We’re putting the final touches on our new badges platform. Badge issuance remains temporarily paused, but all completions are being recorded and will be fulfilled once the platform is live. Thank you for your patience.
Ansys Products

Ansys Products

Discuss installation & licensing of our Ansys Teaching and Research products.

RSM Configuration Error: AWP_ROOT231 is not set on cluster compute node.

TAGGED: 

    • Alex Pritchard
      Subscriber

      I have been trying to use the RSM tool to connect to my universities high performance computing facilities. So far I can get the test tool in the RSM configuration to successfully connect, write and submit the test. After this the test fails stating 'AWP_ROOT231 is not set on cluster compute node'. Any idea's on what I can try to resolve this would be really useful!

      Job report as follows:

                              Ansys Job Report


         Job Name:  K2-lowpri Test Job
         Owner:  ADS\3057254
         Submitted:  01/06/2023 11:44:45
         Priority:  
         Server:  Kelvin2 HPC
         Queue:  K2-lowpri
         Final Status:  Failed


         Job Log:

      RSM Version: 23.1.0.0, Build Date: 18/11/2022 18:15:44
      RSM File Version: 23.1.0.0+Branch.releases-release-23.1.Sha.1309ddcecdb7e4a72b8bdc4bae53e2b96fffead9
      RSM Library Version: 23.1.0.0
      Job Name: K2-lowpri Test Job
          Type: SERVERTEST
          Client Directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\jcsf1egx.5dg
          Client Machine: MAE-139825
          Queue: K2-lowpri [Kelvin2 HPC, k2-lowpri]
          Template: SERVERTEST
      Cluster Configuration: Kelvin2 HPC [kelvin2.qub.ac.uk]
          Cluster Type: SLURM
          Custom Keyword: blank
          Transfer Option: External
          External Storage Type: SSH-SCP
          Staging Directory: /mnt/scratch2/users/3057254
          Delete Staging Directory: False
          Local Scratch Directory: /mnt/scratch2/users/3057254
          Platform: Linux
          Using External Communication Protocol
              Remote Account: 3057254
          Cluster Submit Options: blank
          Normal Inputs: [*,commands.xml,*.in]
          Cancel Inputs: [-]
          Excluded Inputs: [-]
          Normal Outputs: [*]
          Failure Outputs: [-]
          Cancel Outputs: [-]
          Excluded Outputs: [-]
          Inquire Files:
            normal: [*]
            inquire: [*.out]
      Submission in progress...
      Runtime Settings:
          Job Owner: ADS\3057254
          Submit Time: 01 June 2023 11:44
          Directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\jcsf1egx.5dg
          Alternate Platform Directory: /mnt/scratch2/users/3057254/ua112kx2.5j2
      Transferring file: commands.xml              | 2 kB |   2.5 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEst WiTh sPaCe.in        | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEsT.in                   | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEsT.in                   | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      2.68 KB, 10.9 sec (.25 KB/sec)
      Submission in progress...
      JobType is: SERVERTEST
      Final command platform: Linux
      RSM_PYTHON_HOME=C:\Program Files\ANSYS Inc\v231\commonfiles\CPython\3_7\winx64\Release\python
      RSM_HPC_JOBNAME=RSMTest
      Distributed mode requested: True
      RSM_HPC_DISTRIBUTED=TRUE
      Running 4 commands
      Job working directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\jcsf1egx.5dg
      Number of CPU requested: 1
      AWP_ROOT231=C:\Program Files\ANSYS Inc\v231
      Testing writability of working directory...
      C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\jcsf1egx.5dg
      If you can read this, file was written successfully to working directory
      Writability test complete
      Checking queue k2-lowpri exists ...
      Job will run locally on each node in: /mnt/scratch2/users/3057254/ua112kx2.5j2
      Transferring file: control_9a89de12-98e4-4a7 | 3 kB |   3.6 kB/s | ETA: 00:00:00 | 100%
      Transferring file: clusterjob_9a89de12-98e4- | 1 kB |   1.6 kB/s | ETA: 00:00:00 | 100%
      5.35 KB, 5.79 sec (.92 KB/sec)
      JobId was parsed as: 11710003
      Job submission was successful.
      Trying to download diagnostic output files.
      Transferring file: stdout_9a89de12-98e4-4a79 | 0 kB |   0.1 kB/s | ETA: 00:00:00 | 100%
      Transferring file: exitcode_9a89de12-98e4-4a | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      AWP_ROOT231 is not set on cluster compute node.
      ClusterJobs Command Exit Code: 1000
      Trying to download diagnostic output files.
      Transferring file: stdout_9a89de12-98e4-4a79 | 0 kB |   0.1 kB/s | ETA: 00:00:00 | 100%
      AWP_ROOT231 is not set on cluster compute node.
      Trying to cleanup diagnostic files.
      No files to transfer.

    • MangeshANSYS
      Ansys Employee

      Please see this documentation page for setting the variable
      https://ansyshelp.ansys.com/account/secured?returnurl=/Views/Secured/corp/v231/en/wb_rsm/wb_nx_path_conf_426.html

       

    • Alex Pritchard
      Subscriber

      Hello Mangesh. Thank you for the reply! I followed the instructions and I now get the error message 'AWP_ROOT231 directory does not exist from execution node.' Any ideas on what to try next?

      Error report now as follows:

      RSM Version: 23.1.0.0, Build Date: 18/11/2022 18:15:44
      RSM File Version: 23.1.0.0+Branch.releases-release-23.1.Sha.1309ddcecdb7e4a72b8bdc4bae53e2b96fffead9
      RSM Library Version: 23.1.0.0
      Job Name: Sandbox Test Job
          Type: SERVERTEST
          Client Directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\ql5lifuh.gok
          Client Machine: MAE-139825
          Queue: Sandbox [Kelvin2 HPC, sandbox]
          Template: SERVERTEST
      Cluster Configuration: Kelvin2 HPC [kelvin2.qub.ac.uk]
          Cluster Type: SLURM
          Custom Keyword: blank
          Transfer Option: External
          External Storage Type: SSH-SCP
          Staging Directory: /mnt/scratch2/users/3057254
          Delete Staging Directory: True
          Local Scratch Directory: /mnt/scratch2/users/3057254
          Platform: Linux
          Using External Communication Protocol
              Remote Account: 3057254
          Cluster Submit Options: blank
          Normal Inputs: [*,commands.xml,*.in]
          Cancel Inputs: [-]
          Excluded Inputs: [-]
          Normal Outputs: [*]
          Failure Outputs: [-]
          Cancel Outputs: [-]
          Excluded Outputs: [-]
          Inquire Files:
            normal: [*]
            inquire: [*.out]
      Submission in progress...
      Runtime Settings:
          Job Owner: ADS\3057254
          Submit Time: 13 June 2023 19:29
          Directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\ql5lifuh.gok
          Alternate Platform Directory: /mnt/scratch2/users/3057254/ppnerzs3.jfs
      Transferring file: commands.xml              | 2 kB |   2.5 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEst WiTh sPaCe.in        | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEsT.in                   | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      Transferring file: tEsT.in                   | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      2.68 KB, 3.6 sec (.74 KB/sec)
      Submission in progress...
      JobType is: SERVERTEST
      Final command platform: Linux
      RSM_PYTHON_HOME=C:\Program Files\ANSYS Inc\v231\commonfiles\CPython\3_7\winx64\Release\python
      RSM_HPC_JOBNAME=RSMTest
      Distributed mode requested: True
      RSM_HPC_DISTRIBUTED=TRUE
      Running 4 commands
      Job working directory: C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\ql5lifuh.gok
      Number of CPU requested: 1
      AWP_ROOT231=C:\Program Files\ANSYS Inc\v231
      Testing writability of working directory...
      C:\Users\3057254\AppData\Local\Temp\RsmConfigTest\ql5lifuh.gok
      If you can read this, file was written successfully to working directory
      Writability test complete
      Checking queue sandbox exists ...
      Job will run locally on each node in: /mnt/scratch2/users/3057254/ppnerzs3.jfs
      Transferring file: control_e6184b2c-ae62-4a1 | 3 kB |   3.6 kB/s | ETA: 00:00:00 | 100%
      Transferring file: clusterjob_e6184b2c-ae62- | 1 kB |   1.6 kB/s | ETA: 00:00:00 | 100%
      5.34 KB, 2.33 sec (2.3 KB/sec)
      JobId was parsed as: 11773405
      Job submission was successful.
      Trying to download diagnostic output files.
      Transferring file: exitcode_e6184b2c-ae62-4a | 0 kB |   0.0 kB/s | ETA: 00:00:00 | 100%
      Transferring file: stdout_e6184b2c-ae62-4a1a | 0 kB |   0.1 kB/s | ETA: 00:00:00 | 100%
      AWP_ROOT231 directory does not exist from execution node.
      ClusterJobs Command Exit Code: 1009
      Trying to download diagnostic output files.
      Transferring file: stdout_e6184b2c-ae62-4a1a | 0 kB |   0.1 kB/s | ETA: 00:00:00 | 100%
      AWP_ROOT231 directory does not exist from execution node.
      Trying to cleanup diagnostic files.
      No files to transfer.

      • MangeshANSYS
        Ansys Employee

        please refer to the link provided on June 9th. I am pasting it here for reference

        Please see this documentation page for setting the environment variable
        https://ansyshelp.ansys.com/account/secured?returnurl=/Views/Secured/corp/v231/en/wb_rsm/wb_nx_path_conf_426.html

    • karagol
      Subscriber

      Generaly universy HPC system are Linux. So you have to use linux PATH in your .bashrc. For example:

      export AWP_root231=/opt/apps/ansys_ins/v231

Viewing 3 reply threads
  • The topic ‘RSM Configuration Error: AWP_ROOT231 is not set on cluster compute node.’ is closed to new replies.