-
-
October 7, 2019 at 3:49 am
learner
SubscriberHello All.
I am doing HPC for ansys hfss project using more than 2 nodes in ANSYS Electronics.
It's good in case using one remote machine, but it fails when using 2 remote nodes or more.
When start analyse, it hangs on with message "Determining memory availability on distributed machine on XXX"
After long time, it messages out "could not start memory inquiry solver. check distributed installations, MPI availability, MPI authentication and firewall settings"
I already surfed google like this article /forum/forums/topic/best-way-to-install-a-mpi-for-ansys-electronics-desktop-19-2-share-memory-and-cores/
but error has not gone.
please help me. thanks
-
October 7, 2019 at 11:11 pm
tsiriaks
Ansys EmployeeIf you are sure you already setup according to those 6 steps, this will be tricky.
First, let's check if IBM MPI works. The requirement for any MPI to work is that these solving machines must be on a domain, it can't be 'Workgroup' .
If they are on a domain, try this command to test IBM MPI (change installation path and version accordingly)
"%MPI_ROOT%binmpirun" -hostlist localhost:2,MachineB:2 "C:Program FilesAnsysEMAnsysEM19.4Win64schedulersdiagnosticsUtilspcmpi_test.exe"
NOTE: Change MachineB to be hostname or IP of the other machine.
-
July 22, 2020 at 7:15 am
hexuan
SubscriberHi win,
I did the command to test IBM MPI (change installation path and version accordingly)
"%MPI_ROOT%binmpirun" -hostlist localhost:2,MachineB:2 "C:Program FilesAnsysEMAnsysEM19.4Win64schedulersdiagnosticsUtilspcmpi_test.exe"
but i got the information below:
C:Users86188>"%MPI_ROOT%binmpirun" -hostlist localhost:2,121.248.51.72:2 "C:Program FilesAnsysEMAnsysEM20.1Win64schedulersdiagnosticsUtilspcmpi_test.exe"
mpirun: Drive is not a network mapped - using local drive.
WARNING: No cached password or password provided.
use '-pass' or '-cache' to provide password
ERR-Client: InitializeSecurityContext failed (0x8009030e)
ERR - Client Authorization of socket failed.
Command sent to service failed.
mpirun: ERR: Error adding task to job (-1).
mpirun: mpirun_mpid_start: thread 2172 exited with code -1
mpirun: mpirun_winstart: unable to start all mpid processes.
mpirun: Unable to contact remote service or mpid
mpirun: An mpid process may still be running on 121.248.51.72
Could you please tell me how to solve the problem?
What else should i do except those 6 steps in /forum/forums/topic/hpc-setup-for-ansys-2020r1/?order=all ?
If there is something i should do to active MPI ?
Thanks
Xuan
-
July 23, 2020 at 11:33 pm
tsiriaks
Ansys EmployeeHi Xuan,
The key error is this
ERR-Client: InitializeSecurityContext failed (0x8009030e)
ERR - Client Authorization of socket failed.
but I've not seen this error before. What if you run the command from the other machine to this local machine, do you get the same error ? If so, you probably need to try temporarily disabling anti-virus and firewall on both machines for a quick test.
Thanks,
Win
-
December 1, 2020 at 6:51 am
mahesh2444
Subscriber,nI would like to know how you configured the IBM MPI for successful simulations. I am having two pc's and I described my issue atn/forum/discussion/22480/facing-issues-while-setting-up-distributed-memory-simulations-in-ansys-edt-hfss-2020r1#latestnDo I need to do anything post installation of IBM MPI apart from the above mentioned things ? Do I need to register my credentials anywhere else for the mpirun to work ?nI have done the credential registration part as specified user in remote analysis available in AEDT GUI. nCan you please help me to get out of this problem ?nThanksnMaheshn -
December 1, 2020 at 10:17 am
mahesh2444
SubscriberHello Array , Array , Array , Array , Array nAn update on my question. Among the two machines (DESKTOP-CLH2LM1-->(A), DESKTOP-B4I9FQ7-->(
). nWhen I run the test with A as localhost and B as the other machine the MPI testing command results in Hello world! output indicating a good connection between A & B.nC:\Users\Mahesh>%MPI_ROOT%\bin\mpirun -pass -hostlist localhost:2,DESKTOP-B4I9FQ7:2 n%ANSYSEM_ROOT201%\schedulers\diagnostics\Utils\pcmpi_test.exenPassword for MPI runs:nmpirun: Drive is not a network mapped - using local drive.nHello world! I'm rank 0 of 4 running on DESKTOP-CLH2LM1nHello world! I'm rank 1 of 4 running on DESKTOP-CLH2LM1nHello world! I'm rank 2 of 4 running on DESKTOP-B4I9FQ7nHello world! I'm rank 3 of 4 running on DESKTOP-B4I9FQ7nBut when I tried to run the same MPI testing command with B as localhost and A as other machine, following output is obtained in command prompt window.nC:\Users\HP>%MPI_ROOT%\bin\mpirun -pass -hostlist localhost:2,DESKTOP-CLH2LM1:2n%ANSYSEM_ROOT201%\schedulers\diagnostics\Utils\pcmpi_test.exe nPassword for MPI runs: nmpirun: Drive is not a network mapped - using local drive. nERR-Client: InitializeSecurityContext failed (0x80090308) nERR - Client Authorization of socket failed. nCommand sent to service failed. nmpirun: ERR: Error adding task to job (-1). nmpirun: mpirun_mpid_start: thread 19792 exited with code -1 nmpirun: mpirun_winstart: unable to start all mpid processes. nmpirun: Unable to contact remote service or mpid nmpirun: An mpid process may still be running on DESKTOP-CLH2LM1 nI want to know why the output is like this and what settings do I have to make for getting same output as described earlier in this comment. nFor testing this distributed simulation feature I have started simulation of Helical_Antenna { available in examples (it is advised to consider this simulation as test case ANSYS 2020 R1 Help) } on Machine A. I have setup analysis configuration consisting of two machine with Machine B being the first one among the list followed by localhost. nBut the simulation steps like meshing and solving are only performed in Machine B and didn't used any of the hardware available in Machine A. Why this occurred ?nWhat settings do I need to modify for using both machines in the simulation ?nP.S: Machine A has Windows 10 Pro OS while Machine B has Windows 10 Home OS installed. Also there is one generation difference between processors on both machines. I have disabled the firewalls completely on both machines. They are on the Domain WorkGroup.nThanksnMaheshn
-
- The topic ‘mpi authentication in HPC using multiple nodes in ANSYS Electronics’ is closed to new replies.
-
5874
-
1906
-
1420
-
1306
-
1021
© 2026 Copyright ANSYS, Inc. All rights reserved.