-
-
September 5, 2019 at 5:35 pm
soloviev
SubscriberHello,
I am running a model which a support person at ANSYS helped create and successfully ran on their personal computer. When I try to run it does not move past 13 time steps before crashing with the following error:
 *** Error in `/cm/shared/apps/ansys_inc/v194/fluent/fluent19.4.0/lnamd64/3ddp_node/fluent_mpi.19.4.0': double free or corruption (!prev): 0x00000000058dd3e0 ***
MPI was killed with signal 9.
Â
I had IT look at our workstation and HPC and everything was fine.Â
Â
I also already tried switching the MPI type in the launcher, which did not change anything.Â
Â
What could be causing this and what could be a solution?
Â
Thanks,
Alex
-
September 6, 2019 at 9:26 am
Rob
Forum ModeratorIf it's run for some steps it's usually either diverged (the mpi errors are triggered as the node(s) fail) or the hardware has done something interesting. Can you check where it's saving monitors etc to make sure they're OK and there is still disc space. Also try running on +/- one node incase it's a parallel issue.Â
Does anything happen at 12-14 timesteps into the calculation? Eg mesh motion.Â
-
September 6, 2019 at 4:22 pm
soloviev
SubscriberMesh adaption, periodic boundaries, and evaporation were all turned on at the 100 time step mark, and this error occurs 13 time steps after that. The ansys contact said it ran past this on their personal computer. The files I am loading are the exact same as those which he ran.Â
I have two error files from two runs, which have different error outputs. Please see below:
Â
Â
Â
Thanks,
Alex
-
September 9, 2019 at 12:26 pm
Rob
Forum ModeratorTurn off (dynamic?) adaption and run the model. If you adapt after 13 timesteps how localised would the increase in cell count be? How much RAM have you got per node (and will it be enough)?Â
-
September 9, 2019 at 6:38 pm
soloviev
SubscriberWe have 192GB of RAM per node, and currently running on 12 nodes.Â
I turned off dynamic mesh adaption and the model ran past 13 time steps.
Â
Thanks,
Alex -
September 10, 2019 at 5:12 am
Amine Ben Hadj Ali
Ansys EmployeeWhich adaptive mesh method is used? How many levels if refinements? I would highly recommend to switch to more actual release when it comes to dynamic grid adaption and vof2dpm as mentioned in other posts. -
September 10, 2019 at 7:36 pm
soloviev
SubscriberI am running on 2019R2.Â
Â
These are the current adaption settings:
Calculation activities:Â
Â
Execute commands: defined commands: 3Â
Â
Command-1 every 50 time steps par part meth metisÂ
Â
Command-2 every 50 time steps par part reorder-partitions-to-archÂ
Â
Command-3 every 50 time steps par part use-stored-partÂ
Â
Mesh adaption:Â
Â
Refinement criterion: dynamic adapt refineÂ
Â
-field value; more than 1e-08 water vof Â
Â
Coarsening criterion: dynamic adapt coarsenÂ
Â
-field value; less than 1e-14 water vofÂ
Â
Minimum cell volume: 1e-14Â
Â
Maximum refinement level: 3
Â
Dynamic adaption: frequency 4Â
Â
Thanks,
Alex
 -
October 2, 2019 at 8:31 pm
soloviev
SubscriberHello,
Â
Is there any update to this issue?
Â
Thanks,
Alex -
December 17, 2020 at 8:12 pm
puh69
SubscriberAlex,nnDid you get anywhere with this. I'm having a similar MPI problem when running my overset mesh.n -
December 18, 2020 at 10:41 am
Rob
Forum ModeratorCheck the error right at the top of the message. It may be memory related or could be the solver:the MPI can just mean the node crashed as a result of something happening which is hopefully contained in the first line or so of the message. n -
December 18, 2020 at 3:13 pm
puh69
Subscriber,nnThanks for the response on this thread. I actually started my own discussion on the matter: /forum/discussion/22932/hpc-failure-fluent#latest. I attached the entire output message file in this comment. My 15 million cell case runs with 16Gb of PPN (4N/20PPN) which is crazy but wont run with anything less.nnThanks,nnPiercenn -
December 18, 2020 at 3:34 pm
Rob
Forum ModeratorOK, thanks. I'll leave that thread open and leave this one for Alex to comment. n
-
- The topic ‘Fluent MPI Error’ is closed to new replies.
-
4628
-
1535
-
1386
-
1215
-
1021
© 2025 Copyright ANSYS, Inc. All rights reserved.


