TAGGED: ansys-fluent, hpc
-
-
August 5, 2021 at 8:42 am
katz
SubscriberHello,
i am simulating residence time distribution for my master thesis. my Problem is the following: to solve the task i use a linux cluster high performance computer of the university. the problem is now, if i start to iterate on the my own pc there is a .dat file several gb big that is written. the one that is written from the hpc is only 1 kb. Do i have a setting wrong? is this a problem by fluent or do i have to contact the hpc-people.
Did anyone have a similar problem and already solved it?
Greetings
Katz
August 5, 2021 at 8:45 amDrAmine
Ansys EmployeeFor sure there is something wrong: how are you executing your job on the cluster: Interactive run? If using batch job, are you running the case you read and writing the dat file? Maybe there is an issue while writing the whole process get stuck; have you checked with cluster management and university IT?
August 5, 2021 at 9:14 amkatz
SubscriberI execute as a batch job.
my .jou reads the following:
file/read-case xxxxxxxxxxxx.cas
parallel/partition/method/cartesian-axes 4
solve/initialize/initialize-flow
solve/iterate 10000
wd yyyyyyyyyyyyyyyyyyy.dat
file/confirm-overwrite yes
exit
yes
my .sh file is longer but widely used here at my chair. i changed the length, the amount of nodes but kept everything else the same.
as i looked at the simulation in progress it looked exactly like in fluent, where the residuals are shown, so it calcutlated that at least. but other than in fluent on my computer it wont fill the dat file with the calculated data. I have over 700gb free on the hpc cluster and dont think the data file would eceed this. i expect it to be 10 gb or so.
I had this problem before but the IT cant see anything wrong with my scripts.
August 5, 2021 at 10:43 amRob
Forum ModeratorYou may want to check the script and use
/file/....
The first / forces the full path and reduces problems with partly executed commands and user tinkering from messing up journals. Not sure why you want the confirm overwrite, it's focus on using a unique file name to avoid the need.
August 5, 2021 at 12:37 pmkatz
SubscriberI deleted the overwrite from the script i put a backslash befor file and still the same problem. My supervisor doesnt know how to solve the issue, and the IT-HPC people also dont know what could cause the problem, or their ideas didnt work either.
I used a .dat file with the first couple of iterations to "initiate" the writing of the datfile. But it didnt help. Has anyone else an idea? At this point i would try anything to solve the issue.
August 5, 2021 at 12:43 pmRob
Forum ModeratorWhich version of the code are you using? Try
/file/wd filename_
and see what happens. The other thing to do is look in the working folder for transcript & other solver files. They may help identify the problem.
August 5, 2021 at 1:11 pmkatz
SubscriberI tried what you suggested but it didnt work. I checked my old files and they look exactly the same and they worked as the one I copied into the comment. The fluent file was also looked after by my supervisor.
August 5, 2021 at 3:55 pmRob
Forum ModeratorCan you initialise the model on the local machine? If so, also set up autosave at (for example 1000 iterations) and use the journal to read case & data, run 1001 iterations and close.
August 6, 2021 at 8:49 amkatz
SubscriberIt seems that my Mesh has too many elements, and because of that the .dat file is not written. Tried it with a less fine meshed mesh and the .dat file reappears magically. Could anyone tell me why this is an issue on the hpc?
August 6, 2021 at 9:15 amRob
Forum ModeratorHow many elements does it have? Are you using a Research licence on the HPC? Check in case it's picking up Teaching. Also check the load on the head node (IT may need to help with this).
August 6, 2021 at 2:41 pmkatz
Subscriberme and the hpc uses a Research Licence. The nodes are no Problem, i didnt choose enough at the beginning but figured it out. There are 36 Million elements in my mesh. Since a good computer can solve the simulation if given a few days time and after that writes a .dat file if i ask him too the problm seems to be within the way the hpc works/distributes the date. We will discuss this next week with the it people.
August 6, 2021 at 3:03 pmRob
Forum ModeratorWith 36M nodes check you're not overloading the system. You'll want 70-90GB of RAM if running double precision. Also check the cluster memory is well allocated: ie all nodes on a box can see all the RAM.
Viewing 11 reply threads- The topic ‘My .dat file is empty’ is closed to new replies.
Innovation SpaceTrending discussionsTop Contributors-
5874
-
1906
-
1420
-
1306
-
1021
Top Rated Tags© 2026 Copyright ANSYS, Inc. All rights reserved.
Ansys does not support the usage of unauthorized Ansys software. Please visit www.ansys.com to obtain an official distribution.
-