-
-
December 23, 2019 at 5:40 pm
Deus Ex Machina
SubscriberHello everyone,
I am looking for a method or protocol to estimate the total time of a simulation in a external cluster which they have high price per hour, hence I need to know (approximate, of course) how many time spend the machine in do this.
For desktop computers it is enough to try several seconds and then extrapolate.
For instance, if I have 10 million elements to solve in Fluent and the workstation has 96 virtual cores with 2.6 GHz and at least 300 GB, how many hours will it take?
What is the standard protocol to know this in industrial problems? It is very important to know this to evaluate the correct project.
Thank you very much and merry Christmas to all! -
December 24, 2019 at 10:12 am
Rob
Forum ModeratorThere's not really a good way to estimate this as it's very model dependent.Â
For 10M cells you'll need (roughly) 20GB RAM (assuming minimal chemistry & no/few additional phases). You want "real" cores, so if the system has hyperthreading you need to avoid using both threads on a core. 10M cells will (again physics depending) scale fairly well at around 100 cores, I'd not use more than about 200 as you'll begin to lose efficiency.Â
Â
Â
-
December 24, 2019 at 8:07 pm
Deus Ex Machina
SubscriberDear rwoolhou,
First of all, thanks for your comments.
10M cells were an arbitrary example. Maybe I didn't know how to explain my question: when we have to solve an industrial problem whose simulation will be huge (order of millions of cells), we usually need an external computing station which it will be expensive per hour and cores, how do we estimate time (money) to know if the budget will be accepted?
I know it depends on the type and physics, but there must be a protocol (in my opinion) or something that engineers do in these cases.
Thank you,
Regards.
-
- The topic ‘Estimate the calculation time (Fluent solver) for huge simulations’ is closed to new replies.
-
3145
-
1007
-
935
-
858
-
792
© 2025 Copyright ANSYS, Inc. All rights reserved.