-
-
June 5, 2024 at 1:24 pmGeoffSubscriber
I have writen a compiled UDF in C. I have several static arrays of fixed length for storing boundary ID's and other model parameters. When I define the array dimensions directly as below the code runs fine.
static int boundary_IDs[7] = {30,4,36,37,34,5,6};
When I define the array dimensions using a int variable:
const int N_IDs = 7;
static int boundary_IDs[N_IDs] = {30,4,36,37,34,5,6};The code compiles but when I run the simulation I get a memory fault and fluent crashes when it tries to access values from the aray within my DEFINE_... macros. Furthermore the memmory fault only ocurs on some machines; the code runs fine on single compute nodes of larger HPC clusters (Linux based) but the problems occur when I run the code on a smaller Linux based server where I do not have access to a full compute node or when I try to test the code on a Windows desktop.Â
I have also tried defining N_IDs as a constant :
#define N_IDs 7
but that did not seem to make any difference.
-
June 5, 2024 at 3:00 pmRobForum Moderator
I can't see const int used in any of the examples so wonder if it's an oddity of C, C++, C# etc that will pass the compiler step as it's not technically an illegal statement but may not be recognised by all node or OS configurations. Does integer N_IDs = 7 work?
The #define approach may not store N_IDs as an integer?Â
-
June 5, 2024 at 4:10 pmGeoffSubscriber
Part of the problem must have been the loop over the boundary ID array inside my DEFINE_ macro. I changed from a do-while loop to a for loop with a break statement inside and changed the definition of N_IDs back to #define. Now the code appears to run on all plateforms... I have no clue why this change fixed things but at least it works now and that's what matters.
-
- You must be logged in to reply to this topic.
-
421
-
192
-
178
-
162
-
140
© 2024 Copyright ANSYS, Inc. All rights reserved.