SimScale CAE Forum

The problem in discrete phase model run in multi-CPU

I download a simulation of discrete phase model,and run the simulation in the local computer.I find that it can be run in single CPU,but when it run in mulit-CPU,an error about mpi occured sending the message of
mpirun has exited due to process rank 2 with PID 8123 on
node ubuntu exiting improperly. There are two reasons this could occur:

1. this process did not call “init” before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call “init”. By rule, if one process calls “init”,
then ALL processes must call “init” prior to termination.

2. this process called “init”, but exited without calling “finalize”.
By rule, all processes that call “init” MUST call “finalize” prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
[ubuntu:08120] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[ubuntu:08120] Set MCA parameter “orte_base_help_aggregate” to 0 to see all help / error messages

can you help me @dheiny,thank you very much.

Hello @15244624494

I will allow myself to step before David does :wink:

It seems that the problem you are facing comes down to wrong MPI domain split performed locally. Check that your decomposeParDict in the System folder is set up correctly. Unfortunately you might also experience some trouble with MPI installation and integration with OF on your local machine. This we can not help you with.

Finally I highly recommend you to run the cases directly on SimScale platform. With the straightforward and clean interface you are less prone to type-o’s and setup errors.

Good luck with the project!


thank you very much,I know the reason which case the problem,it’s the version of openFoam casing the problem.

Hi @15244624494,

@psosnowski was faster - happy simulating!