We are trying to simulate air flow at Mach 3 over a diamond shaped airfoil of chord length 1m with a 15 degree half angle. Our goal was to calculate lift at angles of attack - 0,2,4,6,8,10,12,14,16 - and compare to analytical results using basic compressible flow theory. We have been consistently running into Exceeded Max Iterations errors, as well as exceeding the specified maximum courant number. We are not trying to resolve the boundary layer (since at a Re of +12e6, would be very very small), or trying to resolve the shock structures, we are only trying to get lift values at different angles of attack.
Thus far we have had convergence on a few laminar transient simulations, and a smaller number of k-wSST transient simulations. The end solution is fairly steady, so we would like to be able to use a steady state solution, but haven’t had any luck with convergence on those.
There are 4 “projects” associated with this endeavor. The first,
, is mine and mostly laminar transient simulations. The end of the solver log is shown below for the simulation named “DAirfoil15 - 12aoa, 0mesh” , “Run 1”. The error message it presents is “Maximum number of iterations exceeded.”
Solver log:
ExecutionTime = 347.54 s ClockTime = 380 s
Mean and max Courant Numbers = 0.0164781097357 0.0999838461065
deltaT = 2.61086930744e-06
Time = 0.0177818727461
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0
smoothSolver: Solving for Ux, Initial residual = 6.78940078722e-11, Final residual = 6.78940078722e-11, No Iterations 0
smoothSolver: Solving for Uy, Initial residual = 4.40786243635e-11, Final residual = 4.40786243635e-11, No Iterations 0
smoothSolver: Solving for Uz, Initial residual = 1.10328821598e-08, Final residual = 1.10328821598e-08, No Iterations 0
diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0
smoothSolver: Solving for h, Initial residual = 4.60981725109e-11, Final residual = 4.60981725109e-11, No Iterations 0
ExecutionTime = 347.59 s ClockTime = 380 s
Mean and max Courant Numbers = 0.0164799706806 0.0999707283277
deltaT = 2.61150419501e-06
Time = 0.0177844842503
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0
diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0
smoothSolver: Solving for Ux, Initial residual = 6.78572395445e-11, Final residual = 6.78572395445e-11, No Iterations 0
smoothSolver: Solving for Uy, Initial residual = 4.41030964155e-11, Final residual = 4.41030964155e-11, No Iterations 0
smoothSolver: Solving for Uz, Initial residual = 1.08999660025e-08, Final residual = 1.08999660025e-08, No Iterations 0
diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0
smoothSolver: Solving for h, Initial residual = 4.61279963388e-11, Final residual = 4.61279963388e-11, No Iterations 0
[9]
[9]
[9] ExecutionTime = 347.65 s ClockTime = 380 s
→ FOAM FATAL ERROR:
[9] Maximum number of iterations exceeded
[9]
[9] From function thermo::T(scalar f, scalar T0, scalar (thermo::*F)(const scalar) const, scalar (thermo::*dFdT)(const scalar) const, scalar (thermo::*limit)(const scalar) const) const
[9] in file at line 76.
[9]
FOAM parallel run aborting
[9]
[9] #0 Foam::error::printStack(Foam::Ostream&) at ??:?
[9] #1 Foam::error::abort() at ??:?
[9] #2 Foam::species::thermo >, Foam::sensibleEnthalpy>::THs(double, double, double) const at ??:?
[9] #3 Foam::hePsiThermo >, Foam::sensibleEnthalpy> > > >::calculate() at ??:?
[9] #4 Foam::hePsiThermo >, Foam::sensibleEnthalpy> > > >::correct() at ??:?
[9] #5
[9] at ??:?
[9] #6 __libc_start_main in "
[9] #7
[9] at ??:?
MPI_ABORT was invoked on rank 9 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
The second associated project, also mine, is primarily attempts at k-wSST models, also transient. The link to the project is
Below is another snippet of the end of the solver log. This corresponds to the Simulation called “Airfoil Turbulent 3 - coarser mesh”, and then run name is “Same but GAMG & Smooth”. The same max iteration error popped up as well as “The Courant number (CFL) exceeded the limit of 1. You may experience either instability or bad temporal accuracy. It is recommended to keep the CFL number below 0.7. In order to achieve this you need to decrease the time step.”
Solver log:
GAMG: Solving for p, Initial residual = 1.42414603943e-05, Final residual = 2.39870784188e-16, No Iterations 1
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 6.64022390833e-05, global = 6.6402239083e-05, cumulative = 4.28467666308
rho max/min : 2.51716593916 0.5
smoothSolver: Solving for omega, Initial residual = 1.55792535405e-07, Final residual = 1.55792535405e-07, No Iterations 0
smoothSolver: Solving for k, Initial residual = 8.30096238902e-06, Final residual = 8.30096238902e-06, No Iterations 0
PIMPLE: iteration 4
smoothSolver: Solving for Ux, Initial residual = 3.70865215024e-06, Final residual = 3.70865215024e-06, No Iterations 0
smoothSolver: Solving for Uy, Initial residual = 7.62429351071e-07, Final residual = 7.62429351071e-07, No Iterations 0
smoothSolver: Solving for Uz, Initial residual = 1.7901142161e-06, Final residual = 1.7901142161e-06, No Iterations 0
smoothSolver: Solving for h, Initial residual = 1.80451190642e-05, Final residual = 5.83363381981e-09, No Iterations 1
GAMG: Solving for p, Initial residual = 1.96769145308e-05, Final residual = 2.09961985396e-16, No Iterations 1
diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0
time step continuity errors : sum local = 6.64022390825e-05, global = 6.64022390823e-05, cumulative = 4.28474306532
rho max/min : 2.5172132253 0.5
smoothSolver: Solving for omega, Initial residual = 1.5566417153e-07, Final residual = 1.5566417153e-07, No Iterations 0
smoothSolver: Solving for k, Initial residual = 7.34234180476e-06, Final residual = 7.34234180476e-06, No Iterations 0
PIMPLE: iteration 5
smoothSolver: Solving for Ux, Initial residual = 8.33388464656e-06, Final residual = 8.33388464656e-06, No Iterations 0
smoothSolver: Solving for Uy, Initial residual = 2.16772705588e-06, Final residual = 2.16772705588e-06, No Iterations 0
smoothSolver: Solving for Uz, Initial residual = 4.49962524042e-06, Final residual = 4.49962524042e-06, No Iterations 0
smoothSolver: Solving for h, Initial residual = 2.23971677768e-05, Final residual = 6.81859483086e-09, No Iterations 1
[1]
[1]
[1] → FOAM FATAL ERROR:
[1] Maximum number of iterations exceeded
[1]
[1] From function thermo::T(scalar f, scalar T0, scalar (thermo::*F)(const scalar) const, scalar (thermo::*dFdT)(const scalar) const, scalar (thermo::*limit)(const scalar) const) const
[1] in file at line 76.
[1]
FOAM parallel run aborting
[1]
[1] #0 Foam::error::printStack(Foam::Ostream&) at ??:?
[1] #1 Foam::error::abort() at ??:?
[1] #2 Foam::species::thermo >, Foam::sensibleEnthalpy>::THs(double, double, double) const at ??:?
[1] #3 Foam::hePsiThermo >, Foam::sensibleEnthalpy> > > >::calculate() at ??:?
[1] #4 Foam::hePsiThermo >, Foam::sensibleEnthalpy> > > >::correct() at ??:?
[1] #5
[1] at ??:?
[1] #6 __libc_start_main in "
[1] #7
[1] at ??:?
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
The third project associated is
This is my partner’s (we’re doing a group project for a CFD course). We were told that there is a bug in the code when trying to simulate an angle of attack (using x and y velocity components for the initial condition and the boundary condition) with a mesh that has refinements. So, (since I have run out of computing hours), he has been working on doing uniform meshes in this project:
trying to establish mesh convergence, and then we will be attempting that level of mesh fine-ness at our specified angles of attack.
Any advice that anyone has is much appreciated. We’re new at this, so our meshing skills, and simulation set-up I’m sure could all use some improvement. Thanks!