Documentation

Incompressible LBM (Lattice Boltzmann) Advanced Options

Geometry

The LBM solver has the ability to deal with many CAD types and is generally more robust than many solvers in terms of cleanliness of the geometry, where open geometry, poor faces and small faces don’t really matter. That said in the odd occasion you have issues or error, if you inspect the geometry and don’t find anything fundamentally wrong, an STL can normally be loaded

Turbulence models

URANS, DES and LES

(coming soon)

Y+ Requirements

The Y+ requirements for LBM   tends to be more robust than those of the equivalent finite volume methods, for example, the K-omega SST (URANS) model in the FVM implementation has an approximate requirement of 30 < Y+ < 300, however in the LBM implementation the lower bound is not considered a requirement and instead a more robust upper bound of less than 500 and certainly not higher than 1000 is recommended. The solver will additionally warn for Y+ values higher than 2000 for in the near wall voxel.

 

The DES models ‘K-omega SST DDES’ and ‘K-omega SST IDDES’ have similar requirements to the URANS ‘K-omega SST’ since the wall model is based upon the same model, however pure LES models such as the ‘LES Smagorinsky’ model require similar Y+ requirements to the equivalent FVM model, where y+ is around or below 1. This is one of the main reasons LES will be a more expensive simulation.

If the Y+ is much higher than expected, where results are likely to be impacted the user will be warned in the interface:

  • ‘High velocities encountered that might not be handled by the current mesh resolution. Please check your results and consider refining the mesh further.’
  • ‘Mesh resolution might not be sufficient for correct turbulence modeling. Please check your results and consider refining the mesh.’

And in the solver log with an error:

‘ERROR @ DomainHealthStatusExporter.cpp:60: simulationTime=748, domainHealthStatus=(maxVelMag=0.323367, minRho=0.697381, maxRho=1.48795, maxNuT=0.000625238, maxWallCellSizeYp=102696)’

 

This shows a maximum Y+ of 100k which is obviously wrong and needs to be reduced. The main methods of doing this are to apply some Reynolds scaling (see section below) or to refine the surface. If the surface is already refined to a reasonable level, the scaling is the only option without excessively increasing the cost of your simulation.

 

‘Regarding Y+ Targets Pacefish is much more flexible than FVM codes with wall functions. It has no limitation regarding the low-bound value. The results should not suffer from wall resolution as long as the size of the wallnext voxels is not exceeding 500 to 1000.

 

When you rerun the simulations please consider to use the “SST IDDES” turbulence model instead of plain kOmega-SST and Smagorinsky models. The mentioned “SST IDDES” turbulence model is a powerfull hybrid LES-uRANS model which uses RANS formulation in the boundary layer and LES formulation in the farfield achieving an optimum between both worlds. In the present case the kOmega-SST model probably swallows some of the transient effects. For good results with the plain Smagroninsky model wall resolution has to be around or below Y+ of 1.’ – (Eugen, 2018)

Boundary Conditions

(coming soon)

Reynolds Scaling factor

It is commonplace to scale down a model physically for wind tunnel testing or to slow down a flow, where examples are testing a scaled building or a plane in subsonic flows. The Reynolds scaling factor can apply this scaling automatically to a full-scale geometry.

Not only is this scaling important in wind tunnels for obvious sizing reasons, but it is also required in the LBM method, where a high Reynolds number will create a thin boundary layer, but a thin boundary layer will also need a finer mesh to compensate. Since the LBM requires a lattice where the aspect ratio is 1, a perfect cube, refining to required Y+ values may become expensive, on top of that, if you were to refine to the required level at the surface without scaling, then because of the Courant number is being maintained at a value lower than 1, then the number of time steps required for the same time scale would increase further increasing simulation expense.

The depicted validation case, AIJ Case E, for pedestrian wind comfort is compared to a wind tunnel where the scale of the city is 1:250, therefore a scaling factor is applied of 0.004. If dealing with high Reynolds number it is recommended that some literature review is used to understand an acceptable scaling factor for the application, or if in research, choose the matching scale factor to the wind tunnel you are comparing to.

The Reynolds scaling factor is located in an Incompressible LBM simulation under the Model node.

The Reynolds number is defined as Re = U.L/v where ‘L’ is the reference length, ‘U’ is the velocity and ‘v’ is the kinematic viscosity of the fluid. When a scaling factor is applied, instead of sizing the geometry down, the viscosity is instead increased to ensure that the Reynolds number is reduced to the correct scaling.

 

(References coming soon)

Meshing

We applied a simple rule of thumb where the mesh of the worst resolved solid is still max two refinement levels below the best resolved solid. Because memory consumption scales with second order and computation effort scales with third order you already will have a huge saving in relation to resolve all solids at highest refinement level (93% less memory and 99% less computation time), but at the same time have stable (not-changing resolution) at the wall getting rid of numeric effects at the transitions.

Please consider grid transition at solids as some EXPENSIVE operation in terms of results quality even if you do not get any NaNs and do not directly see the effects. This means you can use it, but do it carefully. Try best to maintain same refinement level for solids as far as possible. Just try to follow the above mentioned rule of thumb using a VoxelizedVolume with unidirectional extrusion size of 4 voxels and directional downstream extrusion of 16 voxels and you will get very good geometry-adapted meshes being a lot better suited for the simulation in almost any case than those refinement regions build of manual boxes. Generally consider refinement boxes being some tool from the Navier-Stokes world. They still work for Pacefish, but VoxelizedVolumes work much better.

Results

The difference between the standard and LBM solvers is vast, but the SimScale user interface does an excellent job at making the transition between the two as seamless as possible, however, one of the biggest differences that cannot be hidden in the vast amount of options available for data exportation.

 

The reason for this level of control is that in ordinary OpenFOAM based solvers, usually run in the steady state, and usually on grids sub 20 million, saving the entire results for the final step is no issue. However, on the LBM solver, it’s normal to have grids bigger than 100million cells, and what is more, since its transient, results are produced at every timestep. The size of a complete result set is obviously so large that they cannot all realistically be returned.

 

For this reason, the LBM allows three main methods of exportation, Transient, Statistics and Snap Shot. And for each of these, we have the option to specify the interval, the region to be saved and whether to save surface, fluid or both result sets. Let’s go through these three options.

 

If the machine runs out of storage to hold the asked for results an error will start appearing in the logs:

 

FATAL @ EnSightExport.cpp:3679: EnSight data export to “export/trans_Pedestrian__PACEFISHSPACE__Level__PACEFISHSPACE__SlicePACEFISH” FAILED because of file I/O issue. Please check the access rights and the available disk space at the destination.

 

If this starts appearing it is advised to immediately stop the simulation and re-adjust the result controls to reduce the size of the written data, as any further produced data is unlikely to be written and therefore further solve time will not gain you additional results, and will thus become a waste of GPU hours.

 

The amount of data a machine can hold is not an exact science, the results will depend upon the mesh size, the export domain size, frequency of transient result write and the time a simulation is run for, so, although it might be hard to judge, simply being conservative, realistic and putting thought into what you need at the end of a simulation will likely produce simulation results without error. If errors like the above are observed, it doesn’t take long before you get an idea as to how much data is too much.

 

General advice on reducing the size of the results is to be conservative, and this can be further elaborated upon. If you are interested in results in a large area, for example, peak velocities at various points at pedestrian height in a city, you could simply export transient data of the encompassing area, however, to get good transient results many writes will be needed, and realistically, at every time step. This will however not be possible for a realistic case, i.e. a case with cells near or exceeding 100 million cells with appropriate wall refinements. An alternative would be to save a region much smaller, to do this we could slice a region using a small region height, and this will export a region with one cell thick in the vertical height, drastically reducing the size of the results. However, we could be even more conservative, we could know the point at which we are interested in and upload these points as a CSV file, and export every time step, this would reduce the results footprint drastically allowing for the space to be used for other things and thus getting more out of the simulation.

 

Another example of being conservative might be in wind loading, where you simply want to understand pressures on the surfaces of the building, you could export fluid and surface data around a city, or reduce it to just the building of interest. Furthermore, we could remove the volume data only exporting surface data, this reduces the size of the results to two dimensions. Further, we could take a leaf out of the wind tunnel book and introduce once again points on the surface as a CSV as virtual tap points which will only export the data at those points.

 

In the above two examples, it is up to the user to determine the level of results they require, however, every time you drop a level a significant amount of space is freed up on the machine and these methods can lead to highly productive simulation runs.

Transient

Transient results are the time-dependent result fields and can be saved every specified interval, it is generally recommended that only small domains are saved, and if an animation is desired, a small slice saving frequently. If a machine runs out of memory, your simulation will fail, wasting potentially a lot of solve time. So be conservative with the transient output and think about the exact results you need, (Todo)

Statistics

Statistics can be analysed as a percentage from the end, at intervals bigger or equal to the timestep. Percentage from the end defines where the analysis starts. For example, percentage from the end of 1 (100%) analyses all data from the beginning of the simulation, however, this might be undesirable since the flow takes some time to initialise and stabilise to a somewhat periodic constant flow. Therefore, numbers such as 0.5 (50% into the simulation) and default 0.2 (20% from end, 80% into the simulation) are better.

Snapshot

(coming soon)

Probe point

Probe points can be added to be used as velocity measuring devices (virtual hot wires or pitot tubes etc…) or can be added to monitor pressure at a point (virtual pressure tap points) where data for each probe is returned as components of velocity and pressure, where the full time sequence for these probes are returned at the rate specified in the resulting control.

 

The format for specifying the pacefish probe plot is:

 

Name, X ordinate, Y ordinate, Z ordinate,

 

Where an example is:

 

probe0,8.5,9.25,2.5
probe1,15.0,9.25,2.5
Probe2,20.0,9.25,2.5

 

This can easily be done in excel or your choice of spreadsheet software, which can export in .csv format.

Sample Rate

It is important to note that if the time steps are bigger than the asked for frequency, then the data is returned at the rate of the time step size, and warns the user in the user interface. This is important if doing a spectral analysis and the user has a different frequency than asked for in the interface. This is true for probe points, force plots, statistical sampling and transient result field return.

Time step is too large when compared to the capture rate

 

Contents

Cookie Policy

SimScale uses cookies to improve your user experience. By using this website you consent to our cookie policy. Don't worry, we are not storing any personal information.

Accept Data Privacy