Ansys Fluent
Ansys Fluent is proprietary fluid simulation software.
This container can be run on both GPU and CPU partitions. They can be operated interactively (either through a text user interface or a graphical user interface). Or through a text batch mode interface. You can run it through Open OnDemand or through an SSH session.
The current version is ANSYS Fluent 2023R1 in the form of the ansys_fluent_workbench_2023r1.sif container. This contains an Ubuntu 22.04 Linux installation of ANSYS Fluent 2023R1.
It is designed to be operated from within the cluster. It does not accept network commands from the Parallel Settings or Remote section
of the Fluent Launcher running from the outside of the cluster.
The Fluent 2023R1 container can only open Workbench and Fluent files that are created in version 2023R1 and earlier.
Quick start (OnDemand graphical mode)
The easiest way to get started is through Open OnDemand OpenOnDemand. This provides a responsive graphical Linux desktop environment through a web browser.
Get an account on the Aoraki cluster.
Goto the Open OnDemand dashboard.
One of the Pinned Apps is
[Otago HPC Desktop]. Click this. You may need to select RAM and partition type for your simulation underAdvanced Settings.Click
Launchto start the HPC desktop. This will bring you to a list of your sessions. Eventually when the new session has the status ofRunningyou can click the[Launch Otago HPC Desktop]button at the bottom with this bringing up the HPC linux environment.Click on the
Terminal Emulatoricon in the top bar of the desktop environment. You can also access a terminal under XFCE through goingApplications -> Terminal Emulator.
Basic console command:
By default when you run the container with no arguments it will load into the graphical workbench as well as print out the different options in the terminal. From there you can design simulations and open Fluent. To run the graphical ANSYS Fluent Workbench you type:
apptainer run /opt/apptainer_img/ansys_fluent_workbench_2023r1.sif
Starting Fluent from the Workbench:
To start Fluent from the Workbench:
Double click on
Fluid Flow (Fluent)under Analysis Systems (that is under Toolbox) in the left pane of the Workbench app. This will bring up aFluid Flow (Fluent)box in the main Project Schematic area, I.e. Double click:Toolbox -> Analysis Systems -> Fluid Flow (Fluent).If you want to run the
Fluent Launcher: double click onSetupfrom within the newly createdFluid Flow (Fluent)block. From there you can select simulation parameters (dimensions, double precision, 2D/3D, etc). Clicking the[Start]button starts Fluent.If you want to go straight to Fluent: double click on the
Solutionentry from within the newly createdFluid Flow (Fluent)block and Fluent should load right up.
SSH graphical mode
You can run ANSYS Fluent graphically through an SSH connection with X11 forwarding enabled. You need to enable X11 to display X11 programs (through WSL on Windows, XQuartz on Mac and Xorg on Linux) on your local machine. This requires a bit more setup to function correctly.
Note that 3D software rendering is used which is a lot slower than direct hardware acceleration that would be on somebodies local machine. This means that the graphical interface is useful for initial setup and validation (confirming that it works) rather than for design and analysis.
One of the issues is that if the SSH connection dies (e.g. due to the network dropping out) the Fluent/Workbench program will cease to operate. Please take this into consideration when running a lengthy simulation.
# Log into Aoraki with X11 support (X) and compression (C):
ssh -XC userXXX@aoraki-login.otago.ac.nz
# Next you need to create a bash terminal on a node in the cluster. Both options are below:
# Option 1: With GPU: To run a bash shell with 64GB RAM, X11 forwarding on a partition that has a GPU ('aoraki_gpu') for 24 hours:
[userXXX@aoraki-login ~]$ srun --gres=gpu --mem=64G --x11 --pty --partition aoraki_gpu -t 24:00:00 /bin/bash
# Option 2: No GPU: To run a bash shell with 64GB RAM, X11 forwarding on a partition that has no GPU ('aoraki') for 24 hours:
[userXXX@aoraki-login ~]$ srun --mem=64G --x11 --pty --partition aoraki -t 24:00:00 /bin/bash
# Once the bash shell is running on the cluster node it is time to run the graphical Fluent Workbench:
[userXXX@rtis-hpc-YY ~]$ apptainer run /opt/apptainer_img/ansys_fluent_workbench_2023r1.sif
Container modes
The ANSYS Fluent container can be run in different modes depending on the arguments that are given at runtime.
Interactive shell mode (–shell):
The container is run in interactive shell mode by running the container with the --shell argument. This drops you into a bash shell where you can enter in commands:
apptainer run /opt/apptainer_img/ansys_fluent_workbench_2023r1.sif --shell
This will load into an Ubuntu Linux Bash shell environment where the user can issue commands:
To start the Fluent Launcher type in fluent. If you want to start the graphical workbench type in runwb2 (see the example commands section above). Most of the other container modes are wrappers around the inbuilt bash shell.
Text based Scripted Automation
Fluent has built in automation, i.e. the runtime can be scripted. This is in the form of IronPython 2.7 script files (also known as journal files: .wbjn)
that the ANSYS IronPython runtime uses to open, initialize, initiate, save and close a simulation.
Python script file example:
You are free to use the code developed for ‘Batch mode’ (see below) by having a look at the script found (within the container) at:
/run-files/runwb2-wbprj.py
This can be executed by the ANSYS runtime by the following (when the container is being run in --shell mode):
# Python 2.7 script file:
runwb2 -B -R /run-files/runwb2-wbprj.py
There is an open source project on github that also provides another example using design points (parameterization). This can be found here: https://github.com/sikvelsigma/ANSYS-WB-Batch-Script.
Journal file generation:
The Python journal files can also be generated though the graphical ANSYS Workbench. This is done by enabling journalling that records your actions in the GUI.
You can then play this back in text mode. To do this (in the Workbench) goto File => Scripting => Record Journal.... The workbench will then ask you for the
journal file name and start recording your actions. This .wbjn file can be modified with a text editor. The workbench automatically records your actions when
you enable automatic journalling. To enable this goto Tools => Options => Journals and Logs (side menu) and check “Record Journal File”. You can also set the
directory here as well.
To run or ‘replay’ the script/journal you run the following within the container:
# Journal file that was recorded through the GUI:
runwb2 -B -R file_name.wbjn
In the above example, the -B operates the workbench in batch mode (text mode), i.e it does not load up the GUI. The -R runs or replays the
script/journal file that is given (in this case: file_name.wbjn).
Batch mode
The container can be run in automated Batch mode where it runs in text mode only (it does not display a GUI).
You give it the path to the workbench file .wbpj as a single parameter and it initalizes and
runs the solvers within the provided workbench file. It exits after it finishes its simulation(s).
The steps that the container goes through are:
Searches the directory that the
.wbpjfile is in and then opens it.Creates a list of Solution objects that have solvers.
For each object with a solver it initalizes it with the chosen method (see below).
Once the solver has been initalized any prior simulation data is deleted/cleaned and the simulation is started.
Once the simulation has completed the simulation is saved and the container exits.
Environmental variables
Environmental variables are used to control the initialization of the model before the simulation is run:
ANSFLU_INITIALIZE_METHOD: The ANSFLU_INITIALIZE_METHOD is used to set the inititation method. The default is “hybrid”.
There is: none, hybrid, standard, custom. The default is hybrid. If custom is set you have to define the
TUI initalization command in the ANSFLU_INITIALIZE_METHOD_CUSTOM environmental variable.
For example:
export ANSFLU_INITIALIZE_METHOD="hybrid"
apptainer run /opt/apptainer_img/ansys_fluent_workbench_2023r1.sif projectdir/model1.wbpj
# or:
export ANSFLU_INITIALIZE_METHOD="custom"
export ANSFLU_INITIALIZE_METHOD_CUSTOM="/solve/initialize/init-flow-statistics"
An example of a workbench based simulation run:
[user@aorakiXX ~]$ apptainer run /opt/apptainer_img/ansys_fluent_workbench_2023r1.sif projectdir/model1.wbpj
WARN: CUDA not found. NVIDIA container will operate without GPU acceleration.
*******************************************
>> OPENING: model1.wbpj @ 2024-09-05 16:12:54
*******************************************
>> ENUMERATING 'model1.wbpj'
SYSTEM OBJECT:
UserId: Geom 3
Caption: MX1 Geometry
Solver: []
SYSTEM OBJECT:
UserId: FLTG 8
Caption: 400k cells
Solver: ['FLUENT']
*******************************************
>> SOLVING SOLUTIONS
* SYSTEM: FLTG 8
> INITIALIZATION
> INITIALIZE: Hybrid method...
> RUNNING
*******************************************
>> COMPLETED
Started: 2024-09-05 16:12:54
Completed: 2024-09-05 16:21:10
Duration: 0:08:16
*******************************************
>> SAVING
[user@aorakiXX ~]$
There can be only a single .wbpj workbench file present in that directory at a time.
No job completion information is given while the simulation(s) are underway.
Example SLURM files are given (see above) to submit jobs to the SLURM job manager.
Home directory quota constraints
A quota system is in place on Aoraki that limits the data in a users home directory to 40GB. This can be easy exceeded with CFD simulations with large datasets and multiple runs. The possible solutions involve moving your data onto another storage medium (Ohau or HCS) and running the container from that directory. Please contact RTIS for more information.
Resources
Ansys Workbench Scripting Guide: https://dl.cfdexperts.net/cfd_resources/Ansys_Documentation/Workbench/Workbench_Scripting_Guide.pdf
Fluent TUI commands (text commands within Fluent): https://www.afs.enea.it/project/neptunius/docs/fluent/html/ug/node48.htm
Journaling and Scripting within Fluent (basics): https://www.afs.enea.it/project/neptunius/docs/fluent/html/wbug/node45.htm
Running Fluent on a different HPC cluster using SLURM: https://www.hkhlr.de/sites/default/files/field_download_file/HKHLR-HowTo-Ansys_Fluent.pdf
Starting Parallel Ansys Fluent on a Linux System: https://ansyshelp.ansys.com/public/account/secured?returnurl=//Views/Secured/corp/v242/en/flu_ug/flu_ug_parallel_start_linux_unix.html