@daidezhi
2017-07-22T22:22:16.000000Z
字数 3098
阅读 1300
OpenFOAM
TACC
A typical directory structure of an OpenFOAM simulation case is shown as
$case
case root directory
├──
constant
mesh and transport properties
│
├──
polyMesh
│
└──
transportProperties
├──
0
initial and boundary conditions
│
├──
alpha.water
│
├──
p
│
└──
U
└──
system
flow solver configurations
├──
controlDict
computation control, i.e., time step, etc.
├──
fvSchemes
fvm operator schemes, i.e., ddt(rho, U), etc.
├──
fvSolution
algebraic equations solver and PISO, simple or pimple algorithms, etc.
└──
sampleDict
sample data, i.e., 0.5 fraction value iso-surface, etc.
A SLURM job script for a serial OpenFOAM case is shown as
#!/bin/bash
#----------------------------------------------------
# SLURM job script to run applications on
# TACC's Lonestar 5 system.
#
# Your job description...
#----------------------------------------------------
#SBATCH -J your_job_name # Job name
#SBATCH -o your_job_name_%j.out # Name of stdout output file (%j expands to jobId)
#SBATCH -e your_job_name_%j.err # Name of stdout output file (%j expands to jobId)
#SBATCH -p normal # Queue name
#SBATCH -N 1 # Total number of nodes requested
#SBATCH -n 1 # Total number of mpi tasks requested
#SBATCH -t 48:00:00 # Run time (hh:mm:ss) - 48 hours (maximum)
# Slurm email notifications are now working on Lonestar 5
#SBATCH --mail-user=your_email_address
#SBATCH --mail-type=all
# Launch the executable flow solver based on OpenFOAM
flow_solver_name -case $case
# flow_solver_name ==> the OpenFOAM flow solver name
# $case ==> the absolute (full) path of your root case directory
# For example, if you want to run 'interFoam' solver and your case is located in
# '$WORK/aa/bb/cc/your_case', the command in line 22 should be
# interFoam -case $WORK/aa/bb/cc/your_case
Save this script as your_job_script
, then cd
to the directory where your script file located and use sbatch your_job_script
to submit the job in the terminal. The command showq -U your_tacc_user_id
may help you to monitor your job status.
It should be noted that before downloading the numerical results, you'd better compress them by using SLURM job script again
#!/bin/bash
#----------------------------------------------------
# SLURM job script to run applications on
# TACC's Lonestar 5 system.
#
# Your job description...
#----------------------------------------------------
#SBATCH -J your_job_name # Job name
#SBATCH -o your_job_name_%j.out # Name of stdout output file (%j expands to jobId)
#SBATCH -e your_job_name_%j.err # Name of stdout output file (%j expands to jobId)
#SBATCH -p normal # Queue name
#SBATCH -N 1 # Total number of nodes requested
#SBATCH -n 1 # Total number of mpi tasks requested
#SBATCH -t 48:00:00 # Run time (hh:mm:ss) - 48 hours (maximum)
# Slurm email notifications are now working on Lonestar 5
#SBATCH --mail-user=your_email_address
#SBATCH --mail-type=all
# Launch the executable flow solver based on OpenFOAM
tar -zcvf saved_name.tar.gz $case
# saved_name ==> the name you want the compressed file to be
# $case ==> the absolute (full) path of your root case directory
# For example, if your case is located in
# '$WORK/aa/bb/cc/your_case', the command in line 22 should be
# tar -zcvf saved_name.tar.gz $WORK/aa/bb/cc/your_case
# then the compressed file 'saved_name.tar.gz' will stored in the directory which the
# terminal points to once the compression is accomplished.
DO NOT do the compression directly in the login node to avoid the account suspension.
Dezhi Dai @ MAE Department, UT Arlington