[关闭]
@daidezhi 2017-07-22T22:22:16.000000Z 字数 3098 阅读 1300

How to submit an OpenFOAM job to TACC supercomputer system

OpenFOAM TACC


1. OpenFOAM case directory structure

A typical directory structure of an OpenFOAM simulation case is shown as

$case case root directory
├──constant mesh and transport properties
├──polyMesh
└──transportProperties
├──0 initial and boundary conditions
├──alpha.water
├──p
└──U
└──system flow solver configurations
├──controlDict computation control, i.e., time step, etc.
├──fvSchemes fvm operator schemes, i.e., ddt(rho, U), etc.
├──fvSolution algebraic equations solver and PISO, simple or pimple algorithms, etc.
└──sampleDict sample data, i.e., 0.5 fraction value iso-surface, etc.


2. SLURM job script

A SLURM job script for a serial OpenFOAM case is shown as

  1. #!/bin/bash
  2. #----------------------------------------------------
  3. # SLURM job script to run applications on
  4. # TACC's Lonestar 5 system.
  5. #
  6. # Your job description...
  7. #----------------------------------------------------
  8. #SBATCH -J your_job_name # Job name
  9. #SBATCH -o your_job_name_%j.out # Name of stdout output file (%j expands to jobId)
  10. #SBATCH -e your_job_name_%j.err # Name of stdout output file (%j expands to jobId)
  11. #SBATCH -p normal # Queue name
  12. #SBATCH -N 1 # Total number of nodes requested
  13. #SBATCH -n 1 # Total number of mpi tasks requested
  14. #SBATCH -t 48:00:00 # Run time (hh:mm:ss) - 48 hours (maximum)
  15. # Slurm email notifications are now working on Lonestar 5
  16. #SBATCH --mail-user=your_email_address
  17. #SBATCH --mail-type=all
  18. # Launch the executable flow solver based on OpenFOAM
  19. flow_solver_name -case $case
  20. # flow_solver_name ==> the OpenFOAM flow solver name
  21. # $case ==> the absolute (full) path of your root case directory
  22. # For example, if you want to run 'interFoam' solver and your case is located in
  23. # '$WORK/aa/bb/cc/your_case', the command in line 22 should be
  24. # interFoam -case $WORK/aa/bb/cc/your_case

Save this script as your_job_script, then cd to the directory where your script file located and use sbatch your_job_script to submit the job in the terminal. The command showq -U your_tacc_user_id may help you to monitor your job status.


3. Numerical results packaging

It should be noted that before downloading the numerical results, you'd better compress them by using SLURM job script again

  1. #!/bin/bash
  2. #----------------------------------------------------
  3. # SLURM job script to run applications on
  4. # TACC's Lonestar 5 system.
  5. #
  6. # Your job description...
  7. #----------------------------------------------------
  8. #SBATCH -J your_job_name # Job name
  9. #SBATCH -o your_job_name_%j.out # Name of stdout output file (%j expands to jobId)
  10. #SBATCH -e your_job_name_%j.err # Name of stdout output file (%j expands to jobId)
  11. #SBATCH -p normal # Queue name
  12. #SBATCH -N 1 # Total number of nodes requested
  13. #SBATCH -n 1 # Total number of mpi tasks requested
  14. #SBATCH -t 48:00:00 # Run time (hh:mm:ss) - 48 hours (maximum)
  15. # Slurm email notifications are now working on Lonestar 5
  16. #SBATCH --mail-user=your_email_address
  17. #SBATCH --mail-type=all
  18. # Launch the executable flow solver based on OpenFOAM
  19. tar -zcvf saved_name.tar.gz $case
  20. # saved_name ==> the name you want the compressed file to be
  21. # $case ==> the absolute (full) path of your root case directory
  22. # For example, if your case is located in
  23. # '$WORK/aa/bb/cc/your_case', the command in line 22 should be
  24. # tar -zcvf saved_name.tar.gz $WORK/aa/bb/cc/your_case
  25. # then the compressed file 'saved_name.tar.gz' will stored in the directory which the
  26. # terminal points to once the compression is accomplished.

DO NOT do the compression directly in the login node to avoid the account suspension.


Dezhi Dai @ MAE Department, UT Arlington

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注