The Stagger-code

The Stagger-code is a 3D radiation magneto-hydrodynamics code originally written by Aake Nordlund and Klaus Galsgaard at the Niels Bohr Institute in Copenhagen. It is a flexible, multi-purpose, code, comprising of many different modules that can be used for a variety of numerical experiments. A general overview of the finite difference scheme and components of the Stagger-code is provided by Nordlund and Galsgaard (1995). The official code is maintained on a CVS repository hosted at the Niels Bohr Institute. To use the very latest version of the code from the CVS repository, please contact Aake. 

How to compile the Stagger code?

Before compiling, you need to configure some environment variables depending on the system where you intend to run the Stagger-code. First load necessary modules.

module load intel
module load impi

download the Stagger-code and unpack

cd stagger/
tar -xvzf stagger-code_*.tar.gz

compile

cd ~/stagger/stagger-code/src/
make EXPERIMENT=remo MP=_mpi

move executable

mv remo_ifort_mpi.x stagger.x

Which files do I need?

You need a starting model (*.scr) with its mesh file (*.msh & *.dx).

  • t5777g44m0004.scr
  • t5777g44m0004.msh
  • t5777g44m0004.dx

Then, you need the equation-of-state file and opacity table file.

  • EOSrhoe_AGSS09+0.00.tab
  • kaprhoT.tab_t5777g44m00v01_6bin

Also some control files are necessary.

  • input04.txt
  • fluxes.txt
  • qsub.t5777g44m00

How do I run the Stagger-code?

interactive run with openmpi

module load openmpi
mpiexec -n 8 ./stagger.x input04.txt > stagger.log 

Stagger-code CVS repository

login

cvs -d :pserver:user@astro.ku.dk:/ast/home/aake/cvs login

connect

cvs -d :pserver:user@astro.ku.dk:/ast/home/aake/cvs checkout codes/stagger

Table code SVN repository

login

svn checkout svn://user@lcd.colorado.edu:/convec <path>

Download the IDL routines

svn checkout svn://user@lcd.colorado.edu:/idl_convec <path>

Later on, one needs just to update the repository.

svn update

One can lookup informations with

svn info
svn st -q

SLURM script

# load intel/mpi compiler
source /users/software/astro/startup.d/modules.sh
module purge
source /software/astro/intel/parallel_studio_xe_2015_upd1/composer_xe_2015.1.133/bin/compilervars.sh intel64
source /software/astro/intel/parallel_studio_xe_2015_upd1/impi/5.0.2.044/bin64/mpivars.sh 
# link library
export I_MPI_FABRICS=shm:dapl
export I_MPI_HYDRA_JMI_LIBRARY=libjmi_slurm.so.1.1
export I_MPI_HYDRA_BOOTSTRAP=jmi
export I_MPI_PMI_LIBRARY=/usr/lib64/libpmi.so
# debug flags
export I_MPI_DEBUG=6
export I_MPI_DAPL_PROVIDER=ofa-v2-mlx4_0-1 # best performance
# number of threads and OpenMP affinity
export KMP_AFFINITY="verbose,granularity=thread,scatter,1,0"
# Change to directory where script was submitted
cd ${SLURM_SUBMIT_DIR}
export F_UFMTENDIAN=big
# run application
time srun --label --distribution=block:block ./stagger.x input.txt > log_`date +%y%m%d_%H%M`