7.10 LQCD

7.10.1 Introduction to LQCD

7.10.2 Common Used Softwares

Currently common used LQCD softwares in IHEP:

  • Chroma: developed by USQCD Collaberations
  • QUDA:supported by NVIDIA, using GPU to perform lqcd calculations combined with sChroma。
  • GWU-Code:developed by χQCD Collaberations.

7.10.3 Software Usages

7.10.3.1 Software Versions and Location

Before using chroma, QUDA and GWU-Code, following commands must be excuted:

mkdir -p ~/privatemodules
ln -sf /afs.ihep.ac.cn/soft/lqcd/modulefiles ~/privatemodules/lqcd
echo "module load use.own" >> /.bashrc
source ~/.bashrc

Check softwares' versions:

$ module av
----------------------------------------------------------------------------------------------------------------------------- /usr/share/Modules/modulefiles -----------------------------------------------------------------------------------------------------------------------------
dot         module-git  module-info modules     null        use.own

------------------------------------------------------------------------------------------------------------------------------------ /etc/modulefiles ------------------------------------------------------------------------------------------------------------------------------------
mpi/compat-openmpi16-x86_64 mpi/mpich-3.0-x86_64        mpi/mpich-x86_64

---------------------------------------------------------------------------------------------------------------------- /afs/ihep.ac.cn/users/b/biyj/privatemodules -----------------------------------------------------------------------------------------------------------------------
lqcd/base/sl7                              lqcd/cmake/3.16.4                          lqcd/grid/2.0                              lqcd/llvm/6.0.0-gcc7.3.1                   lqcd/mpi/openmpi/4.0.2-ucx-gcc4.8.5        lqcd/ucx/1.7.0-gcc4.8.5
lqcd/chroma/double/latest-afs              lqcd/cuda/10.1                             lqcd/gwu-code/mvapich2                     lqcd/mpi/mvapich2/2.3.2                    lqcd/mpi/openmpi/4.0.2-ucx-gcc4.8.5-afs    lqcd/ucx/1.7.0-gcc7.3.1
lqcd/chroma/double/latest-openmpi-afs      lqcd/eigen/3.3.7                           lqcd/gwu-code/mvapich2.v2                  lqcd/mpi/mvapich2/2.3.2-gdr-mcast-afs      lqcd/mpi/openmpi/4.0.2-ucx-gcc7.3.1        lqcd/xqcd.io/mvapich2-double
lqcd/chroma/single/latest-afs              lqcd/gcc/4.9.0                             lqcd/gwu-code/openmpi                      lqcd/mpi/mvapich2/2.3.2-gdr-mcast-gcc4.8.5 lqcd/python/3.6                            lqcd/xqcd.io/mvapich2-single
lqcd/chroma/single/latest-openmpi-afs      lqcd/gcc/7.3.1                             lqcd/libxml2/2.6.24-sl7                    lqcd/mpi/mvapich2/2.3.3-mcast-gcc4.8.5     lqcd/root/6.14                             lqcd/xqcd.io/openmpi-double
lqcd/cmake/3.13.4                          lqcd/gdrcopy/2.0                           lqcd/llvm/6.0.0-gcc4.8.5                   lqcd/mpi/mvapich2/2.3.3-mcast-gcc7.3.1     lqcd/root/6.18                             lqcd/xqcd.io/openmpi-single

7.10.3.2 Usage Demonstrations

#! /bin/bash
#### Job Queue
#SBATCH --partition=gpu

#### Group Name: lqcd
#SBATCH --account=lqcd

#### CPU Cores
#SBATCH --ntasks=8

#### Not necessary if logged in as afs account
#SBATCH --uid=<biyj

#### Output redirection
#SBATCH --output=<some_writable_directory>/job-%j.out
#SBATCH --job-name=gpu_test

##### Specify memory to use, or slurm will allocate all available memory
#SBATCH --mem-per-cpu=2048

##### Specify how many GPU cards to use,  8 in total per node.
#SBATCH --gres=gpu:v100:2

test -d $HOME/privatemodules || mkdir -p $HOME/privatemodules
test -f $HOME/privatemodules/lqcd || ln -sf /afs/ihep.ac.cn/soft/lqcd/modules $HOME/privatemodules/lqcd
module load user.own
module load lqcd/chroma/double
srun -n 8 chroma -i sample.in.xml -o sample.out.xml > sample.out.log &
wait

results matching ""

    No results matching ""