Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
en:services:application_services:high_performance_computing:orca [2021/04/22 15:22]
mboden removed
— (current)
Line 1: Line 1:
-======= Orca ======= 
- 
-To run Orca, you should register at [[https://orcaforum.cec.mpg.de/]]. For now we are allowed to have Orca installed centrally in the SCC. This may change in the future! 
- 
-=====  Using ORCA  ===== 
- 
-Login via ssh to gwdu101 or gwdu102. Load the following modules: 
- 
-<code> 
-module load openmpi/gcc/64/1.6.4 
-module load orca</code> 
-\\ 
-For a serial job, create a jobscript like: 
- 
-<code> 
-#!/bin/bash 
-#SBATCH -p medium 
-#SBATCH -n 1 
-#SBATCH -t 1-00:00:00 
- 
-INPUTFILE=test.inp 
- 
-$ORCA_PATH/orca ${INPUTFILE}</code> 
-\\ 
-This tells the batch system to submit the job to queue medium and require 1 processor for 24 hours. 
- 
-For parallel jobs, this needs a little trick, since orca can't run on shared filesystems like NFS, CVFS or FHGFS. We need to use /local as a local filesystem for the run: 
- 
-<code> 
-#!/bin/bash 
-#SBATCH -p medium 
-#SBATCH -J ORCA 
-#SBATCH -n 20 
-#SBATCH -N 1 
-#SBATCH -t 1-00:00:00 
-#SBATCH --ntasks-per-node=20 
-#SBATCH --signal=B:12@600 
-#SBATCH -C local 
- 
-INPUTFILE=test.inp 
- 
-work=$PWD 
- 
-trap 'srun -n ${SLURM_JOB_NUM_NODES} --ntasks-per-node=1 cp -af ${TMP_LOCAL}/* ${work}/; exit 12' 12 
- 
-cp -af ${INPUTFILE} ${work}/*.gbw ${work}/*.pot ${TMP_LOCAL}/ 
- 
-cd $TMP_LOCAL 
- 
-$ORCA_PATH/orca ${INPUTFILE} & 
-wait 
- 
-srun -n ${SLURM_JOB_NUM_NODES} --ntasks-per-node=1 cp -af ${TMP_LOCAL}/* ${work}/ >/dev/null 2>&1 
- 
-</code> 
-\\ 
-This tells the batch system to submit the job to partition medium and require 20 processors on one node for 24 hours. **Please make sure that your input file in this case (-n 20) contains the line '%pal nprocs 20 end' (without quotes)!** '%pal nprocs 20 end' must equal the number of processes you reserve with the '-n' option. 
- 
-Save the script as myjob.job, for example, and submit with 
- 
-<code> 
-sbatch myjob.job</code> 
-\\ 
-[[Kategorie: Scientific Computing]]