• Jump to content
  • Jump to navigation
  • Jump to bottom of page
Simulate organization breadcrumb open Simulate organization breadcrumb close
  • FAUTo the central FAU website
  • RRZE
  • NHR-Geschäftsstelle
  • Gauß-Allianz

Navigation Navigation close
  • News
  • People
  • Research
    • Research Focus
    • Publications, Posters and Talks
    • Software & Tools
    • HPC Performance Lab
    • Atomic Structure Simulation Lab
    • NHR PerfLab Seminar
    • Projects
    • Awards
    Portal Research
  • Teaching & Training
    • Lectures and Seminars
    • Tutorials and Courses
    • Theses
    • HPC Cafe
    • Student Cluster Competition
    Portal Teaching
  • Systems & Services
    • Systems, Documentation & Instructions
    • Support & Contact
    • Success Stories from the Support
    • Training Resources
    • Summary of System Utilization
    • Reports from User Projects
    Portal Systems & Services

  1. Home
  2. Systems & Services
  3. Systems, Documentation & Instructions
  4. Special applications, and tips & tricks
  5. ORCA

ORCA

In page navigation: Systems & Services
  • Systems, Documentation & Instructions
    • Getting started with HPC
      • NHR@FAU HPC-Portal Usage
    • NHR application rules – NHR@FAU
    • HPC clusters & systems
      • Dialog server
      • Alex GPGPU cluster (NHR+Tier3)
      • Fritz parallel cluster (NHR+Tier3)
      • Meggie parallel cluster (Tier3)
      • Emmy parallel cluster (Tier3)
      • Woody throughput cluster (Tier3)
      • TinyFat cluster (Tier3)
      • TinyGPU cluster (Tier3)
      • Test cluster
      • Jupyterhub
    • SSH – Secure Shell access to HPC systems
    • File systems
    • Batch Processing
      • Job script examples – Slurm
      • Advanced topics Slurm
      • Torque batch system
    • Software environment
    • Special applications, and tips & tricks
      • Amber/AmberTools
      • ANSYS CFX
      • ANSYS Fluent
      • ANSYS Mechanical
      • Continuous Integration / Gitlab Cx
      • CP2K
      • CPMD
      • GROMACS
      • IMD
      • Intel MKL
      • LAMMPS
      • Matlab
      • NAMD
      • OpenFOAM
      • ORCA
      • Python and Jupyter
      • Quantum Espresso
      • R and R Studio
      • STAR-CCM+
      • Tensorflow and PyTorch
      • TURBOMOLE
      • VASP
        • Request access to central VASP installation
      • Working with NVIDIA GPUs
      • WRF
  • Support & Contact
    • Monthly HPC Cafe
    • HPC Performance Lab
    • Atomic Structure Simulation Lab
    • Support Success Stories
      • Success story: Elmer/Ice
  • HPC User Training
  • HPC System Utilization
  • User projects
    • Biology, life sciences & pharmaceutics
      • HPC User Report from A. Bochicchio (Professorship of Computational Biology)
      • HPC User Report from A. Horn (Bioinformatics)
      • HPC User Report from C. Söldner (Professorship for Bioinformatics)
      • HPC User Report from J. Calderón (Computer Chemistry Center)
      • HPC User Report from J. Kaindl (Chair of Medicinal Chemistry)
      • HPC User Report from K. Pluhackova (Computational Biology Group)
    • Chemical & mechanical engineering
      • HPC User Report from A. Leonardi (Institute for Multiscale Simulation)
      • HPC User Report from F. Lenahan (Institute of Advanced Optical Technologies – Thermophysical Properties)
      • HPC User Report from F. Weber (Chair of Applied Mechanics)
      • HPC User Report from K. Nusser (Institute of Process Machinery and Systems Engineering)
      • HPC User Report from K. Nusser (Institute of Process Machinery and Systems Engineering)
      • HPC User Report from L. Eckendörfer (Catalytic Reactors and Process Technology)
      • HPC User Report from M. Klement (Institute for Multiscale Simulation)
      • HPC User Report from M. Münsch (Chair of Fluid Mechanics)
      • HPC User Report from T. Klein (Institute of Advanced Optical Technologies – Thermophysical Properties)
      • HPC User Report from T. Schikarski (Chair of Fluid Mechanics / Chair of Particle Technology)
      • HPC User Report from U. Higgoda (Institute of Advanced Optical Technologies – Thermophysical Properties)
    • Chemistry
      • HPC User Report from B. Becit (Professorship of Theoretical Chemistry)
      • HPC User Report from B. Meyer (Computational Chemistry – ICMM)
      • HPC User Report from D. Munz (Chair of Inorganic and General Chemistry)
      • HPC User Report from J. Konrad (Professorship of Theoretical Chemistry)
      • HPC User Report from P. Schwarz (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from S. Frühwald (Chair of Theoretical Chemistry)
      • HPC User Report from S. Maisel (Chair of Theoretical Chemistry)
      • HPC User Report from S. Sansotta (Professorship of Theoretical Chemistry)
      • HPC User Report from S. Seiler (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from S. Trzeciak (Professorship of Theoretical Chemistry)
      • HPC User Report from T. Klöffel (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from T. Kollmann (Professorship of Theoretical Chemistry)
    • Computer science & Mathematics
      • HPC User Report from B. Jakubaß & S. Falk (Division of Phoniatrics and Pediatric Audiology)
      • HPC User Report from D. Schuster (Chair for System Simulation)
      • HPC User Report from F. Wein (Professorship for Mathematical Optimization)
      • HPC User Report from J. Hornich (Professur für Höchstleistungsrechnen)
      • HPC User Report from L. Folle and K. Tkotz (Chair of Computer Science 5, Pattern Recognition)
      • HPC User Report from R. Burlacu (Economics, Discrete Optimization, and Mathematics)
      • HPC User Report from S. Falk (Division of Phoniatrics and Pediatric Audiology)
      • HPC User Report from S. Falk (Phoniatrics and Pediatric Audiology)
      • HPC User Report from S. Jacob (Chair of System Simulation)
    • Electrical engineering & Audio processing
      • HPC User Report from N. Pia (AudioLabs)
      • HPC User Report from S. Balke (Audiolabs)
    • Geography & Climatology
      • HPC usage report from F. Temme, J. V. Turton, T. Mölg and T. Sauter
      • HPC usage report from J. Turton, T. Mölg and E. Collier
      • HPC usage report from N. Landshuter, T. Mölg, J. Grießinger, A. Bräuning and T. Peters
      • HPC User Report from C. Pickler and T. Mölg (Climate System Research Group)
      • HPC User Report from E. Collier (Climate System Research Group)
      • HPC User Report from E. Collier and T. Mölg (Climate System Research Group)
      • HPC User Report from E. Collier, T. Sauter, T. Mölg & D. Hardy (Climate System Research Group, Institute of Geography)
      • HPC User Report from E. Kropač, T. Mölg, N. J. Cullen, E. Collier, C. Pickler, and J. V. Turton (Climate System Research Group)
      • HPC User Report from J. Fürst (Department of Geography)
      • HPC User Report from P. Friedl (Department of Geography)
      • HPC User Report from T. Mölg (Climate System Research Group)
    • Linguistics
      • HPC User Report from P. Uhrig (Chair of English Linguistics)
    • Material sciences
      • HPC User Report from A. Rausch (Chair of Materials Science and Engineering for Metals)
      • HPC User Report from D. Wei (Chair of Materials Simulation)
      • HPC User Report from J. Köpf (Chair of Materials Science and Engineering for Metals)
      • HPC User Report from P. Baranova (Chair of General Materials Properties)
      • HPC User Report from S. Nasiri (Chair for Materials Simulation)
      • HPC User Report from S.A. Hosseini (Chair for Materials Simulation)
    • Medical research
      • HPC User Report from H. Sadeghi (Phoniatrics and Pediatric Audiology)
      • HPC User Report from P. Ritt (Imaging and Physics Group, Clinic of Nuclear Medicine)
      • HPC User Report from S. Falk (Division of Phoniatrics and Pediatric Audiology)
    • Physics
      • HPC User Report from D. Jankowsky (High-Energy Astrophysics)
      • HPC User Report from M. Maiti (Inst. Theoretische Physik 1)
      • HPC User Report from N. Vučemilović-Alagić (PULS group of the Physics Department)
      • HPC User Report from O. Malcioglu (Theoretische Festkörperphysik)
      • HPC User Report from S. Fey (Chair of Theoretical Physics I)
      • HPC User Report from S. Ninova (Theoretical Solid-State Physics)
      • HPC User Report from S. Schmidt (Erlangen Centre for Astroparticle Physics)
    • Regional users and student projects
      • HPC User Report from Dr. N. Ferruz (University of Bayreuth)
      • HPC User Report from J. Martens (Comprehensive Heart Failure Center / Universitätsklinikum Würzburg)
      • HPC User Report from M. Fritsche (HS-Coburg)
      • HPC User Report from M. Heß (TH-Nürnberg)
      • HPC User Report from M. Kögel (TH-Nürnberg)
  • NHR compute time projects

ORCA

ORCA is an ab initio quantum chemistry program package that contains modern electronic structure methods including density functional theory, many-body perturbation, coupled cluster, multireference methods, and semi-empirical quantum chemistry methods. Its main field of application is larger molecules, transition metal complexes, and their spectroscopic properties.

ORCA requires a license per individual or research group (cf. https://cec.mpg.de/orcadownload/ or the ORCA forum https://orcaforum.kofo.mpg.de/). Once you can proof that you are eligible, contact hpc-support@fau.de for activation of the ORCA module.

Availability / Target HPC systems

  • throughput cluster Woody and TinyFAT
  • owing to its limited scalability, ORCA is not suited for the parallel computers

New versions of ORCA are installed by RRZE upon request with low priority if the users provide the installation files.

Notes

  • orca has to be called with the full path otherwise parallel runs may fail.
  • The orca module will take care of loading an appropriate openmpi module, too.
  • ORCA often results in massive IO (“communication through files??”); thus, put temporary files into /dev/shm (RAM disk) or local scratch directory.

Sample job scripts

parallel orca on a Woody node

#!/bin/bash -l
#PBS -lnodes=1:ppn=4,walltime=10:00:00
#PBS -N my-orca
#PBS -j eo

cd $PBS_O_WORKDIR

module add orca/4.1.1

### No mpirun required as ORCA starts the parallel processes internally as needed.
### The number of processes is specified in the input file using '%pal  nprocs #  end'

${ORCABASE}/orca   orca.inp   "optional openmpi arguments"

Further information

  • https://orcaforum.kofo.mpg.de/
  • note in the ORCA forum on improving MKL performance on AMD Epyc processors: https://orcaforum.kofo.mpg.de/viewtopic.php?f=8&t=3340&hilit=mkl&start=20
    We recommend to not only set MKL_DEBUG_CPU_TYPE=5 but to also set MKL_CBWR=AUTO as environment variables (as long as ORCA still uses Intel MKL versions before 2020.1; these environment variables no longer work with newer MKL versions)

Mentors

  • please volunteer!

 

Erlangen National High Performance Computing Center (NHR@FAU)
Martensstraße 1
91058 Erlangen
Germany
  • Imprint
  • Privacy
  • Accessibility
  • How to find us
Up