• Jump to content
  • Jump to navigation
  • Jump to bottom of page
Simulate organization breadcrumb open Simulate organization breadcrumb close
  • FAUTo the central FAU website
  • RRZE
  • NHR-Geschäftsstelle
  • Gauß-Allianz

Navigation Navigation close
  • News
  • People
  • Research
    • Research Focus
    • Publications, Posters and Talks
    • Software & Tools
    • HPC Performance Lab
    • Atomic Structure Simulation Lab
    • NHR PerfLab Seminar
    • Projects
    • Awards
    Portal Research
  • Teaching & Training
    • Lectures and Seminars
    • Tutorials and Courses
    • Theses
    • HPC Cafe
    • Student Cluster Competition
    Portal Teaching
  • Systems & Services
    • Systems, Documentation & Instructions
    • Support & Contact
    • Success Stories from the Support
    • Training Resources
    • Summary of System Utilization
    • Reports from User Projects
    Portal Systems & Services

  1. Home
  2. Systems & Services
  3. User projects
  4. Chemistry
  5. HPC User Report from S. Seiler (Interdisciplinary Center for Molecular Materials)

HPC User Report from S. Seiler (Interdisciplinary Center for Molecular Materials)

In page navigation: Systems & Services
  • Systems, Documentation & Instructions
    • Getting started with HPC
      • NHR@FAU HPC-Portal Usage
    • NHR application rules – NHR@FAU
    • HPC clusters & systems
      • Dialog server
      • Alex GPGPU cluster (NHR+Tier3)
      • Fritz parallel cluster (NHR+Tier3)
      • Meggie parallel cluster (Tier3)
      • Emmy parallel cluster (Tier3)
      • Woody throughput cluster (Tier3)
      • TinyFat cluster (Tier3)
      • TinyGPU cluster (Tier3)
      • Test cluster
      • Jupyterhub
    • SSH – Secure Shell access to HPC systems
    • File systems
    • Batch Processing
      • Job script examples – Slurm
      • Advanced topics Slurm
      • Torque batch system
    • Software environment
    • Special applications, and tips & tricks
      • Amber/AmberTools
      • ANSYS CFX
      • ANSYS Fluent
      • ANSYS Mechanical
      • Continuous Integration / Gitlab Cx
      • CP2K
      • CPMD
      • GROMACS
      • IMD
      • Intel MKL
      • LAMMPS
      • Matlab
      • NAMD
      • OpenFOAM
      • ORCA
      • Python and Jupyter
      • Quantum Espresso
      • R and R Studio
      • STAR-CCM+
      • Tensorflow and PyTorch
      • TURBOMOLE
      • VASP
        • Request access to central VASP installation
      • Working with NVIDIA GPUs
      • WRF
  • Support & Contact
    • Monthly HPC Cafe
    • HPC Performance Lab
    • Atomic Structure Simulation Lab
    • Support Success Stories
      • Success story: Elmer/Ice
  • HPC User Training
  • HPC System Utilization
  • User projects
    • Biology, life sciences & pharmaceutics
      • HPC User Report from A. Bochicchio (Professorship of Computational Biology)
      • HPC User Report from A. Horn (Bioinformatics)
      • HPC User Report from C. Söldner (Professorship for Bioinformatics)
      • HPC User Report from J. Calderón (Computer Chemistry Center)
      • HPC User Report from J. Kaindl (Chair of Medicinal Chemistry)
      • HPC User Report from K. Pluhackova (Computational Biology Group)
    • Chemical & mechanical engineering
      • HPC User Report from A. Leonardi (Institute for Multiscale Simulation)
      • HPC User Report from F. Lenahan (Institute of Advanced Optical Technologies – Thermophysical Properties)
      • HPC User Report from F. Weber (Chair of Applied Mechanics)
      • HPC User Report from K. Nusser (Institute of Process Machinery and Systems Engineering)
      • HPC User Report from K. Nusser (Institute of Process Machinery and Systems Engineering)
      • HPC User Report from L. Eckendörfer (Catalytic Reactors and Process Technology)
      • HPC User Report from M. Klement (Institute for Multiscale Simulation)
      • HPC User Report from M. Münsch (Chair of Fluid Mechanics)
      • HPC User Report from T. Klein (Institute of Advanced Optical Technologies – Thermophysical Properties)
      • HPC User Report from T. Schikarski (Chair of Fluid Mechanics / Chair of Particle Technology)
      • HPC User Report from U. Higgoda (Institute of Advanced Optical Technologies – Thermophysical Properties)
    • Chemistry
      • HPC User Report from B. Becit (Professorship of Theoretical Chemistry)
      • HPC User Report from B. Meyer (Computational Chemistry – ICMM)
      • HPC User Report from D. Munz (Chair of Inorganic and General Chemistry)
      • HPC User Report from J. Konrad (Professorship of Theoretical Chemistry)
      • HPC User Report from P. Schwarz (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from S. Frühwald (Chair of Theoretical Chemistry)
      • HPC User Report from S. Maisel (Chair of Theoretical Chemistry)
      • HPC User Report from S. Sansotta (Professorship of Theoretical Chemistry)
      • HPC User Report from S. Seiler (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from S. Trzeciak (Professorship of Theoretical Chemistry)
      • HPC User Report from T. Klöffel (Interdisciplinary Center for Molecular Materials)
      • HPC User Report from T. Kollmann (Professorship of Theoretical Chemistry)
    • Computer science & Mathematics
      • HPC User Report from B. Jakubaß & S. Falk (Division of Phoniatrics and Pediatric Audiology)
      • HPC User Report from D. Schuster (Chair for System Simulation)
      • HPC User Report from F. Wein (Professorship for Mathematical Optimization)
      • HPC User Report from J. Hornich (Professur für Höchstleistungsrechnen)
      • HPC User Report from L. Folle and K. Tkotz (Chair of Computer Science 5, Pattern Recognition)
      • HPC User Report from R. Burlacu (Economics, Discrete Optimization, and Mathematics)
      • HPC User Report from S. Falk (Division of Phoniatrics and Pediatric Audiology)
      • HPC User Report from S. Falk (Phoniatrics and Pediatric Audiology)
      • HPC User Report from S. Jacob (Chair of System Simulation)
    • Electrical engineering & Audio processing
      • HPC User Report from N. Pia (AudioLabs)
      • HPC User Report from S. Balke (Audiolabs)
    • Geography & Climatology
      • HPC usage report from F. Temme, J. V. Turton, T. Mölg and T. Sauter
      • HPC usage report from J. Turton, T. Mölg and E. Collier
      • HPC usage report from N. Landshuter, T. Mölg, J. Grießinger, A. Bräuning and T. Peters
      • HPC User Report from C. Pickler and T. Mölg (Climate System Research Group)
      • HPC User Report from E. Collier (Climate System Research Group)
      • HPC User Report from E. Collier and T. Mölg (Climate System Research Group)
      • HPC User Report from E. Collier, T. Sauter, T. Mölg & D. Hardy (Climate System Research Group, Institute of Geography)
      • HPC User Report from E. Kropač, T. Mölg, N. J. Cullen, E. Collier, C. Pickler, and J. V. Turton (Climate System Research Group)
      • HPC User Report from J. Fürst (Department of Geography)
      • HPC User Report from P. Friedl (Department of Geography)
      • HPC User Report from T. Mölg (Climate System Research Group)
    • Linguistics
      • HPC User Report from P. Uhrig (Chair of English Linguistics)
    • Material sciences
      • HPC User Report from A. Rausch (Chair of Materials Science and Engineering for Metals)
      • HPC User Report from D. Wei (Chair of Materials Simulation)
      • HPC User Report from J. Köpf (Chair of Materials Science and Engineering for Metals)
      • HPC User Report from P. Baranova (Chair of General Materials Properties)
      • HPC User Report from S. Nasiri (Chair for Materials Simulation)
      • HPC User Report from S.A. Hosseini (Chair for Materials Simulation)
    • Medical research
      • HPC User Report from H. Sadeghi (Phoniatrics and Pediatric Audiology)
      • HPC User Report from P. Ritt (Imaging and Physics Group, Clinic of Nuclear Medicine)
      • HPC User Report from S. Falk (Division of Phoniatrics and Pediatric Audiology)
    • Physics
      • HPC User Report from D. Jankowsky (High-Energy Astrophysics)
      • HPC User Report from M. Maiti (Inst. Theoretische Physik 1)
      • HPC User Report from N. Vučemilović-Alagić (PULS group of the Physics Department)
      • HPC User Report from O. Malcioglu (Theoretische Festkörperphysik)
      • HPC User Report from S. Fey (Chair of Theoretical Physics I)
      • HPC User Report from S. Ninova (Theoretical Solid-State Physics)
      • HPC User Report from S. Schmidt (Erlangen Centre for Astroparticle Physics)
    • Regional users and student projects
      • HPC User Report from Dr. N. Ferruz (University of Bayreuth)
      • HPC User Report from J. Martens (Comprehensive Heart Failure Center / Universitätsklinikum Würzburg)
      • HPC User Report from M. Fritsche (HS-Coburg)
      • HPC User Report from M. Heß (TH-Nürnberg)
      • HPC User Report from M. Kögel (TH-Nürnberg)
  • NHR compute time projects

HPC User Report from S. Seiler (Interdisciplinary Center for Molecular Materials)

Molecular Friction in Graphite Intercalation Compounds

Contact:

Steffen Seiler
Interdisciplinary Center for Molecular Materials (ICMM)
Friedrich-Alexander-Universität Erlangen-Nürnberg

Mainly used HPC resources at RRZE

Emmy cluster

Oxidative wet-chemical graphite delamination is a promising method for large-scale graphene production. Ab initio molecular dynamics simulations show that ideal stacking and oxidation of the graphite layers reduce the friction of sulfuric acid molecules, thereby facilitating intercalation.

Motivation and problem definition

Over the past decade, graphene has attracted a lot of attention in physics, chemistry and materials science due to its outstanding electronic, mechanical and chemical properties. Chemical vapor deposition on a metal substrate is currently the method of choice for preparing large areas of high quality graphene. However, the overall amount of synthesized graphene remains rather small. An alternative process, which has the potential for large-scale graphene production for industrial applications, is the wet-chemical delamination of graphite. A widely used approach is based on the so-called Hummers’ method, which consists of the following steps: first graphite is intercalated by concentrated sulfuric acid. Then, the graphite intercalation compound (GIC) is oxidized. Finally, graphene oxide (GO) layers are separated in solution by hydrolysis and the post-processing of the GO with a reducing agent yields graphene. The overall process, however, is not well understood and is hampered by defect formation. Molecular dynamics (MD) simulations can provide new insights on the atomic scale how oxidation and perturbations in the graphite lattice influence the overall intercalation process.

Methods and codes

To this end, we performed ab initio molecular dynamics simulations within the Car-Parrinello framework using the CPMD software package. In a recent KONWHIR project Tobias Klöffel, in collaboration with Gerald Mathias from LRZ, significantly improved the OpenMP and MPI parallelization of CPMD. These code modifications clearly boosted the performance, enabling us to treat quite large systems (unit cells with more than 500 atoms) in a reasonable amount of time. For typical simulations on 9 Emmy nodes (180 processes), the improved OpenMP parallelization saved up to 40 % of computer time.
Energies and forces were determined from quantum-chemical density-functional theory calculations employing Vanderbilt ultrasoft pseudopotentials, the PBE exchange-correlation functional and Grimme D2 dispersion corrections. The simulations were done in the canonical ensemble close to room temperature conditions by using Nose-Hoover thermostats. A typical simulation for a trajectory of 70 ps requires 700.000 successive MD steps (each taking about 6 seconds on 9 Emmy nodes), resulting in a total simulation time of roughly 7 weeks. From the MD trajectory we determined equilibrium distributions of atoms (which can be converted into free energy profiles), diffusion coefficients, pair correlation functions or the friction coefficient, which describes the resistance for the sulfuric acid molecules to move between the carbon layers of the graphite crystal. It turned out that especially the computation of a converged friction coefficient requires well-equilibrated systems and quite long simulation times of several ten picoseconds.

Results

Our main results were obtained from simulations of three model setups of the sulfuric acid graphite intercalation compound. Friction coefficients can be extracted from the plateau value of the Green-Kubo force auto-correlation function (see Figure). The plot shows that friction is significantly lowered in graphites with ideal AB stacking (see green line) and after oxidation (see blue line) as compared to the setup with perturbed stacking of the graphite layers (see red line). These observations are corroborated by the sulfur atom diffusion coefficient and the lateral distribution of the oxygen atoms in the simulation box and they are explained by a detailed analysis of free energy profiles and the electronic structure. Altogether, we can conclude that the intercalation process strongly benefits from an initial oxidation and a high crystallinity of the graphite lattice. This explains experimental observations on the different behavior of different types of natural graphites.

Outreach

This work has been published in Nature Communications 9 (2018) 836 (DOI: 10.1038/s41467-018-03211-1) and it will be part of Steffen Seiler’s PhD thesis. The project is financed by the Collaborative Research Center SFB 953 “Synthetic Carbon Allotropes”.

Researcher’s Bio and Affiliation

Steffen Seiler obtained his Master’s degree in Molecular Science at the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and he is currently a PhD student in the group of Prof. Bernd Meyer at the Interdisciplinary Center for Molecular Materials (ICMM) and the Computer-Chemistry-Center (CCC).

Erlangen National High Performance Computing Center (NHR@FAU)
Martensstraße 1
91058 Erlangen
Germany
  • Imprint
  • Privacy
  • Accessibility
  • How to find us
Up