Skip to content

Overview#

All HPC clusters at NHR@FAU have to be accessed through their respective frontend nodes. All frontends use private IPv4 addresses, which means that they are not globally reachable.

Depending on your affiliation, there are several options how you can connect to the cluster frontends:

  • FAU users:

    • directly from within the FAU university network
    • via VPN
    • via SSH proxy jump through our dialog server
    • directly with an IPv6 connection
  • non-FAU users:

    • SSH proxy jump through our dialog server
    • directly with an IPv6 connection

Whichever option you choose, you need to use SSH to connect to the clusters.

Documentation on how to set up SSH is available for:

JupyterHub#

Access to some HPC systems is also possible via our central JupyterHub instances.

Remote desktop#

It is possible to run a remote graphical Linux desktop on our dialog servers through

This is not suitable for doing graphically demanding tasks like remote 3D visualization, just for normal desktop use.

Remote visualization#

With Fritz we provide a remote (3D) visualization node.

Dialog server#

  • cshpc.rrze.fau.de: cshpc is a Linux-System that permits login to all HPC-accounts.
  • csnhr.nhr.fau.de: csnhr is a Linux-System that permits login to all HPC-accounts. This machine will replace cshpc soon. It can already be used, but you should expect things to not be fully ready yet.

File transfer#

The standard filesystems ($HOME, $HPCVAULT, $WORK) are directly reachable from the dialog server. It can be used to directly copy data to the HPC systems or mount the remote filesystems locally.