The NHR systems in Berlin ("Lise") and Göttingen ("Emmy") are equipped with various file systems each. Their properties and their intended utilization are described here.

Disk quotas based on group ownerships are implemented on each site's global (non-local) file systems.

We support file transfer tools like scp or rsync which use the SSH protocol in the background to establish the connection and to encrypt the data transfer. For this reason, a working SSH connection is a prerequisite for data transfer. Each of the following sections deals with a specific direction for establishing the transfer connection. Independent of the connection direction, data can always be transferred "from" or "to" the connected target host.

Data Transfer Connecting from the Outside World

External connections to the NHR systems in Berlin or Göttingen require that an SSH key pair is used for authentication. More details can be found here. The location of the private key file can be specified when calling scp or rsync on the user's local machine. Some examples including both data transfer directions are shown below.

Using scp, the option -i <fullpath_of_privatekeyfile> can be added:

Example 1: Using scp to copy data from Lise to the user's local machine
$ scp -i <fullpath_of_privatekeyfile> <username><remote_source> <local_target>
Example 2: Using scp to copy data from the user's local machine to Lise
$ scp -i <fullpath_of_privatekeyfile> <local_source> <username><remote_target>

With rsync, the nested option -e 'ssh -i <fullpath_of_privatekeyfile>' can be added: 

Example 3: Using rsync to copy data from Lise to the user's local machine
$ rsync -e 'ssh -i <fullpath_of_privatekeyfile>' <username><remote_source> <local_target>
Example 4: Using rsync to copy data from the user's local machine to Lise
$ rsync -e 'ssh -i <fullpath_of_privatekeyfile>' <local_source> <username><remote_target>

(lightbulb) Alternatively, the additional options shown above for specifying the location of the private key file can be omitted. In this case it is necessary to have a corresponding SSH configuration on the user's local machine as described here. To verify this, the corresponding SSH connection must be working without specifying the private key file on the command line.

Data Transfer Connecting to the Outside World

Connections to external machines located anywhere in the world can be established interactively from the login nodes. In this case, the SSH key pair mentioned above for external connections to the login nodes is not required. However, additional rules imposed by the external host or institution may apply.

Data transfer in the context of a batch job is restricted due to limited network access of the compute nodes. Please send a message to the support mailing list in case you need further help.

Internal Data Transfer

Internal data transfer between a Berlin and a Göttingen login node using scp or rsync  works right out of the box - that is, without specifying any keys or passwords. This is enabled through host-based authentication which is active by default.

For internal data transfer, please always use the host name alone, omitting the domain suffix "". You can use the generic names blogin, glogin or specific names like blogin5, glogin2, etc.. This way NHR internal links are used which are faster than external routes. The latter are used when specifying fully qualified host names ( or which is not recommended here.

Data Transfer Between Emmy (Göttingen) And The GWDG SCC

If you have previously been working on the SCC in Göttingen at the GWDG, you can follow these steps if you need to transfer data to/from the Emmy system:

  1. On an Emmy frontend node ( or glogin[1-9], generate a new SSH key (also documented at the SCC).
  2. Add the SSH key at the GWDG Website -> My Account -> Security.
  3. From an Emmy frontend node ( has access to both Emmy and Grete scratches, while and glogin[1-8] only have access to the Emmy scratch; but all have access to $HOME ), transfer the files using rsync (see SCC documentation and rsync man page) to/from the SCC transfer node Some examples are given below
    • Copy a single file FOO  from SCC $HOME into your current directory on Emmy

      rsync -e 'ssh -i <fullpath_of_privatekeyfile>' .
    • Copy a single file FOO in your current directory on Emmy to $HOME  on the SCC

      rsync -e 'ssh -i <fullpath_of_privatekeyfile>' FOO
    • Copy a directory in your SCC /scratch  to your current directory on Emmy

      rsync -e 'ssh -i <fullpath_of_privatekeyfile>' -r .
    • Copy a directory in your current directory on Emmy to /scratch  on the SCC

      rsync -e 'ssh -i <fullpath_of_privatekeyfile>' -r synthetic_trees

If you have terrabytes of data that need to be transferred, please contact us so that we can provide a custom solution for this.

  • No labels