bwForCluster NEMO

The bwForCluster NEMO is a high-performance compute resource with high speed interconnect. It is intended for compute activities related to research in for researchers from the fields Neuroscience, Elementary Particle Physics, Microsystems Engineering and Material Sciences (NEMO).

For a detailed NEMO documentation refer to the NEMO Wiki.

NEMO Logo NEMO Cluster

Software and Operating System

Compute Nodes

For researchers from the scientific fields Neuroscience, Elementary Particle Physics, Microsystems Engineering and Material Sciences the bwForCluster NEMO offers 900 compute nodes plus several special purpose nodes for login, interactive jobs, etc.

Special Purpose Nodes

Besides the classical compute node several nodes serve as login and preprocessing nodes, nodes for interactive jobs, visualization nodes and nodes for creating virtual environments providing a virtual service environment.

Storage Architecture

The bwForCluster NEMO consists of two separate storage systems, one for the user’s home directory $HOME and one serving workspaces The home directory is limited in space and parallel access but offers snapshots of your files and Backup. The workspace is a parallel file system which offers fast and parallel file access and a bigger capacity than the home directory. This storage is based on BeeGFS and can be accessed parallel from many nodes. Additionally, each compute node provides high-speed temporary storage on the node-local solid state disk (SSD) via the $TMPDIR environment variable.

High Performance Network

The compute nodes all are interconnected through the high performance network Omni-Path which offers a very small latency and 100 Gbit/s throughput. The parallel storage for the workspaces is attached via Omni-Path to all cluster nodes. For non-blocking communication 20 islands with 44 nodes and 880 cores each are available. The islands are connected with a blocking factor of 1:11 (or 400 Gbit/s for 44 nodes).

News and Newsletters

We publish news and other important information around NEMO and Freiburg-specific HPC topics on our newsletters and news pages. To subscribe to our the news mailing list, please send an e-mail to