Difference between revisions of "Quickstart Guide"

From HPC users
Jump to navigationJump to search
Line 24: Line 24:
==Login==
==Login==


If you want to access the HPC-cluster, you need to have an authorized univery account. If you are not authorized yet, [https://www.uni-oldenburg.de/fk5/wr/hochleistungsrechnen/account-beantragen/ request an account].
If you want to access the HPC-cluster, you need to have an authorized university account. If you are not authorized yet, [https://www.uni-oldenburg.de/fk5/wr/hochleistungsrechnen/account-beantragen/ request an account].


You can use a SSH client of your choice or the command line on linux computers to connect to the cluster via ssh. To do so, use either
You can use a SSH client of your choice or the command line on linux computers to connect to the cluster via ssh. To do so, use either

Revision as of 14:20, 16 February 2017

This is a quick start guide to help you start to work on the HPC-clusters CARL and EDDY.

If you have questions that arent answered in this guide, please contact the servicedesk of the it-services: servicedesk@uni-oldenburg.de.

HPC Cluster Overview

The HPC cluster, located at the Carl von Ossietzsky Universität Oldenburg, consists of two clusters named CARL and EDDY. They are connected via FDR Infiniband for parallel computations and parallel I/O. CARL uses a 8:1 blocking network topology and EDDY uses a fully non-blocking network topology. Further, they are connected via an ethernet network for management and IPMI. They also share an GPFS parallel file system with about 900TB net capacity and 17/12 GB/s paralell read/write performance. Additional storage is provided by the central NAS-system of the IT-services.

Both clusters are based on the Lenovo NeXtScale system.

CARL (271 TFlop/s theoretical peak performance):

  • 327 compute nodes (9 of these with a GPU)
  • 7.640 CPU cores
  • 77 TB of RAM
  • 360TB local storage

EDDY (201 TFlop/s theoretical peak performance):

  • 244 compute nodes (3 of these with a GPU)
  • 5.856 CPU cores
  • 21 TB of RAM

For more detailed informations about the cluster, you can visit our Overview.

Login

If you want to access the HPC-cluster, you need to have an authorized university account. If you are not authorized yet, request an account.

You can use a SSH client of your choice or the command line on linux computers to connect to the cluster via ssh. To do so, use either

carl.hpc.uni-oldenburg.de

or

eddy.hpc.uni.oldenburg.de

For further informations about the login, please look at the guide located on the page Login to the HPC cluster.

File System

Software and Environment

There are many pre-installed software packages like compilers, libraries, pre- and postprocessing tools and further applications provided. We are using the module command to manage them. With this command you can:

  • list the available software
  • access/load software (even in different versions)

Example: Show the software on CARL and EDDY and load the Intel compiler

[abcd1234@hpcl001 ~]$ module avail
-----------/cm/shared/uniol/modules/compiler-----------
... icc/2016.3.210
[abcd1234@hpcl001 ~]$ module load icc/2016.3.210
[abcd1234@hpcl001 ~]$ module list
Currenty loaded modules: ... icc/2016.3.210 ...

Basic Job Submission