Difference between revisions of "Welcome to the HPC User Wiki of the University of Oldenburg"

From HPC users
Jump to navigationJump to search
 
(219 intermediate revisions by 8 users not shown)
Line 1: Line 1:
'''Note''': This is a first, '''preliminary''' version (v0.01) of the HPC User Wiki. Its primary purpose is to get you started with our new clusters (FLOW and HERO), enabling you to familiarize with these systems and gather some experience. More elaborate, updated versions will follow, so you may want to check these pages regularly.
__NOTOC__
__NOEDITSECTION__
<div style="text-align:justify;">
<center>
{| style="text-align:justify;font-size:1.2em;line-height:1.2em;background-color:#eeeeff;" border="1" cellspacing="0"
|-
| [[Image:picture_of_nodes.jpg|155px]]
| [[Image:picture_of_cluster_closed.jpg|70px]]
| ''This is the HPC-Wiki of the University of Oldenburg''<br>
| [[Image:picture_of_gpfs.jpg|82px]]
| [[Image:picture_of_infinyband.jpg|155px]]
|}
</center>


= Basic Information =
<center>
{| style="background-color:#eeeeff;" cellpadding="10" border="1" cellspacing="0"
|- style="background-color:#ddddff;"
! HPC Facilities
! Login
! User environment
! Compiling and linking
! Job Management (Queueing) System
! Altix UV 100 system
! Examples
|- valign="top"
|
* [[HPC Facilities of the University of Oldenburg| Overview]]
* [[HPC Facilities of the University of Oldenburg#FLOW| FLOW]]
* [[HPC Facilities of the University of Oldenburg#HERO| HERO]]
* [[HPC Policies| HPC Policies]]
* [[Unix groups| Groups ]]
* [[Acknowledging_the_HPC_facilities| Acknowledging FLOW/HERO]]
* [[User Meetings]]
|
* [[Logging in to the system#From within the University (intranet) | From University]]
* [[Logging in to the system#From outside the University (internet) | From Home]]
|
* [[User environment - The usage of module| Usage of module]]
* [[File system| File System / Quotas]]
* [[Mounting Directories of FLOW and HERO#Windows | Shares under Windows]]
* [[Mounting Directories of FLOW and HERO#Linux | Shares under Linux]]
* [[License servers]]
|
* [[Compiling and linking|Basics]]
* [[GNU Compiler]]
* [[Intel Compiler]]
* [[PGI Compiler]]
* [[Open64 Compiler]]
* [[Using the Altix UV 100 system#Compiling and linking applications| Altix UV 100]]


== Introduction ==
|
* [[SGE Job Management (Queueing) System| Overview]]
* [[SGE Job Management (Queueing) System#Submitting jobs| Submitting ]]
* [[SGE Job Management (Queueing) System#Specifying job requirements| Job requirements ]]
* [[SGE Job Management (Queueing) System#Parallel environments (PEs) | Parallel jobs ]]
* [[SGE Job Management (Queueing) System#Interactive jobs | Interactive jobs ]]
* [[SGE Job Management (Queueing) System#Monitoring and managing your jobs | Commands ]]
* [[SGE Job Management (Queueing) System#Array jobs| Job arrays  ]]
* [[SGE Job Management (Queueing) System#Environment variables | Environment variables]]
* [[Brief_Introduction_to_HPC_Computing#Checking_the_status_of_the_job | Checking the job status]] [[Brief_Introduction_to_HPC_Computing#Checking_the_status_of_the_job_2| (par. jobs)]]
* [[Brief_Introduction_to_HPC_Computing#Details_for_finished_jobs| Obtaining details for finished jobs]]
* [[SGE Job Management (Queueing) System#Documentation | Documentation]]
* [[Queues_and_resource_allocation| On Queues and resource allocation]]
|
* [[Using the Altix UV 100 system#Compiling and linking applications| Compiling]]
* [[Using the Altix UV 100 system#Submitting SGE jobs| Submitting]]
* [[Using the Altix UV 100 system#Documentation| Documentation]]
|
* [[Brief Introduction to HPC Computing| Brief Introduction to HPC Computing]]
* [[Matlab Examples using MDCS| Matlab examples using MDCS]]
* [[MDCS Basic Example]] (for R2014b and later)
* [[HPC Tutorial No1| HPC Tutorial 2013]]
* [[HPC Introduction October 6-8, 2014| HPC Tutorial 2014]]
* [[HPC Introduction October 7-9, 2015| HPC Tutorial 2015]]
|-


The central HPC facilities of the University of Oldenburg comprise three systems:


*FLOW ('''F'''acility for '''L'''arge-Scale C'''O'''mputations in '''W'''ind Energy Research): IBM iDataPlex cluster solution with 2232 CPU cores, 6 TB of (distributed) main memory, and Quad-Data Rate (QDR) InfiniBand interconnect (theoretical peak performance: 24 TFlop/s).
|}
</center>


*HERO ('''H'''igh-'''E'''nd Computing '''R'''esource '''O'''ldenburg): hybrid system composed of two components:
= Application Software and Libraries =
** IBM iDataPlex cluster solution with 1800 CPU cores, 4 TB of (distributed) main memory, and Gigabit interconnect (theoretical peak performance: 19.2 TFlop/s),
** SGI Altix UltraViolet shared-memory system ("SMP" component) with 120 CPU cores and 640 GB of globally addressable memory, and NumaLink5 interconnect (theoretical peak performance: 1.3 TFlop/s).


*[http://www.csc.uni-oldenburg.de GOLEM]: older, AMD Opteron-based cluster with 390 cores and 800 GB of (distributed) main memory (theoretical peak performance: 1.6 TFlop/s).
<center>
{| style="background-color:#eeeeff;" cellpadding="10" border="1" cellspacing="0"
|- style="background-color:#ddddff;"
!Compiler and Development Tools
!Quantum Chemistry
!Computational Fluid Dynamics
!Mathematics/Scripting
!Visualisation
!Libraries
|- valign="top"
|
* [[debugging]]
* [[git]]
* [[GNU Compiler]]
* [[Intel Compiler]]
* [[Open64 Compiler]]
* [[PGI Compiler]]
* [[Profiling_using_gprof| profiling]]
* [[scalasca]]
* [[subversion (svn)]]
* [[valgrind]]


FLOW and HERO use a dedicated, shared storage system (high-performance NAS Cluster) with a net capacity of 130 TB.
|
* [[Gaussian 09]]
* [[MOLCAS]]
* [[MOLPRO]]
* [[NBO]]
* [[ORCA]]
|
* [[Ansys]]
* [[FOAMpro]]
* [[Nektar++]]
* [[Nek 5000]]
* [[OpenFOAM]]
* [[PALM]]
* [[STAR-CCM++]]
* [[THETA]]
* [[WRF/WPS]]


FLOW is employed for computationally demanding CFD calculations in wind energy research, conducted by the Research Group [http://twist.physik.uni-oldenburg.de/en/index.html TWiST] (Turbulence, Wind Energy, and Stochastis) and the [http://www.forwind.de/forwind/index.php?article_id=1&clang=1 ForWind] Center for Wind Energy Research. It is, to the best of our knowledge, the largest system in Europe dedicated solely to that purpose.
|
* [[Configuration MDCS]] (2014b and later)  
* [[MATLAB Distributing Computing Server]]
* [[Python]]
* [[R]]
* [[STATA| STATA]]
|
* [[iso99]]
* [[NCL]]
* [[ncview]]
* [[paraview]]
|
* [[BLAS and LAPACK]]
* [[EGSnrc]]
* [[FLUKA]]
* [[GEANT4]]
* [[Gurobi]]
* [[HDF5]]
* [[Intel MPI]]
* [[LEDA]]
* [[NetCDF]]
* [[OpenMPI]]


The main application areas of the HERO cluster are Quantum Chemistry, Theoretical Physics, and the Neurosciences and Audiology. Besides that, the system is used by many other research groups of the [http://www.fk5.uni-oldenburg.de Faculty of Mathematics and Science] and the [http://www.informatik.uni-oldenburg.de Department of Informatics] of the School of Computing Science, Business Administration, Economics, and Law.
|-


== Hardware Overview  ==
|}
</center>


(Westmere-EP, 2.66 GHz)
= Courses and Tutorials =


(Nehalem-EX, "Beckton")
<center>
{| style="background-color:#eeeeff;" cellpadding="10" border="1" cellspacing="0"
|- style="background-color:#ddddff;"
!Introduction to HPC Courses
!Matlab Tutorials
!New OS
|- valign="top"
|
* [[HPC Introduction October 6-8, 2014]]
* [[HPC Introduction October 7-9, 2015]]
|
* [[Audio Data Processing]]
* [[Using the MEX Compiler]]
|
* [[media:New_OS_On_FLOW.pdf | New OS on FLOW ]]
|-


== Basic Usage  ==
|}
</center>


=== Log in to the system  ===


==== From within the University (intranet) ====
= Contact =


Within the internal net of the University, access to the systems is granted via ssh. Use your favorite ssh client, like OpenSSH, PuTTY, ... For example, on a UNIX/Linux system, users of FLOW may type on the command line (replace "abcd1234" by your own account):
<center>
{| style="background-color:#eeeeff;" cellpadding="10" border="1" cellspacing="0"
|- style="background-color:#ddddff;"
!HPC Resource
!EMail
|- valign="top"
|
FLOW and HERO<br>
Both (in case of vacation)<br>
|
Stefan.Harfst@uni-oldenburg.de<br>
hpcuniol@uni-oldenburg.de<br>
|-
|}
</center>


ssh abcd1234@flow.hpc.uni-oldenburg.de


Similarly, users of HERO may login by typing:
'''''Note:''' This Wiki is under construction and a preliminary version! Contributions are welcome. Please ask Stefan Harfst (Stefan.Harfst(at)uni-oldenburg.de) for further informations.''


ssh abcd1234@hero.hpc.uni-oldenburg.de
<center>
''Only for editors: [[Formatting rules for this Wiki]]''
</center>


Use <tt>ssh -X</tt> for X11 forwarding (i.e., if you need to export the graphical display to your local system).
</div>
 
[[HPC User Wiki 2016]]
For security reasons, access to the HPC systems is denied from certain subnets. In particular, you cannot login from the WLAN of the University or from "public" PCs (located, e.g., in Libraries, PC rooms, or at other places).
 
==== From outside the University (internet) ====
 
Gateway      : vpn2.uni-oldenburg.de
Group name    : hpc-vpn
Group password: hqc-vqn
 
thus you have to supply the above data when configuring your VPN client.
After the VPN tunnel is established, you can login to HERO or FLOW via ssh, as described above.
 
=== User Environment  ===
 
=== Compiling and linking programs ===
 
==== Intel compiler
 
===== Documentation =====
 
* [http://software.intel.com/sites/products/documentation/hpc/composerxe/en-us/cpp/lin/index.htm C/C++ Compiler]
 
* [http://software.intel.com/sites/products/documentation/hpc/composerxe/en-us/start/lin/cpp/index.htm Getting started tutorial]
 
* [http://software.intel.com/sites/products/documentation/hpc/composerxe/en-us/fortran/lin/index.htm Fortran compiler User and Reference Guides]
 
=== Job Submission and Monitoring  ===
 
== Application Software and Libraries  ==
 
== Advanced Usage ==
 
Here will you will find, among others, hints how to analyse and optimize your programs using HPC tools (profiler, debugger, performance libraries), and other useful information.
 
... tbc ...

Latest revision as of 15:08, 6 June 2017


Picture of nodes.jpg Picture of cluster closed.jpg This is the HPC-Wiki of the University of Oldenburg
Picture of gpfs.jpg Picture of infinyband.jpg

Basic Information

HPC Facilities Login User environment Compiling and linking Job Management (Queueing) System Altix UV 100 system Examples

Application Software and Libraries

Compiler and Development Tools Quantum Chemistry Computational Fluid Dynamics Mathematics/Scripting Visualisation Libraries

Courses and Tutorials

Introduction to HPC Courses Matlab Tutorials New OS


Contact

HPC Resource EMail

FLOW and HERO
Both (in case of vacation)

Stefan.Harfst@uni-oldenburg.de
hpcuniol@uni-oldenburg.de


Note: This Wiki is under construction and a preliminary version! Contributions are welcome. Please ask Stefan Harfst (Stefan.Harfst(at)uni-oldenburg.de) for further informations.

Only for editors: Formatting rules for this Wiki

HPC User Wiki 2016