HDF5 2016
From HPC users
Introduction
HDF5 is a unique technology suite that makes possible the management of extremely large and complex data collections.
The HDF5 technology suite includes:
- A versatile data model that can represent very complex data objects and a wide variety of metadata.
- A completely portable file format with no limit on the number or size of data objects in the collection.
- A software library that runs on a range of computational platforms, from laptops to massively parallel systems, and implements a high-level API with C, C++, Fortran 90, and Java interfaces.
- A rich set of integrated performance features that allow for access time and storage space optimizations.
- Tools and applications for managing, manipulating, viewing, and analyzing the data in the collection.
Installed version
The following version are currently available on the cluster:
... on envirnoment hpc-uniol-env:
HDF/4.2.11-intel-2016b HDF/4.2.12-gimkl-7.2017.3 HDF/4.2.12-goolf-7.2.11 HDF5/1.8.17-intel-2016b
... on environment hpc-env/6.4:
HDF/4.2.14-GCCcore-6.4.0 HDF5/1.10.1-foss-2017b HDF5/1.10.1-intel-2018a HDF5/1.10.2-gimkl-2018a HDF5/1.10.2-intel-2018a
Using HDF on the HPC cluster
If you want to use HDF on the cluster, you will need to load the corresponding module first. You can do that with the command:
module load HDF
This will load HDF4 in version 4.2.11. If you need a different version, you will need to specify that like so:
module load HDF5/1.8.17-intel-2016b
Documentation
The full documentation and further informations can be found here.