Difference between revisions of "WRF/WPS"

From HPC users
Jump to navigationJump to search
Line 17: Line 17:


== Private installation ==
== Private installation ==
The get the best performance WRF should be compiled with the Intel compilers and with Intel MPI. For the installation on flow for installation scripts are available (for''em_real'' and up to WRF 3.4.1):
The get the best performance WRF should be compiled with the Intel compilers and with Intel MPI. For the installation on flow for installation scripts are available (for ''em_real'' and up to WRF 3.4.1):
* [[install_wrf_intel.sh]] or [[install_wrf_intel_double.sh]] for the compilation with Intel Compilers and Intel MPI with single or double precision floating points, respectively
* [[install_wrf_intel.sh]] or [[install_wrf_intel_double.sh]] for the compilation with Intel Compilers and Intel MPI with single or double precision floating points, respectively
* [[install_wrf_gcc_ompi.sh]] or [[install_wrf_gcc_ompi_double.sh]] for the compilation with gcc 4.7.1 and OpenMPI 1.4.3 with single or double precision floating points, respectively
* [[install_wrf_gcc_ompi.sh]] or [[install_wrf_gcc_ompi_double.sh]] for the compilation with gcc 4.7.1 and OpenMPI 1.4.3 with single or double precision floating points, respectively

Revision as of 17:41, 16 April 2014

The software WRF (Weather Research and Forecasting Model) is a numerical mesoscale weather prediction system.

Available modules

On FLOW WRF/WPS is available as module. So far the WRF for em_real in single (SP) and double (DP) precision is available. The available modules can be listed by

 module avail wrf

The modules are extended by two helper scripts

  setup_wps_dir.sh [DIRECTORY_NAME]

and (for em_real)

  setup_wrf_dir.sh [DIRECTORY_NAME]

which set up a directory for the preprocessing with WPS and for the simulation with WRF, respectively. The scripts create a new directory wps and wrf and copy the needed files into the directory. Optionally an other DIRECTORY_NAME can be specified.

Private installation

The get the best performance WRF should be compiled with the Intel compilers and with Intel MPI. For the installation on flow for installation scripts are available (for em_real and up to WRF 3.4.1):

In both scripts ungrib2 is supported. The usage of the scripts are

   install_wrf_XXXX.sh WRF_RELEASE DIR_TO_TAR_GZ_FILES


Note:

  • The final installation of WRF and WPS will be in the directories WRFV3_v${WRF_RELEASE}_XXXXX and WPS_v${WRF_RELEASE]_XXXXX, respectively
  • The orginal folders WRFV3 and WPS will be removed during the installation process!
  • WRF with single precision floating point operations leads to strong deviations in the result of WRF when changing the compiler or the WRF release. Thus WRF should be compiled with double precision floating points.

SGE script

A basic script to submit WRF to SGE is shown below:

  #!/bin/bash
  #
  # ==== SGE options ====
  #
  # --- Which shell to use ---
  #$ -S /bin/bash
  #
  # --- Name of the job ---
  #$ -N MY_WRF_JOB
  #
  # --- Change to directory where job was submitted from ---
  #$ -cwd
  #
  # --- merge stdout and stderr ---
  #$ -j y
  #
  # ==== Resource requirements of the job ====
  #
  # --- maximum walltime of the job (hh:mm:ss) ---
  # PLEASE MODIFY TO YOUR NEEDS!
  #$ -l h_rt=18:00:00
  #
  # --- memory per job slot (= core) ---
  #$ -l h_vmem=1800M
  #
  # --- disk space ---
  # OPTIONALLY, PLEASE MODIFY TO YOUR NEEDS!
  ##$ -l h_fsize=100G
  #
  # --- which parallel environment to use, and number of slots (should be multiple of 12) ---
  # PLEASE MODIFY TO YOUR NEEDS!
  #$ -pe impi41 24
  #
  
  # load module, here WRF 3.4.1 in single precision
  module load wrf/3.4.1/em_real/SP
  
  # Start real.exe in parallel (OPIONALLY, ONLY IF NEEDED)
  #mpirun real.exe
  
  # Start wrf.exe in parallel
  mpirun wrf.exe

The job script has to be submitted from your WRF data directiory.

Scaling

Here a short example (January test case, see WRF tutorial) of the scaling and wall clock times for different compiler and single (SP) and double precision (DP):

#Cores GNU compiler 4.7.1 [s] Intel compiler 13.0.1 [s] Intel vs. GNU
single precision double precision single precision double precision single precision double precision
1 351 292 127 212 2,73 1.37
2 181 151 62 109 2.92 1.39
4 98 88 37 62 2.65 1.42
8 56 51 22 39 2.55 1.31

A short visualisation of the times is given below

Scaling of WRF for different compilers and precisions
Speedup of WRF for different compilers and precisions. Note the times are normalized by the time for one core with the same setup.


Known issues

  • The Intel compiler release 12.0.0 produces an bug in the binary ungrib.exe in the WPS-Suite. Due to this bug waves can occurs in the the preprocessed data.
  • The default installation rules of WRF with single precision floating point operations leads to strong deviations in the result of WRF when changing the compiler or the WRF release. Thus WRF should be compiled with double precision floating points.


External links