Difference between revisions of "Yambo 2016"
Schwietzer (talk | contribs) |
Schwietzer (talk | contribs) |
||
(4 intermediate revisions by the same user not shown) | |||
Line 3: | Line 3: | ||
== Installed version(s) == | == Installed version(s) == | ||
The following versions are installed and currently available on environment ''hpc-env/8.3'': | The following versions are installed and currently available on environment ''hpc-env/8.3'': | ||
Line 9: | Line 8: | ||
== Loading / Using Yambo== | == Loading / Using Yambo== | ||
To load the desired version of the module, use the ''module load'' command, e.g. | To load the desired version of the module, use the ''module load'' command, e.g. | ||
Line 17: | Line 15: | ||
Always remember: this command is case sensitive! | Always remember: this command is case sensitive! | ||
After loading the Yambo module, yambo and its associated executables (<tt>a2y, c2y, p2y, ypp</tt>) can be called from within every directory. But since the software package was installed with intel MPI to ensure parallel computing, you must call the desired executables with <tt> mpirun -np <threads> <executable> </tt>, for example like this: | |||
After loading the Yambo module, yambo and its associated executables (<tt>a2y, c2y, p2y, ypp</tt>) can be called from within every directory. But since the software package was | |||
mpirun -np 4 yambo | mpirun -np 4 yambo | ||
== Start Working with Yambo: Quick Tutorial == | == Start Working with Yambo: Quick Tutorial == | ||
Our installed module version of Yambo on CARL differs slightly in how to call the specific commands. For this reason you can find a shortened and slightly modified tutorial which is based on the | Our installed module version of Yambo on CARL differs slightly in how to call the specific commands. For this reason, you can find a shortened and slightly modified tutorial which is based on the wiki pages [http://www.yambo-code.org/wiki/index.php?title=Bulk_material:_h-BN Bulk material: h-BN] and [http://www.yambo-code.org/wiki/index.php?title=Initialization Initialization ] from the [http://www.yambo-code.org/wiki/index.php?title=Tutorials yambo tutorial wiki]. <br/> | ||
On our tutorial version, we will focus on the commands and output files, and not much about the | On our tutorial version, we will focus on the commands and output files, and not much about the scientific substance of the results. If you want to know more about that we highly recommend visiting the wiki pages that this tutorial is based on. <br/> | ||
===Bulk material: h-BN === | ===Bulk material: h-BN === | ||
Line 50: | Line 45: | ||
Now, run the SCF calculation to generate the ground-state charge density, occupations, Fermi level, and so on. | Now, run the SCF calculation to generate the ground-state charge density, occupations, Fermi level, and so on. | ||
For this step we need the executable <tt>pw.x</tt> which is part of QuantumESPRESSO and ''not'' callable from only the Yambo module. Also, since QuantumESPRESSO and Yambo are built on MPI, you always have to call the executables with ''mpirun'' | For this step, we need the executable <tt>pw.x</tt> which is part of QuantumESPRESSO and ''not'' callable from only the Yambo module. Also, since QuantumESPRESSO and Yambo are built on MPI, you always have to call the executables with ''mpirun'' | ||
mpirun pw.x < hBN_scf.in > hBN_scf.out # can take up to 1 minute | mpirun pw.x < hBN_scf.in > hBN_scf.out # can take up to 1 minute | ||
Line 59: | Line 54: | ||
Note the presence of the following flags in the input file: | Note the presence of the following flags in the input file: | ||
wf_collect=.true. | wf_collect=.true. | ||
force_symmorphic=.true. | force_symmorphic=.true. | ||
Line 68: | Line 62: | ||
After these two runs, you should have a hBN.save directory: | After these two runs, you should have a hBN.save directory: | ||
$ ls hBN.save | $ ls hBN.save | ||
B.pz-vbc.UPF N.pz-vbc.UPF wfc12.hdf5 wfc1.hdf5 wfc4.hdf5 wfc7.hdf5 | B.pz-vbc.UPF N.pz-vbc.UPF wfc12.hdf5 wfc1.hdf5 wfc4.hdf5 wfc7.hdf5 | ||
Line 93: | Line 86: | ||
But since we created the SAVE folder at the steps above, you are probably at the right place and can directly call yambo: | But since we created the SAVE folder at the steps above, you are probably at the right place and can directly call yambo: | ||
mpirun -np4 yambo ## you ''can'' parallelize the process, but this is not mandatory | mpirun -np4 yambo ## you ''can'' parallelize the process, but this is not mandatory | ||
There is now r_setup_02 In the case of parallel runs, CPU-dependent log files will appear inside a LOG folder, e.g. | There is now r_setup_02 In the case of parallel runs, CPU-dependent log files will appear inside a LOG folder, e.g. | ||
ls LOG | ls LOG | ||
''l_p2y_CPU_1 l_p2y_CPU_13 l_p2y_CPU_17 ...'' | ''l_p2y_CPU_1 l_p2y_CPU_13 l_p2y_CPU_17 ...'' | ||
====Run-time output==== | ====Run-time output==== | ||
This is typically written to standard output (on screen) and tracks the progress of the run in real time: | This is typically written to standard output (on screen) and tracks the progress of the run in real time: | ||
<---> [01] MPI/OPENMP structure, Files & I/O Directories | <---> [01] MPI/OPENMP structure, Files & I/O Directories | ||
<---> [02] CORE Variables Setup | <---> [02] CORE Variables Setup | ||
Line 125: | Line 115: | ||
Specific runlevels are indicated with numeric labels like [02.02]. | Specific runlevels are indicated with numeric labels like [02.02]. | ||
The hashes (#) indicate progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete. | The hashes (#) indicate the progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete. | ||
====New core databases==== | ====New core databases==== | ||
New databases appear in the SAVE folder: | New databases appear in the SAVE folder: | ||
ls SAVE | |||
ls SAVE | |||
''ns.db1 ns.wf ns.kb_pp_pwscf ndb.gops ndb.kindx'' | ''ns.db1 ns.wf ns.kb_pp_pwscf ndb.gops ndb.kindx'' | ||
''ns.wf_fragments_1_1 ...'' | ''ns.wf_fragments_1_1 ...'' | ||
Line 138: | Line 126: | ||
These contain information about the G-vector shells and k/q-point meshes as defined by the DFT calculation. | These contain information about the G-vector shells and k/q-point meshes as defined by the DFT calculation. | ||
In general: a database called ''n'''s'''.xxx'' is a static database, generated once by <tt>p2y<tt | In general: a database called ''n'''s'''.xxx'' is a static database, generated once by <tt>p2y</tt>, while databases called ''n'''db'''.xxx'' are dynamically generated while you use yambo. | ||
TIP: if you launch yambo, but it does not seem to do anything, check that these files are present. | TIP: if you launch yambo, but it does not seem to do anything, check that these files are present. | ||
Line 144: | Line 132: | ||
====Report File==== | ====Report File==== | ||
A report file r_setup is generated in the run directory. This mostly reports information about the ground state system as defined by the DFT run, but also adds information about the band gaps, occupations, shells of G-vectors, IBZ/BZ grids, the CPU structure (for parallel runs), and so on. More about this can be found at the [http://www.yambo-code.org/wiki/index.php?title=Initialization#Report_file original tutorial] | A report file r_setup is generated in the run directory. This mostly reports information about the ground state system as defined by the DFT run, but also adds information about the band gaps, occupations, shells of G-vectors, IBZ/BZ grids, the CPU structure (for parallel runs), and so on. More about this can be found at the [http://www.yambo-code.org/wiki/index.php?title=Initialization#Report_file original tutorial] | ||
====Different Ways of Running Yambo==== | ====Different Ways of Running Yambo==== | ||
Line 150: | Line 137: | ||
Let's try to re-run the setup with the command | Let's try to re-run the setup with the command | ||
nohup mpirun yambo & | nohup mpirun yambo & | ||
ls | ls | ||
Line 158: | Line 144: | ||
== Cite Yambo == | == Cite Yambo == | ||
The yambo team kindly asks everyone who uses yambo to cite | The yambo team kindly asks everyone who uses yambo to cite their work about yambo as well, which we think is a very supportable request for reasons of scientific fairness. <br/> | ||
Their quote: <br/> | |||
<blockquote> | <blockquote> | ||
"'''Cite us''' | "'''Cite us''' | ||
Line 171: | Line 157: | ||
</blockquote> | </blockquote> | ||
== Documentation == | == Documentation == | ||
The developers created a very elaborate wiki including tutorials for learning and using Yambo. You can find the wiki entry page [http://www.yambo-code.org/wiki/index.php?title=Main_Page here]. <br/> | |||
The developers created a very elaborate wiki including tutorials for learning and using Yambo. You | You can find the software's main page [http://www.yambo-code.org/ here] | ||
You can find the | |||
<tt> </tt> | <tt> </tt> |
Latest revision as of 11:21, 26 May 2021
Introduction
YAMBO implements Many-Body Perturbation Theory (MBPT) methods (such as GW and BSE) and Time-Dependent Density Functional Theory (TDDFT), which allows for accurate prediction of fundamental properties as band gaps of semiconductors, band alignments, defect quasi-particle energies, optics and out-of-equilibrium properties of materials. ¹
Installed version(s)
The following versions are installed and currently available on environment hpc-env/8.3:
- Yambo/5.0.2-intel-2019b
Loading / Using Yambo
To load the desired version of the module, use the module load command, e.g.
module load hpc-env/8.3 module load Yambo
Always remember: this command is case sensitive!
After loading the Yambo module, yambo and its associated executables (a2y, c2y, p2y, ypp) can be called from within every directory. But since the software package was installed with intel MPI to ensure parallel computing, you must call the desired executables with mpirun -np <threads> <executable> , for example like this:
mpirun -np 4 yambo
Start Working with Yambo: Quick Tutorial
Our installed module version of Yambo on CARL differs slightly in how to call the specific commands. For this reason, you can find a shortened and slightly modified tutorial which is based on the wiki pages Bulk material: h-BN and Initialization from the yambo tutorial wiki.
On our tutorial version, we will focus on the commands and output files, and not much about the scientific substance of the results. If you want to know more about that we highly recommend visiting the wiki pages that this tutorial is based on.
Bulk material: h-BN
Prerequisites
We will use the following modules, executables and files:
- PWSCF input files and pseudopotentials for hBN bulk
- the modules QuantumESPRESSO/6.6-intel-2019b and Yambo/5.0.2-intel-2019b on hpc-env/8.3
- pw.x executable, version 5.0 or later
- p2y and yambo executables
DFT calculations
Firstly, create a test folder and get the test files for this tutorial:
mkdir $HOME/yambo_test && cd $HOME/yambo_test wget http://www.yambo-code.org/educational/tutorials/files/hBN.tar.gz tar xvf hBN.tar.gz
Load the modules and cd into the untarred directory
ml hpc-env/8.3 ml QuantumESPRESSO/6.6-intel-2019b # It must be this version, not the 6.7 version based on intel-iompi. Otherwise, you will run into MPI errors ml Yambo/5.0.2-intel-2019b cd hBN/PWSCF
Now, run the SCF calculation to generate the ground-state charge density, occupations, Fermi level, and so on. For this step, we need the executable pw.x which is part of QuantumESPRESSO and not callable from only the Yambo module. Also, since QuantumESPRESSO and Yambo are built on MPI, you always have to call the executables with mpirun
mpirun pw.x < hBN_scf.in > hBN_scf.out # can take up to 1 minute
Next run a non-SCF calculation to generate a set of Kohn-Sham eigenvalues and eigenvectors for both occupied and unoccupied states (100 bands):
mpirun -np 4 pw.x < hBN_nscf.in > hBN_nscf.out # this step can be parallelized, here with 4 threads (-np 4)
Here we use a 6x6x2 grid giving 14 k-points, but denser grids should be used for checking convergence of Yambo runs.
Note the presence of the following flags in the input file:
wf_collect=.true. force_symmorphic=.true. diago_thr_init=5.0e-6, diago_full_acc=.true.
which are needed for generating the Yambo databases accurately. Full explanations of these variables are given on the quantum-ESPRESSO input variables page.
After these two runs, you should have a hBN.save directory:
$ ls hBN.save B.pz-vbc.UPF N.pz-vbc.UPF wfc12.hdf5 wfc1.hdf5 wfc4.hdf5 wfc7.hdf5 charge-density.hdf5 wfc10.hdf5 wfc13.hdf5 wfc2.hdf5 wfc5.hdf5 wfc8.hdf5 data-file-schema.xml wfc11.hdf5 wfc14.hdf5 wfc3.hdf5 wfc6.hdf5 wfc9.hdf5
Conversion to Yambo format
The PWscf bBN.save output is converted to the Yambo format using the p2y executable (pwscf to yambo), found in the yambo bin directory ($EBROOTYAMBO/bin - callable from everywhere). Enter hBN.save and launch p2y:
cd hBN.save mpirun p2y
This output repeats some information about the system and generates a SAVE directory
ls SAVE ns.db1 ns.wf ns.kb_pp_pwscf ns.wf_fragments_1_1 ... ns.kb_pp_pwscf_fragment_1 ...
These files, with an n prefix, indicate that they are in netCDF format, and thus not human readable. However, they are perfectly transferable across different architectures. You can check that the databases contain the information you expect by launching Yambo using the -D option:
mpirun yambo -D
Initialization
For Yambo to initialize, you must be at the directory containing the SAVE folder, and not inside the SAVE folder. The error output yambo: cannot access CORE database (SAVE/*db1 and/or SAVE/*wf) mostly will tell you, that you are currently not inside the right directory.
But since we created the SAVE folder at the steps above, you are probably at the right place and can directly call yambo:
mpirun -np4 yambo ## you can parallelize the process, but this is not mandatory
There is now r_setup_02 In the case of parallel runs, CPU-dependent log files will appear inside a LOG folder, e.g.
ls LOG l_p2y_CPU_1 l_p2y_CPU_13 l_p2y_CPU_17 ...
Run-time output
This is typically written to standard output (on screen) and tracks the progress of the run in real time:
<---> [01] MPI/OPENMP structure, Files & I/O Directories <---> [02] CORE Variables Setup <---> [02.01] Unit cells <---> [02.02] Symmetries <---> [02.03] Reciprocal space <---> Shells finder |########################################| [100%] --(E) --(X) <---> [02.04] K-grid lattice <---> Grid dimensions : 6 6 2 <---> [02.05] Energies & Occupations <---> [03] Transferred momenta grid and indexing <---> BZ -> IBZ reduction |########################################| [100%] --(E) --(X) <---> [03.01] X indexes <---> X [eval] |########################################| [100%] --(E) --(X) <---> X[REDUX] |########################################| [100%] --(E) --(X) <---> [03.01.01] Sigma indexes <---> Sigma [eval] |########################################| [100%] --(E) --(X) <---> Sigma[REDUX] |########################################| [100%] --(E) --(X) <---> [04] Timing Overview <---> [05] Memory Overview <---> [06] Game Over & Game summary
Specific runlevels are indicated with numeric labels like [02.02]. The hashes (#) indicate the progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete.
New core databases
New databases appear in the SAVE folder:
ls SAVE ns.db1 ns.wf ns.kb_pp_pwscf ndb.gops ndb.kindx ns.wf_fragments_1_1 ... ns.kb_pp_pwscf_fragment_1 ...
These contain information about the G-vector shells and k/q-point meshes as defined by the DFT calculation.
In general: a database called ns.xxx is a static database, generated once by p2y, while databases called ndb.xxx are dynamically generated while you use yambo.
TIP: if you launch yambo, but it does not seem to do anything, check that these files are present.
Report File
A report file r_setup is generated in the run directory. This mostly reports information about the ground state system as defined by the DFT run, but also adds information about the band gaps, occupations, shells of G-vectors, IBZ/BZ grids, the CPU structure (for parallel runs), and so on. More about this can be found at the original tutorial
Different Ways of Running Yambo
We just run Yambo interactively.
Let's try to re-run the setup with the command
nohup mpirun yambo & ls l_setup nohup.out r_setup r_setup_01 SAVE
If Yambo is launched using a script, or as a background process, or in parallel, this output will appear in a log file prefixed by the letter l, in this case as l_setup. If this log file already exists from a previous run, it will not be overwritten. Instead, a new file will be created with an incrementing numerical label, e.g. l_setup_01, l_setup_02, etc. This applies to all files created by Yambo. Here we see that l_setup was created for the first time, but r_setup already existed from the previous run, so now we have r_setup_01 If you check the differences between the two you will notice that in the second run yambo is reading the previously created ndb.kindx in place of re-computing the indexes. Indeed the output inside l_setup does not show the timing for X and Sigma
Cite Yambo
The yambo team kindly asks everyone who uses yambo to cite their work about yambo as well, which we think is a very supportable request for reasons of scientific fairness.
Their quote:
"Cite us
It is scientifically fair to cite the two following articles in any publication based on results obtained with Yambo
- Many-body perturbation theory calculations using the yambo code, Davide Sangalli and Andrea Ferretti and Henrique Miranda and Claudio Attaccalite and Ivan Marri and Elena Cannuccia and Pedro Miguel Melo and Margherita Marsili and Fulvio Paleari and Antimo Marrazzo and Gianluca Prandini and Pietro Bonf\`a and Michael O Atambo and Fabio Affinito and Maurizia Palummo and Alejandro Molina Sanchez and Conor Hogan and Myrta Gr\xFCning and Daniele Varsano and Andrea Marini, Journal of Physics: Condensed Matter 31, 325902 (2019).
- Yambo: an ab initio tool for excited state calculations, Andrea Marini, Conor Hogan, Myrta Grüning, Daniele Varsano, Comp. Phys. Comm. 144, 180 (2009).
Check the Yambo researcher ID page for citation information." ²
Documentation
The developers created a very elaborate wiki including tutorials for learning and using Yambo. You can find the wiki entry page here.
You can find the software's main page here