Quantum Espresso Charge Density Backup

In the past couple of months, we have been actively developing a machine learning method to learn charge-density functionals. We have developed this to map charge-density to a property of interest (such as the band-gap or some other property of interest). We are very excited about this code and I will be updating this site after the publication is completed.

In the mean time, I wanted to show a way in which you can automatically back up the charge density in a QE calculation. In QE, if you are running a relax/vc-relax/md calculation, the charge density is overwritten every scf calculation. If we are going to develop a neural network potential to predict properties from charge densities, we are going to need a lot of data. By performing ab-initio md calculations and backing up the charge density every scf step, we can then post-process these densities to match them up with the property we are interested in.

To back these up, we will need to pull out all the trick from our bash toolbox. In the following runscript, we use tail, awk, and a simple bash script to move the charge density from the *.save directory into the current directory.

#!/bin/bash
#SBATCH -N 1 # Number of cores
#SBATCH -p batch # Partition to submit to
#SBATCH -o hostname.out # File to which STDOUT will be written
#SBATCH -e hostname.err # File to which STDERR will be written
#SBATCH -J sn1

module load espresso-6.1-gnu

touch scf.out

tail -f scf.out | awk '/Writing/ {system("./movecharge.sh")}' &
 
mpirun -n 16 pw.x -in sn.in > scf.out

Essentially this script touches scf.out to guarantees the file exists prior to running the tail command on it. By following the output via tail, any time we encounter ‘Writing’ in the output (a string that indicates that QE is writing the charge density to *.save) we use awk to call the movecharge.sh script. This whole process is forked to the background so that we can actually run pw.x.

movecharge.sh is as follows:

#!/usr/bin/env bash

sleep 20

if [[ -e charge-density-1.dat ]] ; then
  i=1
  while [[ -e charge-density-$i.dat ]] ; do
    let i++
  done
  cp *.save/charge-density.dat charge-density-$i.dat
else
  cp *.save/charge-density.dat charge-density-1.dat
fi

When called, this code first sleeps for 20 seconds to make sure that QE fully writes the charge density file. Then it checks if charge-density-1.dat exists in the current working directory. If it does, it then finds out the highest count of charge-density-N.dat then moves the charge density to charge-density-N+1.dat. At the end of this simulation our directory will have N files called charge-density-i.dat, where i ranges from 1 to N and N is the number of scf steps in the calculation. We can then post-process these files to match each scf calculation in scf.out with our charge densities.

I hope you can find some way to use this. Please keep an eye out for a coming update on our neural network code!

Leave a Reply

Your email address will not be published. Required fields are marked *