Install PFLOTRAN on NERSC

Last updated: 03/19/2024 (update doc for installing on NERSC-Perlmutter)

See this documentation for installation on Linux machine.

Setup environment

By default, Perlmutter uses GPU. Make sure to switch to CPU environment.

module load cpu

Install PETSc

Download Petsc from Bitbucket, and save it into directory petsc_v3.20.2, and checkout the latest version 3.20.2.

git clone https://gitlab.com/petsc/petsc.git petsc_v3.20.2
cd petsc_v3.20.2
git checkout v3.20.2

Set current dir as PETSC_DIR, define --PETSC_ARCH to any name, and a subdir with the same name will be created under the PETSC_DIR (eg.petsc_v3.20.2/perl-c-opt)

export PETSC_DIR=$PWD
export PETSC_ARCH=perl-c-opt

Configure PETSc

  • Use the recommended configuration. This will install a Fortran compiler, MPI, HDF5, and BLAS/LAPACK.
./configure --CFLAGS='-O3' --CXXFLAGS='-O3' --FFLAGS='-O3' --with-debugging=no --download-mpich=yes --download-hdf5=yes --download-hdf5-fortran-bindings=yes --download-fblaslapack=yes --download-metis=yes --download-parmetis=yes
  • After configure, you will see something similar to the following message:
xxx=========================================================================xxx
 # Configure stage complete. Now build PETSc libraries with (gnumake build):
   make PETSC_DIR=/global/cfs/cdirs/m1800/pin/pflotran-perl/petsc_v3.20.2 PETSC_ARCH=perl-c-opt all
xxx=========================================================================xxx
  • Follow the prompt to build PETSc:
make PETSC_DIR=/global/cfs/cdirs/m1800/pin/pflotran-perl/petsc_v3.20.2 PETSC_ARCH=perl-c-opt all

After the build is complete, you may check if the build is successful using:

# Now to check if the libraries are working do:
make PETSC_DIR=/global/cfs/cdirs/m1800/pin/pflotran-perl/petsc_v3.20.2 PETSC_ARCH=perl-c-opt check

Download and compile PFLOTRAN

git clone https://bitbucket.org/pflotran/pflotran pflotran

# optional. To checkout a specific version
# cd pflotran && git checkout maint/v5.0

cd pflotran/src/pflotran
make -j4 pflotran # use parallel thread to compile? You can also try -j8, -j16... if more cores are available

After compilation is complete, a new file named pflotran* executable is generated at current directory. You can also move this executable to another directory, e.g. ` ./bin/pflotran*, then you can export this directory to PATH`.

mkdir bin && cd bin && cp ../pflotran .
export PATH=$PATH:/global/project/projectdirs/m1800/pin/pflotran/src/pflotran/bin

Fast compilation (use with caution)

Caution! This works if only a small change is made because the compilation will not rebuild all the dependencies.

make pflotran fast=1

Regression test

Do a regression test to see if pflotran if working. First, request an interactive node to run the regression test.

salloc -N 1 -C cpu -q interactive -t 01:00:00 -L SCRATCH -A m1800
srun -n 1 pflotran -pflotranin $PFLOTRAN_DIR/regression_tests/default/543/543_flow.in # need to use one core to run this example

Within seconds, the test model should finish, and the installation processes are done! 🎉

Create modulefile

Use module is a great way to organize compiled codes. Here is a sample that I used for PFLOTRAN on NERSC.

#%Module1.0#####################################################################
##
## modules modulefile
##
proc ModulesHelp { } {
    global mpi_bin

    puts stderr "\tPFLOTRAN pflotran/5.0 repository, opt build"
    puts stderr ""
}

module-whatis   "PFLOTRAN pflotran/5.0 repository, opt build"
# #############################################################################

module load cpu

setenv PFLOTRAN_DIR /PATH/TO/PFLOTRAN_DIR
setenv PETSC_DIR /PATH/TO/PETSC_DIR
setenv PETSC_ARCH perl-c-opt

prepend-path    PATH            $env(PFLOTRAN_DIR)/src/pflotran/bin
prepend-path    PATH            $env(PETSC_DIR)/$env(PETSC_ARCH)/bin
prepend-path    PYTHONPATH      $env(PFLOTRAN_DIR)/src/python

Realization dependent runs

PFLOTRAN supports running multiple realizations at once. Must have realization dependent dataset.

  • run single realization
srun -n 32 pflotran pflotran.in -realization_id 1
  • run all realization
srun -n 128 pflotran pflotran.in -stochastic -num_realizations 4 -num_groups 4

Note: # of realizations per groups equals num_realizations/num_groups; # of cores per simulation = # of cores/num_groups; in this case, 32 cores are used for one realization.

Update PFLOTRAN

  • make sure PETSC_DIR and PETSC_ARCH are in your environment
export PETSC_DIR=$PWD
export PETSC_ARCH=perl-c-opt
  • Pull the changes from remote repo
git pull 
  • Recompile PFLOTRAN
cd pflotran/src/pflotran
make -j4 pflotran
# make pflotran fast=1

Automatically update

use the following bash script to automatically update PFLOTRAN to the latest on master branch.

#! /bin/bash

echo "<<<<<<<<<<<<<<<<<<update pflotran repo<<<<<<<<<<<<<<<<<<"
git pull

echo "<<<<<<<<<<<<<<<<<<compile pflotran<<<<<<<<<<<<<<<<<<<<<<<"
cd ./src/pflotran
make -j8 pflotran

Common issues

  1. module: command not found. This happened on an interactive node with zsh shell.
    • To fix this, run source $LMOD_PKG/init/zsh prior to running any module command.



    Enjoy Reading This Article?

    Here are some more articles you might like to read next:

  • A Quick Intro to Git
  • A simple word cloud from Google Scholar
  • A Quick Intro to Note-taking using Markdown
  • A Zotero to Obsidian Workflow
  • Create Your Academic Website using GitHub Pages