-
Notifications
You must be signed in to change notification settings - Fork 121
Getting Started
git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git
cd ufs-srweather-app
Then, check out the submodules for the SRW application.
./manage_externals/checkout_externals
See this link to determine the level of support for your platform and compiler. You will need to know if your platform is pre-configured, configurable, limited-test, or build-only.
If you are running on Stampede, see the Getting Started instructions for that machine here.
There are a number of prerequisite libraries that must be installed before building and running this application. If you are running on a pre-configured platform, those libraries are already installed on your system and this step may be skipped.
If you are not running on a pre-configured platform, you will need to build the required libraries. The NCEPLIBS-external wiki page is the starting point for building the libraries.
Once you have the prerequisite libraries built, the workflow must be able to find them, via the environment variable NCEPLIBS_DIR
. If you followed the instructions on the NCEPLIBS-external wiki page, the NCEPLIBS_DIR
should be set to $WORK/NCEPLIBS-ufs-v1.1.0
. On some systems (Stampede2, for example), you MUST source this file in your .bashrc or equivalent script in order for compute nodes to find it. For example, you would add the following to your .bashrc:
source $WORK/NCEPLIBS-ufs-v1.1.0/bin/setenv_nceplibs.sh
If you are on a pre-configured platform, there is an input data directory with sample initialization/lateral boundary condition (IC/LBC) data for one experiment (GST case) already prepared. The data can be found in the following locations on these machines:
On Cheyenne:
/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files
On Hera:
/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data
On Jet:
/lfs4/BMC/wrfruc/FV3-LAM/model_data
On Orion:
/work/noaa/fv3-cam/UFS_SRW_app/v1p0/model_data
On Gaea:
/lustre/f2/pdata/esrl/gsd/ufs/ufs-srw-release-v1.0.0/staged_extrn_mdl_files
On WCOSS Cray:
/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data
On WCOSS Dell:
/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data
If you are not on a pre-configured platform, you can download the sample data file from one of two locations provided below.
AWS S3 bucket:
https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar
EMC ftp site:
https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/gst_model_data.tar.gz
Once the data are staged, the path will need to be set in your config.sh
as described below in Section 5.
Instructions for loading the proper modules and/or setting the correct environment variables can be found in the env/
directory in files named build_<platform>_<compiler>.env. The commands in those files can be directly copy-pasted to the command line or the file can sourced. You may need to modify certain variables such as the path to NCEP libraries for your individual platform or the use of setenv rather than export depending on your environment:
ls -l env/
-rw-rw-r-- 1 user ral 466 Jan 21 08:41 build_cheyenne_intel.env
-rw-rw-r-- 1 user ral 461 Jan 21 08:41 build_hera_intel.env
-rw-rw-r-- 1 user ral 543 Jan 21 08:41 build_jet_intel.env
...
mkdir build
cd build
Run cmake to set up the Makefile, then run make
cmake .. -DCMAKE_INSTALL_PREFIX=..
make -j4 >& build.out &
If this step is successful, there should be twelve executables in ufs-srweather-app/bin
including an executable for the model NEMS.exe
.
cd ../regional_workflow/ush
cp config.community.sh config.sh
Edit config.sh
to set the machine you are running on to MACHINE
, use an account you can charge for ACCOUNT
, and set the name of the experiment EXPT_SUBDIR
.
MACHINE="your machine eg hera, cheyenne"
ACCOUNT="your account"
EXPT_SUBDIR="my_expt_name"
If you have access to the NOAA HPSS from the machine you are running on, those changes should be sufficient; however, if that is not the case (for example, on Cheyenne), or if you have pre-staged the initialization data, the following parameters should also be set:
USE_USER_STAGED_EXTRN_FILES="TRUE"
EXTRN_MDL_SOURCE_BASEDIR_ICS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" )
EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" "gfs.pgrb2.0p25.f030" "gfs.pgrb2.0p25.f036" "gfs.pgrb2.0p25.f042" "gfs.pgrb2.0p25.f048" )
For pre-configured machines, see Section 2 (Prepare the build) for paths to where the path-to/model_data to access pre-staged IC/LBCs for the GST case can be set to.
You will need to load the appropriate python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This python environment has already been set up on Level 1 platforms, and can be activated in the following way:
source ufs-srweather-app/env/wflow_<platform>.env
Then generate the workflow:
./generate_FV3LAM_wflow.sh
If the Rocoto software is available on your platform, you can follow the steps in this section to run the workflow. If Rocoto is not available, the workflow can be run using stand-alone scripts described here.
Go to the experiment directory
cd $EXPTDIR
And run the workflow using the script:
./launch_FV3LAM_wflow.sh
Or manually calling Rocoto:
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
To check the progress of the workflow using Rocoto:
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
For automatic re-submission of the workflow (every 3 minutes), the following line can be added to the user's crontab (use "crontab -e" to edit the cron table; examples are for Cheyenne):
*/3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && /glade/p/ral/jntp/tools/rocoto/rocoto-1.3.1/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
-or-
*/3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && ./launch_FV3LAM_wflow.sh
If you want to turn off entries in the crontab, you can comment them out using a "#" at the beginning of each line.
The Python plotting scripts are located under ufs-srweather-app/regional_workflow/ush/Python
directory. The plot_allvars.py
script plots the output from a single run, while the plot_allvars_diff.py
script plots the difference between two runs. If you are plotting the difference, the runs must be on the same domain and available for the same forecast hours.
Generate the python plots:
cd ufs-srweather-app/regional_workflow/ush/Python
The appropriate environment will need to be loaded to run the scripts, which require Python 3 with the pygrib and cartopy packages. This Python environment has already been set up on Level 1 platforms, and can be activated in the following way (Note: if you are using the batch submission scripts the environments are set for you and you do not need to set them on the command line prior to running the script - see further instructions below):
On Cheyenne:
module load ncarenv
ncar_pylib /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/python_graphics
On Hera and Jet,
module use -a /contrib/miniconda3/modulefiles
module load miniconda3
conda activate pygraf
On Orion:
module use -a /apps/contrib/miniconda3-noaa-gsl/modulefiles
module load miniconda3
conda activate pygraf
On Gaea:
module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles
module load miniconda3/4.8.3-regional-workflow
To run the Python plotting script for a single run, six command line arguments are required, including
- Cycle date/time in YYYYMMDDHH format
- Starting forecast hour in HHH format
- Ending forecast hour in HHH format
- Forecast hour increment in HHH format
- EXPT_DIR: Experiment directory where post-processed data are found EXPT_DIR/YYYYMMDDHH/postprd
- CARTOPY_DIR: Base directory of cartopy shapefiles with a file structure of CARTOPY_DIR/shapefiles/natural_earth/cultural/*.shp
An example for plotting output from the default config.sh settings (using the GFSv15p2 suite definition file) is as follows:
python plot_allvars.py 2019061500 6 48 6 /path-to/expt_dirs/test_CONUS_25km_GFSv15p2 /path-to/NaturalEarth
The Cartopy shape files are available for use on on a number of Tier 1 platforms in the following locations:
On Cheyenne:
/glade/p/ral/jntp/UFS_SRW_app/tools/NaturalEarth
On Hera:
/scratch2/BMC/det/UFS_SRW_app/v1p0/fix_files/NaturalEarth
On Jet:
/lfs4/BMC/wrfruc/FV3-LAM/NaturalEarth
On Orion:
/work/noaa/gsd-fv3-dev/UFS_SRW_App/v1p0/fix_files/NaturalEarth
On Gaea:
/lustre/f2/pdata/esrl/gsd/ufs/NaturalEarth
If the Python scripts are being used to create plots of multiple forecast lead times and forecast variables, then they should be submitted through the batch system using one of the following scripts.
On Hera, Jet, Orion, and Gaea:
sbatch sq_job.sh
On Cheyenne:
qsub qsub_job.sh
If the batch script is being used, multiple environment variables (HOMErrfs and EXPTDIR(#)) need to be set prior to submitting the script: For a single run:
setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow or export HOMErrfs=/path-to/ufs-srweather-app/regional_workflow
setenv EXPTDIR /path-to/EXPTDIR or export EXPTDIR=/path-to/EXPTDIR
For differencing two runs:
setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow or export HOMErrfs=/path-to/ufs-srweather-app/regional_workflow
setenv EXPTDIR1 /path-to/EXPTDIR1 or export EXPTDIR1=/path-to/EXPTDIR1
setenv EXPTDIR2 /path-to/EXPTDIR2 or export EXPTDIR2=/path-to/EXPTDIR2
In addition, the following variables can be modified in the batch script depending on your needs (for example, if you want to plot hourly forecast output, FCST_INC should be set to 1; if you just want to plot a subset of your model output you can set the FCST_START/END/INC accordingly):
export CDATE=${DATE_FIRST_CYCL}${CYCL_HRS}
export FCST_START=6
export FCST_END=${FCST_LEN_HRS}
export FCST_INC=6
The output files (.png format) will be located in the experiment directory (EXPT_DIR) under the YYYYMMDDHH/postprd
directory.
For more detailed information on the application, see the UFS Short-Range Weather App Users Guide.
- Getting Started for Developers
- Repository Structure and Submodules
- Contributor's Guide
- Code Reviewer's Guide
- UFS offline Land Data Assimilation (DA) System
- Global Workflow
- UFS Hurricane Analysis and Forecast System
- UFS Medium-Range Weather Application (no longer supported)
- spack-stack - builds bundled library dependencies using a Spack-based package installation method