# Jobs overview The jobs to be run in the workflow are defined in a set of configuration files stored in `conf/common/jobs/`. Including a file into the configuration by uncommenting the respective line in `conf/.yml` adds the jobs to the workflow. ## Base workflow ### BUILD_ICON This job builds the ICON model. It will clone the model git repository, according to the values stored in `conf/common/build.yml` in the `ICON` section. Afterwards, the model is configured with the script specified by `ICON.CONFIG_FILE` (usually set in the platform-specific config file), then build and installed. Details on how to configure the job can be found on the [build page](/Userguide/Building-ICON.md). ### BUILD_PYTHON This job sets up a Python virtual environment and installs all required packages there. A few configuration options are available in `conf/common/build.yml`. ### PREPARE_EXPERIMENT This job prepares the remote HPC system for the current experiment. It copies all custom Python modules and the core experiment configuration file `/experiment.yml` to the HPC system. ### PRE_FIND_FILES This job locates all input files on the system. For details see the page on [input files](/Userguide/Input-data.md). ### PREPARE_MEMBER This job creates the working/output directory and links all input files there. The input files are specified in the `DIRECTORIES.LINK_FILES` section specified in `/experiment.yml`. See the comments there for details and the [step-by-step guide](/Introductory-guides/Step-by-step-guide.md). By default it requires the working directory to be non-existent or empty to not overwrite something. This behavior can be altered in the `conf/art/simulation.yml` file with the parameter `SIMULATION.REQUIRE_CLEAN_OUTDIR`. ### PREPARE_NAMELIST This job creates the namelist for the experiment (a specific member and chunk if applicable). For details, see the [namelist page](/Userguide/Namelists.md). ### RUN_ICON This job runs the actual simulation for a single chunk. It uses a so-called `envmodules.sh` file (platform-specific, in `templates/common/platforms/`) to set up the environment required for the run on a specific HPC system. ## Additional jobs ### TRANSFER_PROJECT This job is only required for the cases `real-from-ideal` and `real-from-dwd-ana` and transfers _auto-icon_ to the HPC system. ### CLEAN_RESTART This job deletes the restart files regularly to save disk space. Per default it leaves the latest three files in place. It has to be included manually (in the include file or via the init script). :::{note} Only available for the `art` ICON case. ::: ### TRANSFER With this job, one can transfer some or all output data to a different location on the HPC system or to the local platform. :::{important} Do not copy any output data to the *auto-icon* server of KIT! ::: ## Preprocessing ### BUILD_DWD_ICON_TOOLS Build the DWD ICON tools required for some input data remapping tasks. ### PRECHECK_REMAP_ERA2ICON This job checks for the existence of remapped ERA5 data. If the data is present, the PREP_REMAP_ERA2ICON job can be skipped. ### PREP_REMAP_ERA2ICON / PREP_REMAP_IFS2ICON This job does the actual remapping of ERA5 or IFS input data to the ICON grid. ### PRE_CREATE_GRID This job creates grids for the ICON model by using the DWD ICON tools grid generator. ### PRE_REMAP This is a general job to remap input data to the target grid with CDO. Intended for all non-ERA5 and non-IFS input data. ## Postprocessing ### GET_SAMOA Pull the SAMOA project from the git repository. ### POST_SAMOA Run SAMOA checks on the output. ### POST_CDO Do postprocessing with CDO. This can be configured in the main experiment configuration file `/experiment.yml`, section `CDO`. ### VIS_GLOBAL_2D Plot some global fields in a 2D plot. Can be configured in `/visualization.yml`. :::{note} Only available for the `art` ICON case. ::: ### VIS_ANIMATION_2D Make an animation of subsequent global field 2D plots. Can be configured in `/visualization.yml`. :::{note} Only available for the `art` ICON case. ::: ## Testing This set of jobs defined in the file `conf/common/jobs/testcase.yml` provides a check of the output files against identical files in a respective reference set. On successful comparison it deletes all output data. This is especially useful for testing purposes. ### PRE_CLEAN_REMOTE Clean all files on the remote HPC system. This job is for preparing a fresh test run. ### POST_VALIDATE_BIN_EQUAL_TO_REF This job does the actual comparison. It uses CDO to compare each pair of files, whether the content matches exactly. This requires the reference to be run on the same hardware (how strict this condition is, is currently not fully established). The location of the reference set is defined via `DIRECTORIES.REFDIR` in `conf/art/experiments/.yml`. ### POST_CLEAN_EXPERIMENT This job deletes all output data. Intended for clean up of a test run.