Skip to content
Snippets Groups Projects
Commit 54289d89 authored by Nicolas Martin's avatar Nicolas Martin :speech_balloon:
Browse files

Clean symbolic links before relocation

parent 02256a3f
No related branches found
No related tags found
No related merge requests found
Showing
with 3566 additions and 20 deletions
../../../cfgs/README.rst
\ No newline at end of file
********************************
Run the Reference configurations
********************************
.. todo::
Lack of illustrations for ref. cfgs, and more generally in the guide.
NEMO is distributed with a set of reference configurations allowing both
the user to set up his own first applications and
the developer to test/validate his NEMO developments (using SETTE package).
.. contents::
:local:
:depth: 1
.. attention::
Concerning the configurations,
the NEMO System Team is only in charge of the so-called reference configurations described below.
.. hint::
Configurations developed by external research projects or initiatives that
make use of NEMO are welcome to be publicized through the website by
filling up the form :website:`to add an associated project<projects/add>`.
How to compile an experiment from a reference configuration
===========================================================
To compile the ORCA2_ICE_PISCES_ reference configuration using :file:`makenemo`,
one should use the following, by selecting among available architecture file or
providing a user defined one:
.. code-block:: console
$ ./makenemo -r 'ORCA2_ICE_PISCES' -m 'my_arch' -j '4'
A new ``EXP00`` folder will be created within the selected reference configurations,
namely ``./cfgs/ORCA2_ICE_PISCES/EXP00``.
It will be necessary to uncompress the archives listed in the above table for
the given reference configuration that includes input & forcing files.
Then it will be possible to launch the execution of the model through a runscript
(opportunely adapted to the user system).
List of Configurations
======================
All forcing files listed below in the table are available from |DOI data|_
=================== === === === === === ==================================
Configuration Component(s) Archives (input & forcing files)
------------------- ------------------- ----------------------------------
Name O S T P A
=================== === === === === === ==================================
AGRIF_DEMO_ X X X AGRIF_DEMO_v4.0.tar,
ORCA2_ICE_v4.0.tar
AMM12_ X AMM12_v4.0.tar
C1D_PAPA_ X INPUTS_C1D_PAPA_v4.0.tar
GYRE_BFM_ X X *none*
GYRE_PISCES_ X X X *none*
ORCA2_ICE_PISCES_ X X X X ORCA2_ICE_v4.0.tar,
INPUTS_PISCES_v4.0.tar
ORCA2_OFF_PISCES_ X X ORCA2_OFF_v4.0.tar,
INPUTS_PISCES_v4.0.tar
ORCA2_OFF_TRC_ X ORCA2_OFF_v4.0.tar
ORCA2_SAS_ICE_ X ORCA2_ICE_v4.0.tar,
INPUTS_SAS_v4.0.tar
SPITZ12_ X X SPITZ12_v4.0.tar
=================== === === === === === ==================================
.. admonition:: Legend for component combination
O for OCE, S for SI\ :sup:`3`, T for TOP, P for PISCES and A for AGRIF
AGRIF_DEMO
----------
``AGRIF_DEMO`` is based on the ``ORCA2_ICE_PISCES`` global configuration at 2° of resolution with
the inclusion of 3 online nested grids to demonstrate the overall capabilities of AGRIF in
a realistic context (including the nesting of sea ice models).
The configuration includes a 1:1 grid in the Pacific and two successively nested grids with
odd and even refinement ratios over the Arctic ocean,
with the finest grid spanning the whole Svalbard archipelago that is of
particular interest to test sea ice coupling.
.. image:: _static/AGRIF_DEMO_no_cap.jpg
:scale: 66%
:align: center
The 1:1 grid can be used alone as a benchmark to check that
the model solution is not corrupted by grid exchanges.
Note that since grids interact only at the baroclinic time level,
numerically exact results can not be achieved in the 1:1 case.
Perfect reproducibility is obtained only by switching to a fully explicit setup instead of
a split explicit free surface scheme.
AMM12
-----
``AMM12`` stands for *Atlantic Margin Model at 12 km* that is
a regional configuration covering the Northwest European Shelf domain on
a regular horizontal grid of ~12 km of resolution (see :cite:`ODEA2012`).
.. image:: _static/AMM_domain.png
:align: center
This configuration allows to tests several features of NEMO specifically addressed to the shelf seas.
In particular, ``AMM12`` accounts for vertical s-coordinates system, GLS turbulence scheme,
tidal lateral boundary conditions using a flather scheme (see more in ``BDY``).
Boundaries may be completely omitted by setting ``ln_bdy = .false.`` in ``nambdy``.
Sample surface fluxes, river forcing and an initial restart file are included to test a realistic model run
(``AMM12_v4.0.tar``).
Note that, the Baltic boundary is included within the river input file and is specified as a river source,
but unlike ordinary river points the Baltic inputs also include salinity and temperature data.
C1D_PAPA
--------
.. figure:: _static/Papa2015.jpg
:height: 225px
:align: left
``C1D_PAPA`` is a 1D configuration for the `PAPA station`_ located in
the northern-eastern Pacific Ocean at 50.1°N, 144.9°W.
See :gmd:`Reffray et al. (2015) <8/69/2015>` for the description of
its physical and numerical turbulent-mixing behaviour.
| The water column setup, called NEMO1D, is activated by
setting ``ln_c1d = .true.`` in ``namdom`` and
has a horizontal domain of 1x1 grid point.
| This reference configuration uses 75 vertical levels grid (1m at the surface),
GLS turbulence scheme with K-epsilon closure and the NCAR bulk formulae.
Data provided with ``INPUTS_C1D_PAPA_v4.2.tar`` file account for:
- :file:`forcing_PAPASTATION_1h_y201[0-1].nc`:
ECMWF operational analysis atmospheric forcing rescaled to 1h
(with long and short waves flux correction) for years 2010 and 2011
- :file:`init_PAPASTATION_m06d15.nc`: Initial Conditions from
observed data and Levitus 2009 climatology
- :file:`chlorophyll_PAPASTATION.nc`: surface chlorophyll file from Seawifs data
GYRE_BFM
--------
``GYRE_BFM`` shares the same physical setup of GYRE_PISCES_,
but NEMO is coupled with the `BFM`_ biogeochemical model as described in ``./cfgs/GYRE_BFM/README``.
GYRE_PISCES
-----------
``GYRE_PISCES`` is an idealized configuration representing a Northern hemisphere double gyres system,
in the Beta-plane approximation with a regular 1° horizontal resolution and 31 vertical levels,
with PISCES BGC model :cite:`gmd-8-2465-2015`.
Analytical forcing for heat, freshwater and wind-stress fields are applied.
This configuration acts also as demonstrator of the **user defined setup**
(``ln_read_cfg = .false.``) and grid setting are handled through
the ``&namusr_def`` controls in :file:`namelist_cfg`:
.. literalinclude:: ../../../cfgs/GYRE_PISCES/EXPREF/namelist_cfg
:language: fortran
:lines: 35-41
Note that, the default grid size is 30x20 grid points (with ``nn_GYRE = 1``) and
vertical levels are set by ``jpkglo``.
The specific code changes can be inspected in :file:`./src/OCE/USR`.
.. rubric:: Running GYRE as a benchmark
| This simple configuration can be used as a benchmark since it is easy to increase resolution,
with the drawback of getting results that have a very limited physical meaning.
| GYRE grid resolution can be increased at runtime by setting a different value of ``nn_GYRE``
(integer multiplier scaling factor), as described in the following table:
=========== ============ ============ ============ ===============
``nn_GYRE`` ``jpiglo`` ``jpjglo`` ``jpkglo`` Equivalent to
=========== ============ ============ ============ ===============
1 30 20 31 GYRE 1°
25 750 500 101 ORCA 1/2°
50 1500 1000 101 ORCA 1/4°
150 4500 3000 101 ORCA 1/12°
200 6000 4000 101 ORCA 1/16°
=========== ============ ============ ============ ===============
| Note that, it is necessary to set ``ln_bench = .true.`` in ``&namusr_def`` to
avoid problems in the physics computation and that
the model timestep should be adequately rescaled.
| For example if ``nn_GYRE = 150``, equivalent to an ORCA 1/12° grid,
the timestep ``rn_rdt`` should be set to 1200 seconds
Differently from previous versions of NEMO, the code uses by default the time-splitting scheme and
internally computes the number of sub-steps.
ORCA2_ICE_PISCES
----------------
``ORCA2_ICE_PISCES`` is a reference configuration for the global ocean with
a 2°x2° curvilinear horizontal mesh and 31 vertical levels,
distributed using z-coordinate system and with 10 levels in the top 100m.
ORCA is the generic name given to global ocean Mercator mesh,
(i.e. variation of meridian scale factor as cosinus of the latitude),
with two poles in the northern hemisphere so that
the ratio of anisotropy is nearly one everywhere
This configuration uses the three components
- |OCE|, the ocean dynamical core
- |ICE|, the thermodynamic-dynamic sea ice model.
- |MBG|, passive tracer transport module and PISCES BGC model :cite:`gmd-8-2465-2015`
All components share the same grid.
The model is forced with CORE-II normal year atmospheric forcing and
it uses the NCAR bulk formulae.
.. rubric:: Ocean Physics
:horizontal diffusion on momentum:
the eddy viscosity coefficient depends on the geographical position.
It is taken as 40000 m\ :sup:`2`/s, reduced in the equator regions (2000 m\ :sup:`2`/s)
excepted near the western boundaries.
:isopycnal diffusion on tracers:
the diffusion acts along the isopycnal surfaces (neutral surface) with
an eddy diffusivity coefficient of 2000 m\ :sup:`2`/s.
:Eddy induced velocity parametrization:
With a coefficient that depends on the growth rate of baroclinic instabilities
(it usually varies from 15 m\ :sup:`2`/s to 3000 m\ :sup:`2`/s).
:lateral boundary conditions:
Zero fluxes of heat and salt and no-slip conditions are applied through lateral solid boundaries.
:bottom boundary condition:
Zero fluxes of heat and salt are applied through the ocean bottom.
The Beckmann [19XX] simple bottom boundary layer parameterization is applied along
continental slopes.
A linear friction is applied on momentum.
:convection:
The vertical eddy viscosity and diffusivity coefficients are increased to 1 m\ :sup:`2`/s in
case of static instability.
:time step: is 5400sec (1h30') so that there is 16 time steps in one day.
ORCA2_OFF_PISCES
----------------
``ORCA2_OFF_PISCES`` shares the same general offline configuration of ``ORCA2_ICE_TRC``,
but only PISCES model is an active component of TOP.
ORCA2_OFF_TRC
-------------
| ``ORCA2_OFF_TRC`` is based on the ORCA2 global ocean configuration
(see ORCA2_ICE_PISCES_ for general description) along with
the tracer passive transport module (TOP),
but dynamical fields are pre-calculated and read with specific time frequency.
| This enables for an offline coupling of TOP components,
here specifically inorganic carbon compounds (CFC11, CFC12, SF6, C14) and water age module (age).
See :file:`namelist_top_cfg` to inspect the selection of
each component with the dedicated logical keys.
Pre-calculated dynamical fields are provided to NEMO using
the namelist ``&namdta_dyn`` in :file:`namelist_cfg`,
in this case with a 5 days frequency (120 hours):
.. literalinclude:: ../../namelists/namdta_dyn
:language: fortran
Input dynamical fields for this configuration (:file:`ORCA2_OFF_v4.0.tar`) comes from
a 2000 years long climatological simulation of ORCA2_ICE using ERA40 atmospheric forcing.
| Note that,
this configuration default uses linear free surface (``ln_linssh = .true.``) assuming that
model mesh is not varying in time and
it includes the bottom boundary layer parameterization (``ln_trabbl = .true.``) that
requires the provision of BBL coefficients through ``sn_ubl`` and ``sn_vbl`` fields.
| It is also possible to activate PISCES model (see ``ORCA2_OFF_PISCES``) or
a user defined set of tracers and source-sink terms with ``ln_my_trc = .true.``
(and adaptation of ``./src/TOP/MY_TRC`` routines).
In addition, the offline module (OFF) allows for the provision of further fields:
1. **River runoff** can be provided to TOP components by setting ``ln_dynrnf = .true.`` and
by including an input datastream similarly to the following:
.. code-block:: fortran
sn_rnf = 'dyna_grid_T', 120, 'sorunoff' , .true., .true., 'yearly', '', '', ''
2. **VVL dynamical fields**, in the case input data were produced by a dyamical core using
variable volume (``ln_linssh = .false.``)
it is necessary to provide also diverce and E-P at before timestep by
including input datastreams similarly to the following
.. code-block:: fortran
sn_div = 'dyna_grid_T', 120, 'e3t' , .true., .true., 'yearly', '', '', ''
sn_empb = 'dyna_grid_T', 120, 'sowaflupb', .true., .true., 'yearly', '', '', ''
More details can be found by inspecting the offline data manager in
the routine :file:`./src/OFF/dtadyn.F90`.
ORCA2_SAS_ICE
-------------
| ORCA2_SAS_ICE is a demonstrator of the Stand-Alone Surface (SAS) module and
it relies on ORCA2 global ocean configuration (see ORCA2_ICE_PISCES_ for general description).
| The standalone surface module allows surface elements such as sea-ice, iceberg drift, and
surface fluxes to be run using prescribed model state fields.
It can profitably be used to compare different bulk formulae or
adjust the parameters of a given bulk formula.
More informations about SAS can be found in :doc:`NEMO manual <cite>`.
SPITZ12
-------
``SPITZ12`` is a regional configuration around the Svalbard archipelago
at 1/12° of horizontal resolution and 75 vertical levels.
See :gmd:`Rousset et al. (2015) <8/2991/2015>` for more details.
This configuration references to year 2002,
with atmospheric forcing provided every 2 hours using NCAR bulk formulae,
while lateral boundary conditions for dynamical fields have 3 days time frequency.
.. rubric:: References
.. bibliography:: cfgs.bib
:all:
:style: unsrt
:labelprefix: C
../../../CHANGES.rst
\ No newline at end of file
../../../CONTRIBUTING.rst
\ No newline at end of file
../../../cfgs/SHARED/README.rst
\ No newline at end of file
***********
Diagnostics
***********
.. todo::
.. contents::
:local:
Output of diagnostics in NEMO is usually done using XIOS.
This is an efficient way of writing diagnostics because
the time averaging, file writing and even some simple arithmetic or regridding is carried out in
parallel to the NEMO model run.
This page gives a basic introduction to using XIOS with NEMO.
Much more information is available from the :xios:`XIOS homepage<>` above and from the NEMO manual.
Use of XIOS for diagnostics is activated using the pre-compiler key ``key_xios``.
Extracting and installing XIOS
==============================
1. Install the NetCDF4 library.
If you want to use single file output you will need to compile the HDF & NetCDF libraries to
allow parallel IO.
2. Download the version of XIOS that you wish to use.
The recommended version is now XIOS 2.5:
.. code-block:: console
$ svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-2.5
and follow the instructions in :xios:`XIOS documentation <wiki/documentation>` to compile it.
If you find problems at this stage, support can be found by subscribing to
the :xios:`XIOS mailing list <../mailman/listinfo.cgi/xios-users>` and sending a mail message to it.
XIOS Configuration files
------------------------
XIOS is controlled using XML input files that should be copied to
your model run directory before running the model.
Examples of these files can be found in the reference configurations (:file:`./cfgs`).
The XIOS executable expects to find a file called :file:`iodef.xml` in the model run directory.
In NEMO we have made the decision to use include statements in the :file:`iodef.xml` file to include:
- :file:`field_def_nemo-oce.xml` (for physics),
- :file:`field_def_nemo-ice.xml` (for ice),
- :file:`field_def_nemo-pisces.xml` (for biogeochemistry) and
- :file:`domain_def.xml` from the :file:`./cfgs/SHARED` directory.
Most users will not need to modify :file:`domain_def.xml` or :file:`field_def_nemo-???.xml` unless
they want to add new diagnostics to the NEMO code.
The definition of the output files is organized into separate :file:`file_definition.xml` files which
are included in the :file:`iodef.xml` file.
Modes
=====
Detached Mode
-------------
In detached mode the XIOS executable is executed on separate cores from the NEMO model.
This is the recommended method for using XIOS for realistic model runs.
To use this mode set ``using_server`` to ``true`` at the bottom of the :file:`iodef.xml` file:
.. code-block:: xml
<variable id="using_server" type="boolean">true</variable>
Make sure there is a copy (or link to) your XIOS executable in the working directory and
in your job submission script allocate processors to XIOS.
Attached Mode
-------------
In attached mode XIOS runs on each of the cores used by NEMO.
This method is less efficient than the detached mode but can be more convenient for testing or
with small configurations.
To activate this mode simply set ``using_server`` to false in the :file:`iodef.xml` file
.. code-block:: xml
<variable id="using_server" type="boolean">false</variable>
and don't allocate any cores to XIOS.
.. note::
Due to the different domain decompositions between XIOS and NEMO,
if the total number of cores is larger than the number of grid points in the ``j`` direction then
the model run will fail.
Adding new diagnostics
======================
If you want to add a NEMO diagnostic to the NEMO code you will need to do the following:
1. Add any necessary code to calculate you new diagnostic in NEMO
2. Send the field to XIOS using ``CALL iom_put( 'field_id', variable )`` where
``field_id`` is a unique id for your new diagnostics and
variable is the fortran variable containing the data.
This should be called at every model timestep regardless of how often you want to output the field.
No time averaging should be done in the model code.
3. If it is computationally expensive to calculate your new diagnostic
you should also use "iom_use" to determine if it is requested in the current model run.
For example,
.. code-block:: fortran
IF iom_use('field_id') THEN
!Some expensive computation
!...
!...
iom_put('field_id', variable)
ENDIF
4. Add a variable definition to the :file:`field_def_nemo-???.xml` file.
5. Add the variable to the :file:`iodef.xml` or :file:`file_definition.xml` file.
../../../INSTALL.rst
\ No newline at end of file
../../../README.rst
\ No newline at end of file
**Contents**
.. contents::
:local:
.. _`Former web platform forge`: https://forge.ipsl.jussieu.fr/nemo
.. _`NEMO users' guide`: https://sites.nemo-ocean.io/user-guide
.. _`Migration Guide`: https://sites.nemo-ocean.io/user-guide/migration.html
.. _`Change list`: https://sites.nemo-ocean.io/user-guide/changes.html
.. _`Test case repository`: https://github.com/NEMO-ocean/NEMO-examples
.. _`How to cite`: https://www.nemo-ocean.eu/bibliography/how-to-cite/
.. _`NEMO forums`: https://nemo-ocean.discourse.group
.. _`NEMO newsletter`: https://listes.ipsl.fr/sympa/subscribe/nemo-newsletter
.. _`NEMO publications`: https://www.nemo-ocean.eu/bibliography/publications/add
.. _`NEMO projects`: https://www.nemo-ocean.eu/projects/add
.. _`Special Issue`: https://gmd.copernicus.org/articles/special_issue40.html
.. _`NEMO System Team wiki`: https://forge.nemo-ocean.eu/developers/home/-/wikis/Home
.. _`NEMO ocean engine`: https://zenodo.org/record/1464816
.. _`NEMO Tracers engine` : https://zenodo.org/record/1471700
.. _`NEMO Sea Ice engine`: https://zenodo.org/record/1471689
**Welcome to NEMO home page!**
NEMO (*Nucleus for European Modelling of the Ocean*) is a state-of-the-art modelling
framework for research activities and forecasting services in ocean and climate sciences,
developed in a sustainable way by the NEMO European consortium since 2008.
This page intends to help you to get started using the NEMO platform and to introduce you
to the different levels of information available. It starts here with NEMO release 4.2.0.
Reminder: Our `Former web platform forge`_ (SVN+Trac) contains the previous documentation
and releases made available from the beginning of the project up to of NEMO 4.0.
Getting started
===============
Getting your hands on NEMO: the first steps are described in detail in the
`NEMO users' guide`_ . This explains how to download the code, build the environment,
create the executable, and perform a first integration.
If you are already using a previous release of NEMO, please refer to the
`Migration Guide`_ which aims to help you to make the move to 4.2.0.
The above users guides cover in detail what is available from gitlab and supported by NEMO
System Team. Aside from this web platform, a set of test cases is also available from the
`Test case repository`_ . These test cases can be useful for students, outreach, and
exploring specific aspects of NEMO with light configurations. The web page also allows you
to submit test cases you have developed and want to share with the community. Feel free to
contribute!
Project documentation
=====================
Reference manuals fully describing NEMO for the three main component
* |OCE| models the ocean {thermo}dynamics and solves the primitive equations (`./src/OCE <./src/OCE>`_)
* |ICE| simulates sea-ice {thermo}dynamics, brine inclusions and subgrid-scale thickness
variations (`./src/ICE <./src/ICE>`_)
* |MBG| models the {on,off}line oceanic tracers transport and biogeochemical processes
(`./src/TOP <./src/TOP>`_)
are available from Zenodo:
============ ======================== =====
Component Reference Manual DOI
============ ======================== =====
|NEMO-OCE| `NEMO ocean engine`_ .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.6334656.svg
:target: https://doi.org/10.5281/zenodo.6334656
|NEMO-ICE| `NEMO Sea Ice engine`_ *not yet available*
|NEMO-MBG| `NEMO Tracers engine`_ .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.1471700.svg
:target: https://doi.org/10.5281/zenodo.1471700
============ ======================== =====
These reference manuals are the publications that should be cited in your own
publications. Please visit `How to cite`_? for details.
New features of 4.2.0 release are described in the `Change list`_ section of the `NEMO users' guide`_
Asking questions, and exchanging information
============================================
- Register once for all and use the `NEMO forums`_ on discourse to share and discuss with the NEMO community.
- Register once for all and receive by mail the `NEMO newsletter`_ : recommended for all
users to receive the major announcements from the project (new releases, open meetings and
main informations). Low traffic: about ten messages a year.
Contributing to NEMO visibility: projects and publications
==========================================================
Please help us justifying the NEMO development efforts by
- Adding your publications using NEMO and its outputs to the `NEMO publications`_ page
- Describing your project using NEMO on the `NEMO projects`_ page
NEMO also has a `Special Issue`_ in the open-access journal
Geoscientific Model Development (GMD) from the European Geosciences Union
The main scope is to collect relevant manuscripts covering various topics and
to provide a single portal to assess the model potential and evolution.
Contributing to NEMO development
================================
NEMO strives to be written in a way which allows the easy incorporation of developments.
You are welcome to contribute to the development of the NEMO Shared reference. NEMO
development is driven by NEMO Consortium planning and producing NEMO's sustainable
development in order to keep a reliable evolving framework. Development is organised and
scheduled through a five years development strategy, Working groups and the activities of
the development team (named NEMO System Team) in a yearly workplan. More information is
available on the `NEMO System Team wiki`_
Disclaimer
==========
The NEMO source code is freely available and distributed under
`CeCILL v2.0 license <./LICENSE>`_ (GNU GPL compatible).
You can use, modify and/or redistribute the software under its terms,
but users are provided only with a limited warranty and the software's authors and
the successive licensor's have only limited liability.
../../../REFERENCES.bib
\ No newline at end of file
../../../src/OCE/USR/README.rst
\ No newline at end of file
******************************
Setting up a new configuration
******************************
.. todo::
.. contents::
:local:
Starting from an existing configuration
=======================================
There are three options to build a new configuration from an existing one.
Option 1: Duplicate an existing configuration
---------------------------------------------
The NEMO so-called Reference Configurations cover a number of major features for NEMO setup
(global, regional, 1D, using embedded zoom with AGRIF...)
One can create a new configuration by duplicating one of the reference configurations
(``ORCA2_ICE_PISCES`` in the following example)
.. code-block:: console
$ ./makenemo –n 'ORCA2_ICE_PISCES_MINE' -r 'ORCA2_ICE_PISCES' -m 'my_arch'
Option 2: Duplicate with differences
------------------------------------
Create and compile a new configuration based on a reference configuration
(``ORCA2_ICE_PISCES`` in the following example) but with different pre-processor options.
For this either add ``add_key`` or ``del_key`` keys as required; e.g.
.. code-block:: console
$ ./makenemo –n 'ORCA2_ICE_PISCES_MINE' -r 'ORCA2_ICE_PISCES' -m 'my_arch' del_key 'key_xios' add_key 'key_diahth'
Option 3: Use the SIREN tools to subset an existing model
---------------------------------------------------------
Define a regional configuration which is a {sub,super}-set of an existing configuration.
This last option employs the SIREN software tools that are included in the standard distribution.
The software is written in Fortran 95 and available in the :file:`./tools/SIREN` directory.
SIREN allows you to create your own regional configuration embedded in a wider one.
SIREN is a set of programs to create all the input files you need to
run a NEMO regional configuration.
:Demo: Set of GLORYS files (GLObal ReanalYSis on the ORCA025 grid),
as well as examples of namelists are available `here`_.
:Doc: :forge:`chrome/site/doc/SIREN/html/index.html`
:Support: Any questions or comments regarding the use of SIREN should be posted in
:forge:`the corresponding forum <discussion/forum/2>`.
.. _here: https://prodn.idris.fr/thredds/catalog/ipsl_public/rron463/catalog.html
Option 4: Use the nesting tools to create embedded zooms or regional configurations from an existing grid
---------------------------------------------------------------------------------------------------------
(see :download:`NESTING README <../../../tools/NESTING/README>`).
Creating a completely new configuration
=======================================
From NEMO version 4.0 there are two ways to build configurations from scratch.
The appropriate method to use depends largely on the target configuration.
Method 1 is for more complex/realistic global or regional configurations and
method 2 is intended for simpler, idealised configurations whose
domains and characteristics can be described in simple geometries and formulae.
Option 1: Create and use a domain configuration file
----------------------------------------------------
This method is used by each of the reference configurations,
so that downloading their input files linked to their description can help.
Although starting from scratch,
it is advisable to create the directory structure to house your new configuration by
duplicating the closest reference configuration to your target application.
For example, if your application requires both ocean ice and passive tracers,
then use the ``ORCA2_ICE_PISCES`` as template,
and execute following command to build your ``MY_NEW_CONFIG`` configuration:
.. code-block:: sh
$ ./makenemo –n 'MY_NEW_CONFIG' -r 'ORCA2_ICE_PISCES' -m 'my_arch'
where ``MY_NEW_CONFIG`` can be substituted with
a suitably descriptive name for your new configuration.
The purpose of this step is simply to create and populate the appropriate :file:`WORK`,
:file:`MY_SRC` and :file:`EXP00` subdirectories for your new configuration.
Other choices for the base reference configuration might be
:GYRE: If your target application is ocean-only
:AMM12: If your target application is regional with open boundaries
All the domain information for your new configuration will be contained within
a netcdf file called :file:`domain_cfg.nc` which you will need to create and
place in the :file:`./cfgs/MY_NEW_CONFIG/EXP00` sub-directory.
Firstly though, ensure that your configuration is set to use such a file by checking that
.. code-block:: fortran
ln_read_cfg = .true.
in :file:`./cfgs/MY_NEW_CONFIG/EXP00/namelist_cfg`
Create the :file:`domain_cfg.nc` file which must contain the following fields
.. code-block:: c
/* configuration name, configuration resolution */
int ORCA, ORCA_index
/* lateral global domain b.c. */
int Iperio, Jperio, NFoldT, NFoldF
/* flags for z-coord, z-coord with partial steps and s-coord */
int ln_zco, ln_zps, ln_sco
/* flag for ice shelf cavities */
int ln_isfcav
/* geographic position */
double glamt, glamu, glamv, glamf
/* geographic position */
double gphit, gphiu, gphiv, gphif
/* Coriolis parameter (if not on the sphere) */
double iff, ff_f, ff_t
/* horizontal scale factors */
double e1t, e1u, e1v, e1f
/* horizontal scale factors */
double e2t, e2u, e2v, e2f
/* U and V surfaces (if grid size reduction in some straits) */
double ie1e2u_v, e1e2u, e1e2v
/* reference vertical scale factors at T and W points */
double e3t_1d, e3w_1d
/* vertical scale factors 3D coordinate at T,U,V,F and W points */
double e3t_0, e3u_0, e3v_0, e3f_0, e3w_0
/* vertical scale factors 3D coordinate at UW and VW points */
double e3uw_0, e3vw_0
/* last wet T-points, 1st wet T-points (for ice shelf cavities) */
int bottom_level, top_level
There are two options for creating a :file:`domain_cfg.nc` file:
- Users can use tools of their own choice to build a :file:`domain_cfg.nc` with all mandatory fields.
- Users can adapt and apply the supplied tool available in :file:`./tools/DOMAINcfg`.
This tool is based on code extracted from NEMO version 3.6 and will allow similar choices for
the horizontal and vertical grids that were available internally to that version.
See :ref:`tools <DOMAINcfg>` for details.
Option 2: Adapt the usr_def configuration module of NEMO for you own purposes
-----------------------------------------------------------------------------
This method is intended for configuring easily simple/idealised configurations which
are often used as demonstrators or for process evaluation and comparison.
This method can be used whenever the domain geometry has a simple mathematical description and
the ocean initial state and boundary forcing is described analytically.
As a start, consider the case of starting a completely new ocean-only test case based on
the ``LOCK_EXCHANGE`` example.
.. note::
We probably need an even more basic example than this with only one namelist and
minimal changes to the usrdef modules
Firstly, construct the directory structure, starting in the :file:`cfgs` directory:
.. code-block:: console
$ ./makenemo -n 'MY_NEW_TEST' -t 'LOCK_EXCHANGE' -m 'my_arch'
where the ``-t`` option has been used to locate the new configuration in
the :file:`tests` subdirectory
(it is recommended practice to keep full configurations and idealised cases clearly distinguishable).
This command will create (amongst others) the following files and directories::
./tests/MY_NEW_TEST:
BLD EXP00 MY_SRC WORK cpp_MY_NEW_TEST.fcm
./tests/MY_NEW_TEST/EXP00:
context_nemo.xml domain_def_nemo.xml field_def_nemo-oce.xml file_def_nemo-oce.xml iodef.xml
namelist_cfg namelist_ref
./tests/MY_NEW_TEST/MY_SRC:
usrdef_hgr.F90 usrdef_nam.F90 usrdef_zgr.F90 usrdef_istate.F90 usrdef_sbc.F90 zdfini.F90
The key to setting up an idealised configuration lies in
adapting a small set of short Fortran 90 modules which
should be dropped into the :file:`MY_SRC` directory.
Here the ``LOCK_EXCHANGE`` example is using 5 such routines but the full set that is available in
the :file:`src/OCE/USR` directory is::
./src/OCE/USR:
usrdef_closea.F90 usrdef_fmask.F90 usrdef_hgr.F90 usrdef_istate.F90
usrdef_nam.F90 usrdef_sbc.F90 usrdef_zgr.F90
Before discussing these in more detail it is worth noting the various namelist controls that
engage the different user-defined aspects.
These controls are set using two new logical switches or are implied by the settings of existing ones.
For example, the mandatory requirement for an idealised configuration is to provide routines which
define the horizontal and vertical domains.
Templates for these are provided in the :file:`usrdef_hgr.F90` and :file:`usrdef_zgr.F90` modules.
The application of these modules is activated whenever:
.. code-block:: fortran
ln_read_cfg = .false.
in any configuration's :file:`namelist_cfg` file.
This setting also activates the reading of an optional ``&nam_usrdef`` namelist which can be used to
supply configuration specific settings.
These need to be declared and read in the :file:`usrdef_nam.F90` module.
Another explicit control is available in the ``&namsbc`` namelist which
activates the use of analytical forcing.
With
.. code-block:: fortran
ln_usr = .true.
Other usrdef modules are activated by less explicit means.
For example, code in :file:`usrdef_istate.F90` is used to
define initial temperature and salinity fields if
.. code-block:: fortran
ln_tsd_init = .false.
in the ``&namtsd`` namelist.
The remaining modules, namely :file:`usrdef_closea.F90` :file:`usrdef_fmask.F90` are specific to
ORCA configurations and set local variations of some specific fields for
the various resolutions of the global models.
They do not need to be considered here in the context of idealised cases but
it is worth noting that all configuration specific code has now been isolated in the usrdef modules.
In the case of these last two modules, they are activated only if an ORCA configuration is detected.
Currently,
this requires a specific integer variable named ``ORCA`` to be set in a :file:`domain_cfg.nc` file.
.. note::
This would be less confusing if the ``cn_cfg`` string is read directly as
a character attribue from the :file:`domain_cfg.nc`.
So, in most cases, the set up of idealised model configurations can be completed by
copying the template routines from :file:`./src/OCE/USR` into
your new :file:`./cfgs/MY_NEW_TEST/MY_SRC` directory and
editing the appropriate modules as needed.
The default set are those used for the GYRE reference configuration.
The contents of :file:`MY_SRC` directories from other idealised configurations may provide
more convenient templates if they share common characteristics with your target application.
Whatever the starting point,
it should not require too many changes or additional lines of code to produce routines in
:file:`./src/OCE/USR` that define analytically the domain,
the initial state and the surface boundary conditions for your new configuration.
To summarize, the base set of modules is:
:usrdef_hgr.F90: Define horizontal grid
:usrdef_zgr.F90: Define vertical grid
:usrdef_sbc.F90: Provides at each time-step the surface boundary condition,
i.e. the momentum, heat and freshwater fluxes
:usrdef_istate.F90: Defines initialization of the dynamics and tracers
:usrdef_nam.F90: Configuration-specific namelist processing to
set any associated run-time parameters
with two specialised ORCA modules
(not related to idealised configurations but used to isolate configuration specific code that
is used in ORCA2 reference configurations and established global configurations using
the ORCA tripolar grid):
:usrdef_fmask.F90: only used in ORCA configurations for
alteration of f-point land/ocean mask in some straits
:usrdef_closea.F90: only used in ORCA configurations for
specific treatments associated with closed seas
From version 4.0, the NEMO release includes a :file:`tests` subdirectory containing available and
up to date :doc:`test cases <tests>` build by the community.
These will not be fully supported as are NEMO reference configurations,
but should provide a source of raw material.
../../../tests/test_cases.bib
\ No newline at end of file
@article{ brodeau.barnier.ea_JPO16,
title = "Climatologically Significant Effects of Some
Approximations in the Bulk Parameterizations of Turbulent
Air–Sea Fluxes",
pages = "5--28",
journal = "Journal of Physical Oceanography",
volume = "47",
number = "1",
author = "Brodeau, Laurent and Barnier, Bernard and Gulev, Sergey K.
and Woods, Cian",
year = "2016",
month = "Dec",
publisher = "American Meteorological Society",
issn = "1520-0485",
doi = "10.1175/jpo-d-16-0169.1"
}
@techreport{ burchard.bolding_trpt02,
title = "GETM, A General Estuarine Transport Model: Scientific
Documentation",
pages = "",
series = "Tech. Rep. EUR 20253 EN",
author = "Burchard, Hans and Bolding, Karsten",
institution = "European Commission",
year = "2002",
month = "01"
}
@article{ haidvogel.beckmann_SESM99,
title = "Numerical Ocean Circulation Modeling",
journal = "Series on Environmental Science and Management",
author = "Haidvogel, Dale B and Beckmann, Aike",
year = "1999",
month = "Apr",
publisher = "IMPERIAL COLLEGE PRESS",
issn = "0219-9793",
isbn = "9781860943935",
doi = "10.1142/p097"
}
@article{ ilıcak.adcroft.ea_OM12,
title = "Spurious dianeutral mixing and the role of momentum
closure",
pages = "37--58",
journal = "Ocean Modelling",
volume = "45-46",
author = "Ilıcak, Mehmet and Adcroft, Alistair J. and Griffies,
Stephen M. and Hallberg, Robert W.",
year = "2012",
month = "Jan",
publisher = "Elsevier BV",
issn = "1463-5003",
doi = "10.1016/j.ocemod.2011.10.003"
}
@article{ lipscomb.hunke_MWR04,
title = "Modeling Sea Ice Transport Using Incremental Remapping",
pages = "1341--1354",
journal = "Monthly Weather Review",
volume = "132",
number = "6",
author = "Lipscomb, William H. and Hunke, Elizabeth C.",
year = "2004",
month = "Jun",
publisher = "American Meteorological Society",
issn = "1520-0493",
doi = "10.1175/1520-0493(2004)132<1341:msitui>2.0.co;2"
}
@article{ losch_JGR08,
title = "Modeling ice shelf cavities in a z coordinate ocean
general circulation model",
journal = "Journal of Geophysical Research",
volume = "113",
number = "C8",
author = "Losch, M.",
year = "2008",
month = "Aug",
publisher = "American Geophysical Union (AGU)",
issn = "0148-0227",
doi = "10.1029/2007jc004368"
}
@article{ mathiot.jenkins.ea_GMD17,
title = "Explicit representation and parametrised impacts of under
ice shelf seas in the ${z}^{\ast}$ coordinate ocean model
NEMO 3.6",
pages = "2849--2874",
journal = "Geoscientific Model Development",
volume = "10",
number = "7",
author = "Mathiot, Pierre and Jenkins, Adrian and Harris,
Christopher and Madec, Gurvan",
year = "2017",
month = "Jul",
publisher = "Copernicus GmbH",
issn = "1991-9603",
doi = "10.5194/gmd-10-2849-2017"
}
@article{ schär.smolarkiewicz_JCP96,
title = "A Synchronous and Iterative Flux-Correction Formalism for
Coupled Transport Equations",
pages = "101--120",
journal = "Journal of Computational Physics",
volume = "128",
number = "1",
author = "Schär, Christoph and Smolarkiewicz, Piotr K.",
year = "1996",
month = "Oct",
publisher = "Elsevier BV",
issn = "0021-9991",
doi = "10.1006/jcph.1996.0198"
}
../../../tests/README.rst
\ No newline at end of file
**********************
Explore the test cases
**********************
.. todo::
CANAL animated gif is missing
.. contents::
:local:
:depth: 1
Installation
============
Download
--------
| The complete and up-to-date set of test cases is available on
:github:`NEMO test cases repository <NEMO-examples>`.
| Download it directly into the :file:`./tests` root directory with
.. code-block:: console
$ git clone http://github.com/NEMO-ocean/NEMO-examples
Compilation
-----------
The compilation of the test cases is very similar to
the manner the reference configurations are compiled.
If you are not familiar on how to compile NEMO,
it is first recomended to read :doc:`the instructions <install>`.
| As the reference configurations are compiled with ``-r`` option,
test cases can be compiled by the use of :file:`makenemo` with ``-a`` option.
| Here an example to compile a copy named WAD2 of the wetting and drying test case (WAD):
.. code-block:: console
$ ./makenemo -n 'WAD2' -a 'WAD' -m 'my_arch' -j '4'
Run and analysis
----------------
There no requirement of specific input file for the test_cases presented here.
The XIOS xml input files and namelist are already setup correctly.
For detailed description and Jupyter notebook, the reader is directed on
the :github:`NEMO test cases repository <NEMO-examples>`
The description below is a brief advertisement of some test cases.
List of test cases
==================
ICE_AGRIF
---------
.. figure:: _static/ICE_AGRIF_UDIAG_43days_UM5.gif
:width: 200px
:align: left
..
| This test case illustrates the advection of an ice patch across
an East/West and North/South periodic channel over a slab ocean (i.e. one ocean layer),
and with an AGRIF zoom (1:3) in the center.
| The purpose of this configuration is to
test the advection of the ice patch in and across the AGRIF boundary.
One can either impose ice velocities or ice-atm.
Stresses and let rheology define velocities (see :file:`README` for details)
VORTEX
------
.. figure:: _static/VORTEX_anim.gif
:width: 200px
:align: right
..
This test case illustrates the propagation of an anticyclonic eddy over a Beta plan and a flat bottom.
It is implemented here with an online refined subdomain (1:3) out of which the vortex propagates.
It serves as a benchmark for quantitative estimates of nesting errors as in :cite:`DEBREU2012`,
:cite:`PENVEN2006` or :cite:`SPALL1991`.
The animation (sea level anomaly in meters) illustrates with
two 1:2 successively nested grids how the vortex smoothly propagates out of the refined grids.
ISOMIP
------
.. figure:: _static/ISOMIP_moc.png
:width: 200px
:align: left
..
| The purpose of this test case is to evaluate the impact of various schemes and new development with
the iceshelf cavities circulation and melt.
This configuration served as initial assesment of the ice shelf module in :cite:`LOSCH2008` and
:cite:`MATHIOT2017`.
The default setup is the one described |ISOMIP|_.
| The figure (meridional overturning circulation) illustrates
the circulation generated after 10000 days by the ice shelf melting (ice pump).
.. |ISOMIP| replace:: here
LOCK_EXCHANGE
-------------
.. figure:: _static/LOCK-FCT4_flux_ubs.gif
:width: 200px
:align: right
..
| The LOCK EXCHANGE experiment is a classical fluid dynamics experiment that has been adapted
by :cite:`HAIDVOGEL1999` for testing advection schemes in ocean circulation models.
It has been used by several authors including :cite:`BURCHARD2002` and :cite:`ILICAK2012`.
The LOCK EXCHANGE experiment can in particular illustrate
the impact of different choices of numerical schemes and/or subgrid closures on
spurious interior mixing.
| Here the animation of the LOCK_EXCHANGE test case using
the advection scheme FCT4 (forth order) for tracer and ubs for dynamics.
OVERFLOW
--------
.. figure:: _static/OVF-sco_FCT4_flux_cen-ahm1000.gif
:width: 200px
:align: left
..
| The OVERFLOW experiment illustrates the impact of different choices of numerical schemes and/or
subgrid closures on spurious interior mixing close to bottom topography.
The OVERFLOW experiment is adapted from the non-rotating overflow configuration described in
:cite:`HAIDVOGEL1999` and further used by :cite:`ILICAK2012`.
Here we can assess the behaviour of the second-order tracer advection scheme FCT2 and
forth-order FCT4, z-coordinate and sigma coordinate (...).
| Here the animation of the OVERFLOW test case in sigma coordinate with
the forth-order advection scheme FCT4.
WAD
---
.. figure:: _static/wad_testcase_7.gif
:width: 200px
:align: right
..
| A set of simple closed basin geometries for testing the Wetting and drying capabilities.
Examples range from a closed channel with EW linear bottom slope to
a parabolic EW channel with a Gaussian ridge.
| Here the animation of the test case 7.
This test case is a simple linear slope with a mid-depth shelf with
an open boundary forced with a sinusoidally varying ssh.
This test case has been introduced to emulate a typical coastal application with
a tidally forced open boundary with an adverse SSH gradient that,
when released, creates a surge up the slope.
The parameters are chosen such that
the surge rises above sea-level before falling back and oscillating towards an equilibrium position.
CANAL
-----
.. figure:: _static/CANAL_image.gif
:width: 200px
:align: left
..
East-west periodic canal of variable size with several initial states and
associated geostrophic currents (zonal jets or vortex).
ICE_ADV2D
---------
| This test case illustrates the advection of an ice patch across
an East/West and North/South periodic channel over a slab ocean (i.e. one ocean layer).
The configuration is similar to ICE_AGRIF, except for the AGRIF zoom.
| The purpose of this configuration is to test the advection schemes available in the sea-ice code
(for now, Prather and Ultimate-Macho from 1st to 5th order),
especially the occurence of overshoots in ice thickness
ICE_ADV1D
---------
| This experiment is the classical :cite:`SCHAR1996` test case ,
which has been used in :cite:`LIPSCOMB2004`, and in which very specific shapes of ice concentration,
thickness and volume converge toward the center of a basin.
Convergence is unidirectional (in x) while fields are homogeneous in y.
| The purpose of this configuration is to
test the caracteristics of advection schemes available in the sea-ice code
(for now, Prather and Ultimate-Macho from 1st to 5th order),
especially the constitency between concentration, thickness and volume,
and the preservation of initial shapes.
.. rubric:: References
.. bibliography:: tests.bib
:all:
:style: unsrt
:labelprefix: T
ICE_RHEO
--------
|
BENCH
-----
| Benchmark configuration. Allow to run any configuration (including ORCA type or BDY) with idealized grid
and initial state so it does not need any input file other than the namelists.
As usual, all configuration changes can be done through the namelist.
We provide 3 example of namelist_cfg to mimic ORCA1, OR025 or ORCA12 configurations.
By default do not produce any output file. An extensive description of BENCH will be abailable in
Irrmann et al. 2021.
CPL_OASIS
---------
| This test case checks the OASIS interface in OCE/SBC, allowing to set up
a coupled configuration through OASIS. See CPL_OASIS/README.md for more information.
DIA_GPU
---------
| This is a demonstrator of diagnostic DIAHSB ported to GPU using CUDA Fortran.
Memory communications between host and device are asynchronous given the device has that capability.
This experiment is target for ORCA2_ICE_PISCES
TSUNAMI
---------
| just use dynspg_ts to simulate the propagation of an ssh anomaly (cosinus) in a box configuration
with flat bottom and jpk=2.
DONUT
-----
| Donut shaped configuration to test MPI decomposition with bdy.
C1D_ASICS
---------
|
DOME
----
|
ICB
----
| ICB is a very idealized configuration used to test and debug the icb module.
The configuration is box with a shallow shelf (40m) on the east and west part of the domain
with a deep central trough (> 100m).
ICB are generating using the test capability of the icb model along a E-W line (this can easily be tuned).
STATION_ASF
-----------
| this demonstration test case can be used to perform a sanity test of the SBCBLK interface of
NEMO. It will test all the bulk-parameterization algorithms using an idealized
forcing that includes a wide range of *SSX / surface atmospheric state*
conditions to detect potential error / inconsistencies. Both a short report and
boolean output: *passed* or *failed* is provided as an output.
SWG
---
| Square bassin blown with an analytical wind. Vertical structure allows only one mode
associated with reduced gravity to develop. This configuration is based on Adcroft & Marshall 1998.
Also run with RK3 time stepping.
ADIAB_WAVE
----------
| The purpose of this test case is to validate the implementation of the Generalized Lagrangian Mean equations for the coupling of NEMO with waves. This test case was first proposed by Ardhuin et al. (2008) and was successively detailed by Bennis et al (2011).
../../../tools/README.rst
\ No newline at end of file
*****
Tools
*****
.. todo::
The 'Tools' chapter needs to be enriched
.. contents::
:local:
:depth: 1
A set of tools is provided with NEMO to setup user own configuration and (pre|post)process data.
How to compile a tool
=====================
The tool can be compiled using the maketools script in the tools directory as follows:
.. code-block:: console
$ ./maketools -m 'my_arch' -n '<TOOL_NAME>'
where ``my_arch`` can be selected among available architecture files or providing a user defined one.
List of tools
=============
BDY_TOOLS
---------
It contains the utility *bdy_reorder* used to reorder old BDY data files used with
previous versions of the model (before 3.4) to make them compatible with NEMO 3.4.
DMP_TOOLS
---------
Used to create a netcdf file called :file:`resto.nc` containing
restoration coefficients for use with the :file:`tra_dmp` module in NEMO
(see :download:`DMP_TOOLS README <../../../tools/DMP_TOOLS/README>`).
DOMAINcfg
---------
A toolbox allowing the creation of regional configurations from curvilinear grid
(see :download:`DOMAINcfg README <../../../tools/DOMAINcfg/README.rst>`).
GRIDGEN
-------
This tool allows the creation of a domain configuration file (``domain_cfg.nc``) containing
the ocean domain information required to define an ocean configuration from scratch.
(see :download:`GRIDGEN documentation <../../../tools/GRIDGEN/doc_cfg_tools.pdf>`).
MISCELLANEOUS
-------------
The tool allows to create alternative configurations to the community without
having to rely on system team sponsorship and support.
MPP_PREP
--------
This tool provides the user with information to choose the best domain decomposition.
The tool computes the number of water processors for all possible decompositions,
up to a maximum number of processors
(see :download:`MPP_PREP documentation <../../../tools/MPP_PREP/mpp_nc.pdf>` and
:download:`MPP_PREP archive <../../../tools/MPP_PREP/mpp_prep-1.0.tar.gz>`).
NESTING
-------
AGRIF nesting tool allows for the seamless two-way coupling of nested sub-models within
the NEMO framework as long as these are defined on subsets of the original root grid.
It allows to create the grid coordinates, the surface forcing and the initial conditions required by
each sub-model when running a NEMO/AGRIF embedded mode
(see :download:`NESTING README <../../../tools/NESTING/README>`).
OBSTOOLS
--------
A series of Fortran utilities which are helpful in handling observation files and
the feedback file output from the NEMO observation operator.
Further info are available in the :doc:`Nemo manual <cite>`.
REBUILD_NEMO
------------
REBUILD_NEMO is a tool to rebuild NEMO output files from multiple processors
(mesh_mask, restart or XIOS output files) into one file
(see :download:`REBUILD_NEMO README <../../../tools/REBUILD_NEMO/README.rst>`).
REBUILD
-------
It contains the old version of REBUILD_NEMO tool based on the IOIPSL code.
SCOORD_GEN
----------
Offline tool to generate a vertical coordinates input file for use with S coordinates.
This has been carried out by copying the model code to an offline tool and then
modifying it to suppress the use of 3D arrays (to reduce memory usage).
The tool has been created in preparation for the removal of the vertical grid definition from
the code.
The output file should contain all variables that are necessary to restart the model.
SECTIONS_DIADCT
---------------
When the Transport across sections diagnostic is activated (``key_diadct``),
this tool is used to build the binary file containing the pathways between
the extremities of each section.
Further info are available in the :doc:`Nemo manual <cite>`.
SIREN
-----
SIREN is a configuration management tool to set up regional configurations with NEMO
(see :download:`SIREN README <../../../tools/SIREN/README>`).
WEIGHTS
-------
This directory contains software for generating and manipulating interpolation weights for use with
the Interpolation On the Fly (IOF) option in NEMO v3 onwards
(see :download:`WEIGHTS README <../../../tools/WEIGHTS/README>`).
TOYATM
------
This directory contains a simplified model that send/receive atmospheric fields to/from NEMO, for use in the CPL_OASIS sed to test case of the NEMO-OASIS coupling interface.
This toy requires OASIS3-MCT to be installed and properly defined in the arch file.
(see :download:`CPL_OASIS README <../../../tests/CPL_OASIS/README.md>`).
ABL_TOOLS
---------
3 steps to generate atmospheric forcings from ECMWF products for ABL1d model with NEMO:
- main_uvg_hpg (optional): geostrophic wind or horizontal pressure gradient computation on ECMWF eta-levels (to force ABL dynamics)
- main_vinterp: vertical interpolation from ECWMF vertical eta-levels to ABL Z-levels
- main_hdrown: 3D-fields horizontal drowning (extrapolation over land totally inspired from SOSIE by L. Brodeau)
(more details available in Lemarie et al. 2020 GMD)
../../../src/TOP/README.rst
\ No newline at end of file
***************
Oceanic tracers
***************
.. todo::
.. contents::
:local:
TOP (Tracers in the Ocean Paradigm) is the NEMO hardwired interface toward
biogeochemical models and provide the physical constraints/boundaries for oceanic tracers.
It consists of a modular framework to handle multiple ocean tracers,
including also a variety of built-in modules.
This component of the NEMO framework allows one to exploit available modules (see below) and
further develop a range of applications, spanning from the implementation of a dye passive tracer to
evaluate dispersion processes (by means of MY_TRC), track water masses age (AGE module),
assess the ocean interior penetration of persistent chemical compounds
(e.g., gases like CFC or even PCBs), up to the full set of equations involving
marine biogeochemical cycles.
Structure
=========
TOP interface has the following location in the source code :file:`./src/TOP` and
the following modules are available:
:file:`TRP`
Interface to NEMO physical core for computing tracers transport
:file:`CFC`
Inert carbon tracers (CFC11,CFC12,SF6)
:file:`C14`
Radiocarbon passive tracer
:file:`AGE`
Water age tracking
:file:`MY_TRC`
Template for creation of new modules and external BGC models coupling
:file:`PISCES`
Built in BGC model. See :cite:`gmd-8-2465-2015` for a throughout description.
The usage of TOP is activated
*i)* by including in the configuration definition the component ``TOP`` and
*ii)* by adding the macro ``key_top`` in the configuration CPP file
(see for more details :forge:`"Learn more about the model" <wiki/Users>`).
As an example, the user can refer to already available configurations in the code,
``GYRE_PISCES`` being the NEMO biogeochemical demonstrator and
``GYRE_BFM`` to see the required configuration elements to couple with an external biogeochemical model
(see also Section 4) .
Note that, since version 4.0,
TOP interface core functionalities are activated by means of logical keys and
all submodules preprocessing macros from previous versions were removed.
Here below the list of preprocessing keys that applies to the TOP interface (beside ``key_top``):
``key_xios``
use XIOS I/O
``key_agrif``
enable AGRIF coupling
``key_trdtrc`` & ``key_trdmxl_trc``
trend computation for tracers
Synthetic Workflow
==================
A synthetic description of the TOP interface workflow is given below to
summarize the steps involved in the computation of biogeochemical and physical trends and
their time integration and outputs,
by reporting also the principal Fortran subroutine herein involved.
Model initialization (:file:`./src/OCE/nemogcm.F90`)
----------------------------------------------------
Call to ``trc_init`` subroutine (:file:`./src/TOP/trcini.F90`) to initialize TOP.
.. literalinclude:: ../../../src/TOP/trcini.F90
:language: fortran
:lines: 41-86
:emphasize-lines: 21,30-32,38-40
:caption: ``trc_init`` subroutine
Time marching procedure (:file:`./src/OCE/step.F90`)
----------------------------------------------------
Call to ``trc_stp`` subroutine (:file:`./src/TOP/trcstp.F90`) to compute/update passive tracers.
.. literalinclude:: ../../../src/TOP/trcstp.F90
:language: fortran
:lines: 46-125
:emphasize-lines: 42,55-57
:caption: ``trc_stp`` subroutine
BGC trends computation for each submodule (:file:`./src/TOP/trcsms.F90`)
------------------------------------------------------------------------
.. literalinclude:: ../../../src/TOP/trcsms.F90
:language: fortran
:lines: 21
:caption: :file:`trcsms` snippet
Physical trends computation (:file:`./src/TOP/TRP/trctrp.F90`)
--------------------------------------------------------------
.. literalinclude:: ../../../src/TOP/TRP/trctrp.F90
:language: fortran
:lines: 46-95
:emphasize-lines: 17,21,29,33-35
:caption: ``trc_trp`` subroutine
Namelists walkthrough
=====================
:file:`namelist_top`
--------------------
Here below are listed the features/options of the TOP interface accessible through
the :file:`namelist_top_ref` and modifiable by means of :file:`namelist_top_cfg`
(as for NEMO physical ones).
Note that ``##`` is used to refer to a number in an array field.
.. literalinclude:: ../../namelists/namtrc_run
:language: fortran
.. literalinclude:: ../../namelists/namtrc
:language: fortran
.. literalinclude:: ../../namelists/namtrc_dta
:language: fortran
.. literalinclude:: ../../namelists/namtrc_adv
:language: fortran
.. literalinclude:: ../../namelists/namtrc_ldf
:language: fortran
.. literalinclude:: ../../namelists/namtrc_rad
:language: fortran
.. literalinclude:: ../../namelists/namtrc_snk
:language: fortran
.. literalinclude:: ../../namelists/namtrc_dmp
:language: fortran
.. literalinclude:: ../../namelists/namtrc_ice
:language: fortran
.. literalinclude:: ../../namelists/namtrc_trd
:language: fortran
.. literalinclude:: ../../namelists/namtrc_bc
:language: fortran
.. literalinclude:: ../../namelists/namtrc_bdy
:language: fortran
.. literalinclude:: ../../namelists/namage
:language: fortran
Two main types of data structure are used within TOP interface
to initialize tracer properties (1) and
to provide related initial and boundary conditions (2).
1. TOP tracers initialization: ``sn_tracer`` (``&namtrc``)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Beside providing name and metadata for tracers,
here are also defined the use of initial (``sn_tracer%llinit``) and
boundary (``sn_tracer%llsbc, sn_tracer%llcbc, sn_tracer%llobc``) conditions.
In the following, an example of the full structure definition is given for
two idealized tracers both with initial conditions given,
while the first has only surface boundary forcing and
the second both surface and coastal forcings:
.. code-block:: fortran
! ! name ! title of the field ! units ! initial data ! sbc ! cbc ! obc !
sn_tracer(1) = 'TRC1' , 'Tracer 1 Concentration ', ' - ' , .true. , .true., .false., .true.
sn_tracer(2) = 'TRC2 ' , 'Tracer 2 Concentration ', ' - ' , .true. , .true., .true. , .false.
As tracers in BGC models are increasingly growing,
the same structure can be written also in a more compact and readable way:
.. code-block:: fortran
! ! name ! title of the field ! units ! initial data !
sn_tracer(1) = 'TRC1' , 'Tracer 1 Concentration ', ' - ' , .true.
sn_tracer(2) = 'TRC2 ' , 'Tracer 2 Concentration ', ' - ' , .true.
! sbc
sn_tracer(1)%llsbc = .true.
sn_tracer(2)%llsbc = .true.
! cbc
sn_tracer(2)%llcbc = .true.
The data structure is internally initialized by code with dummy names and
all initialization/forcing logical fields set to ``.false.`` .
2. Structures to read input initial and boundary conditions: ``&namtrc_dta`` (``sn_trcdta``), ``&namtrc_bc`` (``sn_trcsbc`` / ``sn_trccbc`` / ``sn_trcobc``)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The overall data structure (Fortran type) is based on the general one defined for NEMO core in the SBC component
(see details in ``SBC`` Chapter of :doc:`Reference Manual <cite>` on Input Data specification).
Input fields are prescribed within ``&namtrc_dta`` (with ``sn_trcdta`` structure),
while Boundary Conditions are applied to the model by means of ``&namtrc_bc``,
with dedicated structure fields for surface (``sn_trcsbc``), riverine (``sn_trccbc``), and
lateral open (``sn_trcobc``) boundaries.
The following example illustrates the data structure in the case of initial condition for
a single tracer contained in the file named :file:`tracer_1_data.nc`
(``.nc`` is implicitly assumed in namelist filename),
with a doubled initial value, and located in the :file:`usr/work/model/inputdata` folder:
.. code-block:: fortran
! ! file name ! frequency (hours) ! variable ! time interp. ! clim ! 'yearly'/ ! weights ! rotation ! land/sea mask !
! ! ! (if <0 months) ! name ! (logical) ! (T/F) ! 'monthly' ! filename ! pairing ! filename !
sn_trcdta(1) = 'tracer_1_data' , -12 , 'TRC1' , .false. , .true. , 'yearly' , '' , '' , ''
rf_trfac(1) = 2.0
cn_dir = 'usr/work/model/inputdata/'
Note that, the Lateral Open Boundaries conditions are applied on
the segments defined for the physical core of NEMO
(see ``BDY`` description in the :doc:`Reference Manual <cite>`).
:file:`namelist_trc`
--------------------
Here below the description of :file:`namelist_trc_ref` used to handle Carbon tracers modules,
namely CFC and C14.
.. literalinclude:: ../../../cfgs/SHARED/namelist_trc_ref
:language: fortran
:lines: 7,17,26,34
:caption: :file:`namelist_trc_ref` snippet
``MY_TRC`` interface for coupling external BGC models
=====================================================
The generalized interface is pivoted on MY_TRC module that contains template files to
build the coupling between
NEMO and any external BGC model.
The call to MY_TRC is activated by setting ``ln_my_trc = .true.`` (in ``&namtrc``)
The following 6 fortran files are available in MY_TRC with the specific purposes here described.
:file:`par_my_trc.F90`
This module allows to define additional arrays and public variables to
be used within the MY_TRC interface
:file:`trcini_my_trc.F90`
Here are initialized user defined namelists and
the call to the external BGC model initialization procedures to populate general tracer array
(``trn`` and ``trb``).
Here are also likely to be defined support arrays related to system metrics that
could be needed by the BGC model.
:file:`trcnam_my_trc.F90`
This routine is called at the beginning of ``trcini_my_trc`` and
should contain the initialization of additional namelists for the BGC model or user-defined code.
:file:`trcsms_my_trc.F90`
The routine performs the call to Boundary Conditions and its main purpose is to
contain the Source-Minus-Sinks terms due to the biogeochemical processes of the external model.
Be aware that lateral boundary conditions are applied in trcnxt routine.
.. warning::
The routines to compute the light penetration along the water column and
the tracer vertical sinking should be defined/called in here,
as generalized modules are still missing in the code.
:file:`trcice_my_trc.F90`
Here it is possible to prescribe the tracers concentrations in the sea-ice that
will be used as boundary conditions when ice melting occurs (``nn_ice_tr = 1`` in ``&namtrc_ice``).
See e.g. the correspondent PISCES subroutine.
:file:`trcwri_my_trc.F90`
This routine performs the output of the model tracers (only those defined in ``&namtrc``) using
IOM module (see chapter “Output and Diagnostics” in the :doc:`Reference Manual <cite>`).
It is possible to place here the output of additional variables produced by the model,
if not done elsewhere in the code, using the call to ``iom_put``.
Coupling an external BGC model using NEMO framework
===================================================
The coupling with an external BGC model through the NEMO compilation framework can be achieved in
different ways according to the degree of coding complexity of the Biogeochemical model, like e.g.,
the whole code is made only by one file or
it has multiple modules and interfaces spread across several subfolders.
Beside the 6 core files of MY_TRC module, let’s assume an external BGC model named *MYBGC* and
constituted by a rather essential coding structure, likely few Fortran files.
The new coupled configuration name is *NEMO_MYBGC*.
The best solution is to have all files (the modified ``MY_TRC`` routines and the BGC model ones)
placed in a unique folder with root ``MYBGCPATH`` and
to use the makenemo external readdressing of ``MY_SRC`` folder.
The coupled configuration listed in :file:`work_cfgs.txt` will look like
::
NEMO_MYBGC OCE TOP
and the related ``cpp_MYBGC.fcm`` content will be
.. code-block:: perl
bld::tool::fppkeys key_xios key_top
the compilation with :file:`makenemo` will be executed through the following syntax
.. code-block:: console
$ makenemo -n 'NEMO_MYBGC' -m '<arch_my_machine>' -j 8 -e '<MYBGCPATH>'
The makenemo feature ``-e`` was introduced to
readdress at compilation time the standard MY_SRC folder (usually found in NEMO configurations) with
a user defined external one.
The compilation of more articulated BGC model code & infrastructure,
like in the case of BFM (|BFM man|_), requires some additional features.
As before, let’s assume a coupled configuration name *NEMO_MYBGC*,
but in this case MYBGC model root becomes :file:`MYBGC` path that
contains 4 different subfolders for biogeochemistry,
named :file:`initialization`, :file:`pelagic`, and :file:`benthic`,
and a separate one named :file:`nemo_coupling` including the modified `MY_SRC` routines.
The latter folder containing the modified NEMO coupling interface will be still linked using
the makenemo ``-e`` option.
In order to include the BGC model subfolders in the compilation of NEMO code,
it will be necessary to extend the configuration :file:`cpp_NEMO_MYBGC.fcm` file to include the specific paths of :file:`MYBGC` folders, as in the following example
.. code-block:: perl
bld::tool::fppkeys key_xios key_top
src::MYBGC::initialization <MYBGCPATH>/initialization
src::MYBGC::pelagic <MYBGCPATH>/pelagic
src::MYBGC::benthic <MYBGCPATH>/benthic
bld::pp::MYBGC 1
bld::tool::fppflags::MYBGC %FPPFLAGS
bld::tool::fppkeys %bld::tool::fppkeys MYBGC_MACROS
where *MYBGC_MACROS* is the space delimited list of macros used in *MYBGC* model for
selecting/excluding specific parts of the code.
The BGC model code will be preprocessed in the configuration :file:`BLD` folder as for NEMO,
but with an independent path, like :file:`NEMO_MYBGC/BLD/MYBGC/<subforlders>`.
The compilation will be performed similarly to in the previous case with the following
.. code-block:: console
$ makenemo -n 'NEMO_MYBGC' -m '<arch_my_machine>' -j 8 -e '<MYBGCPATH>/nemo_coupling'
.. note::
The additional lines specific for the BGC model source and build paths can be written into
a separate file, e.g. named :file:`MYBGC.fcm`,
and then simply included in the :file:`cpp_NEMO_MYBGC.fcm` as follow
.. code-block:: perl
bld::tool::fppkeys key_xios key_top
inc <MYBGCPATH>/MYBGC.fcm
This will enable a more portable compilation structure for all MYBGC related configurations.
.. warning::
The coupling interface contained in :file:`nemo_coupling` cannot be added using the FCM syntax,
as the same files already exists in NEMO and they are overridden only with
the readdressing of MY_SRC contents to avoid compilation conflicts due to duplicate routines.
All modifications illustrated above, can be easily implemented using shell or python scripting
to edit the NEMO configuration :file:`CPP.fcm` file and
to create the BGC model specific FCM compilation file with code paths.
.. |BFM man| replace:: BFM-NEMO coupling manual
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment