Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
No results found
Show changes
Commits on Source (178)
Showing
with 1113 additions and 272 deletions
**Contents**
NEMO
====
.. contents::
:local:
.. _`Former web platform forge`: https://forge.ipsl.jussieu.fr/nemo
.. _`NEMO users' guide`: https://sites.nemo-ocean.io/user-guide
.. _`Migration Guide`: https://sites.nemo-ocean.io/user-guide/migration.html
.. _`Change list`: https://sites.nemo-ocean.io/user-guide/changes.html
.. _`Test case repository`: https://github.com/NEMO-ocean/NEMO-examples
.. _`How to cite`: https://www.nemo-ocean.eu/bibliography/how-to-cite/
.. _`NEMO forums`: https://nemo-ocean.discourse.group
.. _`NEMO newsletter`: https://listes.ipsl.fr/sympa/subscribe/nemo-newsletter
.. _`NEMO publications`: https://www.nemo-ocean.eu/bibliography/publications/add
.. _`NEMO projects`: https://www.nemo-ocean.eu/projects/add
.. _`Special Issue`: https://gmd.copernicus.org/articles/special_issue40.html
.. _`NEMO System Team wiki`: https://forge.nemo-ocean.eu/developers/home/-/wikis/Home
.. _`NEMO ocean engine`: https://zenodo.org/record/1464816
.. _`NEMO Tracers engine` : https://zenodo.org/record/1471700
.. _`NEMO Sea Ice engine`: https://zenodo.org/record/1471689
**Welcome to NEMO home page!**
NEMO (*Nucleus for European Modelling of the Ocean*) is a state-of-the-art modelling
*Nucleus for European Modelling of the Ocean* (NEMO) is a state-of-the-art modelling
framework for research activities and forecasting services in ocean and climate sciences,
developed in a sustainable way by the NEMO European consortium since 2008.
This page intends to help you to get started using the NEMO platform and to introduce you
to the different levels of information available. It starts here with NEMO release 4.2.0.
Reminder: Our `Former web platform forge`_ (SVN+Trac) contains the previous documentation
and releases made available from the beginning of the project up to of NEMO 4.0.
Getting started
===============
Getting your hands on NEMO: the first steps are described in detail in the
`NEMO users' guide`_ . This explains how to download the code, build the environment,
create the executable, and perform a first integration.
If you are already using a previous release of NEMO, please refer to the
`Migration Guide`_ which aims to help you to make the move to 4.2.0.
The above users guides cover in detail what is available from gitlab and supported by NEMO
System Team. Aside from this web platform, a set of test cases is also available from the
`Test case repository`_ . These test cases can be useful for students, outreach, and
exploring specific aspects of NEMO with light configurations. The web page also allows you
to submit test cases you have developed and want to share with the community. Feel free to
contribute!
Project documentation
=====================
Reference manuals fully describing NEMO for the three main component
* |OCE| models the ocean {thermo}dynamics and solves the primitive equations (`./src/OCE <./src/OCE>`_)
* |ICE| simulates sea-ice {thermo}dynamics, brine inclusions and subgrid-scale thickness
variations (`./src/ICE <./src/ICE>`_)
* |MBG| models the {on,off}line oceanic tracers transport and biogeochemical processes
(`./src/TOP <./src/TOP>`_)
are available from Zenodo:
============ ======================== =====
Component Reference Manual DOI
============ ======================== =====
|NEMO-OCE| `NEMO ocean engine`_ .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.6334656.svg
:target: https://doi.org/10.5281/zenodo.6334656
|NEMO-ICE| `NEMO Sea Ice engine`_ *not yet available*
|NEMO-MBG| `NEMO Tracers engine`_ .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.1471700.svg
:target: https://doi.org/10.5281/zenodo.1471700
============ ======================== =====
These reference manuals are the publications that should be cited in your own
publications. Please visit `How to cite`_? for details.
New features of 4.2.0 release are described in the `Change list`_ section of the `NEMO users' guide`_
Asking questions, and exchanging information
============================================
- Register once for all and use the `NEMO forums`_ on discourse to share and discuss with the NEMO community.
- Register once for all and receive by mail the `NEMO newsletter`_ : recommended for all
users to receive the major announcements from the project (new releases, open meetings and
main informations). Low traffic: about ten messages a year.
Contributing to NEMO visibility: projects and publications
==========================================================
Please help us justifying the NEMO development efforts by
- Adding your publications using NEMO and its outputs to the `NEMO publications`_ page
- Describing your project using NEMO on the `NEMO projects`_ page
NEMO also has a `Special Issue`_ in the open-access journal
Geoscientific Model Development (GMD) from the European Geosciences Union
The main scope is to collect relevant manuscripts covering various topics and
to provide a single portal to assess the model potential and evolution.
Contributing to NEMO development
================================
For all information please refer to the `NEMO wiki <https://forge.nemo-ocean.eu/nemo/nemo/-/wikis/home>`_
NEMO strives to be written in a way which allows the easy incorporation of developments.
You are welcome to contribute to the development of the NEMO Shared reference. NEMO
development is driven by NEMO Consortium planning and producing NEMO's sustainable
development in order to keep a reliable evolving framework. Development is organised and
scheduled through a five years development strategy, Working groups and the activities of
the development team (named NEMO System Team) in a yearly workplan. More information is
available on the `NEMO System Team wiki`_
How to cite
===========
To acknowledge the sustainable development efforts of the NEMO Consortium, please quote `these references <https://www.nemo-ocean.eu/bibliography/how-to-cite/>`_ in your publications and presentations using NEMO.
Disclaimer
==========
......
module purge
module load intel-2021.6.0/cmake/3.25.1-7wfsx
module load oneapi-2022.1.0/compiler-rt/2022.1.0
module load intel-2021.6.0/2021.6.0
module load impi-2021.6.0/2021.6.0
module load intel-2021.6.0/impi-2021.6.0/hdf5-threadsafe/1.13.3-zbgha
module load intel-2021.6.0/impi-2021.6.0/netcdf-c-threadsafe/4.9.0-wpe4t
module load intel-2021.6.0/impi-2021.6.0/netcdf-fortran-threadsafe/4.6.0-75oow
module load intel-2021.6.0/impi-2021.6.0/parallel-netcdf/1.12.3-eshb5
module load intel-2021.6.0/perl/5.36.0-jj4hw
module load intel-2021.6.0/perl-uri/1.72-6at2i
module load intel-2021.6.0/impi-2021.6.0/xios/2.5-36kwn
# set linker path to 64-bit libraries
export LD_LIBRARY_PATH="/lib64/":$LD_LIBRARY_PATH
# mpi ifort compiler options for ZEUS cluster + XIOS
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
#
# CPP Pre-processor
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# FPPFLAGS pre-processing flags
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# AR assembler
# ARFLAGS assembler flags
# MK make (usually GNU make)
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
# Environment variables are set automatically when loading modules on JUNO cluster (see arch-X64_JUNO.env)
%NCDF_INC -I${NETCDF_FORTRAN}/include -I${NETCDF_C}/include -I${PARALLEL_NETCDF}/include
%NCDF_LIB -L${NETCDF_FORTRAN}/lib -lnetcdff -L${NETCDF_C}/lib -lnetcdf -L${PARALLEL_NETCDF}/lib -lpnetcdf
%HDF5_INC -I${HDF5}/include
%HDF5_LIB -L${HDF5}/lib -lhdf5_hl -lhdf5
%XIOS_INC -I${XIOS}/inc
%XIOS_LIB -L${XIOS}/lib -lxios
%USER_INC %XIOS_INC %NCDF_INC %HDF5_INC
%USER_LIB %XIOS_LIB %NCDF_LIB %HDF5_LIB
%FC mpiifort
%FCFLAGS -r8 -O3 -fp-model source -traceback -qmkl=cluster -march=icelake-client -mtune=icelake-client -qopt-zmm-usage=low -no-fma
%FFLAGS %FCFLAGS
%CC mpiicc
%CFLAGS -O0
%LD mpiifort
%LDFLAGS -lstdc++ -lgpfs
%AR ar
%ARFLAGS -r
%CPP icc
%FPPFLAGS -E -P -traditional
%MK gmake
arch-X64_JUNO.env
\ No newline at end of file
# mpi ifort compiler options for ZEUS cluster + XIOS
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
#
# CPP Pre-processor
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# FPPFLAGS pre-processing flags
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# AR assembler
# ARFLAGS assembler flags
# MK make (usually GNU make)
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
# Environment variables are set automatically when loading modules on JUNO cluster (see arch-X64_JUNO.env)
%NCDF_INC -I${NETCDF_FORTRAN}/include -I${NETCDF_C}/include -I${PARALLEL_NETCDF}/include
%NCDF_LIB -L${NETCDF_FORTRAN}/lib -lnetcdff -L${NETCDF_C}/lib -lnetcdf -L${PARALLEL_NETCDF}/lib -lpnetcdf
%HDF5_INC -I${HDF5}/include
%HDF5_LIB -L${HDF5}/lib -lhdf5_hl -lhdf5
%XIOS_INC -I${XIOS}/inc
%XIOS_LIB -L${XIOS}/lib -lxios
%USER_INC %XIOS_INC %NCDF_INC %HDF5_INC
%USER_LIB %XIOS_LIB %NCDF_LIB %HDF5_LIB
%FC mpiifort
%FCFLAGS -r8 -g -O0 -check all -fp-model source -traceback -qmkl=cluster -march=icelake-client -mtune=icelake-client -qopt-zmm-usage=low -no-fma
%FFLAGS %FCFLAGS
%CC mpiicc
%CFLAGS -O0
%LD mpiifort
%LDFLAGS -lstdc++ -lgpfs
%AR ar
%ARFLAGS -r
%CPP icc
%FPPFLAGS -E -P -traditional
%MK gmake
......@@ -54,5 +54,5 @@
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC cc
%CC icc
%CFLAGS -O0
......@@ -54,5 +54,5 @@
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC cc
%CC icc
%CFLAGS -O0
# compiler options for AA (using GCC compiler)
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
# OASIS_INC oasis include file (taken into accound only if key_oasis3 is activated)
# OASIS_LIB oasis library (taken into accound only if key_oasis3 is activated)
#
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# FPPFLAGS pre-processing flags
# AR assembler
# ARFLAGS assembler flags
# MK make
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
# All NETCDF and HDF paths are empty as they are automatically defined through environment
# variables by the load of modules
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
#
#
%NCDF_INC ${NETCDF4_INCLUDE}
%NCDF_LIB -L${NETCDF4_DIR}/lib -lnetcdff -lnetcdf -L${HDF5_DIR}/lib -lhdf5_hl -lm
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/build/lib/mct -I${OASIS_DIR}/build/lib/psmile.MPI1
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp -Dkey_nosignedzero
%FC mpif90 -c -cpp
# O3 breaks reproduci/restartabi-lity with gcc/12.2.0
%FCFLAGS -fdefault-real-8 -O2 -funroll-all-loops -fcray-pointer -ffree-line-length-none -fallow-argument-mismatch -Wno-missing-include-dirs
%FFLAGS %FCFLAGS
%LD mpif90
%LDFLAGS -Wl,-rpath,${HDF5_DIR}/lib -Wl,-rpath=${NETCDF4_DIR}/lib -Wl,-rpath=${XIOS_DIR}/lib
%FPPFLAGS -P -traditional
%AR ar
%ARFLAGS rs
%MK make
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC gcc
%CFLAGS -O0 -fcommon
# compiler options for AA (using GCC compiler)
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
# OASIS_INC oasis include file (taken into accound only if key_oasis3 is activated)
# OASIS_LIB oasis library (taken into accound only if key_oasis3 is activated)
#
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# FPPFLAGS pre-processing flags
# AR assembler
# ARFLAGS assembler flags
# MK make
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
# All NETCDF and HDF paths are empty as they are automatically defined through environment
# variables by the load of modules
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
#
#
%NCDF_INC ${NETCDF4_INCLUDE}
%NCDF_LIB -L${NETCDF4_DIR}/lib -lnetcdff -lnetcdf -L${HDF5_DIR}/lib -lhdf5_hl -lm
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/build/lib/mct -I${OASIS_DIR}/build/lib/psmile.MPI1
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp -Dkey_nosignedzero
%FC mpif90 -c -cpp
%FCFLAGS -fdefault-real-8 -Og -g -fbacktrace -funroll-all-loops -fcray-pointer -ffree-line-length-none -fcheck=all,no-array-temps -finit-real=nan -ffpe-trap=invalid,zero,overflow -ffpe-summary=invalid,zero,overflow -fallow-argument-mismatch -Wno-missing-include-dirs
%FFLAGS %FCFLAGS
%LD mpif90
%LDFLAGS -Wl,-rpath,${HDF5_DIR}/lib -Wl,-rpath=${NETCDF4_DIR}/lib -Wl,-rpath=${XIOS_DIR}/lib
%FPPFLAGS -P -traditional
%AR ar
%ARFLAGS rs
%MK make
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC gcc
%CFLAGS -O0 -fcommon
# compiler options for AA (using INTEL compiler & OpenMPI)
#
# module purge
# module use /home/ar0s/modules
# module load prgenv/intel intel/2021.4.0 openmpi/4.1.1.1 hdf5-parallel/1.10.6 netcdf4-parallel/4.7.4 xios/trunk/rev2320-nmpi
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
# OASIS_INC oasis include file (taken into accound only if key_oasis3 is activated)
# OASIS_LIB oasis library (taken into accound only if key_oasis3 is activated)
#
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# FPPFLAGS pre-processing flags
# AR assembler
# ARFLAGS assembler flags
# MK make
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
# All NETCDF and HDF paths are empty as they are automatically defined through environment
# variables by the load of modules
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
#
#
%NCDF_INC ${NETCDF4_INCLUDE}
%NCDF_LIB ${NETCDF4_LIB} -L${HDF5_DIR}/lib -Wl,-rpath,${HDF5_DIR}/lib -lhdf5_hl -lhdf5 -lz
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/include
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp
%FC mpifort -c -cpp
%FCFLAGS -march=core-avx2 -i4 -r8 -O3 -fp-model strict -fno-alias -align array64byte
%FFLAGS %FCFLAGS
%LD mpifort
%LDFLAGS
%FPPFLAGS -P -traditional
%AR ar
%ARFLAGS rs
%MK gmake
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC mpicc
%CFLAGS -O0
# compiler options for AA (using INTEL compiler & OpenMPI)
#
# module purge
# module use /home/ar0s/modules
# module load prgenv/intel intel/2021.4.0 openmpi/4.1.1.1 hdf5-parallel/1.10.6 netcdf4-parallel/4.7.4 xios/trunk/rev2320-nmpi
#
#
# NCDF_INC netcdf4 include file
# NCDF_LIB netcdf4 library
# XIOS_INC xios include file (taken into accound only if key_xios is activated)
# XIOS_LIB xios library (taken into accound only if key_xios is activated)
# OASIS_INC oasis include file (taken into accound only if key_oasis3 is activated)
# OASIS_LIB oasis library (taken into accound only if key_oasis3 is activated)
#
# FC Fortran compiler command
# FCFLAGS Fortran compiler flags
# FFLAGS Fortran 77 compiler flags
# LD linker
# LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries
# FPPFLAGS pre-processing flags
# AR assembler
# ARFLAGS assembler flags
# MK make
# USER_INC complete list of include files
# USER_LIB complete list of libraries to pass to the linker
# CC C compiler used to compile conv for AGRIF
# CFLAGS compiler flags used with CC
#
# Note that:
# - unix variables "$..." are accpeted and will be evaluated before calling fcm.
# - fcm variables are starting with a % (and not a $)
#
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
# All NETCDF and HDF paths are empty as they are automatically defined through environment
# variables by the load of modules
#---------------------------------------------------------------------------------------------
#---------------------------------------------------------------------------------------------
#
#
%NCDF_INC ${NETCDF4_INCLUDE}
%NCDF_LIB ${NETCDF4_LIB} -L${HDF5_DIR}/lib -Wl,-rpath,${HDF5_DIR}/lib -lhdf5_hl -lhdf5 -lz
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/include
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp
%FC mpifort -c -cpp
%FCFLAGS -march=core-avx2 -i4 -r8 -g -O0 -debug all -traceback -fp-model strict -ftrapuv -check all,noarg_temp_created -fpe-all0 -ftz -init=arrays,snan,huge
%FFLAGS %FCFLAGS
%LD mpifort
%LDFLAGS
%FPPFLAGS -P -traditional
%AR ar
%ARFLAGS rs
%MK gmake
%USER_INC %XIOS_INC %OASIS_INC %NCDF_INC
%USER_LIB %XIOS_LIB %OASIS_LIB %NCDF_LIB
%CC mpicc
%CFLAGS -O0
# compiler options for BELENOS/TARANIS (using INTEL compiler)
# compiler options for BELENOS/TARANIS (using INTEL compiler + INTEL MPI)
#
# INTEL_IMPI (XIOS-2.5)
# intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios-2.5_rev1903
# --------------------------------
# INTEL_IMPI (NEMO 4.0 + XIOS-2.5)
# --------------------------------
# module use /home/ext/mr/smer/samsong/modules
# module load intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios/2.5/rev1903
#
# INTEL_IMPI (XIOS-TRUNK must be compiled)
# gcc/9.2.0 intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2
# ----------------------------------
# INTEL_IMPI (NEMO 4.2 + XIOS-TRUNK)
# ----------------------------------
# module use /home/ext/mr/smer/samsong/modules
# module load gcc/9.2.0 intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios/trunk/rev2134
#
#
# NCDF_INC netcdf4 include file
......@@ -44,7 +50,7 @@
%NCDF_LIB -L${NETCDF_LIB} -lnetcdff -lnetcdf -L${PHDF5_LIB_DIR} -lhdf5_hl -lhdf5
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/build/lib/mct -I${OASIS_DIR}/build/lib/psmile.MPI1
%OASIS_INC -I${OASIS_DIR}/include
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp
......
# compiler options for BELENOS/TARANIS (using INTEL compiler)
# compiler options for BELENOS/TARANIS (using INTEL compiler + INTEL MPI)
#
# INTEL_IMPI (XIOS-2.5)
# intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios-2.5_rev1903
# --------------------------------
# INTEL_IMPI (NEMO 4.0 + XIOS-2.5)
# --------------------------------
# module use /home/ext/mr/smer/samsong/modules
# module load intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios/2.5/rev1903
#
# INTEL_IMPI (XIOS-TRUNK must be compiled)
# gcc/9.2.0 intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2
# ----------------------------------
# INTEL_IMPI (NEMO 4.2 + XIOS-TRUNK)
# ----------------------------------
# module use /home/ext/mr/smer/samsong/modules
# module load gcc/9.2.0 intel/2018.5.274 intelmpi/2018.5.274 phdf5/1.8.18 netcdf_par/4.7.1_V2 xios/trunk/rev2134
#
#
# NCDF_INC netcdf4 include file
......@@ -44,7 +50,7 @@
%NCDF_LIB -L${NETCDF_LIB} -lnetcdff -lnetcdf -L${PHDF5_LIB_DIR} -lhdf5_hl -lhdf5
%XIOS_INC -I${XIOS_INC}
%XIOS_LIB -L${XIOS_LIB} -lxios -lstdc++
%OASIS_INC -I${OASIS_DIR}/build/lib/mct -I${OASIS_DIR}/build/lib/psmile.MPI1
%OASIS_INC -I${OASIS_DIR}/include
%OASIS_LIB -L${OASIS_DIR}/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip
%CPP cpp
......
This diff is collapsed.
......@@ -19,12 +19,13 @@
</variable_definition>
<!-- Fields definition -->
<field_definition src="./field_def_nemo-oce.xml"/> <!-- NEMO ocean dynamics -->
<field_definition src="./field_def_nemo-pisces.xml"/> <!-- NEMO ocean dynamics -->
<field_definition src="./field_def_nemo-oce.xml"/> <!-- NEMO ocean dynamics -->
<field_definition src="./field_def_bfm.xml"/> <!-- BFM BGC dynamics -->
<!-- Files definition -->
<file_definition src="./file_def_nemo.xml"/> <!-- NEMO ocean dynamics -->
<file_definition src="./file_def_nemo.xml"/> <!-- NEMO ocean dynamics -->
<file_definition src="./file_def_bfm.xml"/> <!-- BFM BGC dynamics -->
<!-- Axis definition -->
<axis_definition src="./axis_def_nemo.xml"/>
......
......@@ -19,13 +19,14 @@
!-----------------------------------------------------------------------
&namrun ! parameters of the run
!-----------------------------------------------------------------------
cn_exp = "GYRE" ! experience name
nn_it000 = 1 ! first time step
nn_itend = 4320 ! last time step
nn_leapy = 30 ! Leap year calendar (1) or not (0)
nn_stock = 4320 ! frequency of creation of a restart file (modulo referenced to 1)
nn_write = 60 ! frequency of write in the output file (modulo referenced to nn_it000)
nn_istate = 0 ! output the initial state (1) or not (0)
cn_exp = "GYRE_BFM" ! experience name
nn_date0 = 20000101 ! date at nit_0000 (format yyyymmdd) used if ln_rstart=F or (ln_rstart=T and nn_rstctl=0 or 1)
nn_it000 = 1 ! first time step
nn_itend = 4320 ! last time step
nn_leapy = 30 ! Leap year calendar (1) or not (0)
nn_stock = 4320 ! frequency of creation of a restart file (modulo referenced to 1)
nn_write = 60 ! frequency of write in the output file (modulo referenced to nn_it000)
nn_istate = 0 ! output the initial state (1) or not (0)
/
!-----------------------------------------------------------------------
&namcfg ! parameters of the configuration (default: user defined GYRE)
......@@ -49,6 +50,8 @@
ln_linssh = .true. ! =T linear free surface ==>> model level are fixed in time
!
rn_Dt = 7200. ! time step for the dynamics
!
ln_meshmask = .false. ! =T create a mesh file
/
!!======================================================================
......@@ -245,10 +248,14 @@
!-----------------------------------------------------------------------
&nammpp ! Massively Parallel Processing
!-----------------------------------------------------------------------
nn_hls = 1 ! halo width (applies to both rows and columns)
nn_comm = 1 ! comm choice
/
!-----------------------------------------------------------------------
&namctl ! Control prints (default: OFF)
!-----------------------------------------------------------------------
sn_cfctl%l_runstat = .FALSE. ! switches and which areas produce reports with the proc integer settings.
ln_timing = .false. ! timing by routine write out in timing.output file
/
!-----------------------------------------------------------------------
&namsto ! Stochastic parametrization of EOS (default: OFF)
......
......@@ -4,11 +4,14 @@
&namtrc_run ! run information
!-----------------------------------------------------------------------
ln_top_euler = .true. ! use Euler time-stepping for TOP
ln_rsttr = .false.
/
!-----------------------------------------------------------------------
&namtrc ! tracers definition
!-----------------------------------------------------------------------
ln_trcdta = .false. ! Initialisation from data input file (T) or not (F)
jp_bgc = 1 ! Modified runtime by BFM interface
ln_my_trc = .true.
ln_trcdta = .false. ! Initialisation from data input file (T) or not (F)
!
! ! name ! title of the field ! units ! initial data from file or not !
sn_tracer(1) = 'DUMMY ' , 'Dummy tracer ' , 'dummy-units' , .false.
......@@ -23,9 +26,8 @@
/
!-----------------------------------------------------------------------
&namtrc_adv ! advection scheme for passive tracer (default: NO selection)
ln_trcadv_fct = .true. ! FCT scheme
nn_fct_h = 2 ! =2/4, horizontal 2nd / 4th order
nn_fct_v = 2 ! =2/4, vertical 2nd / COMPACT 4th order
ln_trcadv_fct = .false.
ln_trcadv_mus = .true.
!-----------------------------------------------------------------------
/
!-----------------------------------------------------------------------
......@@ -36,6 +38,7 @@
!-----------------------------------------------------------------------
&namtrc_rad ! treatment of negative concentrations
!-----------------------------------------------------------------------
ln_trcrad = .true.
/
!-----------------------------------------------------------------------
&namtrc_dmp ! passive tracer newtonian damping
......@@ -44,6 +47,7 @@
!-----------------------------------------------------------------------
&namtrc_ice ! Representation of sea ice growth & melt effects
!-----------------------------------------------------------------------
nn_ice_tr = -1
/
!-----------------------------------------------------------------------
&namtrc_trd ! diagnostics on tracer trends ('key_trdtrc')
......@@ -52,6 +56,12 @@
!----------------------------------------------------------------------
&namtrc_bc ! data for boundary conditions
!-----------------------------------------------------------------------
cn_dir_sbc = './'
cn_dir_cbc = './'
cn_dir_obc = './'
ln_rnf_ctl = .false.
rn_sbc_time = 86400.
rn_cbc_time = 86400.
/
!----------------------------------------------------------------------
&namtrc_bdy ! Setup of tracer boundary conditions
......
#! /bin/sh
### This is an example of a runscript for the LSF queueing system
#BSUB -a poe
#BSUB -J GYRE_BFM # Name of the job.
#BSUB -o GYRE_BFM_%J.out # Appends std output to file %J.out.
#BSUB -e GYRE_BFM_%J.err # Appends std error to file %J.out.
#BSUB -P nemo
#BSUB -q poe_short # queue
#BSUB -n 4 # Number of CPUs
set -evx
export MP_WAIT_MODE=poll
export MP_POLLING_INTERVAL=30000000
export MP_SHARED_MEMORY=yes
export MP_EUILIB=us
export MP_EUIDEVICE=sn_all
export LDR_CNTRL=TEXTPSIZE=64K@STACKPSIZE=64K@DATAPSIZE=64K
export MP_TASK_AFFINITY=core
EXP="EXP00"
workdir="TO_BE_SET_BY_USER"
execdir=`pwd`
if [ ! -d ${workdir} ] ; then
mkdir -p ${workdir}
fi
cd ${workdir}
rm -rf *
# Copy files to exp folder
cp ${execdir}/opa ./opa.x
cp ${execdir}/* ./
# Launch the model
mpirun.lsf opa.x
-----------------------------------------------------------------------
Coupling with the Biogeochemical Flux Model (BFM)
-----------------------------------------------------------------------
Author: M. Vichi, BFM system team and NEMO system team
INFO HELPDESK: info@bfm-community.eu
WEB SITE: www.bfm-community.eu
REVISION DATE: October 2013
Please address any technical query to the BFM System Team
bfm_st@lists.cmcc.it
-----------------------------------------------------------------------
-----------------------------------------------------------------------
What is the BFM?
-----------------------------------------------------------------------
The Biogeochemical Flux Model (BFM) is a numerical model for the
simulation of the dynamics of major biogeochemical properties
in marine ecosystems. The BFM is open source software freely available
under the GNU Public License. The model can be used in standalone mode
to simulate a 0-D system or coupled with other OGCM.
The coupling with NEMO is maintained by CMCC as part of the
NEMO system team activity
-----------------------------------------------------------------------
How to get the BFM code
-----------------------------------------------------------------------
The code can be downloaded from http://www.bfm-community.eu after
the registration of a new user. Follow the instructions on how to
install the code. It is recommended to run the STANDALONE test cases
before using the NEMO-BFM coupled system.
-----------------------------------------------------------------------
Compile NEMO with the BFM
-----------------------------------------------------------------------
NEMO-BFM is compiled from the BFM configuration script relying on the
NEMO FCM compilation environment. This is done to allow BFM users to
use new configurations in NEMO that are not part of the NEMO
standard distribution code.
The BFM configuration shipped with NEMO is GYRE_BFM (see next section)
Make sure that the BFMDIR variable is defined in your environment
and define the variable NEMODIR pointing to the root of NEMO source code
It is assumed here that you have expanded the bfm in /home/user/bfm
and the root of this NEMO directory in /home/user/nemo then
and that you have already adjusted the appropriate ARCHFILE that
is used for the NEMO compilation with makenemo in ../../ARCH
Execute the following commands:
>> export BFMDIR=/home/user/bfm
>> export NEMODIR=/home/user/nemo
>> cd $BFMDIR/build
>> ./bfm_config.sh -gcd -p GYRE_BFM
The script will generate (-g) the BFM code, then launch
makenemo for compilation (-c) and create the run directory
(-d) in $BFMDIR/run.
to get information on how to use the BFM configuration script run
>> ./bfm_config.sh -h
-----------------------------------------------------------------------
Standard test case
-----------------------------------------------------------------------
The distributed standard test case is GYRE_BFM, a version of GYRE
with a full-blown BFM. It is a demnstration simulation and it is not
meant to produce any published result.
GYRE_BFM runs with analytical input data only.
The namelists for the BFM are not distributed with NEMO but are
generated directly by the BFM, in directory $BFMDIR/run/gyre_bfm.
The generation of the BFM namelist also copy the required NEMO
namelist and namelist_top files to this directory.
This is why there are no namelist files found in the standard
run directory $NEMODIR/NEMOGCM/CONFIG/GYRE_BFM/EXP00
Note for expert users:
If a user prefers to work in the NEMO directory than she has to
copy the generated namelists there
>> cp $BFMDIR/run/gyre_bfm/* $NEMODIR/NEMOGCM/CONFIG/GYRE_BFM/EXP00
Once the BFM code has been generated the first time, the code can be
also rebuilt with the standard NEMO command:
>> ./makenemo -n GYRE_BFM -m ARCHFILE -e $BFMDIR/src/nemo
-----------------------------------------------------------------------
Other examples
-----------------------------------------------------------------------
Other couplings with NEMO are available in $BFMDIR/build/configurations.
Run the command
>> ./bfm_config.sh -P
to get a list of available presets
Please refer to the README file in each directory for more information.
# NEMO coupling with the Biogeochemical Flux Model (BFM)
## What is the BFM?
The Biogeochemical Flux Model (BFM) is a numerical model for the simulation of the dynamics of major biogeochemical properties in marine ecosystems (see www.bfm-community.eu). BFM is open source software freely available under the GNU Public License.
The model can be used in standalone mode to simulate a 0-D system or coupled with other OGCM.
The coupling with NEMO is maintained by CMCC as part of the NEMO System Team activity.
## How to get the BFM code
Access to the code is provided trough the BFM website http://www.bfm-community.eu along with instructions on how to install and use it within the Documentation `Quick Guide`.
It is recommended to run the STANDALONE test cases before using the NEMO-BFM coupled system.
## Compile NEMO with BFM
NEMO-BFM is compiled from the BFM configuration script exploiting the NEMO FCM compilation environment. This is done to allow BFM users to create new configurations in NEMO that are not part of the NEMO standard distribution code.
The BFM configuration shipped with NEMO is `GYRE_BFM` (described in next section).
Make sure to define in your shell enviroment the following variables with code root path:
- `BFMDIR`, pointing to the root of BFM source code
- `NEMODIR`, pointing to the root of BFM source code
Check that the appropriate ARCHFILE used for the NEMO compilation with makenemo is associated to the ARCH field within the configuration file of the selected BFM preset, e.g. `$BFMDIR/build/configurations/GYRE_BFM/configuration`
Here below an example of the commands sequence for `GYRE_BFM` preset (-p):
```
$> export BFMDIR=/home/user/bfm
$> export NEMODIR=/home/user/nemo
$> cd $BFMDIR/build
$> ./bfm_configure.sh -gcd -p GYRE_BFM
```
The script will generate (-g) the BFM code, then launch makenemo for compilation (-c) and create the run directory (-d) in $BFMDIR/run.
To get information on how to use the BFM configuration script execute the following:
```$> ./bfm_configure.sh -h```
## GYRE_BFM standard configuration
The distributed standard test case is GYRE_BFM, a version of GYRE with a full-blown BFM.
It is a demonstration simulation and it is not meant to produce any published result. GYRE_BFM runs with analytical input data only.
The namelists for the BFM are not distributed with NEMO but are generated directly by the BFM, in directory `$BFMDIR/run/gyre_bfm`.
The generation of the BFM namelist also copy the required NEMO namelist and namelist_top files to this directory.
This is why there are no namelist files found in the standard run directory `$NEMODIR/cfgs/GYRE_BFM/EXPREF`
Please refer to the README file in the preset directory for more information.
## Contacts
Please visit www.bfm-community.eu for further informations and address any technical query to the BFM System Team `bfm_st@lists.cmcc.it`