generated from NOAA-OWP/owp-open-source-project-template
-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running NextGen at CONUS Scale - A Step by Step Instruction #794
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
program--
requested changes
Apr 18, 2024
Thank you Justin for the careful review. I'll implement the suggestions
tomorrow.
…On Thu, Apr 18, 2024 at 9:30 AM Justin Singh-M. - NOAA < ***@***.***> wrote:
***@***.**** requested changes on this pull request.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +# Download the Codes
+
+To download the `ngen` source code, run the following commands:
+
+`git clone https://github.com/NOAA-OWP/ngen.git` <https://github.com/NOAA-OWP/ngen.git>
+`cd ngen`
+
+Then we need all the submodule codes. So run the command below:
+
+`git submodule update --init --recursive`
+
+# Setting up the Environment
+
+For setting up the build and computation environment, we refer the users to our documentation chapter [DEPENDENCIES.md](DEPENDENCIES.md) for details. Basically, you will need to have access to C/C++ compiler, MPI, Boost, NetCDF, Cmake, SQLite3. Some of them may already be on your system. Otherwise, you have to install your own version. There are also some required software packages that come with `ngen` as submodules, such as `Udunits libraries`, `pybind11`, and `iso_c_fortran_bmi`.
+
+You are most likely need to use Python. For that we recommend setting up a virtual environment. For details, see [PYTHON_ROUTING.md](PYTHON_ROUTING.md). After setting up the Python virtual environment and activating it, you may need install additional python modules depending what `ngen submodules` you want to run.
⬇️ Suggested change
-You are most likely need to use Python. For that we recommend setting up a virtual environment. For details, see [PYTHON_ROUTING.md](PYTHON_ROUTING.md). After setting up the Python virtual environment and activating it, you may need install additional python modules depending what `ngen submodules` you want to run.
+You most likely need to use Python. For that we recommend setting up a virtual environment. For details, see [PYTHON_ROUTING.md](PYTHON_ROUTING.md). After setting up the Python virtual environment and activating it, you may need install additional python modules depending on what `ngen` submodules you want to run.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +`git clone https://github.com/NOAA-OWP/ngen.git` <https://github.com/NOAA-OWP/ngen.git>
+`cd ngen`
+
+Then we need all the submodule codes. So run the command below:
+
+`git submodule update --init --recursive`
+
+# Setting up the Environment
+
+For setting up the build and computation environment, we refer the users to our documentation chapter [DEPENDENCIES.md](DEPENDENCIES.md) for details. Basically, you will need to have access to C/C++ compiler, MPI, Boost, NetCDF, Cmake, SQLite3. Some of them may already be on your system. Otherwise, you have to install your own version. There are also some required software packages that come with `ngen` as submodules, such as `Udunits libraries`, `pybind11`, and `iso_c_fortran_bmi`.
+
+You are most likely need to use Python. For that we recommend setting up a virtual environment. For details, see [PYTHON_ROUTING.md](PYTHON_ROUTING.md). After setting up the Python virtual environment and activating it, you may need install additional python modules depending what `ngen submodules` you want to run.
+
+# Build the Executable
+
+After setting up the environment variables, we need first build the necessay dynamically linked librares. Although `ngen` has capability for automated building of submodule libraries, we build them explicitly so that users have a better understanding. For simplicity, we display the content a script which we name it `build_libs`.
⬇️ Suggested change
-After setting up the environment variables, we need first build the necessay dynamically linked librares. Although `ngen` has capability for automated building of submodule libraries, we build them explicitly so that users have a better understanding. For simplicity, we display the content a script which we name it `build_libs`.
+After setting up the environment variables, we need to first build the necessary dynamically linked libraries. Although `ngen` has the capability for automated building of submodule libraries, we build them explicitly so that users have a better understanding. For simplicity, we display the content a script which we name it `build_libs`.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +cmake --build extern/SoilMoistureProfiles/SoilMoistureProfiles/cmake_build --target smpbmi -- -j 2 &&
+```
+
+Copy the content into the file named `build_libs` and run the command:
+
+```
+source build_libs
+```
+
+This will build all libraries we need to run `ngen` at the time of this writing.
+
+Then, with the Python virtual environment activated, we can build the MPI executable using the following script:
+
+```
+cmake -S . -B cmake_build_mpi -DCMAKE_C_COMPILER=/local/lib/bin/mpicc -DCMAKE_CXX_COMPILER=/local/lib/bin/mpicxx \
+-DBOOST_ROOT=/home/shengting.cui/usr/boost_1_79_0/ \
⬇️ Suggested change
--DBOOST_ROOT=/home/shengting.cui/usr/boost_1_79_0/ \
+-DBOOST_ROOT=<path-to-Boost-ROOT-Dir> \
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> + -DNGEN_WITH_EXTERN_PET:BOOL=OFF \
+ -DNGEN_WITH_EXTERN_NOAH_OWP_MODULAR:BOOL=ON
+cmake --build cmake_build_mpi --target all -j 8
+```
+
+For the meaning of each option in the script, see `ngen/wiki` [build](https://github.com/NOAA-OWP/ngen/wiki/Building) page.
+
+Suppose the above script is named `build_mpi`, execute the following command to build:
+
+`source build_mpi`
+
+This will build an executable in the `cmake_build_mpi` directory named `ngen` and another named `partitionGenerator` as well as all the unit tests in the `cmake_build_mpi/test`.
+
+# CONUS Hydrofabric
+
+The CONUS hydrofabric is downloaded from [here](https://www.lynker-spatial.com/#v20.1/). The file name under the list is `conus.gpkg`. It is cautioned that since the data there are evolving and newer version may be available in the future. When using a newer version, be mindful that the corresponding initial configuration file generation and validation for all submodules at CONUS scale are necessary, which may be a non-trivial process due to the shear size of the spatial scale.
⬇️ Suggested change
-The CONUS hydrofabric is downloaded from [here](https://www.lynker-spatial.com/#v20.1/). The file name under the list is `conus.gpkg`. It is cautioned that since the data there are evolving and newer version may be available in the future. When using a newer version, be mindful that the corresponding initial configuration file generation and validation for all submodules at CONUS scale are necessary, which may be a non-trivial process due to the shear size of the spatial scale.
+The CONUS hydrofabric is downloaded from [here](https://www.lynker-spatial.com/#v20.1/). The file name under the list is `conus.gpkg`. Note that since the data there is continually evolving, a newer version may be available in the future. When using a newer version, be mindful that the corresponding initial configuration file generation and validation for all submodules at CONUS scale is necessary, which may be a non-trivial process due to the sheer size of the spatial scale.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +cmake --build cmake_build_mpi --target all -j 8
+```
+
+For the meaning of each option in the script, see `ngen/wiki` [build](https://github.com/NOAA-OWP/ngen/wiki/Building) page.
+
+Suppose the above script is named `build_mpi`, execute the following command to build:
+
+`source build_mpi`
+
+This will build an executable in the `cmake_build_mpi` directory named `ngen` and another named `partitionGenerator` as well as all the unit tests in the `cmake_build_mpi/test`.
+
+# CONUS Hydrofabric
+
+The CONUS hydrofabric is downloaded from [here](https://www.lynker-spatial.com/#v20.1/). The file name under the list is `conus.gpkg`. It is cautioned that since the data there are evolving and newer version may be available in the future. When using a newer version, be mindful that the corresponding initial configuration file generation and validation for all submodules at CONUS scale are necessary, which may be a non-trivial process due to the shear size of the spatial scale.
+
+As the file is fairly large, it is worth some consideration to store it in a proper place, then simply build a symbolic link in the `ngen` home directory, thus named `./hydrofabric/conus.gpkg`. Note the easiest way to create the symbolic link is to `makedir hydrofabric` and then create the full path.
⬇️ Suggested change
-As the file is fairly large, it is worth some consideration to store it in a proper place, then simply build a symbolic link in the `ngen` home directory, thus named `./hydrofabric/conus.gpkg`. Note the easiest way to create the symbolic link is to `makedir hydrofabric` and then create the full path.
+As the file is fairly large, it is worth some consideration to store it in a proper place, then simply build a symbolic link in the `ngen` home directory, thus named `./hydrofabric/conus.gpkg`. Note the easiest way to create the symbolic link is to create a `hydrofabric` directory and then create a link to that directory.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +
+As the file is fairly large, it is worth some consideration to store it in a proper place, then simply build a symbolic link in the `ngen` home directory, thus named `./hydrofabric/conus.gpkg`. Note the easiest way to create the symbolic link is to `makedir hydrofabric` and then create the full path.
+
+# Generate Partition For Parallel Computation
+
+For parallel computation using MPI on hydrofabric, a [partition generate tool](Distributed_Processing.md) is used to partition the hydrofabric features ids into a number of partitions equal to the number of MPI processing CPU cores. To generate the partition file, run the following command:
+
+```
+./cmake-build_mpi/partitionGenerator ./hydrofabric/conus.gpkg ./hydrofabric/conus.gpkg ./partition_config_32.json 32 '' ''
+```
+
+In the command above, `conus.gpkg` is the NextGen hydrofabric version 2.01 for CONUS, `partition_config_32.json` is the partition file that contains all features ids and their interconnected network information. The number `32` is intended number of processing cores for running parallel build `ngen` using MPI. The last two empty strings, as indicated by `''`, indicate there is no subsetting, i.e., we intend to run the whole CONUS hydrofabric.
+
+# Prepare the Input Data
+
+Input data include the forcing data and initial parameter data for various submodules. These depend on what best suit the user need. For our case, as of this documentation, beside forcing data, which can be accessed at `./forcing/NextGen_forcing_2016010100.nc` using the symbolic link scheme, we also generated initial input data for various submodules `noah-owp-modular`, `PET`, `CFE`, `SoilMoistureProfiles (SMP)`, `SoilFreezeThaw (SFT)`. The first three are located in `./conus_config/`, the SMP initial configus are located in `./conus_smp_configs/` and the SFT initial configs are located in `./conus_sft_configs/`.
⬇️ Suggested change
-Input data include the forcing data and initial parameter data for various submodules. These depend on what best suit the user need. For our case, as of this documentation, beside forcing data, which can be accessed at `./forcing/NextGen_forcing_2016010100.nc` using the symbolic link scheme, we also generated initial input data for various submodules `noah-owp-modular`, `PET`, `CFE`, `SoilMoistureProfiles (SMP)`, `SoilFreezeThaw (SFT)`. The first three are located in `./conus_config/`, the SMP initial configus are located in `./conus_smp_configs/` and the SFT initial configs are located in `./conus_sft_configs/`.
+Input data includes the forcing data and initial parameter data for various submodules. These depend on what best suits the user's need. For our case, as of this documentation, beside forcing data, which can be accessed at `./forcing/NextGen_forcing_2016010100.nc` using the symbolic link scheme, we also generated initial input data for various submodules `noah-owp-modular`, `PET`, `CFE`, `SoilMoistureProfiles (SMP)`, `SoilFreezeThaw (SFT)`. The first three are located in `./conus_config/`, the SMP initial configs are located in `./conus_smp_configs/` and the SFT initial configs are located in `./conus_sft_configs/`.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +./cmake-build_mpi/partitionGenerator ./hydrofabric/conus.gpkg ./hydrofabric/conus.gpkg ./partition_config_32.json 32 '' ''
+```
+
+In the command above, `conus.gpkg` is the NextGen hydrofabric version 2.01 for CONUS, `partition_config_32.json` is the partition file that contains all features ids and their interconnected network information. The number `32` is intended number of processing cores for running parallel build `ngen` using MPI. The last two empty strings, as indicated by `''`, indicate there is no subsetting, i.e., we intend to run the whole CONUS hydrofabric.
+
+# Prepare the Input Data
+
+Input data include the forcing data and initial parameter data for various submodules. These depend on what best suit the user need. For our case, as of this documentation, beside forcing data, which can be accessed at `./forcing/NextGen_forcing_2016010100.nc` using the symbolic link scheme, we also generated initial input data for various submodules `noah-owp-modular`, `PET`, `CFE`, `SoilMoistureProfiles (SMP)`, `SoilFreezeThaw (SFT)`. The first three are located in `./conus_config/`, the SMP initial configus are located in `./conus_smp_configs/` and the SFT initial configs are located in `./conus_sft_configs/`.
+
+For code used to generate the initial config files for the various modules, the interested users are directed to this [web location](https://github.com/NOAA-OWP/ngen-cal/tree/master/python/ngen_config_gen).
+
+The users are warned that since the simulated region is large, some of the initial config parameters values for some catchments may be unsuitable and cause the `ngen` execution to stop due to errors. Usually, in such cases, either `ngen` or the submodule itself may provide some hint as to the catchment ids or the location of the code that caused the error. Users may follow these hints to figure out as to which initial input parameter or parameters are initialized with inappropriate values. In the case of SFT, an initial value of `smcmax=1.0` would be too large. In the case of SMP, an initial value of `b=0.01` would be too small, for example.
+
+# Build the Realization Configurations
+
+The realization configuration file in `Json` format contains high level information to run a `ngen` simulation, such as interconnected submodules, paths to forcing file, shared libraries, initialization parameters, duration of simulation, I/O variables, etc. We have built the realization configurations for several commonly used submodules which are located in `data/baseline/`. These are built by adding one submodule at a time, test run for 10 days simulation time. The successive submodules used are:
⬇️ Suggested change
-The realization configuration file in `Json` format contains high level information to run a `ngen` simulation, such as interconnected submodules, paths to forcing file, shared libraries, initialization parameters, duration of simulation, I/O variables, etc. We have built the realization configurations for several commonly used submodules which are located in `data/baseline/`. These are built by adding one submodule at a time, test run for 10 days simulation time. The successive submodules used are:
+The realization configuration file, in JSON format, contains high level information to run a `ngen` simulation, such as interconnected submodules, paths to forcing file, shared libraries, initialization parameters, duration of simulation, I/O variables, etc. We have built the realization configurations for several commonly used submodules which are located in `data/baseline/`. These are built by adding one submodule at a time, performing a test run for a 10 day simulation. The successive submodules used are:
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +# Build the Realization Configurations
+
+The realization configuration file in `Json` format contains high level information to run a `ngen` simulation, such as interconnected submodules, paths to forcing file, shared libraries, initialization parameters, duration of simulation, I/O variables, etc. We have built the realization configurations for several commonly used submodules which are located in `data/baseline/`. These are built by adding one submodule at a time, test run for 10 days simulation time. The successive submodules used are:
+```
+sloth (conus_bmi_multi_realization_config_w_sloth.json)
+sloth+noah-owp-modular (conus_bmi_multi_realization_config_w_sloth_noah.json)
+sloth+noah-owp-modular+pet (conus_bmi_multi_realization_config_w_sloth_noah_pet.json)
+sloth+noah-owp-modular+pet+cfe (conus_bmi_multi_realization_config_w_sloth_noah_pet_cfe.json)
+sloth+noah-owp-modular+pet+smp (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp.json)
+sloth+noah-owp-modular+pet+smp+sft (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft.json)
+sloth+noah-owp-modular+pet+smp+sft+cfe (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json)
+```
+
+# Run Computations with Submodules
+
+With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition json file for the number of cores used. The command line for running a MPI job is as sollows:
⬇️ Suggested change
-With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition json file for the number of cores used. The command line for running a MPI job is as sollows:
+With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition JSON file for the number of cores used. The command line for running a MPI job is as follows:
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +sloth+noah-owp-modular (conus_bmi_multi_realization_config_w_sloth_noah.json)
+sloth+noah-owp-modular+pet (conus_bmi_multi_realization_config_w_sloth_noah_pet.json)
+sloth+noah-owp-modular+pet+cfe (conus_bmi_multi_realization_config_w_sloth_noah_pet_cfe.json)
+sloth+noah-owp-modular+pet+smp (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp.json)
+sloth+noah-owp-modular+pet+smp+sft (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft.json)
+sloth+noah-owp-modular+pet+smp+sft+cfe (conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json)
+```
+
+# Run Computations with Submodules
+
+With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition json file for the number of cores used. The command line for running a MPI job is as sollows:
+
+For a simple example run and quick turn around, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
⬇️ Suggested change
-run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
+mpiexec -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +```
+
+# Run Computations with Submodules
+
+With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition json file for the number of cores used. The command line for running a MPI job is as sollows:
+
+For a simple example run and quick turn around, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
+```
+
+For a more substantial example simulation, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
⬇️ Suggested change
-run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
+mpiexec -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +
+For a simple example run and quick turn around, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
+```
+
+For a more substantial example simulation, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
+```
+
+For an example taken into account more realistic contributions, you can try:
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json conus_partition_32.json
⬇️ Suggested change
-run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json conus_partition_32.json
+mpiexec -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json conus_partition_32.json
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +
+With all preparation steps completed, we are now ready to run computations. We use MPI as our parallel processing application with 32 cores as an example. Users are free to choose whatever number cores they want, just make sure you will need to have the appropriate corresponding partition json file for the number of cores used. The command line for running a MPI job is as sollows:
+
+For a simple example run and quick turn around, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth.json conus_partition_32.json
+```
+
+For a more substantial example simulation, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
+```
+
+For an example taken into account more realistic contributions, you can try:
⬇️ Suggested change
-For an example taken into account more realistic contributions, you can try:
+For an example taking into account more realistic contributions, you can try:
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> +```
+
+For a more substantial example simulation, you can run:
+
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah.json conus_partition_32.json
+```
+
+For an example taken into account more realistic contributions, you can try:
+```
+run -n 32 ./cmake_build_mpi/ngen ./hydrofabric/conus.gpkg '' ./hydrofabric/conus.gpkg '' data/baseline/conus_bmi_multi_realization_config_w_sloth_noah_pet_smp_sft_cfe.json conus_partition_32.json
+```
+
+where `ngen` is the executable we build in the [Building the Executable](#Building the Executable) section. All other terms have been discussed above in details. With the current existing realization config files, the above jobs run 10 days simulation time on CONUS scale.
+
+Be aware that the above commands will generate over a million output files associated with catchment and nexus ids so if you issue a `ls` command in `ngen` directory, it will be significantly slower than usual to list all the file names. The exact time will depend on the computer you are working on.
You may want to note/include the use of the output_root realization
config option here, so that outputs can be stored in a separate directory
from the source tree.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> @@ -0,0 +1,173 @@
+# NextGen on CONUS
+
+This documentation provides instructions on all neccessary steps and components to run NextGen jobs at CONUS scale. Considering the computations large scale, we focus only on running parallel jobs using MPI.
⬇️ Suggested change
-This documentation provides instructions on all neccessary steps and components to run NextGen jobs at CONUS scale. Considering the computations large scale, we focus only on running parallel jobs using MPI.
+This documentation provides instructions on all neccessary steps and components to run NextGen jobs at CONUS scale. Considering the computation's large scale, we focus only on running parallel jobs using MPI.
—
Reply to this email directly, view it on GitHub
<#794 (review)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACA4SRLEKB34ATRWTBFAQ6DY57KHTAVCNFSM6AAAAABGJ6TC5CVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDAMBZGAZDMOJYHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
program--
previously approved these changes
Apr 23, 2024
program--
reviewed
Apr 23, 2024
program--
reviewed
Apr 23, 2024
Good catch for both. Thanks.
…On Tue, Apr 23, 2024 at 10:46 AM Justin Singh-M. - NOAA < ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In doc/NextGen_ON_CONUS.md
<#794 (comment)>:
> @@ -0,0 +1,174 @@
+# NextGen on CONUS
+
+This documentation provides instructions on all neccessary steps and components to run NextGen jobs at CONUS scale. Considering the computation's large scale, we focus only on running parallel jobs using MPI.
+
+* [Summary](#summary)
+* [Download the Codes](#doenload-the-codes)
+* [Setting Up the Environment](#setting-up-the-environment)
+* [Build the Executable](#build-the-executable)
+* [CONUS Hydrofabric](#CONUS-hydrofabric)
This doesn't link in the rich diff, does changing the URL slightly to
#conus-hydrofabric resolve that?
⬇️ Suggested change
-* [CONUS Hydrofabric](#CONUS-hydrofabric)
+* [CONUS Hydrofabric](#conus-hydrofabric)
—
Reply to this email directly, view it on GitHub
<#794 (review)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACA4SRLZEQL2J5S2JUR2LK3Y6Z64HAVCNFSM6AAAAABGJ6TC5CVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDAMJXGY2DMNRWHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
program--
approved these changes
Apr 23, 2024
7 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR provides detailed documentations on how to run ngen at CONUS scale.
Additions
NextGen_ON_CONUS.md
Removals
Changes
Testing
All Linux related commands in the documents have been tested.
mpirun jobs listed have been tested,
Visualization in brower
Screenshots
Notes
Todos
Add Topmodel
Add Routing
Checklist
Testing checklist (automated report can be put here)
Target Environment support