From fd33d446a62ed8b8b9669d7adfaa87b08dc098c6 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Tue, 27 May 2025 14:31:15 -0500 Subject: [PATCH 01/14] Update Contributing and Readme --- CONTRIBUTING.md | 24 ++++++++++++++----- README.md | 62 +++++++++++++++++++++++++++++++++++++++---------- 2 files changed, 68 insertions(+), 18 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 82801b928..35fa8859b 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -71,17 +71,29 @@ for a minimal build of NVIDIA cuOpt without using conda are also listed below. Compilers: -* `gcc` version 11.4+ -* `nvcc` version 11.8+ -* `cmake` version 3.29.6+ +These will be installed while creating the conda environment + +* `gcc` version 13.0+ +* `nvcc` version 12.8+ +* `cmake` version 3.30.4+ CUDA/GPU Runtime: -* CUDA 11.4+ +* CUDA 12.8 * Volta architecture or better ([Compute Capability](https://docs.nvidia.com/deploy/cuda-compatibility/) >=7.0) -You can obtain CUDA from -[https://developer.nvidia.com/cuda-downloads](https://developer.nvidia.com/cuda-downloads). +Python: + +* Python >=3.10.x, <= 3.12.x + +OS: + +* Only Linux is supported + +Architecture: + +* x86_64 (64-bit) +* aarch64 (64-bit) ### Build NVIDIA cuOpt from source diff --git a/README.md b/README.md index 61011ecf2..da4fc7036 100644 --- a/README.md +++ b/README.md @@ -6,20 +6,24 @@ NVIDIA® cuOpt™ is a GPU-accelerated optimization engine that excels in mixed For the latest stable version ensure you are on the `main` branch. -## Build from Source +## Supported APIs -Please see our [guide for building cuOpt from source](CONTRIBUTING.md#build-nvidia-cuopt-from-source) +cuOpt supports the following APIs: -## Contributing Guide - -Review the [CONTRIBUTING.md](CONTRIBUTING.md) file for information on how to contribute code and issues to the project. +- C API support + - Linear Programming (LP) + - Mixed Integer Linear Programming (MILP) +- C++ API support + - cuOpt is written in C++ and includes a native C++ API. However, we do not provide documentation for the C++ API at this time. We anticipate that the C++ API will change significantly in the future. Use it at your own risk. +- Python support + - Routing (TSP, VRP, and PDP) + - Linear Programming (LP) and Mixed Integer Linear Programming (MILP) + - cuOpt includes a Python API that is used as the backend of the cuOpt server. However, we do not provide documentation for the Python API at this time. We suggest using cuOpt server to access cuOpt via Python. We anticipate that the Python API will change significantly in the future. Use it at your own risk. +- Server support + - Linear Programming (LP) + - Mixed Integer Linear Programming (MILP) + - Routing (TSP, VRP, and PDP) -## Resources - -- [cuopt (Python) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/introduction.html) -- [libcuopt (C++/CUDA) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/introduction.html) -- [Examples and Notebooks](https://github.com/NVIDIA/cuopt-examples) -- [Test cuopt with Brev](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [Brev](https://docs.nvidia.com/brev/latest/). ## Installation @@ -29,6 +33,16 @@ Review the [CONTRIBUTING.md](CONTRIBUTING.md) file for information on how to con * NVIDIA driver >= 525.60.13 (Linux) and >= 527.41 (Windows) * Volta architecture or better (Compute Capability >=7.0) +### Python requirements + +* Python >=3.10.x, <= 3.12.x + +### OS requirements + +* Only Linux is supported and Windows via WSL2 +* x86_64 (64-bit) +* aarch64 (64-bit) + ### Pip cuOpt can be installed via `pip` from the NVIDIA Python Package Index. @@ -52,7 +66,31 @@ conda install -c rapidsai -c conda-forge -c nvidia \ cuopt=25.05 python=3.12 cuda-version=12.8 ``` +### Container + +Users can pull the cuOpt container from the NVIDIA container registry + We also provide [nightly Conda packages](https://anaconda.org/rapidsai-nightly) built from the HEAD of our latest development branch. -Note: cuOpt is supported only on Linux, and with Python versions 3.10 and later. \ No newline at end of file +```bash +docker pull nvidia/cuopt:25.5.0-cuda12.8-py312 +``` +More information about the cuOpt container can be found [here](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/quick-start.html#container-from-docker-hub) + + +## Build from Source and Test + +Please see our [guide for building cuOpt from source](CONTRIBUTING.md#setting-up-your-build-environment) + +## Contributing Guide + +Review the [CONTRIBUTING.md](CONTRIBUTING.md) file for information on how to contribute code and issues to the project. + +## Resources + +- [libcuopt (C) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-c/index.html) +- [cuopt (Python) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-python/index.html) +- [cuopt (Server) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/index.html) +- [Examples and Notebooks](https://github.com/NVIDIA/cuopt-examples) +- [Test cuopt with Brev](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [Brev](https://docs.nvidia.com/brev/latest/). \ No newline at end of file From 653db5ab9409cf3a07064dfbb39f8c6489833777 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Tue, 27 May 2025 17:22:48 -0500 Subject: [PATCH 02/14] Add readme to all the directories --- README.md | 6 ++- benchmarks/README.md | 6 +++ ci/README.md | 49 ++++++++++++++++++ cmake/README.md | 6 +++ conda/README.md | 8 +++ cpp/README.md | 61 +++++++++++++++++++++++ docs/cuopt/README.md | 23 ++++----- docs/cuopt/source/cuopt-c/quick-start.rst | 3 ++ docs/cuopt/source/faq.rst | 6 +-- notebooks/README.md | 5 ++ python/README.md | 42 ++++++++++++++++ 11 files changed, 196 insertions(+), 19 deletions(-) create mode 100644 benchmarks/README.md create mode 100644 ci/README.md create mode 100644 cmake/README.md create mode 100644 conda/README.md create mode 100644 cpp/README.md create mode 100644 notebooks/README.md create mode 100644 python/README.md diff --git a/README.md b/README.md index da4fc7036..03086aab8 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,10 @@ [![Build Status](https://github.com/NVIDIA/cuopt/actions/workflows/build.yaml/badge.svg)](https://github.com/NVIDIA/cuopt/actions/workflows/build.yaml) -NVIDIA® cuOpt™ is a GPU-accelerated optimization engine that excels in mixed integer programming (MIP), linear programming (LP), and vehicle routing problems (VRP). It enables near real-time solutions for large-scale challenges with millions of variables and constraints, offering easy integration into existing solvers and seamless deployment across hybrid and multi-cloud environments. +NVIDIA® cuOpt™ is a GPU-accelerated optimization engine that excels in mixed integer programming (MIP), linear programming (LP), and vehicle routing problems (VRP). It enables near real-time solutions for large-scale challenges with millions of variables and constraints, offering +easy integration into existing solvers and seamless deployment across hybrid and multi-cloud environments. + +Core engine is written in C++ which is wrapped into C API, Python API and Server API. For the latest stable version ensure you are on the `main` branch. @@ -24,7 +27,6 @@ cuOpt supports the following APIs: - Mixed Integer Linear Programming (MILP) - Routing (TSP, VRP, and PDP) - ## Installation ### CUDA/GPU requirements diff --git a/benchmarks/README.md b/benchmarks/README.md new file mode 100644 index 000000000..9ce20f69e --- /dev/null +++ b/benchmarks/README.md @@ -0,0 +1,6 @@ +# Benchmarks Scripts + +This directory contains the scripts for the benchmarks. + + + diff --git a/ci/README.md b/ci/README.md new file mode 100644 index 000000000..d24225216 --- /dev/null +++ b/ci/README.md @@ -0,0 +1,49 @@ +# CI scripts + +This directory contains the scripts for the CI pipeline. + +CI builds are triggered by `pr.yaml`, `build.yaml` and `test.yaml` files in in `.github/workflows` directory. And these scripts are use from those workflows to build and test the code. + +cuOpt is packaged in following ways: + +## PIP package + +### Build + +The scripts for building the PIP packages are named as `build_wheel_.sh`. For Example, `build_wheel_cuopt.sh` is used to build the PIP package for cuOpt. + +Please refer to existing scripts for more details and how you can add a new script for a new package. + +### Test + +The scripts for testing the PIP packages are named as `test_wheel_.sh`. For Example, `test_wheel_cuopt.sh` is used to test the PIP package for cuOpt. + +Please refer to existing scripts for more details and how you can add a new script for a new package. + +## Conda Package + +### Build + +For conda package, + +- all cpp libraries are built under one script called `build_cpp.sh`. +- all python bindings are built under one script called `build_python.sh`. + +So if if there are new cpp libraries or python bindings, you need to add them to the respective scripts. + + +### Test + +Similarly, for conda package, + +- all cpp libraries are tested under one script called `test_cpp.sh`. +- all python bindings are tested under one script called `test_python.sh`. + + +There are other scripts in this directory which are used to build and test the code and are also used in the workflows as utlities. + + + + + + diff --git a/cmake/README.md b/cmake/README.md new file mode 100644 index 000000000..b0ab92484 --- /dev/null +++ b/cmake/README.md @@ -0,0 +1,6 @@ +# Cmake for RAPIDS configuration + +This directory contains the Cmake files for the RAPIDS configuration. + + + diff --git a/conda/README.md b/conda/README.md new file mode 100644 index 000000000..8188d0a7d --- /dev/null +++ b/conda/README.md @@ -0,0 +1,8 @@ +# Conda Recipes and Environment + +This directory contains the conda recipes for the cuOpt packages which are used to build the conda packages in CI. + +Along with that, it also contains the environment files which is used to create the conda environment for the development of cuOpt and CI testing. + + + diff --git a/cpp/README.md b/cpp/README.md new file mode 100644 index 000000000..94362b812 --- /dev/null +++ b/cpp/README.md @@ -0,0 +1,61 @@ +# C++ Modules + +This directory contains the C++ modules for the cuOpt project. + +Please refer to [CMakeLists.txt](CMakeLists.txt) file for details on how to add new modules and tests. + +Most of the dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. But some of the dependencies are defined in [thirdparty modules](cmake/thirdparty/) in case where source code is needed to build, for example, `cccl` and `rmm`. + + +## Include Structure + +Add any new modules in the `include` directory under `include/cuopt/` directory. + +```bash +cpp/ +├── include/ +│ ├── cuopt/ +│ │ └── linear_programming/ +│ │ └── ... +│ │ └── routing/ +│ │ └── ... +│ └── ... +└── ... +``` + +## Source Structure + +Add any new modules in the `src` directory under `src/cuopt/` directory. + +```bash +cpp/ +├── src/ +│ ├── cuopt/ +│ │ └── linear_programming/ +│ │ └── ... +│ │ └── routing/ +│ │ └── ... +└── ... +``` + +## Test Structure + +Add any new modules in the `test` directory under `test/cuopt/` directory. + +```bash +cpp/ +├── test/ +│ ├── cuopt/ +│ │ └── linear_programming/ +│ │ └── ... +│ │ └── routing/ +│ │ └── ... +└── ... +``` + +## MPS parser + +The MPS parser is a standalone module that parses MPS files and converts them into a format that can be used by the cuOpt library. + +It is located in the `libmps_parser` directory. This also contains the `CMakeLists.txt` file to build the module. + diff --git a/docs/cuopt/README.md b/docs/cuopt/README.md index f6b36f7f8..94ebe06ce 100644 --- a/docs/cuopt/README.md +++ b/docs/cuopt/README.md @@ -1,36 +1,33 @@ # Building Documentation -Documentation dependencies are installed while installing conda environment, please refer to the [CONTRIBUTING](https://github.com/NVIDIA/cuopt/blob/main/CONTRIBUTING.md) for more details. Doc generation -does not get run by default. There are two ways to generate the docs: +Documentation dependencies are installed while installing conda environment, please refer to the [Build and Test](../../CONTRIBUTING.md#building-with-a-conda-environment) for more details. Assuming you have set-up conda environment, you can build the documentation along with all the cuopt libraries by running: -Note: It is assumed that all required libraries are already installed locally. If they haven't been installed yet, please first install all libraries by running: ```bash ./build.sh ``` -1. Run +In subsequent runs where there are no changes to the cuopt libraries, documentation can be built by running: + +1. From the root directory: ```bash -make clean;make html +./build.sh docs ``` -from the `docs/cuopt` directory. -2. Run + +2. From the `docs/cuopt` directory: ```bash -./build.sh docs +make clean;make html ``` -from the root directory. Outputs to `build/html/index.html` ## View docs web page by opening HTML in browser: -First navigate to `/build/html/` folder, i.e., `cd build/html` and then run the following command: - ```bash -python -m http.server +python -m http.server --directory=build/html/ ``` Then, navigate a web browser to the IP address or hostname of the host machine at port 8000: ``` -https://:8000 +http://:8000 ``` Now you can check if your docs edits formatted correctly, and read well. diff --git a/docs/cuopt/source/cuopt-c/quick-start.rst b/docs/cuopt/source/cuopt-c/quick-start.rst index 18a18334a..c36fe13bc 100644 --- a/docs/cuopt/source/cuopt-c/quick-start.rst +++ b/docs/cuopt/source/cuopt-c/quick-start.rst @@ -13,6 +13,9 @@ pip For CUDA 12.x: +This wheel is python wrapper around the C++ library and eases installation and access to libcuopt. This also help in pip environment to load libraries dynamically while using python SDK. + + .. code-block:: bash # This is deprecated module and not longer used, but share same name for the CLI, so we need to uninstall it first if it exists. diff --git a/docs/cuopt/source/faq.rst b/docs/cuopt/source/faq.rst index ba1c48d39..74c93b8fe 100644 --- a/docs/cuopt/source/faq.rst +++ b/docs/cuopt/source/faq.rst @@ -11,7 +11,6 @@ General FAQ - NVIDIA docker hub (https://hub.docker.com/r/nvidia/) - NVIDIA NGC registry (https://catalog.ngc.nvidia.com/orgs/nvidia/teams/cuopt/containers/cuopt/tags) with NVAIE license. - .. dropdown:: How to get a NVAIE license? Please refer to `NVIDIA NVAIE `_ for more information. @@ -44,14 +43,13 @@ General FAQ docker pull - .. dropdown:: Do I need a GPU to use cuOpt? Yes, please refer to `system requirements `_ for GPU specifications. You can acquire a cloud instance with a supported GPU and launch cuOpt; alternatively, you can launch it in your local machine if it meets the requirements. -.. dropdown:: Does cuOpt use multiple GPUs? +.. dropdown:: Does cuOpt use multiple GPUs/multi-GPUs/multi gpus? - #. Yes, in cuOpt self-hosted server, a solver process per GPU can be configured to run multiple solvers. Requests are accepted in a round-robin queue. More details are available in `server api `_. + #. Yes, in cuOpt self-hosted server, a solver process per GPU can be configured to run multiple solvers. Requests are accepted in a round-robin queue. More details are available in `server api `_. #. There is no support for leveraging multiple GPUs to solve a single problem or oversubscribing a single GPU for multiple solvers. .. dropdown:: The cuOpt Service is not starting: Issue with port? diff --git a/notebooks/README.md b/notebooks/README.md new file mode 100644 index 000000000..58bea996e --- /dev/null +++ b/notebooks/README.md @@ -0,0 +1,5 @@ +# Notebooks + +This directory contains the sample notebooks for the cuOpt project. + +Users can find more advanced examples in the [cuOpt Examples](https://github.com/nvidia/cuopt-examples) repository. \ No newline at end of file diff --git a/python/README.md b/python/README.md new file mode 100644 index 000000000..6eb9cf26f --- /dev/null +++ b/python/README.md @@ -0,0 +1,42 @@ +# Python Modules + +This directory contains the Python modules for the cuOpt project. + +## Package Structure + +- Each subdirectory contains the Python modules for a specific cuOpt package. For example, `libcuopt` directory contains the Python wrappers for the cuOpt C++ library. This is the main package for the cuOpt project. And it just loads shared libraries and make it available for other python modules. `cuopt` python package uses `libcuopt` package as dependency and build on top of it. + +```bash +python/ +├── libcuopt/ +├── cuopt/ +└── ... +``` +- Each of these python modules have `tests` directory which contains the tests for the module. Python tests are written using `pytest`. For example, `python/cuopt/cuopt/tests/` directory contains the tests for the `cuopt` python package. + +```bash +python/ +├── cuopt/ +│ ├── cuopt/ +│ │ └── tests/ +│ └── ... +└── ... +``` + +- Each of these pyhon modules have pyproject.toml file which contains the dependencies for the module. For example, `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` python package. + +```bash +python/ +├── cuopt/ +│ ├── pyproject.toml +│ └── ... +└── ... +``` + +- The dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file in root folder. For example, `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` python package. So any changes to dependencies should be done in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. + + + + + + From ebca2981ebebf607a3ae47c948ce85d37dd85085 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Tue, 27 May 2025 17:26:33 -0500 Subject: [PATCH 03/14] update CONTRIBUTING.md on dependency details. --- CONTRIBUTING.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 35fa8859b..082944322 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -231,6 +231,11 @@ set_source_files_properties(src/routing/data_model_view.cu PROPERTIES COMPILE_OP This will add the device debug symbols for this object file in `libcuopt.so`. You can then use `cuda-dbg` to debug into the kernels in that source file. +## Adding dependencies + +Please refer to [dependencies.yaml](dependencies.yaml) file for details on how to add new dependencies. +Add any new dependencies in the `dependencies.yaml` file. It takes care of conda, requirements (pip based dependencies) and pyproject.Please don't try to add dependencies directly to environment.yaml files under `conda/environments` directory and pyproject.toml files under `python` directories. + ## Code Formatting ### Using pre-commit hooks @@ -315,6 +320,5 @@ You can skip these checks with `git commit --no-verify` or with the short versio (d) I understand and agree that this project and the contribution are public and that a record of the contribution (including all personal information I submit with it, including my sign-off) is maintained indefinitely and may be redistributed consistent with this project or the open source license(s) involved. ``` - From 69b747a826a05d1a86c825bac8793a1e19709d53 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Tue, 27 May 2025 18:04:15 -0500 Subject: [PATCH 04/14] address reviews on adding details for pip, conda and container --- README.md | 26 +++++++++++++------ docs/cuopt/source/cuopt-c/quick-start.rst | 2 +- .../cuopt/source/cuopt-python/quick-start.rst | 4 +-- .../cuopt/source/cuopt-server/quick-start.rst | 8 +++--- docs/cuopt/source/introduction.rst | 14 +++++----- 5 files changed, 32 insertions(+), 22 deletions(-) diff --git a/README.md b/README.md index 03086aab8..0ffb032cb 100644 --- a/README.md +++ b/README.md @@ -42,11 +42,17 @@ cuOpt supports the following APIs: ### OS requirements * Only Linux is supported and Windows via WSL2 -* x86_64 (64-bit) -* aarch64 (64-bit) + * x86_64 (64-bit) + * aarch64 (64-bit) + +Note: WSL2 is tested to run cuOpt, but not for building. + +More details on system requirements can be found [here](https://docs.nvidia.com/cuopt/user-guide/latest/system-requirements.html) ### Pip +Pip wheels are easy to install and easy to configure. Users with existing workflows who uses pip as base to build their workflows can use pip to install cuOpt. + cuOpt can be installed via `pip` from the NVIDIA Python Package Index. Be sure to select the appropriate cuOpt package depending on the major version of CUDA available in your environment: @@ -54,36 +60,40 @@ on the major version of CUDA available in your environment: For CUDA 12.x: ```bash -pip install --extra-index-url=https://pypi.nvidia.com cuopt-cu12 +pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5 cuopt-sh-client==25.5 ``` ### Conda cuOpt can be installed with conda (via [miniforge](https://github.com/conda-forge/miniforge)) from the `nvidia` channel: +All other dependencies are installed automatically when cuopt-server and cuopt-sh-client are installed. + +Users who are used to conda env based workflows would benefit with conda packages readily available for cuOpt. For CUDA 12.x: ```bash conda install -c rapidsai -c conda-forge -c nvidia \ - cuopt=25.05 python=3.12 cuda-version=12.8 + cuopt-server=25.05 cuopt-sh-client=25.05 python=3.12 cuda-version=12.8 ``` +We also provide [nightly Conda packages](https://anaconda.org/rapidsai-nightly) built from the HEAD +of our latest development branch. + ### Container Users can pull the cuOpt container from the NVIDIA container registry -We also provide [nightly Conda packages](https://anaconda.org/rapidsai-nightly) built from the HEAD -of our latest development branch. - ```bash docker pull nvidia/cuopt:25.5.0-cuda12.8-py312 ``` More information about the cuOpt container can be found [here](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/quick-start.html#container-from-docker-hub) +Users who are using cuOpt for quick testing or research can use the cuOpt container. Or users who are planning to pluging cuOpt as a service in their workflow can quickly start with the cuOpt container. But users are required to build security layers around the service to safeguard the service from untrusted users. ## Build from Source and Test -Please see our [guide for building cuOpt from source](CONTRIBUTING.md#setting-up-your-build-environment) +Please see our [guide for building cuOpt from source](CONTRIBUTING.md#setting-up-your-build-environment). This will be helpful if users want to add new features or fix bugs for cuOpt. This would also be very helpful in case users want to customize cuOpt for their own use cases which require changes to the cuOpt source code. ## Contributing Guide diff --git a/docs/cuopt/source/cuopt-c/quick-start.rst b/docs/cuopt/source/cuopt-c/quick-start.rst index c36fe13bc..9962381ce 100644 --- a/docs/cuopt/source/cuopt-c/quick-start.rst +++ b/docs/cuopt/source/cuopt-c/quick-start.rst @@ -35,7 +35,7 @@ For CUDA 12.x: # This is deprecated module and not longer used, but share same name for the CLI, so we need to uninstall it first if it exists. conda remove cuopt-thin-client conda install -c rapidsai -c conda-forge -c nvidia \ - libcuopt=25.5.* python=3.12 cuda-version=12.8 + libcuopt=25.05.* python=3.12 cuda-version=12.8 Please visit examples under each section to learn how to use the cuOpt C API. \ No newline at end of file diff --git a/docs/cuopt/source/cuopt-python/quick-start.rst b/docs/cuopt/source/cuopt-python/quick-start.rst index 27d46b8f7..b47af8716 100644 --- a/docs/cuopt/source/cuopt-python/quick-start.rst +++ b/docs/cuopt/source/cuopt-python/quick-start.rst @@ -27,7 +27,7 @@ For CUDA 12.x: .. code-block:: bash conda install -c rapidsai -c conda-forge -c nvidia \ - cuopt=25.5.* python=3.12 cuda-version=12.8 + cuopt=25.05.* python=3.12 cuda-version=12.8 Container @@ -43,7 +43,7 @@ The container includes both the Python API and self-hosted server components. To .. code-block:: bash - docker run --gpus all -it --rm nvidia/cuopt:25.5.0 + docker run --gpus all -it --rm nvidia/cuopt:25.5.0-cuda12.8-py312 This will start an interactive session with cuOpt pre-installed and ready to use. diff --git a/docs/cuopt/source/cuopt-server/quick-start.rst b/docs/cuopt/source/cuopt-server/quick-start.rst index 4aba1e6aa..007809fa8 100644 --- a/docs/cuopt/source/cuopt-server/quick-start.rst +++ b/docs/cuopt/source/cuopt-server/quick-start.rst @@ -12,7 +12,7 @@ For CUDA 12.x: .. code-block:: bash - pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5.* cuopt-sh==25.5.* + pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.05.* cuopt-sh-client==25.05.* Conda @@ -25,7 +25,7 @@ For CUDA 12.x: .. code-block:: bash conda install -c rapidsai -c conda-forge -c nvidia \ - cuopt-server=25.5.* cuopt-sh=25.5.* python=3.12 cuda-version=12.8 + cuopt-server=25.05.* cuopt-sh-client=25.05.* python=3.12 cuda-version=12.8 Container from Docker Hub @@ -35,13 +35,13 @@ NVIDIA cuOpt is also available as a container from Docker Hub: .. code-block:: bash - docker pull nvidia/cuopt:25.5.0 + docker pull nvidia/cuopt:25.5.0-cuda12.8-py312 The container includes both the Python API and self-hosted server components. To run the container: .. code-block:: bash - docker run --gpus all -it --rm -p 8000:8000 -e CUOPT_SERVER_PORT=8000 nvidia/cuopt:25.5.0 /bin/bash -c "python3 -m cuopt_server.cuopt_service" + docker run --gpus all -it --rm -p 8000:8000 -e CUOPT_SERVER_PORT=8000 nvidia/cuopt:25.5.0-cuda12.8-py312 /bin/bash -c "python3 -m cuopt_server.cuopt_service" .. note:: Make sure you have the NVIDIA Container Toolkit installed on your system to enable GPU support in containers. See the `installation guide `_ for details. diff --git a/docs/cuopt/source/introduction.rst b/docs/cuopt/source/introduction.rst index 85949293c..202fff2a8 100644 --- a/docs/cuopt/source/introduction.rst +++ b/docs/cuopt/source/introduction.rst @@ -104,15 +104,15 @@ Supported APIs cuOpt supports the following APIs: - C API support - - Linear Programming (LP) - - Mixed Integer Linear Programming (MILP) + - `Linear Programming (LP) `_ + - `Mixed Integer Linear Programming (MILP) `_ - C++ API support - cuOpt is written in C++ and includes a native C++ API. However, we do not provide documentation for the C++ API at this time. We anticipate that the C++ API will change significantly in the future. Use it at your own risk. - Python support - - Routing (TSP, VRP, and PDP) - - Linear Programming (LP) and Mixed Integer Linear Programming (MILP) + - `Routing (TSP, VRP, and PDP) `_ + - `Linear Programming (LP) and Mixed Integer Linear Programming (MILP) - cuOpt includes a Python API that is used as the backend of the cuOpt server. However, we do not provide documentation for the Python API at this time. We suggest using cuOpt server to access cuOpt via Python. We anticipate that the Python API will change significantly in the future. Use it at your own risk. - Server support - - Linear Programming (LP) - - Mixed Integer Linear Programming (MILP) - - Routing (TSP, VRP, and PDP) + - `Linear Programming (LP) `_ + - `Mixed Integer Linear Programming (MILP) `_ + - `Routing (TSP, VRP, and PDP) `_ From 0058ee4a24b69a1fe23a6fd72b867400f0f1d680 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 12:09:11 -0500 Subject: [PATCH 05/14] update introduction --- docs/cuopt/source/introduction.rst | 32 ++++++++++++++++++++++++++++++ 1 file changed, 32 insertions(+) diff --git a/docs/cuopt/source/introduction.rst b/docs/cuopt/source/introduction.rst index 202fff2a8..3e5776c96 100644 --- a/docs/cuopt/source/introduction.rst +++ b/docs/cuopt/source/introduction.rst @@ -15,6 +15,9 @@ As part of `NVIDIA AI Enterprise `__ for more information about the NVIDIA Developer Program. +Core engine is built on C++ and all the APIs are built on top of it as wrappers. For example, cuOpt python API uses cython to wrap the C++ core engine and provide a Python interface. +Similarly, other interfaces wrap different layers to communicate with the core engine. + Routing (TSP, VRP, and PDP) ============================= @@ -116,3 +119,32 @@ cuOpt supports the following APIs: - `Linear Programming (LP) `_ - `Mixed Integer Linear Programming (MILP) `_ - `Routing (TSP, VRP, and PDP) `_ + +================================== +INSTALLATION OPTIONS +================================== + +NVIDIA cuOpt is available in several formats to suit different deployment needs: + +Source Code +---------- +For users who want to customize cuOpt or contribute to its development, the source code is available on `GitHub `_. Building from source allows maximum flexibility but requires setting up the build environment. + +Pip Wheels +---------- +For Python users with existing pip-based workflows, cuOpt can be installed directly via pip from the NVIDIA Python Package Index. This is the simplest installation method for most users. + +Conda Packages +------------- +Available from the NVIDIA channel, conda packages provide a convenient way to manage cuOpt and its dependencies in conda environments. This is ideal for users who prefer conda-based workflow management. + +Containers +--------- +NVIDIA provides ready-to-use containers with cuOpt pre-installed, available from: + +- Docker Hub (``nvidia/cuopt``) +- NVIDIA NGC (for NVIDIA AI Enterprise subscribers) + +Containers offer a consistent, isolated environment and are particularly useful for cloud deployments or microservices architectures. + +For detailed installation instructions for each option, please refer to the respective quickstart guides in the documentation. \ No newline at end of file From db5dfa6f2a53c871dbe8afde6e919a0af6e30525 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 12:21:15 -0500 Subject: [PATCH 06/14] fix doc errors --- docs/cuopt/Makefile | 3 +-- docs/cuopt/source/introduction.rst | 22 +++++++++++----------- 2 files changed, 12 insertions(+), 13 deletions(-) diff --git a/docs/cuopt/Makefile b/docs/cuopt/Makefile index 9a675115b..7102ea1d4 100644 --- a/docs/cuopt/Makefile +++ b/docs/cuopt/Makefile @@ -32,9 +32,8 @@ help: clean: @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) - rm -rf "$(SOURCEDIR)/user_guide/api_docs/api" # Catch-all target: route all unknown targets to Sphinx using the new # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). %: Makefile - @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) \ No newline at end of file + @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) diff --git a/docs/cuopt/source/introduction.rst b/docs/cuopt/source/introduction.rst index 3e5776c96..cac22bae6 100644 --- a/docs/cuopt/source/introduction.rst +++ b/docs/cuopt/source/introduction.rst @@ -107,18 +107,18 @@ Supported APIs cuOpt supports the following APIs: - C API support - - `Linear Programming (LP) `_ - - `Mixed Integer Linear Programming (MILP) `_ + - `Linear Programming (LP) - C `_ + - `Mixed Integer Linear Programming (MILP) - C `_ - C++ API support - cuOpt is written in C++ and includes a native C++ API. However, we do not provide documentation for the C++ API at this time. We anticipate that the C++ API will change significantly in the future. Use it at your own risk. - Python support - - `Routing (TSP, VRP, and PDP) `_ - - `Linear Programming (LP) and Mixed Integer Linear Programming (MILP) + - `Routing (TSP, VRP, and PDP) - Python `_ + - Linear Programming (LP) and Mixed Integer Linear Programming (MILP) - cuOpt includes a Python API that is used as the backend of the cuOpt server. However, we do not provide documentation for the Python API at this time. We suggest using cuOpt server to access cuOpt via Python. We anticipate that the Python API will change significantly in the future. Use it at your own risk. - Server support - - `Linear Programming (LP) `_ - - `Mixed Integer Linear Programming (MILP) `_ - - `Routing (TSP, VRP, and PDP) `_ + - `Linear Programming (LP) - Server `_ + - `Mixed Integer Linear Programming (MILP) - Server `_ + - `Routing (TSP, VRP, and PDP) - Server `_ ================================== INSTALLATION OPTIONS @@ -127,19 +127,19 @@ INSTALLATION OPTIONS NVIDIA cuOpt is available in several formats to suit different deployment needs: Source Code ----------- +=========== For users who want to customize cuOpt or contribute to its development, the source code is available on `GitHub `_. Building from source allows maximum flexibility but requires setting up the build environment. Pip Wheels ----------- +========== For Python users with existing pip-based workflows, cuOpt can be installed directly via pip from the NVIDIA Python Package Index. This is the simplest installation method for most users. Conda Packages -------------- +=============== Available from the NVIDIA channel, conda packages provide a convenient way to manage cuOpt and its dependencies in conda environments. This is ideal for users who prefer conda-based workflow management. Containers ---------- +=========== NVIDIA provides ready-to-use containers with cuOpt pre-installed, available from: - Docker Hub (``nvidia/cuopt``) From e4e4c8c51ad8a3062a5c730f50265f5f858126cb Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 14:11:31 -0500 Subject: [PATCH 07/14] update examples --- docs/cuopt/source/cuopt-server/examples/lp-examples.rst | 2 +- docs/cuopt/source/cuopt-server/examples/routing-examples.rst | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/cuopt/source/cuopt-server/examples/lp-examples.rst b/docs/cuopt/source/cuopt-server/examples/lp-examples.rst index c1b82d8ea..01bb356ff 100644 --- a/docs/cuopt/source/cuopt-server/examples/lp-examples.rst +++ b/docs/cuopt/source/cuopt-server/examples/lp-examples.rst @@ -662,7 +662,7 @@ To use a previous solution as the initial/warm start solution for a new request # Please update these values if the server is running on a different IP address or port export ip="localhost" export port=5000 - reqId=$(cuopt_sh -t LP data.json -i $ip -p $port -k | sed "s/'/\"/g" | jq -r '.reqId') + reqId=$(cuopt_sh -t LP data.json -i $ip -p $port -k | sed "s/'/\"/g" | sed 's/False/false/g' | jq -r '.reqId') cuopt_sh data.json -t LP -i $ip -p $port -wid $reqId diff --git a/docs/cuopt/source/cuopt-server/examples/routing-examples.rst b/docs/cuopt/source/cuopt-server/examples/routing-examples.rst index 2d2aeb013..f66a58f9f 100644 --- a/docs/cuopt/source/cuopt-server/examples/routing-examples.rst +++ b/docs/cuopt/source/cuopt-server/examples/routing-examples.rst @@ -338,7 +338,7 @@ To use a previous solution as an initial solution for a new request ID, you are cuopt_sh data.json -i $ip -p $port -id $reqId # delete previous saved solutions using follwing command - cuopt_sh $ip $port -d $reqId + cuopt_sh -i $ip -p $port -d $reqId Uploading a Solution From 3e8b3f0dad8939ef3a6e7f06d0606bff587a0328 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 14:24:12 -0500 Subject: [PATCH 08/14] update to docs --- README.md | 2 +- docs/cuopt/source/cuopt-c/quick-start.rst | 2 +- docs/cuopt/source/cuopt-python/quick-start.rst | 2 +- docs/cuopt/source/cuopt-server/quick-start.rst | 2 +- docs/cuopt/source/system-requirements.rst | 7 ++++++- 5 files changed, 10 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 0ffb032cb..3b35dd2b1 100644 --- a/README.md +++ b/README.md @@ -60,7 +60,7 @@ on the major version of CUDA available in your environment: For CUDA 12.x: ```bash -pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5 cuopt-sh-client==25.5 +pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5 cuopt-sh-client==25.5 nvidia-cuda-runtime-cu12==12.8.0 ``` ### Conda diff --git a/docs/cuopt/source/cuopt-c/quick-start.rst b/docs/cuopt/source/cuopt-c/quick-start.rst index 9962381ce..6ac41b333 100644 --- a/docs/cuopt/source/cuopt-c/quick-start.rst +++ b/docs/cuopt/source/cuopt-c/quick-start.rst @@ -20,7 +20,7 @@ This wheel is python wrapper around the C++ library and eases installation and a # This is deprecated module and not longer used, but share same name for the CLI, so we need to uninstall it first if it exists. pip uninstall cuopt-thin-client - pip install --extra-index-url=https://pypi.nvidia.com libcuopt-cu12==25.5.* + pip install --extra-index-url=https://pypi.nvidia.com libcuopt-cu12==25.5.* nvidia-cuda-runtime-cu12==12.8.0 Conda diff --git a/docs/cuopt/source/cuopt-python/quick-start.rst b/docs/cuopt/source/cuopt-python/quick-start.rst index b47af8716..a7c34c12d 100644 --- a/docs/cuopt/source/cuopt-python/quick-start.rst +++ b/docs/cuopt/source/cuopt-python/quick-start.rst @@ -43,7 +43,7 @@ The container includes both the Python API and self-hosted server components. To .. code-block:: bash - docker run --gpus all -it --rm nvidia/cuopt:25.5.0-cuda12.8-py312 + docker run --gpus all -it --rm nvidia/cuopt:25.5.0-cuda12.8-py312 nvidia-cuda-runtime-cu12==12.8.0 This will start an interactive session with cuOpt pre-installed and ready to use. diff --git a/docs/cuopt/source/cuopt-server/quick-start.rst b/docs/cuopt/source/cuopt-server/quick-start.rst index 007809fa8..9c0ca57a3 100644 --- a/docs/cuopt/source/cuopt-server/quick-start.rst +++ b/docs/cuopt/source/cuopt-server/quick-start.rst @@ -12,7 +12,7 @@ For CUDA 12.x: .. code-block:: bash - pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.05.* cuopt-sh-client==25.05.* + pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.05.* cuopt-sh-client==25.05.* nvidia-cuda-runtime-cu12==12.8.0 Conda diff --git a/docs/cuopt/source/system-requirements.rst b/docs/cuopt/source/system-requirements.rst index 7313ff0ed..54eb4394b 100644 --- a/docs/cuopt/source/system-requirements.rst +++ b/docs/cuopt/source/system-requirements.rst @@ -2,6 +2,8 @@ System Requirements =================== +Dependencies are installed automatically when using the pip and conda installation methods. But users would still need to make sure the system meets the minimum requirements. + .. dropdown:: Minimum Requirements * System Architecture: @@ -23,6 +25,9 @@ System Requirements * CUDA: - 12.0+ + * Python: + - >= 3.10.* and <= 3.12.* + * NVIDIA drivers: - 525.60.13+ (linux) - 527.41+ (windows) @@ -91,4 +96,4 @@ Thin-client for Self-Hosted - x86-64 - ARM64 -* Python > 3.10.x \ No newline at end of file +* Python >= 3.10.x <= 3.12.x \ No newline at end of file From f1a39c28d0d05e436cd1f8ade2b85f574657dca5 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 14:51:13 -0500 Subject: [PATCH 09/14] udpate --- CONTRIBUTING.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 082944322..6a77f4b0a 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -234,7 +234,8 @@ This will add the device debug symbols for this object file in `libcuopt.so`. Y ## Adding dependencies Please refer to [dependencies.yaml](dependencies.yaml) file for details on how to add new dependencies. -Add any new dependencies in the `dependencies.yaml` file. It takes care of conda, requirements (pip based dependencies) and pyproject.Please don't try to add dependencies directly to environment.yaml files under `conda/environments` directory and pyproject.toml files under `python` directories. +Add any new dependencies in the `dependencies.yaml` file. It takes care of conda, requirements (pip based dependencies) and pyproject. +Please don't try to add dependencies directly to environment.yaml files under `conda/environments` directory and pyproject.toml files under `python` directories. ## Code Formatting From eb13d199bcfb9230d7436313047bf077dad6c47b Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 15:13:31 -0500 Subject: [PATCH 10/14] update examples --- README.md | 2 +- docs/cuopt/source/cuopt-python/quick-start.rst | 8 ++++---- docs/cuopt/source/cuopt-server/quick-start.rst | 6 +++--- docs/cuopt/source/resources.rst | 2 +- 4 files changed, 9 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 3b35dd2b1..ae946b3bd 100644 --- a/README.md +++ b/README.md @@ -105,4 +105,4 @@ Review the [CONTRIBUTING.md](CONTRIBUTING.md) file for information on how to con - [cuopt (Python) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-python/index.html) - [cuopt (Server) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/index.html) - [Examples and Notebooks](https://github.com/NVIDIA/cuopt-examples) -- [Test cuopt with Brev](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [Brev](https://docs.nvidia.com/brev/latest/). \ No newline at end of file +- [Test cuopt with NVIDIA Launchable](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [NVIDIA Launchable](https://docs.nvidia.com/brev/latest/). \ No newline at end of file diff --git a/docs/cuopt/source/cuopt-python/quick-start.rst b/docs/cuopt/source/cuopt-python/quick-start.rst index a7c34c12d..6f635d82a 100644 --- a/docs/cuopt/source/cuopt-python/quick-start.rst +++ b/docs/cuopt/source/cuopt-python/quick-start.rst @@ -43,7 +43,7 @@ The container includes both the Python API and self-hosted server components. To .. code-block:: bash - docker run --gpus all -it --rm nvidia/cuopt:25.5.0-cuda12.8-py312 nvidia-cuda-runtime-cu12==12.8.0 + docker run --gpus all -it --rm nvidia/cuopt:25.5.0-cuda12.8-py312 This will start an interactive session with cuOpt pre-installed and ready to use. @@ -51,10 +51,10 @@ This will start an interactive session with cuOpt pre-installed and ready to use Make sure you have the NVIDIA Container Toolkit installed on your system to enable GPU support in containers. See the `installation guide `_ for details. -Brev ----- +NVIDIA Launchable +------------------- -NVIDIA cuOpt can be tested with `Brev Launchable `_ with `example notebooks `_. For more details, please refer to the `Brev documentation `_. +NVIDIA cuOpt can be tested with `NVIDIA Launchable `_ with `example notebooks `_. For more details, please refer to the `NVIDIA Launchable documentation `_. Smoke Test ---------- diff --git a/docs/cuopt/source/cuopt-server/quick-start.rst b/docs/cuopt/source/cuopt-server/quick-start.rst index 9c0ca57a3..990844a22 100644 --- a/docs/cuopt/source/cuopt-server/quick-start.rst +++ b/docs/cuopt/source/cuopt-server/quick-start.rst @@ -82,10 +82,10 @@ The container includes both the Python API and self-hosted server components. To docker run --gpus all -it --rm -p 8000:8000 -e CUOPT_SERVER_PORT=8000 /bin/bash -c "python3 -m cuopt_server.cuopt_service" -Brev ----- +NVIDIA Launchable +------------------- -NVIDIA cuOpt can be tested with `Brev Launchable `_ with `example notebooks `_. For more details, please refer to the `Brev documentation `_. +NVIDIA cuOpt can be tested with `NVIDIA Launchable `_ with `example notebooks `_. For more details, please refer to the `NVIDIA Launchable documentation `_. Smoke Test ---------- diff --git a/docs/cuopt/source/resources.rst b/docs/cuopt/source/resources.rst index 978778ef7..e5952be20 100644 --- a/docs/cuopt/source/resources.rst +++ b/docs/cuopt/source/resources.rst @@ -6,7 +6,7 @@ Resources `Sample Notebooks `_ ---------------------------------------------------------------------------------- -`Test cuopt with Brev `_ +`Test cuopt with NVIDIA Launchable `_ ------------------------------------------------------------------------------------------------------------------------ `File a Bug `_ From 4ededa28f41eca4c03c50e5e5de0e715087a0606 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 15:33:18 -0500 Subject: [PATCH 11/14] Address review comments --- CONTRIBUTING.md | 4 ++-- README.md | 10 +++++----- ci/README.md | 12 ++++++------ cpp/README.md | 4 ++-- docs/cuopt/README.md | 4 ++-- docs/cuopt/source/cuopt-c/quick-start.rst | 6 +++--- docs/cuopt/source/cuopt-python/quick-start.rst | 2 +- docs/cuopt/source/cuopt-server/quick-start.rst | 2 +- docs/cuopt/source/faq.rst | 2 +- docs/cuopt/source/introduction.rst | 5 ++--- docs/cuopt/source/system-requirements.rst | 7 ++++--- python/README.md | 8 ++++---- 12 files changed, 33 insertions(+), 33 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 6a77f4b0a..12e8efa33 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -71,7 +71,7 @@ for a minimal build of NVIDIA cuOpt without using conda are also listed below. Compilers: -These will be installed while creating the conda environment +These will be installed while creating the Conda environment * `gcc` version 13.0+ * `nvcc` version 12.8+ @@ -233,7 +233,7 @@ This will add the device debug symbols for this object file in `libcuopt.so`. Y ## Adding dependencies -Please refer to [dependencies.yaml](dependencies.yaml) file for details on how to add new dependencies. +Please refer to the [dependencies.yaml](dependencies.yaml) file for details on how to add new dependencies. Add any new dependencies in the `dependencies.yaml` file. It takes care of conda, requirements (pip based dependencies) and pyproject. Please don't try to add dependencies directly to environment.yaml files under `conda/environments` directory and pyproject.toml files under `python` directories. diff --git a/README.md b/README.md index ae946b3bd..7120d3207 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,7 @@ [![Build Status](https://github.com/NVIDIA/cuopt/actions/workflows/build.yaml/badge.svg)](https://github.com/NVIDIA/cuopt/actions/workflows/build.yaml) -NVIDIA® cuOpt™ is a GPU-accelerated optimization engine that excels in mixed integer programming (MIP), linear programming (LP), and vehicle routing problems (VRP). It enables near real-time solutions for large-scale challenges with millions of variables and constraints, offering +NVIDIA® cuOpt™ is a GPU-accelerated optimization engine that excels in mixed integer linear programming (MILP), linear programming (LP), and vehicle routing problems (VRP). It enables near real-time solutions for large-scale challenges with millions of variables and constraints, offering easy integration into existing solvers and seamless deployment across hybrid and multi-cloud environments. Core engine is written in C++ which is wrapped into C API, Python API and Server API. @@ -60,7 +60,7 @@ on the major version of CUDA available in your environment: For CUDA 12.x: ```bash -pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5 cuopt-sh-client==25.5 nvidia-cuda-runtime-cu12==12.8.0 +pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.5 cuopt-sh-client==25.5 nvidia-cuda-runtime-cu12==12.8.* ``` ### Conda @@ -82,14 +82,14 @@ of our latest development branch. ### Container -Users can pull the cuOpt container from the NVIDIA container registry +Users can pull the cuOpt container from the NVIDIA container registry. ```bash docker pull nvidia/cuopt:25.5.0-cuda12.8-py312 ``` -More information about the cuOpt container can be found [here](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/quick-start.html#container-from-docker-hub) +More information about the cuOpt container can be found [here](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/quick-start.html#container-from-docker-hub). -Users who are using cuOpt for quick testing or research can use the cuOpt container. Or users who are planning to pluging cuOpt as a service in their workflow can quickly start with the cuOpt container. But users are required to build security layers around the service to safeguard the service from untrusted users. +Users who are using cuOpt for quick testing or research can use the cuOpt container. Alternatively, users who are planning to plug cuOpt as a service in their workflow can quickly start with the cuOpt container. But users are required to build security layers around the service to safeguard the service from untrusted users. ## Build from Source and Test diff --git a/ci/README.md b/ci/README.md index d24225216..b1344a947 100644 --- a/ci/README.md +++ b/ci/README.md @@ -2,7 +2,7 @@ This directory contains the scripts for the CI pipeline. -CI builds are triggered by `pr.yaml`, `build.yaml` and `test.yaml` files in in `.github/workflows` directory. And these scripts are use from those workflows to build and test the code. +CI builds are triggered by `pr.yaml`, `build.yaml` and `test.yaml` files in the `.github/workflows` directory. And these scripts are used from those workflows to build and test the code. cuOpt is packaged in following ways: @@ -10,13 +10,13 @@ cuOpt is packaged in following ways: ### Build -The scripts for building the PIP packages are named as `build_wheel_.sh`. For Example, `build_wheel_cuopt.sh` is used to build the PIP package for cuOpt. +The scripts for building the PIP packages are named as `build_wheel_.sh`. For example, `build_wheel_cuopt.sh` is used to build the PIP package for cuOpt. Please refer to existing scripts for more details and how you can add a new script for a new package. ### Test -The scripts for testing the PIP packages are named as `test_wheel_.sh`. For Example, `test_wheel_cuopt.sh` is used to test the PIP package for cuOpt. +The scripts for testing the PIP packages are named as `test_wheel_.sh`. For example, `test_wheel_cuopt.sh` is used to test the PIP package for cuOpt. Please refer to existing scripts for more details and how you can add a new script for a new package. @@ -24,17 +24,17 @@ Please refer to existing scripts for more details and how you can add a new scri ### Build -For conda package, +For Conda package, - all cpp libraries are built under one script called `build_cpp.sh`. - all python bindings are built under one script called `build_python.sh`. -So if if there are new cpp libraries or python bindings, you need to add them to the respective scripts. +So if there are new cpp libraries or python bindings, you need to add them to the respective scripts. ### Test -Similarly, for conda package, +Similarly, for Conda package, - all cpp libraries are tested under one script called `test_cpp.sh`. - all python bindings are tested under one script called `test_python.sh`. diff --git a/cpp/README.md b/cpp/README.md index 94362b812..974f22cc7 100644 --- a/cpp/README.md +++ b/cpp/README.md @@ -2,9 +2,9 @@ This directory contains the C++ modules for the cuOpt project. -Please refer to [CMakeLists.txt](CMakeLists.txt) file for details on how to add new modules and tests. +Please refer to the [CMakeLists.txt](CMakeLists.txt) file for details on how to add new modules and tests. -Most of the dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. But some of the dependencies are defined in [thirdparty modules](cmake/thirdparty/) in case where source code is needed to build, for example, `cccl` and `rmm`. +Most of the dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. However, some of the dependencies are defined in [thirdparty modules](cmake/thirdparty/) in case where source code is needed to build, for example, `cccl` and `rmm`. ## Include Structure diff --git a/docs/cuopt/README.md b/docs/cuopt/README.md index 94ebe06ce..435224424 100644 --- a/docs/cuopt/README.md +++ b/docs/cuopt/README.md @@ -1,12 +1,12 @@ # Building Documentation -Documentation dependencies are installed while installing conda environment, please refer to the [Build and Test](../../CONTRIBUTING.md#building-with-a-conda-environment) for more details. Assuming you have set-up conda environment, you can build the documentation along with all the cuopt libraries by running: +Documentation dependencies are installed while installing the Conda environment, please refer to the [Build and Test](../../CONTRIBUTING.md#building-with-a-conda-environment) for more details. Assuming you have set-up the Conda environment, you can build the documentation along with all the cuOpt libraries by running: ```bash ./build.sh ``` -In subsequent runs where there are no changes to the cuopt libraries, documentation can be built by running: +In subsequent runs where there are no changes to the cuOpt libraries, documentation can be built by running: 1. From the root directory: ```bash diff --git a/docs/cuopt/source/cuopt-c/quick-start.rst b/docs/cuopt/source/cuopt-c/quick-start.rst index 6ac41b333..a913f9593 100644 --- a/docs/cuopt/source/cuopt-c/quick-start.rst +++ b/docs/cuopt/source/cuopt-c/quick-start.rst @@ -13,12 +13,12 @@ pip For CUDA 12.x: -This wheel is python wrapper around the C++ library and eases installation and access to libcuopt. This also help in pip environment to load libraries dynamically while using python SDK. +This wheel is a Python wrapper around the C++ library and eases installation and access to libcuopt. This also helps in the pip environment to load libraries dynamically while using the Python SDK. .. code-block:: bash - # This is deprecated module and not longer used, but share same name for the CLI, so we need to uninstall it first if it exists. + # This is a deprecated module and no longer used, but it shares the same name for the CLI, so we need to uninstall it first if it exists. pip uninstall cuopt-thin-client pip install --extra-index-url=https://pypi.nvidia.com libcuopt-cu12==25.5.* nvidia-cuda-runtime-cu12==12.8.0 @@ -32,7 +32,7 @@ For CUDA 12.x: .. code-block:: bash - # This is deprecated module and not longer used, but share same name for the CLI, so we need to uninstall it first if it exists. + # This is a deprecated module and no longer used, but it shares the same name for the CLI, so we need to uninstall it first if it exists. conda remove cuopt-thin-client conda install -c rapidsai -c conda-forge -c nvidia \ libcuopt=25.05.* python=3.12 cuda-version=12.8 diff --git a/docs/cuopt/source/cuopt-python/quick-start.rst b/docs/cuopt/source/cuopt-python/quick-start.rst index 6f635d82a..993b25e1c 100644 --- a/docs/cuopt/source/cuopt-python/quick-start.rst +++ b/docs/cuopt/source/cuopt-python/quick-start.rst @@ -14,7 +14,7 @@ For CUDA 12.x: .. code-block:: bash - pip install --extra-index-url=https://pypi.nvidia.com cuopt-cu12==25.5.* + pip install --extra-index-url=https://pypi.nvidia.com cuopt-cu12==25.5.* nvidia-cuda-runtime-cu12==12.8.* Conda diff --git a/docs/cuopt/source/cuopt-server/quick-start.rst b/docs/cuopt/source/cuopt-server/quick-start.rst index 990844a22..5eed0cdc8 100644 --- a/docs/cuopt/source/cuopt-server/quick-start.rst +++ b/docs/cuopt/source/cuopt-server/quick-start.rst @@ -12,7 +12,7 @@ For CUDA 12.x: .. code-block:: bash - pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.05.* cuopt-sh-client==25.05.* nvidia-cuda-runtime-cu12==12.8.0 + pip install --extra-index-url=https://pypi.nvidia.com cuopt-server-cu12==25.05.* cuopt-sh-client==25.05.* nvidia-cuda-runtime-cu12==12.8.* Conda diff --git a/docs/cuopt/source/faq.rst b/docs/cuopt/source/faq.rst index 74c93b8fe..60c24f5c2 100644 --- a/docs/cuopt/source/faq.rst +++ b/docs/cuopt/source/faq.rst @@ -47,7 +47,7 @@ General FAQ Yes, please refer to `system requirements `_ for GPU specifications. You can acquire a cloud instance with a supported GPU and launch cuOpt; alternatively, you can launch it in your local machine if it meets the requirements. -.. dropdown:: Does cuOpt use multiple GPUs/multi-GPUs/multi gpus? +.. dropdown:: Does cuOpt use multiple GPUs/multi-GPUs/multi GPUs? #. Yes, in cuOpt self-hosted server, a solver process per GPU can be configured to run multiple solvers. Requests are accepted in a round-robin queue. More details are available in `server api `_. #. There is no support for leveraging multiple GPUs to solve a single problem or oversubscribing a single GPU for multiple solvers. diff --git a/docs/cuopt/source/introduction.rst b/docs/cuopt/source/introduction.rst index cac22bae6..d3878d3ea 100644 --- a/docs/cuopt/source/introduction.rst +++ b/docs/cuopt/source/introduction.rst @@ -15,8 +15,7 @@ As part of `NVIDIA AI Enterprise `__ for more information about the NVIDIA Developer Program. -Core engine is built on C++ and all the APIs are built on top of it as wrappers. For example, cuOpt python API uses cython to wrap the C++ core engine and provide a Python interface. -Similarly, other interfaces wrap different layers to communicate with the core engine. +The core engine is built on C++ and all the APIs are built on top of it as wrappers. For example, cuOpt Python API uses Cython to wrap the C++ core engine and provide a Python interface. Similarly, other interfaces wrap different layers to communicate with the core engine. Routing (TSP, VRP, and PDP) ============================= @@ -121,7 +120,7 @@ cuOpt supports the following APIs: - `Routing (TSP, VRP, and PDP) - Server `_ ================================== -INSTALLATION OPTIONS +Installation Options ================================== NVIDIA cuOpt is available in several formats to suit different deployment needs: diff --git a/docs/cuopt/source/system-requirements.rst b/docs/cuopt/source/system-requirements.rst index 54eb4394b..216b4e4e2 100644 --- a/docs/cuopt/source/system-requirements.rst +++ b/docs/cuopt/source/system-requirements.rst @@ -2,7 +2,7 @@ System Requirements =================== -Dependencies are installed automatically when using the pip and conda installation methods. But users would still need to make sure the system meets the minimum requirements. +Dependencies are installed automatically when using the pip and Conda installation methods. However, users would still need to make sure the system meets the minimum requirements. .. dropdown:: Minimum Requirements @@ -29,8 +29,9 @@ Dependencies are installed automatically when using the pip and conda installati - >= 3.10.* and <= 3.12.* * NVIDIA drivers: - - 525.60.13+ (linux) - - 527.41+ (windows) + - 525.60.13+ (Linux) + - 527.41+ (Windows) + * OS: - Linux distributions with glibc>=2.28 (released in August 2018): * Arch Linux (minimum version 2018-08-02) diff --git a/python/README.md b/python/README.md index 6eb9cf26f..47536d698 100644 --- a/python/README.md +++ b/python/README.md @@ -4,7 +4,7 @@ This directory contains the Python modules for the cuOpt project. ## Package Structure -- Each subdirectory contains the Python modules for a specific cuOpt package. For example, `libcuopt` directory contains the Python wrappers for the cuOpt C++ library. This is the main package for the cuOpt project. And it just loads shared libraries and make it available for other python modules. `cuopt` python package uses `libcuopt` package as dependency and build on top of it. +- Each subdirectory contains the Python modules for a specific cuOpt package. For example, `libcuopt` directory contains the Python wrappers for the cuOpt C++ library. This is the main package for the cuOpt project. And it just loads shared libraries and make it available for other Python modules. `cuopt` Python package uses `libcuopt` package as dependency and build on top of it. ```bash python/ @@ -12,7 +12,7 @@ python/ ├── cuopt/ └── ... ``` -- Each of these python modules have `tests` directory which contains the tests for the module. Python tests are written using `pytest`. For example, `python/cuopt/cuopt/tests/` directory contains the tests for the `cuopt` python package. +- Each of these Python modules have a `tests` directory that contains the tests for the module. Python tests are written using `pytest`. For example, `python/cuopt/cuopt/tests/` directory contains the tests for the `cuopt` Python package. ```bash python/ @@ -23,7 +23,7 @@ python/ └── ... ``` -- Each of these pyhon modules have pyproject.toml file which contains the dependencies for the module. For example, `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` python package. +- Each of these Python modules have a `pyproject.toml` file that contains the dependencies for the module. For example, `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` Python package. ```bash python/ @@ -33,7 +33,7 @@ python/ └── ... ``` -- The dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file in root folder. For example, `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` python package. So any changes to dependencies should be done in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. +- The dependencies are defined in the [dependencies.yaml](../dependencies.yaml) file in the root folder. For example, the `python/cuopt/pyproject.toml` file contains the dependencies for the `cuopt` Python package. Therefore, any changes to dependencies should be done in the [dependencies.yaml](../dependencies.yaml) file. Please refer to different sections in the [dependencies.yaml](../dependencies.yaml) file for more details. From 8da06b6a9e471f05d7886a73c9acea1a4da6cf13 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 15:39:57 -0500 Subject: [PATCH 12/14] fix doc build failure --- docs/cuopt/source/resources.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/cuopt/source/resources.rst b/docs/cuopt/source/resources.rst index e5952be20..d8cd4f103 100644 --- a/docs/cuopt/source/resources.rst +++ b/docs/cuopt/source/resources.rst @@ -7,7 +7,7 @@ Resources ---------------------------------------------------------------------------------- `Test cuopt with NVIDIA Launchable `_ ------------------------------------------------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------ `File a Bug `_ ----------------------------------------------------------------- From 9fc0be0c5b6ce089d98395b3d30192fa8a0a59a8 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 15:59:32 -0500 Subject: [PATCH 13/14] Add google colab links --- README.md | 3 ++- docs/cuopt/source/resources.rst | 4 ++++ 2 files changed, 6 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 7120d3207..39d45e4af 100644 --- a/README.md +++ b/README.md @@ -105,4 +105,5 @@ Review the [CONTRIBUTING.md](CONTRIBUTING.md) file for information on how to con - [cuopt (Python) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-python/index.html) - [cuopt (Server) documentation](https://docs.nvidia.com/cuopt/user-guide/latest/cuopt-server/index.html) - [Examples and Notebooks](https://github.com/NVIDIA/cuopt-examples) -- [Test cuopt with NVIDIA Launchable](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [NVIDIA Launchable](https://docs.nvidia.com/brev/latest/). \ No newline at end of file +- [Test cuopt with NVIDIA Launchable](https://brev.nvidia.com/launchable/deploy?launchableID=env-2qIG6yjGKDtdMSjXHcuZX12mDNJ): Examples notebooks are pulled and hosted on [NVIDIA Launchable](https://docs.nvidia.com/brev/latest/). +- [Test cuopt on Google Colab](https://colab.research.google.com/github/nvidia/cuopt-examples/): Examples notebooks can be opened in Google Colab. Please note that you need to choose a `Runtime` as `GPU` in order to run the notebooks. \ No newline at end of file diff --git a/docs/cuopt/source/resources.rst b/docs/cuopt/source/resources.rst index d8cd4f103..dde06479d 100644 --- a/docs/cuopt/source/resources.rst +++ b/docs/cuopt/source/resources.rst @@ -9,6 +9,10 @@ Resources `Test cuopt with NVIDIA Launchable `_ ------------------------------------------------------------------------------------------------------------------------------ +`Test cuOpt on Google Colab `_ +------------------------------------------------------------------------------------------------------------------------ +Please note that you need to choose a `Runtime` as `GPU` in order to run the notebooks. + `File a Bug `_ ----------------------------------------------------------------- From e20a9711a50255c28a6535b5961fe77327a22533 Mon Sep 17 00:00:00 2001 From: Ramakrishna Prabhu Date: Wed, 28 May 2025 17:02:38 -0500 Subject: [PATCH 14/14] update faq --- docs/cuopt/source/faq.rst | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/docs/cuopt/source/faq.rst b/docs/cuopt/source/faq.rst index 60c24f5c2..d452813c9 100644 --- a/docs/cuopt/source/faq.rst +++ b/docs/cuopt/source/faq.rst @@ -81,6 +81,19 @@ General FAQ #. The complete round-trip solve time might be more than what was set. +.. dropdown:: Why am I getting "libcuopt.so: cannot open shared object file: No such file or directory" error? + + This error indicates that the cuOpt shared library is not found. Please check the following: + + - The cuOpt is installed + - Use ``find / -name libcuopt.so`` to search for the library path from root directory. You might need to run this command as root user. + - If the library is found, please add it to the ``LD_LIBRARY_PATH`` environment variable as shown below: + + .. code-block:: bash + + export LD_LIBRARY_PATH=/path/to/cuopt/lib:$LD_LIBRARY_PATH + + - If the library is not found, it means it is not yet installed. Please check the cuOpt installation guide for more details. .. dropdown:: Is there a way to make cuOpt also account for other overheads in the same time limit provided?