Skip to content

Commit

Permalink
Fixed installation docs for 1.34 release (#3667)
Browse files Browse the repository at this point in the history
* Fixed wheel file naming mechanism in installation (#3292)
* Updated versioning and naming of wheel files in conf.py and install RST file
* Updated installation instructions and fixed wheel file installation commands (#3335)
* Updated docker and host wheel install instructions to make them more generic across releases
* Updated install instructions: torch GPU to cuda 2.1 and onnx to 1.16
* Added pytorch variant cuda versions

Signed-off-by: Bharath Ramaswamy <quic_bharathr@quicinc.com>
Signed-off-by: Joel Polizzi <quic_jpolizzi@quicinc.com>
Co-authored-by: Joel Polizzi <quic_jpolizzi@quicinc.com>
Co-authored-by: Bharath Ramaswamy <quic_bharathr@quicinc.com>
Signed-off-by: Bharath Ramaswamy <quic_bharathr@quicinc.com>
  • Loading branch information
quic-bharathr and quic-jpolizzi committed Sep 13, 2024
1 parent 4f7bc22 commit 89184ab
Show file tree
Hide file tree
Showing 4 changed files with 68 additions and 116 deletions.
1 change: 1 addition & 0 deletions Docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,7 @@ def setup(app):
.. |author| replace:: {author}
.. |project| replace:: {project}
.. |default-quantsim-config-file| replace:: aimet_common/quantsim_config/default_config.json
.. |version| replace:: {version}
""".format(project=project, author=author, version=version)

# -- Options for LaTeX output ------------------------------------------------
Expand Down
39 changes: 19 additions & 20 deletions Docs/install/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,10 +47,10 @@ The AIMET PyTorch GPU PyPI packages are available for environments that meet the

* 64-bit Intel x86-compatible processor
* Linux Ubuntu 22.04 LTS [Python 3.10] or Ubuntu 20.04 LTS [Python 3.8]
* Cuda 12.0
* CUDA 12.0
* Torch 2.2.2

**Pip install:**
**Pip install**

.. code-block::
Expand All @@ -61,48 +61,47 @@ The AIMET PyTorch GPU PyPI packages are available for environments that meet the
Release Packages
~~~~~~~~~~~~~~~~

For other aimet variants, install the latest version from the .whl files hosted at https://github.com/quic/aimet/releases
For other AIMET variants, install the *latest* version from the .whl files hosted at https://github.com/quic/aimet/releases

**PyTorch**

.. parsed-literal::
# Pytorch 2.1 with CUDA 12.x
python3 -m pip install |download_url|\ |version|/aimet_torch-\ |version|.cu121\ |whl_suffix|
# Pytorch 2.1 CPU only
python3 -m pip install |download_url|\ |version|/aimet_torch-\ |version|.cpu\ |whl_suffix|
# Pytorch 1.13 with CUDA 11.x
python3 -m pip install |download_url|\ |version|/aimet_torch-torch_gpu\_\ |version|\ |whl_suffix|
# Pytorch 1.13 CPU only
python3 -m pip install |download_url|\ |version|/aimet_torch-torch_cpu\_\ |version|\ |whl_suffix|
python3 -m pip install |download_url|\ |version|/aimet_torch-\ |version|.cu117\ |whl_suffix|
**TensorFlow**

.. parsed-literal::
# Tensorflow 2.10 GPU with CUDA 11.x
python3 -m pip install |download_url|\ |version|/aimet_tensorflow-tf_gpu\_\ |version|\ |whl_suffix|
python3 -m pip install |download_url|\ |version|/aimet_tensorflow-\ |version|.cu118\ |whl_suffix|
# Tensorflow 2.10 CPU only
python3 -m pip install |download_url|\ |version|/aimet_tensorflow-tf_cpu\_\ |version|\ |whl_suffix|
python3 -m pip install |download_url|\ |version|/aimet_tensorflow-\ |version|.cpu\ |whl_suffix|
**Onnx**

.. parsed-literal::
# ONNX 1.14 GPU
python3 -m pip install |download_url|\ |version|/aimet_onnx-onnx_gpu\_\ |version|\ |whl_suffix|
# ONNX 1.14 CPU
python3 -m pip install |download_url|\ |version|/aimet_onnx-onnx_cpu\_\ |version|\ |whl_suffix|
# ONNX 1.16 GPU with CUDA 11.x
python3 -m pip install |download_url|\ |version|/aimet_onnx-\ |version|.cu117\ |whl_suffix|
For previous AIMET releases, browse packages at https://github.com/quic/aimet/releases. Each release includes multiple python packages of the following format:

.. parsed-literal::
# ONNX 1.16 CPU
python3 -m pip install |download_url|\ |version|/aimet_onnx-\ |version|.cpu\ |whl_suffix|
# VARIANT in {torch_gpu, torch_cpu, tf_gpu, tf_cpu, onnx_gpu, onnx_cpu}
# PACKAGE_PREFIX in {aimet_torch, aimet_tensorflow, aimet_onnx}
<PACKAGE_PREFIX>-<VARIANT>_<VERSION>\ |whl_suffix|
For older versions, please browse the releases at https://github.com/quic/aimet/releases and follow the documentation corresponding to that release to select and install the appropriate package.

.. |version| replace:: 1.31.0
.. |whl_suffix| replace:: -cp38-cp38-linux_x86_64.whl
.. |whl_suffix| replace:: -cp310-cp310-manylinux_2_34_x86_64.whl
.. |download_url| replace:: \https://github.com/quic/aimet/releases/download/

System Requirements
Expand Down
74 changes: 25 additions & 49 deletions Docs/install/install_docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,6 @@ Set the `<variant_string>` to ONE of the following depending on your desired var
#. For the PyTorch 2.1 GPU variant, use `torch-gpu`
#. For the PyTorch 2.1 CPU variant, use `torch-cpu`
#. For the PyTorch 1.13 GPU variant, use `torch-gpu-pt113`
#. For the PyTorch 1.13 CPU variant, use `torch-cpu-pt113`
#. For the TensorFlow GPU variant, use `tf-gpu`
#. For the TensorFlow CPU variant, use `tf-cpu`
#. For the ONNX GPU variant, use `onnx-gpu`
Expand Down Expand Up @@ -121,73 +120,50 @@ Install AIMET packages
From PyPI
=========

Aimet Torch GPU can install from pypi through the following method:
The default AIMET Torch GPU variant may be installed from PyPI as follows:
- Go to https://pypi.org/project/aimet-torch
- Browse the Requirements section of each Release to identify the version you wish to install. Following are some tips:
- For Pytorch 2.2.2 GPU with CUDA 12.1, use aimet-torch>=1.32.2
- For Pytorch 2.1.2 GPU with CUDA 12.1, use aimet-torch==1.32.1.post1
- For PyTorch 1.13 GPU with CUDA 11.7, use aimet-torch==1.31.2

Go to https://pypi.org/project/aimet-torch to identify a version you wish to install

- For PyTorch 1.13 GPU you should use aimet-torch==1.31.1
- For Pytorch 2.1.2 GPU you should use aimet-torch >= 1.32.0
Run the following commands to install the package (prepend with "sudo" and/or package version as needed):

.. code-block:: bash
sudo apt-get install liblapacke -y
pip install aimet-torch
apt-get install liblapacke -y
python3 -m pip install aimet-torch
From Release Package
====================

Alternatively, we host .whl packages for each release at https://github.com/quic/aimet/releases. Identify the release tag
of the package you wish to install, then follow the instructions below to install AIMET from the .whl file.

Set the <variant_string> to ONE of the following depending on your desired variant
We also host python wheel packages for different variants which may be installed as follows:
- Go to https://github.com/quic/aimet/releases
- Identify the release tag of the package that you wish to install
- Identify the .whl file corresponding to the package variant that you wish to install
- Follow the instructions below to install AIMET from the .whl file

#. For the PyTorch 2.1 GPU variant, use "torch_gpu"
#. For the PyTorch 2.1 CPU variant, use "torch_cpu"
#. For the PyTorch 1.13 GPU variant, use "torch_gpu-pt113"
#. For the PyTorch 1.13 CPU variant, use "torch_cpu-pt113"
#. For the TensorFlow GPU variant, use "tf_gpu"
#. For the TensorFlow CPU variant, use "tf_cpu"
#. For the ONNX GPU variant, use "onnx_gpu"
#. For the ONNX CPU variant, use "onnx_cpu"
Set the package details as follows:

.. code-block:: bash
export AIMET_VARIANT=<variant_string>
Replace <release_tag> in the steps below with the appropriate tag:

.. code-block:: bash
export release_tag=<release_tag>
Set the package download URL as follows:

.. code-block:: bash
# Set the release tag ex. "1.34.0"
export release_tag="<version release tag>"
# Construct the download root URL
export download_url="https://github.com/quic/aimet/releases/download/${release_tag}"
Set the common suffix for the package files as follows:

.. code-block:: bash
export wheel_file_suffix="cp310-cp310-linux_x86_64.whl"
Install the AIMET packages in the order specified below:
# Set the wheel file name with extension
# ex. "aimet_torch-1.34.0.cu121-cp310-cp310-manylinux_2_34_x86_64.whl"
export wheel_file_name="<wheel file name>"
**NOTE:**
#. Please pre-pend the "apt-get install" and "pip3 install" commands with "sudo -H" as appropriate.
#. These instructions assume that pip packages will be installed in the path: /usr/local/lib/python3.10/dist-packages. If that is not the case, please modify it accordingly.
#. Python dependencies will automatically get installed.
Install the selected AIMET package as specified below:
**NOTE:** Python dependencies will automatically get installed.

.. code-block:: bash
# Install ONE of the following depending on the variant
python3 -m pip install ${download_url}/aimet_torch-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix} -f https://download.pytorch.org/whl/torch_stable.html
# OR
python3 -m pip install ${download_url}/aimet_tensorflow-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix}
# OR
python3 -m pip install ${download_url}/aimet_onnx-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix}
python3 -m pip install ${download_url}/${wheel_file_name}
Environment setup
~~~~~~~~~~~~~~~~~
Expand Down
70 changes: 23 additions & 47 deletions Docs/install/install_host.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,73 +100,49 @@ Install AIMET packages
From PyPI
=========

Aimet Torch GPU can install from pypi through the following method:
The default AIMET Torch GPU variant may be installed from PyPI as follows:
- Go to https://pypi.org/project/aimet-torch
- Browse the Requirements section of each Release to identify the version you wish to install. Following are some tips:
- For Pytorch 2.2.2 GPU with CUDA 12.1, use aimet-torch>=1.32.2
- For Pytorch 2.1.2 GPU with CUDA 12.1, use aimet-torch==1.32.1.post1
- For PyTorch 1.13 GPU with CUDA 11.7, use aimet-torch==1.31.2

Go to https://pypi.org/project/aimet-torch to identify a version you wish to install

- For PyTorch 1.13 GPU you should use aimet-torch==1.31.1
- For Pytorch 2.1.2 GPU you should use aimet-torch >= 1.32.0
Run the following commands to install the package (prepend with "sudo" and/or package version as needed):

.. code-block:: bash
sudo apt-get install liblapacke -y
pip install aimet-torch
apt-get install liblapacke -y
python3 -m pip install aimet-torch
From Release Package
====================

Alternatively, we host .whl packages for each release at https://github.com/quic/aimet/releases. Identify the release tag
of the package you wish to install, then follow the instructions below to install AIMET from the .whl file.

Set the <variant_string> to ONE of the following depending on your desired variant

#. For the PyTorch 2.1 GPU variant, use "torch_gpu"
#. For the PyTorch 2.1 CPU variant, use "torch_cpu"
#. For the PyTorch 1.13 GPU variant, use "torch_gpu_pt113"
#. For the PyTorch 1.13 CPU variant, use "torch_cpu_pt113"
#. For the TensorFlow GPU variant, use "tf_gpu"
#. For the TensorFlow CPU variant, use "tf_cpu"
#. For the ONNX GPU variant, use "onnx_gpu"
#. For the ONNX CPU variant, use "onnx_cpu"
We also host python wheel packages for different variants which may be installed as follows:
- Go to https://github.com/quic/aimet/releases
- Identify the release tag of the package that you wish to install
- Identify the .whl file corresponding to the package variant that you wish to install
- Follow the instructions below to install AIMET from the .whl file

.. code-block:: bash
export AIMET_VARIANT=<variant_string>
Replace <release_tag> in the steps below with the appropriate tag:
Set the package details as follows:

.. code-block:: bash
export release_tag=<release_tag>
Set the package download URL as follows:

.. code-block:: bash
# Set the release tag ex. "1.34.0"
export release_tag="<version release tag>"
# Construct the download root URL
export download_url="https://github.com/quic/aimet/releases/download/${release_tag}"
Set the common suffix for the package files as follows:

**NOTE:** Set wheel_file_suffix to cp310-cp310-linux_x86_64.whl OR cp38-cp38-linux_x86_64.whl OR cp36-cp36m-linux_x86_64 OR cp37-cp37m-linux_x86_64 OR py3-none-any as appropriate depending on the actual wheel filename(s) on the https://github.com/quic/aimet/releases.

.. code-block:: bash
export wheel_file_suffix="cp310-cp310-linux_x86_64.whl"
Install the AIMET packages in the order specified below:
# Set the wheel file name with extension
# ex. "aimet_torch-1.33.0.cu121-cp310-cp310-manylinux_2_34_x86_64.whl"
export wheel_file_name="<wheel file name>"
Install the selected AIMET package as specified below:
**NOTE:** Python dependencies will automatically get installed.

.. code-block:: bash
# Install ONE of the following depending on the variant
python3 -m pip install ${download_url}/aimet_torch-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix} -f https://download.pytorch.org/whl/torch_stable.html
# OR
python3 -m pip install ${download_url}/aimet_tensorflow-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix}
# OR
python3 -m pip install ${download_url}/aimet_onnx-${AIMET_VARIANT}_${release_tag}-${wheel_file_suffix}
python3 -m pip install ${download_url}/${wheel_file_name}
Install common debian packages
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down

0 comments on commit 89184ab

Please sign in to comment.