Skip to content

bump ao to 0.5.0 to enable torchtune across platforms #1136

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Sep 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 4 additions & 28 deletions install/install_requirements.sh
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,6 @@ PYTORCH_NIGHTLY_VERSION=dev20240814
# Nightly version for torchvision
VISION_NIGHTLY_VERSION=dev20240814

# Nightly version for torchao
AO_NIGHTLY_VERSION=dev20240905

# Nightly version for torchtune
TUNE_NIGHTLY_VERSION=dev20240910

Expand All @@ -79,10 +76,6 @@ fi
REQUIREMENTS_TO_INSTALL=(
torch=="2.5.0.${PYTORCH_NIGHTLY_VERSION}"
torchvision=="0.20.0.${VISION_NIGHTLY_VERSION}"
)

LINUX_REQUIREMENTS_TO_INSTALL=(
torchao=="0.5.0.${AO_NIGHTLY_VERSION}"
torchtune=="0.3.0.${TUNE_NIGHTLY_VERSION}"
)

Expand All @@ -94,27 +87,10 @@ LINUX_REQUIREMENTS_TO_INSTALL=(
"${REQUIREMENTS_TO_INSTALL[@]}"
)

PLATFORM=$(uname -s)

# Install torchtune and torchao requirements for Linux systems using nightly.
# For non-Linux systems (e.g., macOS), install torchao from GitHub since nightly
# build doesn't have macOS build.
# TODO: Remove this and install nightly build, once it supports macOS
if [ "$PLATFORM" == "Linux" ];
then
(
set -x
$PIP_EXECUTABLE install --pre --extra-index-url "${TORCH_NIGHTLY_URL}" --no-cache-dir \
"${LINUX_REQUIREMENTS_TO_INSTALL[@]}"
)
else
# For torchao need to install from github since nightly build doesn't have macos build.
# TODO: Remove this and install nightly build, once it supports macos
(
set -x
$PIP_EXECUTABLE install git+https://github.com/pytorch/ao.git@e11201a62669f582d81cdb33e031a07fb8dfc4f3
)
fi
(
set -x
$PIP_EXECUTABLE install torchao=="0.5.0"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine to have it here for now since we might move to nightlies, but this makes more sense in the install_requirements.txt

)

if [[ -x "$(command -v nvidia-smi)" ]]; then
(
Expand Down
7 changes: 2 additions & 5 deletions torchchat/cli/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,11 +35,8 @@
from torchchat.utils.measure_time import measure_time
from torchchat.utils.quantize import quantize_model

# bypass the import issue before torchao is ready on macos
try:
from torchtune.models.convert_weights import meta_to_tune
except:
pass
from torchtune.models.convert_weights import meta_to_tune




Expand Down
11 changes: 3 additions & 8 deletions torchchat/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,9 @@

from torchchat.utils.build_utils import find_multiple, get_precision

# bypass the import issue, if any
# TODO: remove this once the torchao is ready on macos
try:
from torchtune.models.flamingo import flamingo_decoder, flamingo_vision_encoder
from torchtune.modules.model_fusion import DeepFusionModel
from torchtune.models.llama3_1._component_builders import llama3_1 as llama3_1_builder
except:
pass
from torchtune.models.flamingo import flamingo_decoder, flamingo_vision_encoder
from torchtune.modules.model_fusion import DeepFusionModel
from torchtune.models.llama3_1._component_builders import llama3_1 as llama3_1_builder

config_path = Path(f"{str(Path(__file__).parent)}/model_params")

Expand Down
Loading