-
-
Notifications
You must be signed in to change notification settings - Fork 871
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update README for updated docker images #328
Conversation
README.md
Outdated
@@ -107,7 +106,7 @@ accelerate launch scripts/finetune.py examples/openllama-3b/lora.yml \ | |||
|
|||
3. Install torch | |||
```bash | |||
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 | |||
pip3 install torch==2.0.1+cu118 --index-url https://download.pytorch.org/whl/cu118 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pip3 install torch==2.0.1+cu118 --index-url https://download.pytorch.org/whl/cu118 | |
pip3 install -U torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 |
I think you do not need to specify the version of torch as latest is 2.0.1 one. Setting that index url also sets the cuda version. The thing I forgot to add was the -U
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks, I've updated it with the -U
and removed the explicit version. I'm going to keep the removal of torchvision and torchaudio since those aren't needed I don't think
- `winglian/axolotl-runpod:main-py3.9-cu118-2.0.0-gptq`: for gptq | ||
- `winglian/axolotl:dev`: dev branch (not usually up to date) | ||
- `winglian/axolotl-runpod:main-py3.10-cu118-2.0.1`: for runpod | ||
- `winglian/axolotl-runpod:main-py3.9-cu118-2.0.1-gptq`: for gptq |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason gptq is 3.9 vs the run pod 3.10?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't touched gptq in a while, so I'd rather leave it at 3.9 where it is believed to still work, rather than trying to upgrade and introduce another potential issue
* update README for updated docker images * update readme from pr feedback
No description provided.