-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
can not create conda environment #45
Comments
i encountered a similar issue:
Any help would be greatly appreciated. I'm on a Windows machine. Is this package not supported on Windows??!!! |
same issue. I'm on ubuntu. |
ugh, this sucks. wish the requirements n instructions are clearer. |
|
same issue on mac |
I removed this line first and then did the installation, and then executed |
any update? I am still having this issue. ResolvePackageNotFound:
CUDA Command v10.2 (x86_64) pip install cupy-cuda102 v10.2 (aarch64 - JetPack 4) pip install cupy-cuda102 -f https://pip.cupy.dev/aarch64 v11.0 (x86_64) pip install cupy-cuda110 v11.1 (x86_64) pip install cupy-cuda111 v11.2 ~ 11.8 (x86_64) pip install cupy-cuda11x v11.2 ~ 11.8 (aarch64 - JetPack 5 / Arm SBSA) pip install cupy-cuda11x -f https://pip.cupy.dev/aarch64 v12.x (x86_64) pip install cupy-cuda12x v12.x (aarch64 - JetPack 5 / Arm SBSA) pip install cupy-cuda12x -f https://pip.cupy.dev/aarch64 |
I am getting the same issue with Mamba.
Getting the following errors:
Running on macOS 12.16.3 Would be nice if the README can add the prerequisites for setting up the environment. |
I'll update the README. I believe these packages are only available on Linux. Windows users might be able to use WSL (see issue #19), but I don't think this will run on macOS. |
this helped on ubuntu: |
For anyone trying to run inference on a Mac (fyi the training scripts will not work): This environment.yml worked for me. Since Macs don’t have a CUDA device, you’re going to have to use CPU packages. There is a way to leverage GPU acceleration with MPS but I haven’t tried that yet. For inference, you’d have to modify the Python script to use CPU. I’m going to put up a PR soon, but for now, reference this bot.py. Changes:
environment.yml for Mac:
|
managed to run, thank to @orangetin
Mac M1. |
@EdgBuc make sure the dtype is not |
Indeed, |
You don't need to modify the if/else statement if you pass Yeah, that is expected if you're running this on CPU. However, on Silicon, you should be able to set the device to |
well when tired to change to
and then:
|
I can't reproduce this error on an M2 (following the instructions I provided). This seems like a PyTorch error. The only modification to the code was to change Getting better speeds but still not great. Read this blogpost if you want faster CPU inference |
Describe the bug
Followed the instructions but could not get
to work because of
To Reproduce
Steps to reproduce the behavior:
Intall miniconda
run
conda env create -f environment.yml
Expected behavior
Create an environment called OpenChatKit but can't create
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Mac
Smartphone (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: