-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failure to import openmmtorch #67
Comments
Can you post We've been running into a lot of conda issues lately with PyTorch, CUDA, OpenMM, and other packages. See for example torchmd/torchmd-net#266 and conda-forge/openmm-torch-feedstock#48. I'm not sure what the cause is. @RaulPPelaez might have some insight. |
I do not know what is going on these days. Seems like conda is falling apart -.- Please provide the commands you ran from a clean environment and the result of conda-list so we can provide more help. Be careful with settings like CONDA_CUDA_OVERRIDE, they are tricky and may cause unexpected inconsistencies. For me, running something like this: mamba create -n test openmm-torch "cuda-version>=12" openmm-ml Tries to install the cpu versions of pytorch and some other packages, which is incorrect behavior. I also checked with conda and same behavior. mamba create -n test openmm-torch openmm-ml pytorch=*=*cuda* This results in an env in which I can import openmm-ml without issue. There are these commented lines in the openmm-torch feedstock: #run_constrained:
# 2022/02/05 hmaarrfk
# While conda packaging seems to allow us to specify
# constraints on the same package in different lines
# the resulting package doesn't have the ability to
# be specified in multiples lines
# This makes it tricky to use run_exports
# we add the GPU constraint in the run_constrained
# to allow us to have "two" constraints on the
# running package
#- pytorch =*={{ torch_proc_type }}* It really looks to me like that should be uncommented. Lets try... |
Working on it here conda-forge/openmm-torch-feedstock#49 |
Thank you both!
|
We have new packages for OpenMM and OpenMM-Torch. Hopefully they fix the problem. Can you try again and see? |
Confirmed, no problems now, thanks again! |
I'm stuck trying to make dependencies work, related to openmm/openmm-torch#127.
After standard mamba installation of
openmm-ml
I get:on trying to
createSystem
. I have the same problem with separately installingopenmmtorch
and this solution worked. But trying to installopenmm-ml
on top of that workingopenmmtorch
(which requires using--no-deps
) leads to the 'undefined symbol' error fromtorchani
import, as seen here.The text was updated successfully, but these errors were encountered: