You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run clearml pipeline locally, but its steps have default Docker image.
I added another step, did not commit it to repo and tried to run pipeline to test how it works.
I also have a module to store some utility functions used in pipeline steps, so my project structure is as follows:
When I ran it locally (steps still executed in Docker container), the first step failed with ModuleNotFoundError: No module named 'lib'. Note that only step3_new.py was added and pipeline.py was modified, lib/utils.py remained unchanged.
When I committed my changes to repo and then ran it locally again, everything went as expected, which is really weird, because Docker container is supposed to have $PYTHONPATH and everything set independently, so why does it behave like that?
Curiously enough, I still got [MainProcess] [INFO] No repository found, storing script code instead message for each step in VSCode terminal (no such messages in Web UI though).
Expected behaviour
Either fix python path when there is diff, or just don't allow running pipeline steps if diff is detected. This feels broken the way it is now.
Environment
Server type - self hosted
ClearML SDK Version - 1.16.3
ClearML Server Version - 1.16.2
Python Version - 3.8
OS - Linux
The text was updated successfully, but these errors were encountered:
Describe the bug/To reproduce
I run clearml pipeline locally, but its steps have default Docker image.
I added another step, did not commit it to repo and tried to run pipeline to test how it works.
I also have a module to store some utility functions used in pipeline steps, so my project structure is as follows:
When I ran it locally (steps still executed in Docker container), the first step failed with
ModuleNotFoundError: No module named 'lib'
. Note that onlystep3_new.py
was added andpipeline.py
was modified,lib/utils.py
remained unchanged.When I committed my changes to repo and then ran it locally again, everything went as expected, which is really weird, because Docker container is supposed to have $PYTHONPATH and everything set independently, so why does it behave like that?
Curiously enough, I still got
[MainProcess] [INFO] No repository found, storing script code instead
message for each step in VSCode terminal (no such messages in Web UI though).Expected behaviour
Either fix python path when there is diff, or just don't allow running pipeline steps if diff is detected. This feels broken the way it is now.
Environment
The text was updated successfully, but these errors were encountered: