[ENH] - include conda env activation for kernel with custom environment #1903
Labels
status:Need Info
We believe we need more information about an issue from the reporting user to help, debug, fix
type:Enhancement
A proposed enhancement to the docker images
What docker image(s) is this feature applicable to?
base-notebook
What changes are you proposing?
I'd like to find out, how to best integrate the activation of a custom environment upon kernel launch.
Problem, that needs this improvement:
When the kernel in the custom environment is started from JupyterLab, the conda environment is unchanged the "base" environment, where Jupyter is installed and run from, because the .bashrc with its conda environment activation never runs in that context. This means, that the PATH environment variable is also the one of the "base" environment, even if the python executable is from the custom env … resulting in modules, that are not purely Python, but properly installed in the custom env, to fail loading.
If I do a conda custom env activation e.g. via shell script, that I call via the kernelspec, the PATH is correct and loading those modules, that previously failed (e.g. plotly kaleido), works.
Questions:
ipykernel install
creates, be implemented optimally, in the Dockerfile?kernel-launch.sh
script a good idea, or should just the standard command be augmented withconda activate customenv
, instead?This is the reference to the custom env kernel creation:
docker-stacks/docs/using/recipes.md
Lines 88 to 102 in b378681
How does this affect the user?
The user would be able to make use of conda modules, that use the full spectrum of conda environments, right away, without having to wonder, why additional binaries can't be found and without manually manipulating the PATH environment variable or absolute paths.
Anything else?
No response
The text was updated successfully, but these errors were encountered: