You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I find myself limited by nebari with respect to working on larger analyses (multiple notebooks spread across directory tree). Locally, I would either:
start JupyterLab with modified PYTHONPATH variable to include the root of the analysis repo and have relative imports and other fun things for free
install package in editable mode in the appropriate environment
However in nebari I cannot do either (because of lack of exposed spawner customization and conda store respectively).
Nebari now offers documentation on How to Develop Local Packages on Nebari, however I find the documented solution unfit for the scenario of using Nebari for data analysis in notebooks.
In particular, I find the requirement to include on top of each notebook the block:
Another approach is taken by https://bluss.github.io/pyproject-local-kernel/ which detects python environment based on pyproject.toml file. This would be amazing to have. It also allows to swap out the kernel startup arguments so we could pass custom PYTHONPATH.
I am not sure if we would have a consensus to add it to base docker image (maybe?), but an option to install it (without having to ship own docker image) would be great.
I completely agree that the current status is functional but not practical. I raised this as an issue on this repo a while back.
I'm not sure how pyproject-local-kernel would work with conda-store. I think the proper solution would be to sort out how to make this work within conda-store.
Feature description
I find myself limited by nebari with respect to working on larger analyses (multiple notebooks spread across directory tree). Locally, I would either:
PYTHONPATH
variable to include the root of the analysis repo and have relative imports and other fun things for freeHowever in nebari I cannot do either (because of lack of exposed spawner customization and conda store respectively).
Nebari now offers documentation on How to Develop Local Packages on Nebari, however I find the documented solution unfit for the scenario of using Nebari for data analysis in notebooks.
In particular, I find the requirement to include on top of each notebook the block:
cumbersome and making it difficult to collaborate.
It is really counter-productive having to add this in each notebook, also if another user wants to run it they need to change the user name.
Developing in
~/shared
would help a bit, but would make that snippet even larger:Parametrized kernels (jupyter/enhancement-proposals#87) could help here once they lands (it may be worth dedicating some time to help it land).
Value and/or benefit
More happy users
Anything else?
No response
The text was updated successfully, but these errors were encountered: