-
Notifications
You must be signed in to change notification settings - Fork 449
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jupyterhub + Sparkmagic/levy + Spark on K8s? #575
Comments
I have no personal experience, but apparently Amazon EMR does Sparkmagic: https://aws.amazon.com/blogs/machine-learning/build-amazon-sagemaker-notebooks-backed-by-spark-in-amazon-emr/ |
Rather fine, but have some minor issues. Already 3 prod deployments. |
@ogidogi if you have any useful patches to Sparkmagic, could you submit them as PRs? Thanks! |
Feel free to refer the Helm setup ready for use: https://github.com/jahstreet/spark-on-kubernetes-helm . I'm opened for questions and feedback on usage. Can assist you with deployment. |
Also some information can be found here: apache/incubator-livy#249 |
@neerawrt I am currently using this exact architecture and the only problem I have found is that sparkmagic suport for jupyterlab has some bugs on the progress bar and other visual issues using the scala kernel |
Hi @PedroRossi , is there any luck for us to see the sharing of how this architecture is set up on k8s? |
Hi @wjxiz1992 there a some resources online for Sparks on K8s + Apache Livy. I know Viaduct has some open source repos for their docker images: https://github.com/viaduct-ai |
Hey guys, I've developed a module for deploying the latest 3.4.0 server-client mode on k8s and support config PySpark Session for direct connections! https://github.com/Wh1isper/sparglim#spark-connect-server-on-k8s Ping @devstein @wjxiz1992 @PedroRossi @jahstreet @ogidogi @itamarst @ogidogi @neerawrt |
Does anyone have Jupyterhub + Sparkmagic/levy + Spark running on Kubernetes at scale (in production)?
How stable is it?
The text was updated successfully, but these errors were encountered: