-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
java.lang.UnsupportedOperationException using pandas in Spark #1168
Comments
Hello @angelcervera, Thanks for reporting this issue. Not sure we want to downgrade to Java 8. So if you want to draft a PR to implement the Many thanks. |
Please add these config options to resolve the issue:
I am going to try upgrading JRE to see if this issue is fixed in the newer versions, but last time I have tried, Spark would not work with anything higher than JRE 11, which is very old these days. |
Hi Derek.
If you want, I can work on it, but no time until weekend.
If it's ok for you, I can create a PR.
Regards
…On Tue 29 Sep 2020, 15:28 Darek, ***@***.***> wrote:
Please add these config options to resolve the issue:
conf.set("spark.sql.legacy.setCommandRejectsSparkCoreConfs","false")
conf.set("spark.driver.extraJavaOptions",
"-Dio.netty.tryReflectionSetAccessible=true")
conf.set("spark.executor.extraJavaOptions",
"-Dio.netty.tryReflectionSetAccessible=true")
I am going to try upgrading JRE to see if this issue is fixed in the newer
versions, but last time I have tried, Spark would not work with anything
higher than JRE 11, which is very old these days.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1168 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACHURBX6CERYDM6COERNGTSIHVHDANCNFSM4RZL4W4A>
.
|
I just tried upgrading Open JDK to 14, Spark work OK but this error still exists. |
I may do a PR for JDK 14 anyway hoping that it fixes this issue. |
Hello, just submitted the PR #1198 to fix it. |
What docker image you are using?
jupyter/all-spark-notebook
What complete docker command do you run to launch the container (omitting sensitive values)?
docker run -d -p 8888:8888 jupyter/all-spark-notebook
What steps do you take once the container is running to reproduce the issue?
spylon-kernel
notebook.What do you expect to happen?
What actually happens?
There is know limitation with pySpark (more precise with Arrow) and Java11 and there are few option to fix it. Downgrade to Java8 or patch the configuration adding
-Dio.netty.tryReflectionSetAccessible=true
property.Reference
From the Spark documentation
The text was updated successfully, but these errors were encountered: