-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use S3 File as mainApplicationFile #996
Comments
Hi all, I was able to get this working by adding an extra sparkConf: Now my pySpark file is being submitted. But in the pySpark file, I am reading data from S3 and writing it back to S3 at a different path. That part is again failing with error In the PySpark file, I am setting following configuration before reading the data:
Can somebody please suggest me if I'm missing something? Appreciate all the help :) |
Closing this as the issue is solved |
@batCoder95 how did you get this issue solved? I am facing the same problem. Can you post the final YAML settings you used? |
I face the same issue, very appreciate if you can share your yaml settings |
Hi,
I am trying to install this operator (gcr.io/spark-operator/spark-py:v3.0.0) on my EKS cluster and then run a simple pyspark file that resides on my S3 bucket. I went through some of the documentation and found out that we need to set spark configurations in the YAML file in order to enable S3 file system. Therefore, I configured the same and below is how my YAML file looks like now:
apiVersion: "sparkoperator.k8s.io/v1beta2" kind: SparkApplication metadata: name: pyspark-pi namespace: default spec: type: Python pythonVersion: "3" mode: cluster image: "gcr.io/spark-operator/spark-py:v3.0.0" imagePullPolicy: Always mainApplicationFile: s3a://myBucket/input/appFile.py sparkVersion: "3.0.0" sparkConf: "spark.jars.packages": "com.amazonaws:aws-java-sdk-pom:1.11.271,org.apache.hadoop:hadoop-aws:3.1.0" "spark.hadoop.fs.s3a.impl": "org.apache.hadoop.fs.s3a.S3AFileSystem" "spark.hadoop.fs.s3a.access.key": "<access-key>" "spark.hadoop.fs.s3a.secret.key": "<secret-key>"
But, now that I am deploying this YAML file, I am running into following issue in the driver pod:
Exception in thread "main" java.io.FileNotFoundException: /opt/spark/.ivy2/cache/resolved-org.apache.spark-spark-submit-parent-d3d506ae-d79f-45f6-b459-cfa5dc649610-1.0.xml (No such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:162) at org.apache.ivy.plugins.parser.xml.XmlModuleDescriptorWriter.write(XmlModuleDescriptorWriter.java:70) at org.apache.ivy.plugins.parser.xml.XmlModuleDescriptorWriter.write(XmlModuleDescriptorWriter.java:62) at org.apache.ivy.core.module.descriptor.DefaultModuleDescriptor.toIvyFile(DefaultModuleDescriptor.java:563) at org.apache.ivy.core.cache.DefaultResolutionCacheManager.saveResolvedModuleDescriptor(DefaultResolutionCacheManager.java:176) at org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:245) at org.apache.ivy.Ivy.resolve(Ivy.java:523) at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1387) at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:871) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Would really appreciate if someone can suggest me how to fix this issue?
Thanks in advance :)
The text was updated successfully, but these errors were encountered: