You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As EMR does not have the /dbfs folder, hsfs is defaulting to use PyHive even though spark is available. I could work around this issue by creating the folder manually.
The text was updated successfully, but these errors were encountered:
SirOibaf
added a commit
to SirOibaf/feature-store-api
that referenced
this issue
Nov 29, 2020
Also remove downloading of the certificates in the connect method for
PySpark clients. Certificates should already be present when the
application is started.
Also update the Spark external cluster documentation.
Closeslogicalclocks#170
SirOibaf
added a commit
to SirOibaf/feature-store-api
that referenced
this issue
Nov 29, 2020
Also remove downloading of the certificates in the connect method for
PySpark clients. Certificates should already be present when the
application is started.
Also update the Spark external cluster documentation.
Closeslogicalclocks#170
As EMR does not have the /dbfs folder, hsfs is defaulting to use PyHive even though spark is available. I could work around this issue by creating the folder manually.
The text was updated successfully, but these errors were encountered: