-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it #12890
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -71,35 +71,32 @@ object Main extends Logging { | |
| } | ||
| } | ||
|
|
||
| def createSparkContext(): SparkContext = { | ||
| def createSparkSession(): SparkSession = { | ||
| val execUri = System.getenv("SPARK_EXECUTOR_URI") | ||
| conf.setIfMissing("spark.app.name", "Spark shell") | ||
| // SparkContext will detect this configuration and register it with the RpcEnv's | ||
| // file server, setting spark.repl.class.uri to the actual URI for executors to | ||
| // use. This is sort of ugly but since executors are started as part of SparkContext | ||
| // initialization in certain cases, there's an initialization order issue that prevents | ||
| // this from being set after SparkContext is instantiated. | ||
| .set("spark.repl.class.outputDir", outputDir.getAbsolutePath()) | ||
| // SparkContext will detect this configuration and register it with the RpcEnv's | ||
| // file server, setting spark.repl.class.uri to the actual URI for executors to | ||
| // use. This is sort of ugly but since executors are started as part of SparkContext | ||
| // initialization in certain cases, there's an initialization order issue that prevents | ||
| // this from being set after SparkContext is instantiated. | ||
| conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath()) | ||
| if (execUri != null) { | ||
| conf.set("spark.executor.uri", execUri) | ||
| } | ||
| if (System.getenv("SPARK_HOME") != null) { | ||
| conf.setSparkHome(System.getenv("SPARK_HOME")) | ||
| } | ||
| sparkContext = new SparkContext(conf) | ||
| logInfo("Created spark context..") | ||
| Signaling.cancelOnInterrupt(sparkContext) | ||
| sparkContext | ||
| } | ||
|
|
||
| def createSparkSession(): SparkSession = { | ||
| val builder = SparkSession.builder.config(conf) | ||
| if (SparkSession.hiveClassesArePresent) { | ||
| sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate() | ||
| sparkSession = builder.enableHiveSupport().getOrCreate() | ||
| logInfo("Created Spark session with Hive support") | ||
| } else { | ||
| sparkSession = SparkSession.builder.getOrCreate() | ||
| sparkSession = builder.getOrCreate() | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. At here, maybe it is better to explicitly set
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. hm I think it's better to keep that flag contained rather than duplicating it everywhere |
||
| logInfo("Created Spark session") | ||
| } | ||
| sparkContext = sparkSession.sparkContext | ||
| Signaling.cancelOnInterrupt(sparkContext) | ||
| sparkSession | ||
| } | ||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we want to use
builder.config("spark.sql.catalogImplementation", "hive")?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's what enableHiveSupport does?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, right. We still have this method.