Skip to content

Commit 3afe221

Browse files
author
Marcelo Vanzin
committed
Drop "spark." requirement.
After thinking about it a bit more, it can lead to confusing behavior. Since substitution only applies to configs declared internally by Spark, this avoids random user configs being affected already, so no need for another check to disable that.
1 parent 6618bc4 commit 3afe221

File tree

1 file changed

+4
-8
lines changed

1 file changed

+4
-8
lines changed

core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -171,14 +171,10 @@ private object ConfigEntry {
171171
val replacement = prefix match {
172172
case null =>
173173
require(!usedRefs.contains(name), s"Circular reference in $value: $name")
174-
if (name.startsWith("spark.")) {
175-
Option(findEntry(name))
176-
.flatMap(_.readAndExpand(conf, getenv, usedRefs = usedRefs + name))
177-
.orElse(Option(conf.get(name)))
178-
.orElse(defaultValueString(name))
179-
} else {
180-
None
181-
}
174+
Option(findEntry(name))
175+
.flatMap(_.readAndExpand(conf, getenv, usedRefs = usedRefs + name))
176+
.orElse(Option(conf.get(name)))
177+
.orElse(defaultValueString(name))
182178
case "system" => sys.props.get(name)
183179
case "env" => Option(getenv(name))
184180
case _ => None

0 commit comments

Comments
 (0)