Skip to content

Conversation

@michellemay
Copy link
Contributor

Refactor Utils class and create ShutdownHookManager.

NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted locally using custom log4j.properties file with Redis appender and logstash formatter (bundled in the fat-jar submitted to spark)

ex:
log4j.rootCategory=WARN,console,redis
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark.graphx.Pregel=INFO

log4j.appender.redis=com.ryantenney.log4j.FailoverRedisAppender
log4j.appender.redis.endpoints=hostname:port
log4j.appender.redis.key=mykey
log4j.appender.redis.alwaysBatch=false
log4j.appender.redis.layout=net.logstash.log4j.JSONEventLayoutV1

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line is too long and will fail style checks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you suggest to split this line up? I'm not too much familiar with scala best practices and style.

@vanzin
Copy link
Contributor

vanzin commented Aug 11, 2015

The original reason (from what I remember) to hide this under the Utils class was to avoid exposing the ShutdownHookManager, which clashes with a Hadoop class with the same name (different package though). I guess since Spark code should not use the Hadoop one, it's ok to use the same name.

The change looks ok, but let's see what tests say.

@vanzin
Copy link
Contributor

vanzin commented Aug 11, 2015

ok to test

@vanzin
Copy link
Contributor

vanzin commented Aug 11, 2015

Jenkins, test this please.

@vanzin
Copy link
Contributor

vanzin commented Aug 11, 2015

ok to test

@michellemay
Copy link
Contributor Author

Where do we see test results ?

@SparkQA
Copy link

SparkQA commented Aug 12, 2015

Test build #1469 has finished for PR 8109 at commit cdcd1f6.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Aug 12, 2015

@michellemay the last build has a link to the results (it's failing the style checks).

@SparkQA
Copy link

SparkQA commented Aug 12, 2015

Test build #40624 has finished for PR 8109 at commit 2e2a3ee.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@michellemay
Copy link
Contributor Author

@vanzin about your comment on ShutdownHookManager, should I rename it?

Also, while refactoring calls to Utils.addShutdownHook, I've seen instances of calls to Runtime.getRuntime.addShutdownHook.. Should I replace them with calls to the new class?

org\apache\spark\deploy\ExternalShuffleService.scala:126
org\apache\spark\deploy\mesos\MesosClusterDispatcher.scala:113
org\apache\spark\storage\ExternalBlockStore.scala:181

@vanzin
Copy link
Contributor

vanzin commented Aug 12, 2015

about your comment on ShutdownHookManager, should I rename it?

No, the name is fine. I was just giving some historical context.

I've seen instances of calls to Runtime.getRuntime.addShutdownHook..

From a quick look, it seems that at least the one in ExternalBlockStore should be changed; the other two seem to be separate processes (i.e. they are not part of a Spark application) so that's questionable. But I'd say leave it for a different change (maybe file a separate bug so it doesn't fall through the cracks).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: sorting order

@vanzin
Copy link
Contributor

vanzin commented Aug 12, 2015

LGTM, just have style nits left. I'll let this sit for a little bit to see if anyone else has comments.

@vanzin
Copy link
Contributor

vanzin commented Aug 12, 2015

Alright, merging. Will fix the imports in the process.

asfgit pushed a commit that referenced this pull request Aug 12, 2015
Refactor Utils class and create ShutdownHookManager.

NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted locally using custom log4j.properties file with Redis appender and logstash formatter (bundled in the fat-jar submitted to spark)

ex:
log4j.rootCategory=WARN,console,redis
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark.graphx.Pregel=INFO

log4j.appender.redis=com.ryantenney.log4j.FailoverRedisAppender
log4j.appender.redis.endpoints=hostname:port
log4j.appender.redis.key=mykey
log4j.appender.redis.alwaysBatch=false
log4j.appender.redis.layout=net.logstash.log4j.JSONEventLayoutV1

Author: michellemay <mlemay@gmail.com>

Closes #8109 from michellemay/SPARK-9826.

(cherry picked from commit ab7e721)
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
@asfgit asfgit closed this in ab7e721 Aug 12, 2015
asfgit pushed a commit that referenced this pull request Aug 13, 2015
Refactor Utils class and create ShutdownHookManager.

NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted locally using custom log4j.properties file with Redis appender and logstash formatter (bundled in the fat-jar submitted to spark)

ex:
log4j.rootCategory=WARN,console,redis
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark.graphx.Pregel=INFO

log4j.appender.redis=com.ryantenney.log4j.FailoverRedisAppender
log4j.appender.redis.endpoints=hostname:port
log4j.appender.redis.key=mykey
log4j.appender.redis.alwaysBatch=false
log4j.appender.redis.layout=net.logstash.log4j.JSONEventLayoutV1

Author: michellemay <mlemay@gmail.com>

Closes #8109 from michellemay/SPARK-9826.

(cherry picked from commit ab7e721)

Conflicts:
	core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala
	core/src/main/scala/org/apache/spark/rdd/NewHadoopRDD.scala
	core/src/main/scala/org/apache/spark/util/Utils.scala
	sql/core/src/main/scala/org/apache/spark/sql/sources/SqlNewHadoopRDD.scala
	sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2.scala
	sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala
	sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala
@vanzin
Copy link
Contributor

vanzin commented Aug 13, 2015

Merged to master, branch-1.5 and branch-1.4. Thanks!

I fixed some minor conflicts in the 1.4 port (mostly import statements), made sure the build passed, and ran core/ unit tests locally.

@michellemay
Copy link
Contributor Author

Awesome! Thanks Marcelo for your implication!

@michellemay michellemay deleted the SPARK-9826 branch August 13, 2015 11:36
CodingCat pushed a commit to CodingCat/spark that referenced this pull request Aug 17, 2015
Refactor Utils class and create ShutdownHookManager.

NOTE: Wasn't able to run /dev/run-tests on windows machine.
Manual tests were conducted locally using custom log4j.properties file with Redis appender and logstash formatter (bundled in the fat-jar submitted to spark)

ex:
log4j.rootCategory=WARN,console,redis
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.spark.graphx.Pregel=INFO

log4j.appender.redis=com.ryantenney.log4j.FailoverRedisAppender
log4j.appender.redis.endpoints=hostname:port
log4j.appender.redis.key=mykey
log4j.appender.redis.alwaysBatch=false
log4j.appender.redis.layout=net.logstash.log4j.JSONEventLayoutV1

Author: michellemay <mlemay@gmail.com>

Closes apache#8109 from michellemay/SPARK-9826.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants