Skip to content

Conversation

@andrewor14
Copy link
Contributor

The peak execution memory metric was introduced in SPARK-8735. That was before Tungsten was enabled by default, so it assumed that spark.sql.unsafe.enabled must be explicitly set to true. The result is that the memory is not displayed by default.

@andrewor14
Copy link
Contributor Author

@rxin @yhuai

@SparkQA
Copy link

SparkQA commented Aug 21, 2015

Test build #41338 has finished for PR 8345 at commit 5d3d81a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yhuai
Copy link
Contributor

yhuai commented Aug 24, 2015

LGTM.

asfgit pushed a commit that referenced this pull request Aug 24, 2015
The peak execution memory metric was introduced in SPARK-8735. That was before Tungsten was enabled by default, so it assumed that `spark.sql.unsafe.enabled` must be explicitly set to true. The result is that the memory is not displayed by default.

Author: Andrew Or <andrew@databricks.com>

Closes #8345 from andrewor14/show-memory-default.

(cherry picked from commit 662bb96)
Signed-off-by: Yin Huai <yhuai@databricks.com>
@asfgit asfgit closed this in 662bb96 Aug 24, 2015
@andrewor14 andrewor14 deleted the show-memory-default branch August 24, 2015 23:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants