Skip to content

Conversation

@gatorsmile
Copy link
Member

What changes were proposed in this pull request?

Hive Index tables are not supported by Spark SQL. Thus, we issue an exception when users try to access Hive Index tables. When the internal function tableExists tries to access Hive Index tables, it always gets the same error message: Hive index table is not supported. This message could be confusing to users, since their SQL operations could be completely unrelated to Hive Index tables. For example, when users try to alter a table to a new name and there exists an index table with the same name, the expected exception should be a TableAlreadyExistsException.

This PR made the following changes:

  • Introduced a new AnalysisException type: SQLFeatureNotSupportedException. When users try to access an Index Table, we will issue a SQLFeatureNotSupportedException.
  • tableExists returns true when hitting a SQLFeatureNotSupportedException and the feature is Hive index table.
  • Add a checking requireTableNotExists for SessionCatalog's createTable API; otherwise, the current implementation relies on the Hive's internal checking.

How was this patch tested?

Added a test case

@rxin
Copy link
Contributor

rxin commented Aug 25, 2016

Can we avoid introducing new exception types? It is super annoying to match those in Python.

@SparkQA
Copy link

SparkQA commented Aug 25, 2016

Test build #64404 has finished for PR 14801 at commit c400c52.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • class SQLFeatureNotSupportedException(val feature: String)

@gatorsmile
Copy link
Member Author

Sure, will revert it back and use the existing AnalysisException. Thanks!

@SparkQA
Copy link

SparkQA commented Aug 26, 2016

Test build #64455 has finished for PR 14801 at commit 3f75605.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 26, 2016

Test build #64456 has finished for PR 14801 at commit 8bcd946.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Aug 28, 2016

Test build #64545 has finished for PR 14801 at commit 8bcd946.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member Author

cc @rxin @cloud-fan

try {
client.getTableOption(db, table).isDefined
} catch {
case e: AnalysisException if e.message.contains("Hive index table is not supported") => true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks hacky. Actually why we use getTableOption to implement the semantic of tableExists? I'd like to make HiveClient provide the tableExists API directly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that is a good idea. Let me add it.

@SparkQA
Copy link

SparkQA commented Aug 29, 2016

Test build #64599 has finished for PR 14801 at commit ce35aa0.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

}

test(s"$version: tableExists") {
// No exception should be thrown
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hive's tableExists may throw exception?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan
Copy link
Contributor

LGTM, pending jenkins.

@SparkQA
Copy link

SparkQA commented Aug 30, 2016

Test build #64618 has finished for PR 14801 at commit 439db0b.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Aug 30, 2016

Test build #64619 has finished for PR 14801 at commit 439db0b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

thanks, merging to master!

@asfgit asfgit closed this in bca79c8 Aug 30, 2016

if (tableExists(db, table) && !ignoreIfExists) {
throw new TableAlreadyExistsException(db = db, table = table)
}
Copy link
Member Author

@gatorsmile gatorsmile Nov 26, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After a few months, I found the above code looks weird. We should follow the same logics in InMemoryCatalog.scala. Will improve it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants