We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
If you have a very large dataset in Druid, refresh metadata can take a very long time and results in timeout.
Not sure if we can pass analysisTypes just only cardinality or something that's configurable which results in a much faster query.
analysisTypes
cardinality
https://github.com/airbnb/superset/blob/2d866e3ffa9bfedd3b3dad0d3463767aae879a14/superset/models.py#L2018
http://druid.io/docs/latest/querying/segmentmetadataquery.html#analysistypes
Because by default it will look for all the types and it seems like we only care about the columns.
Latest
faster refresh druid metadata
superset refresh_druid -m true
The text was updated successfully, but these errors were encountered:
analysisTypes is supported by pydruid.
Sorry, something went wrong.
Has this been fixed in #1983?
Yes it's fixed in #1983. I'm closing the issue.
No branches or pull requests
If you have a very large dataset in Druid, refresh metadata can take a very long time and results in timeout.
Not sure if we can pass
analysisTypes
just onlycardinality
or something that's configurable which results in a much faster query.https://github.com/airbnb/superset/blob/2d866e3ffa9bfedd3b3dad0d3463767aae879a14/superset/models.py#L2018
http://druid.io/docs/latest/querying/segmentmetadataquery.html#analysistypes
Because by default it will look for all the types and it seems like we only care about the columns.
Superset version
Latest
Expected results
faster refresh druid metadata
Actual results
faster refresh druid metadata
Steps to reproduce
superset refresh_druid -m true
The text was updated successfully, but these errors were encountered: