-
Notifications
You must be signed in to change notification settings - Fork 14.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request : Tableau TDE hook and Tableau TDE create/refresh operator #196
Comments
We've been doing different things around Tableau integration but I don't think we've find something we're happy with. I wrote a little framework that would scan a Hive database ("tableau") and discover the all the tables from the metastore and push CSV files using the HiveToSamba operator. I think it would do differential loads where tables were partitionned. I think we ended up settling on a pull approach using the Hortonworks hiveserver2 connector. We're interested in better integration between Tableau and Hive/Presto though. I haven't looked at the Tableau APIs lately. |
Seems like it can be distributed with the software as long as they are notified. I'll reach out to Tableau and see what their thoughts are. |
+1 for this feature!! |
* wip spark configuration * fixup depends on init action * fixup use staging bucket * fixup! docs, volumes and init action bug * spark tasks use ccache cluster policy rule * use 1.10.x operator path * Attempt to fix the image * Update terraform/modules/airflow_tenant/modules/airflow_app/main.tf Co-authored-by: Kamil Breguła <mik-laj@users.noreply.github.com> * Update sfdc-airflow-aas/sfdc_airflow/cluster_policy/rules.py Co-authored-by: Kamil Breguła <mik-laj@users.noreply.github.com> * improve hadoop config organization on GCS * set core / yarn configmaps * escape commas to make helm happy * improve spark logging, add docs * revert log4j * fix env var name * fix leading newline in hadoop configs * fix yarn site in configmap * remove duplicate conf in exported gcs path * Update subrepos/airflow/chart/templates/workers/worker-deployment.yaml * Update subrepos/airflow/chart/templates/workers/worker-deployment.yaml * add back log4j * working demo * Add spark provider package * wip * fix numbers add dive * add deploying iac docs * fix gcs connector verification * [Discuss] Dynamic hadoop configs (#220) * improve apply-all behavior * improve apply-all behavior Co-authored-by: Kamil Breguła <mik-laj@users.noreply.github.com> Co-authored-by: Kamil Breguła <kamil.bregula@polidea.com>
Signed-off-by: wslulciuc <willy@datakin.com>
Using the Tableau Data Extraction API this should be possible.
Caveat 1 - @mistercrunch curious to get your thoughts on this ... I don't believe the code from Tableau is open source so would need to review the license or not include their code in the package (up to user to install outside of Airflow).
Caveat 2 - Code runs on Windows and Linux only ... which should be fine but would need to be called out as a constraint ...
Thinking this may be useful as a plugin rather than integration into Airflow considering the above caveats.
The text was updated successfully, but these errors were encountered: