Skip to content

Conversation

@snreddygopu
Copy link
Contributor

This PR introduces a new feature in the Airflow Teradata provider to support high-performance data loading and unloading operations using Teradata Parallel Transporter (TPT) as part of Airflow DAGs. The new operator enhances integration with Teradata by enabling flexible orchestration of data transfer workloads both locally and remotely via SSH.

Key Features:

  • Multi-mode data operations: Support for file-to-table, table-to-file, table-to-table, and SELECT-to-file transfers
  • Flexible source and target specifications: Supports source files, source tables, SELECT statements, and custom INSERT statements
  • Batch and single operations: Works with individual data transfer operations or complex data pipeline scenarios
  • Integrates with Airflow's connection management for secure database access
  • Provides comprehensive logging of execution results and performance metrics
  • Supports both local and remote execution via SSH for distributed data operations
  • Dynamic job variable file generation: Automatically generates and manages TPT job variable files based on operation mode

🛠️ Additional Enhancements:

Includes utility functions for:

  • Job variable file preparation and validation
  • Data format specification (Delimited, Fixed-length, etc.)
  • Custom delimiter configuration for source and target files
  • Comprehensive parameter validation for operation modes
  • Robust error handling with detailed error messages
  • Secure file encryption for remote transfers
  • Resource cleanup and subprocess management

Teradata Provider documentation build status: https://github.com/Teradata/airflow/actions/runs/15688902771

Teradata Provider Unit tests build status: https://github.com/Teradata/airflow/actions/runs/15688585657


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

@boring-cyborg
Copy link

boring-cyborg bot commented Dec 2, 2025

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contributors' Guide (https://github.com/apache/airflow/blob/main/contributing-docs/README.rst)
Here are some useful points:

  • Pay attention to the quality of your code (ruff, mypy and type annotations). Our prek-hooks will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
  • Always keep your Pull Requests rebased, otherwise your build might fail due to changes not related to your commits.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: dev@airflow.apache.org
    Slack: https://s.apache.org/airflow-slack

@potiuk potiuk merged commit d739a98 into apache:main Dec 10, 2025
121 checks passed
@boring-cyborg
Copy link

boring-cyborg bot commented Dec 10, 2025

Awesome work, congrats on your first merged pull request! You are invited to check our Issue Tracker for additional contributions.

@github-actions
Copy link

Backport failed to create: v3-1-test. View the failure log Run details

Status Branch Result
v3-1-test Commit Link

You can attempt to backport this manually by running:

cherry_picker d739a98 v3-1-test

This should apply the commit to the v3-1-test branch and leave the commit in conflict state marking
the files that need manual conflict resolution.

After you have resolved the conflicts, you can continue the backport process by running:

cherry_picker --continue

@snreddygopu
Copy link
Contributor Author

Hi Potiuk,

Do I need to address the comment added by the github-actions bot regarding the v3-1-test backport? The workflow log shows that it actually completed successfully.

@potiuk
Copy link
Member

potiuk commented Dec 13, 2025

No need.

@eladkal
Copy link
Contributor

eladkal commented Jan 15, 2026

@snreddygopu The teradata system test dashboard that is linked from the provider shows all system tests are broken:
https://teradata.github.io/airflow/index.html

Screenshot 2026-01-15 at 10 26 41

@snreddygopu
Copy link
Contributor Author

Hi @eladkal,

There appears to be an issue with our automation that updates the Teradata system test dashboard. I am currently investigating the root cause. Please note that the Teradata system tests themselves are running successfully, only the dashboard update process is affected. Will update you once resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants