-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Support for Model-Specific dbt_vars in DbtTaskGroup #1497
Comments
Hi @ame589 , Thanks for reporting this feature request. Cosmos may already support what you'd like to accomplish. In version 1.8.0, @wornjs introduced support to customizing Airflow operator arguments per dbt node via the PR #1339 - more information here. More recently, we made a few improvements to this feature, which is available in 1.9.0a4, as part of the PR #1492. The main improvement is ensuring that operator arguments defined at the model level take precedence over those set at a higher level. This documentation summarises the existing feature Given that Cosmos exposes the argument astronomer-cosmos/cosmos/operators/base.py Lines 27 to 29 in 9c175f6
You probably can accomplish what you want by using something like the following in your
If this does not work with Cosmos 1.8.0, please try it out with 1.9.0a4. Although I haven't tested, I'm optimistic this will solve the problem, so I'm closing the issue for now. Please feel free to reopen it if it does not meet your needs. |
Hi @tatiana, unfortunately with your approach, using version 1.6.0, we receive: 2025-02-03, 15:16:29 UTC] {logging_mixin.py:188} INFO - 15:16:29 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. There are 1 unused configuration paths: - models.staging.folder_model_1.model_name.meta.cosmos.operator_kwargs.vars We have set in this way the dbt_project.yml:
|
@ame589 this warning message in dbt is letting you know that there's a configuration path in your Could you try the following configuration in your
You can use
To confirm if the configuration is valid. |
Description
Currently, when creating a DbtTaskGroup that points to a folder (e.g., using a select path like +path:models/silver/{group_name}), it is only possible to pass shared dbt_vars to all models in that folder via the operator_args parameter or dbt_vars inside ProjectConfig (depracated).
This limitation makes it challenging to handle scenarios where each model within the folder requires different variables.
For example, consider the following code snippet:
In this case, all models within the models/silver/{group_name} folder will share the same dbt_vars. However, I would like to pass a unique variable (e.g., the output of an upstream task like DatabricksRunNowOperator) to each model. This is currently not possible without creating separate task groups for each model or pre-processing variables externally.
Use case/motivation
In our workflows, we often have multiple models grouped under the same folder, but each model represents a different flow or use case. For example:
Model A requires the output of an upstream Databricks task specific to customer data.
Model B requires the output of an upstream Databricks task specific to product data.
Currently, there is no straightforward way to pass these unique variables directly to each model within a DbtTaskGroup. The only workaround is to create separate task groups for each model or handle variables externally, which adds complexity and reduces maintainability.
Adding support for model-specific dbt_vars within a single DbtTaskGroup would:
Simplify orchestration by allowing dynamic variable assignment per model.
Improve flexibility and usability for complex dbt projects.
Align with dbt's ability to handle per-model configurations via CLI arguments or YAML files.
Related issues
I am not aware of any existing issues directly addressing this feature request. However, this enhancement would align with Cosmos's goal of simplifying dbt orchestration in Airflow.
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: