Skip to content

Commit

Permalink
Added full_refresh attribute to the pipeline_task in `databricks_…
Browse files Browse the repository at this point in the history
…job` (#2444)

This allows to force full refresh of the pipeline from the job.

This fixes #2362
  • Loading branch information
alexott authored and nkvuong committed Aug 2, 2023
1 parent bacb45c commit 9bb853f
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
1 change: 1 addition & 0 deletions docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -233,6 +233,7 @@ You can invoke Spark submit tasks only on new clusters. **In the `new_cluster` s
### pipeline_task Configuration Block

* `pipeline_id` - (Required) The pipeline's unique ID.
* `full_refresh` - (Optional) (Bool) Specifies if there should be full refresh of the pipeline.

-> **Note** The following configuration blocks are only supported inside a `task` block

Expand Down
3 changes: 2 additions & 1 deletion jobs/resource_job.go
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,8 @@ type PythonWheelTask struct {

// PipelineTask contains the information for pipeline jobs
type PipelineTask struct {
PipelineID string `json:"pipeline_id"`
PipelineID string `json:"pipeline_id"`
FullRefresh bool `json:"full_refresh,omitempty"`
}

type SqlQueryTask struct {
Expand Down

0 comments on commit 9bb853f

Please sign in to comment.