You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are implementing LakeFlow Connect SQL Replication. When deploying with DABs, we are seeing some configurations that appear to WARN as unknown field types. One does not seem to be a functional issue (source_type), but the other is a functional issue because it is not updating (dbr_version).
The source_type seems to actually work on initial deployments, and it deploys the gateway pipeline and ingestion pipeline just fine. I haven't ever tried changing this (it wouldn't make sense to do so), so whether it handles updates or not I'm unsure. It initially deployed correctly, even though it warned.
The dbr_version does not work when updating a pipeline (for this, I did not specify this on the initial deployment), and we need to update our pipeline to use the custom image due to an error that it has encountered on the default image (engineering provided us with the custom image). Ideally we should not need to destroy the pipeline and re-replicate the entire dataset just to update the image. Using the Python SDK I'm able to do exactly this, and update the gateway pipeline's image.
Run databricks bundle deploy ... to update pipeline
See warning
Check pipeline and notice the custom dbr_version is not set
Expected Behavior
The gateway pipeline should restart with the custom image specified in dbr_version
Actual Behavior
The gateway pipeline did not update its image.
OS and CLI version
Deploying with a VM Scale Set via Azure DevOps build pipeline.
Starting: Deploy bundle
==============================================================================
Task : Command line
Description : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
Version : 2.246.1
Author : Microsoft Corporation
Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
==============================================================================
Generating script.
Script contents:
databricks bundle deploy -t prod --auto-approve
Is this a regression?
Did not test other versions
The text was updated successfully, but these errors were encountered:
The "unknown field" warning means that the CLI doesn't know about the property. One possibility is that the CLI version you're using was released prior to the field being added. Another possibility is that the product hasn't externalized this property yet.
My mistake. I copy/pasted in the wrong section for version.
Databricks CLI v0.239.0
I have confirmed as recently as yesterday that I'm able to use the Python SDK and update these pipelines to use the custom image. But any time a DAB redeploy happens, it reverts to the old image and does not adhere to the dbr_version image that I have in the asset bundle configuration file.
I have also tried deleting the pipeline entirely, and running the initial deployment of the pipeline using the custom image defined in dbr_version and it also does not set it in that case.
Describe the issue
We are implementing LakeFlow Connect SQL Replication. When deploying with DABs, we are seeing some configurations that appear to WARN as unknown field types. One does not seem to be a functional issue (
source_type
), but the other is a functional issue because it is not updating (dbr_version
).The
source_type
seems to actually work on initial deployments, and it deploys the gateway pipeline and ingestion pipeline just fine. I haven't ever tried changing this (it wouldn't make sense to do so), so whether it handles updates or not I'm unsure. It initially deployed correctly, even though it warned.The
dbr_version
does not work when updating a pipeline (for this, I did not specify this on the initial deployment), and we need to update our pipeline to use the custom image due to an error that it has encountered on the default image (engineering provided us with the custom image). Ideally we should not need to destroy the pipeline and re-replicate the entire dataset just to update the image. Using the Python SDK I'm able to do exactly this, and update the gateway pipeline's image.Configuration
Steps to reproduce the behavior
databricks bundle deploy ...
dbr_version
for custom imagedatabricks bundle deploy ...
to update pipelinedbr_version
is not setExpected Behavior
The gateway pipeline should restart with the custom image specified in
dbr_version
Actual Behavior
The gateway pipeline did not update its image.
OS and CLI version
Deploying with a VM Scale Set via Azure DevOps build pipeline.
Is this a regression?
Did not test other versions
The text was updated successfully, but these errors were encountered: