Follow the instructions in set up environment to create all required resources in Azure and Azure Databricks to run the demo.
Once all resources have you been created from set up environment follow these steps to run the demo.
- Generate data in ADLS and Event Hubs Kafka topic using data-generator.
- Modify the notebooks in delta-live-tables to use the correct config values.
- Go to the Jobs page and Delta Live Tables tab to deploy your pipeline using the settings from dlt-pipeline-settings-continuous.json.
- Once the pipeline starts, stop it and modify the settings to apply the Spark config for secrets from dlt-pipeline-settings-continuous.json.
- Check the database and tables were create correctly using the audit_tables notebook.
- Audit the DLT pipeline event logs using the audit_event_log notebook.