-
Notifications
You must be signed in to change notification settings - Fork 101
Add workflow_with_ai_parse_document example #111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This example demonstrates incremental document processing using: - ai_parse_document for extracting structured data from PDFs/images - ai_query for LLM-based content analysis - Databricks Workflows with Structured Streaming and serverless compute Key features: - Python notebooks with Structured Streaming for incremental processing - Serverless compute for cost efficiency - Parameterized workflow with catalog, schema, and table names - Checkpointed streaming to process only new data - Visual debugging notebook with interactive bounding boxes
0aeff00 to
d5151d4
Compare
pietern
reviewed
Oct 13, 2025
knowledge_base/workflow_with_ai_parse_document/resources/ai_parse_document_workflow.job.yml
Outdated
Show resolved
Hide resolved
knowledge_base/workflow_with_ai_parse_document/resources/ai_parse_document_workflow.job.yml
Outdated
Show resolved
Hide resolved
...e_base/workflow_with_ai_parse_document/src/explorations/ai_parse_document -- debug output.py
Show resolved
Hide resolved
- Add job-level parameters block for catalog and schema (shared across all tasks) - Move optional schedule configuration to top of job definition - Replace Python notebook with Jupyter notebook format including visual outputs
The ipynb format was not preserving the visual output properly. Using .py notebook source and .html export to show outputs correctly.
- Move directory from knowledge_base/ to contrib/ - Replace "workflow" terminology with "job" throughout - Add recursiveFileLookup option for subdirectory file discovery - Make all parameters explicit at job level - Update checkpoint paths to use Unity Catalog volumes instead of /tmp - Remove hardcoded personal paths from notebooks - Add 6 visual screenshots showing parsing capabilities - Update README with Example Output section - Remove HTML output file in favor of Python notebook
pietern
approved these changes
Oct 15, 2025
| default: /Volumes/main/default/parsed_output | ||
| checkpoint_base_path: | ||
| description: Base path for Structured Streaming checkpoints | ||
| default: /tmp/checkpoints/ai_parse_workflow |
Contributor
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This path is local to a driver. Checkpoints will be gone by the time the cluster terminates. Intentional?
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This example demonstrates incremental document processing using:
Key features: