Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better auto batching for resolve LLM calls #128

Merged
merged 79 commits into from
Oct 28, 2024

Conversation

sushruth2003
Copy link
Contributor

@sushruth2003 sushruth2003 commented Oct 27, 2024

Making some changes to auto_batch in resolve to heuristically understand the workload and decide batch sizes. Also implemented a way to fit batches that don't redo previously clustered operations, and therefore more efficiently use parallelization for llm calls.
Testing:
See test_resolve_auto_batch, for a sample test suite to run.

Not Done:
Model limits are hardcoded to 500 requests concurrently at the moment. We need to find a better way to incorporate this into the model api.

shreyashankar and others added 30 commits October 12, 2024 12:41
feat: add reduce operation lineage
fix: change gleaning prompt to validation_prompt
docs: add 'output' argument to ResolveOp code eg
docs: add sample and cluster to docs
shreyashankar and others added 27 commits October 20, 2024 20:06
feat: adding human in the loop for split-map-gather decomp
fix: cache partial pipeline runs
only compare distinct pairs in resolve
@shreyashankar shreyashankar changed the base branch from main to staging October 28, 2024 02:15
@shreyashankar shreyashankar merged commit 32cd39d into ucbepic:staging Oct 28, 2024
1 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants