-
Notifications
You must be signed in to change notification settings - Fork 6
Official PyTorch implementation for Hogwild! Inference: Parallel LLM Generation with a Concurrent Attention Cache
License
eqimp/hogwild_llm
ErrorLooks like something went wrong!
About
Official PyTorch implementation for Hogwild! Inference: Parallel LLM Generation with a Concurrent Attention Cache
Topics
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published