-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update PL ecosystem libraries to the stable Accelerator/Strategy API #12026
Comments
I think it is related to EcoCI feature request: Lightning-AI/ecosystem-ci#24 |
Hi, is this still being worked on? |
Would you be interested in helping us? 🐰 |
Sure I can give it a shot. Is there a good source to start with for this: what are the required aspects of a strategy, changes made between 1.6 and 1.7+ for the strategy api, etc? |
@ChickenTarm This is an old ticket that I created a year ago and it got forgotten, sorry! The main motivation was to update the apis used over in ray lightning. There are some overrides for Accelerator, Launcher, and Strategy here: https://github.com/ray-project/ray_lightning/tree/main/ray_lightning |
Proposed refactor
Create an issue and send a PR to repositories that depend on the outdated "TrainingTypePlugin" API and update them to the stable Accelerator/Strategy API.
Motivation
As we are releasing an updated, stable API for Strategy (formerly known as TrainingTypePlugin) with Lightning 1.6 we want to ensure that we notify and support the community about the changes. A smooth transition will ensure that early adopters can profit from the improvements the new API brings. Since the API in some strategies has changed dramatically, we can offer some help.
Candidates
Additional context
#9932 #10416
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @rohitgr7 @akihironitta
The text was updated successfully, but these errors were encountered: