Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Update project names and links in awesome-transformers #28878

Merged
merged 1 commit into from
Feb 6, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions awesome-transformers.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ This repository contains examples and best practices for building recommendation

Keywords: Recommender systems, AzureML

## [lama-cleaner](https://github.com/Sanster/lama-cleaner)
## [IOPaint](https://github.com/Sanster/IOPaint)

Image inpainting tool powered by Stable Diffusion. Remove any unwanted object, defect, people from your pictures or erase and replace anything on your pictures.

Expand Down Expand Up @@ -105,9 +105,9 @@ An open-source Implementation of Imagen, Google's closed-source Text-to-Image Ne

Keywords: Imagen, Text-to-image

## [adapter-transformers](https://github.com/adapter-hub/adapter-transformers)
## [adapters](https://github.com/adapter-hub/adapters)

[adapter-transformers](https://github.com/adapter-hub/adapter-transformers) is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules. It is a drop-in replacement for transformers, which is regularly updated to stay up-to-date with the developments of transformers.
[adapters](https://github.com/adapter-hub/adapters) is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules. It is a drop-in replacement for transformers, which is regularly updated to stay up-to-date with the developments of transformers.

Keywords: Adapters, LoRA, Parameter-efficient fine-tuning, Hub

Expand Down Expand Up @@ -601,9 +601,9 @@ All Hugging Face models and pipelines can be seamlessly integrated into BentoML

Keywords: BentoML, Framework, Deployment, AI Applications

## [LLaMA-Efficient-Tuning](https://github.com/hiyouga/LLaMA-Efficient-Tuning)
## [LLaMA Factory](https://github.com/hiyouga/LLaMA-Factory)

[LLaMA-Efficient-Tuning](https://github.com/hiyouga/LLaMA-Efficient-Tuning) offers a user-friendly fine-tuning framework that incorporates PEFT. The repository includes training(fine-tuning) and inference examples for LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, and other LLMs. A ChatGLM version is also available in [ChatGLM-Efficient-Tuning](https://github.com/hiyouga/ChatGLM-Efficient-Tuning).
[LLaMA Factory](https://github.com/hiyouga/LLaMA-Factory) offers a user-friendly fine-tuning framework that incorporates PEFT. The repository includes training(fine-tuning) and inference examples for LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, and other LLMs. A ChatGLM version is also available in [ChatGLM-Efficient-Tuning](https://github.com/hiyouga/ChatGLM-Efficient-Tuning).

Keywords: PEFT, fine-tuning, LLaMA-2, ChatGLM, Qwen