Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapter for Dinov2 and ViT transformers #593

Open
FBehrad opened this issue Oct 16, 2023 · 1 comment
Open

Adapter for Dinov2 and ViT transformers #593

FBehrad opened this issue Oct 16, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@FBehrad
Copy link

FBehrad commented Oct 16, 2023

Environment info

  • adapter-transformers version: ?
  • Platform: windows
  • Python version: 3.8
  • PyTorch version (GPU?): torch | 2.0.1+cu117 | 2.1.0
  • peft: 0.5.0

Details

Hello,
How can we use the provided library for ViT and dinov2?
I checked the documentation and found this page. However, I don't know how to use it in my code.
Also, as Dinov2 is an state-of-the-art model with a promising performance in many tasks, it would be wonderful to have a adapter for it.

@FBehrad FBehrad added the question Further information is requested label Oct 16, 2023
@hSterz
Copy link
Member

hSterz commented Oct 20, 2023

Hey @FBehrad , the ViT is supported you can use the ViTAdapterModel which you can load with from_pretrained as you would with transformers. The model provides all the adapter functionality like adding, activating and training adapters. You can also use the transformer model classes as they provide the adapters functionality as well. To get started it might be helpful to check out the quickstart in the documentation and the example notebooks.

But we currently don't support dinov2. I will change the label of this issue to enhancement and leave this open as a feature request.

@hSterz hSterz added enhancement New feature or request and removed question Further information is requested labels Oct 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants