Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Phi3 support in BetterTransformer (to use the microsoft/phi-4 model) #2171

Open
majdabd opened this issue Jan 26, 2025 · 1 comment
Open

Comments

@majdabd
Copy link

majdabd commented Jan 26, 2025

Feature request

Hello,

Is it possible to add the phi3 architecture to BetterTransformer supported models?

Motivation

Nan

Your contribution

Nan

@IlyasMoutawwakil
Copy link
Member

Phi3 supports sdpa attention implementation in transformers, see https://huggingface.co/docs/transformers/v4.48.0/en/model_doc/siglip#using-scaled-dot-product-attention-sdpa
Please tell us if this answers your request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants