Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Phi-3-mini adapter #145

Merged
merged 2 commits into from
May 13, 2024
Merged

Add Phi-3-mini adapter #145

merged 2 commits into from
May 13, 2024

Conversation

pashminacameron
Copy link
Contributor

@pashminacameron pashminacameron commented May 12, 2024

Adds Phi-3-mini support. This requires updating transformers to near latest. There hasn't been a release of transformers package since Phi-3 support was added.

Dependencies:
A commit from Friday in transformers package has an issue so updating transformers package to just before that commit. Once the issue is fixed, we can update to latest git commit of transformers and once there's a release, we can point to the latest package.
Also updating peft==0.6.2. I have checked the slicing and finetuning for Phi-2 and run the tests. We get nearly the same results as the paper

model: Phi-2
piqa: originial: 79.3, sliced@25% 74.5, recovery finetuned: 74.4
ppl on alpaca: original 2.98, sliced@25%: 3.22, recovery finetuned: 2.99

We can look at renaming the adapters: llama->llama2 and Phi-3->Phi-3-mini in a separate change. I have not tried the Phi-3-mini-128k version! We will need to figure out how to handle that too if we want to.

Copy link
Collaborator

@nailimixaM nailimixaM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome!

@pashminacameron pashminacameron merged commit 1a437b8 into main May 13, 2024
2 checks passed
@pashminacameron pashminacameron deleted the pashmina/phi-3 branch May 16, 2024 20:29
nailimixaM added a commit that referenced this pull request Jun 18, 2024
* Update dependencies (#144)

* Separate out dependencies for experiments

* Raise peft version

* Bump transformers

* Bump datasets

* Update README

* Rollback peft to 0.6.0 and make it optional

* Use a task metric map in lm_eval runner (#146)

* Add Phi-3-mini adapter (#145)

* Add Phi-3 adapter

* Removed cast. Aligned type with base class.

* Add support for llama3 adapter (#147)

* Update transformers to 4.41.0 (#150)

* Update transformers to latest

* Spaces

* Point to bug fix commit we want to pick

* Update pyproject.toml

* update README

* update

---------

Co-authored-by: Dmitry Kats <dmitrykats@microsoft.com>
Co-authored-by: Pashmina Cameron <pcameron@microsoft.com>
Co-authored-by: radhikamp99 <47057131+radhikamp99@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants