-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PhiMoE #33363
Merged
ArthurZucker
merged 31 commits into
huggingface:main
from
garg-amit:gargamit/onboard_phi3_5_moe
Oct 4, 2024
Merged
PhiMoE #33363
Changes from 30 commits
Commits
Show all changes
31 commits
Select commit
Hold shift + click to select a range
6ad863a
onboard phimoe model
garg-amit 1a1e547
removed debug code
garg-amit 08d73d7
added unit tests
garg-amit 1277bc8
updated docs
garg-amit 232588d
formatted
garg-amit 8462783
fixed unit tests
garg-amit 3668c5d
fixed test case
garg-amit e6ed8dc
fixed format
garg-amit 89f51ea
refactored code
garg-amit e552e33
fixed expected outputs in the integration tests
garg-amit c8173d7
Added a warning msg
garg-amit 290514e
Merge branch 'main' into gargamit/onboard_phi3_5_moe
garg-amit 5dda7d6
Merge branch 'main' into gargamit/onboard_phi3_5_moe
garg-amit 11a0f17
Merge branch 'gargamit/onboard_phi3_5_moe' of https://github.com/garg…
garg-amit dd02bf9
Merge branch 'main' of https://github.com/garg-amit/transformers into…
garg-amit 43f9cc9
Addressed comments
garg-amit b3f8af5
Merge branch 'main' of https://github.com/garg-amit/transformers into…
garg-amit b6acc3e
Addressed comments
garg-amit 33caa63
fixed test cases
garg-amit e01a78e
added paper link
garg-amit d1f847e
Addressed comments
garg-amit dd8b8b0
Refactored PhimoeForCausalLM forward fn
garg-amit 42b59c6
Refactored PhimoeRotaryEmbedding class
garg-amit e4e2f1a
fixed test cases
garg-amit 8359a59
fixed testcase
garg-amit bebad97
fixed test case
garg-amit 4c28bd5
Merge branch 'main' of https://github.com/garg-amit/transformers into…
garg-amit 1311e80
Addressed comments
garg-amit 2671887
fixed test cases
garg-amit 18830f5
fixed testcases
garg-amit 36891bf
Used cache position instead to get the seq len
garg-amit File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,118 @@ | ||
<!--Copyright 2024 The HuggingFace Team. All rights reserved. | ||
|
||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with | ||
the License. You may obtain a copy of the License at | ||
|
||
http://www.apache.org/licenses/LICENSE-2.0 | ||
|
||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on | ||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the | ||
specific language governing permissions and limitations under the License. | ||
|
||
⚠️ Note that this file is in Markdown but contains specific syntax for our doc-builder (similar to MDX) that may not be | ||
rendered properly in your Markdown viewer. | ||
|
||
--> | ||
|
||
# PhiMoE | ||
|
||
## Overview | ||
|
||
The PhiMoE model was proposed in [Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone](https://arxiv.org/abs/2404.14219) by Microsoft. | ||
|
||
### Summary | ||
|
||
The abstract from the Phi-3 paper is the following: | ||
|
||
We introduce phi-3-mini, a 3.8 billion parameter language model trained on 3.3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of models such as Mixtral 8x7B and GPT-3.5 (e.g., phi-3-mini achieves 69% on MMLU and 8.38 on MT-bench), despite being small enough to be deployed on a phone. Our training dataset is a scaled-up version of the one used for phi-2, composed of heavily filtered publicly available web data and synthetic data. The model is also further aligned for robustness, safety, and chat format. We also provide parameter-scaling results with a 7B, 14B models trained for 4.8T tokens, called phi-3-small, phi-3-medium, both significantly more capable than phi-3-mini (e.g., respectively 75%, 78% on MMLU, and 8.7, 8.9 on MT-bench). To enhance multilingual, multimodal, and long-context capabilities, we introduce three models in the phi-3.5 series: phi-3.5-mini, phi-3.5-MoE, and phi-3.5-Vision. The phi-3.5-MoE, a 16 x 3.8B MoE model with 6.6 billion active parameters, achieves superior performance in language reasoning, math, and code tasks compared to other open-source models of similar scale, such as Llama 3.1 and the Mixtral series, and on par with Gemini-1.5-Flash and GPT-4o-mini. Meanwhile, phi-3.5-Vision, a 4.2 billion parameter model derived from phi-3.5-mini, excels in reasoning tasks and is adept at handling both single-image and text prompts, as well as multi-image and text prompts. | ||
|
||
The original code for PhiMoE can be found [here](https://huggingface.co/microsoft/Phi-3.5-MoE-instruct). | ||
|
||
## Usage tips | ||
|
||
- This model is very similar to `Mixtral` with the main difference of [`Phi3LongRoPEScaledRotaryEmbedding`], where they are used to extend the context of the rotary embeddings. The query, key and values are fused, and the MLP's up and gate projection layers are also fused. | ||
- The tokenizer used for this model is identical to the [`LlamaTokenizer`], with the exception of additional tokens. | ||
|
||
## How to use PhiMoE | ||
|
||
<Tip warning={true}> | ||
|
||
Phi-3.5-MoE-instruct has been integrated in the development version (4.44.2.dev) of `transformers`. Until the official version is released through `pip`, ensure that you are doing the following: | ||
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function. | ||
|
||
The current `transformers` version can be verified with: `pip list | grep transformers`. | ||
|
||
Examples of required packages: | ||
``` | ||
flash_attn==2.5.8 | ||
torch==2.3.1 | ||
accelerate==0.31.0 | ||
transformers==4.43.0 | ||
``` | ||
|
||
</Tip> | ||
|
||
```python | ||
import torch | ||
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline | ||
|
||
torch.random.manual_seed(0) | ||
|
||
model = AutoModelForCausalLM.from_pretrained( | ||
"microsoft/Phi-3.5-MoE-instruct", | ||
device_map="cuda", | ||
torch_dtype="auto", | ||
trust_remote_code=True, | ||
) | ||
|
||
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3.5-MoE-instruct") | ||
|
||
messages = [ | ||
{"role": "system", "content": "You are a helpful AI assistant."}, | ||
{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"}, | ||
{"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."}, | ||
{"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"}, | ||
] | ||
|
||
pipe = pipeline( | ||
"text-generation", | ||
model=model, | ||
tokenizer=tokenizer, | ||
) | ||
|
||
generation_args = { | ||
"max_new_tokens": 500, | ||
"return_full_text": False, | ||
"temperature": 0.0, | ||
"do_sample": False, | ||
} | ||
|
||
output = pipe(messages, **generation_args) | ||
print(output[0]['generated_text']) | ||
``` | ||
|
||
## PhimoeConfig | ||
|
||
[[autodoc]] PhimoeConfig | ||
|
||
<frameworkcontent> | ||
<pt> | ||
|
||
## PhimoeModel | ||
|
||
[[autodoc]] PhimoeModel | ||
- forward | ||
|
||
## PhimoeForCausalLM | ||
|
||
[[autodoc]] PhimoeForCausalLM | ||
- forward | ||
- generate | ||
|
||
## PhimoeForSequenceClassification | ||
|
||
[[autodoc]] PhimoeForSequenceClassification | ||
- forward | ||
|
||
</pt> | ||
</frameworkcontent> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -190,6 +190,7 @@ | |
persimmon, | ||
phi, | ||
phi3, | ||
phimoe, | ||
phobert, | ||
pix2struct, | ||
pixtral, | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,28 @@ | ||
# Copyright 2024 Microsoft and The HuggingFace Inc. team. All rights reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
from typing import TYPE_CHECKING | ||
|
||
from ...utils import _LazyModule | ||
from ...utils.import_utils import define_import_structure | ||
|
||
|
||
if TYPE_CHECKING: | ||
from .configuration_phimoe import * | ||
from .modeling_phimoe import * | ||
|
||
else: | ||
import sys | ||
|
||
_file = globals()["__file__"] | ||
sys.modules[__name__] = _LazyModule(__name__, _file, define_import_structure(_file), module_spec=__spec__) |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you no longer need this complicated structred! See the
__init__
for Albert for example!You need to define a
__all__
in the modeling and config and that's it