forked from mlc-ai/mlc-llm
-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BYO-FT support, with some LoRA support #224
Closed
Closed
Changes from all commits
Commits
Show all changes
33 commits
Select commit
Hold shift + click to select a range
b8b9280
[PR-1685][ParamManager] Preserve variable names in transform_dequantize
Lunderberg 29c49fc
[PR-1686][ParamManager] Simplify get_param_loading_functions signature
Lunderberg 1ded3c2
[PR-1687][ParamManager] Separate get_param_loading_functions for get/set
Lunderberg 8573cc4
[PR-1756][Transform] Add check for function.attrs
Lunderberg 18a6d1c
[PR-1757][Transform] Handle non-schedulable func in LiftTIRGlobalBuff…
Lunderberg 76cb787
[PR-1758][Models] Define sharding strategy when combine_matmul=False
Lunderberg de2b289
[PR-1760][Bugfix] Remove mutation of IRModule in ReorderTransformFunc
Lunderberg aadff29
[PR-1851][Bugfix] Handle model names with multiple path components
Lunderberg f1fde2a
[PR-1852][Build] Replace mod_transform_before_build with IRModule pass
Lunderberg 1db8a2b
[PR-1855][Utils][Transform] Added SetEntryFuncs transform
Lunderberg c136c11
[PR-1856][Build] Update transform_params_for_each_rank to IRModule pass
Lunderberg ec531cf
[PR-1857][Utils] Allow ReorderTransformFunc to be used without param …
Lunderberg f5330d4
[Utils][Bugfix] Provide full path to shutil.copy in copy_tokenizer
Lunderberg 035c915
[Model] Update Mixtral to have well-formed TIR
Lunderberg 1069f69
Apply black auto-format
Lunderberg 7e826c1
[BYO-FT] Generate a `transform_params` function in compiled module
Lunderberg ec048b0
[BYO-FT] Set combine_matmul=False for llama.py, VLLM-llama, mistral
Lunderberg de36ed9
[BYO-FT] Support execution of transform_params during initialization
Lunderberg ad51336
[Llama] Produce well-formed TIR for PagedAttention
Lunderberg 03082e8
[Debug] Output `original_params` directory
Lunderberg a4376c3
[Debug] Implement validate_transform_params
Lunderberg 234e777
[Debug] Add verify_well_formed calls
Lunderberg 6f4aad1
[Debug] Print optimized model
Lunderberg 71e5e93
[Debug] Added assert in ParamManager indicating required call sequence
Lunderberg febc355
[Debug] Add LOG.debug statements for safetensor loading
Lunderberg 8b51544
[LoRA] Add --lora=path/to/lora.safetensors argument
Lunderberg 4e09fa7
[LoRA] Implement utility functions to get the rank of each LoRA
Lunderberg 2b62e5a
[LoRA] Assert that `num_input` is present instead of redefining it
Lunderberg d202e7b
[LoRA] Implement optimization passes for LoRA models
Lunderberg c15ed34
[LoRA] Add transforms to inject/optimize LoRA
Lunderberg 4741afb
Handle bfloat16 -> float16 conversions for any dimension of tensor
Lunderberg 677fe40
Normalize from (bfloat16 | float16) to float16
Lunderberg 1b5620e
Black auto-format
Lunderberg File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is
CombineParallelMatmul
with lora supported now?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately, no. It will require improvements to
LiftTransformParams
, in order to lift out a parameter transformation that can be used for every function in anIRModule
.