Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
CI: https://gitlab.com/coremltools1/coremltools/-/pipelines/1413983249 ✅
Release Notes
protobuf
python package: Improves serialization latency.numpy 2.0
.scikit-learn 1.5
.coremltools.models.utils.bisect_model
can break a large Core ML model into two smaller models with similar sizes.coremltools.models.utils.materialize_dynamic_shape_mlmodel
can convert a flexible input shape model into a static input shape model.coremltools.optimize.coreml
cluster_dim > 1
incoremltools.optimize.coreml.OpPalettizerConfig
, you can do the vector palettization, where each entry in the lookup table is a vector of lengthcluster_dim
.enable_per_channel_scale=True
incoremltools.optimize.coreml.OpPalettizerConfig
, weights are normalized along the output channel using per channel scales before being palettized.coremltools.optimize.torch
.coremltools.optimize.torch
coremltools.optimize.torch
.SKMPalettizer
.PostTrainingPalettizer
andDKMPalettizer
.cluter_dtype
option in favor oflut_dtype
inModuleDKMPalettizerConfig
.ConvTranspose
modules withPostTrainingQuantizer
andLinearQuantizer
.GPTQ
.Conv2D
layer with per-block quantization inGPTQ
.QAT
APIs.torch.export
conversion supportclip
.