An example of using CoreML from C++ / Python for Whisper Encoder
- (Done) generate
encoder_{modelSize}_fp16.mlmodel
byconvert_whisper_encoder_to_coreml.py
- (Done) move model to coreml/CoremlEncoder.mlmodel
- (Done) generate
mlmodelc/
and its objc srcs in coreml folder by
xcrun coremlc compile CoremlEncoder.mlmodel .
xcrun coremlc generate CoremlEncoder.mlmodel .
make
./main
(it should generate same result from pytorch output)
after 1~4 steps, run python predict_with_coreml.py
note: currently coreml/ folder contains 1-3 result from Whisper Tiny model