Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Support exporting pretrained models to mace/ncnn and others via pnnx #527

Closed
wants to merge 9 commits into from

Conversation

csukuangfj
Copy link
Collaborator

@csukuangfj csukuangfj commented Aug 9, 2022

The aim of this PR is to support exporting pretrained models in a way such that they can be used by other inference frameworks such as

and possibly others like: tnn, mnn, etc.

We can also use the quantization support from the above frameworks, e.g., using full int8 support from ncnn

As the first step, I am using pnnx as the intermediate format.


It requires
csukuangfj/ncnn#1

@csukuangfj
Copy link
Collaborator Author

It requires
csukuangfj/ncnn#1

@yaozengwei
Copy link
Collaborator

In the export.py, if we just save the model.state_dict, we don't need to execute convert_scaled_to_non_scaled(model, inplace=True).

@csukuangfj
Copy link
Collaborator Author

In the export.py, if we just save the model.state_dict, we don't need to execute convert_scaled_to_non_scaled(model, inplace=True).

Yes, you are right.

@csukuangfj
Copy link
Collaborator Author

Closing via #571

@csukuangfj csukuangfj closed this Sep 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants