Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion - Integrate MobileSAM into the pipeline for lightweight and faster inference #1

Open
mdimtiazh opened this issue Jun 28, 2023 · 1 comment

Comments

@mdimtiazh
Copy link

Reference: https://github.com/ChaoningZhang/MobileSAM

Our project performs on par with the original SAM and keeps exactly the same pipeline as the original SAM except for a change on the image encode, therefore, it is easy to Integrate into any project.

MobileSAM is around 60 times smaller and around 50 times faster than original SAM, and it is around 7 times smaller and around 5 times faster than the concurrent FastSAM. The comparison of the whole pipeline is summarzed as follows:

image

image

Best Wishes,

Qiao

@LWHYC
Copy link
Collaborator

LWHYC commented Jun 28, 2023

Hi Qiao,

Thank you for your interest in our project and the detailed comparison with MobileSAM. We appreciate your suggestion and acknowledge the impressive performance of MobileSAM.

We are planning to conduct more comprehensive comparisons, including MobileSAM, in our future work. This will give us a better understanding of how our project stacks up against other state-of-the-art solutions.

Thanks again for bringing this to our attention. Please stay tuned for our future updates.

Best regards,
Wenhui

@LWHYC LWHYC closed this as completed Jun 28, 2023
@LWHYC LWHYC reopened this Jul 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants