Skip to content

Most performant way of running Llama inference on Mac using ExecuTorch? #8571

Unanswered
manuelcandales asked this question in Q&A
Discussion options

You must be logged in to vote

Replies: 3 comments 5 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@manuelcandales
Comment options

manuelcandales Feb 19, 2025
Collaborator Author

Comment options

You must be logged in to vote
4 replies
@manuelcandales
Comment options

manuelcandales Feb 20, 2025
Collaborator Author

@digantdesai
Comment options

@kimishpatel
Comment options

@metascroy
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants