Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Portkey for Self Hosted LLMs in a VM? #300

Closed
AIAnytime opened this issue Apr 12, 2024 · 2 comments
Closed

Portkey for Self Hosted LLMs in a VM? #300

AIAnytime opened this issue Apr 12, 2024 · 2 comments
Labels
bug Something isn't working triage

Comments

@AIAnytime
Copy link

What Happened?

Is Portkey compatible with a self hosted LLM running on a VM. I can't use Mistral API from Mistral Cloud or anywhere else?

What Should Have Happened?

No response

Relevant Code Snippet

No response

Your Twitter/LinkedIn

No response

@AIAnytime AIAnytime added the bug Something isn't working label Apr 12, 2024
@vrushankportkey
Copy link
Collaborator

Hi @AIAnytime you can use Portkey with a self-hosted LLM. Check out this doc - https://portkey.ai/docs/welcome/integration-guides/byollm

@vrushankportkey
Copy link
Collaborator

@AIAnytime closing this issue as it's resolved. Please feel free to reopen if you have any further questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

2 participants