-
Notifications
You must be signed in to change notification settings - Fork 260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama for Local Models #36
Comments
Hi! I could definitely look into this. Is this the project you're talking about? |
Yes, that's it.
…On Sun, Jun 9, 2024 at 12:56 PM Ivan Kwiatkowski ***@***.***> wrote:
Hi! I could definitely look into this. Is this
<https://github.com/ollama/ollama> the project you're talking about?
—
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMRCOAGJFSO3JDUFP3NKO4TZGQYHHAVCNFSM6AAAAABJAPHWGKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJWGQ2DEMBYGQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Most of these projects have their own hosting on localhost, like LM Studio has as well, you could pick the port as well. (not sure if you can with ollama) but they both have like local api endpoints. Perhaps if I use the OpenAI Proxy you set as a setting to the localhost thing it could function without any alterations. |
Major refactoring to support dynamic construction of the UI menus. This was necessary to support arbitrary model combinations installed via Ollama. Updated translations.
There we are! Please let me know if this works for you! |
Thank you!!! |
Wanted to know if it's any hastle to add compatiblity to Ollama so the model can be run locally.
Haven't tried, just wondering if anyone has.
The text was updated successfully, but these errors were encountered: