-
Notifications
You must be signed in to change notification settings - Fork 16.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add integration for LocalAI #5256
Comments
Hey I would like to take upon this task and would be willing to contribute |
Description: This PR adds embeddings for LocalAI ( https://github.com/go-skynet/LocalAI ), a self-hosted OpenAI drop-in replacement. As LocalAI can re-use OpenAI clients it is mostly following the lines of the OpenAI embeddings, however when embedding documents, it just uses string instead of sending tokens as sending tokens is best-effort depending on the model being used in LocalAI. Sending tokens is also tricky as token id's can mismatch with the model - so it's safer to just send strings in this case. Partly related to: #5256 Dependencies: No new dependencies Twitter: @mudler_it --------- Signed-off-by: mudler <mudler@localai.io> Co-authored-by: Bagatur <baskaryan@gmail.com>
Hi, @mudler. I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale. From what I understand, you opened this issue as a feature request to add integration with LocalAI. You mentioned that you are not very experienced with Python and may need assistance with the implementation. Pratham-saraf has expressed interest in taking on the task and contributing to the project. Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and contribution to the LangChain project. If you have any further questions or need assistance, please let us know. |
I would highly appreciate this issue being picked up again! The issue with the current state of affairs is that AzureChatOpenAI and ChatOpenAI both send requests to |
We have the same issue... is there some workaround to point it at LocalAI w/o too many code changes? |
use another backend 😅 |
I'm wondering what openai client version are you using? |
Hi @baskaryan, could you please assist with this issue? The user has provided additional context regarding the discrepancy in request endpoints between AzureChatOpenAI, ChatOpenAI, and LocalAI. Thank you! |
Feature request
Integration with LocalAI and with its extended endpoints to download models from the gallery.
Motivation
LocalAI is a self-hosted OpenAI drop-in replacement with support for multiple model families: https://github.com/go-skynet/LocalAI
Your contribution
Not a python guru, so might take few cycles away here.
The text was updated successfully, but these errors were encountered: