-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible to use it without docker? #6
Comments
good point, I'll have a look as soon as possible. |
You can try and copy my Makefile from here: https://github.com/go-skynet/llama-cli/blob/ad18b8305449b54f63383408bc47c246cacdf419/Makefile If you have all the required GO dependancies, you can use |
@mudler if this Makefile is tested and merged, then we could close this issue? (we could make it a seperate PR if you wish) |
you should now be able to do 'make build' if you use latest and have all the required dependancies (ie: go, cmake). @mudler i think we can close this item. We would need to update the documentation tho, so this issue doesn't re-apead for lack of docs. |
I don't have experience in GoLang. Currently I have a question answering chatbot developed using LangChain Python and hosted as AWS Lambda function. You may want to look into the github repo at https://github.com/limcheekin/serverless-flutter-gpt. I want to swap out the OpenAI API dependency to LocalAI if possible. Appreciate your advise whether it is feasible to do so and please share the steps on how to do so. Thanks in advanced. |
@limcheekin 👋 feel free to join our Discord channel, however I think it should be as simple as specifying from the OpenAI client you already use a different base_url, and point it to where the API runs. |
I'm closing this issue now, as building locally should be possible with |
Update: binary releases now are available too https://github.com/go-skynet/LocalAI/releases/tag/v1.15.0 |
how do we use? |
Would really appreciate some instructions or guidance for getting this working directly, without docker. I noticed it's using a modified llama.cpp mixed with golong, but I don't have enough knowledge with this to build it. I did try building it but got a error about lama.h not being found.
The text was updated successfully, but these errors were encountered: