Skip to content

docs: add documentation on how to write LocalAI backends #1057

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
mudler opened this issue Sep 14, 2023 · 3 comments
Open

docs: add documentation on how to write LocalAI backends #1057

mudler opened this issue Sep 14, 2023 · 3 comments
Assignees
Labels
enhancement New feature or request roadmap

Comments

@mudler
Copy link
Owner

mudler commented Sep 14, 2023

LocalAI backends are just gRPC servers satisfying an interface. Indeed it is possible to also start external gRPC servers and attach them to a LocalAI instance as explained here: https://localai.io/advanced/index.html#connect-external-backends, however, we miss documents on how to build such a backend from scratch and an e2e guide on how to connect those to LocalAI.

@mudler mudler added the enhancement New feature or request label Sep 14, 2023
@mudler mudler self-assigned this Sep 14, 2023
@mudler mudler added the roadmap label Sep 14, 2023
@localai-bot
Copy link
Contributor

⚠️⚠️⚠️⚠️⚠️

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me,
I don't support (yet) replying!

⚠️⚠️⚠️⚠️⚠️

To build a custom gRPC server from scratch, you can follow the instructions in the official gRPC documentation: https://grpc.io/docs/languages/go/. Once you have built your own gRPC server, you can use it with LocalAI by specifying the --external-grpc-backends parameter as explained here: https://localai.io/advanced/index.html#connect-external-backends.
To connect an external gRPC server to LocalAI, you need to specify the server address and port in the format <BACKEND_NAME>:<BACKEND_URI>. For example, if your gRPC server is running on localhost:8080, you can start LocalAI with the following command:

./local-ai --debug --external-grpc-backends "my-awesome-backend:localhost:8080"

This will make the new backend available for all API endpoints. You can also specify multiple backends in a comma-separated list.
To test the connection, you can use the curl command to send a request to LocalAI and verify that it is processed by your custom gRPC server. For example:

curl -X POST \
  http://localhost:8080/api/v1/models/my_model/predict \
  -H "Content-Type: application/json" \
  -d '{"input": {"image": "http://example.com/image.jpg"}}'

This will send a prediction request to LocalAI, which will forward it to your custom gRPC server for processing. You can verify that the request is processed correctly by checking the response body and headers.

Sources:

@Aisuko

This comment was marked as off-topic.

@mudler
Copy link
Owner Author

mudler commented Oct 5, 2023

Initial documentation on how to write backends has been added here https://localai.io/advanced/development/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request roadmap
Projects
None yet
Development

No branches or pull requests

3 participants