How to expose LLMS over a REST endpoint #565
Replies: 1 comment
-
Simplest: https://github.com/langchain-ai/langserve/blob/main/examples/llm/server.py If you need to support configuration, you can use configurable fields: If you need to pick up user information from the request itself: Use the examples for reference: https://github.com/langchain-ai/langserve/tree/main?tab=readme-ov-file#examples |
Beta Was this translation helpful? Give feedback.
-
This module contains the functions to create the language model manager (LLM) instances.
The LLMs are used to interact with the language models (LMs) to generate responses, embeddings, etc.
The module contains the following functions:
* get_langchain_llm - returns a language model manager for the chat model.
* get_embedding_llm - returns a language model manager for the embeddings model.
How can I expose these two llms over a REST endpoint using langserve.
Beta Was this translation helpful? Give feedback.
All reactions