File tree Expand file tree Collapse file tree 2 files changed +10
-2
lines changed
src/llm_agents_from_scratch/llms/ollama Expand file tree Collapse file tree 2 files changed +10
-2
lines changed Original file line number Diff line number Diff line change @@ -10,6 +10,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/).
1010
1111### Added
1212
13+ - Add host param to OllamaLLM construction to properly connect the internal AsyncClient to it. (#125 )
1314- [ Feature] Add id_to ToolCall and tool_call_id to ToolCallResult (#119 )
1415
1516### Changed
Original file line number Diff line number Diff line change @@ -25,17 +25,24 @@ class OllamaLLM(BaseLLM):
2525 Integration to `ollama` library for running open source models locally.
2626 """
2727
28- def __init__ (self , model : str , * args : Any , ** kwargs : Any ) -> None :
28+ def __init__ (
29+ self ,
30+ model : str ,
31+ host : str | None = None ,
32+ * args : Any ,
33+ ** kwargs : Any ,
34+ ) -> None :
2935 """Create an OllamaLLM instance.
3036
3137 Args:
3238 model (str): The name of the LLM model.
39+ host (str | None): Host of running Ollama service. Defaults to None.
3340 *args (Any): Additional positional arguments.
3441 **kwargs (Any): Additional keyword arguments.
3542 """
3643 super ().__init__ (* args , ** kwargs )
3744 self .model = model
38- self ._client = AsyncClient ()
45+ self ._client = AsyncClient (host = host )
3946
4047 async def complete (self , prompt : str , ** kwargs : Any ) -> CompleteResult :
4148 """Complete a prompt with an Ollama LLM.
You can’t perform that action at this time.
0 commit comments