Open access LLMs #8
Replies: 5 comments 2 replies
-
Behind the scene, this plugin uses LangChain JS to interact with a chat model. So far the only choices in this family are OpenAI and Anthropic, but I imagine more open-access LLMs will become available in the future. https://js.langchain.com/docs/modules/models/chat/integrations |
Beta Was this translation helpful? Give feedback.
-
It would be good to be able to choose to use gpt-4 |
Beta Was this translation helpful? Give feedback.
-
Local LLMs seem to be possible now, as, for example, ChatOllama is listed by now: https://js.langchain.com/docs/integrations/chat/ Would be really great if this worked in the near future within ARIA : ) |
Beta Was this translation helpful? Give feedback.
-
Hi @adkiem, ARIA heavily uses GPT's function calling feature to interact with Zotero APIs. I see that open-source LLMs such as Ollama has started to support similar features but we will need to measure their performance in comparison to GPT-4. To that end, I am curating a dataset to support such functional testing so we can more easily explore/validate different model configurations. |
Beta Was this translation helpful? Give feedback.
-
LiteLLM, OpenRouter, Ollama. The trinity of local use + semi-local services. |
Beta Was this translation helpful? Give feedback.
-
In the spirit of research, would it be idea if the assistant can connect to any other LLM even local?
Beta Was this translation helpful? Give feedback.
All reactions