-
Notifications
You must be signed in to change notification settings - Fork 548
Description
In Supported LLM Models section there is info about supported engines:
You can use any LLM provider that is supported by LangChain, e.g.,
ai21
,aleph_alpha
,anthropic
,anyscale
,azure
,cohere
,huggingface_endpoint
,huggingface_hub
,openai
,self_hosted
,self_hosted_hugging_face
. Check out the LangChain official documentation for the full list.
The problem is that the documentation doesn't mention any particular page in the LangChain docs. I would guess that it references this page: Langchain > Components > LLMs. But there is also LangChain page dedicated to certain providers: Langchain > Providers.
And while some of the listed names seem to match the LangChain LLMs components (like e.g. ai21
, aleph_alpha
, anthropic
), the other ones like azure
or huggingface_hub
, in my opinion, create inconsistency in name convention. The below image shows what I'm talking about (there is no particular provider called only azure
):
So the questions are:
- Which part of LangChain documentation does Namo Guardrails refer to?
- Where I could find the real list of supported LLMs/engines in Nemo?