Using Azure OpenAI Deployments #306
-
Thanks for the great application and the update to support different web url! I wonder how I can use Azure openAI endpoint? Here is what I need to provide to AzureOpenAI client in langchain openai_api_key = xxx Because there is only openAI API key and base URL available, so I wonder how I can make it useful for my company's deploy can use this package? Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 5 replies
-
Hey @xjw1001001 , thanks for the kind words! We're working on adding different providers instead of just the OpenAI compatible ones, but it will take a bit of time to get this done due to other priorities. However, there is a temporary solution you could do for the time being, which is using a proxy service like LiteLLM's proxy server. It lets you setup any LLM Provider, and you would just need to set the Dataline base URL to it (for example Keep in mind I haven't tested this setup but LiteLLM has worked for me in the past so I trust that this should work. Let me know if this helps or if you run into any issues! |
Beta Was this translation helpful? Give feedback.
Hey @xjw1001001 , thanks for the kind words!
We're working on adding different providers instead of just the OpenAI compatible ones, but it will take a bit of time to get this done due to other priorities.
However, there is a temporary solution you could do for the time being, which is using a proxy service like LiteLLM's proxy server. It lets you setup any LLM Provider, and you would just need to set the Dataline base URL to it (for example
http://localhost:4000/v1
). See the screenshot for how the UI looks:Keep in mind I haven't tested this setup but LiteLLM has worked for me in the past so I trust that this should work. Let me know if this helps or if you run into any issues!