Skip to content

Using Azure OpenAI Deployments #306

Answered by anthony2261
xjw1001001 asked this question in Q&A
Discussion options

You must be logged in to vote

Hey @xjw1001001 , thanks for the kind words!

We're working on adding different providers instead of just the OpenAI compatible ones, but it will take a bit of time to get this done due to other priorities.

However, there is a temporary solution you could do for the time being, which is using a proxy service like LiteLLM's proxy server. It lets you setup any LLM Provider, and you would just need to set the Dataline base URL to it (for example http://localhost:4000/v1). See the screenshot for how the UI looks:

Keep in mind I haven't tested this setup but LiteLLM has worked for me in the past so I trust that this should work. Let me know if this helps or if you run into any issues!

Replies: 1 comment 5 replies

Comment options

You must be logged in to vote
5 replies
@xjw1001001
Comment options

@anthony2261
Comment options

@xjw1001001
Comment options

@anthony2261
Comment options

@anthony2261
Comment options

Answer selected by RamiAwar
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
feature-request New feature or request question Further information is requested
2 participants