Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure OpenAI #5

Open
cirezd opened this issue Aug 1, 2023 · 1 comment
Open

Azure OpenAI #5

cirezd opened this issue Aug 1, 2023 · 1 comment

Comments

@cirezd
Copy link

cirezd commented Aug 1, 2023

Hi,

I am using Azure OpenAI service and therefore I am interested in getting this to work with litechain.
I tested this locally and it works without issues. I modified the OpenAIChatChain and added the required environment variables for the init method and engine parameter. Since there are no contribution guidelines yet, I am wondering now what's the best method to contribute? Should a AzureOpenAIChatChain inherit from OpenAIChatChain? If yes, this would probably mean that the additional arguments need to be passed to the parent class (similar to here).

Let me know how to move forward and I am happy to contribute.

@etaroza
Copy link

etaroza commented Feb 9, 2024

The solution with LiteLLM didn't work for me, I got this error:
'Delta' object has no attribute 'role'

So I wrote my own AzureOpenAIChatStream :

import importlib

from langstream.contrib import OpenAIChatStream
from typing import TypeVar


T = TypeVar("T")
U = TypeVar("U")


class AzureOpenAIChatStream(OpenAIChatStream[T, U]):
    """
    PROBLEM:
    The documented LiteLLMChatStream doesn't work as documented: https://rogeriochaves.github.io/langstream/docs/llms/lite_llm/
    Probably it is some bug or smth, there was constantly an error 'Delta' object has no attribute 'role'

    I create the AzureOpenAIChatStream class to override the original implementation,
    so that I can use the AzureOpenAI client instead of the OpenAI client.
    """
    _client_ = None

    @staticmethod
    def client():
        """
        Returns the OpenAI client instance being used to make the LLM calls.
        """

        if not AzureOpenAIChatStream._client_:
            openai = importlib.import_module("openai")
            AzureOpenAIChatStream._client_ = openai.AzureOpenAI()

        return AzureOpenAIChatStream._client_

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants