Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: The OpenAI interface supports the thought process #2392

Merged
merged 1 commit into from
Feb 25, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 11 additions & 3 deletions apps/common/handle/impl/response/openai_to_response.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,11 @@ class OpenaiToResponse(BaseToResponse):
def to_block_response(self, chat_id, chat_record_id, content, is_end, completion_tokens, prompt_tokens,
other_params: dict = None,
_status=status.HTTP_200_OK):
if other_params is None:
other_params = {}
data = ChatCompletion(id=chat_record_id, choices=[
BlockChoice(finish_reason='stop', index=0, chat_id=chat_id,
reasoning_content=other_params.get('reasoning_content', ""),
message=ChatCompletionMessage(role='assistant', content=content))],
created=datetime.datetime.now().second, model='', object='chat.completion',
usage=CompletionUsage(completion_tokens=completion_tokens,
Expand All @@ -32,11 +35,16 @@ def to_block_response(self, chat_id, chat_record_id, content, is_end, completion
).dict()
return JsonResponse(data=data, status=_status)

def to_stream_chunk_response(self, chat_id, chat_record_id, node_id, up_node_id_list, content, is_end, completion_tokens,
def to_stream_chunk_response(self, chat_id, chat_record_id, node_id, up_node_id_list, content, is_end,
completion_tokens,
prompt_tokens, other_params: dict = None):
if other_params is None:
other_params = {}
chunk = ChatCompletionChunk(id=chat_record_id, model='', object='chat.completion.chunk',
created=datetime.datetime.now().second,choices=[
Choice(delta=ChoiceDelta(content=content, chat_id=chat_id), finish_reason='stop' if is_end else None,
created=datetime.datetime.now().second, choices=[
Choice(delta=ChoiceDelta(content=content, reasoning_content=other_params.get('reasoning_content', ""),
chat_id=chat_id),
finish_reason='stop' if is_end else None,
index=0)],
usage=CompletionUsage(completion_tokens=completion_tokens,
prompt_tokens=prompt_tokens,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regularity Checks:

  1. Initialization of other_params:

    • Both methods use if other_params is None to initialize it. This is good practice as it checks whether the parameter is provided and provides default values accordingly.
  2. Stream Response (to_stream_chunk_response):

    • The choices list remains unchanged, except for adding an additional field: reasoning_content with a default value if not provided in other_params.

Potential Issues:

  1. Type Hinting:

    • Ensure that all type hints within the function definitions match their respective types. For instance, _status should be of type int.
    • Consider using more specific annotations like Union[None, Dict[str, Any]] instead of just Dict[str, Any].
  2. Datetime Module Usage:

    • In Python 3.x, you can import datetime directly from the module without specifying datetime. at the beginning. This aligns well with PEP 8 guidelines but could vary slightly based on Python versions used.

Optimization Suggestions:

  1. Use Type Annotations for Methods:

    from typing import Set
    
    class OpenaiToResponse(BaseToResponse):
        def to_block_response(
                self,
                chat_id: str,
                chat_record_id: str,
                content: str,
                is_end: bool,
                completion_tokens: int,
                prompt_tokens: int,
                other_params: dict = None,  # Default is an empty dict
                _status: int = status.HTTP_200_OK
        ) -> JsonResponse:
            pass
    
        def to_stream_chunk_response(
                self,
                chat_id: str,
                chat_record_id: str,
                node_id: str,
                up_node_id_list: Set[str],  # Assuming this expects a set of strings
                content: str,
                is_end: bool,
                completion_tokens: int,
                prompt_tokens: int,
                other_params: dict = None
        ) -> None:
            pass
  2. Consider Using Named Tuples or Data Classes:

from dataclasses import dataclass

@dataclass
class CompletionData:
    id: str
    model: str
    object: str
    created: float
    choices: List[Choice]
    usage: 'CompletionUsage'

@dataclass
class Choice:
    delta: ChoiceDelta
    finish_reason: Union[None, str]

@dataclass
class ChoiceDelta:
    content: str
    reasoning_content: str

@dataclass
class CompletionUsage:
    completion_tokens: int
    prompt_tokens: int

By implementing these suggestions, code will become more robust, maintainable, and efficient while adhering to best practices.

Expand Down