Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix streaming api #1596

Merged
merged 1 commit into from
Nov 20, 2024
Merged

fix streaming api #1596

merged 1 commit into from
Nov 20, 2024

Conversation

michaeljguarino
Copy link
Member

@michaeljguarino michaeljguarino commented Nov 20, 2024

troubleshoot and fix publishing to websockets

Test Plan

n/a

Checklist

  • If required, I have updated the Plural documentation accordingly.
  • I have added tests to cover my changes.
  • I have added a meaningful title and summary to convey the impact of this PR to a user.

Summary by CodeRabbit

  • New Features
    • Enhanced debugging capabilities with console logging for stream and chunk data during processing.
  • Bug Fixes
    • Improved error handling for AI completion generation, providing clearer error messages.
  • Chores
    • Added logging statements to assist in monitoring the behavior of the streaming and publishing processes.

@michaeljguarino michaeljguarino added the enhancement New feature or request label Nov 20, 2024
@michaeljguarino michaeljguarino requested a review from a team November 20, 2024 22:07
Copy link
Contributor

coderabbitai bot commented Nov 20, 2024

Important

Review skipped

Auto incremental reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Walkthrough

The changes involve modifications to three modules: Console.AI.OpenAI, Console.AI.Stream, and Console.AI.Stream.Exec. Key updates include the addition of IO.inspect() calls for debugging purposes, allowing inspection of stream data and chunks being published. The completion/2 function in the OpenAI module now robustly handles binary string responses, enhancing error messaging. The overall logic and flow of the program remain unchanged, with a focus on improving debugging capabilities and refining error handling.

Changes

File Change Summary
lib/console/ai/provider/openai.ex Modified new/1 to include IO.inspect() for the stream field; updated completion/2 for better binary response handling and error messaging.
lib/console/ai/stream.ex Updated publish to include IO.inspect(chunk) for logging before publishing.
lib/console/ai/stream/exec.ex Added IO.inspect() after publishing in exec function for debugging; no changes to logic in handle_openai or build_stream.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant OpenAI
    participant Stream
    participant Exec

    User->>OpenAI: Request completion
    OpenAI->>OpenAI: Process request
    OpenAI->>OpenAI: IO.inspect(stream)
    OpenAI->>User: Return response
    User->>Stream: Publish chunk
    Stream->>Stream: IO.inspect(chunk)
    Stream->>User: Confirm publish
    Exec->>Stream: Execute stream
    Exec->>Exec: IO.inspect(data)
Loading

🐇 "In the code where the streams flow,
Debugging logs now brightly glow.
With chunks and responses, we now see,
The inner workings, clear as can be!
Hopping through functions, so neat and spry,
Our code's a garden, watch it fly!" 🌼


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Outside diff range and nitpick comments (2)
lib/console/ai/stream.ex (1)

Line range hint 8-18: Consider enhancing error handling and logging.

The publish function would benefit from structured logging and better error handling:

  1. Add logging for failed publishes
  2. Include metadata like chunk size and topic
  3. Consider adding telemetry events for monitoring

Here's a suggested improvement:

 def publish(%__MODULE__{topic: topic}, chunk) when is_binary(topic) do
+  Logger.metadata(topic: topic, size: byte_size(chunk))
+  :telemetry.execute([:console, :ai, :stream, :publish], %{size: byte_size(chunk)}, %{topic: topic})
   Absinthe.Subscription.publish(
     ConsoleWeb.Endpoint,
-    IO.inspect(chunk),
     chunk,
     [ai_stream: topic]
-  )
+  ) |> case do
+    :ok -> :ok
+    error ->
+      Logger.warning("Failed to publish AI stream", error: error)
+      error
+  end
 end
-def publish(_, _), do: :ok
+def publish(_, _) do
+  Logger.debug("Skipping publish for invalid stream configuration")
+  :ok
+end
lib/console/ai/provider/openai.ex (1)

Line range hint 71-82: Consider enhancing streaming error handling and documentation

The streaming implementation could benefit from:

  1. More detailed documentation about streaming behavior and configuration
  2. Enhanced error handling for stream initialization and processing
  3. Configurable timeouts and retry policies for streaming

Consider adding module and function documentation:

@moduledoc """
Implements OpenAI API integration with support for both streaming and non-streaming completions.

## Streaming
The module supports streaming responses through WebSockets. Configure streaming by providing
a Stream struct during initialization. Example:

    OpenAI.new(%{stream: Stream.stream(), ...})

## Configuration
- `:stream` - Optional Stream struct for enabling streaming responses
- `:model` - OpenAI model to use (defaults to "gpt-4o-mini")
- `:base_url` - Optional custom API endpoint
"""
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 8bdf9fc and 65672e1.

📒 Files selected for processing (3)
  • lib/console/ai/provider/openai.ex (1 hunks)
  • lib/console/ai/stream.ex (1 hunks)
  • lib/console/ai/stream/exec.ex (1 hunks)
🔇 Additional comments (1)
lib/console/ai/stream.ex (1)

Line range hint 1-24: Verify WebSocket connection handling.

Since this PR aims to fix WebSocket publishing issues, we should verify the connection handling across the codebase.

Let's check for potential WebSocket connection handling issues:

✅ Verification successful

WebSocket implementation appears robust and properly configured

The WebSocket connection handling in the codebase is well-structured with proper error handling:

  • Endpoint correctly configures WebSocket connections with appropriate security settings
  • UserSocket implements proper authentication and context building
  • Error cases are handled appropriately in the socket connection phase
  • Channel implementations include proper error handling for invalid scenarios
  • The AI stream implementation in question integrates correctly with the existing WebSocket infrastructure

The changes in Console.AI.Stream align with the established WebSocket patterns in the codebase, using the same Absinthe.Subscription.publish mechanism that's used elsewhere for real-time updates.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Look for WebSocket connection handling patterns and potential issues

# Check for WebSocket connection configurations
echo "Checking WebSocket configurations..."
rg -l "socket" --type elixir

# Look for other subscription publishing patterns
echo "Checking subscription publishing patterns..."
rg "Absinthe.Subscription.publish" --type elixir -A 2 -B 2

# Check for error handling patterns around WebSocket operations
echo "Checking WebSocket error handling..."
rg "socket.*error|socket.*failure" --type elixir

Length of output: 2007


Script:

#!/bin/bash
# Let's check for WebSocket connection and error handling patterns more thoroughly

# Check socket connection configurations and error handling in endpoint.ex
echo "Checking WebSocket endpoint configuration..."
cat lib/console_web/endpoint.ex

# Check user socket implementation
echo "Checking user socket implementation..."
cat lib/console_web/channels/user_socket.ex

# Check for any error handling in channels
echo "Checking channel implementations for error handling..."
rg "handle_error|handle_info|catch" lib/console_web/channels/

# Check for any WebSocket related configurations
echo "Checking WebSocket configurations..."
cat config/config.exs | rg -A 5 -B 5 "socket"

Length of output: 4142

lib/console/ai/stream.ex Outdated Show resolved Hide resolved
lib/console/ai/stream/exec.ex Outdated Show resolved Hide resolved
lib/console/ai/stream/exec.ex Outdated Show resolved Hide resolved
lib/console/ai/provider/openai.ex Outdated Show resolved Hide resolved
troubleshoot and fix publishing to websockets
@michaeljguarino michaeljguarino changed the title sfix streaming api fix streaming api Nov 20, 2024
@michaeljguarino michaeljguarino merged commit adffee2 into master Nov 20, 2024
11 checks passed
@michaeljguarino michaeljguarino deleted the fix-streaming branch November 20, 2024 23:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants