Skip to content

Conversation

@Pouyanpi
Copy link
Collaborator

@Pouyanpi Pouyanpi commented Jul 1, 2025

Problem

The main_llm_supports_streaming flag was not being set properly when the main LLM was initialized from config, causing streaming functionality to fail in the develop branch while it worked in v0.14.0.

Root Cause

During LLM initialization refactoring (#1221), the streaming support detection logic was only applied when iterating through config models, but was skipped when the main LLM was initialized directly from config in the new code path.

Fix

  • Added streaming support detection when main LLM is initialized from config
  • Added streaming support detection when LLM is provided via constructor
  • Ensures main_llm_supports_streaming is properly set in both initialization paths

Test Plan

Added three tests to tests/test_streaming.py that verify main_llm_supports_streaming is correctly set:

  1. test_main_llm_supports_streaming_flag_with_config(): LLM from config with streaming enabled
  2. test_main_llm_supports_streaming_flag_with_constructor(): LLM via constructor with streaming enabled
  3. test_main_llm_supports_streaming_flag_disabled_when_no_streaming(): Streaming disabled

These tests fail before this fix and pass after, preventing future regressions.

@Pouyanpi Pouyanpi added this to the v0.14.1 milestone Jul 1, 2025
@Pouyanpi Pouyanpi added the bug Something isn't working label Jul 1, 2025
@Pouyanpi Pouyanpi self-assigned this Jul 1, 2025

This comment was marked as outdated.

@Pouyanpi Pouyanpi marked this pull request as draft July 1, 2025 13:50
@Pouyanpi Pouyanpi requested a review from Copilot July 1, 2025 14:36
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This pull request fixes an issue with the streaming support detection for the main LLM during initialization, ensuring that the streaming flag is properly set whether the LLM is created from configuration or via the constructor. Key changes include updating tests to verify streaming detection, adding a custom streaming provider fixture for testing, and refactoring streaming configuration logic in LLMRails.

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
tests/test_streaming.py Updated tests on streaming support with custom streaming providers and added tests for constructor-based LLM initialization.
nemoguardrails/rails/llm/llmrails.py Introduced a dedicated _configure_main_llm_streaming function and updated its invocation in multiple initialization paths to properly set the streaming flag.

@codecov-commenter
Copy link

codecov-commenter commented Jul 1, 2025

Codecov Report

Attention: Patch coverage is 91.66667% with 1 line in your changes missing coverage. Please review.

Project coverage is 69.57%. Comparing base (426d704) to head (728f249).

Files with missing lines Patch % Lines
nemoguardrails/rails/llm/llmrails.py 91.66% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #1258      +/-   ##
===========================================
+ Coverage    69.54%   69.57%   +0.03%     
===========================================
  Files          161      161              
  Lines        16016    16023       +7     
===========================================
+ Hits         11138    11148      +10     
+ Misses        4878     4875       -3     
Flag Coverage Δ
python 69.57% <91.66%> (+0.03%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
nemoguardrails/rails/llm/llmrails.py 88.83% <91.66%> (+0.64%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@Pouyanpi Pouyanpi marked this pull request as ready for review July 1, 2025 14:43
@Pouyanpi Pouyanpi requested review from tgasser-nv and trebedea July 1, 2025 14:43
Copy link
Collaborator

@tgasser-nv tgasser-nv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approved - please add a few more tests as requested for coverage

@Pouyanpi
Copy link
Collaborator Author

Pouyanpi commented Jul 1, 2025

Thank you @tgasser-nv , I've added the tests and made the suggested changes.

@Pouyanpi Pouyanpi merged commit 9cdad05 into develop Jul 1, 2025
17 checks passed
@Pouyanpi Pouyanpi deleted the fix/streaming-main-llm-init branch July 1, 2025 18:53
Pouyanpi added a commit that referenced this pull request Jul 1, 2025
…on (#1258)

The main_llm_supports_streaming flag was not being set properly when the main LLM was initialized from config, causing streaming functionality to fail after merging #1221 while it worked in v0.14.0.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants