- 
                Notifications
    
You must be signed in to change notification settings  - Fork 557
 
fix: Streaming support detection for main LLM initialization #1258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request fixes an issue with the streaming support detection for the main LLM during initialization, ensuring that the streaming flag is properly set whether the LLM is created from configuration or via the constructor. Key changes include updating tests to verify streaming detection, adding a custom streaming provider fixture for testing, and refactoring streaming configuration logic in LLMRails.
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description | 
|---|---|
| tests/test_streaming.py | Updated tests on streaming support with custom streaming providers and added tests for constructor-based LLM initialization. | 
| nemoguardrails/rails/llm/llmrails.py | Introduced a dedicated _configure_main_llm_streaming function and updated its invocation in multiple initialization paths to properly set the streaming flag. | 
          Codecov ReportAttention: Patch coverage is  
 
 Additional details and impacted files@@             Coverage Diff             @@
##           develop    #1258      +/-   ##
===========================================
+ Coverage    69.54%   69.57%   +0.03%     
===========================================
  Files          161      161              
  Lines        16016    16023       +7     
===========================================
+ Hits         11138    11148      +10     
+ Misses        4878     4875       -3     
 Flags with carried forward coverage won't be shown. Click here to find out more. 
 🚀 New features to boost your workflow:
  | 
    
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approved - please add a few more tests as requested for coverage
| 
           Thank you @tgasser-nv , I've added the tests and made the suggested changes.  | 
    
Problem
The
main_llm_supports_streamingflag was not being set properly when the main LLM was initialized from config, causing streaming functionality to fail in the develop branch while it worked in v0.14.0.Root Cause
During LLM initialization refactoring (#1221), the streaming support detection logic was only applied when iterating through config models, but was skipped when the main LLM was initialized directly from config in the new code path.
Fix
main_llm_supports_streamingis properly set in both initialization pathsTest Plan
Added three tests to
tests/test_streaming.pythat verifymain_llm_supports_streamingis correctly set:test_main_llm_supports_streaming_flag_with_config(): LLM from config with streaming enabledtest_main_llm_supports_streaming_flag_with_constructor(): LLM via constructor with streaming enabledtest_main_llm_supports_streaming_flag_disabled_when_no_streaming(): Streaming disabledThese tests fail before this fix and pass after, preventing future regressions.