Skip to content

Conversation

SimFG
Copy link
Contributor

@SimFG SimFG commented Oct 10, 2025

When using some openai-compatible thinking models at that time, it was found that centain errors would occur.

2025-10-10 11:22:45,625 - ragas.executor - ERROR - Exception raised in Job[3]: BadRequestError(Error code: 400 - {'error': {'code': 'InvalidParameter', 'message': 'Reasoning model does not support n > 1, logit_bias, logprobs, top_logprobs Request id: xxx', 'param': '', 'type': 'BadRequest'}})

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Oct 10, 2025
Copy link
Contributor

@anistark anistark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @SimFG

Could you please also add tests relevant to bypass_n

Add unit tests that verify:

  • When bypass_n=True, the wrapper doesn't pass n to the underlying LLM
  • When bypass_n=False (default), behavior remains unchanged
  • Both sync (generate_text) and async (agenerate_text) methods work correctly

SimFG and others added 2 commits October 13, 2025 10:12
Co-authored-by: Ani <5357586+anistark@users.noreply.github.com>
Add comprehensive test cases to verify the behavior of bypass_n parameter in LangchainLLMWrapper. Tests cover both sync and async methods, default behavior, and interaction with multiple completion support.
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Oct 13, 2025
@SimFG
Copy link
Contributor Author

SimFG commented Oct 13, 2025

@anistark Thanks a lot for your time, I have added some relative unit test cases.

@SimFG
Copy link
Contributor Author

SimFG commented Oct 14, 2025

I have updated the pull request, thanks your careful review.

Copy link
Contributor

@anistark anistark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the changes @SimFG 👏🏼

@anistark anistark merged commit 9deccc8 into explodinggradients:main Oct 14, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants