Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

fix: add drop_params support to BedrockCompletion for unsupported parameters

Summary

Fixes GitHub issue #4046 where certain Bedrock models (like openai.gpt-oss-safeguard-120b) don't support the stopSequences field and throw an error: "This model doesn't support the stopSequences field."

This PR adds support for drop_params and additional_drop_params parameters in BedrockCompletion._get_inference_config(), following the same pattern already implemented in the Azure provider. Users can now drop unsupported parameters from the inference config:

llm = LLM(
    model="bedrock/openai.gpt-oss-safeguard-120b",
    drop_params=True,
    additional_drop_params=["stopSequences"]
)

Changes:

  • Modified _get_inference_config() to check for drop_params and additional_drop_params in additional_params and remove specified keys from the config
  • Added 6 unit tests covering various scenarios (drop enabled/disabled, multiple params, API call verification)

Review & Testing Checklist for Human

  • Test with actual Bedrock model: The tests mock the boto3 client. Please verify this fix works with the actual openai.gpt-oss-safeguard-120b model or another model that doesn't support stopSequences
  • Verify agent behavior: Since supports_stop_words() still returns True, the agent executor will still set stop words but they'll be dropped before the API call. Confirm this doesn't cause unexpected behavior in the agent loop
  • Check parameter naming: Users must use AWS API parameter names (e.g., stopSequences, topP) in additional_drop_params, not internal attribute names (e.g., stop_sequences, top_p)

Notes

…ameters

This fixes GitHub issue #4046 where certain Bedrock models don't support
the stopSequences field. The fix adds support for drop_params and
additional_drop_params parameters in BedrockCompletion._get_inference_config()
to allow users to drop unsupported parameters from the inference config.

Example usage:
llm = LLM(
    model="bedrock/openai.gpt-oss-safeguard-120b",
    drop_params=True,
    additional_drop_params=["stopSequences"]
)

This follows the same pattern as the Azure provider implementation.

Co-Authored-By: João <joao@crewai.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@devin-ai-integration
Copy link
Contributor Author

Closing due to inactivity for more than 7 days. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant