Skip to content

feat(core): fallback to chat-base when using unrecognized models for chat#19016

Merged
SandyTao520 merged 2 commits intomainfrom
st/feat/custom-model-fallback-chat-base
Feb 13, 2026
Merged

feat(core): fallback to chat-base when using unrecognized models for chat#19016
SandyTao520 merged 2 commits intomainfrom
st/feat/custom-model-fallback-chat-base

Conversation

@SandyTao520
Copy link
Contributor

Summary

This PR updates the Model Config Service so that when an unrecognized model is requested via the CLI (e.g., -m my-custom-model), it automatically inherits the chat-base generation configuration.

Previously, using an unknown model resulted in an empty configuration, stripping away standard interactive chat defaults like temperature: 1, topP: 0.95 and topK: 64.

Details

This change adds an isChatModel property to the ModelConfigKey interface to signify when a configuration request originates from the primary interactive chat session. The resolveAliasChain method has been updated to conditionally apply the chat-base fallback only when isChatModel is true. This ensures that internal background models (e.g., the summarizer) are not inadvertently affected and continue to receive an empty fallback config when encountering unrecognized aliases.

Related Issues

#18007

How to Validate

  1. Run the CLI with an unknown model: npm start -- -m my-custom-model.
  2. Observe that the chat session inherits the default hyperparameter configurations as defined in chat-base (e.g., it shouldn't feel restricted to default 0 temperature, etc.).
  3. Try running another command requiring a background tool/summary and verify it executes properly without the chat-base overrides.

Pre-Merge Checklist

  • Updated relevant documentation and README (if needed)
  • Added/updated tests (if needed)
  • Noted breaking changes (if any)
  • Validated on required platforms/methods:
    • MacOS
      • npm run
      • npx
      • Docker
      • Podman
      • Seatbelt
    • Windows
      • npm run
      • npx
      • Docker
    • Linux
      • npm run
      • npx
      • Docker

…chat

Previously, passing an unknown model to the CLI via `-m custom-model` would result in an empty generation configuration, stripping standard interactive chat defaults like `temperature: 1`, `topP: 0.95` and `topK: 64`.

This change updates the model configuration service to fallback to the `chat-base` configuration whenever `-m` provides an unrecognized custom model for the primary interactive chat session. Other internal background model queries (e.g., summarizer) will remain unaffected.
@SandyTao520 SandyTao520 requested a review from a team as a code owner February 13, 2026 18:27
@gemini-cli
Copy link
Contributor

gemini-cli bot commented Feb 13, 2026

Hi @SandyTao520, thank you so much for your contribution to Gemini CLI! We really appreciate the time and effort you've put into this.

We're making some updates to our contribution process to improve how we track and review changes. Please take a moment to review our recent discussion post: Improving Our Contribution Process & Introducing New Guidelines.

Key Update: Starting January 26, 2026, the Gemini CLI project will require all pull requests to be associated with an existing issue. Any pull requests not linked to an issue by that date will be automatically closed.

Thank you for your understanding and for being a part of our community!

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @SandyTao520, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a significant improvement to the Model Config Service by implementing a robust fallback mechanism. When an unrecognized model is requested for an interactive chat session through the CLI, it will now seamlessly inherit the chat-base generation configuration. This change addresses the problem of losing default hyperparameters for unknown models, ensuring a consistent and expected user experience, while carefully isolating this behavior to interactive chat contexts to avoid affecting other internal model usages.

Highlights

  • Fallback Mechanism for Unrecognized Models: Implemented a new feature where unrecognized models requested via the CLI for interactive chat sessions will automatically fall back to the chat-base configuration.
  • Introduction of isChatModel Property: Added an isChatModel property to the ModelConfigKey interface to explicitly indicate when a model configuration request originates from the primary interactive chat session.
  • Conditional Fallback Logic: Ensured that the chat-base fallback is applied conditionally, only when isChatModel is true, preventing unintended configuration overrides for internal background models.
  • Resolution of Empty Configuration Issue: Fixed a previous issue where using an unknown model resulted in an empty configuration, stripping away standard interactive chat defaults like temperature and topP.
Changelog
  • packages/core/src/core/client.ts
    • Modified the modelConfigKey creation to explicitly set isChatModel: true when selecting a model for the GeminiClient.
  • packages/core/src/services/modelConfigService.test.ts
    • Added a new describe block for 'fallback behavior' to test the new functionality.
    • Included a test case to verify that an unknown model with isChatModel: true correctly falls back to chat-base configuration.
    • Added a test case to confirm that an unknown model returns an empty config if chat-base is not defined.
    • Included a test case to ensure that the fallback does NOT occur if isChatModel is false, even for an unknown model.
  • packages/core/src/services/modelConfigService.ts
    • Extended the ModelConfigKey interface with an optional isChatModel boolean property, along with a descriptive comment.
    • Updated the getResolvedConfig method to pass the context.isChatModel value to the resolveAliasChain method.
    • Modified the resolveAliasChain method signature to accept an optional isChatModel parameter.
    • Implemented new logic within resolveAliasChain to check for isChatModel and the existence of a chat-base alias, then recursively resolve the chat-base configuration as a fallback for unrecognized models.
Activity
  • No specific activity (comments, reviews, progress updates) has been recorded for this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a fallback mechanism for unrecognized chat models to inherit the configuration from chat-base. The implementation is mostly correct and well-tested for the primary scenarios. However, I've identified a significant design issue in how the fallback is handled, which prevents overrides on the underlying base model of chat-base from being applied. This could lead to inconsistent behavior and should be addressed to ensure proper inheritance.

Comment on lines +222 to +226
return {
aliasChain: [...fallbackResolution.aliasChain, requestedModel],
baseModel: requestedModel,
resolvedConfig: fallbackResolution.resolvedConfig,
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This implementation correctly preserves the requested model name, which is good. However, by setting baseModel: requestedModel, it prevents any overrides that target the actual base model of chat-base (e.g., gemini-pro) from being applied. This can lead to surprising behavior where an unknown model doesn't fully inherit the behavior of chat-base.

For example, if chat-base uses gemini-pro and there is an override defined for gemini-pro, that override will not be applied when resolving an unrecognized model via this fallback logic.

To fix this, fallbackResolution.baseModel should be used for override matching, while still ensuring the final resolved model is requestedModel. This likely requires a more significant refactoring, for example, by separating the model used for override resolution from the final resolved model name within internalGetResolvedConfig.

@github-actions
Copy link

github-actions bot commented Feb 13, 2026

Size Change: +578 B (0%)

Total Size: 24.4 MB

ℹ️ View Unchanged
Filename Size Change
./bundle/gemini.js 24.4 MB +578 B (0%)
./bundle/sandbox-macos-permissive-open.sb 890 B 0 B
./bundle/sandbox-macos-permissive-proxied.sb 1.31 kB 0 B
./bundle/sandbox-macos-restrictive-open.sb 3.36 kB 0 B
./bundle/sandbox-macos-restrictive-proxied.sb 3.56 kB 0 B
./bundle/sandbox-macos-strict-open.sb 4.82 kB 0 B
./bundle/sandbox-macos-strict-proxied.sb 5.02 kB 0 B

compressed-size-action

@SandyTao520 SandyTao520 added this pull request to the merge queue Feb 13, 2026
Merged via the queue into main with commit e844a57 Feb 13, 2026
26 of 27 checks passed
@SandyTao520 SandyTao520 deleted the st/feat/custom-model-fallback-chat-base branch February 13, 2026 19:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants