feat(core): fallback to chat-base when using unrecognized models for chat#19016
feat(core): fallback to chat-base when using unrecognized models for chat#19016SandyTao520 merged 2 commits intomainfrom
Conversation
…chat Previously, passing an unknown model to the CLI via `-m custom-model` would result in an empty generation configuration, stripping standard interactive chat defaults like `temperature: 1`, `topP: 0.95` and `topK: 64`. This change updates the model configuration service to fallback to the `chat-base` configuration whenever `-m` provides an unrecognized custom model for the primary interactive chat session. Other internal background model queries (e.g., summarizer) will remain unaffected.
|
Hi @SandyTao520, thank you so much for your contribution to Gemini CLI! We really appreciate the time and effort you've put into this. We're making some updates to our contribution process to improve how we track and review changes. Please take a moment to review our recent discussion post: Improving Our Contribution Process & Introducing New Guidelines. Key Update: Starting January 26, 2026, the Gemini CLI project will require all pull requests to be associated with an existing issue. Any pull requests not linked to an issue by that date will be automatically closed. Thank you for your understanding and for being a part of our community! |
Summary of ChangesHello @SandyTao520, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request introduces a significant improvement to the Model Config Service by implementing a robust fallback mechanism. When an unrecognized model is requested for an interactive chat session through the CLI, it will now seamlessly inherit the Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces a fallback mechanism for unrecognized chat models to inherit the configuration from chat-base. The implementation is mostly correct and well-tested for the primary scenarios. However, I've identified a significant design issue in how the fallback is handled, which prevents overrides on the underlying base model of chat-base from being applied. This could lead to inconsistent behavior and should be addressed to ensure proper inheritance.
| return { | ||
| aliasChain: [...fallbackResolution.aliasChain, requestedModel], | ||
| baseModel: requestedModel, | ||
| resolvedConfig: fallbackResolution.resolvedConfig, | ||
| }; |
There was a problem hiding this comment.
This implementation correctly preserves the requested model name, which is good. However, by setting baseModel: requestedModel, it prevents any overrides that target the actual base model of chat-base (e.g., gemini-pro) from being applied. This can lead to surprising behavior where an unknown model doesn't fully inherit the behavior of chat-base.
For example, if chat-base uses gemini-pro and there is an override defined for gemini-pro, that override will not be applied when resolving an unrecognized model via this fallback logic.
To fix this, fallbackResolution.baseModel should be used for override matching, while still ensuring the final resolved model is requestedModel. This likely requires a more significant refactoring, for example, by separating the model used for override resolution from the final resolved model name within internalGetResolvedConfig.
|
Size Change: +578 B (0%) Total Size: 24.4 MB ℹ️ View Unchanged
|
Summary
This PR updates the Model Config Service so that when an unrecognized model is requested via the CLI (e.g.,
-m my-custom-model), it automatically inherits thechat-basegeneration configuration.Previously, using an unknown model resulted in an empty configuration, stripping away standard interactive chat defaults like
temperature: 1,topP: 0.95andtopK: 64.Details
This change adds an
isChatModelproperty to theModelConfigKeyinterface to signify when a configuration request originates from the primary interactive chat session. TheresolveAliasChainmethod has been updated to conditionally apply thechat-basefallback only whenisChatModelis true. This ensures that internal background models (e.g., the summarizer) are not inadvertently affected and continue to receive an empty fallback config when encountering unrecognized aliases.Related Issues
#18007
How to Validate
npm start -- -m my-custom-model.chat-base(e.g., it shouldn't feel restricted to default 0 temperature, etc.).chat-baseoverrides.Pre-Merge Checklist