-
-
Notifications
You must be signed in to change notification settings - Fork 111
add gpt 5.2 models to model meta #166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Warning Rate limit exceeded@AlemTuzlak has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 0 minutes and 25 seconds before requesting another review. β How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. π¦ How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. π Files selected for processing (1)
WalkthroughAdds three GPT-5.2 family models (GPT5_2, GPT5_2_PRO, GPT5_2_CHAT) with full metadata and type assertions; updates the chat-model list and extends type mappings for provider options and input modalities. Also adds a changeset entry announcing support for gpt 5.2. Changes
Estimated code review effortπ― 2 (Simple) | β±οΈ ~10 minutes
Poem
Pre-merge checks and finishing touchesβ Passed checks (3 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
View your CI Pipeline Execution β for commit c885dd6
βοΈ Nx Cloud last updated this comment at |
@tanstack/ai
@tanstack/ai-anthropic
@tanstack/ai-client
@tanstack/ai-devtools-core
@tanstack/ai-gemini
@tanstack/ai-ollama
@tanstack/ai-openai
@tanstack/ai-react
@tanstack/ai-react-ui
@tanstack/ai-solid
@tanstack/ai-solid-ui
@tanstack/ai-svelte
@tanstack/ai-vue
@tanstack/ai-vue-ui
@tanstack/react-ai-devtools
@tanstack/solid-ai-devtools
commit: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
π§Ή Nitpick comments (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
135-162: Consider addingOpenAIToolsOptionsfor consistency withfunction_callingsupport.The model includes
function_callingin features but omitsOpenAIToolsOptionsfrom provider options. Compare withGPT_5_1_CHAT(lines 1560-1567), which has the samefunction_callingfeature but includesOpenAIToolsOptions.If users can call custom tools via
function_calling, they may need access totool_choice,parallel_tool_calls, andmax_tool_callsoptions.π Suggested fix
} as const satisfies ModelMeta< OpenAIBaseOptions & OpenAIReasoningOptions & OpenAIStructuredOutputOptions & + OpenAIToolsOptions & OpenAIStreamingOptions & OpenAIMetadataOptions >Also update the type mapping at lines 1769-1773:
[GPT5_2_CHAT.name]: OpenAIBaseOptions & OpenAIReasoningOptions & OpenAIStructuredOutputOptions & + OpenAIToolsOptions & OpenAIStreamingOptions & OpenAIMetadataOptions
π Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
π Files selected for processing (1)
packages/typescript/ai-openai/src/model-meta.ts(4 hunks)
π§° Additional context used
π Path-based instructions (3)
**/*.{ts,tsx}
π CodeRabbit inference engine (CLAUDE.md)
**/*.{ts,tsx}: Use tree-shakeable adapter architecture for provider implementations - export specialized adapters (text, embedding, summarize, image) as separate imports from/adapterssubpath rather than monolithic adapters
Use Zod for runtime schema validation and type inference, particularly for tool input/output definitions withtoolDefinition()and Zod schema inference
Implement isomorphic tool system usingtoolDefinition()with.server()and.client()implementations for dual-environment execution
Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Implement stream processing with StreamProcessor for handling chunked responses and support partial JSON parsing for streaming AI responses
Files:
packages/typescript/ai-openai/src/model-meta.ts
**/*.{ts,tsx,js,jsx}
π CodeRabbit inference engine (CLAUDE.md)
Use camelCase for function and variable names throughout the codebase
Files:
packages/typescript/ai-openai/src/model-meta.ts
packages/typescript/*/src/model-meta.ts
π CodeRabbit inference engine (CLAUDE.md)
Maintain model metadata files that define provider options and capabilities per model for per-model type safety
Files:
packages/typescript/ai-openai/src/model-meta.ts
π§ Learnings (2)
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety
Applied to files:
packages/typescript/ai-openai/src/model-meta.ts
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Applied to files:
packages/typescript/ai-openai/src/model-meta.ts
𧬠Code graph analysis (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
packages/typescript/ai-openai/src/text/text-provider-options.ts (6)
OpenAIBaseOptions(17-125)OpenAIReasoningOptions(136-156)OpenAIStructuredOutputOptions(184-190)OpenAIToolsOptions(192-213)OpenAIStreamingOptions(215-225)OpenAIMetadataOptions(227-235)
β° Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Preview
- GitHub Check: Test
π Additional comments (4)
packages/typescript/ai-openai/src/model-meta.ts (4)
107-133: LGTM!The GPT5_2_PRO model definition is internally consistent. The provider options correctly omit
OpenAIStructuredOutputOptionssincestructured_outputsis not in the features array.
1648-1652: LGTM!The new models are correctly added to the
OPENAI_CHAT_MODELSarray under the "Frontier models" section.
1762-1778: Type mappings are consistent with model definitions.The provider options type mappings for the three new models correctly mirror their respective
ModelMetatype parameters. Note that ifOpenAIToolsOptionsis added toGPT5_2_CHATper the earlier suggestion, this mapping should be updated accordingly.
1947-1950: LGTM!The input modalities type mappings correctly reference the
supports.inputarrays from the new model constants.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
π§Ή Nitpick comments (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
135-162: Consider addingOpenAIToolsOptionssincefunction_callingis supported.The model has
function_callingin features (line 144) but the type assertion omitsOpenAIToolsOptions. Even with an emptytoolsarray, users can still define custom tools and would benefit from options liketool_choice,max_tool_calls, andparallel_tool_calls.For comparison,
GPT_5_1_CHAT(lines 1560-1567) includesOpenAIToolsOptionsdespite having no built-in tools.π Proposed fix
} as const satisfies ModelMeta< OpenAIBaseOptions & OpenAIReasoningOptions & OpenAIStructuredOutputOptions & + OpenAIToolsOptions & OpenAIStreamingOptions & OpenAIMetadataOptions >Also update the corresponding entry in
OpenAIChatModelProviderOptionsByName(lines 1769-1773) to includeOpenAIToolsOptions.
π Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
π Files selected for processing (2)
.changeset/spotty-tables-drum.md(1 hunks)packages/typescript/ai-openai/src/model-meta.ts(4 hunks)
β Files skipped from review due to trivial changes (1)
- .changeset/spotty-tables-drum.md
π§° Additional context used
π Path-based instructions (3)
**/*.{ts,tsx}
π CodeRabbit inference engine (CLAUDE.md)
**/*.{ts,tsx}: Use tree-shakeable adapter architecture for provider implementations - export specialized adapters (text, embedding, summarize, image) as separate imports from/adapterssubpath rather than monolithic adapters
Use Zod for runtime schema validation and type inference, particularly for tool input/output definitions withtoolDefinition()and Zod schema inference
Implement isomorphic tool system usingtoolDefinition()with.server()and.client()implementations for dual-environment execution
Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Implement stream processing with StreamProcessor for handling chunked responses and support partial JSON parsing for streaming AI responses
Files:
packages/typescript/ai-openai/src/model-meta.ts
**/*.{ts,tsx,js,jsx}
π CodeRabbit inference engine (CLAUDE.md)
Use camelCase for function and variable names throughout the codebase
Files:
packages/typescript/ai-openai/src/model-meta.ts
packages/typescript/*/src/model-meta.ts
π CodeRabbit inference engine (CLAUDE.md)
Maintain model metadata files that define provider options and capabilities per model for per-model type safety
Files:
packages/typescript/ai-openai/src/model-meta.ts
π§ Learnings (2)
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety
Applied to files:
packages/typescript/ai-openai/src/model-meta.ts
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Applied to files:
packages/typescript/ai-openai/src/model-meta.ts
𧬠Code graph analysis (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
packages/typescript/ai-openai/src/text/text-provider-options.ts (6)
OpenAIBaseOptions(17-125)OpenAIReasoningOptions(136-156)OpenAIStructuredOutputOptions(184-190)OpenAIToolsOptions(192-213)OpenAIStreamingOptions(215-225)OpenAIMetadataOptions(227-235)
β° Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Preview
- GitHub Check: Test
π Additional comments (5)
packages/typescript/ai-openai/src/model-meta.ts (5)
107-133: LGTM!The PRO variant follows established patterns: higher pricing (~12x), no cached pricing tier, and type assertion correctly excludes
OpenAIStructuredOutputOptionssincestructured_outputsis not in features.
1650-1652: LGTM!The new models are correctly placed at the top of the frontier models section, maintaining the pattern of listing newer models first.
1763-1778: Type mappings are consistent with model definitions.The type entries correctly mirror each model's type assertion. Note: If
OpenAIToolsOptionsis added toGPT5_2_CHAT's model definition as suggested above, this entry (lines 1769-1773) should also be updated accordingly.
1948-1950: LGTM!The input modality mappings correctly use
typeofto reference each model'ssupports.input, ensuring type-safe consistency with the model definitions.
67-106: Verify PRO model pricing and API compatibility.GPT-5.2 is priced at $1.75/1M input tokens and $14/1M output tokens, which differs from the pricing comment's suggestion of a 12x premium. More critically, GPT-5.2 pro is available in the Responses API only, not Chat Completions API. If GPT5_2_PRO is listed in OPENAI_CHAT_MODELS, remove it or ensure it's correctly restricted to Responses API use only. Confirm the PRO variant pricing in the code matches the official API rates.
β Skipped due to learnings
Learnt from: CR Repo: TanStack/ai PR: 0 File: CLAUDE.md:0-0 Timestamp: 2025-12-13T17:09:09.794Z Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety
* add gpt 5.2 models to model meta * ci: apply automated fixes * changeeset * update cutoff dates * ci: apply automated fixes * update cutoff dates --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
π― Changes
Adds gpt5.2 models
β Checklist
pnpm run test:pr.π Release Impact
Summary by CodeRabbit
Release Notes
βοΈ Tip: You can customize this high-level summary in your review settings.