Skip to content

Conversation

@AlemTuzlak
Copy link
Contributor

@AlemTuzlak AlemTuzlak commented Dec 20, 2025

🎯 Changes

Adds gpt5.2 models

βœ… Checklist

  • I have followed the steps in the Contributing guide.
  • I have tested this code locally with pnpm run test:pr.

πŸš€ Release Impact

  • This change affects published code, and I have generated a changeset.
  • This change is docs/CI/dev-only (no release).

Summary by CodeRabbit

Release Notes

  • New Features
    • Added support for three new OpenAI GPT‑5.2 models (standard, Pro, and Chat) with full chat availability, extended context windows, updated token limits, and comprehensive input/output modality support.
  • Chores
    • Published a changeset marking a patch release to roll out GPT‑5.2 support.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 20, 2025

Warning

Rate limit exceeded

@AlemTuzlak has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 0 minutes and 25 seconds before requesting another review.

βŒ› How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

πŸ“₯ Commits

Reviewing files that changed from the base of the PR and between 0a13988 and c885dd6.

πŸ“’ Files selected for processing (1)
  • packages/typescript/ai-openai/src/model-meta.ts (4 hunks)

Walkthrough

Adds three GPT-5.2 family models (GPT5_2, GPT5_2_PRO, GPT5_2_CHAT) with full metadata and type assertions; updates the chat-model list and extends type mappings for provider options and input modalities. Also adds a changeset entry announcing support for gpt 5.2.

Changes

Cohort / File(s) Change Summary
Model metadata & types
packages/typescript/ai-openai/src/model-meta.ts
Adds GPT5_2, GPT5_2_PRO, GPT5_2_CHAT model metadata objects (name, context_window, max_output_tokens, knowledge_cutoff, supports, pricing) using as const satisfies ModelMeta<...>. Inserts their names into OPENAI_CHAT_MODELS. Extends OpenAIChatModelProviderOptionsByName and OpenAIModelInputModalitiesByName with entries for the three models.
Release changeset
.changeset/spotty-tables-drum.md
New changeset entry for a patch release of @tanstack/ai-openai with description "add support for gpt 5.2 models". No code changes beyond metadata/types.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

  • Inspect model-meta.ts additions for consistent structure and correct use of satisfies ModelMeta<...>.
  • Verify OPENAI_CHAT_MODELS insertion order and any runtime assumptions relying on that list.
  • Check extended types (OpenAIChatModelProviderOptionsByName, OpenAIModelInputModalitiesByName) compile and map to the right supports shapes.
  • Quick review of .changeset for accuracy of release note.

Poem

🐰 I found three models in the glade,
GPT5_2 family, neatly laid,
Types aligned and metadata spun,
A rabbit hops β€” the work is done! ✨

Pre-merge checks and finishing touches

βœ… Passed checks (3 passed)
Check name Status Explanation
Title check βœ… Passed The pull request title accurately summarizes the main change: adding GPT 5.2 models to the model metadata configuration.
Description check βœ… Passed The PR description follows the required template structure with all sections present, but the Changes section lacks detail about what specific models were added and their features.
Docstring Coverage βœ… Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❀️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@nx-cloud
Copy link

nx-cloud bot commented Dec 20, 2025

View your CI Pipeline Execution β†— for commit c885dd6

Command Status Duration Result
nx affected --targets=test:sherif,test:knip,tes... βœ… Succeeded 1m 6s View β†—
nx run-many --targets=build --exclude=examples/** βœ… Succeeded 37s View β†—

☁️ Nx Cloud last updated this comment at 2025-12-20 14:32:49 UTC

@pkg-pr-new
Copy link

pkg-pr-new bot commented Dec 20, 2025

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai@166

@tanstack/ai-anthropic

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-anthropic@166

@tanstack/ai-client

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-client@166

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-devtools-core@166

@tanstack/ai-gemini

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-gemini@166

@tanstack/ai-ollama

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-ollama@166

@tanstack/ai-openai

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-openai@166

@tanstack/ai-react

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-react@166

@tanstack/ai-react-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-react-ui@166

@tanstack/ai-solid

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-solid@166

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-solid-ui@166

@tanstack/ai-svelte

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-svelte@166

@tanstack/ai-vue

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-vue@166

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-vue-ui@166

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/TanStack/ai/@tanstack/react-ai-devtools@166

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/TanStack/ai/@tanstack/solid-ai-devtools@166

commit: c885dd6

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
packages/typescript/ai-openai/src/model-meta.ts (1)

135-162: Consider adding OpenAIToolsOptions for consistency with function_calling support.

The model includes function_calling in features but omits OpenAIToolsOptions from provider options. Compare with GPT_5_1_CHAT (lines 1560-1567), which has the same function_calling feature but includes OpenAIToolsOptions.

If users can call custom tools via function_calling, they may need access to tool_choice, parallel_tool_calls, and max_tool_calls options.

πŸ”Ž Suggested fix
 } as const satisfies ModelMeta<
   OpenAIBaseOptions &
     OpenAIReasoningOptions &
     OpenAIStructuredOutputOptions &
+    OpenAIToolsOptions &
     OpenAIStreamingOptions &
     OpenAIMetadataOptions
 >

Also update the type mapping at lines 1769-1773:

   [GPT5_2_CHAT.name]: OpenAIBaseOptions &
     OpenAIReasoningOptions &
     OpenAIStructuredOutputOptions &
+    OpenAIToolsOptions &
     OpenAIStreamingOptions &
     OpenAIMetadataOptions
πŸ“œ Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

πŸ“₯ Commits

Reviewing files that changed from the base of the PR and between 254b248 and d691a44.

πŸ“’ Files selected for processing (1)
  • packages/typescript/ai-openai/src/model-meta.ts (4 hunks)
🧰 Additional context used
πŸ““ Path-based instructions (3)
**/*.{ts,tsx}

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

**/*.{ts,tsx}: Use tree-shakeable adapter architecture for provider implementations - export specialized adapters (text, embedding, summarize, image) as separate imports from /adapters subpath rather than monolithic adapters
Use Zod for runtime schema validation and type inference, particularly for tool input/output definitions with toolDefinition() and Zod schema inference
Implement isomorphic tool system using toolDefinition() with .server() and .client() implementations for dual-environment execution
Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Implement stream processing with StreamProcessor for handling chunked responses and support partial JSON parsing for streaming AI responses

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
**/*.{ts,tsx,js,jsx}

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

Use camelCase for function and variable names throughout the codebase

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
packages/typescript/*/src/model-meta.ts

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

Maintain model metadata files that define provider options and capabilities per model for per-model type safety

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
🧠 Learnings (2)
πŸ“š Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety

Applied to files:

  • packages/typescript/ai-openai/src/model-meta.ts
πŸ“š Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety

Applied to files:

  • packages/typescript/ai-openai/src/model-meta.ts
🧬 Code graph analysis (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
packages/typescript/ai-openai/src/text/text-provider-options.ts (6)
  • OpenAIBaseOptions (17-125)
  • OpenAIReasoningOptions (136-156)
  • OpenAIStructuredOutputOptions (184-190)
  • OpenAIToolsOptions (192-213)
  • OpenAIStreamingOptions (215-225)
  • OpenAIMetadataOptions (227-235)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Preview
  • GitHub Check: Test
πŸ”‡ Additional comments (4)
packages/typescript/ai-openai/src/model-meta.ts (4)

107-133: LGTM!

The GPT5_2_PRO model definition is internally consistent. The provider options correctly omit OpenAIStructuredOutputOptions since structured_outputs is not in the features array.


1648-1652: LGTM!

The new models are correctly added to the OPENAI_CHAT_MODELS array under the "Frontier models" section.


1762-1778: Type mappings are consistent with model definitions.

The provider options type mappings for the three new models correctly mirror their respective ModelMeta type parameters. Note that if OpenAIToolsOptions is added to GPT5_2_CHAT per the earlier suggestion, this mapping should be updated accordingly.


1947-1950: LGTM!

The input modalities type mappings correctly reference the supports.input arrays from the new model constants.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
packages/typescript/ai-openai/src/model-meta.ts (1)

135-162: Consider adding OpenAIToolsOptions since function_calling is supported.

The model has function_calling in features (line 144) but the type assertion omits OpenAIToolsOptions. Even with an empty tools array, users can still define custom tools and would benefit from options like tool_choice, max_tool_calls, and parallel_tool_calls.

For comparison, GPT_5_1_CHAT (lines 1560-1567) includes OpenAIToolsOptions despite having no built-in tools.

πŸ”Ž Proposed fix
 } as const satisfies ModelMeta<
   OpenAIBaseOptions &
     OpenAIReasoningOptions &
     OpenAIStructuredOutputOptions &
+    OpenAIToolsOptions &
     OpenAIStreamingOptions &
     OpenAIMetadataOptions
 >

Also update the corresponding entry in OpenAIChatModelProviderOptionsByName (lines 1769-1773) to include OpenAIToolsOptions.

πŸ“œ Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

πŸ“₯ Commits

Reviewing files that changed from the base of the PR and between d691a44 and 0a13988.

πŸ“’ Files selected for processing (2)
  • .changeset/spotty-tables-drum.md (1 hunks)
  • packages/typescript/ai-openai/src/model-meta.ts (4 hunks)
βœ… Files skipped from review due to trivial changes (1)
  • .changeset/spotty-tables-drum.md
🧰 Additional context used
πŸ““ Path-based instructions (3)
**/*.{ts,tsx}

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

**/*.{ts,tsx}: Use tree-shakeable adapter architecture for provider implementations - export specialized adapters (text, embedding, summarize, image) as separate imports from /adapters subpath rather than monolithic adapters
Use Zod for runtime schema validation and type inference, particularly for tool input/output definitions with toolDefinition() and Zod schema inference
Implement isomorphic tool system using toolDefinition() with .server() and .client() implementations for dual-environment execution
Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Implement stream processing with StreamProcessor for handling chunked responses and support partial JSON parsing for streaming AI responses

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
**/*.{ts,tsx,js,jsx}

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

Use camelCase for function and variable names throughout the codebase

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
packages/typescript/*/src/model-meta.ts

πŸ“„ CodeRabbit inference engine (CLAUDE.md)

Maintain model metadata files that define provider options and capabilities per model for per-model type safety

Files:

  • packages/typescript/ai-openai/src/model-meta.ts
🧠 Learnings (2)
πŸ“š Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety

Applied to files:

  • packages/typescript/ai-openai/src/model-meta.ts
πŸ“š Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety

Applied to files:

  • packages/typescript/ai-openai/src/model-meta.ts
🧬 Code graph analysis (1)
packages/typescript/ai-openai/src/model-meta.ts (1)
packages/typescript/ai-openai/src/text/text-provider-options.ts (6)
  • OpenAIBaseOptions (17-125)
  • OpenAIReasoningOptions (136-156)
  • OpenAIStructuredOutputOptions (184-190)
  • OpenAIToolsOptions (192-213)
  • OpenAIStreamingOptions (215-225)
  • OpenAIMetadataOptions (227-235)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Preview
  • GitHub Check: Test
πŸ”‡ Additional comments (5)
packages/typescript/ai-openai/src/model-meta.ts (5)

107-133: LGTM!

The PRO variant follows established patterns: higher pricing (~12x), no cached pricing tier, and type assertion correctly excludes OpenAIStructuredOutputOptions since structured_outputs is not in features.


1650-1652: LGTM!

The new models are correctly placed at the top of the frontier models section, maintaining the pattern of listing newer models first.


1763-1778: Type mappings are consistent with model definitions.

The type entries correctly mirror each model's type assertion. Note: If OpenAIToolsOptions is added to GPT5_2_CHAT's model definition as suggested above, this entry (lines 1769-1773) should also be updated accordingly.


1948-1950: LGTM!

The input modality mappings correctly use typeof to reference each model's supports.input, ensuring type-safe consistency with the model definitions.


67-106: Verify PRO model pricing and API compatibility.

GPT-5.2 is priced at $1.75/1M input tokens and $14/1M output tokens, which differs from the pricing comment's suggestion of a 12x premium. More critically, GPT-5.2 pro is available in the Responses API only, not Chat Completions API. If GPT5_2_PRO is listed in OPENAI_CHAT_MODELS, remove it or ensure it's correctly restricted to Responses API use only. Confirm the PRO variant pricing in the code matches the official API rates.

β›” Skipped due to learnings
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety

@AlemTuzlak AlemTuzlak merged commit 2615fd7 into main Dec 20, 2025
6 checks passed
@AlemTuzlak AlemTuzlak deleted the feat/gpt5-2 branch December 20, 2025 17:33
@github-actions github-actions bot mentioned this pull request Dec 20, 2025
LuggaPugga pushed a commit to LuggaPugga/ai that referenced this pull request Jan 5, 2026
* add gpt 5.2 models to model meta

* ci: apply automated fixes

* changeeset

* update cutoff dates

* ci: apply automated fixes

* update cutoff dates

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants