Skip to content

Commit 128493d

Browse files
committed
Cleanup and addresing comments
1 parent 0da350b commit 128493d

File tree

5 files changed

+27
-51
lines changed

5 files changed

+27
-51
lines changed

docs/builtin-tools.md

Lines changed: 17 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -425,9 +425,8 @@ _(This example is complete, it can be run "as is")_
425425
The [`MCPServerTool`][pydantic_ai.builtin_tools.MCPServerTool] allows your agent to pass MCP configurations in context,
426426
so that the agent can offload MCP calls and parsing to the provider.
427427

428-
This tool is useful for models that support passing MCP servers as tools in parameters, so the model handles calls to remote servers by itself.
429-
430-
However, a vast majority of models do not support this feature, in which case can use Pydantic AI's agent-side [MCP support](mcp/client.md).
428+
This requires the MCP server to live at a public URL the provider can reach and does not support many of the advanced features of Pydantic AI's agent-side [MCP support](mcp/client.md),
429+
but can result in optimized context use and caching, and faster performance due to the lack of a round-trip back to Pydantic AI.
431430

432431
### Provider Support
433432

@@ -456,8 +455,6 @@ agent = Agent(
456455
MCPServerTool(
457456
id='your-mcp-server',
458457
url='https://api.githubcopilot.com/mcp/',
459-
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
460-
allowed_tools=['search_repositories', 'list_commits'],
461458
)
462459
]
463460
)
@@ -482,8 +479,6 @@ agent = Agent(
482479
MCPServerTool(
483480
id='your-mcp-server',
484481
url='https://api.githubcopilot.com/mcp/',
485-
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
486-
allowed_tools=['search_repositories', 'list_commits'],
487482
)
488483
]
489484
)
@@ -509,11 +504,7 @@ agent = Agent(
509504
builtin_tools=[
510505
MCPServerTool(
511506
id='your-mcp-server', # required field
512-
url='https://api.githubcopilot.com/mcp/', # optional field, use `url` or `provider_metadata`
513-
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'), # optional field
514-
allowed_tools=['search_repositories', 'list_commits'], # optional field
515-
description='Your MCP Server', # optional field
516-
headers={'X-CUSTOM-HEADER': 'custom-value'}, # optional field
507+
url='https://api.githubcopilot.com/mcp/', # required field
517508
)
518509
]
519510
)
@@ -523,7 +514,7 @@ print(result.output)
523514
#> Here are some examples of my data: Pen, Paper, Pencil.
524515
```
525516

526-
For OpenAI Responses, you can use connector ID instead of URL:
517+
For OpenAI Responses, you can use a connector by specifying a special `x-openai-connector:` URL:
527518

528519
_(This example is complete, it can be run "as is")_
529520

@@ -536,12 +527,12 @@ agent = Agent(
536527
'openai-responses:gpt-4o',
537528
builtin_tools=[
538529
MCPServerTool(
539-
id='your-mcp-server', # required field
540-
url='x-openai-connector:connector_googlecalendar', # required field
541-
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'), # optional field
542-
allowed_tools=['search_repositories', 'list_commits'], # optional field
543-
description='Your MCP Server', # optional field
544-
headers={'X-CUSTOM-HEADER': 'custom-value'}, # optional field
530+
id='your-mcp-server',
531+
url='x-openai-connector:connector_googlecalendar',
532+
authorization_token=os.getenv('GITHUB_ACCESS_TOKEN', 'mock-access-token'),
533+
allowed_tools=['search_repositories', 'list_commits'],
534+
description='Your MCP Server',
535+
headers={'X-CUSTOM-HEADER': 'custom-value'},
545536
)
546537
]
547538
)
@@ -555,11 +546,13 @@ _(This example is complete, it can be run "as is")_
555546

556547
#### Provider Support
557548

558-
| Parameter | OpenAI | Anthropic | Notes |
559-
|---------------------|--------|-----------|---------------------------------------------------------------------------------------------------------------------|
560-
| `url` ||| For OpenAI Responses, it is possible to use `connector_id` by providing it as `x-openai-connector:<connector_id>` |
561-
| `allowed_tools` ||| ----------- |
562-
| `headers` ||| ----------- |
549+
| Parameter | OpenAI | Anthropic |
550+
|-----------------------|--------|-----------|
551+
| `url` |||
552+
| `allowed_tools` |||
553+
| `authorization_token` |||
554+
| `description` |||
555+
| `headers` |||
563556

564557
## API Reference
565558

docs/mcp/overview.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
# Model Context Protocol (MCP)
22

3-
Pydantic AI supports [Model Context Protocol (MCP)](https://modelcontextprotocol.io) in two ways:
3+
Pydantic AI supports [Model Context Protocol (MCP)](https://modelcontextprotocol.io) in three ways:
44

55
1. [Agents](../agents.md) act as an MCP Client, connecting to MCP servers to use their tools, [learn more …](client.md)
66
2. Agents can be used within MCP servers, [learn more …](server.md)
7+
3. Agents cab pass MCP Servers as built in tools into models, [learn more …](../builtin-tools.md)
78

89
## What is MCP?
910

pydantic_ai_slim/pydantic_ai/builtin_tools.py

Lines changed: 4 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
from abc import ABC
44
from dataclasses import dataclass
5-
from typing import TYPE_CHECKING, Any, Literal
5+
from typing import TYPE_CHECKING, Literal
66

77
from typing_extensions import TypedDict
88

@@ -253,15 +253,15 @@ class MCPServerTool(AbstractBuiltinTool):
253253
id: str
254254
"""The id of the MCP server to use."""
255255

256-
authorization_token: str | None = None
257-
"""Authorization header to use when making requests to the MCP server."""
258-
259256
url: str
260257
"""The URL of the MCP server to use.
261258
262259
For OpenAI Responses, it is possible to use `connector_id` by providing it as `x-openai-connector:<connector_id>`.
263260
"""
264261

262+
authorization_token: str | None = None
263+
"""Authorization header to use when making requests to the MCP server."""
264+
265265
description: str | None = None
266266
"""A description of the MCP server."""
267267

@@ -284,14 +284,6 @@ class MCPServerTool(AbstractBuiltinTool):
284284
* OpenAI Responses
285285
"""
286286

287-
provider_metadata: dict[str, Any] | None = None
288-
"""Extra data to send to the model.
289-
290-
Supported by:
291-
292-
* OpenAI Responses
293-
"""
294-
295287
kind: str = 'mcp_server'
296288

297289
LIST_TOOLS_KIND: Literal['mcp_server:mcp_list_tools'] = 'mcp_server:mcp_list_tools'

pydantic_ai_slim/pydantic_ai/models/anthropic.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
from contextlib import asynccontextmanager
66
from dataclasses import dataclass, field
77
from datetime import datetime
8-
from typing import Any, Literal, TypeAlias, cast, overload
8+
from typing import Any, Literal, cast, overload
99

1010
from pydantic import TypeAdapter
1111
from typing_extensions import assert_never
@@ -890,11 +890,6 @@ def _map_mcp_server_use_block(item: BetaMCPToolUseBlock, provider_name: str) ->
890890
)
891891

892892

893-
BetaMCPToolResultBlockContent: TypeAlias = str | list[BetaTextBlock]
894-
895-
mcp_server_result_content_ta: TypeAdapter[BetaMCPToolResultBlockContent] = TypeAdapter(BetaMCPToolResultBlockContent)
896-
897-
898893
def _map_mcp_server_result_block(item: BetaMCPToolResultBlock, provider_name: str) -> BuiltinToolReturnPart:
899894
return BuiltinToolReturnPart(
900895
provider_name=provider_name,

pydantic_ai_slim/pydantic_ai/models/openai.py

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,7 @@
110110
allows this model to be used more easily with other model types (ie, Ollama, Deepseek).
111111
"""
112112

113-
OpenAIResponsesMCPConnectorIdPrefix: Literal['x-openai-connector'] = 'x-openai-connector'
113+
MCP_SERVER_TOOL_CONNECTOR_URI_SCHEME: Literal['x-openai-connector'] = 'x-openai-connector'
114114
"""
115115
Prefix for OpenAI connector IDs. OpenAI supports either a URL or a connector ID when passing MCP configuration to a model,
116116
by using that prefix like `x-openai-connector:<connector-id>` in a URL, you can pass a connector ID to a model.
@@ -1265,16 +1265,11 @@ def _get_builtin_tools(self, model_request_parameters: ModelRequestParameters) -
12651265
if tool.headers: # pragma: no branch
12661266
mcp_tool['headers'] = tool.headers # pragma: no cover
12671267

1268-
url, connector_id = None, None
1269-
if tool.url.startswith(OpenAIResponsesMCPConnectorIdPrefix): # pragma: no cover
1268+
if tool.url.startswith(MCP_SERVER_TOOL_CONNECTOR_URI_SCHEME + ':'): # pragma: no cover
12701269
_, connector_id = tool.url.split(':', maxsplit=1)
1270+
mcp_tool['connector_id'] = connector_id # pyright: ignore[reportGeneralTypeIssues]
12711271
else:
1272-
url = tool.url
1273-
1274-
if url:
12751272
mcp_tool['server_url'] = tool.url
1276-
elif connector_id: # pragma: no cover
1277-
mcp_tool['connector_id'] = connector_id # pyright: ignore[reportGeneralTypeIssues]
12781273

12791274
tools.append(mcp_tool)
12801275
elif isinstance(tool, ImageGenerationTool): # pragma: no branch

0 commit comments

Comments
 (0)