Skip to content

Commit

Permalink
Replace pydantic_plugin in logfire.configure() with `logfire.inst…
Browse files Browse the repository at this point in the history
…rument_pydantic()` (#453)
  • Loading branch information
alexmojaki authored Sep 27, 2024
1 parent a996819 commit f8d4c39
Show file tree
Hide file tree
Showing 18 changed files with 215 additions and 97 deletions.
28 changes: 14 additions & 14 deletions docs/integrations/pydantic.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Pydantic

Logfire has a [Pydantic plugin][pydantic-plugin] to instrument [Pydantic][pydantic] models.
Logfire has a Pydantic plugin to instrument [Pydantic][pydantic] models.
The plugin provides logs and metrics about model validation.

To enable the plugin, do one of the following:
Expand All @@ -13,35 +13,36 @@ To enable the plugin, do one of the following:
pydantic_plugin_record = "all"
```

- Use the [`pydantic_plugin`][logfire.configure(pydantic_plugin)] parameter in `logfire.configure`, e.g:
- Call [`logfire.instrument_pydantic`][logfire.Logfire.instrument_pydantic] with the desired configuration, e.g:

```py
import logfire

logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record='all'))
logfire.instrument_pydantic() # Defaults to record='all'
```

Note that if you only use the last option then only models defined and imported *after* calling `logfire.configure`
Note that if you only use the last option then only model classes defined and imported *after* calling `logfire.instrument_pydantic`
will be instrumented.

!!! note
Remember to call [`logfire.configure()`][logfire.configure] at some point, whether before or after
calling `logfire.instrument_pydantic` and defining model classes.
Model validations will only start being logged after calling `logfire.configure()`.

## Third party modules

By default, third party modules are not instrumented by the plugin to avoid noise. You can enable instrumentation for those
using the [`include`][logfire.PydanticPlugin.include] configuration.

```py
import logfire

logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record='all', include={'openai'}))
logfire.instrument_pydantic(include={'openai'})
```

You can also disable instrumentation for your own modules using the
[`exclude`][logfire.PydanticPlugin.exclude] configuration.

```py
import logfire

logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record='all', exclude={'app.api.v1'}))
logfire.instrument_pydantic(exclude={'app.api.v1'})
```

## Model configuration
Expand All @@ -60,13 +61,13 @@ class Foo(BaseModel, plugin_settings=PluginSettings(logfire={'record': 'failure'

### Record

The [`record`][logfire.integrations.pydantic.LogfireSettings.record] is used to configure what to record.
The [`record`][logfire.integrations.pydantic.LogfireSettings.record] argument is used to configure what to record.
It can be one of the following values:

* `off`: Disable instrumentation. This is default value.
* `all`: Send traces and metrics for all events.
* `all`: Send traces and metrics for all events. This is default value for `logfire.instrument_pydantic`.
* `failure`: Send metrics for all validations and traces only for validation failures.
* `metrics`: Send only metrics.
* `off`: Disable instrumentation.

<!--
[Sampling](../usage/sampling.md) can be configured by `trace_sample_rate` key in
Expand Down Expand Up @@ -98,4 +99,3 @@ class Foo(
```

[pydantic]: https://docs.pydantic.dev/latest/
[pydantic-plugin]: https://docs.pydantic.dev/latest/concepts/plugins/
5 changes: 3 additions & 2 deletions docs/integrations/third-party/mirascope.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ This will give you:
<figcaption>Mirascope Anthropic call span and Anthropic span and conversation</figcaption>
</figure>

Since Mirascope is built on top of [Pydantic][pydantic], you can use the [Pydantic plugin][pydantic-plugin] to track additional logs and metrics about model validation, which you can enable using the [`pydantic_plugin`][logfire.configure(pydantic_plugin)] configuration.
Since Mirascope is built on top of [Pydantic][pydantic], you can use the [Pydantic plugin](../pydantic.md) to track additional logs and metrics about model validation.

This can be particularly useful when [extracting structured information][mirascope-extracting-structured-information] using LLMs:

Expand All @@ -44,7 +44,8 @@ from mirascope.core import openai, prompt_template
from mirascope.integrations.logfire import with_logfire
from pydantic import BaseModel

logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record="all"))
logfire.configure()
logfire.instrument_pydantic()


class TaskDetails(BaseModel):
Expand Down
5 changes: 3 additions & 2 deletions docs/why-logfire/pydantic.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@ from datetime import date
import logfire
from pydantic import BaseModel

logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record='all')) # (1)!
logfire.configure()
logfire.instrument_pydantic() # (1)!

class User(BaseModel):
name: str
Expand All @@ -43,7 +44,7 @@ User(name='Ben', country_code='USA', dob='2000-02-02')
User(name='Charlie', country_code='GBR', dob='1990-03-03')
```

1. This configuration means details about all Pydantic model validations will be recorded. You can also record details about validation failures only, or just metrics; see the [pydantic plugin docs][logfire.PydanticPlugin].
1. This configuration means details about all Pydantic model validations will be recorded. You can also record details about validation failures only, or just metrics; see the [pydantic plugin docs](../integrations/pydantic.md).
2. Since we've enabled the Pydantic Plugin, all Pydantic validations will be recorded in Logfire.

Learn more about the [Pydantic Plugin here](../integrations/pydantic.md).
Expand Down
3 changes: 3 additions & 0 deletions logfire-api/logfire_api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,8 @@ def decorator(func):
def instrument_fastapi(self, *args, **kwargs) -> ContextManager[None]:
return nullcontext()

def instrument_pydantic(self, *args, **kwargs) -> None: ...

def instrument_pymongo(self, *args, **kwargs) -> None: ...

def instrument_sqlalchemy(self, *args, **kwargs) -> None: ...
Expand Down Expand Up @@ -144,6 +146,7 @@ def shutdown(self, *args, **kwargs) -> None: ...
log_slow_async_callbacks = DEFAULT_LOGFIRE_INSTANCE.log_slow_async_callbacks
install_auto_tracing = DEFAULT_LOGFIRE_INSTANCE.install_auto_tracing
instrument = DEFAULT_LOGFIRE_INSTANCE.instrument
instrument_pydantic = DEFAULT_LOGFIRE_INSTANCE.instrument_pydantic
instrument_fastapi = DEFAULT_LOGFIRE_INSTANCE.instrument_fastapi
instrument_openai = DEFAULT_LOGFIRE_INSTANCE.instrument_openai
instrument_anthropic = DEFAULT_LOGFIRE_INSTANCE.instrument_anthropic
Expand Down
3 changes: 2 additions & 1 deletion logfire-api/logfire_api/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,15 @@ from .integrations.structlog import LogfireProcessor as StructlogProcessor
from .version import VERSION as VERSION
from logfire.sampling import SamplingOptions as SamplingOptions

__all__ = ['Logfire', 'LogfireSpan', 'LevelName', 'AdvancedOptions', 'ConsoleOptions', 'PydanticPlugin', 'configure', 'span', 'instrument', 'log', 'trace', 'debug', 'notice', 'info', 'warn', 'error', 'exception', 'fatal', 'force_flush', 'log_slow_async_callbacks', 'install_auto_tracing', 'instrument_fastapi', 'instrument_openai', 'instrument_anthropic', 'instrument_asyncpg', 'instrument_httpx', 'instrument_celery', 'instrument_requests', 'instrument_psycopg', 'instrument_django', 'instrument_flask', 'instrument_starlette', 'instrument_aiohttp_client', 'instrument_sqlalchemy', 'instrument_redis', 'instrument_pymongo', 'instrument_mysql', 'instrument_system_metrics', 'AutoTraceModule', 'with_tags', 'with_settings', 'shutdown', 'load_spans_from_file', 'no_auto_trace', 'ScrubMatch', 'ScrubbingOptions', 'VERSION', 'suppress_instrumentation', 'StructlogProcessor', 'LogfireLoggingHandler', 'SamplingOptions', 'MetricsOptions']
__all__ = ['Logfire', 'LogfireSpan', 'LevelName', 'AdvancedOptions', 'ConsoleOptions', 'PydanticPlugin', 'configure', 'span', 'instrument', 'log', 'trace', 'debug', 'notice', 'info', 'warn', 'error', 'exception', 'fatal', 'force_flush', 'log_slow_async_callbacks', 'install_auto_tracing', 'instrument_pydantic', 'instrument_fastapi', 'instrument_openai', 'instrument_anthropic', 'instrument_asyncpg', 'instrument_httpx', 'instrument_celery', 'instrument_requests', 'instrument_psycopg', 'instrument_django', 'instrument_flask', 'instrument_starlette', 'instrument_aiohttp_client', 'instrument_sqlalchemy', 'instrument_redis', 'instrument_pymongo', 'instrument_mysql', 'instrument_system_metrics', 'AutoTraceModule', 'with_tags', 'with_settings', 'shutdown', 'load_spans_from_file', 'no_auto_trace', 'ScrubMatch', 'ScrubbingOptions', 'VERSION', 'suppress_instrumentation', 'StructlogProcessor', 'LogfireLoggingHandler', 'SamplingOptions', 'MetricsOptions']

DEFAULT_LOGFIRE_INSTANCE = Logfire()
span = DEFAULT_LOGFIRE_INSTANCE.span
instrument = DEFAULT_LOGFIRE_INSTANCE.instrument
force_flush = DEFAULT_LOGFIRE_INSTANCE.force_flush
log_slow_async_callbacks = DEFAULT_LOGFIRE_INSTANCE.log_slow_async_callbacks
install_auto_tracing = DEFAULT_LOGFIRE_INSTANCE.install_auto_tracing
instrument_pydantic = DEFAULT_LOGFIRE_INSTANCE.instrument_pydantic
instrument_fastapi = DEFAULT_LOGFIRE_INSTANCE.instrument_fastapi
instrument_openai = DEFAULT_LOGFIRE_INSTANCE.instrument_openai
instrument_anthropic = DEFAULT_LOGFIRE_INSTANCE.instrument_anthropic
Expand Down
16 changes: 8 additions & 8 deletions logfire-api/logfire_api/_internal/config.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ from .metrics import ProxyMeterProvider as ProxyMeterProvider
from .scrubbing import BaseScrubber as BaseScrubber, NOOP_SCRUBBER as NOOP_SCRUBBER, Scrubber as Scrubber, ScrubbingOptions as ScrubbingOptions
from .stack_info import warn_at_user_stacklevel as warn_at_user_stacklevel
from .tracer import PendingSpanProcessor as PendingSpanProcessor, ProxyTracerProvider as ProxyTracerProvider
from .utils import UnexpectedResponse as UnexpectedResponse, ensure_data_dir_exists as ensure_data_dir_exists, get_version as get_version, read_toml_file as read_toml_file, suppress_instrumentation as suppress_instrumentation
from .utils import SeededRandomIdGenerator as SeededRandomIdGenerator, UnexpectedResponse as UnexpectedResponse, ensure_data_dir_exists as ensure_data_dir_exists, read_toml_file as read_toml_file, suppress_instrumentation as suppress_instrumentation
from _typeshed import Incomplete
from dataclasses import dataclass
from functools import cached_property
Expand Down Expand Up @@ -59,7 +59,10 @@ class AdvancedOptions:

@dataclass
class PydanticPlugin:
"""Options for the Pydantic plugin."""
"""Options for the Pydantic plugin.
This class is deprecated for external use. Use `logfire.instrument_pydantic()` instead.
"""
record: PydanticPluginRecordValues = ...
include: set[str] = ...
exclude: set[str] = ...
Expand All @@ -74,7 +77,7 @@ class MetricsOptions:

class DeprecatedKwargs(TypedDict): ...

def configure(*, send_to_logfire: bool | Literal['if-token-present'] | None = None, token: str | None = None, service_name: str | None = None, service_version: str | None = None, console: ConsoleOptions | Literal[False] | None = None, config_dir: Path | str | None = None, data_dir: Path | str | None = None, additional_span_processors: Sequence[SpanProcessor] | None = None, metrics: MetricsOptions | Literal[False] | None = None, pydantic_plugin: PydanticPlugin | None = None, scrubbing: ScrubbingOptions | Literal[False] | None = None, inspect_arguments: bool | None = None, sampling: SamplingOptions | None = None, advanced: AdvancedOptions | None = None, **deprecated_kwargs: Unpack[DeprecatedKwargs]) -> None:
def configure(*, send_to_logfire: bool | Literal['if-token-present'] | None = None, token: str | None = None, service_name: str | None = None, service_version: str | None = None, console: ConsoleOptions | Literal[False] | None = None, config_dir: Path | str | None = None, data_dir: Path | str | None = None, additional_span_processors: Sequence[SpanProcessor] | None = None, metrics: MetricsOptions | Literal[False] | None = None, scrubbing: ScrubbingOptions | Literal[False] | None = None, inspect_arguments: bool | None = None, sampling: SamplingOptions | None = None, advanced: AdvancedOptions | None = None, **deprecated_kwargs: Unpack[DeprecatedKwargs]) -> None:
"""Configure the logfire SDK.
Args:
Expand All @@ -94,8 +97,6 @@ def configure(*, send_to_logfire: bool | Literal['if-token-present'] | None = No
additional_span_processors: Span processors to use in addition to the default processor which exports spans to Logfire's API.
metrics: Set to `False` to disable sending all metrics,
or provide a `MetricsOptions` object to configure metrics, e.g. additional metric readers.
pydantic_plugin: Configuration for the Pydantic plugin. If `None` uses the `LOGFIRE_PYDANTIC_PLUGIN_*` environment
variables, otherwise defaults to `PydanticPlugin(record='off')`.
scrubbing: Options for scrubbing sensitive data. Set to `False` to disable.
inspect_arguments: Whether to enable
[f-string magic](https://logfire.pydantic.dev/docs/guides/onboarding-checklist/add-manual-tracing/#f-strings).
Expand Down Expand Up @@ -123,21 +124,20 @@ class _LogfireConfigData:
console: ConsoleOptions | Literal[False] | None
data_dir: Path
additional_span_processors: Sequence[SpanProcessor] | None
pydantic_plugin: PydanticPlugin
scrubbing: ScrubbingOptions | Literal[False]
inspect_arguments: bool
sampling: SamplingOptions
advanced: AdvancedOptions

class LogfireConfig(_LogfireConfigData):
def __init__(self, send_to_logfire: bool | None = None, token: str | None = None, service_name: str | None = None, service_version: str | None = None, console: ConsoleOptions | Literal[False] | None = None, config_dir: Path | None = None, data_dir: Path | None = None, additional_span_processors: Sequence[SpanProcessor] | None = None, metrics: MetricsOptions | Literal[False] | None = None, pydantic_plugin: PydanticPlugin | None = None, scrubbing: ScrubbingOptions | Literal[False] | None = None, inspect_arguments: bool | None = None, sampling: SamplingOptions | None = None, advanced: AdvancedOptions | None = None) -> None:
def __init__(self, send_to_logfire: bool | None = None, token: str | None = None, service_name: str | None = None, service_version: str | None = None, console: ConsoleOptions | Literal[False] | None = None, config_dir: Path | None = None, data_dir: Path | None = None, additional_span_processors: Sequence[SpanProcessor] | None = None, metrics: MetricsOptions | Literal[False] | None = None, scrubbing: ScrubbingOptions | Literal[False] | None = None, inspect_arguments: bool | None = None, sampling: SamplingOptions | None = None, advanced: AdvancedOptions | None = None) -> None:
"""Create a new LogfireConfig.
Users should never need to call this directly, instead use `logfire.configure`.
See `_LogfireConfigData` for parameter documentation.
"""
def configure(self, send_to_logfire: bool | Literal['if-token-present'] | None, token: str | None, service_name: str | None, service_version: str | None, console: ConsoleOptions | Literal[False] | None, config_dir: Path | None, data_dir: Path | None, additional_span_processors: Sequence[SpanProcessor] | None, metrics: MetricsOptions | Literal[False] | None, pydantic_plugin: PydanticPlugin | None, scrubbing: ScrubbingOptions | Literal[False] | None, inspect_arguments: bool | None, sampling: SamplingOptions | None, advanced: AdvancedOptions | None) -> None: ...
def configure(self, send_to_logfire: bool | Literal['if-token-present'] | None, token: str | None, service_name: str | None, service_version: str | None, console: ConsoleOptions | Literal[False] | None, config_dir: Path | None, data_dir: Path | None, additional_span_processors: Sequence[SpanProcessor] | None, metrics: MetricsOptions | Literal[False] | None, scrubbing: ScrubbingOptions | Literal[False] | None, inspect_arguments: bool | None, sampling: SamplingOptions | None, advanced: AdvancedOptions | None) -> None: ...
def initialize(self) -> ProxyTracerProvider:
"""Configure internals to start exporting traces and metrics."""
def force_flush(self, timeout_millis: int = 30000) -> bool:
Expand Down
2 changes: 0 additions & 2 deletions logfire-api/logfire_api/_internal/config_params.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -75,5 +75,3 @@ class ParamManager:
"""
@cached_property
def pydantic_plugin(self): ...

def default_param_manager(): ...
21 changes: 20 additions & 1 deletion logfire-api/logfire_api/_internal/main.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ from . import async_ as async_
from ..version import VERSION as VERSION
from .auto_trace import AutoTraceModule as AutoTraceModule, install_auto_tracing as install_auto_tracing
from .config import GLOBAL_CONFIG as GLOBAL_CONFIG, LogfireConfig as LogfireConfig, OPEN_SPANS as OPEN_SPANS
from .config_params import PydanticPluginRecordValues as PydanticPluginRecordValues
from .constants import ATTRIBUTES_JSON_SCHEMA_KEY as ATTRIBUTES_JSON_SCHEMA_KEY, ATTRIBUTES_LOG_LEVEL_NUM_KEY as ATTRIBUTES_LOG_LEVEL_NUM_KEY, ATTRIBUTES_MESSAGE_KEY as ATTRIBUTES_MESSAGE_KEY, ATTRIBUTES_MESSAGE_TEMPLATE_KEY as ATTRIBUTES_MESSAGE_TEMPLATE_KEY, ATTRIBUTES_SAMPLE_RATE_KEY as ATTRIBUTES_SAMPLE_RATE_KEY, ATTRIBUTES_SPAN_TYPE_KEY as ATTRIBUTES_SPAN_TYPE_KEY, ATTRIBUTES_TAGS_KEY as ATTRIBUTES_TAGS_KEY, ATTRIBUTES_VALIDATION_ERROR_KEY as ATTRIBUTES_VALIDATION_ERROR_KEY, DISABLE_CONSOLE_KEY as DISABLE_CONSOLE_KEY, LEVEL_NUMBERS as LEVEL_NUMBERS, LevelName as LevelName, NULL_ARGS_KEY as NULL_ARGS_KEY, OTLP_MAX_INT_SIZE as OTLP_MAX_INT_SIZE, log_level_attributes as log_level_attributes
from .formatter import logfire_format as logfire_format, logfire_format_with_magic as logfire_format_with_magic
from .instrument import LogfireArgs as LogfireArgs, instrument as instrument
Expand All @@ -24,7 +25,7 @@ from .json_schema import JsonSchemaProperties as JsonSchemaProperties, attribute
from .metrics import ProxyMeterProvider as ProxyMeterProvider
from .stack_info import get_user_stack_info as get_user_stack_info
from .tracer import ProxyTracerProvider as ProxyTracerProvider
from .utils import SysExcInfo as SysExcInfo, handle_internal_errors as handle_internal_errors, log_internal_error as log_internal_error, uniquify_sequence as uniquify_sequence
from .utils import SysExcInfo as SysExcInfo, get_version as get_version, handle_internal_errors as handle_internal_errors, log_internal_error as log_internal_error, uniquify_sequence as uniquify_sequence
from django.http import HttpRequest as HttpRequest, HttpResponse as HttpResponse
from fastapi import FastAPI
from flask.app import Flask
Expand Down Expand Up @@ -367,6 +368,24 @@ class Logfire:
modules in `sys.modules` (i.e. modules that have already been imported) match the modules to trace.
Set to `'warn'` to issue a warning instead, or `'ignore'` to skip the check.
"""
def instrument_pydantic(self, record: PydanticPluginRecordValues = 'all', include: Iterable[str] = (), exclude: Iterable[str] = ()):
"""Instrument Pydantic model validations.
This must be called before defining and importing the model classes you want to instrument.
See the [Pydantic integration guide](https://logfire.pydantic.dev/docs/integrations/pydantic/) for more info.
Args:
record: The record mode for the Pydantic plugin. It can be one of the following values:
- `all`: Send traces and metrics for all events. This is default value.
- `failure`: Send metrics for all validations and traces only for validation failures.
- `metrics`: Send only metrics.
- `off`: Disable instrumentation.
include:
By default, third party modules are not instrumented. This option allows you to include specific modules.
exclude:
Exclude specific modules from instrumentation.
"""
def instrument_fastapi(self, app: FastAPI, *, capture_headers: bool = False, request_attributes_mapper: Callable[[Request | WebSocket, dict[str, Any]], dict[str, Any] | None] | None = None, use_opentelemetry_instrumentation: bool = True, excluded_urls: str | Iterable[str] | None = None, record_send_receive: bool = False, **opentelemetry_kwargs: Any) -> ContextManager[None]:
"""Instrument a FastAPI app so that spans and logs are automatically created for each request.
Expand Down
Loading

0 comments on commit f8d4c39

Please sign in to comment.