Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move TestExporter to avoid requiring pytest #368

Merged
merged 5 commits into from
Aug 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# Release Notes

## [v0.50.0] (2024-08-06)
## [v0.50.1] (2024-08-06)

(Previously released as `v0.50.0`, then yanked due to https://github.com/pydantic/logfire/issues/367)

* **BREAKING CHANGES:** Separate sending to Logfire from using standard OTEL environment variables by @alexmojaki in https://github.com/pydantic/logfire/pull/351. See https://docs.pydantic.dev/logfire/guides/advanced/alternative_backends/ for details. Highlights:
* `OTEL_EXPORTER_OTLP_ENDPOINT` is no longer just an alternative to `LOGFIRE_BASE_URL`. Setting `OTEL_EXPORTER_OTLP_ENDPOINT`, `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT`, and/or `OTEL_EXPORTER_OTLP_METRICS_ENDPOINT` will set up appropriate exporters *in addition* to sending to Logfire, which must be turned off separately if desired. These are basic exporters relying on OTEL defaults. In particular they don't use our custom retrying logic.
Expand Down
2 changes: 1 addition & 1 deletion logfire-api/logfire_api/_internal/config.pyi
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import dataclasses
import requests
from ..testing import TestExporter as TestExporter
from .auth import DEFAULT_FILE as DEFAULT_FILE, DefaultFile as DefaultFile, is_logged_in as is_logged_in
from .collect_system_info import collect_package_info as collect_package_info
from .config_params import ParamManager as ParamManager, PydanticPluginRecordValues as PydanticPluginRecordValues
Expand All @@ -13,6 +12,7 @@ from .exporters.processor_wrapper import MainSpanProcessorWrapper as MainSpanPro
from .exporters.quiet_metrics import QuietMetricExporter as QuietMetricExporter
from .exporters.remove_pending import RemovePendingSpansExporter as RemovePendingSpansExporter
from .exporters.tail_sampling import TailSamplingOptions as TailSamplingOptions, TailSamplingProcessor as TailSamplingProcessor
from .exporters.test import TestExporter as TestExporter
from .integrations.executors import instrument_executors as instrument_executors
from .metrics import ProxyMeterProvider as ProxyMeterProvider, configure_metrics as configure_metrics
from .scrubbing import BaseScrubber as BaseScrubber, NOOP_SCRUBBER as NOOP_SCRUBBER, ScrubCallback as ScrubCallback, Scrubber as Scrubber, ScrubbingOptions as ScrubbingOptions
Expand Down
2 changes: 1 addition & 1 deletion logfire-api/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "hatchling.build"

[project]
name = "logfire-api"
version = "0.50.0"
version = "0.50.1"
description = "Shim for the Logfire SDK which does nothing unless Logfire is installed"
authors = [
{ name = "Pydantic Team", email = "engineering@pydantic.dev" },
Expand Down
2 changes: 1 addition & 1 deletion logfire/_internal/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@
from logfire.exceptions import LogfireConfigError
from logfire.version import VERSION

from ..testing import TestExporter
from .auth import DEFAULT_FILE, DefaultFile, is_logged_in
from .collect_system_info import collect_package_info
from .config_params import ParamManager, PydanticPluginRecordValues
Expand All @@ -75,6 +74,7 @@
from .exporters.quiet_metrics import QuietMetricExporter
from .exporters.remove_pending import RemovePendingSpansExporter
from .exporters.tail_sampling import TailSamplingOptions, TailSamplingProcessor
from .exporters.test import TestExporter
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Kludex it's not critical but I don't know why there isn't a stub for exporters.test when I run rye run generate-stubs. There's other stubs in the exporters folder.

Copy link
Member

@Kludex Kludex Aug 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's because the testing module is not in the __init__.py.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm talking about logfire._internal.exporters.test, not logfire.testing

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, because we don't include private objects.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'd need to add --include-private on the stubgen command for it to appear, I think...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But all the other internal exporter files are there

from .integrations.executors import instrument_executors
from .metrics import ProxyMeterProvider, configure_metrics
from .scrubbing import NOOP_SCRUBBER, BaseScrubber, Scrubber, ScrubbingOptions, ScrubCallback
Expand Down
151 changes: 151 additions & 0 deletions logfire/_internal/exporters/test.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
from __future__ import annotations

import os
import re
import sys
from collections.abc import Sequence
from pathlib import Path
from typing import Any, Mapping, cast

from opentelemetry import trace
from opentelemetry.sdk.trace import Event, ReadableSpan
from opentelemetry.sdk.trace.export import SpanExporter, SpanExportResult
from opentelemetry.semconv.resource import ResourceAttributes
from opentelemetry.semconv.trace import SpanAttributes

from ..constants import ATTRIBUTES_SPAN_TYPE_KEY, RESOURCE_ATTRIBUTES_PACKAGE_VERSIONS


class TestExporter(SpanExporter):
"""A SpanExporter that stores exported spans in a list for asserting in tests."""

# NOTE: Avoid test discovery by pytest.
__test__ = False

def __init__(self) -> None:
self.exported_spans: list[ReadableSpan] = []

def export(self, spans: Sequence[ReadableSpan]) -> SpanExportResult:
"""Exports a batch of telemetry data."""
self.exported_spans.extend(spans)
return SpanExportResult.SUCCESS

def clear(self) -> None:
"""Clears the collected spans."""
self.exported_spans = []

def exported_spans_as_dict(
self,
fixed_line_number: int | None = 123,
strip_filepaths: bool = True,
include_resources: bool = False,
include_package_versions: bool = False,
include_instrumentation_scope: bool = False,
_include_pending_spans: bool = False,
_strip_function_qualname: bool = True,
) -> list[dict[str, Any]]:
"""The exported spans as a list of dicts.

Args:
fixed_line_number: The line number to use for all spans.
strip_filepaths: Whether to strip the filepaths from the exported spans.
include_resources: Whether to include the resource attributes in the exported spans.
include_package_versions: Whether to include the package versions in the exported spans.
include_instrumentation_scope: Whether to include the instrumentation scope in the exported spans.

Returns:
A list of dicts representing the exported spans.
"""

def process_attribute(name: str, value: Any) -> Any:
if name == 'code.filepath' and strip_filepaths:
try:
return Path(value).name
except ValueError: # pragma: no cover
return value
if name == 'code.lineno' and fixed_line_number is not None:
return fixed_line_number
if name == 'code.function':
if sys.version_info >= (3, 11) and _strip_function_qualname:
return value.split('.')[-1]
if name == ResourceAttributes.PROCESS_PID:
assert value == os.getpid()
return 1234
if name == ResourceAttributes.SERVICE_INSTANCE_ID:
if re.match(r'^[0-9a-f]{32}$', value):
return '0' * 32
return value

def build_attributes(attributes: Mapping[str, Any] | None) -> dict[str, Any] | None:
if attributes is None: # pragma: no branch
return None # pragma: no cover
attributes = {
k: process_attribute(k, v)
for k, v in attributes.items()
if k != RESOURCE_ATTRIBUTES_PACKAGE_VERSIONS or include_package_versions
}
if 'telemetry.sdk.version' in attributes:
attributes['telemetry.sdk.version'] = '0.0.0'
return attributes

def build_event(event: Event) -> dict[str, Any]:
res: dict[str, Any] = {
'name': event.name,
'timestamp': event.timestamp,
}
if event.attributes: # pragma: no branch
res['attributes'] = attributes = dict(event.attributes)
if SpanAttributes.EXCEPTION_STACKTRACE in attributes:
last_line = next( # pragma: no branch
line.strip()
for line in reversed(
cast(str, event.attributes[SpanAttributes.EXCEPTION_STACKTRACE]).split('\n')
)
if line.strip()
)
attributes[SpanAttributes.EXCEPTION_STACKTRACE] = last_line
return res

def build_instrumentation_scope(span: ReadableSpan) -> dict[str, Any]:
if include_instrumentation_scope:
return {'instrumentation_scope': span.instrumentation_scope and span.instrumentation_scope.name}
else:
return {}

def build_span(span: ReadableSpan) -> dict[str, Any]:
context = span.context or trace.INVALID_SPAN_CONTEXT
res: dict[str, Any] = {
'name': span.name,
'context': {
'trace_id': context.trace_id,
'span_id': context.span_id,
'is_remote': context.is_remote,
},
'parent': {
'trace_id': span.parent.trace_id,
'span_id': span.parent.span_id,
'is_remote': span.parent.is_remote,
}
if span.parent
else None,
'start_time': span.start_time,
'end_time': span.end_time,
**build_instrumentation_scope(span),
'attributes': build_attributes(span.attributes),
}
if span.events:
res['events'] = [build_event(event) for event in span.events]
if include_resources:
resource_attributes = build_attributes(span.resource.attributes)
res['resource'] = {
'attributes': resource_attributes,
}
return res

spans = [build_span(span) for span in self.exported_spans]
return [
span
for span in spans
if _include_pending_spans is True
or (span.get('attributes', {}).get(ATTRIBUTES_SPAN_TYPE_KEY, 'span') != 'pending_span')
]
163 changes: 12 additions & 151 deletions logfire/testing.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,166 +2,27 @@

from __future__ import annotations

import os
import random
import re
import sys
from collections.abc import Sequence
from dataclasses import dataclass
from pathlib import Path
from typing import Any, Mapping, cast

import pytest
from opentelemetry import trace
from opentelemetry.sdk.metrics.export import InMemoryMetricReader
from opentelemetry.sdk.trace import Event, ReadableSpan
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, SpanExporter, SpanExportResult
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from opentelemetry.sdk.trace.id_generator import IdGenerator
from opentelemetry.semconv.resource import ResourceAttributes
from opentelemetry.semconv.trace import SpanAttributes

import logfire

from ._internal.constants import (
ATTRIBUTES_SPAN_TYPE_KEY,
ONE_SECOND_IN_NANOSECONDS,
RESOURCE_ATTRIBUTES_PACKAGE_VERSIONS,
)


class TestExporter(SpanExporter):
"""A SpanExporter that stores exported spans in a list for asserting in tests."""

# NOTE: Avoid test discovery by pytest.
__test__ = False

def __init__(self) -> None:
self.exported_spans: list[ReadableSpan] = []

def export(self, spans: Sequence[ReadableSpan]) -> SpanExportResult:
"""Exports a batch of telemetry data."""
self.exported_spans.extend(spans)
return SpanExportResult.SUCCESS

def clear(self) -> None:
"""Clears the collected spans."""
self.exported_spans = []

def exported_spans_as_dict(
self,
fixed_line_number: int | None = 123,
strip_filepaths: bool = True,
include_resources: bool = False,
include_package_versions: bool = False,
include_instrumentation_scope: bool = False,
_include_pending_spans: bool = False,
_strip_function_qualname: bool = True,
) -> list[dict[str, Any]]:
"""The exported spans as a list of dicts.

Args:
fixed_line_number: The line number to use for all spans.
strip_filepaths: Whether to strip the filepaths from the exported spans.
include_resources: Whether to include the resource attributes in the exported spans.
include_package_versions: Whether to include the package versions in the exported spans.
include_instrumentation_scope: Whether to include the instrumentation scope in the exported spans.

Returns:
A list of dicts representing the exported spans.
"""

def process_attribute(name: str, value: Any) -> Any:
if name == 'code.filepath' and strip_filepaths:
try:
return Path(value).name
except ValueError: # pragma: no cover
return value
if name == 'code.lineno' and fixed_line_number is not None:
return fixed_line_number
if name == 'code.function':
if sys.version_info >= (3, 11) and _strip_function_qualname:
return value.split('.')[-1]
if name == ResourceAttributes.PROCESS_PID:
assert value == os.getpid()
return 1234
if name == ResourceAttributes.SERVICE_INSTANCE_ID:
if re.match(r'^[0-9a-f]{32}$', value):
return '0' * 32
return value

def build_attributes(attributes: Mapping[str, Any] | None) -> dict[str, Any] | None:
if attributes is None: # pragma: no branch
return None # pragma: no cover
attributes = {
k: process_attribute(k, v)
for k, v in attributes.items()
if k != RESOURCE_ATTRIBUTES_PACKAGE_VERSIONS or include_package_versions
}
if 'telemetry.sdk.version' in attributes:
attributes['telemetry.sdk.version'] = '0.0.0'
return attributes

def build_event(event: Event) -> dict[str, Any]:
res: dict[str, Any] = {
'name': event.name,
'timestamp': event.timestamp,
}
if event.attributes: # pragma: no branch
res['attributes'] = attributes = dict(event.attributes)
if SpanAttributes.EXCEPTION_STACKTRACE in attributes:
last_line = next( # pragma: no branch
line.strip()
for line in reversed(
cast(str, event.attributes[SpanAttributes.EXCEPTION_STACKTRACE]).split('\n')
)
if line.strip()
)
attributes[SpanAttributes.EXCEPTION_STACKTRACE] = last_line
return res

def build_instrumentation_scope(span: ReadableSpan) -> dict[str, Any]:
if include_instrumentation_scope:
return {'instrumentation_scope': span.instrumentation_scope and span.instrumentation_scope.name}
else:
return {}

def build_span(span: ReadableSpan) -> dict[str, Any]:
context = span.context or trace.INVALID_SPAN_CONTEXT
res: dict[str, Any] = {
'name': span.name,
'context': {
'trace_id': context.trace_id,
'span_id': context.span_id,
'is_remote': context.is_remote,
},
'parent': {
'trace_id': span.parent.trace_id,
'span_id': span.parent.span_id,
'is_remote': span.parent.is_remote,
}
if span.parent
else None,
'start_time': span.start_time,
'end_time': span.end_time,
**build_instrumentation_scope(span),
'attributes': build_attributes(span.attributes),
}
if span.events:
res['events'] = [build_event(event) for event in span.events]
if include_resources:
resource_attributes = build_attributes(span.resource.attributes)
res['resource'] = {
'attributes': resource_attributes,
}
return res

spans = [build_span(span) for span in self.exported_spans]
return [
span
for span in spans
if _include_pending_spans is True
or (span.get('attributes', {}).get(ATTRIBUTES_SPAN_TYPE_KEY, 'span') != 'pending_span')
]
from ._internal.constants import ONE_SECOND_IN_NANOSECONDS
from ._internal.exporters.test import TestExporter

__all__ = [
'capfire',
'CaptureLogfire',
'IncrementalIdGenerator',
'SeededRandomIdGenerator',
'TimeGenerator',
'TestExporter',
]


@dataclass(repr=True)
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "hatchling.build"

[project]
name = "logfire"
version = "0.50.0"
version = "0.50.1"
description = "The best Python observability tool! 🪵🔥"
authors = [
{ name = "Pydantic Team", email = "engineering@pydantic.dev" },
Expand Down
Loading