Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(perf): Extract span.status_code tag for HTTP spans #3245

Merged
merged 4 commits into from
Mar 12, 2024

Conversation

gggritso
Copy link
Member

@gggritso gggritso commented Mar 8, 2024

We need this to do status code breakdowns for the HTTP module. Note: The HTTP module, as designed, doesn't need to store the full code, but rather just the first digit. We break down by 2xx, 4xx, and 5xx codes. That said, it seems simpler and more robust to just extract all the response codes, since in practice we don't expect more than half a dozen codes to actually come in per span.

I added a banal test, but this is my first PR in Relay so please save me from myself 🙏🏻 let me know what you need in terms of code changes and/or test coverage.

@gggritso gggritso marked this pull request as ready for review March 8, 2024 23:16
@gggritso gggritso requested a review from a team as a code owner March 8, 2024 23:16
Copy link
Contributor

@iker-barriocanal iker-barriocanal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for contributing! Could we add a changelog entry?

@@ -260,6 +264,10 @@ fn span_metrics() -> impl IntoIterator<Item = MetricSpec> {
Tag::with_key("resource.render_blocking_status")
.from_field("span.sentry_tags.resource.render_blocking_status")
.when(is_resource.clone()),
// HTTP module:
Tag::with_key("span.status_code")
.from_field("span.sentry_tags.status_code")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need the tag in d:spans/exclusive_time_light@millisecond too?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unsure what you mean—from what I can see I added the tag to that metric, did I do something wrong?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tag is added to two metrics: d:spans/exclusive_time@millisecond and d:spans/exclusive_time_light@millisecond. Do we need the tag in both metrics, or just one of the two metrics is good enough?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I see! I need it for both, since the light metric is only used on the landing page, and the regular one is used on the domain landing pages, where we have to break down by transaction

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good! :shipit:

// HTTP module:
Tag::with_key("span.status_code")
.from_field("span.sentry_tags.status_code")
.always(),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
.always(),
.when(is_http.clone()),

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated, thanks!

@@ -260,6 +264,10 @@ fn span_metrics() -> impl IntoIterator<Item = MetricSpec> {
Tag::with_key("resource.render_blocking_status")
.from_field("span.sentry_tags.resource.render_blocking_status")
.when(is_resource.clone()),
// HTTP module:
Tag::with_key("span.status_code")
.from_field("span.sentry_tags.status_code")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point.

// HTTP module:
Tag::with_key("span.status_code")
.from_field("span.sentry_tags.status_code")
.always(),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
.always(),
.when(is_http.clone()),

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done 👍🏻

@Dav1dde
Copy link
Member

Dav1dde commented Mar 11, 2024

That said, it seems simpler and more robust to just extract all the response codes, since in practice we don't expect more than half a dozen codes to actually come in per span.

Note this is worst case a multiplicator of 12x on the entire cardinality, it might seem low, but every additional tag is a multiplicator of the existing cardinality, so if there is no use-case for it, I would consider just going with the broader category of 1xx, 2xx, 3xx, 4xx, 5xx

@gggritso gggritso changed the title feat(perf): Extract span.status_code tag for HTTP spans in exclusive time metric feat(perf): Extract span.status_code tag for HTTP spans Mar 11, 2024
@gggritso
Copy link
Member Author

@iker-barriocanal updated changelog 👍🏻
@Dav1dde fair point thank you, and I'm aware of the cardinality increase but we're already getting feature requests to filter by the exact response code, so I think falling back to broader category would be temporary at best

@Dav1dde
Copy link
Member

Dav1dde commented Mar 11, 2024

Fair enough!

@gggritso gggritso requested a review from jjbayer March 11, 2024 14:36
@gggritso gggritso merged commit f56e54b into master Mar 12, 2024
20 checks passed
@gggritso gggritso deleted the feat/perf/turn-on-http-request-status-tag branch March 12, 2024 14:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants