Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Elastic Search] Fix sanitization for bulk queries #1990

Closed
wants to merge 12 commits into from
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
([#2132](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2132))
- `opentelemetry-resource-detector-azure` Changed timeout to 4 seconds due to [timeout bug](https://github.com/open-telemetry/opentelemetry-python/issues/3644)
([#2136](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2136))
- Fix elastic-search instrumentation sanitization to support bulk queries
([#1990](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/1988))
GilTeixeira marked this conversation as resolved.
Show resolved Hide resolved

## Version 1.22.0/0.43b0 (2023-12-14)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,15 @@ def _unflatten_dict(d):


def sanitize_body(body) -> str:
if isinstance(body, bytes):
body = body.decode("utf8")

if isinstance(body, str):
body = json.loads(body)

if isinstance(body, list):
return str([sanitize_body(elem) for elem in body])

flatten_body = _flatten_dict(body)

for key in flatten_body:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -486,3 +486,19 @@ def test_body_sanitization(self, _):
sanitize_body(json.dumps(sanitization_queries.interval_query)),
str(sanitization_queries.interval_query_sanitized),
)
self.assertEqual(
sanitize_body(
[
json.dumps(sanitization_queries.filter_query).encode("utf-8"),
json.dumps(sanitization_queries.match_query).encode("utf-8"),
json.dumps(sanitization_queries.interval_query).encode("utf-8"),
]
),
str(
[
str(sanitization_queries.filter_query_sanitized),
str(sanitization_queries.match_query_sanitized),
str(sanitization_queries.interval_query_sanitized),
]
),
)
Loading