-
Notifications
You must be signed in to change notification settings - Fork 839
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update Metrics to latest specification. #3518
Comments
Exponential histogram coming later I suppose? |
Yeah, this is not a complete list. As things get better specified, they'll be added. For exponential histograms, I'd highly recommend doing a parallel experiment right now. While I expect the Aggregator package to undergo some refactorings (and hopefully simplifications) you should be able to prototype an exponential hsitogram today. I opened a bug around that here: #3550 |
related to #3724 |
An additional problem I've noticed is that you can use the metric API / SDK to produce metrics streams that violate their monotonic properties. For example, take the following async counter:
For both cumulative and metric exporters, this will product monotonic sums that violate their monotonicity. In the delta case, the values will be negative, which is a violation. In the cumulative case, the values will be decreases, which is a violation. The same thing is true a a sync counter:
The question is what should the SDK do in these cases where the recorded value violates the assumptions of the instrument. |
I'm not sure I understand the sync counter bit? The counter should be adding all those non-negative measurments together. For the first problem, we assume good-faith on the callback functions, and we should carefully document expectations on those. (Similar to requirements on these about good thread-management, blocking calls, etc.). I'm not convinced the SDK needs to do something in its production pathway, but I do think a testing-SDK that highlights issues like this should issue exceptions. |
Right right, something got crossed in my brain.
I'm not sure I do either. At least not without a convincing argument of what the SDK should do from an export perspective when these scenarios occur. It's worth noting that counters ( |
Just making sure it's non-negative? Yeah. I do think it's worth having a "test" or "verify" mode for the sdk where more of these assertions come into play for folks writing instrumentation, so it's not just a "log and ask end-users to fix" thing (although we also shouldn't punish end-users if library-instrumentation is bad, we SHOULD log and ask-to-fix if they see it in production). For the purposes of tracking here, it should be added. Do you have edit access to update TOOOs/link other issues? If so, please feel free to update any "stable-release" blocking things you think we need to do. If not, let me know what you want added here and I'll make sure this represents our "known TODOs prior to stable" and check with @anuraaga and @jkwatson for their list as well. |
Thou shalt create an issue if you want it to be remembered. ;) |
Moving remaining tracking items to: https://github.com/open-telemetry/opentelemetry-java/projects/3 |
Created a new issue from my comment here. |
Here's the task list for taking the prototyped ideas in #3333 and piecemeal submitting them into main branch.
Cleanups
(Optional) Follow-ons
The text was updated successfully, but these errors were encountered: