-
Notifications
You must be signed in to change notification settings - Fork 833
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(proto): add @opentelemetry/proto package #2691
Conversation
c532fd2
to
0090664
Compare
Codecov Report
@@ Coverage Diff @@
## main #2691 +/- ##
=======================================
Coverage 93.27% 93.27%
=======================================
Files 158 158
Lines 5443 5443
Branches 1141 1141
=======================================
Hits 5077 5077
Misses 366 366 |
@vmarchaud @legendecas the protoc generator doesn't work with node 8 so i think we have a couple possible options:
|
Does protobufjs work with nodejs 8? could be an alternative. |
I think we could use a docker image, its already used for the semantics convention codegen: https://github.com/open-telemetry/opentelemetry-js/blob/main/scripts/semconv/generate.sh |
not sure, i'll give it a try
I forgot we already had that dependency. I'm going to try protobufjs first, but if it doesn't work i'll do this. |
+1 to protobufjs. It is web-compatible and it can pre-compile proto files to JSON (or even JS) to boost startup times. If I recall correctly @grpc/proto-loader is already using protobufjs (https://www.npmjs.com/package/@grpc/proto-loader). |
Updated to use Added tests |
Looking forward to this! @dyladan pls give us know if we can help with anything |
@morigs you already did by reminding me to get off my lazy butt and work on it 😆 Seriously though, reviews would be helpful. I've already done a quick PoC to prove out the concept of using this package in the otlp exporters with good results. Open questions:
|
Based on the size of the Long package and how it's used here, I'd recommend not using it and instead extract the couple of functions that are required. This is primarily because everything is namespaced from the "Long" class which means any of the referenced function (depending on your packager) won't be able to tree-shake the unused code. It's also not clear how the "Long" value would get serialized in the protobuf references. And isn't a JavaScript Number already a 64-bit value, what am I missing for the need to use this wrapper class? Answered my own question: Number is an ES6 object and therefore, it's not supported on all browsers -- comes back to the missing browser support matrix and I guess versions of Node as well. Would still prefer not to include all of the |
Unfortunately it isn't that simple. If you don't include the We could potentially try to look at how We could potentially export a function which takes the
If
Yes, but even |
Ok, re-looking at this an "assuming" that the web packages are "unlikely" to include the protobuf implementations (in preference for http/json), then as long as this is limited to the proto implementations (as it appears to be here) then it should be ok... As I'm less concerned about the bundle size for node apps as these generally don't have PLT concerns, they will still have startup concerns (for things like Azure function apps that start/process/stop on-demand) but the overall package size is not quite as critical. |
I think that isn't necessarily a correct assumption. This package does all transformations required to do the JSON representations also, which wouldn't make sense to duplicate elsewhere. |
} | ||
|
||
export function toAnyValue(value: unknown): opentelemetry.proto.common.v1.AnyValue { | ||
return opentelemetry.proto.common.v1.AnyValue.fromObject({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if value
is an object? Shouldn't it be converted into kvlistValue
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@dyladan as I understand, Long.js requires Browserify to work on frontend |
rpcImpl: RPCImpl, | ||
startTime: Fixed64, | ||
}; | ||
export class MetricsServiceClient { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both @grpc/grpc-js
and grpc
need a special proto loader https://github.com/grpc/grpc-node/tree/master/packages/proto-loader and I don't find the service definition generated by protobufjs is compatible with the one of @grpc/grpc-js
(without cumbersome setups).
Ideally I'd find the most compatible way is to convert proto files to json, so that for web we can populate protobuf object classes with protobuf.Root.fromJSON
, and for grpc with fromJSON
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The proto loader is only needed if you are loading proto definitions from json or from .proto files. The static code generated is actually capable of serializing without it and sending with gRPC. Yes, with some cumbersome setup, but this prevents us from having a dependency on grpc-js directly in this package which is not desireable for the http/json exporters.
|
||
const resource = metricRecords[0].resource; | ||
|
||
return opentelemetry.proto.collector.metrics.v1.ExportMetricsServiceRequest.fromObject({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC, we don't need to convert the data to concrete protobuf objects if they are transferred by JSON. These conversion helpers only create data objects with valid shapes according to the proto definitions, like https://github.com/open-telemetry/opentelemetry-js/blob/main/packages/exporter-trace-otlp-http/src/transform.ts#L171.
We can still use the types generated by pbts
as opentelemetry.proto.collector.metrics.v1.IExportMetricsServiceRequest
(by adding leading I
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right we don't, but it doesn't make sense to duplicate all the transformations. We can use the interfaces and generate valid JSON objects then pass those to fromObject
in order to get protobuf or we can use fromObject
and call toJSON
at the end to get JSON back out. Either way is fine with me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually it looks like even the interfaces can't be used directly for the JSON representation of the protobuf. The interfaces generated use enums, which when serialized directly with JSON.stringify are converted to numbers and not the string representation required by JSON mapping. So in order to map correctly, we would need to either manually fix that (and create our own types i guess?) or use fromObject
then toJSON
anyway.
I'm looking into ways to reduce the payload size for the browser case. right now it uses the generated protobuf code to create an in-memory object, then converts that to JSON. the statically generated code is huge though, and we shouldn't need it for the browser at all. I'm looking into a way to have the conversions be able to live without the protobuf code. |
This is a
WIPnew package which may eventually be shared among the otlp protobuf/grpc exportersprotobufjs
long
to support nanosecond timestamps with int precisionhrTime
andhexToBuf
in order to avoid dependency on coreTODO: