Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added tracegen utility #956

Merged
merged 2 commits into from
Sep 23, 2020

Conversation

jpkrohling
Copy link
Member

Signed-off-by: Juraci Paixão Kröhling juraci@kroehling.de

Description: Added tracegen

Link to tracking Issue: #955

Testing: Manual tests

Documentation: readme included

@jpkrohling jpkrohling requested a review from a team September 4, 2020 11:51
@jpkrohling
Copy link
Member Author

@yurishkuro could you take a look at this? The one notable missing thing is the client metrics. I also wanted to check if the inject/extract part looks sane, as I couldn't really get why the original tracegen is doing that.

Copy link
Member

@yurishkuro yurishkuro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

iirc @tigrannajaryan had another load generator that he used for benchmarking OTLP

@@ -0,0 +1,86 @@
// Copyright (c) 2018 The Jaeger Authors.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since this is not the code from Jaeger verbatim, a better copyright reference is

// Copyright (c) 2020 The OpenTelemetry Authors.
// Copyright (c) 2018 The Jaeger Authors.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is currently under discussion here open-telemetry/community#475

}

wg := sync.WaitGroup{}
var running uint32 = 1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this one is never used to signal the workers to stop

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't it used a few lines below?

if c.Duration > 0 {
	time.Sleep(c.Duration)
	atomic.StoreUint32(&running, 0)
}

@tigrannajaryan
Copy link
Member

iirc @tigrannajaryan had another load generator that he used for benchmarking OTLP

Thanks @yurishkuro

@jpkrohling Here is the generator I used in the OTLP benchmarks, it is a primitive one: https://github.com/tigrannajaryan/exp-otelproto/blob/master/encodings/otlp/generator.go

@jpkrohling we also have a more sophisticated Span data generator in https://github.com/open-telemetry/opentelemetry-collector/blob/master/internal/goldendataset/span_generator.go

Depending on what sort of spans you are looking for this one may be a better choice since it can generate a variety of data which is good for testing (but it may also be slower to generate data - not sure by how much).

@jpkrohling
Copy link
Member Author

I'm looking primarily to having a tool that is able to generate a specific number of traces with predictable outcome. This tool, ideally a binary packaged in a container, would be used by e2e tests to assert that the whole system is working properly.

Load testing would be a secondary goal.

@tigrannajaryan
Copy link
Member

I'm looking primarily to having a tool that is able to generate a specific number of traces with predictable outcome. This tool, ideally a binary packaged in a container, would be used by e2e tests to assert that the whole system is working properly.

Load testing would be a secondary goal.

Are we talking about some other e2e tests that are different from e2e tests that we already have in this repo?

@jpkrohling
Copy link
Member Author

Potentially, yes. For the Jaeger Operator, we have a different set of e2e tests, especially for the case where we inject a sidecar into a "business application". The same would happen for OpenTelemetry Operator, and the e2e test would make sure that everything is in the right place and the networking is properly configured. The actual business application is irrelevant, hence, something simple like the tracegen.

@jpkrohling
Copy link
Member Author

For clarity: this first iteration of the tracegen would not be used for the case I mentioned above, but I do plan on adding a web interface to it.

@tigrannajaryan tigrannajaryan self-assigned this Sep 9, 2020
Copy link
Member

@tigrannajaryan tigrannajaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Blocking until license/copyright issue is clarified.

@@ -0,0 +1,86 @@
// Copyright (c) 2018 The Jaeger Authors.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is currently under discussion here open-telemetry/community#475

tracegen/internal/tracegen/worker.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
tracegen/internal/tracegen/config.go Outdated Show resolved Hide resolved
@jpkrohling
Copy link
Member Author

Once the copyright part is figured out, I'll address most of the review comments.

@tigrannajaryan
Copy link
Member

Copyright is clarified, we can move forward.

@jpkrohling
Copy link
Member Author

PR updated

@jpkrohling
Copy link
Member Author

CI is failing due to flaky tests.

@tigrannajaryan
Copy link
Member

@jpkrohling please fix CHANGELOG.md conflict. Can you also add some tests for the new code?

@codecov
Copy link

codecov bot commented Sep 22, 2020

Codecov Report

Merging #956 into master will increase coverage by 0.87%.
The diff coverage is 78.12%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #956      +/-   ##
==========================================
+ Coverage   87.97%   88.85%   +0.87%     
==========================================
  Files         251      254       +3     
  Lines       12012    12150     +138     
==========================================
+ Hits        10568    10796     +228     
+ Misses       1109     1006     -103     
- Partials      335      348      +13     
Flag Coverage Δ
#integration 75.42% <ø> (?)
#unit 87.89% <78.12%> (-0.09%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
tracegen/internal/tracegen/config.go 72.22% <72.22%> (ø)
tracegen/internal/tracegen/worker.go 85.71% <85.71%> (ø)
...eiver/awsxrayreceiver/internal/udppoller/poller.go 97.61% <0.00%> (-2.39%) ⬇️
receiver/k8sclusterreceiver/watcher.go 95.29% <0.00%> (-2.36%) ⬇️
internal/common/testing/container/container.go 72.97% <0.00%> (ø)
receiver/dockerstatsreceiver/docker.go 92.30% <0.00%> (+39.05%) ⬆️
receiver/dockerstatsreceiver/receiver.go 96.82% <0.00%> (+49.20%) ⬆️
receiver/redisreceiver/receiver.go 87.50% <0.00%> (+87.50%) ⬆️
... and 2 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f9082de...6a1a481. Read the comment docs.

Signed-off-by: Juraci Paixão Kröhling <juraci@kroehling.de>
Signed-off-by: Juraci Paixão Kröhling <juraci@kroehling.de>
@jpkrohling
Copy link
Member Author

Conflict solved, test added.

@tigrannajaryan tigrannajaryan merged commit 6145a10 into open-telemetry:master Sep 23, 2020
ljmsc referenced this pull request in ljmsc/opentelemetry-collector-contrib Feb 21, 2022
* Consolidate stdout exporter

* Move config to own file and match project standard

* Abstract Exporter into unified struct

* Rename trace part of the exporter

* Update import paths and configuration

* Update tests

* Update InstallNewPipeline to not return traceProvider

It is a registered global, access it that way.

* Update example_test

* Update docs

* Update example to be for whole package

* Update metric output

Closer match the span output.

* Clean up span output

Print as a batch and cleanup marshaling.

* Correct spelling error in doc

* Add Exporters README

* Update Changelog

* Propagate changes to rest of project

* Lint fixes

* Fix example test in metric SDK

* Add disable config options for trace and metric

Co-authored-by: Liz Fong-Jones <lizf@honeycomb.io>
codeboten pushed a commit that referenced this pull request Nov 23, 2022
* [instrumentation/wsgi] fix NonRecordingSpan bug

There was a bug caused by accessing `.kind` on a NonRecordingSpan. Added a test to validate the fix.

Fix #956

* fix lint

* use is_recording

* fix lint

* fix lint

* fix lint
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants