Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/kafka] Flaky TestIntegration #23267

Closed
dmitryax opened this issue Jun 10, 2023 · 16 comments
Closed

[receiver/kafka] Flaky TestIntegration #23267

dmitryax opened this issue Jun 10, 2023 · 16 comments
Assignees
Labels
bug Something isn't working flaky test a test is flaky priority:p2 Medium receiver/kafka Stale

Comments

@dmitryax
Copy link
Member

dmitryax commented Jun 10, 2023

Seen in https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/5230434021/jobs/9443984920?pr=23239

--- FAIL: TestIntegration (78.01s)
    scraperint.go:121: 
        	Error Trace:	/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/internal/coreinternal/scraperinttest/scraperint.go:121
        	            				/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/kafkametricsreceiver/integration_test.go:78
        	Error:      	Condition never satisfied
        	Test:       	TestIntegration
    scraperint.go:103: resource "map[]": scope "otelcol/kafkametricsreceiver": metric "kafka.brokers": datapoint "map[]": int value doesn't match expected: 1, actual: 0
    scraperint.go:109: full log:
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
        Error scraping metrics
    scraperint.go:117: latest result:
        resourceMetrics:
          - resource: {}
            scopeMetrics:
              - metrics:
                  - description: Number of brokers in the cluster.
                    gauge:
                      dataPoints:
                        - asInt: "0"
                          startTimeUnixNano: "1686407538913389877"
                          timeUnixNano: "1686407594914980439"
                    name: kafka.brokers
                    unit: '{brokers}'
                scope:
                  name: otelcol/kafkametricsreceiver
                  version: latest
FAIL
coverage: 29.1% of statements
FAIL	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/kafkametricsreceiver	78.068s
ok  	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/kafkametricsreceiver/internal/metadata	0.044s	coverage: 0.0% of statements [no tests to run]
FAIL
make[1]: *** [../../Makefile.Common:114: do-integration-tests-with-cover] Error 1
make: *** [Makefile:172: receiver/kafkametricsreceiver] Error 2
make[1]: Leaving directory '/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/kafkametricsreceiver'
Error: Process completed with exit code 2.
@dmitryax dmitryax added bug Something isn't working priority:p2 Medium flaky test a test is flaky receiver/kafka labels Jun 10, 2023
@github-actions
Copy link
Contributor

Pinging code owners for receiver/kafka: @pavolloffay @MovieStoreGuy. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@shivanshuraj1333
Copy link
Member

I want to take this up, kindly assign.
Thanks!

@MovieStoreGuy
Copy link
Contributor

All yours @shivanshu1333

@shivanshuraj1333
Copy link
Member

@dmitryax I've tried to reproduce this, I ran the mentioned integration test ~20 times from my IDE, and every time the test was passed successfully, atm I'm unsure how to reproduce, any guidance? Thanks!

@dmitryax
Copy link
Member Author

@shivanshu1333 I don't think it's possible to reproduce it locally. We need to read the code and figure out how this can happen

@dmitryax
Copy link
Member Author

There are a lot of error messages in the failing job Error scraping metrics. You can start with adding more information to that error. So when it happens next time, we have more details to investigate

@MovieStoreGuy
Copy link
Contributor

How are you going with this @shivanshu1333 ?

Copy link
Contributor

github-actions bot commented Jan 9, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jan 9, 2024
@djaglowski
Copy link
Member

Closed by #30218

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working flaky test a test is flaky priority:p2 Medium receiver/kafka Stale
Projects
None yet
Development

No branches or pull requests

7 participants