-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update host metrics receiver to use scraper controller #1949
Update host metrics receiver to use scraper controller #1949
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1949 +/- ##
==========================================
+ Coverage 91.92% 92.01% +0.09%
==========================================
Files 282 278 -4
Lines 16797 16746 -51
==========================================
- Hits 15440 15409 -31
+ Misses 933 921 -12
+ Partials 424 416 -8
Continue to review full report at Codecov.
|
53f5f9f
to
86d5ea2
Compare
86d5ea2
to
26029b6
Compare
a90bbee
to
3a7f609
Compare
@james-bebbington do you think it makes sense to split this PR? Asking since there are changes in |
3a7f609
to
b0cea8a
Compare
Yes definitely. This PR actually depends on #1961 (the first commit of this PR with changes to |
This PR was marked stale due to lack of activity. It will be closed in 7 days. |
b0cea8a
to
280578d
Compare
e38caf1
to
9bb58cd
Compare
9bb58cd
to
cd01740
Compare
This is now unblocked and ready for review again |
@@ -135,7 +144,7 @@ func (s *scraper) getProcessMetadata() ([]*processMetadata, error) { | |||
|
|||
executable, err := getProcessExecutable(handle) | |||
if err != nil { | |||
errs = append(errs, fmt.Errorf("error reading process name for pid %v: %w", pid, err)) | |||
errs = append(errs, consumererror.NewPartialScrapeError(fmt.Errorf("error reading process name for pid %v: %w", pid, err), 1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't think of this during the previous review but seeing it now a less verbose version might be
var errs consumererror.ScrapeErrors
errs.Add(fmt.Errorf("error reading process name for pid %v: %w", pid, err), 1))
errs.Add(...)
return errs.Combine()
Or something. Just an idea.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a bit ugly yea.
Another option might be to just make a fmt
-like constructor for PartialScrapeError
so we can do something like
errs = append(errs, consumererror.NewScrapeError(1, "error reading process name for pid %v: %w", pid, err))
WDYT?
In any case I will probably do in a follow up PR as this one is very large already
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another option might be to just make a fmt-like constructor for PartialScrapeError so we can do something like
Yeah I think that'd be good. I was worried go vet
might not catch fmt string errors but looks like it does from a quick test:
package main
import (
"fmt"
)
func f(s string, args... interface{}) string {
return fmt.Sprintf(s, args...)
}
func main() {
// fmt.Printf("Hello, playground %s %s", "foo", 5)
f("hello, playground %s %s", "foo", 3)
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But I still think it'd be nice to have a struct to accumulate errors into instead of having to do all the appends. Could give it an Addf
for formatted strings as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea that seems like a good idea. Note we would also want a way to include regular unexpected errors that don't necessarily make sense to be represented as a scrape error.
Moved this discussion to an issue: #2046
cd01740
to
98a76e5
Compare
OK to merge this? |
@tigrannajaryan yes |
Update host metrics receiver to use scraper controller removing around 400 lines of code. There is some "scraper factory" code left over that might be able to be moved into the receiverhelper (or scraper) library. Will think about that for next phase.
Resolves #918
Sample Output:
Note in these examples we see quite a lot of errors scraping process data due to not running the Collector in "Administrative mode" on Windows:
Traces (from http://localhost:55679/debug/tracez?zspanname=scraper%2fprocess%2fMetricsScraped&ztype=2&zsubtype=0):
Metrics (from http://localhost:8888/metrics):