Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Declare how to share a receiver across multiple signals with factory options #10059

Closed
wants to merge 3 commits into from

Conversation

atoulme
Copy link
Contributor

@atoulme atoulme commented May 1, 2024

Description

Declare how to share a receiver across multiple signals with factory options.

Right now, we have each component looking to reuse across signals use a library under internal/sharedcomponent, and declare their own map of components. We move this map to the factory and use a factory option to set a consumer of a different signal on the existing receiver.

This removes the need to declare and manage internal/sharedcomponent.

This PR lacks test coverage and is meant to collect feedback on the feasibility.

Testing

WIP

Documentation

API docs added.

@atoulme atoulme requested review from a team and TylerHelmuth May 1, 2024 07:17
@atoulme atoulme force-pushed the merge_sharedcomponent branch from 2e2129a to b91efbd Compare May 1, 2024 07:19
Copy link

codecov bot commented May 1, 2024

Codecov Report

Attention: Patch coverage is 29.72973% with 52 lines in your changes missing coverage. Please review.

Project coverage is 91.59%. Comparing base (cad2734) to head (5ae1cc8).
Report is 463 commits behind head on main.

Files Patch % Lines
receiver/receiver.go 0.00% 51 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main   #10059      +/-   ##
==========================================
- Coverage   91.87%   91.59%   -0.28%     
==========================================
  Files         360      359       -1     
  Lines       16725    16722       -3     
==========================================
- Hits        15366    15317      -49     
- Misses       1021     1069      +48     
+ Partials      338      336       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@atoulme
Copy link
Contributor Author

atoulme commented May 1, 2024

open-telemetry/opentelemetry-collector-contrib#32809 is open to fix contrib.

@atoulme atoulme force-pushed the merge_sharedcomponent branch 6 times, most recently from ab33d83 to 5ae1cc8 Compare May 2, 2024 07:10
@mwear
Copy link
Member

mwear commented May 2, 2024

There are two behaviors related to status reporting that were implemented in sharedcomponent that need to be preserved in this PR. A shared component represents multiple, logical component instances as a single component. For the purposes of status reporting, the shared component must report status for each of the logical components it represents. To complicate matters sightly, there are two ways to report status for a component. It can be reported from outside the component (via the service status.Reporter) or from within the component (via
component.TelemetrySettings). In both cases, a single call to ReportStatus needs to report status for each component instance. For example, if you have an OTLP receiver that is part of three pipelines, one for traces, logs, and metrics, three status events would be emitted for each call to ReportStatus. With that as background, I'll explain the two special cases that sharedcomponent currently handles.

  1. sharedcomponent currently chains together the telemetrysettings.ReportStatus functions for each instance of the component. this is how it emits n status events for n logical instances. This function is called from within the component.
  2. graph does automatic status reporting for components during startup. It calls ReportStatus on the service's status.Reporter. This reports status from outside the component. The graph starts components one by one, however, this is a problem for the sharedcomponent, as it should transition all logical instances to Starting before Start is called. This is handled in the shared component by reporting its status from within, preemptively, via its chained telemetrysettings.ReportStatus method. Shutdown works similarly.

The solutions currently part of sharedcomponent are not ideal, IMO, and it'd be great if we can improve upon them in the redesign. Let me know if there is anything I can do to help.

codeboten added a commit to open-telemetry/opentelemetry-collector-contrib that referenced this pull request May 3, 2024
The testbed starts and stops the exporter once per signal, but we know
it's the same exporter underneath.

This PR changes the behavior of testbed to start and stop the exporter
just once. This change is needed to support
open-telemetry/opentelemetry-collector#10059

---------

Co-authored-by: Alex Boten <223565+codeboten@users.noreply.github.com>
Co-authored-by: Yang Song <songy23@users.noreply.github.com>
Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label May 17, 2024
@atoulme atoulme removed the Stale label May 28, 2024
@atoulme
Copy link
Contributor Author

atoulme commented May 28, 2024

@mwear can we create an integration test that faithfully checks both scenarios you describe? This way we can make sure this continues to work.

@mwear
Copy link
Member

mwear commented May 28, 2024

@mwear can we create an integration test that faithfully checks both scenarios you describe? This way we can make sure this continues to work.

Sure thing. That's an oversight from the original implementation.

Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Jun 12, 2024
@mwear
Copy link
Member

mwear commented Jun 13, 2024

@atoulme, sorry for the delay. I looked into this today and the two cases I describe are in fact tested by an existing test case:

func TestReportStatusOnStartShutdown(t *testing.T) {
for _, tc := range []struct {
name string
startErr error
shutdownErr error
expectedStatuses []component.Status
}{
{
name: "successful start/stop",
startErr: nil,
shutdownErr: nil,
expectedStatuses: []component.Status{
component.StatusStarting,
component.StatusOK,
component.StatusStopping,
component.StatusStopped,
},
},
{
name: "start error",
startErr: assert.AnError,
shutdownErr: nil,
expectedStatuses: []component.Status{
component.StatusStarting,
component.StatusPermanentError,
},
},
{
name: "shutdown error",
shutdownErr: assert.AnError,
expectedStatuses: []component.Status{
component.StatusStarting,
component.StatusOK,
component.StatusStopping,
component.StatusPermanentError,
},
},
} {
t.Run(tc.name, func(t *testing.T) {
reportedStatuses := make(map[*component.InstanceID][]component.Status)
newStatusFunc := func() func(*component.StatusEvent) {
instanceID := &component.InstanceID{}
return func(ev *component.StatusEvent) {
reportedStatuses[instanceID] = append(reportedStatuses[instanceID], ev.Status())
}
}
base := &baseComponent{}
if tc.startErr != nil {
base.StartFunc = func(context.Context, component.Host) error {
return tc.startErr
}
}
if tc.shutdownErr != nil {
base.ShutdownFunc = func(context.Context) error {
return tc.shutdownErr
}
}
comps := NewMap[component.ID, *baseComponent]()
var comp *Component[*baseComponent]
var err error
for i := 0; i < 3; i++ {
telemetrySettings := newNopTelemetrySettings()
telemetrySettings.ReportStatus = newStatusFunc()
if i == 0 {
base.telemetry = telemetrySettings
}
comp, err = comps.LoadOrStore(
id,
func() (*baseComponent, error) { return base, nil },
telemetrySettings,
)
require.NoError(t, err)
}
err = comp.Start(context.Background(), componenttest.NewNopHost())
require.Equal(t, tc.startErr, err)
if tc.startErr == nil {
comp.telemetry.ReportStatus(component.NewStatusEvent(component.StatusOK))
err = comp.Shutdown(context.Background())
require.Equal(t, tc.shutdownErr, err)
}
require.Equal(t, 3, len(reportedStatuses))
for _, actualStatuses := range reportedStatuses {
require.Equal(t, tc.expectedStatuses, actualStatuses)
}
})
}
}
Let me know if any further explanations are needed.

Copy link
Contributor

github-actions bot commented Jul 5, 2024

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Jul 5, 2024
@mx-psi
Copy link
Member

mx-psi commented Jul 18, 2024

@atoulme Does this relate to #10534?

@github-actions github-actions bot removed the Stale label Jul 19, 2024
Copy link
Contributor

github-actions bot commented Aug 2, 2024

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added Stale and removed Stale labels Aug 2, 2024
Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Aug 22, 2024
@atoulme
Copy link
Contributor Author

atoulme commented Aug 22, 2024

@atoulme Does this relate to #10534?

No, this was a spike to look at how to move away from internal/sharedcomponent.

@atoulme atoulme force-pushed the merge_sharedcomponent branch from 5ae1cc8 to 4468a2f Compare August 22, 2024 06:19
@github-actions github-actions bot removed the Stale label Aug 23, 2024
Copy link
Contributor

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Sep 11, 2024
@github-actions github-actions bot removed the Stale label Sep 19, 2024
Copy link
Contributor

github-actions bot commented Oct 6, 2024

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Oct 6, 2024
Copy link
Contributor

Closed as inactive. Feel free to reopen if this PR is still being worked on.

@github-actions github-actions bot closed this Oct 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants