Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EventHubs] Buffered Producer concurrent sending support #23844

Closed
yunhaoling opened this issue Apr 6, 2022 · 0 comments
Closed

[EventHubs] Buffered Producer concurrent sending support #23844

yunhaoling opened this issue Apr 6, 2022 · 0 comments
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. Event Hubs Messaging Messaging crew
Milestone

Comments

@yunhaoling
Copy link
Contributor

yunhaoling commented Apr 6, 2022

Issue

support concurrent sending in the EventHubProducerClient buffered mode

Background

there are two parameters in .NET:

  • maximumConcurrentSends: The number of batches that may be sent concurrently across all partitions.
  • maximumConcurrentSendsPerPartition: The number of batches that may be sent concurrently to each partition.

However, those two parameters are pulled out of the scopes of the preview release due to following reasons:

  • Our current producer implementation is preventing concurrency as we lock the whole producer's send method: which makes this maximumConcurrentSendsPerPartition useless.
  • The maximumConcurrentSends would also bring confusion if we cannot support concurrent sending for each partition producer. e.g. set maximumConcurrentSends to 10 but we only have 4 partition producer for each partition.
  • This is an advanced perf configuration; I think only few people would be using it. We could add it incrementally.

To sum up, the concurrency story is an advanced feature which needs more time for investigation and experimenting, I prefer pulling it out of the scopes for the first preview.

Originally posted by @yunhaoling in #23748 (comment)

Scopes:

  • understand the necessity of this parameter, understand whether/how customer could benefit from this parameter.
  • propose a technical solution to support this feature as the current implementation of EventHubProducer could not accommodate concurrent model well (there's a big lock)
    • to enable concurrent sending, there're two options (and more) that I think we could do:
        1. refactor the EventHubProducer implementation to embrace concurrency with fine-grained lock
        1. or create multiple partition producers
@ghost ghost added the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Apr 6, 2022
@azure-sdk azure-sdk added Client This issue points to a problem in the data-plane of the library. Event Hubs needs-team-triage Workflow: This issue needs the team to triage. labels Apr 6, 2022
@yunhaoling yunhaoling added Messaging Messaging crew and removed needs-team-triage Workflow: This issue needs the team to triage. labels Apr 6, 2022
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Apr 6, 2022
@yunhaoling yunhaoling changed the title [EventHubs] Buffered Producer concurrent sending [EventHubs] Buffered Producer concurrent sending support Apr 6, 2022
@yunhaoling yunhaoling added this to the Backlog milestone Apr 13, 2022
@kashifkhan kashifkhan modified the milestones: Backlog, [2022] May Apr 26, 2022
@lmazuel lmazuel modified the milestones: [2022] May, [2022] June May 16, 2022
@github-actions github-actions bot locked and limited conversation to collaborators Apr 11, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. Event Hubs Messaging Messaging crew
Projects
None yet
Development

No branches or pull requests

4 participants