-
Notifications
You must be signed in to change notification settings - Fork 616
More efficient way of publishing of batches of messages #364
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
3295551
253afa9
f8f3b47
a21fc48
a1fe5d4
4876214
738c3c3
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -226,6 +226,11 @@ string BasicConsume( | |
| void BasicPublish(string exchange, string routingKey, bool mandatory, | ||
| IBasicProperties basicProperties, byte[] body); | ||
|
|
||
| [AmqpMethodDoNotImplement(null)] | ||
| void BasicBatchPublish(string exchange, string routingKey, bool mandatory, | ||
| IEnumerable<BatchMessage> messages); | ||
|
|
||
|
|
||
| /// <summary> | ||
| /// Configures QoS parameters of the Basic content-class. | ||
| /// </summary> | ||
|
|
@@ -552,4 +557,8 @@ void QueueDeclareNoWait(string queue, bool durable, | |
| /// </summary> | ||
| TimeSpan ContinuationTimeout { get; set; } | ||
| } | ||
| public class BatchMessage{ | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This name is confusing: "what is being batched there? how do I send a batch of messages using this thing?". It's an implementation detailed leaked in the public API (which I think can be avoided).
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. A larger view of the process:
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Stores the variable data between publish invocations using the BasicPublish, which are the BasicProperties and the body of bytes.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As owner @michaelklishin of the source please consider this, A way to avoid batching and gain the performance is to buffer outbound frames for sending on a seperate thread. This however causes behavioural changes in the client, so I chose to write Batch functionality instead. Plus, this fits well for many of the use-cases for RabbitMQ. Interestingly though, using the oubound frame buffer and thread for sending, you get benifits with Ack's being grouped and sent as a single stream on the socket too, which increases read throughput. To achieve this however, I took a dependancy on BufferBlock from TPL DataFlow. I don't believe BufferStream is usefull for writes because it doesn't automatically flush periodically. |
||
| public byte[] Body { get; set; } | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. How about making this thing immutable once created? |
||
| public IBasicProperties basicProperties { get; set; } | ||
| } | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -472,6 +472,19 @@ public void ModelSend(MethodBase method, ContentHeaderBase header, byte[] body) | |
| Session.Transmit(new Command(method, header, body)); | ||
| } | ||
| } | ||
| public void ModelSend(MethodBase method, IEnumerable<BatchMessage> messages) | ||
| { | ||
| if (method.HasContent) | ||
| { | ||
| m_flowControlBlock.WaitOne(); | ||
| } | ||
| List<Command> commands = new List<Impl.Command>(); | ||
| foreach (var message in messages) | ||
| { | ||
| commands.Add(new Command(method, (ContentHeaderBase)message.basicProperties, message.Body)); | ||
| } | ||
| Session.Transmit(commands); | ||
| } | ||
|
|
||
| public virtual void OnBasicAck(BasicAckEventArgs args) | ||
| { | ||
|
|
@@ -1233,6 +1246,50 @@ public void BasicPublish(string exchange, | |
| body); | ||
| } | ||
|
|
||
| public void BasicBatchPublish(string exchange, | ||
| string routingKey, | ||
| bool mandatory, | ||
| IEnumerable<BatchMessage> messages) | ||
| { | ||
| foreach (var message in messages) | ||
| { | ||
| if (message.basicProperties == null) | ||
| { | ||
| message.basicProperties = CreateBasicProperties(); | ||
| } | ||
|
|
||
| if (NextPublishSeqNo > 0) | ||
| { | ||
| lock (m_unconfirmedSet.SyncRoot) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would it change anything if we would acquire all the sequence numbers in one go inside a lock instead of reacquiring the lock for each message? |
||
| { | ||
| if (!m_unconfirmedSet.Contains(NextPublishSeqNo)) | ||
| { | ||
| m_unconfirmedSet.Add(NextPublishSeqNo); | ||
| } | ||
| NextPublishSeqNo++; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| _Private_BasicBatchPublish(exchange, | ||
| routingKey, | ||
| mandatory, | ||
| messages); | ||
| } | ||
| public void _Private_BasicBatchPublish( | ||
| string @exchange, | ||
| string @routingKey, | ||
| bool @mandatory, | ||
| //bool @immediate, | ||
| IEnumerable<BatchMessage> messages) | ||
| { | ||
| BasicPublish __req = new BasicPublish(); | ||
| __req.m_exchange = @exchange; | ||
| __req.m_routingKey = @routingKey; | ||
| __req.m_mandatory = @mandatory; | ||
| //__req.m_immediate = @immediate; | ||
| ModelSend(__req, messages); | ||
| } | ||
| public abstract void BasicQos(uint prefetchSize, | ||
| ushort prefetchCount, | ||
| bool global); | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IEnumerable is great multi purpose when you look at it from a caller perspective. One of the downsides it has it requires you to be aware of multiple enumerations, you cannot access things like count or length without using LINQ (allocs) and it requires the lib to copy around things into internal materialized data structures.
If the API is considered lower level I would lean towards using an array if possible but like I said IEnumerable offers more convenience from the caller perspective. Just some food for thought
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's generally OK if the implementation ensure that it only enumerates once. To do a count, it can enumerate into, e.g. a
List<T>, and inspect theCountproperty.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I see what you're saying, even a
ToList()results in allocations internally. This is a valid concern for a hot code path.