Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] [Servicebus|Azure.Messaging.ServiceBus] SendMEssage MessageSizeExeeded in Azure #19053

Closed
Abrissirba opened this issue Feb 26, 2021 · 4 comments · Fixed by #19091
Closed
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus

Comments

@Abrissirba
Copy link

Abrissirba commented Feb 26, 2021

Describe the bug
I have an Azure function app that uses the Azure.Messaging.ServiceBus. Trying to send multiple messages in batch using CreateMessageBatchAsync. I'm adding messages with TryAddMesage and sends the batch if it returns false. This works localy but when I publish it to azure I get a MessageSizeExceeded when calling SendMessagesAsync.

It looks similar to this issue #18038

Expected behavior
SendMessagesAsync should be able to send the message,

Actual behavior (include Exception or Stack Trace)
Getting this error in the azure function log: The message (id:37495150, size:296449 bytes) is larger than is currently allowed (262144 bytes). (MessageSizeExceeded)

To Reproduce

public async Task<IEnumerable<QueueMessage>> SendMessagesAsync(IEnumerable<QueueMessage> messages)
        {
            var failedMessages = new List<QueueMessage>();
            ServiceBusMessageBatch batch = await _sender.CreateMessageBatchAsync();
            try
            {
                for (var i = 0; i < messages.Count(); i++)
                {
                    var json = JsonConvert.SerializeObject(messages.ElementAt(i), Formatting.None, new JsonSerializerSettings
                    {
                        NullValueHandling = NullValueHandling.Ignore
                    });
                    var serviceBusMessage = new ServiceBusMessage(json);


                    if (!batch.TryAddMessage(serviceBusMessage))
                    {
                        _logger.LogInformation("ServiceBusService, batch full. Messages: {Messages}, Size: {Size}", batch.Count, batch.SizeInBytes);
                        // Send the current batch as it is full and create a new one
                        await _sender.SendMessagesAsync(batch);

                        batch.Dispose();
                        batch = await _sender.CreateMessageBatchAsync();

                        if (!batch.TryAddMessage(serviceBusMessage))
                        {
                            _logger.LogInformation("ServiceBusService, Could not send message. MessageId: {MessgeId}", serviceBusMessage.MessageId);
                            failedMessages.Add(messages.ElementAt(i));
                        }
                    }

                }
                _logger.LogInformation("ServiceBusService, send last batch. Messages: {Messages}, Size: {Size}", batch.Count, batch.SizeInBytes);
                // send the final batch
                await _sender.SendMessagesAsync(batch);
            }
            finally
            {
                batch.Dispose();
            }

            return failedMessages;
        }

And here is the log output

2021-02-26 06:00:01.839
UploadMessagesToQueueAsync: SendMessagesAsync: 702
Information
2021-02-26 06:00:01.859
ServiceBusService, batch full. Messages: 440, Size: 262137
Information
2021-02-26 06:00:01.888
at Azure.Messaging.ServiceBus.Amqp.AmqpSender.SendBatchInternalAsync(Func`1 messageFactory, TimeSpan timeout, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.Amqp.AmqpSender.SendBatchInternalAsync(Func`1 messageFactory, TimeSpan timeout, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.Amqp.AmqpSender.<>c__DisplayClass18_0.<<SendBatchAsync>b__1>d.MoveNext() --- End of stack trace from previous location where exception was thrown --- at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation(Func`2 operation, TransportConnectionScope scope, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation(Func`2 operation, TransportConnectionScope scope, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.Amqp.AmqpSender.SendBatchAsync(ServiceBusMessageBatch messageBatch, CancellationToken cancellationToken) at Azure.Messaging.ServiceBus.ServiceBusSender.SendMessagesAsync(ServiceBusMessageBatch messageBatch, CancellationToken cancellationToken) at Obo.Messages.ApiClient.Services.ServiceBusService.SendMessagesAsync(IEnumerable`1 messages) in C:\Projects\SXBoendeApp\Backend\Tenants\OBO\OBO.Messages\Obo.Messages.ApiClient\Services\ServiceBusService.cs:line 57 at Obo.Messages.ApiClient.Services.MessagesApiClient.UploadMessagesToQueueAsync(IEnumerable`1 messages, IEnumerable`1 deletedIds) in C:\Projects\SXBoendeApp\Backend\Tenants\OBO\OBO.Messages\Obo.Messages.ApiClient\Services\MessagesApiClient.cs:line 201 at OBO.Messages.Booking.AptusFunction.BookingSync(TimerInfo myTimer, ILogger log, ExecutionContext context) in C:\Projects\SXBoendeApp\Backend\Tenants\OBO\OBO.Messages\FunctionApps\OBO.Messages.FunctionApps.Booking\AptusFunction.cs:line 128
Error
2021-02-26 06:00:01.892
The message (id:7331010, size:296449 bytes) is larger than is currently allowed (262144 bytes). (MessageSizeExceeded)
Error
2021-02-26 06:00:01.893
Executed 'BookingSync' (Failed, Id=64ba3ce0-b7b2-457b-8c46-80376a86c65d, Duration=1892ms)
Error
2021-02-26 06:00:01.899
The message (id:7331010, size:296449 bytes) is larger than is currently allowed (262144 bytes). (MessageSizeExceeded)

Environment:
Azure.Messaging.ServiceBus 7.1.0
.Net core 3.1
Azure Functions on Windows

@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Feb 26, 2021
@jsquire jsquire added Client This issue points to a problem in the data-plane of the library. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team Service Bus labels Feb 26, 2021
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Feb 26, 2021
@jsquire
Copy link
Member

jsquire commented Feb 26, 2021

Thank you for your feedback. Tagging and routing to the team member best able to assist.

@JoshLove-msft
Copy link
Member

Looks like the issue is that we are not adding overhead for distributed tracing instrumentation, which probably explains why you don't run into it locally. Thanks @jsquire for pointing it out!

@Abrissirba
Copy link
Author

Any estimate on when a new version of the nuget package is released?

@JoshLove-msft
Copy link
Member

Yes, we should have a beta version released next Tuesday.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants