Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Azure.Storage.Blobs.BlobClient.UploadAsync times out when the stream is at a position different than 0 #13877

Closed
andre-ss6 opened this issue Jul 31, 2020 · 5 comments · Fixed by #14301
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Storage Storage Service (Queues, Blobs, Files)
Milestone

Comments

@andre-ss6
Copy link

andre-ss6 commented Jul 31, 2020

Describe the bug
Apparently similar to #9212 (see comment - I'm opening another issue as that one has already been closed).

If you call BlobContainerClient.UploadBlobAsync(string, Stream) or BlockBlobClient.UploadAsync(Stream, ...) and pass a MemoryStream (didn't test with any other type of Stream) with Position that is any other than 0, a timeout occurs.

Expected behavior
These are possible, acceptable behaviors:

If the stream is seekable:

  1. An exception is thrown with an informative message (instead of a timeout, as currently)
  2. The stream is uploaded from Position until the end

And if Position == Length:

  1. The position is reset to 0 before uploading
  2. The method returns immediately with a null Response<T>.Value

If the stream is not seekable:

  1. If there's no more data coming from the stream, a timeout exception would be OK, I guess, but a more informative message would help as well, perhaps noting that the timeout occurred when waiting for data in the provided stream. The way it currently is makes it look like the timeout occurred when trying to upload/communicate with azure servers. I'm not sure if this would be a change on this library or deeper down the dependency chain though (maybe HttpClient?)

Actual behavior

System.Threading.Tasks.TaskCanceledException: The operation was canceled.

 ---> System.IO.IOException: Unable to read data from the transport connection: The I/O operation has been aborted because of either a thread exit or an application request..

 ---> System.Net.Sockets.SocketException (995): The I/O operation has been aborted because of either a thread exit or an application request.

   --- End of inner exception stack trace ---

   at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.ThrowException(SocketError error, CancellationToken cancellationToken)

   at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.GetResult(Int16 token)

   at System.Net.Security.SslStream.<FillBufferAsync>g__InternalFillBufferAsync|215_0[TReadAdapter](TReadAdapter adap, ValueTask`1 task, Int32 min, Int32 initial)

   at System.Net.Security.SslStream.ReadAsyncInternal[TReadAdapter](TReadAdapter adapter, Memory`1 buffer)

   at System.Net.Http.HttpConnection.FillAsync()

   at System.Net.Http.HttpConnection.ReadNextResponseHeaderLineAsync(Boolean foldedHeadersAllowed)

   at System.Net.Http.HttpConnection.SendAsyncCore(HttpRequestMessage request, CancellationToken cancellationToken)

   --- End of inner exception stack trace ---

   at System.Net.Http.HttpConnection.SendAsyncCore(HttpRequestMessage request, CancellationToken cancellationToken)

   at System.Net.Http.HttpConnectionPool.SendWithNtConnectionAuthAsync(HttpConnection connection, HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)

   at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)

   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)

   at System.Net.Http.HttpClient.FinishSendAsyncUnbuffered(Task`1 sendTask, HttpRequestMessage request, CancellationTokenSource cts, Boolean disposeCts)

   at Azure.Core.Pipeline.HttpClientTransport.ProcessAsync(HttpMessage message)

   at Azure.Core.Pipeline.RequestActivityPolicy.ProcessNextAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean isAsync)

   at Azure.Core.Pipeline.RequestActivityPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean isAsync)

   at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.ResponseBodyPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Storage.Blobs.BlobRestClient.BlockBlob.UploadAsync(ClientDiagnostics clientDiagnostics, HttpPipeline pipeline, Uri resourceUri, Stream body, Int64 contentLength, String version, Nullable`1 timeout, Byte[] transactionalContentHash, String blobContentType, String blobContentEncoding, String blobContentLanguage, Byte[] blobContentHash, String blobCacheControl, IDictionary`2 metadata, String leaseId, String blobContentDisposition, String encryptionKey, String encryptionKeySha256, Nullable`1 encryptionAlgorithm, String encryptionScope, Nullable`1 tier, Nullable`1 ifModifiedSince, Nullable`1 ifUnmodifiedSince, Nullable`1 ifMatch, Nullable`1 ifNoneMatch, String requestId, Boolean async, String operationName, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlockBlobClient.UploadInternal(Stream content, BlobHttpHeaders blobHttpHeaders, IDictionary`2 metadata, BlobRequestConditions conditions, Nullable`1 accessTier, IProgress`1 progressHandler, String operationName, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.PartitionedUploader.UploadAsync(Stream content, BlobHttpHeaders blobHttpHeaders, IDictionary`2 metadata, BlobRequestConditions conditions, IProgress`1 progressHandler, Nullable`1 accessTier, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlockBlobClient.UploadAsync(Stream content, BlobHttpHeaders httpHeaders, IDictionary`2 metadata, BlobRequestConditions conditions, Nullable`1 accessTier, IProgress`1 progressHandler, CancellationToken cancellationToken)

   at [our code]

To Reproduce

        async static Task UploadBlob(string blobName)
        {
            string connectionString = "";
            string containerName = "";
            BlobContainerClient containerClient = new BlobContainerClient(connectionString, containerName);

            var content = new MemoryStream(Encoding.UTF8.GetBytes("Tester"));
            content.Position = 2;
            _ = await containerClient.UploadBlobAsync(blobName, content);
        }

Environment:

  • Azure.Storage.Blobs Version="12.4.4"
  • dotnet --info:
.NET SDK (reflecting any global.json):
 Version:   5.0.100-preview.8.20363.2
 Commit:    48ed6f10db

Runtime Environment:
 OS Name:     Windows
 OS Version:  10.0.19041
 OS Platform: Windows
 RID:         win10-x64
 Base Path:   C:\Program Files\dotnet\sdk\5.0.100-preview.8.20363.2\

Host (useful for support):
  Version: 5.0.0-preview.8.20361.2
  Commit:  f37dd6fc85

.NET SDKs installed:
  2.1.602 [C:\Program Files\dotnet\sdk]
  2.1.700 [C:\Program Files\dotnet\sdk]
  2.2.102 [C:\Program Files\dotnet\sdk]
  3.1.302 [C:\Program Files\dotnet\sdk]
  5.0.100-preview.8.20363.2 [C:\Program Files\dotnet\sdk]

.NET runtimes installed:
  Microsoft.AspNetCore.All 2.1.2 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
  Microsoft.AspNetCore.All 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
  Microsoft.AspNetCore.All 2.1.11 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
  Microsoft.AspNetCore.All 2.1.20 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
  Microsoft.AspNetCore.All 2.2.1 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
  Microsoft.AspNetCore.App 2.1.2 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 2.1.11 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 2.1.20 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 2.2.1 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 3.1.6 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.AspNetCore.App 5.0.0-preview.8.20360.11 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
  Microsoft.NETCore.App 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.NETCore.App 2.1.11 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.NETCore.App 2.1.20 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.NETCore.App 2.2.1 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.NETCore.App 3.1.6 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.NETCore.App 5.0.0-preview.8.20361.2 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
  Microsoft.WindowsDesktop.App 3.1.6 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App]
  Microsoft.WindowsDesktop.App 5.0.0-preview.8.20361.2 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App]
  • IDE and version: Visual Studio 16.6.4

I might want to fix this 😊 (btw, why is this repo in the gigabytes? 😐)

@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Jul 31, 2020
@jsquire jsquire added Client This issue points to a problem in the data-plane of the library. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team Service Attention Workflow: This issue is responsible by Azure service team. Storage Storage Service (Queues, Blobs, Files) labels Jul 31, 2020
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Jul 31, 2020
@ghost
Copy link

ghost commented Jul 31, 2020

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @xgithubtriage.

@jsquire
Copy link
Member

jsquire commented Jul 31, 2020

Tagging and routing to the team best able to assist.

@amnguye
Copy link
Member

amnguye commented Aug 3, 2020

Hi,

I believe this is something @seanmcc-msft is working on with OpenRead and OpenWrite. So this might be a duplicate of #9607 ?

@andre-ss6
Copy link
Author

Hmm, I'm sorry if I'm misunderstanding, but isn't that issue about creating two new APIs? This issue is about a bug with current APIs. Note that I'm not requesting any new features, rather for a correction in current behavior. Again, sorry if I'm misunderstanding.

@seanmcc-msft
Copy link
Member

Hi @andre-ss6, I think you are right, this is might be a bug in the current API. We'll take a look and investigate!

@amnguye amnguye added bug This issue requires a change to an existing behavior in the product in order to be resolved. and removed question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Aug 3, 2020
@jsquire jsquire removed the needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team label Aug 10, 2020
@jsquire jsquire added this to the Backlog milestone Aug 10, 2020
@github-actions github-actions bot locked and limited conversation to collaborators Mar 28, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Storage Storage Service (Queues, Blobs, Files)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants