Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FileStream.Windows useAsync WriteAsync calls blocking apis #25905

Closed
benaadams opened this issue Apr 16, 2018 · 19 comments
Closed

FileStream.Windows useAsync WriteAsync calls blocking apis #25905

benaadams opened this issue Apr 16, 2018 · 19 comments
Assignees
Labels
Milestone

Comments

@benaadams
Copy link
Member

e.g SetEndOfFile in the async path is 50% of the time and it doesn't need to be called (as WriteFile will expand the File).

SetEndOfFile is also a blocking api itself; so expanding the file using it blocks the thread for as much time (or more) as the async saves blocking.


Previous

Seen in electricessence/AsyncFileWriter#1 (Windows 10; netcoreapp2.0 and netcoreapp2.1)

Non-async (just for FileStream.Write)

AsyncFileWriterTester>dotnet run -c Release
100,000 bounded capacity.
Total Time: 2.3680831 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:02.6254878
------------------------

10,000 bounded capacity.
Total Time: 2.0399536 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:02.3390593
------------------------

1,000 bounded capacity.
Total Time: 2.1499109 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:02.3699012
------------------------

100 bounded capacity.
Total Time: 2.0656293 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:03.0224013
------------------------

Press ENTER to continue.

Making the change electricessence/AsyncFileWriter@752f7ef

useAsync = true and fs.Write( => await fs.WriteAsync(

Causes a dramatic slow down:

AsyncFileWriterTester>dotnet run -c Release
100,000 bounded capacity.
Total Time: 4.1238278 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:14.3291113
------------------------

10,000 bounded capacity.
Total Time: 4.370184 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:19.3693952
------------------------

1,000 bounded capacity.
Total Time: 4.1362524 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:19.0104628
------------------------

100 bounded capacity.
Total Time: 4.2608508 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:20.4615956
------------------------

Press ENTER to continue.

Seems to use a lot of threads, though maybe that's Channels

@benaadams
Copy link
Member Author

useAsync = false and await fs.WriteAsync

AsyncFileWriterTester>dotnet run -c Release
100,000 bounded capacity.
Total Time: 4.732445 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:20.8928633
------------------------

10,000 bounded capacity.
Total Time: 4.5554379 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:22.0848356
------------------------

1,000 bounded capacity.
Total Time: 4.7196995 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:21.6726637
------------------------

100 bounded capacity.
Total Time: 4.4949652 seconds
Total Bytes: 115,888,890
Total Wait Time: 00:00:21.4277106
------------------------

Press ENTER to continue.

@benaadams
Copy link
Member Author

useAsync = false and await fs.WriteAsync

netcoreapp2.0
image

netcoreapp2.1
image

image

@benaadams
Copy link
Member Author

benaadams commented Apr 16, 2018

netcoreapp2.1 useAsync = true

image

I was expecting FileStreamCompletionSource in the allocations from WriteAsyncInternalCore but don't see it? Is going via Stream base class?

@stephentoub
Copy link
Member

stephentoub commented Apr 16, 2018

Is this using a FileStream or is it using something derived from FileStream?

@stephentoub
Copy link
Member

stephentoub commented Apr 16, 2018

async FileStream Writing is x2 worse in perf than non-async

It's got more overhead and scales better; that's the tradeoff.

We could look at using an IValueTaskSource to back the new ValueTask-returning overload, but I'm afraid that ship may have sailed at this point, as it's an observable change and we're locking down the 2.1 release.

@electricessence
Copy link

electricessence commented Apr 16, 2018

https://stackoverflow.com/questions/49827208/ado-async-all-the-way-down-the-tubes/49837693#49837693

Iterate 1,000,000 times .Write(bytes)

Total Elapsed Time: 0.1750803 seconds

Iterate 1,000,000 times .WriteAsync(bytes)

Total Elapsed Time: 3.1865426 seconds

@electricessence
Copy link

electricessence commented Apr 16, 2018

And to be clear, the times Ben listed are not correct. The performance is much worse when Async.

@stephentoub:

It's got more overhead and scales better; that's the tradeoff.

That's what I've been thinking is that when quick-iterating, it's just not worth the async overhead per write. Especially if you've already got your data queued up.

@benaadams
Copy link
Member Author

Is this using a FileStream or is it using something derived from FileStream?

FileStream directly:

using (var fs = new FileStream(FilePath, FileMode.Append, 
                      FileAccess.Write, FileShareMode, bufferSize: 4096 * 4, useAsync: true))

Overloads weren't looking right; as didn't go through ReadOnlyMemory<byte>; built self-contained and copied over coreclr test wrapper and the allocations are looking more like expected; but have built incorrectly for useAsync for the profile

image

@benaadams
Copy link
Member Author

benaadams commented Apr 16, 2018

Closing for now; was concerned it wasn't taking the overlapped path, but that seem to not be the case - was a build mix up; probably from switching runtimes,

Will open a new one if I get a smaller repo for perf

@benaadams benaadams reopened this Apr 16, 2018
@benaadams
Copy link
Member Author

benaadams commented Apr 16, 2018

So WriteCore only calls VerifyOSHandlePosition() whereas WriteAsyncInternalCore makes far more syscalls (including VerifyOSHandlePosition())

With SetEndOfFile looking like it takes the same time as WriteFileNative

async FileStream, async WrtieAsync

image

@benaadams
Copy link
Member Author

Non-async FileStream, blocking Write
image

@stephentoub
Copy link
Member

@benaadams, I don't understand what the action item is meant to be on this issue.

@benaadams
Copy link
Member Author

benaadams commented Apr 16, 2018

I don't understand what the action item is meant to be on this issue.

With this one weird trick... dotnet/coreclr#17581?

@benaadams benaadams changed the title async FileStream Writing is x2 worse in perf than non-async async FileStream Writing is considerablly worse in perf than non-async Apr 16, 2018
@benaadams
Copy link
Member Author

Pre
image

Post
image

Pre

netcoreapp2.1\win-x64\publish>AsyncFileWriterTester.exe
100,000 bounded capacity.
Total Time: 9.2379032 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:51.9054476
------------------------

10,000 bounded capacity.
Total Time: 10.4082898 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:01:07.2506591
------------------------

1,000 bounded capacity.
Total Time: 10.5997765 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:01:10.7627665
------------------------

500 bounded capacity.
Total Time: 10.0306642 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:01:06.5325461
------------------------

100 bounded capacity.
Total Time: 10.7193698 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:01:08.3325643
------------------------

Press ENTER to continue.

Post

netcoreapp2.1\win-x64\publish>AsyncFileWriterTester.exe
100,000 bounded capacity.
Total Time: 2.2788458 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:04.6358343
------------------------

10,000 bounded capacity.
Total Time: 2.0491854 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:05.0332742
------------------------

1,000 bounded capacity.
Total Time: 1.8854354 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:03.7733625
------------------------

500 bounded capacity.
Total Time: 1.880664 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:03.7166907
------------------------

100 bounded capacity.
Total Time: 1.9010611 seconds
Total Bytes: 116,888,890
Aggregate Waiting: 00:00:03.8017758
------------------------

Press ENTER to continue.

@benaadams
Copy link
Member Author

There's a second behviour difference. WriteAsync will issue a Flush+Write if it would over fill the buffer; whereas Write will issue a Flush and then copy into the cleared buffer (if under buffer size; otherwise Write); so may have an extra Write creeping in - but that's a more involved item to look at, at some point in future.

@electricessence
Copy link

electricessence commented Apr 16, 2018

@benaadams Make sure you're you are using the latest code. Your previous code was not completing all the writes. It just cut short and made the times look strange. They are much worse. (I added an assertion that compares the total bytes written to the total bytes of the file.) A successful benchmark has to have all the bytes written.

@benaadams
Copy link
Member Author

Am using latest code; also using a very fast SSD so it would be much worse without it.

It looks like the main cause of slow down for async vs blocking is the number of extra file interactions that occur in that path; rather than it being async.

@benaadams
Copy link
Member Author

e.g. SetEndOfFile in the async path is 50% of the time and it doesn't need to be called. It is also a blocking api itself; so expanding the file using it blocks the thread for as much time (or more) as the async saves blocking

@benaadams benaadams changed the title async FileStream Writing is considerablly worse in perf than non-async FileStream.Windows useAsync WriteAsync calls blocking apis Apr 16, 2018
@msftgits msftgits transferred this issue from dotnet/corefx Jan 31, 2020
@msftgits msftgits added this to the Future milestone Jan 31, 2020
@maryamariyan maryamariyan added the untriaged New issue has not been triaged by the area owner label Feb 23, 2020
@JeremyKuhne JeremyKuhne removed the untriaged New issue has not been triaged by the area owner label Mar 3, 2020
@adamsitnik adamsitnik modified the milestones: Future, 6.0.0 Mar 5, 2021
@ghost ghost added the in-pr There is an active PR which will close this issue when it is merged label Mar 5, 2021
@ghost ghost removed the in-pr There is an active PR which will close this issue when it is merged label Mar 17, 2021
@adamsitnik
Copy link
Member

Fixed by #49975, please expect a blog post with all the details and perf numbers in April (before we ship preview 4)

@ghost ghost locked as resolved and limited conversation to collaborators Apr 22, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
8 participants