-
Notifications
You must be signed in to change notification settings - Fork 198
Conversation
Codecov Report
@@ Coverage Diff @@
## main #3280 +/- ##
==========================================
- Coverage 31.46% 31.45% -0.02%
==========================================
Files 307 307
Lines 37219 37239 +20
==========================================
Hits 11712 11712
- Misses 25507 25527 +20
|
Do we have tests for these changes? |
Please document how you observed that the pre-cache step is successfully shortening the runtime of the first coverage iteration. It would also be interesting to test the error path for when Finally, if there isn't already an issue for caching other debuggable modules in the |
Using the rust example binary from onefuzz-samples. I skipped pre-populating the cache by commenting out this line: No pre-computing cache This means before this PR, we're spending ~13 seconds per iteration since we don't store a cache across inputs. With pre-populating the cache, the elapsed time for the first run is 1.2 seconds From 13 seconds down to 1.2, over 10 times faster |
Summary of the Pull Request
What is this about?
Closes #3217