Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.OutOfMemoryException when checking an entire hard drive #13

Open
wdormann opened this issue Jul 24, 2018 · 3 comments
Open

System.OutOfMemoryException when checking an entire hard drive #13

wdormann opened this issue Jul 24, 2018 · 3 comments
Assignees

Comments

@wdormann
Copy link

It seems reasonable to do a scan of the executable code on an entire system. To do this, I ran:
Get-PESecurity -Directory 'C:' -Recursive > pesecurity.log

Eventually the script started throwing errors:

Exception calling "ReadAllBytes" with "1" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown."
At C:\Users\test_user\Get-PESecurity.psm1:399 char:5
+     $FileByteArray = [IO.File]::ReadAllBytes($CurrentFile)
+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : OutOfMemoryException

I'm not too keen on powershell to know where the problem is, but just looking at the memory usage of PowerShell when doing a recursive directory scan, it is mostly only increasing. I'd think that ideally as it crawls through the directories it analyzes a binary one at a time, outputting the results and releasing any allocated memory/objects for each one. The current behavior never actually gets to the point of outputting any results because it runs out of memory before it gets to that point.

@egru
Copy link
Member

egru commented Jul 24, 2018

You're correct. It's scanning the files one-by-one and adding the info to a data table and holding that in memory until it's done. Not the most efficient thing in the world. I can try to refactor it so it's more efficient in it's file handling. I honestly never needed to scan that many files so it's never come up before. However, the error your getting indicates that it may be trying to read a file that's too big. I know that can probably be fixed rather easily by using FileStreams instead of reading everything in at once.

@egru egru self-assigned this Jul 24, 2018
@aaronhudon
Copy link
Contributor

See my PR #15

@eabase
Copy link

eabase commented Oct 25, 2020

I guess this issue can be closed if PR was merged?
Alternatively to provide the option to output the table line by line to a log file, so nothing is needed to be kept in memory

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants