question

JohnnySlalom-1035 avatar image
0 Votes"
JohnnySlalom-1035 asked YoungYang-MSFT answered

PowerShell script consumes all memory and crashes server

I have a PowerShell script that gets a list of files in a folder and all its subfolders, then reads them one by one checking for certain invalid values in each file. This script has run fine for many years on a Windows Server 2008 machine with 8gb of memory. Sorry, I don't remember the version of PS that was running then, but assume it was 3.0.

This server was recently upgraded to Windows Sever 2012 R2, added an addtional 8gb of memory and is running PSVersion 4.0. Ever since the upgrade, the script crashes the server by eating more and more memory until it is all exhausted. There are usually about 300,000+ files to examine each time the script is run, ranging in size from a few KB each up to around 300mb each. This is the warning error I see in Event Viewer after the crash and reboot:

Windows successfully diagnosed a low virtual memory condition. The following programs consumed the most virtual memory: powershell.exe (5964) consumed 66568437760 bytes, splunkd.exe (1900) consumed 263499776 bytes, and svchost.exe (1852) consumed 83922944 bytes.

What could be the difference between PS 3.0 and 4.0 that could cause this, or is it something with the OS?

As for these old OS versions, this is a legacy server that will be going away by the end of the year, but I have to keep nursing it and keep it alive until then. We only upgraded to 2012 as Win 2008 is considered a big security risk and the rest of the old software won't run on anything above Win 2012.

Thanks for any help with this issue.

windows-server-powershell
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RichMatheisen-8856 avatar image
0 Votes"
RichMatheisen-8856 answered RichMatheisen-8856 edited

Since you've changed both the O/S and PowerShell it's kinda hard to say!

However, if you're creating an array of 300K files and then processing the contents of the array, perhaps simply piping the results of the (I assume) Get-ChildItem into the Foreach-Object (If that's what you're using) might cure the problem.

OTOH, of you're using something like ForEach ($file in $array ){...} you'd have to change it to $array | Foreach-Object {...} and either change the existing references to the $file variable to $, or make the first statement in the ForEach-Object block something like $file = $, allowing you to avoid changing possibly multiple line of code.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

JohnnySlalom-1035 avatar image
0 Votes"
JohnnySlalom-1035 answered JohnnySlalom-1035 edited

Thanks for the reply, but I got this fixed.

The first thing I tried was to use [System.GC]::Collect(). I read so much about this and was amazed at all the differing information/opions people were providing whether to use this or not. In the if loop, I kept a counter of files read and would call Collect every 1000 files. Unfortunately, this had no effect at all.

I then just upgraded PowerShell to v5.1 from this link:
https://download.microsoft.com/download/6/F/5/6F5FF66C-6775-42B0-86C4-47D41F2DA187/Win8.1AndW2K12R2-KB3191564-x64.msu

Now everything works as before. I have to assume that it was something to do with PS v4, perhaps even a memory leak, as it works pefectly in v3 and v5.1.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RichMatheisen-8856 avatar image
1 Vote"
RichMatheisen-8856 answered RichMatheisen-8856 published

If you'd accumulated 300K file objects in an array and then began processing them doing a garbage collection wouldn't have much of an effect. Doing a garbage collection after releasing the array (e.g. $array = $null, or $array = @()) might be beneficial, but only if there was a lot going on in the script after you're done with the array. Also possible is the removal from the array of the item just processed in the loop. After, say, removing 1000 items from the array you'd release some memory -- but at the expense of resizing the array repeatedly, and that would also consume memory and significant time,

That's one of the reasons I suggested piping the list of files directly into the loop rather than storing the results in an array. There's no large amount of storage consumed by the array.

I was going to suggest upgrading to release 5.1, but you didn't mention if there was any dependence on a particular version of PowerShell so I assumed(!) that there might be.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

YoungYang-MSFT avatar image
0 Votes"
YoungYang-MSFT answered

Hi, given that this post has been quiet for a while, this is a quick question and answer. Has your question been solved? If so, please mark it as an answer so that users with the same question can find and get help.
:)

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.