Quantcast
Channel: The Official Scripting Guys Forum! forum
Viewing all articles
Browse latest Browse all 15028

Strange issue with Powershell using up huge amounts of memory.

$
0
0

 I've got a very weird issue happening. I've got a Powershell script that Loads a list of servers from a CSV file then uses that with start-job to create a sub-thread for each server to monitor performance counters. The problem is the main thread that starts the jobs keeps using more and more memory. The system pages nearly all of it out but within about 30 hours it ends up having to be restarted because all memory is used up on the server. 

Here is the launch code. 

 

$MyPath=$MyInvocation.MyCommand.Path | Split-Path
. $MyPath'\!PerfLogger_Include.ps1'try{get-job | remove-job -force}catch{}

Createfolder($MinFolder)#Get All the computers from the Monitor List. $ServersToMonitor= import-csv $mypath'\MonitorList.csv'#Launch all Threads. $ServersToMonitor | % {if ($_.Name -ne$Null){$silent= start-job -Filepath .\PerfLogger-Thread.ps1 -argumentlist $_.Name.tostring(),$_.ServerType.tostring(),$MyPath,$_.MinutesToSample,$_.SamplesPerMin#start-job -ScriptBlock {PerfLoggerThread($_.Name,$_.ServerType)
        add-content $LogFile"$(Date)-Starting thread for $_.Name.tostring() - Launcher"
	}
}#Keep shell open while jobs are running.# get-job | wait-job #### Seems to create a memory leak. $loop="Indefinitly"do{
    start-sleep -m 1440
} while ($loop-ne$null)


This is the Thread part that makes up the sub jobs. 

Param([string]$Computer,[string]$ServerType,$MyPath,[int]$Minutes,[int]$SamplesPerMin)
. $MyPath'\!PerfLogger_Include.ps1'

#Get Counters to monitor
add-content $LogFile "$(Date)-Getting Counters for $Computer"
[int]$Samples = $Minutes * $SamplesPerMin
[int]$SamplesInterval = ($Minutes * 60)/($Samples)
$myCounters = GetPerfCounters -computer $Computer -ServerType $ServerTypedo {
    # Gather Performance Data from Machine (Set Sample Interval for time between samples and Max samples for total samples to take before returning)try{$perf = Get-Counter -Counter $mycounters -ComputerName $Computer -SampleInterval $SamplesInterval -MaxSamples $Samples}catch{add-content $LogFile "$(Date)-!!!!!!!!!! PerfData from $Computer was not returned. $ERROR"}
    #Add Performance Counters to Log File
    $StoreLogs = $MinFolder + '\' + $Computer+"-%%$(Get-date -f yyyy-MM-dd-HH)%%"
    Createfolder($StoreLogs)try{        #add-content $LogFile "Saving PerfData for $Computer"
        $perf | export-counter -path $StoreLogs\$Computer'-'$(Get-date -f yyyy-MM-dd-HH-m-s).blg -FileFormat blg -force
     }catch{add-content $LogFile "$(Date)-!!!!!!!!!! Could Not save Perfdata for $Computer. $ERROR"}
}while($i -ne 1)


The main code is launched as a scheduled task on server boot.  It starts out after the threads are up using about 130 mb of ram... as time goes by it grows... last time I shut it down it was at 7 gigs of ram... I can't figure out for the life of me why it would do this... Its not doing anything... I thought perhaps it was the get-job|wait-job doing it so I replaced it with an infinite loop with a 1 day sleep.  That didn't help.  Still the same issue. 

I'd launch the start-jobs and let it close but it seems that if I do that as all Job threads are closed along with the launching script. 

 

Any one know why this is happening or if there is a way to have the job threads remain active after the starting code has closed?

 

 

 

 


Viewing all articles
Browse latest Browse all 15028

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>