Skip to Main Content

PowerShell Community Management Pack in action - Performance Collection Rule

Graham Davies

Technical Product Manager – SCOM products

Welcome to the second part of a series of posts that will run through using the PowerShell Community Management Pack for SCOM to enhance your SCOM monitoring.

In the first part of this series, I walked through using a PowerShell script to check the number of files in a remote share and then tweaked the code to allow for the use of a Run As Account.

Pre-requisites

If you want to follow through the series, then please make sure you have completed Part 1 - PowerShell Community Management Pack in Action - Monitors and Run As Accounts - SquaredUp DS.

Let's go

This is the script that we'll be using.

param([string]$Arguments)

$ScomAPI = New-Object -comObject "MOM.ScriptAPI" $PropertyBag = $ScomAPI.CreatePropertyBag()

$RemoteFolderPath = "\\SCOM-MS2\ImageStore" $State = "Unknown" $FileCount = 0 $Threshold = 5 $whoami = whoami.exe


#####
#
# This script needs permissions set on Everyone = Read on the share otherwise it will return a file count of 0. Test-Path will validate that the share exists but won't have permissions to count files (but won't error) #
######

Try {
    If (Test-Path $RemoteFolderPath) {
 $FileCount = (Get-ChildItem $RemoteFolderPath -File | Measure-Object).Count  if ($FileCount -lt $Threshold) {  $State = "UnderThreshold"             }
 else {  $State = "OverThreshold"             }
        }

    }
      
 finally {  # Properties of our alert 
 $ScomAPI.LogScriptEvent("RemoteFolderCheck.ps1",9999,2,"Run As = " + $whoami + ", Remote Folder Path = " + $RemoteFolderPath + ", State = " + $State + ", FileCount = " + $FileCount + ", Threshold = " + $Threshold) 
 
 $PropertyBag.AddValue("FileCount",$FileCount)  
 $PropertyBag.AddValue("Threshold",$Threshold)  
 $PropertyBag.AddValue("RemoteFolderPath",$RemoteFolderPath)  
 $PropertyBag.AddValue("WhoamI",$whoami)     
 
 # Property Bag for Health State  
  $PropertyBag.AddValue("State",$State) 
    
# Return $PropertyBag to SCOM
$PropertyBag
}

It is important that the script that we use for the Performance Collection Rule is identical to the script that we used for the monitor with identical parameters. SCOM will detect it is the same script and run it just once to output data for both the monitor and the performance collection rule (this is Cookdown in action).

Cookdown allows the SCOM to scale without impacting the monitored servers. The script will run just once and the output of that script will be used in all workflows. Remember - it is important though that the script and all the input parameters of the script are identical.

I'll start by going to the Authoring tab in the SCOM console and Authoring > Management Pack Objects > Rules. Right click on rules and create new rule > PowerShell Script Performance Collection Rule (Community) - as per the screenshot below.

And, make sure to save it in the same Management Pack as our previously created monitor.

I've given the performance collection rule a name, ensured the category is "Performance Collection" and targeted the same class as in part 1. You must target the same class if you want cookdown to work. In my example, I'm using Windows Server 2016 and above Operating System.

I have also disabled the rule and will enable it via an override later.

I have set a schedule of every 15 minutes. This has to be the same schedule as the monitor for Cookdown to work,

Then copy and paste the same script into the script window and make sure you give the script the same name as the monitor script.

In the next windows (the Performance Mapper)

Result

We get alerts:

We get Performance Data:

Cookdown

What we should notice in the Operations Manager event log is that we have just one event every 15 minutes (the script runs once every 15 minutes) to feed the monitor workflow and the performance collection workflow at the same time.

But! Urghh. Something has gone horribly wrong. We can see the script running twice. Why?

The answer is something I have stated repeatedly - that the parameters for the script must be identical. If you look at the events we can see that one of the parameters is not the same. That is the account that SCOM is using to run the script:

Monitor

Rule

And we have now hit one of the limitations of SCOM and the PowerShell Community Management Pack.

Options:

  1. Accept that Cookdown won't work. This is the best option here as the script is lightweight, running against a single instance and will only be running every 15 minutes. There won't be a performance impact on the monitored server.
  2. Use Visual Studio \ Visual Studio Authoring extensions and get into the weeds of authoring.
    1. This is a steep learning curve and outside the scope of this blog but do contact me if you'd like an example. If you are targeting multi-instance objects then you'll need to use Cookdown \ Visual Studio to prevent impacting the monitored server.
    2. Another advantage of Visual Studio is that we only have the script in Visual Studio once in the underlying data source. With the PowerShell Community Management Pack, we have to copy and paste the same script into multiple rules and monitors which can be much harder to maintain.

We'll go with option 1 for now.

Cookdown ...

As a quick aside, I exported the Management Pack and deleted the Run As Property on the monitor (just from the monitor - I didn't delete the Run As Profile from the Management Pack).

And notice that the Run As account specified in the alert matches the Run As Account in the Operations Manager event log event.

Moral of the story

The PowerShell Community Management Pack supports Cookdown but every single parameter of the rule(s) \ monitor(s) that use the script must be identical.

Share this article to LinkedInShare this article on XShare this article to Facebook
Graham Davies

Technical Product Manager – SCOM products