Skip to Main Content

IIS Log Management with SCOM and SquaredUp

We'd like to introduce you to Peter West. Peter is a Server Administrator at DFDS and the newest member of our highly esteemed guest blogger team who, after posting a killer article on our Community Answers platform, has written an extended version for us as a guest blog post. Take it away Peter...


I’ve always found it a little strange that Microsoft's Internet Information Services doesn’t contain any inbuilt method for log management. It’s an omission which I’ve seen cause numerous issues over the years. So much so that whenever we see an IIS Server running low on disk space then the first thing we do is check for the presence of logs.

I’m relatively new to the world of SCOM, but having now spent a couple of years getting to know the product, and having more recently become familiar with SquaredUp, it seemed to be a sensible idea to improve things in this area. This blog post gives a bit of background to a couple of scripts that I’ve built to simplify the processes relating to IIS log folders and files.

Log File reporting

I needed to create a mechanism by which I could report on the amount of disk space being utilised by the IIS log folders, but I also needed it to be flexible enough to cover the differing number of websites that we see on our servers.

The resulting script, which I’ve named Get-IISLogFolderMetrics will enumerate the websites on the Server where it executes before building the path to where the log files exist. It will then gather further data such as the total size of the logs, the age of the oldest log file and the amount of free space on the volume on which the logs exist.

The above data is gathered for each of the websites on the Server before being returned in the required format. Typically for Data on Demand this is CSV and this is the default return type used if none is specified as a parameter.

The creation of the Task that utilises the attached Powershell script was made a lot simpler thanks to the Community Powershell Management Pack. I used this to define a new task with the name ‘List IIS Log File Metrics (Data On Demand)’ and provided the task name, a description and the script itself.

Once that was done it was incredibly easy to use SquaredUp to navigate to a Server which hosted an instance of the IIS product before choosing the IIS perspective. A couple of clicks later and I’d added a new ‘On-Demand Task (Grid)’ widget and populated it with the required configuration. I had considered going to the trouble of going through the configuration but it really was incredibly simple and it was just a case of choosing the appropriate Task from the drop-down list. Once done all instances of IIS would then report the relevant information as part of the perspective.

So, there we go. With a simple and relatively short script we now have great visibility of the space being occupied by these logs. But what do we do about them when they begin to accumulate?

Application Performance Management

Your complete guide to the latest IT monitoring trend

Log File cleanup

I had grown tired over the years of having to RDP to Servers in order to delete an accumulation of IIS log files so writing a script to automate the process seemed a bit of a no brainer. It also became evident that doing so and then leveraging SquaredUp would make it so that potentially anyone in the team could handle the task, thus allowing us to focus on other issues.

I therefore put together a simple script which will enumerate the websites on a specific Server before identifying the log path for each and removing logs that are over a specific threshold. The default threshold being 30 days.

However, we also wanted a method so that those responsible for the Servers could have some control. Maybe they had a specific website which they wanted the logs to be retained for a longer period; with this in mind we came up with a suitable solution.

The script will look at the base path where the log sub-folders exist for the presence of a JSON file named ‘LogRetentionConfiguration.json’. If the file is found then the values held within will be used for the respective Log sub-folder. If no value is given then the usual default of 30 days still applies. An example JSON file would look something like this.

{  <br>    <strong>"LogRetentionTimes"</strong>:[  <br>       {  <br>          <strong>"W3SVC1"</strong>:{  <br>             <strong>"LogRetentionTime"</strong>:"20"<br>          }<br>       },<br>       {  <br>          <strong>"W3SVC2"</strong>:{  <br>             <strong>"LogRetentionTime"</strong>:"10"<br>          }<br>       }<br>    ]<br> }

Again, the Community Powershell Management Pack made it easy to create the script, named Clean-IISLogFiles, as a task in SCOM. It was then simple to edit the IIS perspective in SquaredUp so that the Task could be created as an Action.

And that was really all there was to it. Now, when an IIS Server is viewed in SquaredUp we can select the IIS perspective to both view the amount of space being utilised by log files. And, if a cleanup is needed then we just click on the appropriate button at the top of the display.

Clicking the button executes the script and a summary is shown that gives a breakdown of the space that has been recovered by the process.

What Next?

I haven’t progressed much beyond this yet but there is no reason why the task couldn’t also be attached to a Unit Monitor as a recovery task. By doing this any space being occupied by IIS Logs could automatically be reclaimed if a Server was running low on space.

Neither of the scripts are especially complex but a bit of thought has to be given to ensure that the scripts complete successfully. Initial revisions didn’t consider that the IIS Log Folder might not necessarily exist so over time the script has been extended a little to handle these kind of circumstances. I’m sure there is more that could be done with these scripts. If anyone has any suggestions then please do get in touch.