Logstash and IIS

Note: If you are also using Kibana as your front end, you will need to add a MimeType of “application/json” for the extension .json to IIS.

We are pushing all of our logs into Elasticsearch using Logstash. IIS was the most painful part of the process so I am writing up a few gotchas for Logstash 1.3.3 and IIS in general.

The process is relatively straight forward on paper:

  1. Logstash monitors the IIS log and pushes new entries into the pipeline
  2. Use a grok filter to split out the fields in the IIS log line (more on this below)
  3. Push the result into Elasticsearch

Firstly there is a bug in the Logstash file input on windows (doesn’t handle files named the same in different directories) which results in partial entries being read. To remedy this you need to get IIS to generate a single log file per server (default is per website). Once that is done we can read the IIS logs with this config

Once we have IIS log lines pumping through the veins of Logstash, we need to break down the line into its component fields. To do this we use the Logstash Grok filter. In IIS the default logging is W3C but you are able to select the fields you want outputed. The following config works for the default fields and [bytes sent] so we can see bandwidth usuage. The Heroku Grok Debugger is a lifesaver for debugging the Grok string (paste an entry from your log into it and then paste you GROK pattern in)

Below is the complete IIS configuration for logstash. There are a few other filters we use to enrich the event sent to logstash as well as a conditional to remove IIS log comments.

Tagged ,

8 thoughts on “Logstash and IIS

  1. timezone => “Etc/UCT” should probably be timezone => “Etc/UTC”

  2. adammills says:

    Thanks for reading! UCT and UTC are equivalent.

  3. Adam, did you consider using ETW rather than using the traditional IIS log files as the input? Any reasons for choosing one over the other?

  4. Jen says:

    Im sending IIS logs to Logstash trough nxlog. Im receiving the complete message but logstash is showing _grokparsefailure.
    here is my grok. I have been working in this for hours. any help will be appreciated.


    grok {
    ‘match’ => { “message” => “%{TIMESTAMP_ISO8601:log_timestamp} %{IP:client} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IP:clienthost} %{NOTSPACE:useragent} %{NOTSPACE:referer} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:bytes:int}” }


    2015-04-10 18:08:30 GET /iis-85.png – 80 – Mozilla/5.0+(Macintosh;+Intel+Mac+OS+X+10_9_3)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/32.0.1700.107+Safari/537.36 http://saiis4.mia.ucloud.int/ 200 0 0 203

  5. kiquenet says:

    Use PerformanceCounter class for any ASP.NET Performance Counter is add overhead to performance ? Which more value information: ASP.NET Performance Counter or IIS log ? Anyways, ASP.NET Performance Counter not easy use with LogStash and Kibana, maybe

    output {
    elasticsearch {
    host => “” ***** IS hosts now ??? *****

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: