Protocol Data Inputs

It must have been about a year ago┬ánow that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).

So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.

pdi

 

So let’s break down some of …

» Continue reading

Decoding IIS Logs

Everyone (just about) knows that there is a table of status codes that HTTP/1.1 defines. However, IIS gives you two more status codes in the log files. The HTTP/1.1 status is stored in sc_status (and it is automagically decoded for you in Splunk 6). There is also an extended code called sc_substatus and a Win32 error code. How can you really decode these, especially since the sc_win32_status seems to have really large numbers?

Let’s start with the sc_status and sc_substatus codes. These are normally written together as a decimal number. So, for instance, 401.1 means an sc_status of 401 and an sc_substatus of 1. The sc_status codes follow a pattern: 1xx are informational, 2xx indicate success, 3xx indicate redirection, and …

» Continue reading

Getting data from your REST APIs into Splunk

Overview

More and more products,services and platforms these days are exposing their data and functionality via RESTful APIs.

REST really has emerged over previous architectural approaches as the defacto standard for building and exposing web APIs to enable third partys to hook into your data and functionality. It is simple , lightweight , platform independent,language interoperable and re-uses HTTP constructs. All good gravy. And of course , Splunk has it’s own REST API also.

The Data Potential

I see a world of data out there available via REST that can be brought into Splunk, correlated and enriched against your existing data, or used for entirely new uses cases that you might conceive of once you see what is available and …

» Continue reading

Indexing data into Splunk Remotely

Data can reside anywhere and Splunk recognizes that fact by providing the concept of forwarders. The Splunk Forwarder will collect data locally and send it to a central Splunk indexer which may reside in a remote location. One of the great advantages of this approach is that forwarders maintain an internal index for where they left off when sending data. If for some reason the Splunk Indexer has to be taken offline, the forwarder can resume its task after the indexer is brought back up. Another advantage to forwarders is that they can load balance delivery to multiple indexers. Even a Splunk Light Forwarder (a forwarder that consumes minimal CPU resources and network bandwidth) can participate in an auto

» Continue reading