Custom Message Handling and HEC Timestamps with the Kafka Modular Input
Custom Message Handling
If you are a follower of any of my Modular Inputs on Splunkbase , you may see that I employ a similar design pattern across all of my offerings. That being the ability to declaratively plug in your own parameterizable custom message handler to act upon the raw received data in some manner before it gets output to Splunk for indexing. This affords many benefits :
- Many of my Modular Inputs are very cross cutting in terms of the numerous potential types and formats of data they will encounter once they are let loose in the wild. I can’t think of every data scenario. An extensibility design allows the user and community to be able to customize the
Achieving scale with the Kafka Modular Input
A hot topic in my inbox over the recent months has been how to achieve scalability with the Kafka Modular Input , primarily in terms of message throughput. I get a lot of emails from users and our own internal Splunk team about this , so rather than continuing to dish out the same replys , I thought I’d just pen a short blog to share some tips and tricks.
So let’s start off with this simple scenario :
- a single instance of Splunk 6.3
- downloaded and installed the freely available Kafka Modular Input from Splunkbase
These are the scaling steps that I would try in order.
Enable HTTP Event Collector output
With the recent release of Splunk 6.3 , …
Scheduled Export of Indexed Data
I’m really enjoying playing with all the new Developer hooks in Splunk 6.3 such as the HTTP Event Collector and the Modular Alerts framework. My mind is veritably fizzing with ideas for new and innovative ways to get data into Splunk and build compelling new Apps.
When 6.3 was released at our recent Splunk Conference I also released a new Modular Alert for sending SMS alerts using Twilio, which is very useful in it’s own right but also a really nice simple example for developers to reference to create their own Modular Alerts.
But after getting under the hood of the Modular Alerts framework, this also got me thinking about other ways to utilise Modular Alerts to fulfill other use …
Turbo charging Modular Inputs with the HEC (HTTP Event Collector) Input
HTTP Event Collector (HEC)
Splunk 6.3 introduces a new high performance data input option for developers to send event data directly to Splunk over HTTP(s). This is called the HTTP Event Collector (HEC).
In a nutshell , the key features of HEC are :
- Send data to Splunk via HTTP/HTTPS
- Token based authentication
- JSON payload grammar
- Acknowledgment of sent events
- Support for sending batches of events
- Keep alive connections
A typical use case for HEC would be a developer wanting to send application events to Splunk directly from their code in a manner that is highly performant and scalable and alleviates having to write to a file that is monitored by a Universal Forwarder.
But I have another use case …
SMS Alerting from Splunk with Twilio
With the release of Splunk 6.3 comes an exciting new feature called Modular Alerts.
Historically the alerting actions in Splunk have been limited to Email, RSS and if you wanted to perform some custom alerting functionality then you could execute a Custom Script.
Whilst many Splunk Ninjas over the years have accomplished all sorts of amazing Kung Fu by wrangling with custom alerting scripts , they are ultimately not the most optimal approach for users and developers.
- manual setup
- no configuration interface
- need file system access
- loosely coupled to Splunk
- no common development or packaging standard
So what if you want more alerting actions that you can plugin and present as first class alerting actions in your Splunk instance.
Protocol Data Inputs
It must have been about a year ago now that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).
So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.
So let’s break down some of …
What are Splunk Apps and Add-Ons ?
If you have ever uploaded a contribution to Splunk Apps you’ll see the following option : But what does this really mean ? What is the difference between an App and an Add-on ? Both are packaged and uploaded to Splunk Apps as SPL files and then to install them in your Splunk instance you simply untar the SPL file into etc/apps .But the content and purpose of Apps and Add-ons certainly differ from one another.
An Add-on is typically a single component that you can develop that can be re-used across a number of different use cases.It is usually not specific to any one single use case.It also won’t contain a navigable user interface.You cannot open an Add-on from …
Reflections on a Splunk developer’s journey : Part 2
Why should you develop ?
In “Reflections on a Splunk developer’s journey : Part 1” I shared some of my experiences of developing and supporting Splunk Community Apps and Add-ons over the years.
But WHY did I choose to develop and WHY should you choose to develop and start your foray the Splunk developer ecosystem?
Well the reasons for developing are going to be different for everyone depending on your motives. You might be a business or you might just be an individual community collaborator.
The reasons I started developing were because I discovered Splunkbase (now Apps / Answers) and realized that it was a great forum for collaborating and getting involved with the “Big Data” community to use …
Reflections on a Splunk developer’s journey : Part 1
It seems like only yesterday
…that I was writing my first Splunk App. It was the openness and extensibility of the Splunk platform that attracted me to this pursuit in the first place, and when I discovered the thriving community on Splunkbase (now called Splunk Apps / Answers), I just had to contribute. 12,000+ downloads across 9 different freely available community offerings later, I am feeling somewhat reflective. So in this 2 part blog series I want to share with you some of my lessons learned from developing and supporting Splunk community Apps/Add-ons (part 1) and then some musings on why you should consider developing Splunk Apps/Add-ons yourself and contribute to the Splunk developer ecosystem (part 2).
Some lessons learned…
Command Modular Input Use Case Series
Modular Inputs and Scripted Inputs provide a great way to develop custom programs to collect and index virtually any kind of data that you can set your mind to.
But on whatever platform you have deployed Splunk on, you will also have a whole bevy of other inputs just waiting for you to tap into to get that data into Splunk .They would be the various programs that come with the platform and those that you have installed on your platform.v
This is actually why I created the Command Modular Input that I introduced in a recent blog, a means to as simply as possible leverage the power of your existing system programs and get this data into …