Scheduled Export of Indexed Data
I’m really enjoying playing with all the new Developer hooks in Splunk 6.3 such as the HTTP Event Collector and the Modular Alerts framework. My mind is veritably fizzing with ideas for new and innovative ways to get data into Splunk and build compelling new Apps.
When 6.3 was released at our recent Splunk Conference I also released a new Modular Alert for sending SMS alerts using Twilio, which is very useful in it’s own right but also a really nice simple example for developers to reference to create their own Modular Alerts.
But after getting under the hood of the Modular Alerts framework, this also got me thinking about other ways to utilise Modular Alerts to fulfill other use …
Turbo charging Modular Inputs with the HEC (HTTP Event Collector) Input
HTTP Event Collector (HEC)
Splunk 6.3 introduces a new high performance data input option for developers to send event data directly to Splunk over HTTP(s). This is called the HTTP Event Collector (HEC).
In a nutshell , the key features of HEC are :
- Send data to Splunk via HTTP/HTTPS
- Token based authentication
- JSON payload grammar
- Acknowledgment of sent events
- Support for sending batches of events
- Keep alive connections
A typical use case for HEC would be a developer wanting to send application events to Splunk directly from their code in a manner that is highly performant and scalable and alleviates having to write to a file that is monitored by a Universal Forwarder.
But I have another use case …
SMS Alerting from Splunk with Twilio
With the release of Splunk 6.3 comes an exciting new feature called Modular Alerts.
Historically the alerting actions in Splunk have been limited to Email, RSS and if you wanted to perform some custom alerting functionality then you could execute a Custom Script.
Whilst many Splunk Ninjas over the years have accomplished all sorts of amazing Kung Fu by wrangling with custom alerting scripts , they are ultimately not the most optimal approach for users and developers.
- manual setup
- no configuration interface
- need file system access
- loosely coupled to Splunk
- no common development or packaging standard
So what if you want more alerting actions that you can plugin and present as first class alerting actions in your Splunk instance.
Protocol Data Inputs
It must have been about a year ago now that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).
So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.
So let’s break down some of …
What are Splunk Apps and Add-Ons ?
If you have ever uploaded a contribution to Splunk Apps you’ll see the following option : But what does this really mean ? What is the difference between an App and an Add-on ? Both are packaged and uploaded to Splunk Apps as SPL files and then to install them in your Splunk instance you simply untar the SPL file into etc/apps .But the content and purpose of Apps and Add-ons certainly differ from one another.
An Add-on is typically a single component that you can develop that can be re-used across a number of different use cases.It is usually not specific to any one single use case.It also won’t contain a navigable user interface.You cannot open an Add-on from …
Reflections on a Splunk developer’s journey : Part 2
Why should you develop ?
In “Reflections on a Splunk developer’s journey : Part 1″ I shared some of my experiences of developing and supporting Splunk Community Apps and Add-ons over the years.
But WHY did I choose to develop and WHY should you choose to develop and start your foray the Splunk developer ecosystem?
Well the reasons for developing are going to be different for everyone depending on your motives. You might be a business or you might just be an individual community collaborator.
The reasons I started developing were because I discovered Splunkbase (now Apps / Answers) and realized that it was a great forum for collaborating and getting involved with the “Big Data” community to use …
Reflections on a Splunk developer’s journey : Part 1
It seems like only yesterday
…that I was writing my first Splunk App. It was the openness and extensibility of the Splunk platform that attracted me to this pursuit in the first place, and when I discovered the thriving community on Splunkbase (now called Splunk Apps / Answers), I just had to contribute. 12,000+ downloads across 9 different freely available community offerings later, I am feeling somewhat reflective. So in this 2 part blog series I want to share with you some of my lessons learned from developing and supporting Splunk community Apps/Add-ons (part 1) and then some musings on why you should consider developing Splunk Apps/Add-ons yourself and contribute to the Splunk developer ecosystem (part 2).
Some lessons learned…
Command Modular Input Use Case Series
Modular Inputs and Scripted Inputs provide a great way to develop custom programs to collect and index virtually any kind of data that you can set your mind to.
But on whatever platform you have deployed Splunk on, you will also have a whole bevy of other inputs just waiting for you to tap into to get that data into Splunk .They would be the various programs that come with the platform and those that you have installed on your platform.v
This is actually why I created the Command Modular Input that I introduced in a recent blog, a means to as simply as possible leverage the power of your existing system programs and get this data into …
I tend to travel quite a bit in my role at Splunk.The other day I was wondering to myself how far I had traveled in the last week , the last month , the last year. It just so happens that I am a Foursquare user , not because I like to hoard mayorships across the globe , rather I tend to use Foursquare checkins to help me remember where I have been.Now you get where I am gong with this , because “where have I been” actually means “a lot of cool location meta data” that I can have fun with.
I was looking around online for a simple tool that could hook into Foursquare to tell me how …
A Developer’s Smorgasbord
First bite of the Cherry(py)
I didn’t always work at Splunk. In fact, many moons ago I used to be a Splunk customer. At the time we were simply looking for a means to better consolidate our enterprise’s numerous sources of log data into a centralized repository. A colleague of mine mentioned this product called Splunk , and hence the journey began. Like many, this started with getting some log files indexed into Splunk and creating some trivial searches and Simple XML dashboards. This very quickly led to more data sources and more elaborate dashboards. Then the bloke sitting next to me saw what I was doing and wanted in on the action, then the adjacent team and then the …