Send data to Splunk via an authenticated TCP Input
So my latest headbump is about sending text or binary data to Splunk over raw TCP and authenticating access to that TCP input.Simple to accomplish with PDI.
Setup a PDI stanza to listen for TCP requests
PDI has many options , but for this simple example you only need to choose the protocol(TCP) and a port number.
Declare a custom handler to authenticate the received data
You can see this above in the Custom Data Handler section.I have declared the handler and the authentication token that the handler should use via a JSON properties …
Sending binary data to Splunk and preprocessing it
A while ago I released an App on Splunkbase called Protocol Data Inputs (PDI) that allows you to send text or binary data to Splunk via many different protocols and dynamically apply pre processors to act on this data prior to indexing in Splunk. You can read more about it here.
I thought I’d just share this interesting use case that I was fiddling around with today. What if I wanted to send compressed data (which is a binary payload) to Splunk and index it ? Well , this is very trivial to accomplish with PDI.
Choose your protocol and binary data payload
PDI supports many different protocols , but for the purposes of this example I just rolled a …
Custom Message Handling and HEC Timestamps with the Kafka Modular Input
Custom Message Handling
If you are a follower of any of my Modular Inputs on Splunkbase , you may see that I employ a similar design pattern across all of my offerings. That being the ability to declaratively plug in your own parameterizable custom message handler to act upon the raw received data in some manner before it gets output to Splunk for indexing. This affords many benefits :
- Many of my Modular Inputs are very cross cutting in terms of the numerous potential types and formats of data they will encounter once they are let loose in the wild. I can’t think of every data scenario. An extensibility design allows the user and community to be able to customize the
Achieving scale with the Kafka Modular Input
A hot topic in my inbox over the recent months has been how to achieve scalability with the Kafka Modular Input , primarily in terms of message throughput. I get a lot of emails from users and our own internal Splunk team about this , so rather than continuing to dish out the same replys , I thought I’d just pen a short blog to share some tips and tricks.
So let’s start off with this simple scenario :
- a single instance of Splunk 6.3
- downloaded and installed the freely available Kafka Modular Input from Splunkbase
These are the scaling steps that I would try in order.
Enable HTTP Event Collector output
With the recent release of Splunk 6.3 , …
Scheduled Export of Indexed Data
I’m really enjoying playing with all the new Developer hooks in Splunk 6.3 such as the HTTP Event Collector and the Modular Alerts framework. My mind is veritably fizzing with ideas for new and innovative ways to get data into Splunk and build compelling new Apps.
When 6.3 was released at our recent Splunk Conference I also released a new Modular Alert for sending SMS alerts using Twilio, which is very useful in it’s own right but also a really nice simple example for developers to reference to create their own Modular Alerts.
But after getting under the hood of the Modular Alerts framework, this also got me thinking about other ways to utilise Modular Alerts to fulfill other use …
Turbo charging Modular Inputs with the HEC (HTTP Event Collector) Input
HTTP Event Collector (HEC)
Splunk 6.3 introduces a new high performance data input option for developers to send event data directly to Splunk over HTTP(s). This is called the HTTP Event Collector (HEC).
In a nutshell , the key features of HEC are :
- Send data to Splunk via HTTP/HTTPS
- Token based authentication
- JSON payload grammar
- Acknowledgment of sent events
- Support for sending batches of events
- Keep alive connections
A typical use case for HEC would be a developer wanting to send application events to Splunk directly from their code in a manner that is highly performant and scalable and alleviates having to write to a file that is monitored by a Universal Forwarder.
But I have another use case …
SMS Alerting from Splunk with Twilio
With the release of Splunk 6.3 comes an exciting new feature called Modular Alerts.
Historically the alerting actions in Splunk have been limited to Email, RSS and if you wanted to perform some custom alerting functionality then you could execute a Custom Script.
Whilst many Splunk Ninjas over the years have accomplished all sorts of amazing Kung Fu by wrangling with custom alerting scripts , they are ultimately not the most optimal approach for users and developers.
- manual setup
- no configuration interface
- need file system access
- loosely coupled to Splunk
- no common development or packaging standard
So what if you want more alerting actions that you can plugin and present as first class alerting actions in your Splunk instance.
Protocol Data Inputs
It must have been about a year ago now that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).
So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.
So let’s break down some of …
What are Splunk Apps and Add-Ons ?
If you have ever uploaded a contribution to Splunk Apps you’ll see the following option : But what does this really mean ? What is the difference between an App and an Add-on ? Both are packaged and uploaded to Splunk Apps as SPL files and then to install them in your Splunk instance you simply untar the SPL file into etc/apps .But the content and purpose of Apps and Add-ons certainly differ from one another.
An Add-on is typically a single component that you can develop that can be re-used across a number of different use cases.It is usually not specific to any one single use case.It also won’t contain a navigable user interface.You cannot open an Add-on from …
Reflections on a Splunk developer’s journey : Part 2
Why should you develop ?
In “Reflections on a Splunk developer’s journey : Part 1” I shared some of my experiences of developing and supporting Splunk Community Apps and Add-ons over the years.
But WHY did I choose to develop and WHY should you choose to develop and start your foray the Splunk developer ecosystem?
Well the reasons for developing are going to be different for everyone depending on your motives. You might be a business or you might just be an individual community collaborator.
The reasons I started developing were because I discovered Splunkbase (now Apps / Answers) and realized that it was a great forum for collaborating and getting involved with the “Big Data” community to use …