Talk to Splunk with Amazon Alexa

What do you think the future experience of interacting with your data is going to be like ? Is it going to be logging in by way of a user interface and then using your mouse/keyboard/gestures to view and interact with something on a display panel , or is it going to be more like simply talking with another person ?

Introducing the “Talk to Splunk with Amazon Alexa” App

This is a Splunk App that enables your Splunk instance for interfacing with Amazon Alexa by way of a custom Alexa skill, thereby provisioning a Natural Language interface for Splunk.

You can then use an Alexa device such as Amazon’s Echo,Tap or Dot or another 3rd party hardware device to tell …

» Continue reading

Send data to Splunk via an authenticated TCP Input

Wow , my second blog in 24 hrs about Protocol Data Inputs(PDI) , but sometimes you just infected with ideas and have to roll with it.

So my latest headbump is about sending text or binary data to Splunk over raw TCP and authenticating access to that TCP input.Simple to accomplish with PDI.

Setup a PDI stanza to listen for TCP requests

PDI has many options , but for this simple example you only need to choose the protocol(TCP) and a port number.


Screen Shot 2016-07-30 at 3.31.08 PM

Declare a custom handler to authenticate the received data

You can see this above in the Custom Data Handler section.I have declared the handler and  the authentication token that the handler should use via a JSON properties …

» Continue reading

Sending binary data to Splunk and preprocessing it

A while ago I released an App on Splunkbase called Protocol Data Inputs (PDI)  that allows you to send text or binary data to Splunk via many different protocols and dynamically apply pre processors to act on this data prior to indexing in Splunk. You can read more about it here.

I thought I’d just share this interesting use case that I was fiddling around with today. What if I wanted to send compressed data (which is a binary payload) to Splunk and index it ? Well , this is very trivial to accomplish with PDI.

Choose your protocol and binary data payload

PDI supports many different protocols , but for the purposes of this example I just rolled  a …

» Continue reading

Custom Message Handling and HEC Timestamps with the Kafka Modular Input

Custom Message Handling

If you are a follower of any of my Modular Inputs on Splunkbase , you may see that I employ a similar design pattern across all of my offerings. That being the ability to declaratively plug in your own parameterizable custom message handler to act upon the raw received data in some manner before it gets output to Splunk for indexing. This affords many benefits :

  • Many of my Modular Inputs are very cross cutting in terms of the numerous potential types and formats of data they will encounter once they are let loose in the wild. I can’t think of every data scenario. An extensibility design allows the user and community to be able to customize the
» Continue reading

Achieving scale with the Kafka Modular Input

A hot topic in my inbox over the recent months has been how to achieve scalability with the Kafka Modular Input , primarily in terms of message throughput. I get a lot of emails from users and our own internal Splunk team about this , so rather than continuing to dish out the same replys , I thought I’d just pen a short blog to share some tips and tricks.

So let’s start off with this simple scenario :

  • a single instance of Splunk 6.3
  • downloaded and installed the freely available Kafka Modular Input from Splunkbase

These are the scaling steps that I would try in order.

Enable HTTP Event Collector output

With the recent release of Splunk 6.3 , …

» Continue reading

Scheduled Export of Indexed Data

I’m really enjoying playing with all the new Developer hooks in Splunk 6.3 such as the HTTP Event Collector and the Modular Alerts framework. My mind is veritably fizzing with ideas for new and innovative ways to get data into Splunk and build compelling new Apps.

When 6.3 was released at our recent Splunk Conference I also released a new Modular Alert for sending SMS alerts using Twilio, which is very useful in it’s own right but also a really nice simple example for developers to reference to create their own Modular Alerts.

But after getting under the hood of the Modular Alerts framework, this also got me thinking about other ways to utilise Modular Alerts to fulfill other use …

» Continue reading

Turbo charging Modular Inputs with the HEC (HTTP Event Collector) Input

HTTP Event Collector (HEC)

Splunk 6.3 introduces a new high performance data input option for developers to send event data directly to Splunk over HTTP(s). This is called the HTTP Event Collector (HEC).

In a nutshell , the key features of HEC are :

  • Send data to Splunk via HTTP/HTTPS
  • Token based authentication
  • JSON payload grammar
  • Acknowledgment of sent events
  • Support for sending batches of events
  • Keep alive connections

A typical use case for HEC would be a developer wanting to send application events to Splunk directly from their code in a manner that is highly performant and scalable and alleviates having to write to a file that is monitored by a Universal Forwarder.

But I have another use case …

» Continue reading

SMS Alerting from Splunk with Twilio

Modular Alerts

With the release of Splunk 6.3 comes an exciting new feature called Modular Alerts.

Historically the alerting actions in Splunk have been limited to Email, RSS and if you wanted to perform some custom alerting functionality then you could execute a Custom Script.

Whilst many Splunk Ninjas over the years have accomplished all sorts of amazing Kung Fu by wrangling with custom alerting scripts , they are ultimately not the most optimal approach for users and developers.

  • manual setup
  • no configuration interface
  • need file system access
  • loosely coupled to Splunk
  • no common development or packaging standard

So what if you want more alerting actions that you can plugin and present as first class alerting actions in your Splunk instance.

Well …

» Continue reading

Protocol Data Inputs

It must have been about a year ago now that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).

So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.

pdi

 

So let’s break down some of …

» Continue reading

What are Splunk Apps and Add-Ons ?

If you have ever uploaded a contribution to Splunk Apps you’ll see the following option : app_addon   But what does this really mean ? What is the difference between an App and an Add-on ? Both are packaged and uploaded to Splunk Apps as SPL files and then to install them in your Splunk instance you simply untar the SPL file into etc/apps .But the content and purpose of Apps and Add-ons certainly differ from one another.

Add-ons

An Add-on is typically a single component that you can develop that can be re-used across a number of different use cases.It is usually not specific to any one single use case.It also won’t contain a navigable user interface.You cannot open an Add-on from …

» Continue reading