Custom Message Handling and HEC Timestamps with the Kafka Modular Input

Custom Message Handling

If you are a follower of any of my Modular Inputs on Splunkbase , you may see that I employ a similar design pattern across all of my offerings. That being the ability to declaratively plug in your own parameterizable custom message handler to act upon the raw received data in some manner before it gets output to Splunk for indexing. This affords many benefits :

  • Many of my Modular Inputs are very cross cutting in terms of the numerous potential types and formats of data they will encounter once they are let loose in the wild. I can’t think of every data scenario. An extensibility design allows the user and community to be able to customize the
» Continue reading

Splunk at Dynatrace PERFORM

Dynatrace_vert_logo_RGB_HTML_2000x1545_hiresThis week, Splunk will be participating at Dynatrace PERFORM – the annual users event for Dynatrace APM users. Not only is Dynatrace the largest APM vendor by market share, we know that many people are getting value by connecting Dynatrace APM with Splunk. The Dynatrace APM App for Splunk has nearly 1,500 downloads!

We’ll be at Dynatrace PERFORM largely to share what exactly Splunk is with attendees and how it can complement the capabilities found in Dynatrace’s products.

In addition to the “lightning talks” we’ll be delivering throughout Wednesday and Thursday, I’m honored to be on a New Stack panel on Thursday October 15 at 4:30pm, along with representatives from AWS, NGINX, Ansible and NodeSource.

Bringing Dynatrace and Splunk together provides a complete view of your applications.

Bringing Dynatrace

» Continue reading

Achieving scale with the Kafka Modular Input

A hot topic in my inbox over the recent months has been how to achieve scalability with the Kafka Modular Input , primarily in terms of message throughput. I get a lot of emails from users and our own internal Splunk team about this , so rather than continuing to dish out the same replys , I thought I’d just pen a short blog to share some tips and tricks.

So let’s start off with this simple scenario :

  • a single instance of Splunk 6.3
  • downloaded and installed the freely available Kafka Modular Input from Splunkbase

These are the scaling steps that I would try in order.

Enable HTTP Event Collector output

With the recent release of Splunk 6.3 , …

» Continue reading

Scheduled Export of Indexed Data

I’m really enjoying playing with all the new Developer hooks in Splunk 6.3 such as the HTTP Event Collector and the Modular Alerts framework. My mind is veritably fizzing with ideas for new and innovative ways to get data into Splunk and build compelling new Apps.

When 6.3 was released at our recent Splunk Conference I also released a new Modular Alert for sending SMS alerts using Twilio, which is very useful in it’s own right but also a really nice simple example for developers to reference to create their own Modular Alerts.

But after getting under the hood of the Modular Alerts framework, this also got me thinking about other ways to utilise Modular Alerts to fulfill other use …

» Continue reading

Notes From Splunk .conf 2015 Day Two

The Search party last night was a blast, but today it was back to business. And Day 2 of the global Splunk user group, .conf2015, was another excellent day.

I started with some good mates from the industry analyst community, talking Splunk IT Service Intelligence (ITSI) over breakfast. I gained intriguing insights into our customers and our market, and came away with all sorts of possible new use cases for ITSI.

But as Steve Jobs said, innovation sometimes it means saying ‘no’ to a thousand good ideas, so for now we are going to focus on fulfilling the enormous early demand from our customers for POCs. Still, we are always looking for new ideas from our customers and partners (and analysts too!), …

» Continue reading

Turbo charging Modular Inputs with the HEC (HTTP Event Collector) Input

HTTP Event Collector (HEC)

Splunk 6.3 introduces a new high performance data input option for developers to send event data directly to Splunk over HTTP(s). This is called the HTTP Event Collector (HEC).

In a nutshell , the key features of HEC are :

  • Send data to Splunk via HTTP/HTTPS
  • Token based authentication
  • JSON payload grammar
  • Acknowledgment of sent events
  • Support for sending batches of events
  • Keep alive connections

A typical use case for HEC would be a developer wanting to send application events to Splunk directly from their code in a manner that is highly performant and scalable and alleviates having to write to a file that is monitored by a Universal Forwarder.

But I have another use case …

» Continue reading

SMS Alerting from Splunk with Twilio

Modular Alerts

With the release of Splunk 6.3 comes an exciting new feature called Modular Alerts.

Historically the alerting actions in Splunk have been limited to Email, RSS and if you wanted to perform some custom alerting functionality then you could execute a Custom Script.

Whilst many Splunk Ninjas over the years have accomplished all sorts of amazing Kung Fu by wrangling with custom alerting scripts , they are ultimately not the most optimal approach for users and developers.

  • manual setup
  • no configuration interface
  • need file system access
  • loosely coupled to Splunk
  • no common development or packaging standard

So what if you want more alerting actions that you can plugin and present as first class alerting actions in your Splunk instance.

Well …

» Continue reading

Using The SplunkJS Stack – Part 1

I’ve recently helped a customer integrate the SplunkJS stack into their own custom web application. I wanted to spread the knowledge so others could learn as well.

What is the SplunkJS stack you ask? The SplunkJS stack is a component of the Splunk Web Framework that allows web developers to create apps in their own development environment allowing them to access and manipulate Splunk data. This allows you greater flexibility over the look and feel of your app, including the use of third party visualization tools like D3 and Keylines.

This blog post will be a three part series. I will be covering the following topics in detail.

  • Authentication to Splunk using a local proxy or CORS (covered in
  • » Continue reading

    Collecting docker logs and stats with Splunk

    I’m working at Splunk, but this is my personal thoughts. I have some knowledge about Splunk obviously, but you should not consider this as an official Splunk manual. Everything I did here – I did only for my personal needs and my free time.

    You cannot really feel safe for the services you run if you don’t monitor them. There are plenty of great tools which allow you to monitor your docker environments, like cadvisor and some other cloud solutions. I did not want to use cloud solutions, because they can also upload some sensitive information, like environment variables, where I could keep passwords for AWS backups. So I wanted to use something like cadvisor, but with historical information and …

    » Continue reading

    Raise a Glass to Splunk Apptitude Winners

    With the grand prize of $100,000 being awarded to the Fraud and Insider Threat category, it was only appropriate to announce the winners at Blackhat 2015 – one of the largest security conferences in the world. And though all of the winners couldn’t make it on short notice – they were coming from all over the globe, one even sending a video from the peaks of the Swiss Alps.

    We received a great mix of submissions from customers, partners, and even some Splunk newbies. This really was a great showing of the breadth and varied experience of our users and developer community, as well as the creativity that can only come from a field from such varied experience and location.…

    » Continue reading