Kaufland DevSummit2016 – Splunk for DevOps – Faster Insights, better code

The first DevSummit event was recently hosted by Kaufland with 200 people attending for the day to hear presentations about the “World of API”, discuss the latest best practice developments and build ideas in a hackathon. One highlight was the keynote from Markus Andrezak on how technology, business and innovation play together.

Of course, a team of Splunkers (big thanks to my colleagues Mark and Henning) wouldn’t miss such an event and got involved with a booth as well as a presentation. It was amazing to have so many fruitful discussions about how to make data more easily accessible and useable for business, development and operation teams. In the morning Joern Wanke from the Kaufland Omnichannel team presented on how …

» Continue reading

Easily Create Mod Inputs Using Splunk Add-on Builder 2.0 – Part IV

Add-on Builder 2.0 provides capabilities to build modular inputs without writing any code. In this post however, we focus on using an advanced feature of Splunk’s Add-on Builder 2.0 to write custom python while taking advantage of its powerful helper functions.

NB: Future versions of Add-on Builder will obviate the need for some of the techniques mentioned below, most notably techniques in step #6 & step #8.


There is a veritable cornucopia of useful resources for building modular inputs at docs.splunk.com, dev.splunk.com, blogs.splunk.com, and more. This post certainly isn’t meant to replace those. No no, this post will simply walk you through leveraging Splunk Add-on Builder 2.0 to create custom code to query an API.

In this post we will create a …

» Continue reading

Docker 1.13 with improved Splunk Logging Driver

The evolution of Splunk and Docker continues!   In the early days (2015) of Splunk and Docker we recommended using the native syslog logging driver in Docker Engine.  In Feburary of 2016, Docker 1.10 came out and we contributed the first version of Splunk Logging Driver in Docker 1.10.  Since that first release we have seen huge adoption. After reviewing feedback and thinking about what is needed for Splunk environments with Docker, we’ve added a bunch of new features!

When I wrote this blog post, Docker 1.13 was still in Release Candidate stage. If …

» Continue reading

Announcing new AWS Lambda Blueprints for Splunk

Splunk and Amazon Web Services (AWS) are continuously collaborating to drive customer success by leveraging both the agility of AWS, and the visibility provided by Splunk. To support that goal, we’re happy to announce new AWS Lambda blueprints to easily stream valuable logs, events and alerts from over 15 AWS services into Splunk to help customers gain critical security and operational insights.
splunk_lambda_mediumWith a point-and-click setup, you can use these blueprints to have Splunk ingest data from AWS services such as Kinesis Stream, CloudWatch Logs, DynamoDB Stream and IoT for further data processing & analytics in addition to logging AWS Lambda itself for instrumentation & troubleshooting.

Once Lambda blueprint is configured, events are automatically forwarded in near real-time by Lambda onto Splunk

» Continue reading

Splunk Challenge 2016 – Catch ’em all at Nanyang Polytechnic!

Splunk Challenge 2016, the annual Splunk challenge that many NYP students have been waiting for, is here! Today, the students will be pitting their analytics’ skills learned using Splunk, against each other as they compete for a chance to take home some great prizes.

prize1

Unlike past years where the students were tasked to look into business and IT operation data, this year the ideas of analyzing “Pokemon” data was suggested by the lecturer to be used for the challenge. As the market leader in the data analytics space, not only it is important, but it also addresses some of our core values to keep what we are doing fun and innovative so that we will not only be able to …

» Continue reading

Personal Dev/Test Licenses give you the freedom to explore

Screen Shot 2016-11-02 at 8.39.27 AM

Do you have a new use case to validate? Untapped data sources to investigate? Wouldn’t it be great to explore how Splunk might help other parts of your organization? All without impacting your production systems and license usage…

Free Personal Dev/Test Licenses

At .conf2016 in September, CEO Doug Merritt was clear that we want to make easier for you use Splunk across your business. Enforced metering is gone. And exploring new use cases should be hassle-free.

So now any Splunk Enterprise or Splunk Cloud customer employee can get a free personalized Splunk Enterprise Dev/Test software license. Each license is valid for up to 50 GB daily data ingestion and a six-month renewable term, giving you ample power and time to …

» Continue reading

Event Calendar Custom Visualization

A while back, I wrote a blog post about using a custom calendar visualization in Simple XML dashboards.  To accomplish this, I used a technique sometimes referred to as escape hatching JavaScript into Simple XML.    While this works okay for a developer, the technique does not lend itself well to the end user.

Splunk Custom Visualizations

Splunk 6.4 introduced reusable custom visualizations which allows a developer to package up a visualization and integrate it into Splunk just like the native visualizations.  This also addresses the limitation mentioned above – meaning any end user can use the visualization without mucking around with the Simple XML.

So, revisiting the older escape hatch calendar technique, I thought it would be a good …

» Continue reading

Splunking Kafka At Scale

At Splunk, we love data and we’re not picky about how you get it to us. We’re all about being open, flexible and scaling to meet your needs. We realize that not everybody has the need or desire to install the Universal Forwarder to send data to Splunk. That’s why we created the HTTP Event Collector. This has opened the door to getting a cornucopia of new data sources into Splunk, reliably and at scale.

We’re seeing more customers in Major Accounts looking to integrate their Pub/Sub message brokers with Splunk. Kafka is the most popular message broker that we’re seeing out there but Google Cloud Pub/Sub is starting to make some noise. I’ve been asked multiple times for guidance …

» Continue reading

How to: Splunk Analytics for Hadoop on Amazon EMR.

Using Amazon EMR and Splunk Analytics for Hadoop to explore, analyze and visualize machine data

Machine data can take many forms and comes from a variety of sources; system logs, application logs, service and system metrics, sensors data etc. In this step-by-step guide, you will learn how to build a big data solution for fast, interactive analysis of data stored in Amazon S3 or Hadoop. This hands-on guide is useful for solution architects, data analysts and developers.

This guide will see you:

  1. Setup an EMR cluster
  2. Setup a Splunk Analytics for Hadoop node
  3. Connect to data in your S3 buckets
  4. Explore, visualize and report on your data

You will need:

  1. An Amazon EMR Cluster
  2. A Splunk Analytics for Hadoop Instance
  3. Amazon
» Continue reading

Creating McAfee ePO Alert and ARF Actions with Add-On Builder

One of the best things about Splunk is the passionate user community. As a group, the community writes amazing Splunk searches, crafts beautiful dashboards, answers thousands of questions, and shares apps and add-ons with the world.

Building high quality add-ons is perhaps one of the more daunting ways to contribute. Since the recently-updated Splunk Add-On Builder 2.0 was released, however, it’s never been easier to build, test, validate and package add-ons for sharing on SplunkBase.

Technical Add-Ons, aka TAs, are specialized Splunk apps that make it easy for Splunk to ingest data, extract and calculate field values, and normalize field names against the Common Information Model (CIM). Since the release of version 6.3, Splunk Enterprise also supports TAs for …

» Continue reading