Docker 1.13 with improved Splunk Logging Driver

The evolution of Splunk and Docker continues!   In the early days (2015) of Splunk and Docker we recommended using the native syslog logging driver in Docker Engine.  In Feburary of 2016, Docker 1.10 came out and we contributed the first version of Splunk Logging Driver in Docker 1.10.  Since that first release we have seen huge adoption. After reviewing feedback and thinking about what is needed for Splunk environments with Docker, we’ve added a bunch of new features!

When I wrote this blog post, Docker 1.13 was still in Release Candidate stage. If …

» Continue reading

Announcing new AWS Lambda Blueprints for Splunk

Splunk and Amazon Web Services (AWS) are continuously collaborating to drive customer success by leveraging both the agility of AWS, and the visibility provided by Splunk. To support that goal, we’re happy to announce new AWS Lambda blueprints to easily stream valuable logs, events and alerts from over 15 AWS services into Splunk to help customers gain critical security and operational insights.
splunk_lambda_mediumWith a point-and-click setup, you can use these blueprints to have Splunk ingest data from AWS services such as Kinesis Stream, CloudWatch Logs, DynamoDB Stream and IoT for further data processing & analytics in addition to logging AWS Lambda itself for instrumentation & troubleshooting.

Once Lambda blueprint is configured, events are automatically forwarded in near real-time by Lambda onto Splunk

» Continue reading

Splunk Challenge 2016 – Catch ’em all at Nanyang Polytechnic!

Splunk Challenge 2016, the annual Splunk challenge that many NYP students have been waiting for, is here! Today, the students will be pitting their analytics’ skills learned using Splunk, against each other as they compete for a chance to take home some great prizes.

prize1

Unlike past years where the students were tasked to look into business and IT operation data, this year the ideas of analyzing “Pokemon” data was suggested by the lecturer to be used for the challenge. As the market leader in the data analytics space, not only it is important, but it also addresses some of our core values to keep what we are doing fun and innovative so that we will not only be able to …

» Continue reading

Personal Dev/Test Licenses give you the freedom to explore

Screen Shot 2016-11-02 at 8.39.27 AM

Do you have a new use case to validate? Untapped data sources to investigate? Wouldn’t it be great to explore how Splunk might help other parts of your organization? All without impacting your production systems and license usage…

Free Personal Dev/Test Licenses

At .conf2016 in September, CEO Doug Merritt was clear that we want to make easier for you use Splunk across your business. Enforced metering is gone. And exploring new use cases should be hassle-free.

So now any Splunk Enterprise or Splunk Cloud customer employee can get a free personalized Splunk Enterprise Dev/Test software license. Each license is valid for up to 50 GB daily data ingestion and a six-month renewable term, giving you ample power and time to …

» Continue reading

Event Calendar Custom Visualization

A while back, I wrote a blog post about using a custom calendar visualization in Simple XML dashboards.  To accomplish this, I used a technique sometimes referred to as escape hatching JavaScript into Simple XML.    While this works okay for a developer, the technique does not lend itself well to the end user.

Splunk Custom Visualizations

Splunk 6.4 introduced reusable custom visualizations which allows a developer to package up a visualization and integrate it into Splunk just like the native visualizations.  This also addresses the limitation mentioned above – meaning any end user can use the visualization without mucking around with the Simple XML.

So, revisiting the older escape hatch calendar technique, I thought it would be a good …

» Continue reading

Splunking Kafka At Scale

At Splunk, we love data and we’re not picky about how you get it to us. We’re all about being open, flexible and scaling to meet your needs. We realize that not everybody has the need or desire to install the Universal Forwarder to send data to Splunk. That’s why we created the HTTP Event Collector. This has opened the door to getting a cornucopia of new data sources into Splunk, reliably and at scale.

We’re seeing more customers in Major Accounts looking to integrate their Pub/Sub message brokers with Splunk. Kafka is the most popular message broker that we’re seeing out there but Google Cloud Pub/Sub is starting to make some noise. I’ve been asked multiple times for guidance …

» Continue reading

How to: Splunk Analytics for Hadoop on Amazon EMR.

Using Amazon EMR and Splunk Analytics for Hadoop to explore, analyze and visualize machine data

Machine data can take many forms and comes from a variety of sources; system logs, application logs, service and system metrics, sensors data etc. In this step-by-step guide, you will learn how to build a big data solution for fast, interactive analysis of data stored in Amazon S3 or Hadoop. This hands-on guide is useful for solution architects, data analysts and developers.

This guide will see you:

  1. Setup an EMR cluster
  2. Setup a Splunk Analytics for Hadoop node
  3. Connect to data in your S3 buckets
  4. Explore, visualize and report on your data

You will need:

  1. An Amazon EMR Cluster
  2. A Splunk Analytics for Hadoop Instance
  3. Amazon
» Continue reading

Creating McAfee ePO Alert and ARF Actions with Add-On Builder

One of the best things about Splunk is the passionate user community. As a group, the community writes amazing Splunk searches, crafts beautiful dashboards, answers thousands of questions, and shares apps and add-ons with the world.

Building high quality add-ons is perhaps one of the more daunting ways to contribute. Since the recently-updated Splunk Add-On Builder 2.0 was released, however, it’s never been easier to build, test, validate and package add-ons for sharing on SplunkBase.

Technical Add-Ons, aka TAs, are specialized Splunk apps that make it easy for Splunk to ingest data, extract and calculate field values, and normalize field names against the Common Information Model (CIM). Since the release of version 6.3, Splunk Enterprise also supports TAs for …

» Continue reading

Important information for customers using Splunk Enterprise 6.2 or earlier

Do you use SSL to secure Splunk Enterprise? Are you still using Splunk Enterprise version 6.2 or earlier? If you answered yes to both of these questions, please read on.

Securing communication with your Splunk instance can be essential in today’s digital environment, especially if it is collecting sensitive information. If communication to/from your Splunk instance can be easily intercepted (e.g. public access to SplunkWeb, Forwarders outside firewall) then this communication should be encrypted using SSL. Additionally, security functionality is constantly being enhanced to combat the evolving threat landscape so you should stay on as current a version of Splunk as possible.

You may have heard that the OpenSSL Software Foundation will cease support for OpenSSL version 1.0.1 as …

» Continue reading

Building add-ons just got 2.0 times easier

Are you trying to build ES Adaptive Response actions or alert actions and need some help? Are you trying to validate your add-on to see if it is ready to submit for certification? Are you grappling with your add-on setup page and building credential encryptions? If you are, check out Splunk Add-on Builder 2.0.

Below is a brief overview of what’s new in Add-on Builder 2.0:

  • You can now leverage the easy-to-use, step-by-step workflow in Add-on Builder to create alert actions and ES adaptive response actions. No need to deal with .conf files and Python, let the tool do the work for you.

ModAlert1

modalert2

  • The validation process has been enhanced to include App Certification readiness. This validation process can also be performed on apps and add-ons
» Continue reading