My experience of building Splunk application

I joined Splunk a couple weeks ago and my first challenge was to learn everything I could about how to build Splunk applications. The best way of doing that is just to write your own application – and this is exactly what I did.

Application which I wrote contains two parts. The first part of application is a very simple scripted input for Firebase, the second part of application is built with the Splunk Web Framework that shows you objects and their routes on Google Maps using both real-time or playback historic information.

I hope that my experience can give you some thoughts about how you can extend Splunk for your needs.…

» Continue reading

Updated RSS Input (Java Version)

Last year, I put out a Java version of a RSS Input program that was based on included open source to parse RSS. It used the beta version of the Splunk Java SDK to check for duplicates to make sure in a reasonable time period the same RSS link wasn’t already indexed into Splunk. With the GA release of the Splunk Java SDK, I updated the contents on Splunkbase to include the GA Splunk Java SDK jar file and also used a more efficient way to check for a duplicate entry. You can download the distribution on Splunkbase.

To recap, the distribution uses a scripted input to index the contents of configurable RSS feeds every configurable N seconds. You can …

» Continue reading

How’s Traffic?

By the title of this post, many of you may assume that I am referring to network traffic. However, today’s topic is about monitoring vehicular traffic incidents or what some of us call accidents in most cases. I found a feed from that lists recent incidents for a known USA city if the city is used as the last part of the URL. The information returned explains the jam factor (how crowded the roads are), severity of the incident and its location. Armed with this information, I created a Splunk app around it and put it on Splunkbase for you to use. Instructions are provided on what text file to update to add or delete the cities you …

» Continue reading

Analyzing Flurry data

Have a mobile app that sends data to Flurry? Would you like to do some custom analysis on that data? Splunk to the rescue!

The new Flurry App for Splunk provides a scripted input that automatically extracts events from an existing Flurry account.…

» Continue reading

Zeromq as a Splunk Input

Occasionally, people ask me how to get a message from a message queue such as JMS to deliver its messages into Splunk. I point them to the approach I put up on Splunkbase where a JMS listener is called by Splunk as a scripted input and dequeues messages that are put on queues of interests. Obviously, after the message is dequeued, it is meant to to go into Splunk in this case. No other business application would have subsequent access to the same message on the same queue. Therefore, if you want to use a pure messaging system that is not part of your application to send time series messages to Splunk, this is not the approach you should be …

» Continue reading

Identifying Phishing Sites in Your Events

Recently, I thought I was caught in a phishing scheme where I created an account on an e-commerce site to checkout and as soon as I clicked on the checkout button, it asked me to log onto a well known site. It turned out that the original site was badly implemented and it should have told users that they are affiliates with the other site. Nevertheless, I went to Phishtank to make sure that no one had complained about the original e-commerce site.

This got me thinking that since phishing occurs all too often, there must be a way for a corporations to verify that their users are not going to phishing sites and if they are to know about …

» Continue reading

Asking Vendors to Make Log Events Accessible

In my last blog entry, I wrote about asking vendors to make their log event formats follow industry best practices. Now, if the log events reside in files or can be broadcast out on network ports, this makes it quite easy to access them with technologies such as Splunk Universal Forwarders. However, if the log events are buried deep within the application, device, or system that created them, then there is is one more issue to address to get to the events and that is having an accessible transport mechanism with examples on usage.

By transport, I obviously am not referring to some futuristic vehicle transportation.

What I am talking about is a way for one computer process to …

» Continue reading

My Data takes me back to HD Videos

Last month I wrote about indexing video feeds and Vimeo was the site I featured for HD videos. The idea was to use the Vimeo REST API to gather all the meta data about your favorite Vimeo HD video channels and then index this into Splunk for historical look up or simply to have it available as a one stop dashboard where you can not only view the information that got indexed, but also use a workflow action to actually view the video.


Click on Show Video

Then, what happened was that the REST API called from Python changed in that I was getting one huge line per channel instead of nicely formatted XML. My code had logic to skip …

» Continue reading

RSS Inputs and Also the Splunk Java SDK

By now, some of you over the years may have downloaded from Splunkbase my reference implementation for using scripted input to index RSS feeds or have read about the topic. The idea is that this input is very low in daily volume (possibly in KBs/day as opposed to MBs/day), but presents itself with many different correlation opportunities from the same Splunk console. This was originally written in Python and used the publicly available to download and parse the RSS feed. The issues I have heard over time are some people are not allowed to install Python on a forwarder machine, have the wrong version of Python that may not work with or simply have issues with the …

» Continue reading

Indexing Feeds

We often talk about indexing the output of a program or script in Splunk as an universal way to index any type of text data that goes beyond monitoring log files. For those of you who may be new to Splunk, the idea behind a scripted input is that every configurable N seconds, any user provided script or program written in any language can be called by Splunk or a forwarder that gathers data and the standard output of that script or program is then indexed by Splunk. This is the basis for many of my contributions on Splunkbase such as indexing RSS feeds. Logically, this is a “pull” approach in that data is accessed by Splunk on a …

» Continue reading