Remote Images Retrieval With Splunk Using Custom Command “getimage.py”
Every once in a while my customers ask for a functionality that is not natively supported by Splunk. Out of the box Splunk is a very capable platform, however, there are certain tasks Splunk is not designed for. But that never stops a Splunker from finding a solution! The use-case I am about to discuss in this blog is an example of that: The customer owns large chain of pharmacies across the country, the bulk of the stores transactions end up in Hadoop Data Lake; the customer wants to use Hunk/Splunk to visualize and analyze the massive amount of information collected, which is something Hunk can do easily. The challenge came about when I was asked if Splunk could show …
Splunk Add-on > Where’s That Command – Converting a Field’s Hexadecimal Value to Binary
When looking through Splunk’s Search Reference Manual, there are a ton of search commands with their syntax, descriptions, and examples. After all, if Splunk is the platform for machine data, there needs to be an extensive list of commands, functions, and references that guide Splunkers through the Search Processing Language (SPL). But one would think that we had everything covered, right? Well, almost….
I have a couple of great customers from the Houston, Texas area to thank for this. Gabe and Andrew (you know who you are) are not only strong Splunkers, but frequent the Splunk Houston User Group (SHUG) meetings and are always looking for ways to expand their use of Splunk as well …
An Hour of Code with Splunk
The Hour of Code is a global effort to educate children in more than 180 countries with as little as one hour of computer science. Held as part of Computer Science Education Week (December 7-13), the most recent Hour of Code included more than 198,473 events around the world. And this year, several Splunkers taught sessions in events across the country.
Here in the Seattle Area, Shakeel Mohamed, one of our engineers, taught sessions on Lightbot and Minecraft at Rainier View Elementary School, and I had the pleasure of teaching approximately 150 students at Ingraham High School an hour about log / time-series data and how to mine it with Splunk. The courses are a challenging mix of students …
Splunk in Space: Splunking Satellite Data in the Cloud
This year a Team of Splunkers attended the ESA App Camp 2015 in lovely Frascati, Italy. The topic of this year’s challenge was:
“There are thousands of ways to enrich apps with data from space – what’s yours?”
The Splunk team featured Robert Fujara and Philipp Drieger alongside with camp participants Claire Crotty and Anthony Thomas. Together the team created a mobile web app that accessed a Splunk Cloud instance to analyze geolocation-based satellite data and inform users about different environmental indicators across Europe. Users can input their preferences in terms of living environment and based on different indicators they then receive recommendations on which city or region would suit them best.
The key data sources for this project…
Splunk Logging Driver for Docker
With Splunk 6.3 we introduced HTTP Event Collector which offers a simple, high volume way to send events from applications directly to Splunk Enterprise and Splunk Cloud for analysis. HTTP Event Collector makes it possible to cover more cases of collecting logs including from Docker. Previously I blogged on using the Splunk Universal Forwarder to collect logs from Docker containers.
Today following up on Docker’s press release, we’re announcing early availability in the Docker experimental branch of a new log driver for Splunk. The driver uses the HTTP Event Collector to allow forwarder-less collection of your Docker logs. If you are not familiar yet with the Event Collector check out this blog post.
You can get the new Splunk Logging …
Using Splunk Archive Bucket Reader with Pig
This is part II in a series of posts about how to use the Splunk Archive Bucket Reader. For information about installing the app and using it to obtain jar files, please see the first post in this series.
In this post I want to show how to use Pig to read archived Splunk data. Unlike Hive, Pig cannot be directly configured to use InputFormat classes. However, Pig provides a Java interface—LoadFunc—that makes it reasonably easy to use an arbitrary InputFormat with just a small amount of Java code. A LoadFunc is provided with Splunk Archive Bucket Reader: com.splunk.journal.hive.JournalLoadFunc. If you would prefer to write your own, you can find more information here.
Whereas Hive closely resembles a …
HTTP Event Collect: a Python Class
(Hi all–welcome to the first of what will be a series of technical blog posts from members of the SplunkTrust, our Community MVP program. We’re very proud to have such a fantastic group of community MVPs, and are excited to see what you’ll do with what you learn from them over the coming months and years.
–rachel perkins, Sr. Director, Splunk Community)
Happy Holidays everyone!
I tend to make new code this time of year. So, I decided to make a python class after a lovely Thanksgiving with the family.
There is a lot of great content on the HTTP Event Collector thanks to Glenn Block and …
Splunk Archive Bucket Reader and Hive
This year was my first .conf, and it was an amazingly fun experience! During the keynote, we announced a number of new Hunk features, one of which was the Splunk Archive Bucket Reader. This tool allows you to read Splunk raw data journal files using any Hadoop application that allows the user to configure which InputFormat implementation is used. In particular, if you are using Hunk archiving to copy your indexes onto HDFS, you can now query and analyze the archived data from those indexes using whatever your organization’s favorite Hadoop applications are (e.g. Hive, Pig, Spark). This will hopefully be the first of a series of posts showing in detail how to integrate with these systems. This post is …
Wait, what – a youtube video for my app!?
At Splunkbase we are constantly striving to improve the experience for our users – whether it’s the app-discovery process for a Splunk admin/user, or the app-submission and management experience for our developers. We’ve been busy making changes over the last few months, and I thought this would be a good time to cover some of the more important changes we’ve made recently.
There was a lot of backend engineering work done to spruce up the infrastructure, the API, and search results relevancy – changes that are not always apparent to an end-user of Splunkbase. However, in this post I will talk about some user-facing features we recently added with the goal of improving the experience for our developer community. These features will allow you to …
Send JSON objects to HTTP Event Collector using our .NET Logging Library
Recently we shipped a bunch of logging libraries at the same time our new HTTP Event Collector hit the streets: http://blogs.splunk.com/2015/10/06/http-event-collector-your-direct-event-pipe-to-splunk-6-3/
One of the questions I’ve heard from customers using the libraries, is “Can I send JSON objects with the .NET logging library?”
Yes, you can. To do it, you need to use our Splunk.Logging.Common library which our other loggers depend on. Interfaces like TraceListener were designed for sending strings not objects.
For example TraceSource has a TraceData method which accepts objects and which it appears should work. However (at least based on my testin)g the objects are serialized to strings and then passed on as such to the listeners. Thus by the time we get it we …