Protocol Data Inputs
It must have been about a year ago now that I was talking with a Data Scientist at a Splunk Live event about some of the quite advanced use cases he was trying to achieve with Splunk. That conversation seeded some ideas in my mind , they fermented for a while as I toyed with designs , and over the last couple of months I’ve chipped away at creating a new Splunk App , Protocol Data Inputs (PDI).
So what is this all about ? Well to put it quite simply , it is a Modular Input for receiving data via a number of different protocols, with some pretty cool bells and whistles.
So let’s break down some of …
Mobile Analytics with Storm (Part 2)
In the previous article “Mobile Analytics with Storm“, we discussed how to configure the logging library for mobile apps to send stacktrace messages to Storm via REST API. To make this logging library more usable and robust, mobile app developers are now able to send invaluable stacktrace messages via TCP (through the Network Inputs option). The configuration steps are incredibly simple and are summarized using the diagram shown below:
- Click at “Network data” to enable Storm to receive data via TCP
- Click at “Authorize your IP address” so that Storm is receiving data from authorized IP address(es). Please take note of the “IP/Port combination” in “Send data to” – we
Modular Inputs Tools
And so it is with software. Languages, libraries, frameworks are just tools that make it easier for us to accomplish some task.
With the release of Splunk 5 came a great new feature called Modular Inputs.
Modular Inputs extend the Splunk framework to define a custom input capability.In many respects you can think of them as your old friend the “scripted input” , but elevated to first class citizen status in the Splunk Manager. Splunk treats your custom …
Splunk components for Apache Camel
The developer feedback was great , and no feedback is better than when an audience member gets inspired to go and create and new set of Splunk components for another enterprise Java framework , in this case Apache Camel.
Similarly to Spring Integration , Apache Camel is an open-source integration framework based on Enterprise Integration Patterns. The programming semantic to which the developer builds their integration solution with the respective frameworks will differ, and for this reason the developer may prefer one framework over the over , but the high level approach is the same, that being a development framework that …
Mobile Analytics with Splunk
Spring Integration Splunk Adaptors Webinar
With the introduction of our various programming language SDK’s (Java, JS, Python,PHP,Ruby) for the Splunk REST API , we have significantly lowered the barrier of entry for developers wanting to build big data apps and integrations on top of the Splunk platform.Developers can now choose their preferred development language and right out the blocks focus on coding their core business logic without having to worry about the lower level semantics of REST , the SDK’s make this easy.
And that is after all why we build tools and frameworks in the first place , to make it simpler for you to perform some task and get to that point of productivity faster.
Building upon this ideal, if an SDK makes …
Mobile Analytics with Storm
Splunk, Java and “The Internet of Things”
Spending some time in Asia this week has only further reminded me of how many machine data generating devices permeate our modern lives.Mobile devices, CCTV cameras, Car computers, Hotel smart TVs, Traffic controllers, Payment terminals, Public transport tap n go passes,Wifi access points, Automated laser and light shows etc….All connected systems that generate massive amounts of data. And this is just what you can see on the surface.There are a myriad of embedded systems and controllers running quietly under the covers, churning away, powering the very lifeblood of the city.
A recent article on theserverside.com had some interesting quotes that got my data wrangling adrenaline flowing :
“….there are over three billion embedded devices out there powered by Java, and
SpringOne – Building Big Data Apps with Splunk and Java
In mid October I will be attending the SpringOne Event in Washington DC to talk about Splunk , building Big Data applications on top of the Splunk platform and showcasing our developer platform offerings , with a particular emphasis on the Java SDK.
This is a Java technology oriented event with a focus on leading Java/JVM technologys, frameworks, languages and applications such as SpringSource, Groovy and Tomcat. And to be frank , it is THE event to be at if you have an interest in these areas.On one hand I’ll be there representing Splunk , but I’ll also be there as a fan of many of the speakers.
Splunk are a Gold sponsor of this event , and in addition …
I’m losing my memory
Your phone rings at 3am. A frantic Level 1 support staffer is panicking. The web storefront has gone down. You dig around and quickly find out that the ESB has crashed and the warm standby instance failed to kick in. The logs reveal that the JVM terminated because it ran out of heap memory. How could this have happened? We tested it, right?
The dreaded java.lang.OutOfMemoryError is by far the most commonly recurrent problem I’ve seen in JVM based applications throughout my career.
So let’s just go over a few things you failed to do.
- You failed to use “Splunk for JMX” to monitor your JVM heap and proactively alert you when a heap usage threshold was breached.
- You failed