One source, many use cases: How to deliver value right away by addressing different IT challenges with Splunk
At a recent #Splunk4Rookies event in Paris, we invited people to think about what kind of information they could get from a single piece of raw data to address different needs.
Here at Splunk we work hard to ensure you get the maximum value from your data.
We used the Prism example from (my blog hero) Matt Davies. You can see his take on this issue here: http://blogs.splunk.com/2015/06/22/bigdatasuccess/
I would like to share with you some ideas on how to promote Splunk internally by getting lots of values from your machine data. First things first, you need to be aligned with the company strategy. So, let me introduce a scenario for this first blog post so we can identify quick wins that will address issues that could prevent a company from achieving its goals.
Let’s take a real example from one of our top customers which offers bank transaction services to companies. Their mission statement is to deliver secure and innovative payment terminals and payment as a service. The business strategy for the next few years is to expand globally and offer innovative payment services.
To reach this goal, they built a strategic partnership with a global cloud provider so they can accelerate their growth and expansion to new countries. They also put a lot of effort into developping new payment solutions through different channels (mobile, accessories, etc.)
Our customer identified some IT challenges they’ll need to address to be succesful in those initiatives:
- Secure cloud applications and infrastructures
- Monitor payments through all the channels
- Lots of time pressure to deliver new features and expand
- Consolidate data from all the countries to have a global view of the business in real time
It seems that Splunk can help them to address many of these IT challenges which directly affect the core business and the strategy led by the company stakeholders.
Below is an anonymized piece of raw machine data coming from a bank transaction application log. Each transaction represents a customer transaction from a point of sale :
Here, we have a custom log where a bank transaction is composed of several events that we have to gather by the transaction ID. Fortunately, because we are working with Splunk, we don’t even have to worry about how we send this exotic log structure to be able to send those logs into Splunk – thank you “Schema on the fly” !
So let’s send our logs to Splunk and see what’s happening …
As predicted, each line is an event and we have to group them by transaction ID to identify each bank transaction. Let’s do that.
But first, let’s configure the transaction ID extraction since Splunk does not yet understand this custom data …
A few clicks later, my transaction ID is extracted, I can group events and think about buidling some cool dashboards. Wow!
Next week, based on this kind of machine data, I will show you how you can build really quickly some dashboards that would address a subset of those critical challenges:
- Transaction Performance Management: what KPIs the transaction application manager would like to see to measure the performance?
- Business Performance: what’s my current revenue? compared to yesterday?
- Security: what could be relevant for a security analyst?
If you have ideas, questions or feedback, tweet me @1rom1