Operational Intelligence – Manufactured in Germany | SplunkLive! Germany events 2016

Hello,

Spring has only just begun and yet we have already finished our SplunkLive! 2016 Tour across Germany. We began in Munich with further stops in Frankfurt and Hamburg. We had a fantastic tour and achieved our goal of getting Splunk Ninjas together to learn from one another how to achieve operational intelligence. We heard how a botnet is using hacked e-mail mailboxes for malicious activities, we saw how Splunk sends out Excel sheets to individuals across a business and learned what could be done with 10bn events and machine learning for business application monitoring.

Let’s get started on the highlights with Datev, the 4th largest German software company and Computerwoche’s second best telecommunications company to work for, on how they improve their service quality and optimize the handling of incidents. Andreas Jahnke, Monitoring Manager, explained how Datev use the value of machine data in real-time. The company processes the payroll of 11 million individuals each month and over 164,000 organisations in Germany use its finance software.

As you can imagine in such a sensitive field it’s key that they continuously improve their service, ensuring resilience against different influence factors like performance and security. They also need to optimize their troubleshooting times. They have reduced their MTTR and MTTI by having end to end visibility into their services.

UniCredit’s Splunk deployment and usage spans multiple regions and departments. Markus Sprunck, Senior IT Architecht at UniCredit Business Integrated Solutions in Munich introduced the audience to how the application landscape of payments in a bank work. Considering the complexity of this environment – it is even more impressive how he built a simplified view of the most important indicators for business process monitoring based on log data. Unicredit has also developed a number of its own insights with high level overview and custom drill downs at the raw log event level.

Volker Kassen from Helvetia Insurance reported about its use cases, highlighting a clear mission statement. “We need customer application that are reliable and available”. This means Helvetia needs to monitor customer applications, receive alerts if specific conditions are met, support troubleshooting and provide regular reports for capacity management across multiple departments and countries. As a result, Helvetia has killed many siloes – Volker reported that in the past there were individual tools that provided value on their own but could never provide end-to-end visibility. Data was also being collected multiple times in multiple tools and whenever something happened establishing the root cause analysis was a painstaking process. He gave insights into how Helvetia uses machine data for business analytics, with Splunk regularly providing insight to the sales team. As the preference for this data was nicely formatted Microsoft Excel files, Volker created his own Splunk technology add-on which allows him to send out those directly from Splunk. He gave the work back to the Splunk community so everyone can benefit.

 

Andre Pietsch and Niklas Netz from Otto IT presented on how they carry out anomaly detection with a machine learning algorithm across its end to end business item monitoring. Otto has more then 54,000 employees and has a strong international presence with many brands successful in multichannel e-commerce, finance and services. They have over 10 billion events and are processing them with different solutions. They use things like Splunk IT Service Intelligence, the Machine Learning Toolkit App or Tensorflow. Niklas will also create his bachelor thesis in cooperation with Splunk, Otto and LC Systems at the Hamburg University of Applied Sciences.

We also got super excited by the presentation from Fabian Bock, CEO at Mail.De – a very popular E-Mail Service with free and premium E-Mail hosting made in Germany. He explained how critical it is to collect and analyze machine data for their business, what challenges with unstructured data there are and why traditional approaches did not work for them. Mail.De uses the machine data in Splunk in various ways. It starts with the developers looking for errors proactively, the support desk finding “lost” e-mails all the way to the company management measuring the business success of various projects such as marketing initiatives. He also gave us insights about what pattern a hacked mailbox from a popular botnet looks like. Mail.De is also monitoring the adoption and operation of IPv6 to ensure everything is working fine. The monitoring capability goes through their full offering – FAX (it’s still used!) activity and response times, SMS messages for account activations, load balancers, Paypal payments for premium services etc. Overall he told the audience that by analyzing machine data with Splunk, Mail.De has optimized its services, with all departments now benefiting from speaking a common language and having the same goal.

Keen to share your Splunk story and the challenges you have solved? Join our .conf2016 annual user conference this September at the Walt Disney World Swan and Dolphin Resorts in Florida, the call for papers is now open!

Happy Splunking,
Matthias