From API to easy street within minutes
30? 20? …15? It all depends on how well you know your third-party API. The point is that polling data from third-party APIs is easier than ever. CIM mapping is now a fun experience.
Want to find out more about what I mean? Read the rest of this blog and explore what’s new in Add-on Builder 2.1.0.
REST Connect… and with checkpointing
Interestingly this blog happens to address a problem I faced back on my very first project at Splunk. When I first started at Splunk as a Sales engineer, I worked on building a prototype of the ServiceNow Add-on. Writing Python, scripted inputs vs mod input, conf files, setup.xml, packaging, best practices, password encryption, proxy and even checkpointing… the list goes …
Using machine learning for anomaly detection research
Over the last years I had many discussions around anomaly detection in Splunk. So it was really great to hear about a thesis dedicated to this topic and I think it’s worth sharing with the wider community. Thanks to its author Niklas Netz in advance!
Obviously anomaly detection is an important topic in all core use case areas of Splunk, but each one has different requirements and data, so unfortunately there is not always an easy button. In IT Operations you want to detect systems outages before they actually occur and proactively keep your depending services up and running to meet your business needs. In Security you want to detect anomalous behavior of entities to detect potential indicators for breaches …
Getting Cloud Native with Splunk and Cloud Foundry
Enterprises are moving to microservices architectures, continuous delivery practices, and embracing DevOps culture. This is the foundation of a modern, “cloud-native” business. At Pivotal, we help companies make this transformation with our Pivotal Cloud Foundry product.
Our customers want to extend the utility of Splunk to include their new cloud-native apps running on Cloud Foundry. To this end, we’ve been working up an integration between these two products. This post reviews our progress so far, and concludes with an invite to our private beta program.
What is Pivotal Cloud Foundry?
Pivotal Cloud Foundry is a platform, based on open source software, for deploying and operating applications. …
Visual link analysis with Splunk and Gephi
As cyber-security risks and attacks have surged in recent years, identity fraud has become all too familiar for the common, unsuspecting user. You might wonder, “why don’t we have the capabilities to eliminate these incidents of fraud completely?” The reality is that fraud is difficult to characterize as it often requires much contextual information about what was occurring before, during, and after the event of concern in order to identify if any fraudulent behavior was even occurring at all. Cyber-security analysts therefore require a host of tools to monitor and investigate fraudulent behavior; tools capable of dealing with large amounts of disparate data sets. It would be great for these security analysts to have a platform to be able to …
Kaufland DevSummit2016 – Splunk for DevOps – Faster Insights, better code
The first DevSummit event was recently hosted by Kaufland with 200 people attending for the day to hear presentations about the “World of API”, discuss the latest best practice developments and build ideas in a hackathon. One highlight was the keynote from Markus Andrezak on how technology, business and innovation play together.
Of course, a team of Splunkers (big thanks to my colleagues Mark and Henning) wouldn’t miss such an event and got involved with a booth as well as a presentation. It was amazing to have so many fruitful discussions about how to make data more easily accessible and useable for business, development and operation teams. In the morning Joern Wanke from the Kaufland Omnichannel team presented on how …
Easily Create Mod Inputs Using Splunk Add-on Builder 2.0 – Part IV
Add-on Builder 2.0 provides capabilities to build modular inputs without writing any code. In this post however, we focus on using an advanced feature of Splunk’s Add-on Builder 2.0 to write custom python while taking advantage of its powerful helper functions.
There is a veritable cornucopia of useful resources for building modular inputs at docs.splunk.com, dev.splunk.com, blogs.splunk.com, and more. This post certainly isn’t meant to replace those. No no, this post will simply walk you through leveraging Splunk Add-on Builder 2.0 to create custom code to query an API.
In this post we will create a …
Docker 1.13 with improved Splunk Logging Driver
The evolution of Splunk and Docker continues! In the early days (2015) of Splunk and Docker we recommended using the native syslog logging driver in Docker Engine. In Feburary of 2016, Docker 1.10 came out and we contributed the first version of Splunk Logging Driver in Docker 1.10. Since that first release we have seen huge adoption. After reviewing feedback and thinking about what is needed for Splunk environments with Docker, we’ve added a bunch of new features!
- Skip verification for HTTP Event Collector endpoint availability
- Support for raw and JSON formats
- Performance improvements
- Retry logic
- Gzip compression
- Unit test code coverage
When I wrote this blog post, Docker 1.13 was still in Release Candidate stage. If …
Announcing new AWS Lambda Blueprints for Splunk
Splunk and Amazon Web Services (AWS) are continuously collaborating to drive customer success by leveraging both the agility of AWS, and the visibility provided by Splunk. To support that goal, we’re happy to announce new AWS Lambda blueprints to easily stream valuable logs, events and alerts from over 15 AWS services into Splunk to help customers gain critical security and operational insights.
With a point-and-click setup, you can use these blueprints to have Splunk ingest data from AWS services such as Kinesis Stream, CloudWatch Logs, DynamoDB Stream and IoT for further data processing & analytics in addition to logging AWS Lambda itself for instrumentation & troubleshooting.
Once Lambda blueprint is configured, events are automatically forwarded in near real-time by Lambda onto Splunk …
Splunk Challenge 2016 – Catch ’em all at Nanyang Polytechnic!
Splunk Challenge 2016, the annual Splunk challenge that many NYP students have been waiting for, is here! Today, the students will be pitting their analytics’ skills learned using Splunk, against each other as they compete for a chance to take home some great prizes.
Unlike past years where the students were tasked to look into business and IT operation data, this year the ideas of analyzing “Pokemon” data was suggested by the lecturer to be used for the challenge. As the market leader in the data analytics space, not only it is important, but it also addresses some of our core values to keep what we are doing fun and innovative so that we will not only be able to …
Personal Dev/Test Licenses give you the freedom to explore
Do you have a new use case to validate? Untapped data sources to investigate? Wouldn’t it be great to explore how Splunk might help other parts of your organization? All without impacting your production systems and license usage…
At .conf2016 in September, CEO Doug Merritt was clear that we want to make easier for you use Splunk across your business. Enforced metering is gone. And exploring new use cases should be hassle-free.
So now any Splunk Enterprise or Splunk Cloud customer employee can get a free personalized Splunk Enterprise Dev/Test software license. Each license is valid for up to 50 GB daily data ingestion and a six-month renewable term, giving you ample power and time to …