Smart AnSwerS #78
Hey there community and welcome to the 78th installment of Smart AnSwerS.
Things have been ramping up around Splunk HQ with conf2016 just around the corner! The Splunk education team is starting off strong with Splunk University beginning tomorrow and running through Monday, while the rest of the conference staff are working hard to make the final touches to ensure a smooth and awesome experience for all attendees. I’m looking forward to running into familiar faces and coming across new ones! I’ll be hanging out at the Splunk Answers booth at least half of the time during the conference, so if you happen to be exploring the source=*Pavillion, feel free to stop by to say hello. Safe travels …
Splunk Docs: let us make an example of you
The Splunk doc team wants to improve our search command examples, and we need your help. Share your expertise! The best examples will be added to the Splunk documentation. If you submit a winning example, you will earn undying fame because we will credit you right in the docs.
Here are the search commands that would benefit from better, real-world examples.
- abstract – Has only one basic example now.
- addInfo – Has only one basic example now.
- collect – This advanced command needs a great example.
- delete – Are there other use case examples for this command besides what is there now?
- foreach – Users find this complicated and hard to use, but this
Moving from LDAP to SAML authentication
An often asked question when configuring SAML is how do you ensure users can access their knowledge objects and saved searches that were created before migrating to SAML? Do you need to a script that migrates the users’ knowledge objects? As always is the case, the answer isn’t simple but it depends on the authentication mechanism prior to SAML.
When moving from LDAP to SAML, if the same LDAP server is configured as the backend authentication database on the Identity Provider(Adfs, Okta, Ping…), then the users would be the same and the groups they belong to would be the same.
Then moving from LDAP to SAML and retaining the previously created knowledge objects is straightforward and can be achieved …
Managing your Ingestion with the search bar
Many of our cloud customers have asked me how to better manage their data, e.g. determine volume by sourcetype, or volume by forwarder. This is typically available via the Distributed Management Console, but in some cases, a person’s role prevents them from getting full access to it. In the article below, I will guide you through several searches aimed to let anyone dive a bit deeper into their Splunk Cloud service.
Below are a few searches I find helpful
Total Ingestion Volume over time
index=_internal source="/opt/splunk/var/log/splunk/license_usage.log" type="RolloverSummary" | eval GB=b/2014/1024/1024 |timechart span= 1d sum(GB) as GB |
Be sure to double check your time range selector here, I usually search over the past 7 days. If you want to look …
Configuring PingIdentity PingFederate (Ping) Security Assertion Markup Language (SAML) Single Sign On (SSO) with Splunk Cloud
There are now a few blog postings on SAML configurations for Splunk> Cloud. For Okta , Azure and ADFS. Ping is similar in complexity to the Identity Provider (IdP) ADFS, and can be a bit tricky depending on your implementation. The intent of this guide is help you along on your way to integrate Splunk> Cloud with PingFederate.
My role is a Cloud Services Advisory Engineer on the Customer Adoption and Success Team (CAST) within Splunk>. My focus is to assist our customers in their experience with our Cloud service for Splunk>. With our 6.4.x version of Splunk> Cloud, which this posting is about, the configuration for SAML definitely works quite well, but is not the most user friendly …
Dell EMC Splunking It Up at #splunkconf16
The following is a guest blog post from Cory Minton, Principal Systems Engineer, Dell EMC…
Grab your hoodies, your witty black t-shirts, and maybe your capes…it’s time for another exciting Splunk .conf2016, the annual Splunk User Conference taking place at the Walt Disney Swan and Dolphin Resort September 26-29, 2016. All of us at EMC are excited to be sponsoring .conf for the third year in a row, and this year our presence will be bigger and better than ever before. Dell EMC will host two technical sessions this year, we’ll have more than 20 of the Dell EMC Splunk Ninjas running around learning, a large booth in the partner pavilion demonstrating our technology solutions, and we are pleased to have been …
Detect IoT anomalies and geospatial patterns for logistics insights
In part 1 of this blog series we spoke about how to turn sensor data into logistics insights. In this part we outline one approach for anomaly detection and enrich our sensor data with location information to discover geospatial patterns.
Anomalies? Find them with a few lines of SPL.
Anomaly detection can be tricky and implementations vary from simple thresholding and baselining to highly sophisticated approaches based on machine learning. In this example we leveraged the Splunk Machine Learning Toolkit to detect numeric outliers using a sliding window approach to check against multiples of the standard deviation in this time series to spot anomalies.
And that’s how the SPL looks like:
| timechart span=1s avg(ax) as avx avg(ay) as
Smart AnSwerS #77
Hey there community and welcome to the 77th installment of Smart AnSwerS.
Applications for the 2016 – 2017 SplunkTrust cohort were submitted a month ago, and the current membership reviewed and ranked all of them individually within the past several weeks. The rankings have been gathered to finalize who will be a SplunkTrustee and inducted at .conf2016. The Splunk community has greatly benefited from the contributions of all the applicants through various means, and we can’t thank them enough for sharing their Splunk clue with other users to learn and grow. Best of luck to everyone!
Check out this week’s featured Splunk Answers posts:
Ever wonder which dashboards are being used and what users are using them?…
I can’t make my time range picker pick my time field.
When you are working with Hadoop using Hunk or when you are working with Splunk and the time field you want to work with is not _time, you may want to use the time picker in a dashboard with some other time field. You may have the same problem when the current _time field is not the time field you want to use for the current search.
Here is a solution you might use to make time selections work in every case including in panels.
| inputlookup SampleData.csv | eval _time=strptime(claim_filing_date,"%Y-%m-%d") | sort - _time | addinfo | where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity")
Let’s Break this search down into its parts.
| inputlookup SampleData.csv
This is an example of …
You Bet Your Sweet SaaS, AWS will be at #splunkconf16!
The following is a guest blog post by David Potes, AWS Solutions Architect:
The end of September is one of my favorite times of the year, and not just because it’s finally Summer in San Francisco. It’s the time we attend Splunk .conf to talk about all of the things you can do with Amazon Web Services (AWS) and Splunk.
Here are a few .conf sessions highlighting the strong partnership between AWS and Splunk:
- If you’re looking to learn how Adobe built a security monitoring system across hundreds of accounts, there’s a session for you.
- Be sure to also check out how Experian migrated and monitored their 3-tier web application on AWS as well as how a university research department