High Performance syslogging for Splunk using syslog-ng – Part 2

As I mentioned in part one of this blog, I managed a sizable deployment of Splunk/Syslog servers (2.5TB/day). I had 8 syslog-ng engines in 3 geographically separate data centers. Hong Kong, London and St. Louis. Each group of syslog-ng servers was load balanced with F5. Each group was sending traffic to their own regional indexers. Some of the syslog servers processed upward of 40,000 EPS (bursts traffic). The recommendation that I am about to describe here is what worked for me; your mileage may vary of course. I tried optimizing the syslog-ng engines to get as much performance as possible out of them. If you feel, however, that it is over kill or if you don’t have the manpower to …

» Continue reading

High Performance syslogging for Splunk using syslog-ng – Part 1

Today I am going to discuss a subject that I consider to be extremely critical to any Splunk’s successful deployment. What is the best method of capturing syslog events into Splunk? As you probably already know there is no lack of articles on the topic of syslog on the Internet. Which is fantastic because it enriches the knowledge of our community. This blog is broken into two parts. In part one, I will cover three scenarios of implementing syslog with Splunk. In part two, I will share my own experience running a large Splunk/Syslog environment and what can you do to increase performance and ease management.

When given the choice between using syslog agent (ex: http://sflanders.net/2013/10/25/syslog-agents-windows/ ) or UF (Universal …

» Continue reading

Remote Images Retrieval With Splunk Using Custom Command “getimage.py”

Every once in a while my customers ask for a functionality that is not natively supported by Splunk. Out of the box Splunk is a very capable platform, however, there are certain tasks Splunk is not designed for. But that never stops a Splunker from finding a solution! The use-case I am about to discuss in this blog is an example of that: The customer owns large chain of pharmacies across the country, the bulk of the stores transactions end up in Hadoop Data Lake; the customer wants to use Hunk/Splunk to visualize and analyze the massive amount of information collected, which is something Hunk can do easily. The challenge came about when I was asked if Splunk could show …

» Continue reading