HTTP Event Collect: a Python Class

splunktrust(Hi all–welcome to the first of what will be a series of technical blog posts from members of the SplunkTrust, our Community MVP program. We’re very proud to have such a fantastic group of community MVPs, and are excited to see what you’ll do with what you learn from them over the coming months and years.
–rachel perkins, Sr. Director, Splunk Community)


Happy Holidays everyone!

I am George Starcher, one of the members of the SplunkTrust.

I tend to make new code this time of year. So, I decided to make a python class after a lovely Thanksgiving with the family.
There is a lot of great content on the HTTP Event Collector thanks to Glenn Block and his development team. However, I found there isn’t a Python Logger already built for it.

That motivated me to write a Python class you can leverage in your own Python code. It can be downloaded from my git repository:

I encourage you to also vote up a Python Logger from the Splunk development team over at the SplunkDev User Voice page. That will get a fully supported logger made.

About the Class

The class allows you to declare an http_event_collector object. Default behavior is to use SSL and port 8088. You can override these if you need to for your environment. You can then submit events individually using the sentEvent method. This will immediately send the single event JSON payload to the collector. The more efficient way is to use the batchEvent method to acccumulate the events and finish your code with a batchFlush method call. This method will also auto flush for you if your accumulating events come close to the default max bytes size HTTP Event Collector accepts.

Using the Class

Let’s Collect some Crime Reports. Check out the sample usage of our new class below. It is also in the git repo. I hope you enjoy!

  1. We import the new class
  2. We generate the key value pairs in a JSON payload with our commitCrime function
  3. We create an http_event_collector object called testevent providing the Collector Token and hostname
  4. Note that by default we don’t have to specify the HTTP collector is using SSL and default port 8088. The class will let you override the defaults.
  5. Next we make the payload JSON base for our sample Python code. This is where we add the normal event metadata fields using the update method.
  6. From there we make 5 individual events to collect and immediately send them using the sendEvent method. You can see how we even add a few extra fields to the JSON coming from our crime function.
  7. We also demonstrate the batch event submission by calling batchEvent followed by a flushBatch before we are done.
 from splunk_http_event_collector import http_event_collector 
 import random
 import json
def commitCrime():
# list of sample values
 suspects = ['Miss Scarlett','Professor Plum','Miss Peacock','Mr. Green','Colonel Mustard','Mrs. White']
 weapons = ['candlestick','knife','lead pipe','revolver','rope','wrench']
 rooms = ['kitchen','ballroom','conservatory','dining room','cellar','billiard room','library','lounge','hall','study']
killer = random.choice(suspects)
 weapon = random.choice(weapons)
 location = random.choice(rooms)
 victims = [ suspect for suspect in suspects if suspect != killer ]
 victim = random.choice(victims)
return {"killer":killer, "weapon":weapon, "location":location, "victim":victim}
# Create event collector object, default SSL and HTTP Event Collector Port
 http_event_collector_key = "B02336E2-EEC2-48FF-9FA8-267B553A0C6B"
 http_event_collector_host = "localhost"
testevent = http_event_collector(http_event_collector_key, http_event_collector_host)
# Start event payload and add the metadata information
 payload = {}
# Report 5 Crimes individually
 for i in range(1,5):
 event = commitCrime()
# Report 50 Crimes in a batch
 for i in range(1,50):
 event = commitCrime()

One Trackback

  1. […] it’s introduction in Splunk v6.3. You can check out a python class I made for it over on the Splunk Blog. That got me started back on data collection from my Raspberry Pi. I can just send data straight […]