The Splunk App for Stream – Tracking Open Ports for Security and Compliance – Part 2

In  Part 1 of this post we looked at using the Splunk App for Stream to look for open ports on your networked systems.  (Hint: Follow the ACK packets.)  This post looks at how to keep track of those open ports, and how to detect when a NEW port starts listening.

 

Of course, Splunk is an extensible tool that gives you the ability to solve problems like this a number of different ways.  The method I’ve chosen to use for this case is the Splunk Key Value Store.  This is a new feature in Splunk 6.2 that lets you read and write data within a Splunk app, allowing you to maintain state in that application.  Think of storing user metadata, or caching results from a search query by Splunk or an external data store.  In this case, think of maintaining a list of host IP addresses, open ports on each IP, and when they were last seen.  These three data points alone should be enough for us to discover a NEW listening port.

 

The Splunk Key Value Store or KV Store works like a CSV lookup, but it does so much more.  You can do Create-Read-Update-Delete (CRUD) operations on individual records within a collection, you can define field acceleration to improve search performance, and you have the option of data type enforcement when writing data.  Plus the KV Store is built for performance when handling larger data sets with frequent lookups.

 

KV Store data is managed per app within Splunk.  I recommend creating a new bare bones app just for testing things out.
  1. On the home page of Splunk Web on your Search Head, click the gear icon next to Apps.
  2. Click Create App.
  3. On the Add new page, fill out the properties of the new app:
    1. For Name, enter “Port Status”.
    2. For Folder name, enter “port_status”.
    3. For Template, select “barebones”.
  4. Click Save.

 

Now create a couple of config files for the KV Store for this app.

 

$SPLUNK_HOME/etc/apps/port_status/local/collections.conf
[port_lookup]

 

$SPLUNK_HOME/etc/apps/port_status/local/transforms.conf
[kvstore_lookup]
external_type = kvstore
collection = port_lookup
fields_list = _key, dest_ip, dest_port, _time

 

Now restart Splunk.
You can manage the data in your KV Store with three search commands:

 

     inputlookup – gets search results from a KV Store collection
     outputlookup – writes search results from the search pipeline into a specific KV Store collection
     lookup – match event data from earlier in the search pipeline to data in a KV Store collection

 

To put data into the KV Store, prepare it as a table and then pipe it to the outputlookup command.  It’s important to point out here that the list of ports that you’re going to check against doesn’t have to be limited to a list of open ports detected by a port scanner as outlined above.  You could just as easily populate the KV Store with a table of approved ports for each IP address according to your network policy.  In fact, why not create two lookups, one to maintain port status and one to maintain a list of whitelisted or otherwise acceptable ports?  Follow these same procedures to determine when a new listening port is on your acceptable list of services.

 

Here’s how to populate the KV Store with the scanner data.  Basically, run a search to find open ports and pipe it out to a table:

 

sourcetype=stream:tcp src_ip=10.100.0.11 dest_ip=10.0.0.0/8 ack_packets_in!=0
| transaction dest_ip dest_port mvlist=t
| stats last(timestamp) as time_last_seen by dest_port, dest_ip
| eval _time=strptime(time_last_seen, “%Y-%m-%dT%H:%M:%S.%fZ”)
| table dest_ip dest_port _time

 

B4C046FC-5D39-4711-B183-21136B791A0A

 

If you’re satisfied with the way things look, go ahead an append the following to that search to pipe the data into the KV Store:

 

| outputlookup kvstore_lookup

 

Want to take a look at the data in the KV Store?  Use the inputlookup command:

 

8FF1365E-171C-4678-AC99-FF24E02CA6EA

 

Once the KV Store is populated with your baseline data, either from an acceptable use policy or from port scanner output, you can start running searches to compare current state versus desired or previous state.

 

Using the same searches above, we can look for ack packets being sent to IP addresses, and look for combinations that have NOT been seen before.  The following search does that as well as create a new field named “starttime” which will pass its value to workflow.

 

sourcetype=stream:tcp src_ip=10.100.0.11 dest_ip=10.0.0.0/8 ack_packets_in!=0
| convert mktime(_time) as starttime
| transaction dest_ip dest_port mvlist=t
| search NOT [ | inputlookup kvstore_lookup earliest=0]
| table dest_ip dest_port _time

 

Save this search as a dashboard panel and call it something like “New Listening Ports”.

 

53B76054-87EF-4F6E-82EC-E9170828ACBE

 

Now, if you drill down on this row of data, you’ll be brought to the event itself.  I’ve created a workflow action so that I can add this to my list of known/acceptable ports for this host with a few mouse clicks.

 

6BA0A20F-4FC9-41D8-8FA2-C69C3AF9605D

 

Wanna be cool too?  Use these files:

 

$SPLUNK_HOME/etc/apps/port_scanner/local/eventtypes.conf
[scanner_traffic]
search = sourcetype=stream:tcp src_ip=10.100.0.11 dest_ip=10.0.0.0/8 ack_packets_in!=0

 

$SPLUNK_HOME/etc/apps/port_scanner/local/workflow_actions.conf
[add_ip_port_to_kvstore]
display_location = both
eventtypes = scanner_traffic
fields = *
label = Add $dest_ip$:$dest_port$ to Known/Acceptable List
search.app = port_status
search.preserve_timerange = 0
search.search_string = sourcetype=stream:tcp src_ip=10.100.0.11 dest_ip=$dest_ip$ dest_port=$dest_port$ ack_packets_in!=0 now=$starttime$ earliest=-1m latest=+1m | table dest_ip dest_port _time | outputlookup kvstore_lookup append=true
search.target = blank
type = search

 

This was an example of a single use case for Stream.  For more inspiration, check out the Stream Examples app at https://apps.splunk.com/app/1840/. And if you STILL don’t have any ideas about how to use Stream, take a look at the Stream app that I’ve built at  https://github.com/berthayes/stream_app. This app includes the Port Status searches detailed in this blog post, as well as a bunch of other searches and dashboards that I find useful.

 

This blog post started out discussing a single, focused use case in order to comply with NERC-CIP guidelines.  Although it’s possible to solve this problem in many different ways, my solution was using the Splunk App for Stream. Just like Splunk itself, Stream is an elegant set of tools for taking raw data and doing anything you want with it; in this case the raw data is coming straight off of your network.  Once you’re handy with the tools, the only limit is your imagination.