Announcing new AWS Lambda Blueprints for Splunk
Splunk and Amazon Web Services (AWS) are continuously collaborating to drive customer success by leveraging both the agility of AWS, and the visibility provided by Splunk. To support that goal, we’re happy to announce new AWS Lambda blueprints to easily stream valuable logs, events and alerts from over 15 AWS services into Splunk to help customers gain critical security and operational insights.
With a point-and-click setup, you can use these blueprints to have Splunk ingest data from AWS services such as Kinesis Stream, CloudWatch Logs, DynamoDB Stream and IoT for further data processing & analytics in addition to logging AWS Lambda itself for instrumentation & troubleshooting.
Once Lambda blueprint is configured, events are automatically forwarded in near real-time by Lambda onto Splunk HTTP Event Collector without having to manage a single intermediary server, queuing or storage. This offers you an easy, fast & cost-effective data ingestion mechanism from AWS services to Splunk Enterprise or Splunk Cloud.
Below is a list of the various blueprints immediately available for you. In addition to updating the original splunk-logging Lambda blueprint which was released at re:Invent 2015 for the generic use case of logging events from and via AWS Lambda, we have also added new purpose-built blueprints specific for various AWS services to make it a simple plug-and-play process to to deliver streaming data from each of those AWS services to Splunk.
Part of this release, we’re also leveraging latest AWS Lambda features such as the recently announced support for environment variables to help you configure these blueprints with minimum code changes. Simply set your Lambda function environment variables to define your Splunk destination, specify the function trigger(s) or the AWS event source, and see your events stream in Splunk in near real-time.
To get started using these blueprints, visit the AWS Lambda Management Console.
To learn more go to http://dev.splunk.com/view/event-collector/SP-CAAAE6Y.
I. Stream AWS Kinesis Stream events to Splunk using splunk-kinesis-stream-processor Lambda blueprint
The splunk-kinesis-stream-processor blueprint can be used to automatically poll an Amazon Kinesis stream, parse new records and forward to Splunk.
You need to grant AWS Lambda permissions to poll your Amazon Kinesis stream. You grant all of these permissions to an IAM role (execution role) that AWS Lambda can assume to poll the stream and execute the Lambda function on your behalf. You specify the IAM role when you create your Lambda function. To simplify the process, you can use the predefined role AWSLambdaKinesisExecutionRole to create that IAM role for the Lambda function.
If interested to learn more about how Amazon Kinesis integrates with Lambda, see Using AWS Lambda with Amazon Kinesis.
II. Stream AWS CloudWatch Logs to Splunk using splunk-cloudwatch-logs-processor Lambda blueprint
The splunk-cloudwatch-logs-processor blueprint can be used to receive a real-time feed of logs events from CloudWatch Logs and forward to Splunk. The lambda blueprint takes care of decompressing and decoding the data before sending to Splunk.
You need to grant CloudWatch Logs the permission to execute your function. When using the AWS Console, Lambda will add the necessary permissions for Amazon CloudWatch Logs to invoke your Lambda function for the log group you specify when configuring the Lambda trigger in the AWS Console.
To complete the event source mapping, you need to set up a CloudWatch Logs subscription filter to enable the real-time feed of log events from CloudWatch Logs and have it delivered to Lambda. To learn more about Amazon CloudWatch Logs Subscriptions, see Real-time Processing of Log Data with Subscriptions.
III. Stream AWS DynamoDB Stream events to Splunk using splunk-dynamodb-stream-processor Lambda blueprint
The splunk-dynamodb-stream-processor blueprint is used to respond to updates made to a DynamoDB table, and forward that activity to Splunk.
You need to create an Amazon DynamoDB Stream for your table. For more info, see Capturing Table Activity with DynamoDB Streams.
You need to grant AWS Lambda permissions to poll your DynamoDB stream. You grant all of these permissions to an IAM role (execution role) that AWS Lambda can assume to poll the stream and execute the Lambda function on your behalf. You specify the IAM role when you create your Lambda function. To simplify the process, you can use the predefined role AWSLambdaDynamoDBExecutionRole to create that IAM role for the Lambda function.
If interested to learn more about how DynamoDB integrates with Lambda, see Using AWS Lambda with Amazon DynamoDB.
IV. Stream AWS IoT events to Splunk using splunk-iot-processor Lambda blueprint
The splunk-iot-processor blueprint is used to create a Lambda function that responds and processes MQTT messages which have triggered an AWS IoT rule. These messages are typically sent by any IoT device or an AWS IoT button.
When configuring the Lambda blueprint from the AWS console, you can create a new IoT rule as part of setting AWS IoT as trigger to your Lambda function. When using the AWS Console, Lambda will add the necessary permissions for AWS IoT to invoke your Lambda function. Alternatively, from AWS IoT Console, you can create (or reuse) an IoT rule, and add an action to invoke your Lambda function, as explained in Creating a Lambda Rule in IoT Rule Tutorial.
With serverless computing growing in popularity and streaming data volumes continuing to surge, we hope you’ll find these blueprints very useful. We can’t wait to see how you’ll be using these blueprints as part of your own big data ingestion pipelines. To further help you analyze and make sense of all this data from AWS services, make sure to check out the recently released Splunk App for AWS 5.0