Compute@Edge log streaming: Amazon Kinesis Data Streams
Last updated 2022-05-10
Fastly's Real-Time Log Streaming feature for Compute@Edge services can send log files to Amazon Kinesis Data Streams. Amazon Kinesis Data Streams (KDS) is a real-time data streaming service that can continuously capture data from a variety of sources.
NOTE
Fastly does not provide direct support for third-party services. Read Fastly's Terms of Service for more information.
How Amazon Kinesis Data Streams work with Fastly log streaming
Amazon KDS sends data records to a stream. Each stream comprises one or more shards. A shard represents a fixed amount of processing capacity and the total processing capacity of a stream is determined by the number of shards. The number of shards may be increased or decreased over the lifetime of a stream. This is important because the Fastly Kinesis logging endpoint monitors the number of shards and attempts to uniformly distribute the log data records across the available shards. When the number of shards for a stream changes, the Fastly Kinesis logging endpoint automatically adjusts in response. The goal is to make the best use of the throughput capability of the stream while minimizing the configuration overhead required for our customers.
If the log volume exceeds the throughput capacity of the stream, Amazon KDS will return errors to our system that indicate that the stream is being throttled and that may prevent some logs from being delivered. AWS CloudWatch provides a metric for Kinesis Data Streams, WriteProvisionedThroughputExceeded
, that can be used to monitor this so that adjustments to the stream capacity can be made as necessary.
TIP
For more information about working with Amazon KDS and understanding the capacity limits, refer to the Kinesis Developer Guide.
Prerequisites
Before adding Amazon KDS as a logging endpoint for Fastly Compute@Edge services, we recommend creating Identity and Access Management (IAM) credentials in your AWS account specifically for Fastly. Our recommended way for doing this is by creating an AWS IAM role, which lets you grant temporary credentials. For more information, see Creating an AWS IAM Role for Fastly Logging. Alternatively, create an IAM user and grant the user kinesis:PutRecords
and kinesis:ListShards
permissions for the logging stream. For more information, see Amazon's guidance on understanding and getting your AWS credentials.
Adding Amazon Kinesis as a logging endpoint
After you've registered for an AWS account and created an IAM user in Amazon Kinesis, follow these instructions to add Amazon KDS as a logging endpoint:
- Review the information in our guide to setting up remote log streaming for Compute@Edge. Additionally, our developer documentation provides more information about logging with Compute@Edge code written in Rust, AssemblyScript, and JavaScript.
Click the Amazon Kinesis Data Streams Create endpoint button. The Create an Amazon Kinesis Data Streams endpoint page appears.
Fill out the Create an Amazon Kinesis Data Streams endpoint fields as follows:
- In the Name field, enter the endpoint name you specified in your Compute@Edge code. For example, in our Rust code example, the name is
my_endpoint_name
. - In the Access method field, select either User Credentials or IAM Role.
- If you select User Credentials, enter the access key and secret key associated with the IAM user you created in your AWS account specifically for Fastly. See Amazon's documentation on security credentials for more information.
NOTE
Password management software may mistakenly treat the Secret Key field as a password field because of the way your web browser works. As such, that software may try to auto-fill this field with your Fastly account password. If this happens to you, the AWS integration with Fastly services won't work and you will need to enter Secret Key manually instead.
- If you select IAM Role, enter the Amazon Resource Name (ARN) for the IAM role granting Fastly access to KDS. For more information, see Creating an AWS IAM Role for Fastly Logging.
- In the Stream name field, enter the name of the Kinesis stream to which log data will be sent.
- From the Region menu, select the region to stream logs to. This must match the region where you created your Kinesis stream.
- In the Name field, enter the endpoint name you specified in your Compute@Edge code. For example, in our Rust code example, the name is
- Click the Create button to create the new logging endpoint.
- Click the Activate button to deploy your configuration changes.
Do not use this form to send sensitive information. If you need assistance, contact support.