Compute@Edge log streaming: Microsoft Azure Blob Storage

Fastly's Real-Time Log Streaming feature for Compute@Edge services can send log files to Microsoft Azure Blob Storage (Blob Storage). Blob Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.

Prerequisites

Before adding Blob Storage as a logging endpoint for Fastly Compute@Edge services, create an Azure storage account in the Azure portal. For help creating the account, see Microsoft's account creation documentation.

We recommend creating a Shared Access Signature (SAS) user specifically for Fastly. For more information, see Microsoft's shared access signatures (SAS) documentation, paying specific attention to the Account SAS URI examples.

Here is an example of a SAS token that provides write permissions to a blob:

sv=2018-04-05&ss=b&st=2018-04-29T22%3A18%3A26Z&sr=b&se=2020-04-30T02%3A23%3A26Z&sp=w&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D

The table breaks down each part of the token to understand how it contributes to the SAS:

Element Example Description
sv sv=2018-04-05 Storage services version.
ss ss=b The signedservice field. This is required and should be b for "blob storage."
st st=2018-04-29T22%3A18%3A26Z The start time of the token, specified in UTC.
sr sr=b Store resources for which this token has access. We require blob (b).
se se=2020-04-30T02%3A23%3A26Z The expiry time of the token, specified in UTC.
Ensure you update your token before it expires or the logging functionality will not work.
sp sp=w The permissions granted by the SAS token. We require write (w).
sig sig=Z%2FRHIX5Xcg0Mq2... The signature to authorize access to the blob.

Adding Blob Storage as a logging endpoint

After you've registered for an Azure account and created a SAS token, follow these instructions to add Blob Storage as a logging endpoint:

  1. Review the information in our Setting Up Remote Log Streaming guide.

  2. Click the Azure Blob Storage Create endpoint button. The Create a Microsoft Azure Blob Storage endpoint page appears.
  3. Fill out the Create a Microsoft Azure Blob Storage endpoint fields as follows:
    • In the Name field, enter the name you specified in your Compute@Edge code. For example, in our Rust code example, the name is my_endpoint_name.
    • In the Storage account name field, enter the unique Azure namespace in which your data objects will be stored.
    • In the Container field, enter the name of the Blob Storage container to store logs in. See Microsoft's Blob storage page for more information.
    • In the SAS token field, enter the token associated with the container.
    • In the Maximum bytes field, optionally enter the maximum file size in bytes.
    • In the Period field, optionally enter an interval (in seconds) to control how frequently your log files are rotated. This value defaults to 3600 seconds.
    • In the Timestamp format field, optionally enter a timestamp format for log files. The default is an strftime compatible string. Our guide on changing where log files are written provides more information.
  4. Click the Advanced options link of the Create a Microsoft Azure Blob Storage endpoint page and decide which of the optional fields to change, if any.
  5. Fill out the Advanced options of the Create a Microsoft Azure Blob Storage endpoint page as follows:
    • In the Path field, optionally enter the path within the bucket to store the files. The path ends with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path. Our guide on changing where log files are written provides more information.
    • In the PGP public key field, optionally enter a PGP public key that Fastly will use to encrypt your log files before writing them to disk. You will only be able to read the contents by decrypting them with your private key. The PGP key should be in PEM (Privacy-Enhanced Mail) format. See our guide on log encryption for more information.
    • In the Select a log line format area, select the log line format for your log messages. Our guide on changing log line formats provides more information.
    • In the Compression field, optionally select the compression format you want applied to the log files. Our guide on changing log compression options provides more information.
  6. Click the Create button to create the new logging endpoint.
  7. Click the Activate button to deploy your configuration changes.

Ingesting data for Azure Data Explorer

Azure Data Explorer is a data exploration service for log and telemetry data. To ingest your data correctly, Data Explorer requires your logs to be formatted as comma-separated values (CSVs). When creating your logging endpoint:

  • Set the Log format to a CSV string ( %H,%{time.start.sec}V,%{regsub(req.http.User-Agent, \{"""\}, \{""""\})}V).
  • Specify blank when you Select a log line format in the Advanced options.

Our guide on changing log line formats provides more information.

Back to Top