Log streaming: Microsoft Azure Blob Storage

Fastly's Real-Time Log Streaming feature can send log files to Microsoft Azure Blob Storage (Blob Storage). Blob Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.

NOTE

Fastly does not provide direct support for third-party services. Read Fastly's Terms of Service for more information.

Prerequisites

Before adding Blob Storage as a logging endpoint for Fastly services, create an Azure storage account in the Azure portal. For help creating the account, see Microsoft's account creation documentation.

We recommend creating a Shared Access Signature (SAS) user specifically for Fastly. For more information, see Microsoft's shared access signatures (SAS) documentation, paying specific attention to the Account SAS URI examples.

Here is an example of a SAS token that provides write permissions to a blob:

sv=2018-04-05&ss=b&st=2018-04-29T22%3A18%3A26Z&sr=b&se=2020-04-30T02%3A23%3A26Z&sp=w&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D

The table breaks down each part of the token to understand how it contributes to the SAS:

ElementExampleDescription
svsv=2018-04-05Storage services version.
ssss=bThe signedservice field. This is required and should be b for "blob storage."
stst=2018-04-29T22%3A18%3A26ZThe start time of the token, specified in UTC.
srsr=bStore resources for which this token has access. We require blob (b).
sese=2020-04-30T02%3A23%3A26ZThe expiry time of the token, specified in UTC.
Ensure you update your token before it expires or the logging functionality will not work.
spsp=wThe permissions granted by the SAS token. We require write (w).
sigsig=Z%2FRHIX5Xcg0Mq2...The signature to authorize access to the blob.

Adding Blob Storage as a logging endpoint

After you've registered for an Azure account and created a SAS token, follow these instructions to add Blob Storage as a logging endpoint:

  1. Review the information in our guide to setting up remote log streaming.
  2. Click the Azure Blob Storage Create endpoint button. The Create a Microsoft Azure Blob Storage endpoint page appears.

  3. Fill out the Create a Microsoft Azure Blob Storage endpoint fields as follows:

    • In the Name field, enter a human-readable name for the endpoint.
    • In the Placement area, select where the logging call should be placed in the generated VCL. Valid values are Format Version Default, waf_debug (waf_debug_log), and None. Read our guide on changing log placement for more information.
    • In the Log format field, enter a string formatted as a comma-separated value (CSV) to use for log formatting. See Ingesting data for Azure Data Explorer for more information.
    • In the Storage account name field, enter the unique Azure namespace in which your data objects will be stored.
    • In the Container field, enter the name of the Blob Storage container to store logs in. See Microsoft's Blob storage page for more information.
    • In the SAS token field, enter the token associated with the container.
    TIP

    Ensure you update your token before it expires otherwise the logging functionality will not work.

    • In the Maximum bytes field, optionally enter the maximum file size in bytes.
    • In the Period field, optionally enter an interval (in seconds) to control how frequently your log files are rotated. Rotation entails the finalization of one file object and the start of a new one, never removing any previously created file object. This value defaults to 3600 seconds.
    • In the Timestamp format field, optionally enter a timestamp format for log files. The default is an strftime compatible string. Our guide on changing where log files are written provides more information.
  4. Click the Advanced options link of the Create a Microsoft Azure Blob Storage endpoint page and decide which of the optional fields to change, if any.

  5. Fill out the Advanced options of the Create a Microsoft Azure Blob Storage endpoint page as follows:

    • In the Path field, optionally enter the path within the bucket to store the files. The path ends with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path. Our guide on changing where log files are written provides more information.
    • In the PGP public key field, optionally enter a PGP public key that Fastly will use to encrypt your log files before writing them to disk. You will only be able to read the contents by decrypting them with your private key. The PGP key should be in PEM (Privacy-Enhanced Mail) format. Read our guide on log encryption for more information.
    • In the Select a log line format area, select the log line format for your log messages. Our guide on changing log line formats provides more information.
    • In the Compression field, optionally select the compression format you want applied to the log files. Our guide on changing log compression options provides more information.
  6. Click the Create button to create the new logging endpoint.
  7. Click the Activate button to deploy your configuration changes.
NOTE

Although Fastly continuously streams logs into Azure Blob Storage, the storage portal and API do not make files available for access until after their upload is complete.

Ingesting data for Azure Data Explorer

Azure Data Explorer is a data exploration service for log and telemetry data. To ingest your data correctly, Data Explorer requires your logs to be formatted as comma-separated values (CSVs). When creating your logging endpoint:

  • Set the Log format to a CSV string ( %H,%{time.start.sec}V,%{regsub(req.http.User-Agent, \{"""\}, \{""""\})}V).
  • Specify blank when you Select a log line format in the Advanced options.

Our guide on changing log line formats provides more information.

Was this guide helpful?

Do not use this form to send sensitive information. If you need assistance, contact support.