Compute@Edge log streaming: Google BigQuery

Fastly's Real-Time Log Streaming feature for Compute@Edge services can send log files to BigQuery, Google's managed enterprise data warehouse.

Prerequisites

Before adding BigQuery as a logging endpoint for Fastly Compute@Edge services, you will need to:

Creating a service account

BigQuery uses service accounts for third-party application authentication. To create a new service account, follow the instructions in the Google Cloud documentation. Keep the following in mind when creating the service account:

  • The service account must be assigned the Big Query Data Editor role to write to the table you use for Fastly logging. See BigQuery Roles for details about the default permissions assigned to the Big Query Data Editor role.

    Create service account panel

  • Set the key type to JSON when creating the service's private key pair.

If you elect to use Google service account impersonation in order to avoid storing keys with Fastly you may use this same service account for that purpose. Our guide to creating an Google IAM role provides further details on this limited availability feature.

Obtaining the private key and client email

When you create the BigQuery service account, a JSON file automatically downloads to your computer. This file contains the credentials for your BigQuery service account. Open the file and make a note of the values of the private_key and client_email fields.

Enabling the BigQuery API

To send your Fastly logs to your BigQuery table, you'll need to enable the BigQuery API in the Google Cloud Platform API Manager.

Creating the BigQuery dataset

After you've enabled the BigQuery API, follow these instructions to create a BigQuery dataset:

  1. Open the BigQuery page in the Cloud Console.
  2. In the Explorer panel, select the project where you want to create the dataset.
  3. In the details panel, click Create dataset.
  4. In the Dataset ID field, enter a name for the dataset (e.g., fastly_bigquery).
  5. Click the Create dataset button.

Adding a BigQuery table

After you've created the BigQuery dataset, you'll need to add a BigQuery table. There are four ways of creating the schema for the table:

  • Edit the schema using the BigQuery web interface.
  • Edit the schema using the text field in the BigQuery web interface.
  • Use an existing table.
  • Set the table to automatically detect the schema.

Follow these instructions to add a BigQuery table:

  1. Open the BigQuery page in the Cloud Console.
  2. In the Explorer panel, expand your project and select the BigQuery dataset you created previously.
  3. In the Source section, select Empty Table from the Create table from: menu. The Create table dialog appears.

    The BigQuery create table page

  4. In the Table name field, enter a name for the table (e.g., logs).
  5. In the Schema section of the BigQuery website, use the interface to add fields and complete the schema. See the example schema section for details.
  6. Click the Create Table button.

Adding BigQuery as a logging endpoint

Follow these instructions to add Google Cloud Storage as a logging endpoint. As part of configuration, you can elect to configure Google IAM role-based service account impersonation to avoid storing secrets. Read our guide on creating a Google IAM role for more information on this limited availability feature.

  1. Review the information in our Setting Up Remote Log Streaming guide.

  2. Click the Google BigQuery Create endpoint button. The Create a BigQuery endpoint page appears.
  3. Fill out the Create a BigQuery endpoint fields as follows:
    • In the Name field, enter the name you specified in your Compute@Edge code. For example, in our Rust code example, the name is my_endpoint_name.
    • In the Access Method area, select how Fastly will access Google resources for purposes of log delivery. Select either User Credentials or IAM Role.
    • If you selected User Credentials, enter the following fields:
      • In the Email field, enter the client_email address associated with the BigQuery service account.
      • In the Secret key field, enter the value of the private_key associated with your BigQuery service account.
      • In the Project ID field, enter the ID of your Google Cloud Platform project.
      • In the Dataset field, enter the name of your BigQuery dataset.
      • In the Table field, enter the name of your BigQuery table.
      • In the Template field, optionally enter an strftime compatible string to use as the template suffix for your table.
    • If you selected IAM Role, enter the following fields:
      • In the Service Account Name field, enter the name of the service account email address you selected when configuring Google IAM service account impersonation.
      • In the Project ID field, enter the ID of your Google Cloud Platform project.
      • In the Dataset field, enter the name of your BigQuery dataset.
      • In the Table field, enter the name of your BigQuery table.
      • In the Template field, optionally enter an strftime compatible string to use as the template suffix for your table.
  4. Click the Create button to create the new logging endpoint.
  5. Click the Activate button to deploy your configuration changes.

Data sent to BigQuery must be serialized as a JSON object, and every field in the JSON object must map to a field in your table's schema. The JSON can have nested data in it (e.g., the value for a key in your object can be another object). Here's an example format string for sending data to BigQuery:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
{
  "client_ip": "127.0.0.1",
  "timestamp": "2022-05-17 15:09:24.037547 UTC",
  "geo_country": "USA",
  "geo_city": "boston",
  "host": "curiously-selected-polecat.edgecompute.app",
  "url": "https://curiously-selected-polecat.edgecompute.app/",
  "request_method": "GET",
  "request_protocol": "https",
  "request_referer": "",
  "request_user_agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.0.0 Safari/537.36",
  "response_status": "200",
  "response_reason": "OK",
  "response_body_size": "1234",
  "fastly_server": "IAD"
}

Example schema

The BigQuery schema for the example format shown above would look like this:

1
client_ip:STRING,timestamp:TIMESTAMP,geo_country:STRING,geo_city:STRING,host:STRING,url:STRING,request_method:STRING,request_protocol:STRING,request_referer:STRING,request_user_agent:STRING,response_status:STRING,response_reason:STRING,response_body_size:STRING,fastly_server:STRING
Back to Top