Compute@Edge log streaming: Elasticsearch

Fastly's Real-Time Log Streaming feature for Compute@Edge services can send log files to Elasticsearch. Elasticsearch is a distributed, RESTful search and analytics engine.

Prerequisites

Before adding Elasticsearch as a logging endpoint for Fastly Compute@Edge services, ensure Elasticsearch is running on a remote server. You’ll need to know the endpoint URL that includes a port to which logs should be sent (make sure it can receive traffic from Fastly) and also the name of the index to send logs to. For more information on setting up Elasticsearch, see the Elasticsearch setup documentation.

Required privileges

We send data using the Bulk API via the index action. When using basic authentication, ensure that the required index privileges to use the index action are granted to the user role.

Adding Elasticsearch as a logging endpoint

Follow these instructions to add Elasticsearch as a logging endpoint:

  1. Review the information in our Setting Up Remote Log Streaming guide.

  2. Click the Elasticsearch Create endpoint button. The Create an Elasticsearch endpoint page appears.
  3. Fill out the Create an Elasticsearch endpoint fields as follows:
    • In the Name field, enter the name you specified in your Compute@Edge code. For example, in our Rust code example, the name is my_endpoint_name.
    • In the URL field, enter the Elasticsearch endpoint URL that includes a port to which logs should be sent. The URL must be sent using HTTPS on a port that can receive incoming TCP traffic from Fastly.
    • In the Index field, enter the name of the Elasticsearch index to send logs to. The index must follow the Elasticsearch index format rules. We support strftime interpolated variables inside braces prefixed with a pound symbol. For example, #{%F} will interpolate as YYYY-MM-DD with today's date.
    • In the Pipeline field, optionally enter the ID of the Elasticsearch ingest pipeline to apply pre-process transformations to before indexing (for example, my_pipeline_id).
    • In the Maximum logs field, optionally enter the maximum number of logs to append to a batch, if non-zero.
    • In the Maximum bytes field, optionally enter the maximum size of the log batch.
    • In the BasicAuth user field, optionally enter your basic authentication username.
    • In the BasicAuth password field, optionally enter your basic authentication password.
    • In the TLS hostname field, optionally enter a hostname to verify the server's certificate. This should be one of the Subject Alternative Name (SAN) fields for the certificate. Common Names (CN) are not supported.
    • In the TLS CA certificate field, optionally copy and paste the certification authority (CA) certificate used to verify that the origin server's certificate is valid. The certificate you upload must be in PEM format. Consider uploading the certificate if it's not signed by a well-known certification authority. This value is not required if your TLS certificate is signed by a well-known authority.
    • In the TLS client certificate field, optionally copy and paste the TLS client certificate used to authenticate to the origin server. The TLS client certificate you upload must be in PEM format and must be accompanied by a client certificate. A TLS client certificate allows your server to authenticate that Fastly is performing the connection.
    • In the TLS client key field, optionally copy and paste the TLS client key used to authenticate to the backend server. The TLS client key you upload must be in PEM format and must be accompanied by a TLS client certificate. A TLS client key allows your server to authenticate that Fastly is performing the connection.
  4. Click the Create button to create the new logging endpoint.
  5. Click the Activate button to deploy your configuration changes.
Back to Top