LOG IN SIGN UP
Documentation

Creating and customizing a robots.txt file

Once you've created a CNAME DNS record to direct your domain's traffic to Fastly, you can set custom responses that will be served by the robots.txt file on your website. To create and configure your robots.txt file via the Fastly web interface, follow the steps below:

  1. Log in to the Fastly web interface and click the Configure link.
  2. From the service menu, select the appropriate service.
  3. Click the Edit configuration button and then select Clone active. The service version page appears.
  4. Click the Content tab. The Content page appears.

    the Content page

  5. Click the Create response button. The Create a new synthetic response page appears.

    a new synthetic response dialog

  6. Fill out the Create a new synthetic response fields as follows:

    • Leave the Status menu set at its default 200 OK.
    • In the MIME Type field, type text/plain.
    • In the Response field, type at least one User-agent string and one Disallow string. For instance, the above example tells all user agents (via the User-agent: * string) they are not allowed to crawl anything after /tmp/ directory or the /foo.html file (via the Disallow: /tmp/* and Disallow: /foo.html strings respectively).
    • In the Description field, type an appropriate name. For example robots.txt.
  7. Click the Create button. Your new response appears in the list of responses.

  8. Click the Attach a condition link to the right of the newly created response. The Create a new condition window appears.

    a req.url condition for robots

  9. Fill out the Create a new condition fields as follows:

    • From the Type menu, select the desired condition (for example, Request).
    • In the Name field, type a meaningful name for your condition (e.g., robots.txt).
    • In the Apply if field, type the logical expression to execute in VCL to determine if the condition resolves as true or false. In this case, the logical expression would be the location of your robots.txt file (e.g., req.url ~ "^/robots.txt").
    • Leave the Priority set to 10.
  10. Click the Save and apply to button.

  11. Click the Activate button to deploy your configuration changes.

Why can't I customize my robots.txt file with global.prod.fastly.net?

Adding the .global.prod.fastly.net extension to your domain (for example, www.example.com.global.prod.fastly.net) via the browser or in a cURL command can be used to test how your production site will perform using Fastly's services.

To prevent Google from accidentally crawling this test URL, we provide an internal robots.txt file that instructs Google's webcrawlers to ignore all pages for all hostnames that end in .prod.fastly.net.

a default robots.txt file

This internal robots.txt file cannot be customized via the Fastly web interface until after you have set the CNAME DNS record for your domain to point to global.prod.fastly.net.


Back to Top