Creating and customizing a robots.txt file

Once you've created a CNAME DNS record to direct your domain's traffic to Fastly, you can set custom responses through the Fastly application that will be served by the robots.txt file on your website. To create and configure your robots.txt file via the Fastly application, follow the steps below.

  1. Log in to the Fastly application and click the configure tab (wrench icon).

    the configure tab

  2. From the Service menu, select the appropriate service and then click the blue Configure button. The main controls for your selected service appear.

  3. Click Content from the section list on the left.

  4. In the Responses area at the bottom of the page click the New button. The New Response window appears.

    a robots response dialog

  5. Fill out the New Response controls as follows:

    • In the Name field, type an appropriate name. For example robots.txt.
    • Leave the Status menu set at its default 200 OK.
    • In the MIME Type field, type text/plain.
    • In the text editor area at the bottom of the window, specify at least one User-agent string and one Disallow string. For instance, the above example tells all user agents (via the User-agent: * string) they are not allowed to crawl anything after /tmp/ directory or the /foo.html file (via the Disallow: /tmp/* and Disallow: /foo.html strings respectively).
  6. Click the Create button. Your new response appears in the list of responses.

  7. Click the gear icon to the right of the new response you just created and select Request Conditions from the menu.

    a robots request condition

    The New Condition window appears.

    a req.url condition for robots

  8. Fill out the fields of the New Condition window as follows:

    • In the Name field type a meaningful name for your condition (e.g., robots.txt).
    • In the Apply If field, type the logical expression to execute in VCL to determine if the condition resolves as True or False. In this case, the logical expression would be the location of your robots.txt file (e.g., req.url ~ "^/robots.txt").
    • Leave the Priority set to 10.
  9. Click Create to create the new condition.

Why can't I customize my robots.txt file with global.prod.fastly.net?

Adding the .global.prod.fastly.net extension to your domain (for example, www.example.com.global.prod.fastly.net) via the browser or in a cURL command can be used to test how your production site will perform using Fastly's services.

To prevent Google from accidentally crawling this test URL, we provide an internal robots.txt file that instructs Google's webcrawlers to ignore all pages for all hostnames that end in .prod.fastly.net.

a default robots.txt file

This internal robots.txt file cannot be customized via the Fastly UI until after you have set the CNAME DNS record for your domain to point to global.prod.fastly.net.

Back to Top