Creating and customizing a robots.txt file

The robots.txt file tells web robots how to crawl webpages on your website. You can use Fastly's web interface to create and configure a robots.txt file. If you follow the instructions in this guide, Fastly will serve the robots.txt file from cache so the requests won't hit your origin.

Creating a robots.txt file

To create and configure your robots.txt file, follow the steps below:

  1. Log in to the Fastly web interface.
  2. From the Home page, select the appropriate service. You can use the search box to search by ID, name, or domain.
  3. Click the Edit configuration button and then select the option to clone the active version. The Domains page appears.
  4. Click the Content link. The Content page appears.
  5. Click the robots.txt switch to enable the robots.txt response.

    the robots.txt quick config

  6. In the TXT Response field, customize the response for the robots.txt file.

  7. Click the Save button to save the response.

  8. Click the Activate button to deploy your configuration changes.

Manually creating and customizing a robots.txt file

If you need to customize the robots.txt response, you can follow the steps below to manually create the synthetic response and condition:

  1. Log in to the Fastly web interface.
  2. From the Home page, select the appropriate service. You can use the search box to search by ID, name, or domain.
  3. Click the Edit configuration button and then select the option to clone the active version. The Domains page appears.
  4. Click the Content link. The Content page appears.
  5. Click the Set up advanced response button. The Create a synthetic response page appears.

    a synthetic response dialog

  6. Fill out the Create a synthetic response fields as follows:

    • In the Name field, enter an appropriate name. For example robots.txt.
    • Leave the Status menu set at its default 200 OK.
    • In the MIME Type field, enter text/plain.
    • In the Response field, enter at least one User-agent string and one Disallow string. For instance, the above example tells all user agents (via the User-agent: * string) they are not allowed to crawl anything after /tmp/ directory or the /foo.html file (via the Disallow: /tmp/* and Disallow: /foo.html strings respectively).
  7. Click the Create button. Your new response appears in the list of responses.

  8. Click the Attach a condition link to the right of the newly created response. The Create a new condition window appears.

    a req.url condition for robots

  9. Fill out the Create a condition fields as follows:

    • From the Type menu, select the desired condition (for example, Request).
    • In the Name field, enter a meaningful name for your condition (e.g., Robots).
    • In the Apply if field, enter the logical expression to execute in VCL to determine if the condition resolves as true or false. In this case, the logical expression would be the location of your robots.txt file (e.g., req.url.path == "/robots.txt").
  10. Click the Save button.

  11. Click the Activate button to deploy your configuration changes.
NOTE

For an in-depth explanation of creating custom responses, check out our Responses Tutorial.

Why can't I customize my robots.txt file with global.prod.fastly.net?

Adding the .global.prod.fastly.net extension to your domain (for example, www.example.com.global.prod.fastly.net) via the browser or in a curl command can be used to test how your production site will perform using Fastly's services.

To prevent Google from accidentally crawling this test URL, we provide an internal robots.txt file that instructs Google's webcrawlers to ignore all pages for all hostnames that end in .prod.fastly.net.

a default robots.txt file

This internal robots.txt file cannot be customized via the Fastly web interface until after you have set the CNAME DNS record for your domain to point to global.prod.fastly.net.

Was this guide helpful?

Do not use this form to send sensitive information. If you need assistance, contact support.