Getting started
Basics
Domains & Origins
Performance

Configuration
Basics
Conditions
Dictionaries
Domains & Origins
Request settings
Cache settings
Headers
Responses
Performance
Purging
Custom VCL
Image optimization
Video

Security
Access Control Lists
Monitoring and testing
Securing communications
Security measures
TLS
Web Application Firewall

Integrations
Logging endpoints
Non-Fastly services

Diagnostics
Streaming logs
Debugging techniques
Common errors

Account info
Account management
Billing
User access and control

Reference

    Segmented Caching

      Last updated September 27, 2019

    Fastly’s Segmented Caching feature allows you to cache files of any size. Segmented Caching works by breaking files into smaller 1 MB segments in Fastly’s cache then recombining or splitting these objects to respond to arbitrary size byte-range requests from clients. Once enabled, Segmented Caching improves performance for range requests and allows Fastly to efficiently cache files of any size. If Segmented Caching is not enabled, requests for files over 2 GB without Streaming Miss or 5 GB with Streaming Miss will result in errors.

    How Segmented Caching works

    When an end user makes a range request for a file with Segmented Caching enabled and a cache miss occurs (that is, at least part of the range is not cached), Fastly will make the appropriate range requests back to origin. Segmented Caching will then ensure only the specific portions of the file that have been requested by the end user (along with rounding based on object size) will be cached rather than the entire file. Partial cache hits will result in having the cached portion served from cache and the missing pieces fetched from origin. (Requests for an entire file would be treated as a byte range request from 0 to end of file.)

    Once Fastly has all of the objects necessary to respond to an end user’s request, the Segmented Caching feature will assemble the response by concatenating or pulling portions of objects. The requests back to origin, also called “inner requests,” will have a true value for fastly.segmented_caching.is_inner_req and requests from end users, also called “outer requests,” will have a true value for fastly.segmented_caching.is_outer_req. If a request is made for an object without segmented caching enabled, both variables will have a FALSE value.

    Limitations and considerations

    This feature has the following limitations and considerations you should take into account:

    Enabling Segmented Caching

    Use the following steps to enable Segmented Caching.

    1. Determine which files should use Segmented Caching.

    2. From the service menu, select the appropriate service.
    3. Click the Edit configuration button and then select Clone active. The Domains page appears.
    4. Click the VCL Snippets link. The VCL Snippets page appears.
    5. Click the Create your first VCL snippet button. The Create a VCL snippet page appears.
    6. In the Name field, type an appropriate name (e.g., Enable segmented caching).
    7. From the Type (placement of the snippets) controls, select within subroutine.
    8. From the Select subroutine menu, select recv (vcl_recv).
    9. In the VCL field, add a VCL snippet that sets the req.enable_segmented_caching VCL variable to true in vcl_recv. For example, to ensure proper caching of the large files you've identified that contain MPEG-2-compressed video data, you could add this VCL snippet in vcl_recv:

      1
      2
      3
      4
      
      # my custom enabled Segmented Caching code
      if (req.url.ext == "ts") {
         set req.enable_segmented_caching = true;
      }
      

      This snippet tells Fastly to look for requests for files with the ts extension and then enable Segmented Caching for those files.

    10. Click Create to create the snippet.
    11. Click the Activate button to deploy your configuration changes.
    Back to Top