Dynamic Files allows you to set and define a variety of configurations that help search engines discover and index your content.
Robots.txt Configuration
Robots.txt Configuration allows you to define how to the CDN delivers the Robots.txt file.
The robots.txt file is used to instruct web crawlers and search engine robots how to crawl pages on your website. Quite simply, the text file tells all, or a list of specific web crawlers, whether they are allowed or disallowed to crawl and index certain pages of your website.
Robots.txt Cache Control Header
This is the Cache-Control header the CDN will delivery with the Robots.txt file.
Robots.txt File
Here you can paste the contents of the Robots.txt file you’d like the CDN to deliver instead of the default.
Flash Cross Domain File
This allows you to enable and configure the crossdomain.xml file required to enable the Dynamic Files policy.
It's a cross-domain policy file that grants a web client — such as Adobe Flash Player, Adobe Reader, etc. — permission to handle data across domains.
CDN Generated crossdomain.xml
This text area allows you to copy and paste your crossdomain.xml code directly.