# This robots.txt file requests that search engines and other # automated web-agents don't try to index the files in this # directory. This file is required in the event that you use # Revive Adserver without virtual domains (i.e. you use a # single URL to run both the admin interface and the delivery # engine), and have the web root set to this directory. User-agent: * Disallow: /