## Enable robots.txt rules for all crawlers User-agent: * ## Sitemap Sitemap: https://www.snel.com/sitemap_index.xml Sitemap: https://www.snel.com/post-sitemap.xml Sitemap: https://www.snel.com/page-sitemap.xml Sitemap: https://www.snel.com/kb-sitemap.xml Sitemap: https://www.snel.com/kbtopic-sitemap.xml ## Do not crawl development and backup files Disallow: /*.sql$ Disallow: /*.tar$ Disallow: /*.tgz$ User-agent: MJ12bot Disallow: / User-agent: Vagabondo Disallow: / User-agent: BaiDuSpider Disallow: / User-agent: Exabot Disallow: / User-agent: Yandex Disallow: / User-agent: BSpider Disallow: /