// by seankriegler.com

Robots.txt Generator

Build a valid robots.txt for your site. Add allow/disallow rules per crawler and reference your sitemap.

// Where It Goes

Place at the root of your site: https://yourdomain.com/robots.txt - one per (sub)domain.

// User-agent

* targets all crawlers. Use Googlebot, Bingbot, GPTBot etc. for specific bots.

// Allow / Disallow

Disallow blocks paths from crawling. Allow re-permits a sub-path inside a disallowed parent.

// Sitemap

Helps search engines discover URLs faster. Use the full absolute URL of your sitemap.xml.