Robots.txt Generator
Configure robots.txt for your site with a few clicks. Choose which bots to allow and which paths to block.
Allowed bots
Disallow paths
What is robots.txt and how does it affect SEO?
The robots.txt file is a text file in the site root that instructs search engine robots which parts of the site to crawl and which to skip. It is a standard from the Robots Exclusion Protocol (REP).
Robots.txt is NOT a security measure — it only tells well-behaved robots not to crawl certain URLs. Malicious bots ignore it. For security, use authentication and server-side restrictions.
A properly configured robots.txt helps with Google crawl budget: you block pages that don't need indexing (admin panels, API endpoints, duplicates) and allow Googlebot to focus resources on important pages.
Always add your Sitemap URL to robots.txt — this helps Google find and index all public pages faster.