Robots.txt Checker

Fetch and analyze any website's robots.txt file to ensure proper indexing and identify SEO issues

Using AllOrigins API for secure CORS requests

Analyze Robots.txt

Please enter a valid domain name (e.g. example.com or www.example.com)
Analysis completed successfully!

Fetching robots.txt file...

https://google.com/robots.txt
Found

Raw Content

Analyze a domain to view robots.txt content

Analysis

SEO Analysis

We found 12 directives for 3 user agents in this robots.txt file.

  • File is properly formatted
  • No critical resources blocked
  • Crawl-delay directive detected
  • Sitemap references included
  • Sensitive directories are blocked

Recommendations

  • Consider adding a sitemap reference if missing
  • Avoid blocking CSS and JS files for better indexing
  • Make sure important pages are not disallowed
  • Test with Google Search Console for verification