Robots & Sitemap Inspector
Decode your crawler rules. Ensure your sitemap is visible and you aren't accidentally blocking traffic.
What are robots.txt and sitemaps?
Robots.txt is a gatekeeper file that tells search engine bots where they are allowed to go. Sitemaps are maps that tell bots where to find your content. Together, they control how your site is indexed.
Why use this inspector?
- Prevent Accidents: Ensure you haven't accidentally blocked Google or AI bots.
- Verify Visibility: Confirm your sitemap is submitted and accessible to crawlers.
