Skip to main content

Disallow Directive Test

What is it?

Check if your robots.txt file is instructing search engine crawlers to avoid parts of your website. The disallow directive is used in robots.txt to tell search engines not to crawl and index a file, page, or directory.

How do I fix it?

If you want that your website to be crawled by search engines you are advised to use carefully the disallow directive in the robots.txt file.

Dominate search today on Google and AI Engines.

Join 85,000+ SaaS Marketers, Growth Agencies, Content-Led Companies and E-commerce Brands.

See Pricing
Dashboard preview showing SEO site checkup metrics, page group insights, and issue prioritization