seo site checkup logo
PricingFree ToolsArticles
Log in

Disallow Directive Test

What is it?

Check if your robots.txt file is instructing search engine crawlers to avoid parts of your website. The disallow directive is used in robots.txt to tell search engines not to crawl and index a file, page, or directory.
Check your URL:

How do I fix it ?

If you want that your website to be crawled by search engines you are advised to use carefully the disallow directive in the robots.txt file.
Check your website's SEO for free right now!
seo site checkup logo
Website SEO, Monitoring & Automation Made Easy.
Product
  • Pricing
  • Free Tools
  • Articles
  • Login
  • Free 14 Day Trial
© SEO Site Checkup 2009-2021 • All rights reserved