Check if your robots.txt file is instructing search engine crawlers to avoid parts of your website. The disallow directive is used in robots.txt to tell search engines not to crawl and index a file, page, or directory.
How do I fix it ?
If you want that your website to be crawled by search engines you are advised to use carefully the disallow directive in the robots.txt file.
Analyze and monitor your SEO with our powerful SEO ToolBox
Try the new features of our supercharged SEO ToolBox using a 7-day free trial account