Check if your robots.txt file is instructing search engine crawlers to avoid parts of your website. The disallow directive is used in robots.txt to tell search engines not to crawl and index a file, page, or directory.
Check your URL:
How do I fix it ?
If you want that your website to be crawled by search engines you are advised to use carefully the disallow directive in the robots.txt file.