This type of errors means Google tried to crawl your URLs but could not access. However, this is normal as the default robot.txt is made to prevent the labels/tags of your posts or pages from being indexed. However, if you feel not comfortable with it and would like to remove this error, you can switch your blog to 'NO ARCHIVING'. Just sign in your blog, go to Settings->Archiving, and choose 'No Archive'.
How to remove Crawling Error "Restricted by robots.txt" for Blogger Blog URLs with Labels/Tags
Subscribe to:
Post Comments (Atom)
Blog Archive
©2011 Tutorials4Share.blogspot.com
About Me
- Tutorials4Share
- I am a software developer with roughly 5 years of experience in developing end-to-end solutions in C#, Java, C/C++, PHP and HTML/CSS/Javascript. At the moment, I am joining the Professional Doctorate in Engineering degree program in Software Technology at Eindhoven University of Technology. My areas of particular interest include software design, data structures and algorithms, problem solving, software security, embedded system, machine learning, and data science.
No comments:
Post a Comment