Google’s John Mueller responded to a question on LinkedIn to discuss the use of an unsupported noindex directive on the robots.txt of his own personal website. He explained the pros and cons of search engine support for the directive and offered insights into Google’s internal discussions about supporting it.
John Mueller’s Robots.txt
Mueller’s robots.txt has been a topic of conversation for the past week because of the general weirdness of the odd and non-standard directives he used within it.
It was almost inevitable that Mueller’s robots.txt was scrutinized and went viral in the search marketing community.
Noindex Directive
Everything that’s in a robots.txt is called a directive. A directive is a request to a web crawler that it is obligated to obey (if it obeys robots.txt directives).
There are standards for how to write a robots.txt directive and anything that doesn’t conform to those those standards is likely to be ignored. A non-standard directive in Mueller’s robots.txt caught the eye of someone who decided to post a question about it to John Mueller via LinkedIn, to know if Google supported the non-standard directive.
It’s a good question because it’s easy to assume that if a Googler is using it then maybe Google supports it.
The non-standard directive was noindex. Noindex is a part of the meta robots standard but not the robots.txt standard. Mueller had not just one instance of the noindex directive, he had 5,506 noindex directives.
The SEO specialist who asked the question, Mahek Giri, wrote:
“In John Mueller’s robots.txt file,