Google Brings A Note for Unsupported Rules in Robots.txt

Robots txt

In the latest SEO updates, Google announced about open-sourcing production of the robots.txt parser. The company also said that this crawl feature will help potential Search open sourcing projects in coming time.

Google was thoughtful for approving a code handler for other rules like robots txt crawl delay included in the code. Later, the draft published by the company gives an extensible base for rules which are not part of the standard. It was explained like this- if a crawler looking to support their own line like “unicorns: allowed”, they can.

Google to explain this practically in a parser added a very common line, sitemap, in the open-source robots.txt parser.

During the open-sourcing of parser library, Google assessed the uses of robots file. Better, the company focused on rules not supported by the internet draft, for instance, crawl-delay, nofollow, and noindex.

As these rules are not framed in Google’s documents, naturally, their use in relation to Google robot is very low. After further research, their use was challenged by other rules in all about but 0.001% of all robots.txt files on the internet.

Google is retiring all code that handles unsupported and unpublished rules like noindex on September 1, 2019. For those who use noindex indexing directive in the robots.txt file for controlling the crawls, they can choose from below options-

  • No index in robots Meta tags: The feature is supported both in the HTTP response headers and in HTML.
  • 404 and 410 HTTP status codes: If both status codes exist, that means the page does not exist, which will drop such URLs from Google’s index after crawling and processing.
  • Disallow in robots.txt: Search engines can only index pages which a marketer makes fetches to them, so blocking the page from so that they can’t be crawled usually means its content won’t be indexed.
  • Search Console Remove URL tool: With this tool, you can quickly and easily remove a URL temporarily from Google’s search results.

This latest Google SEO update on robot files will help you in your optimization activities.