WordPress 5.3 includes fix to feature that helps keep sites from appearing in search results

A WordPress feature that is commonly used to keep search engines from including development and staging sites in their search results has been fixed in version 5.3.

Hugh Lashbrooke, Community Manager for WordPress, announced a minor release of WordPress, version 5.2.3, is in the release candidate stage and will be available soon. Lashbrooke also discussed proposals for WordPress privacy improvements and a Core notification system.

The announcement included a brief, nondescript mention of WordPress 5.3 but didn’t provide any details. Francesca Marano, WordPress Community Manager for SiteGround, recently posted about the schedule and scope of WordPress 5.3, but it too was brief and stated that its focus will be polishing current interactions and [making] the UIs more user friendly.

While researching WordPress 5.3, I noticed a tweet from Jono Alderson, Special Ops at Yoast, that mentioned an under-the-radar fix in 5.3 that wasn’t mentioned by Lashbrooke or Marano. It’s a fix that impacts search and will be a very welcome update to webmasters and SEOs alike.

Preventing search engines from displaying a WordPress site

WordPress has an option to discourage search engines from indexing a site. The purpose of this feature is to keep development and staging versions of a site from getting into a search engine’s results. Up until this fix, the feature would only update a site’s robots.txt with Disallow: /.

The Disallow rule tells search engine crawlers like Googlebot that it doesn’t want them to crawl and index the site, but that rule alone is not enough to keep Google from including it in their search results. As Joost de Valk, Founder of Yoast, points out.

If a link points to a page, domain or wherever, Google follows that link. If the robots.txt on that domain prevents indexing of that page by a search engine, it’ll still show the URL in the results if it can gather from other variables that it might be worth looking at.

In the past, some webmasters thought they could prevent their site from appearing in search results by also adding noindex to robots.txt. However, that method was officially debunked in July 2019 with an update to the Robots Exclusion Protocol and Google’s declaration that it was removing support for noindex in robots.txt.

The correct way to keep search engines from including a page in its search results is to include <meta name="robots" content="noindex,nofollow"> in the page’s HTML. That’s what the fix in WordPress 5.3 does. When an administrator checks the discourage search engines from indexing site option in the Settings, WordPress will now add the noindex,nofollow metadata in addition to adding Disallow: / to the robots.txt file.

It should be noted that the best way to keep search engines from including a site in search results is to include the HTML Header X-Robots-Tag: noindex, nofollow, but WordPress’ Core is unable to add it. Instead, it needs to be manually added to the webserver. Perhaps the best way to keep a site from being included in SERPs is to password protect it. Hosting providers like WP Engine include password protection for all of their development and staging environments.

Jon is the founder and Managing Editor of Coywolf. He is a serial entrepreneur with over 25 years of experience in web development, SaaS, internet strategy, and digital marketing. Follow @henshaw

Never miss an important story – Subscribe to Newsletter