SEO News

Google: URLs Excluded By Robots.txt Aren’t Removed Until URLs Are Individually Reprocessed


Google: URLs Excluded By Robots.txt Aren’t Removed Until URLs Are Individually Reprocessed

Google’s John Mueller posted a clarification on how and when Google processes the removal requests, or exclusion requests, you make in your robots.txt. The action is not taken when Google discovers the change in your robots.txt, but rather after first the robots.txt is processed and then the specific URLs that are impacted are individually reprocessed by Google Search.

Both have to happen. First, Google needs to pick up on your changes in your robots.tx and then Google needs to reprocess the individual URLs, on a URL-by-URL basis, for any changes in Google Search to happen. This can be fast or not, depending on how fast the specific URLs are reprocessed by Google.

John Mueller posted this on Mastodon saying, “the thing to keep in mind is that it’s not about when we see the change in the robots.txt, it’s about when we would have wanted to reprocess the URL. If I disallow: / today, and Google sees it tomorrow, it doesn’t change all of the URLs into robotted tomorrow, it only starts doing that on a per-URL basis then. It’s like when you 404 a whole site, the whole site doesn’t drop out, but instead it happens on a per-url basis.”

Here is a screenshot of that conversation, just so you have the context:

Google Mastodon Conversation

In fact, many ranking algorithms work this way, which is why when an update rolls out, sometimes it takes about two weeks to fully rollout because that is how long most important URLs on the internet take for Google to reprocess them. Some URLs on the internet may take months to reprocess, as an FYI.

Forum discussion at Twitter.



Source link : Seroundtable.com

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!