SEO News

Google On When Robots.txt Is Unreachable, Other Pages Reachability Matter


Google On When Robots.txt Is Unreachable, Other Pages Reachability Matter

There is this interesting conversation on LinkedIn around a robots.txt serves a 503 for two months and the rest of the site is available. Gary Illyes from Google said that when other pages on the site are reachable and available, that makes a big difference, but when those other pages are not, then “you’re out of luck,” he wrote.

Note, he specified the home page and other “important” pages as needing to be available…

The thread was posted by Carlos Sánchez Donate on LinkedIn where he asked, “what would happened if the robots.txt is 503 for 2 months and the rest of the site is available?”

Gary Illyes from Google responded:

I’m not sure if we need to add more nuance to it; see last sentence. One aspect that’s left out usually is whether our crawlers can reach consistently the homepage (or some other important pages? don’t remember) while the robotstxt is unreachable. If it is, then the site might be in an ok, albeit limbo state, but still served. If we get errors for the important page too, you’re out of luck. With robotstxt http errors you really just want to focus on fixing the reachability as soon as possible.

The question was if there needs to be more clarification on the robots.txt 5xx error handling in the documentation or not to handle this.

This is a super interesting thread, so I recommend you scan through this stuff if it interests you. Of course, most of you would say, just fix the 5xx errors and don’t worry about this. But many SEOs **** to wonder about the what if situations.

Here is a screenshot of this conversation, but again, there is a lot more there, so check it out:

Linkedin Discussion

Forum discussion at LinkedIn.



Source link : Seroundtable.com

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!