SEO News

Google Says SEOs Can Help Shape Policies & Decisions On AI Bots


Google Says SEOs Can Help Shape Policies & Decisions On AI Bots

Google’s John Mueller said that SEOs are in a great place because they understand how crawlers work, how the controls work, and they can help their clients decide on their AI policies and decisions as they navigate this new era of AI bots.

John Mueller wrote on LinkedIn, “This intersection of AI & SEO puts you all (technical SEOs!) into a great place to help shape policies / decisions for & with your clients.” “You know how these control mechanisms work, you can choose to use them, and help folks to decide what makes sense for them,” he added.

I like how he worded this next line, saying, “The robots.txt gives you a lot of control (over the reasonable crawlers / uses — for unreasonable ones, you might need to dig deeper into, or use a CDN/hoster that lets you block them by request type), you can even make your robots.txt disallow all by default if you want.” I mean, he did not say “full control” but “a lot of control.” Because, no, it does not give you full control. In some cases, if you want to block AI Overviews, you need to block all of Google Search. There are other AI bots and crawlers unrelated to Googlebot. And then there are the countless up and coming AI engines with bots all over the place.

John wrote more, here is the full set of comments:

This intersection of AI & SEO puts you all (technical SEOs!) into a great place to help shape policies / decisions for & with your clients. You know how these control mechanisms work, you can choose to use them, and help folks to decide what makes sense for them.

The robots.txt gives you a lot of control (over the reasonable crawlers / uses — for unreasonable ones, you might need to dig deeper into, or use a CDN/hoster that lets you block them by request type), you can even make your robots.txt disallow all by default if you want. Help the person running the site to make a decision (this is the hard part), and implement it properly (you definitely know how to do this).

These new systems access the web in a way similar to search engines, which you (I assume) know how it works & how to guide it. The controls are similar (sometimes the same) to those for search engines, which you know how they work & can use thoughtfully. What these new systems do with the data is sometimes very different, but it’s learnable (also, it changes quickly). You know what you want from search engines (“why do SEO? XYZ is why”), you can extrapolate from there if the new systems give you something comparable, and use that to decide how you interact with them. You’re (as a technical SEO in particular) in a good position to help make these decisions, and you’re definitely the right person to implement them. (And of course, your clean technical SEO foundation will make anything that these new systems do easier, crawling, internal links, clean URLs, clean HTML, etc — if you choose to go down that route.)

And finally, you hopefully have a lot of practice saying “it depends”, which is the basis of all technical decision making.

Are clients coming to you and asking how to deal with this?

Forum discussion at LinkedIn.

Note: This was pre-written and scheduled to be posted today, I am currently offline for Rosh Hashanah.



Source link : Seroundtable.com

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!