ChatGPT is one of the hottest topics among digital marketers, content creators and search engine optimization (SEO) experts. The opinions surrounding ChatGPT are controversial.
ChatGPT is a large language model (LLM) that collects information from various sources online, which is the basis of the answers it provides. Some have ethical concerns regarding the fact that ChatGPT does not attribute or reference sources provided in its summaries and that there may be blurred lines regarding copyright.
Content creators and digital publishers can opt not to allow their content to be indexed or crawled using the Robots Exclusions Protocol or Robots.txt. Common crawl data refers to information online that the Common Crawl bot, CCBot, has already crawled.
Although digital content can’t be removed from Common Crawl datasets, it is possible to block content from being crawled going forward using the Robots Exclusions Protocol. Without it, users can download content without the publisher’s consent.
The reality is that artificial intelligence(AI) is being implemented in various ways, and ChatGPT is making strides in improving the current model. In fact, Microsoft is planning to implement a newer, faster version of ChatGPT known as GPT-4.
ChatGPT has also already released its paid version, costing content creators $20 per month. The paid version offers all-round access (even at peak times), faster response times and prioritized access to new improvements or features. This option is currently only available in the U.S.
More SEO News You Can Use
Bing Crawl System Gets a Revamp To Improve Efficiency: Bing has revamped its crawl system to make it more efficient by using the “lastmod” tag in XML sitemaps. These tags indicate when the site was last crawled, the frequency that the site should be crawled, and which pages should be indexed. It’s important to note that if “lastmod” values are set incorrectly, it could affect the efficiency of crawling and indexing. It’s recommended that publishers update sitemaps regularly so that search engines can easily access and index webpages. Improvements are already being implemented and are expected to roll out in June 2023.
Yandex Data Leak: What You Need To Know: Dan Taylor, a Russian search engine optimization expert, spoke out on the recent Yandex data leak that took place on January 27, 2023. This was, however, not the first time. An ex-employee of Yandex, one of Russia’s most significant search engines, attempted to sell Yandex code for approximately $30,000 on the black market in 2015. The recent leak released information about 17,800 ranking factors pertaining to various Yandex products such as Search, Mail, Maps, Metrika, Cloud and Disc. Yandex responded, saying that these factors are outdated and do not relate to the current version of the search engine. How much of the leaked ranking factors are still relevant is not yet determined, but for more detailed information, visit Taylor’s full article here on Search Engine Journal. Alternatively, Search Engine Land also delves into the inner working of the code.
Yelp Removes Paid Review Groups and Lead Generators: Yelp has announced that it will be cracking down on social media spam lead generators, fake reviews and incentivized business review rings. In its 2022 Trust and Safety Report, Yelp mentions how it is proactively searching for these groups and will take action by identifying the IP addresses of the group and connected users assisting in setting up paid reviews. Yelp is working with social media platforms to help break these groups up: Facebook, Instagram, LinkedIn and Twitter. Since the report’s release, Yelp has already closed more than 77,000 accounts and rejected 32,800 business pages for behavior that goes against Yelp’s policies. Content removed in 2022 include photos, reviews and review up-voting, to name a few. You can view Yelp’s Trust and Safety Report 2022 here or visit Search Engine Journal for more information.
Google Gives Insider SEO Tips for News Articles: Google Search Advocate John Mueller recently shared search engine optimization (SEO) tips in the most recent Google Office Hours session. Mueller, joined by Analyst Gary Illyes, answers questions about using lastmod properly as well as the pros of using separate sitemaps. Mueller suggests that news publishers should ideally ensure that the **** of the most significant change in content should be reflected in the lastmod field. But news publishers can also use the last comment **** if more relevant to the page. Illyes then went on to clarify that it’s much simpler to create two different sitemaps, one for news and one for general content, than to have one. News sites could opt for one sitemap with duplicate URLs; however, it’s not recommended. Removing URLs that are older than 30 days from your news sitemap is recommended. These tips will help to optimize news websites and boost visibility and engagement. You can catch the full Google Search Central podcast here.
New LinkedIn SEO Feature – Add an SEO Title and Description: LinkedIn now allows content creators and users to add custom search engine optimization (SEO) titles and meta descriptions to LinkedIn articles. This can be added to published articles and those yet to be published. Users have 60 characters to play with for SEO titles and 160 characters for descriptions. The description will replace the formerly used first lines of the article to summarize the content. This is a great way to optimize discoverability and search visibility without implementing complex SEO. Users will find this feature under settings in the Publishing menu drop-down. And in more exciting news, LinkedIn also announced that users will soon be able to schedule posts.
Editor’s Note: “SEO News You Can Use” is a weekly blog post posted every Monday morning only on SEOblog.com, rounding up all the top SEO news from around the world. Our goal is to make SEOblog.com a one-stop-shop for everyone looking for SEO news, education and for hiring an SEO expert with our comprehensive SEO agency directory.