SEOSEO News

Diggity Marketing SEO News Roundup – August 2020


If you prefer to live life on the cutting edge, this roundup is for you. Several of SEO’s most innovative minds have published pieces this month, and we’ve got them covered here so you won’t miss out.

It starts with some chunky guides. You’ll learn how to get the perfect length for a blog post in any niche, some advice on doing a 100K launch with no ads, plus a neat method you can use to scale keyword research using Python (don’t worry, you can copy the code out of the article).

After that, we’ve got two case studies for you. Learn what the data says about how to discover any website’s traffic, and whether Google is effective at crawling tabbed content.

Google dominates our news section this month. You’ll catch up with announcements about a new paid GMB profile, the death of the structured data testing tool, what Google revealed during their recent congressional hearing, and what was up with all the recent rank fluctuations.

How Long Should A Blog Post Be to Win the SEO Game

https://surferseo.com/blog/how-long-should-a-blog-post-be/

Marta Szyndlar of Surfer brings us this new assessment of what blog lengths work best for SEO. There have been other studies on this. The article links to several of them (Hubspot, Neil Patel, and Backlinko) in the first section. However, Surfer takes a different approach to the question altogether.

They, Marta argues, believe that “ideal length” is worthless to apply to most types of content. This is because there’s evidence that Google prefers different lengths depending on the topic. It’s not just the algorithm, either. Sometimes searchers want fast answers, and they’ll bounce from a long-winded article.

surfer seo content structure length sample

Source: surferseo.com

The article quotes a personal experiment by Matthew Woodward, where he culled more than 20,000 words from an article after realizing his own content was many times the length of competitors. That action alone took his result from the 7th page to the 1st.

The question becomes: How do you find the sweet spot? This guide recommends a simple competitor analysis process where you average out the right length from the top 10-50 results. You’re going to want to use tools to do it, but the guide covers the manual method first.

That method involves merely clicking on results, copying all content (including any comments they have, because Google considers that part of the count, too), recording them in a sheet, and then averaging the results.

 

Watch out. It’s possible to go wrong with even these simple instructions if you pick the incorrect results. The guide reminds you to try to match intent when considering competitors. Do not average results with a different form or purpose, or results that are apparent outliers.

If you are a Surfer user, the following sections detail how you can use the various tools in the suite to do this work. You can sweep up all this information from the SERPs page, or just enter each URL into another tool to see the ideal length.

Our next piece moves out of content with some advice on how to promote just about any product without using ads.

How to do a $100k Launch with No Ads in A Single Month

https://twitter.com/Charles_SEO/status/1269368908588281856

This great tweet-thread by Charles Floate covers the methods he used to generate more than $100k over 31 days that followed the launch of a new product. He did it without paying for a single ad.

The product, in this case, was an eBook. Not everyone has the kind of reputation Charles does. His credibility certainly played a role in driving sales, but the process he used has implications for many types of products—notably how he generated hype.

charles twitter post 100k launch

After optimizing his marketplace and landing page, he ran a non-ad campaign that included the following:

  • Generating social proof reviews by distributing free copies to influencers and then leveraging the reviews/grateful comments as they came in
  • Offering the book at a discounted price for 1-month
  • Releasing older paid content for free to promote the new developments coming with the new eBook
  • Filling outreach emails with free information to reinforce the value of the paid product
  • Recruiting affiliates experienced with eBook sales

All of these tactics can be used alone or combined with the rest. None of them even has a price tag if you handle them yourself. The closest thing to an ad buy here is working with affiliates, and the affiliates only get paid when you do.

The next piece coming up will also appeal to the DIY’ers out there. It’s going to show you how to combine Python and a free Google account to scale keyword research.

Using Python & Google Sheets to Scale Keyword Research for Local SEO

https://ardentgrowth.com/using-python-and-google-sheets-for-local-seo-keyword-research/

Skyler Reeves of Ardent Growth brings us this new process for saving a lot of time on keyword research. It should be said that this guide may come across as a little complex for newer SEOs, but the process it describes could be valuable for a professional agency.

The guide covers the theory and explanation of the process, along with how it can be used to collect better data, build more accurate predictions, and even automate the tricky parts.

Estimated Catch Up Rate from ardenth growth

Source: ardentgrowth.com

It starts with a Google doc filled with standard crawl data (quickly yanked from a free audit tool like ScreamingFrog). In addition to information like URLs, categories, links, and the number of live sessions, you’ll be looking at the existing keywords and how they compare to the best ones.

The guide doesn’t cover how to find the best keywords—assuming you understand that already. It skips right to assessing your competitors by recording the following data from the top 3 competing pages for each of your target URLs:

  • The content type (blog post, service page, a landing page)
  • The number of do-follow referring domains
  • An estimate of the page’s link velocity over the past 12 months
  • The PA and DA of the URL

With this data in hand, you can apply a series of provided formulas to rapidly determine what pages need your attention, and how much potential they have for improvement.

python and google sheets

Once the sheet is set up (which will take some time and elbow grease), it can be repeated in about 5 minutes. A whole website can be analyzed—and the recommendations justified with data—in an impressively short time.

Sometimes the best research is the data you lift from people already doing the right thing. Competitor analysis is the subject of the first case study we’ll be covering. It’s going to show you how to find out how much traffic any website gets.

Find Out How Much Traffic ANY Website Gets: 3-Step Analysis (With TEMPLATE)

https://www.robbierichards.com/seo/how-much-traffic-website-gets/

This first case study isn’t a study in itself, but a process for performing mini studies when you need to know a competitor or research target’s traffic.

Robbie Richards takes us through a process that can help you determine:

  • Which of your competitor’s channels drive the most traffic?
  • Which subdomains get the most visits?
  • Which pages/posts pull in the most organic traffic?

This process uses the SEMRush tool, but understanding how it works may give you the insight you need to do this same research using the tools you already have at hand. The data is recorded in a spreadsheet template that is provided for download.

How Much Traffic ANY Website Gets

First, Robbie argues, you need to analyze website traffic based on how you monetize your site. For example: for Adsense/ ad revenue, you need to drive ad impressions and should focus on the top results.

For an eCommerce store, you need to look at product category subfolders to see where they’re getting traffic with commercial intent. For an affiliate site, you’ll need to dissect traffic by keyword modifiers like “best,” “alternative,” “top.”

Once you know what information you want to track, the guide details a 3-step process you can use to:

  • Check global traffic data to understand visits and engagement
  • Find out how much organic traffic a website gets (at the subfolder, page & keyword levels)
  • Find out how much paid traffic there is

These are covered using different SEMRush tools, but the article closes with some of the tools that you can use as alternatives. Tools like SimilarWeb, Alexa, and Ahrefs have many of the same functions. You’ll even get some screenshots to show you where to find that data in each tool.

Our next case study is also concerned with traffic, but more specifically, with the effect that tabbed content has on it.

SEO Split-Testing Lessons from SearchPilot: Bringing Content Out of Tabs

https://www.searchpilot.com/resources/case-studies/seo-split-test-lessons-bringing-content-out-of-tabs/

Emily Potter of SearchPilot brings us this quick case study on the issue of tabbed content.

In the past, SEOs have had trouble pinning down Google on the subject. This is content that is only partially revealed until a visitor clicks on something like “read more.” Accordions and drop-down content are other examples of this style.

Many SEOs prefer this type of content because it makes for much cleaner pages. It’s almost necessary for the mobile versions of many sites.

However, there are some lingering doubts about whether Google is effective at crawling content that’s been tabbed. Some employees, such as Gary Illyes, have (vaguely) suggested that it won’t interfere with crawling. What does the data say?

SEO Split-Testing Lessons from SearchPilot tweet

In a series of tests, tabs were removed from product descriptions. The effect was tested on both accordion content and a set of four tabs.

The results were surprisingly conclusive. In the cases where the tabs were removed, there was a 12% uplift in live sessions. That change happened within less than a month.

The researchers noted that the effect was even more pronounced for mobile versions. It’s important to note that this was only one experiment with a specific type of website, but it is also easy to test for yourself on your own sites.

Now, we’re ready to move on to the latest news. This month, it’s all about Google, starting with its announcement of a new paid tier of GMB.

Google offers ‘upgraded’ GMB profile with Google Guaranteed badge for $50 per month

https://searchengineland.com/google-offering-upgraded-gmb-profile-with-google-guaranteed-badge-for-50-per-month-338064

Google recently announced the launch of a subscription service for GMB. It adds a Google Guaranteed badge of authenticity (certifying that Google considers the business to be real and in good standing) for the cost of $50 a month.

This badge has already existed for a couple of years but was limited to local service ads. The new upgrade allows businesses to list their services as guaranteed even if they’re not running ads. Here’s how the badge currently appears in the ad program::

Google Local Services Ad

The form may change before full implementation. Google has already taken criticism from advertisers who feel that the company is positioning itself as a competitor to its own ad-buying clients.

As new programs are introduced, a lot of the old ones are going away. Google recently announced it was closing the structured data testing tool permanently.

Google Shutters Structured Data Testing Tool

https://www.searchenginejournal.com/google-rich-results-structured-data-test-tool/373903/

The structured data testing tool is officially being retired in the next couple of months, Google has announced. If you’ve relied on this tool, don’t worry. All of the functions and many more new ones will be part of the Rich Results tool.

The Rich Results tool is not new, but it has picked up many features since it was first introduced. At that time, it only recognized recipes, jobs, movies, and courses. Now, it can adequately assess all types of structured data.

new google rich result snippet

It includes features that help you discover all the search feature enhancements a markup is eligible for. It will also provide you with both the mobile and desktop renders of a given result, and provide you with the option to test either a specific snippet or an entire URL.

Google was happy to share the last two bits of news with us, but the next story had to be coaxed out by a congressional committee. Let’s look at what we’ve learned from the released document so far.

US Congress Investigation Suggests Google Uses Clicks & User Data In Search

https://twitter.com/randfish/status/1288980563219513345

Rand Fishkin has some thoughts on the latest findings from the recent congressional hearing. In this thread, he posted many of the docs and provided some commentary. The documents provide some rare official confirmation (if any SEOs were still waiting for that) of the fact that Google:

  • Has their own secret DA score
  • Uses user signals such as clicks to determine the worthiness
  • Tracks the % of clicks that go to their properties vs. other websites

Internal documents also seemed to reveal that Google expressed early concern about the possibility of competition in verticals like travel and local search. They openly discussed the significant competitive advantage that their search data gave them.

Rand Fishkin tweet on congress and google

While it is news that this information is now public and confirmed, for most SEOs, it’s not news at all. Google has vaguely denied many similar accusations, but the trend lines have always pointed toward the further consolidation of searches toward their own services.

We’ve only seen the beginning of this process, as many other tech companies face accusations of monopoly. These reports may form the origins of a reform movement, and SEOs should watch it carefully to see where it goes.

Speaking of seeing where things go, many of us experienced a massive ranking disruption on August 10th. A few days out, we finally have an idea of what happened. 

That Massive Google Update Was Glitch & Bug – Search Results Back To Normal

https://www.seroundtable.com/google-update-was-glitch-bug-29923.html

If you saw your ranking data dip and dive all over the place earlier this month, we now know why. It wasn’t an update (as many expected), but a massive glitch. 

google glitch meme

Google’s John Meuller reached out after several SEOs reported bizarre fluctuations. According to John, the issue had been noticed and fixed, though no additional details were offered. 

We’re still waiting to learn what happened, but no one seems to be having problems at this point. Across the board, most signals seem to have returned to normal.

 

Got Questions or Comments?

Join the discussion here on Facebook.

Matt-Author-Img

Matt is the founder of Diggity Marketing, LeadSpring, The Search Initiative, The Affiliate Lab, and the Chiang Mai SEO Conference. He actually does SEO too.

 





Source link

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!