SEOSEO News

Rank Ranger SEO Data – 4 Key Insights | Rank Ranger


What are the key insights that you should be extracting from your Rank Ranger SEO data?

That’s what we’re covering today with an SEO who is also a backgammon enthusiast. He’s the author of “Data-Driven SEO with Python” and the founder of Artios. A warm welcome to the In Search SEO podcast, Andreas Voniatis.

In this episode, Andreas shares four ways to extract SEO data from Rank Ranger including: 

  • Keyword clustering   
  • Look for quick wins 
  • Competitor analysis   
  • Schema       

Four Key Insights to Extract SEO Data From Rank Ranger 

Andreas: Thank you for having me, David.

David: Thanks so much for joining us. You can find Andreas over at artios.io. So today you’re sharing four key insights to extract from Rank Ranger SEO data. Starting off with number one, keyword clustering.     

1. Keyword Clustering

A: Yes, the beauty of keyword clustering is if you’re trying to group keywords by search intent, you can use a Rank Ranger API to get the search results for all of your keywords at scale. So that’s really great. That saves a lot of manual work having to download all those CSV sheets. And then what you do is use a bit of Python to compare the search results for similarity. Now, if two keyword search results are similar, then you know they have the same search intent. If they’re dissimilar, then you know those two keywords have different search intents and therefore should be mapped onto different web pages. And the code is all in my book, “Data-driven SEO.”

D: Talking about the phrase a bit of Python, for SEOs that are uncomfortable with what you’ve just shared there, how would you summarize your use of Python? And also, can you talk a little bit about how you go about determining the similarities between different keyword phrases?

A: It’s probably best to answer your second question first because we just talked about keyword clustering. Regarding the similarity metric, I was inspired by how genetic scientists use string methods to compare DNA strings for gene sequencing. So I thought, what if we were to encode search results for keywords like DNA? And then we can compare the DNA to see if they’re similar or not. And that’s how I went about constructing the code for comparing the search results for keywords. Now, in terms of the similarity, if someone is searching for trench coats, and the search results are the same for ladies’ trench coats, then there’s a good chance that the page for ladies’ trench coats should also be optimized for trench coats per se. Whereas you might find that the search intent for trench coats and men’s trench coats are dissimilar because most of the search results for trench coats were for ladies’ trench coats as opposed to men’s trench coats. Hopefully, that answers your question.

D: It does. But as always, it actually encourages me to dig deeper or follow the thread from a slightly different angle. I actually used to work in SEO for luxury fashion as well. So I kind of experienced the fashion e-commerce SEO that you alluded to there in relation to your trench coats example. I think e-commerce SEOs sometimes struggle with which page the male or female version to try and optimize for the core item of clothing or perhaps to create some kind of unisex page and then have that as a funnel towards the male or the female version. You mentioned that trench coats by itself tends to be more commonly searched for by female searchers. Is that something that you advise SEOs to look at and try to determine?

A:Yeah, so I would look at the data. The example I was giving you was just an example I plucked out from 10 years ago. So the answer to that would be to look at the data, maybe things have changed. Let’s take another example, if you looked at the search results for ‘braces’ ten to twelve years ago, it would be a mix of dental braces and trouser braces, I’m pretty sure now, almost exclusively, it’s dental braces. User behavior, or user search patterns, change over time. That’s why we’ve got to be data driven these days. Look at the data, don’t be beholden to SEO anecdotes. I’m not saying don’t follow best practices, but the data should be your first reference before anything else.

D: And your step number two is to look for quick wins. This obviously relates to the keyword information that you’ve uncovered in step one.   

2. Look for quick wins 

A: One of the things I do is I use the Rank Ranger API to grab the ranking data for the last week. And if you take the average, then you know that the data results are stable. If there’s a page hanging around page two, it’s really simple. That effectively qualifies a quick win. Just by using the Rank Ranger API and systemising, or automating, that reporting using their filters, you can go over and above the SEO strategy that you’ve set for the next six months. There are things you could do to improve and get results sooner rather than later.

D: Do you also try to determine the authority and relevance of the pages that are ranking above this page that you’ve identified? Because it’s all well and good identifying a page that’s perhaps on page two but if it’s exceptionally difficult to move past the competitors, then it might not be worthwhile working on that.

A: Yes, 100%. The chances are, that if you’re on page two, you probably have sufficient authority, it’s just the user experience side that’s probably lacking. But yes, I would qualify the keywords. If you were to take two quick win examples of keywords, you might find that the median domain authority for one keyword SERP is much lower than the other. Just like you, I know which one I’d prioritize.

D: And the third insight that you extract from Rank Ranger SEO data is competitor analysis.   

3. Competitor analysis

A: Yes, and what I **** about data driven SEO is we’re in the industry where it’s data rich. The outcomes of Google’s algorithm is in the public domain via search engine results. And that’s all extractable using the Rank Ranger API. The inputs, which would explain the variation in rankings, are also in the public domain. You can get data on how your competitors are structuring their content, what that content is, features of the content, such as word count, and things like that. All of that is also in the public domain. So the missing link is your data science, which is to do the mathematical modeling in Python, and correlate what’s working and what’s not working, and to what statistical significance. And machine learning ****** help do this at scale.

D: In general, what trends are you currently seeing that constitute a successful competitor page? Are you seeing increased word length, are you seeing certain elements within the page that are more likely for a page that ranks highly?

A: Yes, it depends on the sector. But if we just take the e-commerce SEO sector, one of the things that I’ve noticed that seems to be a predictor of rank and can explain the difference in position, is the number of products you might offer on a page. It seems that the more you offer, the higher ranked you are. With word length, less is more in the e-commerce sector, whereas something a bit more B2B or service-led tends to be more. So the higher your word count, the higher you will rank. And if we take readability, again, I’m not anti-best practice but I’m anti-anecdotes. It seems that there’s a best practice that the more readable your content, the higher you rank. Well, if you look at the data in technical industries such as accounting, law, and blockchain, it’s actually the opposite. The less readable your copy is, the more it’s deemed that you know what you’re talking about. And it seems to satisfy Google’s users who are searching on those queries for those search spaces. Honestly, don’t take best practice for granted, be data-driven about it. But those are the trends that I’ve identified having analyzed a number of SERPs.

D: Going back to what you said to begin with there, are you saying that it might be an idea to consider increasing the number of standard product snippets that e-commerce stores have on category pages, maybe from 10 to 20 or something like that?

A: Yeah, 100%. Again, it probably depends on the market. But I’ve seen, for example, in the furniture space, where the machine learning algorithm identified that there was a cutoff point where if you offered three to four typical products for that product category, it seemed to perform much worse compared to other stores that were offering at least 12 sofas.

D: Interesting. And your fourth key insight to extract from Rank Ranger is schema.       

4. Schema

A: Yeah, the beauty of SERP data using the Rank Ranger API is you get all kinds of rich information that can tell you a lot about the search intent behind keywords. For example, if you’re seeing results for People Also Ask, that’s a clue that you should be using schema to mark up your FAQ content. If it’s e-commerce, you can use schema to show the number of reviews you’re getting for your products. Nothing is new in that sense. The value add for data-driven SEO is to do this at scale. And with the Rank Ranger API, that’s a major piece of the puzzle taking care of.     

The Pareto Pickle – Frequency Count of the Keyword Modifiers on Your Pages Page

D: Great thoughts. Let’s finish off with the Pareto Pickle. Pareto says that you can get 80% of your results from 20% of your efforts. What’s one SEO activity that you would recommend that provides incredible results for modest levels of effort?

A: Well, for modest levels of effort, let’s assume that no serious budget is involved, I would say either data-driven digital PR, although that does require some budget. Or for zero budget and just your time, you could look into Google Search Console for each of your pages. And you can look at the queries and do a frequency count of the modifiers per page, and then optimize your page titles accordingly. And the great thing is you can do this at scale. So assuming you know Python, it’s going to be a very modest effort to get all the data you need using the Google Search Console API. This is also detailed in my book on how to extract that data without being restricted to 1000 rows. That’s a really modest way yet a transformative effect on your SEO traffic.

D: Interesting. So a lot of great data inside Google Search Console that you can use to help optimize page titles and things like that. Are you a fan of using AI to help create that content as well?

A: 100%. I actually built a neural network model. I trained my neural network model with just over a million data points, meta titles, and descriptions. And using that I was able to generate sensible meta descriptions. I showed it to a few people just to do the Mom test and they thought it was pretty good. Obviously, the Google Terms of Service do state that a human must edit content but I think that applies to article body content. But for meta descriptions and titles, that’s pretty much fair game.

D: And I must ask for one example of the example that you gave that cost a bit of money, i.e., data-driven digital PR. What would be an example of something from a digital PR perspective, data-driven, that is very effective nowadays?

A: Obviously, the PR landscape has changed massively in the last 15 years. Certainly, link acquisition has. In terms of how we service our clients, what we do is we produce data-driven content and it’s the kind of content that influencers want to link to. And because it’s proprietary data, it’s the kind of thing that accrues links over time. Obviously, that’s a lot more scalable, and the efforts that go into it are perhaps not so modest. You have to understand your audience, their burning questions, and you have to understand the influencers and what they tend to write about. It doesn’t sound modest at first, but then when you look back over the last 12 months, in terms of how that piece of content performed in terms of acquiring links over time, you probably end up acquiring links for about $50 per link. Some won’t be worth much, there’s the Pareto 10% that have the DA of 80s, 90s, or 70s. You can’t have it all, but it’s pretty transformative. And on the whole, it’s a modest effort for what you get.

D: Understood. It’s not an insignificant amount of time, effort, and cost to begin with, but looking at it over the long term, it’s a lot of volume for money.

I’ve been your host, David Bain. You can find Andreas Voniatis over artios.io. Andreas, thank you so much for being on the In Search SEO podcast.

A: Thank you for having me. It’s always a pleasure.

D: And thank you for listening. Check out all the previous episodes and sign up for a free trial of the Rank Ranger platform over at rankranger.com.

About The Author

The In Search SEO Podcast

In Search is a weekly SEO podcast featuring some of the biggest names in the search marketing industry.

Tune in to hear pure SEO insights with a ton of personality!

New episodes are released each Tuesday!



Source link

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!