A couple of months ago, we looked at the relationship between review site rankings on a business’s local SERP/SEO and the number of reviews of the business on those sites. We found a significant positive relationship between the number of reviews and how highly those sites ranked in local search results.
Most of the analysis that time around centered on automobile dealers around the country and where their Facebook and DealerRater presences ranked in Google search results targeting that dealer. This time around we expanded that analysis to multiple review sites and multiple industries and found that the relationship between review volume and local search ranking varies wildly by domain and industry. We also dug a little deeper into the data to try to estimate the value of adding reviews on these sites over time, and we found that new reviews are valuable in two ways. First, new reviews help review sites rise on search engine results pages, and second, the more reviews that site acquires, the better its chances are of staying at the top of the results page as well.
Reviews and SEO across sites and industries
First, let’s look at the relationship between review volume and domain ranking in local search for that same set of automobile dealers. (Note: None of these analyses include Google, since the Google review presence is usually anchored on the right-hand side of the page.)
The Facebook and DealerRater lines here match up pretty well to the data we presented before. We also see a correlation between review volume and domain rank for the other domains, but it is notable that the apparent impact of additional reviews varies a bit by source. For instance, for four of these domains the average rank of the review site for a location with no reviews is between 8.5 and 10. Having 100 reviews on DealerRater brings the expected rank of that domain down to the top half of the first page, whereas for cars.com and Facebook, we would still expect 100 reviews to leave that site below the fold when someone is searching for that location. Edmunds.com is even worse. It seems no matter how many reviews you get, Google is determined to pin your Edmunds presence to the top of the 2nd page.
This data would lead us to hypothesize that, on average, an additional review on DealerRater is worth considerably more for a car dealer than an additional review on one of these other sites. But before we explore that hypothesis a little more, lets look at similar data for a few other industries. Next let’s look at hospitals:
These are the review site domains that most commonly showed up when we googled over 1000 US hospitals. Again we see the expected directional relationship, more reviews means a generally better SERP/SEO ranking. However, none of these curves are as steep as the steepest curves for auto dealers. It’s also very interesting to note how much Google seems to value a healthgrades.com page regardless of whether there are any reviews on it.
And here is the data for the Self-Storage Unit industry. There isn’t as much breadth in this industry, as there aren’t as many review sites with high volume, but it is very interesting to note that in the storage industry, Facebook has a very strong correlation between review volume and SERP/SEO rank.
All of this is very interesting, but it raises several questions. Most notably, what makes the SERP/SEO ranking of particular review sites seem to be so responsive to review volume in particular industries? And is there actually causation here or does something else explain why some of these correlations are so strong?
Review volume impact on local SEO over time
Let’s address the causation question by looking at some more dynamic data, specifically by looking at how these rankings and volumes change over time. This is still a long way from a controlled experiment, but it would be more compelling if we could show that as review volumes rise for a particular location on a particular site, then the SERP/SEO ranking of that location tends to fall.
Over the last couple of months we gathered SERP/SEO data once a week for several thousand US auto dealers. We then looked at the rankings for major review sites over time and how those changes correlated with total review volume and with changes in review volume. To model this, we fit a Markov Chain that predicted the probability of any weekly SERP/SEO ranking for a review site based upon the domain, that site’s ranking the previous week, the total number of reviews for that location on that site, and whether the number of reviews went up or not.
The first thing we wanted to measure was this – Does getting new reviews positively impact your search engine rank? According to our data, the answer would appear to be yes. In the graph below we plot the predicted impact of getting a new review on one review site according to our model.
According to our data, after we normalize for domain, rank, and total number of reviews prior, review sites that got at least one new review in a given week tended to be placed higher the following week than sites that did not get new reviews. Obviously this impact is much higher when you have no reviews or very few reviews (an average improvement of 1/3 of a spot for sites getting their first review!), and it levels off pretty quickly once you have around a dozen reviews.
Our model spit out one other interesting insight. It found that review volume is important not just for getting a site ranked highly on SERP, but for keeping it there as well. Review site rankings drift from week to week, and our Markov Chain model captures that drift. But what the model also found is that for review sites with a high volume of reviews, regardless of where they ranked the week before, they tended to drift more towards the top of the page (or were more likely to stay there) than review sites with very few reviews.
This graph plots how much an auto dealer’s review volume will impact the drift of that ranking on average. In other words, if you have no reviews, your review site page will lose one spot every three weeks, on average, relative to the norm. If you have 50+ reviews, it will gain 1 spot every 5 weeks on average, relative to the norm. You might ask, “how can I gain a spot if I am already at the top?” Well, links that are in the top spot tend to lose that spot about 20% of the time. If that site has 50+ reviews, it will be much less likely to do so.
Hopefully this analysis shines a light on the value of generating a healthy review volume on review sites that you want your customers to be able to find on Search Engines. And makes it clear that those reviews are valuable not just because they will help those review sites climb to the top of search engine results, but because they will help those sites stay there as well. Also beware that these effects can vary considerably from domain to domain, and the most responsive domains may also vary from industry to industry.
At Reputation, our clientele includes both businesses and individuals, and for businesses, one of the primary components of online reputation is the corpus of online reviews written about that business. As a result, we’ve spent a lot of time assessing how online reviews impact a company’s reputation and ultimately, the bottom line.
Online reviews can impact your company and its reputation in a number of ways. At the most granular level, each individual review can impact whether a potential future customer walks through your door and even anchor what that future customer is likely to think of you. These reviews also aggregate into a high-level summary of your company (e.g. an overall star rating and review count) which each review site (e.g. Google, Facebook, Cars) will use to rank you internally within their site and may be all that many prospective customers ever see about you.
Over the next few months we will dig into various components of these interactions, but for now we will start at an even higher level, looking at how these various ratings and reviews impact how your business is represented on popular search engines such as Google and Bing.
As the online reputation space matures, our clients wind up asking more and more about the impact of online reviews on other metrics. These are normally packaged up under the topic Return on Spend (ROI), which is a little nebulous. This could mean anything from store level foot traffic, increasing the effectiveness of marketing spend, creating visibility, or showing a concrete impact on sales. Over the years, we have been able to make cases for most of these, but the most fun one so far as been reverse engineering what is going through Google’s mind: online reputation’s impact on a location’s local search engine visibility. Our hypothesis is simple: we think that Google increases the Search Engine Results Page (SERP) for review sites based on review volume and their ratings.
A small housekeeping task – we defined Local SERP as what shows up when you are doing a local search on your mobile phone or on the web looking at a specific area.
We are starting with a data set that include about 20k reviews from Dealer Rater and Facebook.
Let’s assume that different sites are treated differently by Google, so let’s start looking at the same data by segmented by the source of reviews:
Since we are dealing with only two factors here, we can look at a scatterplot to get a first level sense of interactions:
A quick analysis of these graphs tells us two interesting interactions:
- The number of reviews has a clear impact on the SERP Rank (right most column, top graph)
- Ratings has a correlation with Rank, but not one that appears to be very strong (left most column, middle graph)
Let’s run a normal linear regression and see if we can get an overall sense of the importance of the two factors. There are arguable better ways of doing this, but it will help confirm our suspicions about the relative importance of these factors:
fit <- lm(Rank ~ Rating + log(review_link_text + 1), data=data)
lm(formula = Rank ~ Rating + log(review_link_text + 1), data = source_data)
Min 1Q Median 3Q Max
-7.3197 -2.4955 -0.8922 1.6404 22.2808
Estimate Std. Error t value Pr(>|t|)
(Intercept) 8.47824 0.06445 131.546 < 2e-16 ***
Rating 0.53441 0.07849 6.809 9.98e-12 ***
log(review_link_text + 1) -0.99965 0.01278 -78.191 < 2e-16 ***
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 3.668 on 48297 degrees of freedom
Multiple R-squared: 0.1157, Adjusted R-squared: 0.1156
F-statistic: 3158 on 2 and 48297 DF, p-value: < 2.2e-16
Using a relative importance metric calculation, we get:
Relative importance metrics:
lmg last first pratt
Rating 0.01974774 0.007525053 0.03137841 0.01543049
log(review_link_text + 1) 0.98025226 0.992474947 0.96862159 0.98456951
This confirms what we saw in the graphs – there is a very strong correlation between the review count and the Rank. Rating is also there – it has a small impact, but this needs more analysis. We will look at that in more detail in another blog post and come up with some strategies for that.
Let’s explore the relationship between quantity and SEO visibility a little more and see if we can get a better picture by using a quantile binning of the review count:
It is pretty clear that having a baseline volume of reviews has a huge impact on your SEO visibility – the first 10 reviews that a location gets can boost a location up from hovering around the bottom of the first page or second page to clearly in the top half of the first page.
With search engine rankings, generally speaking we are most interested in how many people click on the links (it is a decent approximation of how many users are looking at your locations specifically.) Other companies like Moz.com have done interesting research on this – and, since all we really care about is approximating the CTR, this should be more than enough: https://moz.com/blog/google-organic-click-through-rates-in-2014
If we take this CTR data into account, we can see that having 50 reviews can increase the expected click through rate by 266% compared to a baseline location.
Now this leaves many open questions – for example, we have just identified a clear correlation between the number of reviews and the SEO ranking, but not certainly not a causal effect. In future blogs on this topic, we will be looking at the following topics:
- Can we identify some likelihood of causality here?
- How does the industry you are in change this?
- Does google treat different review sites differently?
- What are other factors that affect this?
- How long does it take for review volume to have an impact on search rank?
Reputation.com was founded in 2006, with the mission of protecting an individual’s online reputation. We created the online reputation management industry, and for the last decade, we have led the way in building and innovating in this space. In order to deliver on our promise of monitoring and managing the online reputations of our clients, we have had to make sense of the constantly expanding Internet and world-wide-web. And to do so we have had to be on the leading edge of another rapidly growing industry – data science.
Over the last few years, our company has found a new market with our business product, and with it, we’ve opened up a new vein of interesting data sets and problems. When I joined Reputation.com, I felt it was important to share some of the discoveries we’ve made as it relates to online reputation. With one of the largest data sets of reviews for large companies in the world, we have had the opportunity to understand just how incredibly impactful reviews are (in most cases – in some cases, they have no impact).
This blog is really an opportunity for the data science team to share and engage in a dialogue with others about some of our cool discoveries, and letting others get to know more about the tools (both infrastructure and process) that we use. In the next few months, we looking to publish some research on the impact of online reputation on local SEO, handling analytics with MongoDB databases, scaling out NLP with spark, and some thoughts on how to build out analytics infrastructure. Let me know if you have any suggestion for future posts – I can be reached at ajohnson at reputation.