At Reputation, our clientele includes both businesses and individuals, and for businesses, one of the primary components of online reputation is the corpus of online reviews written about that business. As a result, we’ve spent a lot of time assessing how online reviews impact a company’s reputation and ultimately, the bottom line.
Online reviews can impact your company and its reputation in a number of ways. At the most granular level, each individual review can impact whether a potential future customer walks through your door and even anchor what that future customer is likely to think of you. These reviews also aggregate into a high-level summary of your company (e.g. an overall star rating and review count) which each review site (e.g. Google, Facebook, Cars) will use to rank you internally within their site and may be all that many prospective customers ever see about you.
Over the next few months we will dig into various components of these interactions, but for now we will start at an even higher level, looking at how these various ratings and reviews impact how your business is represented on popular search engines such as Google and Bing.
As the online reputation space matures, our clients wind up asking more and more about the impact of online reviews on other metrics. These are normally packaged up under the topic Return on Spend (ROI), which is a little nebulous. This could mean anything from store level foot traffic, increasing the effectiveness of marketing spend, creating visibility, or showing a concrete impact on sales. Over the years, we have been able to make cases for most of these, but the most fun one so far as been reverse engineering what is going through Google’s mind: online reputation’s impact on a location’s local search engine visibility. Our hypothesis is simple: we think that Google increases the Search Engine Results Page (SERP) for review sites based on review volume and their ratings.
A small housekeeping task – we defined Local SERP as what shows up when you are doing a local search on your mobile phone or on the web looking at a specific area.
We are starting with a data set that include about 20k reviews from Dealer Rater and Facebook.
Let’s assume that different sites are treated differently by Google, so let’s start looking at the same data by segmented by the source of reviews:
Since we are dealing with only two factors here, we can look at a scatterplot to get a first level sense of interactions:
A quick analysis of these graphs tells us two interesting interactions:
- The number of reviews has a clear impact on the SERP Rank (right most column, top graph)
- Ratings has a correlation with Rank, but not one that appears to be very strong (left most column, middle graph)
Let’s run a normal linear regression and see if we can get an overall sense of the importance of the two factors. There are arguable better ways of doing this, but it will help confirm our suspicions about the relative importance of these factors:
fit <- lm(Rank ~ Rating + log(review_link_text + 1), data=data)
lm(formula = Rank ~ Rating + log(review_link_text + 1), data = source_data)
Min 1Q Median 3Q Max
-7.3197 -2.4955 -0.8922 1.6404 22.2808
Estimate Std. Error t value Pr(>|t|)
(Intercept) 8.47824 0.06445 131.546 < 2e-16 ***
Rating 0.53441 0.07849 6.809 9.98e-12 ***
log(review_link_text + 1) -0.99965 0.01278 -78.191 < 2e-16 ***
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 3.668 on 48297 degrees of freedom
Multiple R-squared: 0.1157, Adjusted R-squared: 0.1156
F-statistic: 3158 on 2 and 48297 DF, p-value: < 2.2e-16
Using a relative importance metric calculation, we get:
Relative importance metrics:
lmg last first pratt
Rating 0.01974774 0.007525053 0.03137841 0.01543049
log(review_link_text + 1) 0.98025226 0.992474947 0.96862159 0.98456951
This confirms what we saw in the graphs – there is a very strong correlation between the review count and the Rank. Rating is also there – it has a small impact, but this needs more analysis. We will look at that in more detail in another blog post and come up with some strategies for that.
Let’s explore the relationship between quantity and SEO visibility a little more and see if we can get a better picture by using a quantile binning of the review count:
It is pretty clear that having a baseline volume of reviews has a huge impact on your SEO visibility – the first 10 reviews that a location gets can boost a location up from hovering around the bottom of the first page or second page to clearly in the top half of the first page.
With search engine rankings, generally speaking we are most interested in how many people click on the links (it is a decent approximation of how many users are looking at your locations specifically.) Other companies like Moz.com have done interesting research on this – and, since all we really care about is approximating the CTR, this should be more than enough: https://moz.com/blog/google-organic-click-through-rates-in-2014
If we take this CTR data into account, we can see that having 50 reviews can increase the expected click through rate by 266% compared to a baseline location.
Now this leaves many open questions – for example, we have just identified a clear correlation between the number of reviews and the SEO ranking, but not certainly not a causal effect. In future blogs on this topic, we will be looking at the following topics:
- Can we identify some likelihood of causality here?
- How does the industry you are in change this?
- Does google treat different review sites differently?
- What are other factors that affect this?
- How long does it take for review volume to have an impact on search rank?