Do Google Places Listings Really Matter?
Posted on April 1st, 2011
I’ve noticed, for many cases, if you have great organic rankings for your local terms having the additional visuals of a map marker, logo image, and address/phone info, plus maybe star ratings, review snippets and review counts, from a Google Places listing, does not matter at all.
I have a certain LONG term client (5 years) and have seen all the various iterations of Google Maps and Places mixed into the search results. From 10-packs to 7-packs and today’s blended local organic listings. This client has enjoyed #1 organic rankings for their primary local terms through most of that time (maybe the odd short term dip to #2). But here’s the kicker – they are located just outside the city limits. This tends to restrict a number of things as far as Google Places listings and rankings go.
Here are the various changes to Google results they’ve witnessed over the past 5 years;
- Before Google started showing map listings in the SERPS, they were #1 organic.
- When the 10-pack local listings rolled out they were #1 organic but below the 10-pack and map.
- For a while (most of 2009 and early part of 2010) we were able to trick Google into thinking they were in the city and they showed #A in the 7-pack and again #1 organic below the map.
- Later Google caught on to our trick and they got kicked out of 7-pack but still showed at #1 organic below the map.
- New blended local organic results now has them at #1 above any of the local listings that get the bonus map markers and such.
What do you think we saw for changes in traffic to the website and changes in lead volumes (we track email leads as goal conversions in Analytics) through each of those significant changes?
Answer = none.
By none I mean nothing that could by any means be attributed to places listings vs. no places listings. What we did see was year over year growth in traffic and leads (look at the peaks in the image below – this is a very seasonal business). We attribute that to the growth of the internet and search in general as a source to find local businesses.
You would think that there should have been a noticeable difference between being in the 7 pack noticeably above the organic results and being only in the organic listings below the map. But it simply is not the case.
Now the peak for 2010 was not much different than 2009. Could that peak have been higher if the business was located in the city limits, or we managed to still convince Google it was, and we continued to show up in the local 7-pack? To check against that I checked their primary local term in Google Insights for historical traffic trends. You’ll see 2009 and 2010 at about the same levels there as well.
One Small Difference
Primary Service Term with No Location Qualifier
If you search certain broad terms like dentist, pizza, plumber, etc.., Google recognizes it as the type of query that may have local intent and will mix in map results with the broad organic results. Without a true local presence in the city we now miss out on that traffic. But it is small, puny even.
For the year + we had the spoofed (sort of) local presence and high map rankings, the volume of traffic from the singular terms was about 1/10th of those that included that word plus the city name before or after. Plus there is the other iterations that use city names with state or province abbreviations as well as those that use ‘in’ city-name, ‘near’ city name, etc… Combine all the other search terms for their various related services, with a city name in the phrase, and the singular broad terms amount to close to 1/100th of their local traffic.
Another thing about the insignificance of the single word terms, conversion rates (email leads) were a mere fraction as well. Terms that included the city name convert at 8% to 15%. Single word searches on their 2 primary service type variations, converted at 0.39% and 0.64% respectively.
Goodbye single phrase terms with pseudo-local intent – we don’t really miss you.
Hold On, Not So Fast!
I have other clients ranking well organically for areas they don’t get local map listings – they pretty much see the same thing. But for some industries the pull of the map is a little bit more noticeable.
People have slightly different intents when searching certain local businesses. Those industries where need tends to be urgent – “I need it now and I don’t care who it is, just get it done, NOW” – see a slightly different trend.
I’ve worked with a number of plumbers in various cities and they all tell me the same thing. Searchers are looking for phone numbers and simply start “dialing for dollars”. They have a leaky pipe that’s about to ruin their flooring and they want a fix now. First plumber to answer the phone gets the business, not the guy who can’t answer his cell while elbow deep into a toilet. They can’t even be bothered to leave a message on voice mail. Instead they hang up and call the next plumber on the list.
Google Maps/Places listings show phone numbers quite prominently, so in those urgent cases the Places listing in the search results matters more. But that can be overcome with good organic rankings too. Simply include a phone number in your title tags, and/or meta description tags.
Reviews Can Matter, Sometimes
Blended local search results are sometimes displaying snippets of reviews right there on page one. They can have an impact, in extreme cases. By extreme I mean a glaringly negative review showing as the review snippet. Right there, page 1.
From the perspective of what a user see’s in the search results, things like positive review snippets, review counts, rating stars, etc.. have little impact on click through’s to the website and phone calls. It primarily comes down to rankings. But one bad review, for all to see, can have a dramatic impact – but not on traffic to the site. It impacts lead volumes – both phone calls and emails.
A client in New York City had that dilemma with a false negative review being pulled from Yelp. The bad review showed on page one of Google search for close to two months and during that period lead volume dropped by 75% (phone calls saw a bigger impact than email) but no discernible change in total traffic to the website.
Seems that rankings still pulled the clicks as they usually do but people chose not to contact them. Obviously because of the review they saw in the search results. Based on the content of that review we could tell it was either an intentional fake from a competitor or a mistake on the reviewers part mixing them up with some other business.
We posted a business owner response to that review, similar to what this dentist did, and a couple weeks later Yelp ended up placing the bad one as a filtered review and Google then scrapped a different one to show as a snippet. Lead volumes are shooting back up to normal levels. Needless to say, my client is breathing a sigh of relief.
In general terms, organic rankings trumps all and the extra fluff of review snippets, star ratings, review counts are just that -fluff (unless it’s glaringly negative). I wonder then, at the mass user level, what they actually think of those extras.
- Is it merely a little side bonus that might, for some users in some cases, maybe, maybe, maybe influence a decision?
- Is there a lack of trust for user reviews that are easy to fake and have little to no quality control?
- Do they ultimately put more trust in the ranking algorithm, regardless of stars and reviews?