Blind Study Finds YP Local Results Beat Google, Bing, Yahoo
A “blind” study commissioned by directory publisher and local ad network YP, performed by Crowdflower, found that YP’s local search results were better overall than those of search engines. This was measured as a function of user satisfaction rating. Thousands of queries were compared over a year-long period. I can hear the skepticism: “It’s sponsored […]
A “blind” study commissioned by directory publisher and local ad network YP, performed by Crowdflower, found that YP’s local search results were better overall than those of search engines. This was measured as a function of user satisfaction rating. Thousands of queries were compared over a year-long period.
I can hear the skepticism: “It’s sponsored study” and “How can they be better than Google’s local results?” Here’s what the study says about the methodology:
YP contracted CrowdFlower, an independent third-party, to employ an innovative and impartial methodology for the relevance measurement of local search results on YP and other sites. For over a year, CrowdFlower worked with YP to test approximately 13,000 local search query results per month to track user satisfaction (as a proxy for search relevance) of YP.com local search results and those of other popular local search engines.
The study matches YP local search technology against other popular local search engines: Bing local (bing.com/local), Google Maps (maps.google.com), and Yahoo! Local (local.yahoo.com). They were chosen for the study because of their high user reach and ability to offer search results across all local business categories. The search queries in the study span a wide range of local business categories. Sites that focus on a narrow set of vertical categories, such as restaurant or entertainment-heavy sites were not included because their scope does not match the breadth of this study.
In the evaluation task, search queries are presented to contributors who are asked to rate their satisfaction with the relevance of the results. To eliminate possible bias, the search results are presented in unbranded lists that show only basic listing information such as business name, address, and phone number.
I was told that this study was originally intended for the purpose of internal benchmarking. It wasn’t undertaken for marketing or PR purposes. Only in the end did the company authorize Crowdflower to release it.
As the discussion above mentions, users were not exposed to the brand of the search site. Above is an image of the type of screen study participants saw in every case, regardless of the source.
As the screen indicates, users in each case were asked to perform a query and then rate the result on a scale from “invalid search” to “perfect.” Each query had both a keyword and a location (city, state, zip). The distribution of query types is illustrated by the chart below:
Crowdflower explains the query selection process further:
The majority of the queries (approximately 65%) are randomly selected from user searches on the YP.com site. The remaining queries are sampled from the most commonly searched categories (High Traffic), YP’s top monetized categories (Top Monetized), and a control set (Monitoring) that remains constant month over month.
If there’s any “bias” in the study it would be in the query selection process; this was not a randomly selected group of queries. However, that doesn’t account for the results.
Crowdflower reported that in November 2012, YP search results “were found more relevant compared to Bing local, Google Maps, and Yahoo! Local. Users were satisfied with YP.com search results 86% of the time, compared with 83% for Google Maps, 82% for Yahoo! Local, and 74% for Bing local.”
There are other charts like the above showing a comparison of user satisfaction and performance by business name query and business category. In each case YP delivered higher local search relevance/satisfaction overall than Google, Yahoo and Bing in that order. The following is a more detailed breakdown of categories where YP performed better and where its results were comparable to the competitors.
Microsoft often makes the claim that its results are better than Google’s but Google’s brand strength blinds people to that fact. People debate the accuracy of that claim. This may well be a comparable situation, where YP did do a better job of satisfying user expectations than its search engine competitors but Google’s brand makes that hard for people to accept.
I’m especially curious about what people think of these findings and whether you believe them or remain skeptical.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.