Business & economics, Research

Crowd-sourced online reviews help fill restaurant seats, study finds

By Sarah Yang

For anyone who has wondered whether crowd-sourced online reviews make a dent in a business’s bottom line, the answer is an unequivocal yes, according to a new study by UC Berkeley economists.

UC Berkeley economists found that online reviews can boost a restaurant’s business.

Researchers analyzed restaurant ratings on Yelp.com and found that, on a scale of 1 to 5, a half-star rating increase translates into a 19 percent greater likelihood that an eatery’s seats will be full during peak dining times. The study, published this month in the Economic Journal, found that the increase is independent of changes in price or in food and service quality.

“This is the first study to link online consumer reviews with the popularity of restaurants,” said study lead author Michael Anderson, assistant professor of agricultural and resource economics. “We show that social media sites and forums play an increasingly important role in how consumers judge the quality of goods and services.”

Anderson and study co-author Jeremy Magruder, assistant professor of agricultural and resource economics, analyzed 148,000 Yelp reviews for 328 restaurants in the San Francisco Bay Area.

Because Yelp rounds off a business’s average rating to the nearest half-star — a 3.74 average would mean 3.5 stars, while a 3.76 would bump the restaurant up to 4 stars — the researchers were able to isolate the impact of the rankings from the overall quality of the restaurant.

The study found that moving from 3 stars to 3.5 stars increases a restaurant’s chance of selling out during prime dining times from 13 percent to 34 percent, and that moving from 3.5 stars to 4 stars increases the chance of selling out during prime dining times by another 19 percentage points. These changes occur even though restaurant quality is held constant.

Not surprisingly, the economists found that crowd-sourced reviews have a bigger impact when there is a lack of alternative information available by which to judge a restaurant’s quality. When the researchers parsed the data further, they found that restaurants rated in popular guidebooks or newspaper rankings did not see a statistically significant effect from the Yelp rankings.

“If a restaurant has a Michelin star or it appears in the San Francisco Chronicle’s list of Top 100 Restaurants in the Bay Area, the Yelp star becomes irrelevant,” said Magruder. “Those restaurants are relatively famous, and consumers already know them. For restaurants that were not on those established reviews, we actually saw a 27 percent greater likelihood in filled seats during peak dining times with a half-star rating increase on Yelp.”

Could these findings lead to potential manipulation of the ranking system for profit?

“We considered that possibility, and our study indicates that so far, such manipulation is under control,” said Anderson. “There are enough reviews available that it would be difficult to generate enough fake positive reviews to drown out the bad ones. There is also an element of self-policing since customers going to a restaurant on the basis of a good fake review only to be disappointed could submit a bad review. It could be hard for the business owner to sustain the false positives over time.”

The researchers are now looking to expand their analysis to sites such as Amazon.com, Tripadvisor.com and Netflix.com.

The Giannini Foundation of Agricultural Economics helped support this research.