Online doctor ratings don’t predict actual performance

(Reuters Health) - Patients’ online ratings of physicians often don’t match up with the scientific data about the doctors’ quality of care, according to a new study.

“When you’re trying to choose a restaurant online, it’s OK if you get a bad recommendation, but the stakes are high here,” coauthor Dr. Timothy Daskivich of Cedars-Sinai Medical Center in Los Angeles said in a phone interview. “Online rating companies should be clear about what their ratings measure and don’t measure so patients understand the scores.”

Using the five most popular online platforms - Healthgrades, Vitals, Yelp, RateMDs and UCompareHealth – researchers looked at consumer ratings for 78 medical and surgical specialists. Next, they compared the consumer ratings to the doctors’ specialty-specific performance scores, which were based on how closely the doctors adhered to medical guidelines, certain outcomes of their patients’ hospital stays, and the cost of the care they provided.

Patients’ ratings of individual doctors tended to be consistent across all five platforms - but there was no significant association between consumer ratings and objective measures of the quality and value of care doctors provided.

Overall, the authors found, less than a third of the doctors with the lowest performance scores had consumer ratings in the lowest category. For some specialties, the disparity between consumer ratings and actual performance was even greater.

Writing in the Journal of the American Medical Informatics Association, the researchers cite a 2013 survey in which 81 percent of patients said they would visit a doctor based on positive reviews, and 77 percent said they wouldn’t see a doctor based on negative reviews.

“Those ratings don’t tell the entire story of how good a doctor is,” Daskivich said.

Consumer ratings do, however, “seem to explain some of the service-related aspects, such as staff friendliness, time spent with patients, empathy and ability to answer questions, which are all important for effective delivery of health care,” he added.

There needs to be a way to pair consumer ratings with quality and performance measures, the study authors say. Some of these measures are reported on websites such as Physician Compare, which is run by the Centers for Medicare and Medicaid Services, but it can be difficult to understand, they add.

“Ideally, a scoreboard should be available for each provider, which combines both the expert and the patient options and has intuitive explanations of technical ratings, such as readmission rates” at hospitals, said Dr. Vagelis Hristidis of the University of California, Riverside. Hristidis, who wasn’t involved with this study, researches online ratings and health insurance ratings - and how those relate to a doctor’s performance scores.

“We are working on an automatic classification tool that extracts the factors rated by each text-based patient review,” he told Reuters Health by email. “For example, does a review talk about wait time or about medical skills?”

Other than online sources, the study authors recommend that patients ask their primary care doctors for referrals to specialists. “Another doctor knows who they trust, and if it’s local, they likely personally know the doctor and quality of care provided,” Daskivich said.

SOURCE: Journal of the American Medical Informatics Association, online September 8, 2017.