(Reuters Health) - The physician ratings on popular patient-review websites did not match formal patient surveys about the same doctors and other measures of quality care in a new study.
"This is really the first study of its kind with legitimate measures of quality of care," said Bradley M. Gray, a researcher for the American Board of Internal Medicine (ABIM) in Philadelphia and coauthor of the new research letter.
It didn't appear that online star ratings would direct patients to any better or worse doctors, Gray said, but they also may not be very useful for patients.
"All we can say is that we didn't find a big bias in terms of extremism," Gray said. "We also didn't find huge differences in the quality of doctors who did and did not have reviews."
For the study, the researchers included 1,299 doctors who had completed a professional education module from ABIM to improve their practice in diabetes or hypertension treatment.
The study team looked at patient surveys and medical records to assess the patients' clinical outcomes and their experiences, both validated measures of the quality of care.
Gray's team then searched for the doctors' names, specialties and cities using Google and extracted ratings from eight free doctor-rating sites, including Healthgrades, UCompareHealthCare, Vitals and Avvo.
About 60 percent of the doctors in the study had been rated online, and each doctor had an average of between five and six patient ratings on the websites.
Doctors' website ratings mostly did not match the clinical quality measures, according to the results in JAMA Internal Medicine.
The only area with a small association was patient experience. For example, a doctor with an online rating of one star out of a potential five had 79 percent of patients who answered the formal survey rate their overall quality of care as "very good." That compared to 82 percent of the surveyed patients when the doctor's online rating was five out of five stars.
There is no one correct way to measure quality, which is always somewhat subjective, according to Dr. David A. Hanauer, who researches clinical medical informatics at the University of Michigan Medical School in Ann Arbor. He was not involved in the study.
"What is important to you might be different from what is important to me," Hanauer told Reuters Health by email. "For some people it might be years of experience, or bedside manner, or types of insurance accepted, or wait times in the clinic, or being board-certified, and the list goes on."
Dr. Naomi Bardach at the University of California San Francisco noted in an email, "The quality measures they use are based only on the experiences of patients with two specific illnesses (diabetes and hypertension), and only 25 patients in each practice."
So the study has some strengths but the findings "do not provide strong evidence that online ratings should be disregarded," she told Reuters Health.
Crowd-sourced patient observations of care may contain wisdom in some cases, and have correlated with hospital infection and death rates in past studies, she said.
"The number of ratings are generally not large enough to support an accurate rating," Gray said.
"It's also possible that the kind of people that respond to websites are not representative of the population in general," he told Reuters Health. "Personally I've never gone onto a website and written a review."
Websites near the top of a Google search did seem to correlate more with patient experience scores, he noted.
"There is no danger or cost to looking at the reviews except for time spent at the computer," Bardach said.
"Prospective patients should not avoid looking at them," she said. "But the ratings are likely not perfect, since the quality of 'crowd sourcing' is always going to be based on who exactly is in the crowd and how similar their story is to yours."
The stars on the website ratings don't seem to mean much, but a lot of information on the websites is valuable, Gray said.
"For example they tell you whether a doc is board certified or about hospital privileges," Gray said. It's a rigorous process to earn and to maintain board certification, which has been found to be related to quality of care, he said.
"It's not hurting anything to look at the stars, the only way it might hurt is if it's distracting them from looking at other sources of information," he said.
Gray said the website www.certificationmatters.org is a good source of board certification information.
SOURCE: JAMA Internal Medicine, online December 1, 2014.