A new study published in the Journal of Neurosurgery: Spine examines the methodology for online surgeon scorecards.
The study authors examined the methodology for Consumers' Checkbook and ProPublica rating websites among neurosurgeons and orthopedic spine, hip and knee surgeons. They searched for inaccuracies among surgeons at top-ranked hospitals. There were 510 surgeons queried in the study; 401 were orthopedic surgeons and 109 were neurosurgeons.
The researchers found:
1. Twenty-eight percent of the surgeons had data on Consumers' Checkbook and 56 percent of surgeons had information on ProPublica.
2. Among the surgeons in top-ranked programs that had data, 17 percent had high complication rates and 13 percent had lower volume than other surgeons.
3. Seventy-nine percent of the physicians at top-ranked programs had three stars out of a possible five-star rating.
4. There wasn't a significant correlation between the stars surgeons received from Consumers' Checkbook and ProPublica's adjusted complication rate for that surgeon. The complications on the site were "proxies of complication occurrence."
5. The study authors concluded both surgeon rating sites "have significant methodological issues."
"Neither site assessed complication occurrence, but rather readmission or prolonged length of stay. Risk adjustment was nonexistent," the study authors reported. The rating algorithm methodology isn't peer reviewed or subject to "appropriate scientific scrutiny."