Topics
More on Quality and Safety

EHR usability issues may contribute to patient harm, JAMA study shows

The authors caution that, due to a conservative approach in the way the data was analyzed, the actual numbers are likely underestimated.

Jeff Lagasse, Editor

The usability of electronic health records may be associated with some safety events in which patients were possibly harmed, according to a new study published in the Journal of the American Medical Association.

And while the authors did not make any specific financial revelations, clinical quality has increasingly been tied to reimbursement as the healthcare industry shifts from fee-for-service to a value-based model, with the federal government limiting reimbursement for underperformance in categories such as 30-day readmission rates and, yes, patient harm events.

[Also: Doctors say EHRs fall short for value-based care]

The analysis discovered that patient safety reports mentioning a specific EHR contained language suggesting that the EHR may have contributed to patient harm, though from 2013 to 2016 these issues represented less than 1 percent of patient harm events.

Researchers analyzed 1.735 million free-text patient safety reports from 571 healthcare facilities in Pennsylvania and another East Coast healthcare system. Only reports that used one of the top five electronic health products or vendors, and that were classified as "reaching the patient with possible harm," were included.

The researchers found that of the reported safety events, 1,956 of them, or 0.11 percent specifically mentioned one of the included EHR vendors or products and were reported as possible patient harm. And 557 of them, or 0.03 percent, used words that strongly suggested EHR usability contributed to possible patient harm. 

Of those latter incidents, 468 reached the patient and could possibly have needed monitoring to preclude harm; 80 could possibly have caused temporary harm; seven could possibly have caused permanent harm; and two and might have needed intervention to prevent a fatality.

The authors caution that, due to a conservative approach in the way the data was analyzed, the actual numbers are likely underestimated. Partly, that's due to the data coming mostly from Pennsylvania; researchers expect that if the analysis were scaled to cover the entire nation, the numbers would be higher.

Another possible reason the number may not reflect the whole story is that the study only looked at reports specifically mentioning the name of a vendor or product; oftentimes, clinicians don't mention the names of EHR vendors when writing up patient safety reports.

And then there's the fact that patient safety events are typically underreported, in some cases by five- or tenfold, according to the authors.

Twitter: @JELagasse
Email the writer: jeff.lagasse@himssmedia.com