Topics
More on Patient Engagement

Patients less likely to take advice from AI doctors if they know their names

While the technology can be hugely beneficial, providers have to find the sweet spot when it comes to patients' comfort levels.

Jeff Lagasse, Editor

Photo: David Sacks/Getty Images

Artificial intelligence has long been seen as a means by which healthcare can become easier, more streamlined and more cost effective, but it turns out that patient trust of AI technology only goes so far. New findings from Penn State and the University of California show that patients are less likely to take advice from an AI physician when it knows their name and medical history.

On the flip side of that: Patients want to be on a first-name basis with their human doctors.

That was the top takeaway after the research team studied 295 participants, pairing each with either a human physician, an AI-assisted physician or an AI chatbot. 

When the fully AI physician used the patients' first names and referred to their medical history, the patients were more likely to consider the chatbot intrusive, and were less likely to follow its medical advice. When it came to real, flesh-and-blood doctors, however, patients expected them to differentiate them from other patients.

WHAT'S THE IMPACT?

The findings offer further evidence that machines walk a fine line in serving as doctors, and should give providers pause about what types of AI technologies to implement in their practices. The authors hypothesize that since machines can't feel or experience, patients are inclined to be more resistant.

Machines do have some advantages as medical providers, though. Like a family doctor who has treated a patient for a long time, computer systems could – hypothetically – know a patient's complete medical history. In comparison, seeing a new doctor or a specialist who knows only your latest lab tests might be a more common experience.

As medical providers look for cost-effective ways to provide better care, AI medical services may provide one alternative. But AI doctors must provide care and advice that patients are willing to accept, the team said. Many don't feel comfortable with the technology, or feel that the AI recognizes their uniqueness as a patient. And when the technology does recognize their uniqueness, it can come across as intrusive.

In a perplexing finding, about 78% of the participants in the experimental condition that featured a human doctor believed that they were interacting with an AI doctor. One tentative explanation: People may have become more accustomed to online health platforms during the pandemic, and may have expected a richer interaction.

In the future, the researchers expect more investigations into the roles that authenticity and the ability for machines to engage in back-and-forth questions may play in developing better rapport with patients.

THE LARGER TREND

Artificial intelligence has the ability to make physicians' lives easier, according to experts and former clinicians.

For example, AI has the ability to make medicine keyboard-free, a futuristic goal that would be welcomed by physicians who spend most of their time with a patient in front of a computer. The technology would require a visual interface or dictation.

The technology also holds the potential to improve administrative processes such as revenue cycle management

Hiring data provided by Optum360 illustrates the extent to which administrative spending has increased. Hiring for physicians has increased since 1970, but not nearly to the extent of administrative hires, which have grown 3,000% during that time.

The potential to mitigate waste with AI is joined by an overall positive sentiment toward the technology among healthcare professionals. According to Optum's data, 97% of those in the industry trust AI to handle administrative or clinical applications, while 85% are currently implementing or developing some kind of AI strategy. More than half, 55%, expect AI to achieve positive ROI in fewer than three years.

On average, organizations are investing $39.7 million in AI implementation over the next five years. Already, almost one-third of health plans, providers and employers are automating processes such as administrative tasks or customer service, and 56% of health plans are using the technology to combat fraud, waste and abuse. Thirty-nine percent of providers are using it to personalize care recommendations.
 

Twitter: @JELagasse
Email the writer: jeff.lagasse@himssmedia.com