What it takes is to involve clinical staff in AI
BOSTON – When asked about the evolution of artificial intelligence and when emotional AI will become an integral part of the healthcare fabric, Dr. Patrick Thomas, director of new digital technology in pediatric surgery at the University of Nebraska Medical Center College of Medicine, said medical training needs to be revamped.
“We need to prepare clinical students for the real world they’re going into,” he said at the HIMSS AI Health Conference on Thursday.
As host of a panel on how to support healthcare professionals to trust and embrace AI – highlighting the importance of governance, keeping people in the loop and shared responsibility for maintaining the value of AI medicine and producers – Thomas asked his team members how they were doing. doubts and doubts of doctors.
Joining Thomas for an interview was Dr. Sonya Makhni, medical director of the Mayo Clinic Platform, Dr. Peter Bonis, chief medical officer of Wolters Kluwer Health, and Dr. Antoine Keller of Ochsner Lafayette General Hospital.
Increasing clinical resources
Using large-scale linguistic methods to eliminate the strong psychological burdens that nurses have is still problematic, from biases and data visualizations to costs.
Bonis noted that developers may end up paying the cost of developing the systems over the cost of designing the infrastructure.
Keller added health care has more information than clinical staff can gather.
“We don’t have enough staff to receive” the right clinical decisions in a timely manner, he said. Although nurses are focused on risk-based innovation – building security to avoid the concerns of AI use – giving nurses a level of comfort that they can embrace is important.
Keller, a heart surgeon, explained how Louisiana-based Oschner Health provides community health partners with an AI-enhanced tool called Heart Sense to drive interventions.
By conducting assessments in underserved communities with low-cost technology, the AI tool “increases the workforce geometrically,” he said.
Improving access to communities in need
Using a heart monitor not only improves Oschner’s practice, but also focuses attention on patients who need it most.
Thomas asked what it means to have the influence of AI in care when a healthcare organization does not have data scientists on staff, how Oschner’s community health partners are getting an understanding of the tool of AI and how that tool is used.
There’s a lot of reluctance where the communities they serve are developed by holding hands, Keller said.
“You have to be there and be aware of the obstacles or issues that people have using technology,” he explained.
However, those who find the technology in areas where there is a shortage of medical providers are grateful for it, he said.
One of the main methods of diagnosis – that patients have a heart murmur – is required before the patient can have an aortic valve replacement. Using an AI-driven diagnostic tool, the health system has found that 25% of those over 60 in its communities will have pathogenic snoring – which can be treated with surgical intervention.
“The prevalence data shows that there is a large undiagnosed population,” Keller said.
Using AI to understand which patients are at risk for treatment is a huge opportunity to treat patients before they develop an irreversible problem.
But acceptance depends on having an educated workforce, he said.
“Visually – with something concrete,” and it is easily understood even with a low-level education, he said.
Stakes and shared responsibility
“The standards are very high for medical care,” Bonis admitted, noting that keeping the human in medical decisions is a guiding principle for developing reliable AI under many different conditions.
While front-line nurses aren’t interested in the “sausage-making” that is AI, “from a marketer’s lens, awareness is important,” he said.
The question for Makhni is how to narrow down lifelong knowledge, he said.
The Mayo Clinic Platform is working directly with AI developers and The Mayo Clinic to look at how to deploy clinical AI in a way that’s also user-centric — “and communicates that information transparently to empower the end user.”
Such multidisciplinary analysis can determine whether the AI developer has tried to detect bias, for example, and that information can be relayed to doctors in a way they understand.
“We meet [developers] where they are on their journey, “but the goal is to represent the principles of safety, justice and fairness,” he said.
Thinking about the digital divide and asking clinical staff what their concerns are is important to provide safe AI applications and the burden cannot fall solely on users.
“Sometimes it also has to fall to the solution provider,” he said. “We have to have a shared responsibility.”
While healthcare won’t solve all AI problems quickly, “we can go in a metered fashion,” he added.
Andrea Fox is the senior editor of Healthcare IT News.
Email: afox@himss.org
Healthcare IT News is a publication of HIMSS Media.
#takes #involve #clinical #staff