During one of the many Live Collaboration Panels at MUSC’s 2022 Innovation Week, an interesting discussion ensued, reflecting a common debate in healthcare: How does artificial intelligence (AI) fit in?
In the past week, with several clinicians and key members of the Clemson-MUSC AI Hub — formed in 2021 — at the Gazes Cardiac Research Institute, it is quickly becoming clear that AI is gaining traction throughout the healthcare world. But equally evident is the fact that there is still some skepticism from the mainstream when it comes to the best way to use it.
For congenital cardiologist J. Hamilton Becker, MD, assistant professor of pediatrics, artificial intelligence remains a huge untapped resource.
“Artificial intelligence is an umbrella term,” he said in an interview right after the formation of the Clemson-MUSC AI Hub last year. “We are taking advantage of data science and arguing in those giant databases with appropriately applied machine learning methods.”
Baker has been using AI in his work for several years now, working on a number of different AI + biomedical projects ranging from congenital heart disease to diabetic eye disease.
“I feel very strongly about AI education. The goal is to teach clinicians how to understand and use AI. We’re not asking people to learn how to code, we simply want them to learn how AI can work with them,” Becker said.
At Gazes, the topic quickly focused on artificial intelligence and bias. Some clinicians believe that the most elegant aspect of AI is that it removes unintended biases by allowing computers — which are inherently bias-free because they are metal and silicon — to do the data processing and leave the treatment to the doctors.
“When two clinicians disagree about something, AI can help uncover unknown biases and dispel others,” said Paul Haider, associate professor of public health sciences at MUSC. “Artificial intelligence only looks at the data and makes decisions based on that alone.”
However, others have argued that those AI programs were written by humans, and those unintended biases were almost certainly pervasive in them.
“Trustworthiness is a key word we should be focusing on here,” said Brian Dean, PhD, chair of the computer science department at Clemson University. “Because the AI system has become less of an intelligent sensor that provides input to the medical decision-making process and more of a colleague. So we have to be very careful because, after all, the AI has been trained based on the opinion of a human expert, which is a biased opinion.”
Dean agreed that artificial intelligence is a very valuable tool in the medical field, warning everyone that its use is simply wise.
Jihad Obaid, co-director of the Center for Biomedical Informatics at MUSC, agrees. “If you use it as a decision aid, not a decision maker, then AI can be a real asset,” he said.
Regardless of the differences of opinion in the room, panelists agreed that AI has unlimited potential for researchers and clinicians alike.
“When it comes to AI in healthcare, it’s tempting to talk about the hype, and all the big things it can do,” Becker said. “But the fact of the matter is that there are a lot of easy and smart projects where AI can make a huge difference, and we just need more people on board.”
According to MUSC President Lisa K. Salahuddin, Ph.D., MUSC is already using artificial intelligence to develop technologies that can help diagnose and treat a range of diseases, including cancer, Alzheimer’s disease, substance abuse, child abuse, epilepsy, aphasia and other diseases. Inflammatory skin and heart.
Baker said that clinicians interested in applying AI in their research or practice should look to the AI Center, as it provides a range of resources, including AI funding. During this year’s Innovation Week, the Clemson-MUSC AI Hub awarded $100,000 in grants to five worthwhile projects.
“We want people to know about this,” he said. “I know there are a lot of people out there who could really use our help. We want to accelerate the adoption of AI for those who care about that.”