Dr. Matthew Hitchcock, a family physician in Chattanooga, Tennessee, has an AI helper.
It records patient visits on its smartphone and summarizes them for treatment plans and billing. He does some light editing of what the AI produces and finishes his daily patient visit documentation in about 20 minutes.
Dr. Hitchcock spent up to two hours typing out these medical notes after his four children had gone to bed. “That belongs to the past,” he said. “It’s pretty awesome.”
ChatGPT style artificial intelligence is coming to healthcare and the grand vision of what it could bring is inspiring. Every doctor, enthusiasts predict, will have a super-intelligent assistant who will make suggestions to improve care.
But first come more everyday applications of artificial intelligence. A key goal will be to ease the crushing burden of digital paperwork doctors must produce, typing lengthy notes into electronic health records required for treatment, billing and administrative purposes.
For now, the new AI in healthcare will be less of a genius partner than a tireless writer.
From leaders at major medical centers to general practitioners, there is optimism that healthcare will benefit from the latest advances in generative AI – technology that can produce everything from poetry to computer programs, often fluent at a human level.
But medicine, doctors point out, is not a wide open field for experimentation. AI’s tendency to occasionally create fabrications or so-called hallucinations can be funny, but not in the high-stakes healthcare industry.
That makes generative AI, they say, very different from AI algorithms, which have already been approved by the Food and Drug Administration, for specific applications, such as scanning medical images for cell clusters or subtle patterns that suggest the presence of lung or breast cancer. . Doctors also use chatbots to communicate more effectively with some patients.
Doctors and medical researchers say regulatory uncertainty, patient safety concerns and litigation will slow the adoption of generative AI in healthcare, especially its use in diagnosis and treatment plans.
The doctors who have been trialling the new technology say performance has improved significantly over the past year. And the medical notes software is designed for doctors to review the AI-generated data summaries against the words spoken during a patient’s visit, making it verifiable and fostering trust.
“At this stage, we need to choose our use cases carefully,” says Dr. John Halamka, president of Mayo Clinic Platform, which oversees the health system’s adoption of artificial intelligence. “Reducing the documentation burden would be a huge win in itself.”
Recent surveys show that doctors and nurses report high levels of burnout, leading many to leave the profession. High on the complaint list, especially among GPs, is the time spent documenting electronic patient records. That work often extends into the evenings, toil after office hours that doctors call “pajama time.”
Generative AI, experts say, appears to be a promising weapon to combat physicians’ workload crisis.
“This technology is improving rapidly at a time when healthcare needs help,” said Dr. Adam Landman, chief information officer of Mass General Brigham, which includes Massachusetts General Hospital and Brigham and Women’s Hospital in Boston.
For years, doctors have used a variety of documentation aids, including speech recognition software and human transcribers. But the latest AI does much more: summarize, organize and tag the conversation between doctor and patient.
Companies developing this type of technology include Abridge, Ambience Healthcare, Augmedix, Nuance, part of Microsoft, and Suki.
Ten physicians at the University of Kansas Medical Center have used generative AI software in the past two months, said Dr. Gregory Ator, an ear, nose, and throat specialist and the center’s chief medical informatician. The medical center plans to eventually make the software available to its 2,200 doctors.
But the Kansas health system is refraining from using generative AI in diagnosis, concerned that its recommendations could be unreliable and that its reasoning lacks transparency. “In medicine, we cannot tolerate hallucinations,” said Dr. Ator. “And we don’t like black boxes.”
The University of Pittsburgh Medical Center has been a test bed for Abridge, a start-up led and co-founded by Dr. Shivdev Rao, a practicing cardiologist who was also an executive in the venture arm of the medical center.
Abridge was founded in 2018 when large language models, the technology engine for generative AI, emerged. The technology, said Dr. Rao, opened a door to an automated solution to the healthcare administrative overload he saw around him, even for his own father.
“My father took early retirement,” Dr. Rao said. “He just couldn’t type fast enough.”
Today, Abridge software is used by more than 1,000 physicians in the University of Pittsburgh medical system.
Dr. Michelle Thompson, a family physician in Hermitage, Pennsylvania, who specializes in lifestyle and integrative care, said the software freed up nearly two hours a day. Now she has time to take a yoga class, or linger over a family dinner.
Another benefit is that the patient visit experience is improved, said Dr. Thompson. There’s no more typing, taking notes, or other distractions. She simply asks patients for permission to record their conversation on her phone.
“AI has enabled me as a physician to be 100 percent present for my patients,” she said.
The AI tool, Dr Thompson added, has also helped patients become more involved in their own care. Immediately after a visit, the patient receives a summary, accessible through the online portal of the University of Pittsburgh medical system.
The software translates all medical terminology into plain English at about a fourth grade reading level. It also provides a recording of the visit with “medical moments” color-coded for medications, procedures and diagnoses. The patient can click on a colored tag and listen to part of the conversation.
Studies show that patients forget up to 80 percent of what doctors and nurses say during visits. The recorded and AI-generated summary of the visit, said Dr. Thompson, is a resource her patients can return to for reminders to take medications, exercise, or schedule follow-up visits.
After the appointment, doctors receive a clinical note summary to review. There are links back to the transcript of the doctor-patient conversation so that the work of the AI can be checked and verified. “That really helped me build confidence in the AI,” said Dr. Thompson.
In Tennessee, Dr. Hitchcock, who also uses Abridge software, has read reports from ChatGPT scoring high marks on standard medical tests and heard predictions that digital physicians will improve care and solve staff shortages.
Dr. Hitchcock has tried ChatGPT and is impressed. But he would never think of loading a patient record into the chatbot and asking for a diagnosis, for legal, regulatory and practical reasons. For now, he’s thankful to have his evenings off, no longer wrapped up in the tedious digital documentation required by America’s health care system.
And he sees no technological solution for the shortage of health care workers. “AI isn’t going to solve that any time soon,” says Dr. Hitchcock, who is looking for another physician for his four-physician practice.