Listen Better, See Deeper
Combining Medical Attentiveness with Artificial Intelligence
John Halamka, M.D., president, Mayo Clinic Platform, and Paul Cerrato, senior research analyst and communications specialist, Mayo Clinic Platform, wrote this article.
Embracing an “ecology of attention” will significantly improve patient care, according to Mark Kissler, MD, at the University of Colorado.1 Kissler and his colleagues point out that clinicians spend much of their time multi-tasking and navigating around interruptions. While such juggling acts are often unavailable, it’s important to occasionally step back and ask: Is this the best use of my time? Equally important: Do the distractions cause “lapses in judgement, insensitivity to changing clinical conditions and medication errors.”1 If so, there are practical solutions that can help refocus our attention.
Kissler et al. offer several recommendations. Initially, we need to recognize the difference between reachability and deep interpersonal availability. Most clinicians want to be reachable to help solve problems within their scope of practice, but spreading oneself too thin can jeopardize one's communication quality. Designing the physical spaces in which we interact with patients and colleagues is another area where we can build better attentiveness. For many years, the business and tech worlds believed that bullpens and shared office space would foster communication. Still, we are now beginning to realize that all the distractions may impede creativity and productivity. Finally, the University of Colorado team emphasizes the need to build attentiveness into one’s organizational culture: “Provide clinicians with the tools and language to prioritize attention in their daily practice.” That can be accomplished by developing a culture that encourages staffers to listen with curiosity, communicate with empathy, and remain open to others' perspectives, even when that perspective contradicts our understanding of the facts.
Of course, as every clinician knows, even the most attentive listener can still miss things. The medical interview can only uncover so much, necessitating a careful physical exam and diagnostic testing when appropriate. While imaging studies have always been a part of the diagnostic process, machine learning has taken these procedures to a new level, with companies like Zebra Medical, GE, Siemens, and AIDOC introducing useful services. AIDOC, for instance, has created a suite of services that combines three layers: an algorithmic layer, a product layer, and a clinically viable solution layer. All three are combined and implemented directly into the workflow. According to AIDOC, the platform reduces “turnaround time and increases quality and efficiency by flagging acute anomalies in real-time. Radiologists benefit from state-of-the-art deep learning technology that is "Always -on," running behind the scenes and freeing them to focus on the diagnosis itself.”
Surveys suggest a need for an always-on service to help radiologists cope with the unrealistic workload that many face daily. One study found that: “ Based on 255 uninterrupted eight-hour workdays per year, radiologists need to review one image every three to four seconds to meet workload demands.” The hectic pace likely contributes to misdiagnoses and loss of life. The diagnosis of lung cancer with imaging is one of the most challenging issues to contend with. It is estimated that misinterpreted chest X-rays are responsible for 90% of presumed errors in diagnosing pulmonary tumors.2 Mounting evidence suggests that ML-enhanced imaging data analysis may catch the disease at a much earlier stage, reduce hospital length of stay and health care costs, and save lives. For example, a prospective, randomized clinical trial that evaluated AIDOC-assisted CT scanning during the intracranial hemorrhage diagnosis found that algorithms gave clinicians an earlier heads-up. Specifically, the researchers looked at 620 consecutive head CT scans. They collected the turnaround times (TAT) for positive ICH findings, i.e., how long it took to complete the CT scan to report the findings to clinicians who needed the results to make a treatment decision.3 Wismuller and Stockmaster compared TAT when CT results were flagged in radiologists’ worklists to CT results that were not flagged. When radiologists were told about the potentially dangerous findings early on, TAT was 73 minutes +/- 143 minutes, compared to 132 minutes +/- 193 minutes when they were left in the dark early on.
These prospective results were supported by retrospective analysis of a much larger data set. A study presented at the 2019 Society of Photo-Optical Instrumentation Engineers conference analyzed over 7,000 head CT scans from urban academic and trauma centers. Using convolutional neural networks, AIDOC generated a specificity of 99%, the sensitivity of 95%, and overall accuracy of 98% in diagnosing intracranial bleeds when compared to ground truth from expert neuroradiologists.4
Reports like this certainly don't imply that machine learning-enhanced algorithms will someday replace physicians. High-quality patient care will always require clinicians who are empathetic listeners. Nor do they suggest that AI will replace experienced radiologists. But they suggest that those who ignore digital medicine innovations will eventually be replaced by those willing to combine traditional approaches with emerging digital techniques that augment human decision-making.
1.Kissler MJ, Kissler K, Burden M. Toward medical “ecology of attention.” New Engl J Med. 2021; 384: 299-301.
2. Del Ciello A, Franchi P, Contegiacomo A et al. Missed lung cancer: when, where, and why? Diagn. Interv Radiol. 2017;23:118-126.
3. Wismuller A, Stockmaster L. A Prospective Randomized Clinical Trial for Measuring Radiology Study Reporting Time on Artificial Intelligence-Based Detection of Intracranial Hemorrhage in Emergent Care Head CT. Presentation at SPIE Medical Imaging 2020 Conference, Houston, TX, February 15-20, 2020. https://arxiv.org/pdf/2002.12515.pdf
4. Ojeda P, Zawaideh M, Mossa-Basha M, et al. The utility of deep learning: evaluation of a convolutional neural network for detection of intracranial bleeds on non-contrast head computed tomography studies. Proceedings SPID Medical Imaging 2019. https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10949/2513167/The-utility-of-deep-learning--evaluation-of-a-convolutional/10.1117/12.2513167.short
By John Halamka and Paul Cerrato — Mayo Clinic Platform_Connect is transforming how patient data is used to generate innovative diagnostic tools and treatment options.
By John Halamka and Paul Cerrato — What happens when ChatGPT-4 and a human cardiologist are asked to diagnose the same patient? The results are quite revealing.
By John Halamka and Paul Cerrato — Several thought leaders and stakeholders have joined forces to create GoodDx.org, a searchable database that has the potential to reduce the human suffering affecting millions of Americans.