AHA24 Scientific Sessions Daily News - Monday

7 #AHA24 ScientificSessions.org hospital when pressures in the heart start to increase,” he said. “These pressures don’t just increase two or three days prior to hospitalization, it takes four, five, six weeks. If you have an implantable sensor, we can track pressures and adjust medications, but maybe 1% of the heart failure population has a sensor implanted. This noninvasive device can provide similar information that can prevent patients from ending up in the hospital, prevent them from having symptoms and help them lead a more normal daily life at home.” Artificial intelligence can improve echocardiographic workflow A single-center randomized crossover study found that a novel artificial intelligence (AI)-based analysis tool can streamline the daily workflow in echocardiology with improved measurements and more patients examined compared to conventional manual echocardiography for cardiovascular risk assessment. AI assistance reduced the time per exam, 13.0 minutes versus 14.3 minutes for manual exams (p<0.001) and increased the number of daily exams from 14.1 to 16.7 (p=0.003) with less sonographer fatigue (p=0.039), 3.4-fold more echocardiographic parameters analyzed per exam (85 versus 25, p<0.001) and improved cardiographic image quality (p<0.001). Kagiyama “Over 90% of the AI’s initial values were clinically acceptable and used in clinical practice,” said Nobuyuki Kagiyama, MD, PhD, associate professor of cardiology at Juntendo University School of Medicine in Tokyo, Japan. “AI can enhance efficiency in the echo lab, easing a boring, repetitive task like screening echocardiograms, so sonographers and cardiologists can spend more time on the detailed evaluation of more severe patients who really need more intensive care and attention.” Improving echocardiography workflow is a particular interest in Japan, which has about onethird the U.S. population but performs about 1.3 times more echocardiograms, commonly for routine cardiovascular risk screening. AI-ECHO randomized four experienced sonographers performing screening echocardiography over 38 days on a daily basis to use AI for automatic echocardiography analysis (19 AI days) or conventional procedures (19 non-AI days). Both AI and nonAI echocardiograms were reviewed by expert cardiologists, who finalized all reports for clinical use. The primary endpoint was examination efficiency, defined as the time per examination and the number of exams performed per day. Secondary endpoints included the number of parameters analyzed and image quality. AI days allowed sonographers to focus on image acquisition and quality, Kagiyama said, resulting in an overall improvement in image quality compared to non-AI days. Because AI was handling image analysis, sonographers could, and did, concentrate more on acquiring higher quality images knowing that they would not have to spend time later evaluating imaging themselves. “This software is already approved by the FDA and the Pharmaceuticals and Medical Devices Agency in Japan for some uses, but AI-ECHO pushed it beyond what is approved today,” Kagiyama said. “This realworld randomized trial demonstrates how AI-based automatic analysis can significantly improve the efficiency of screening echocardiography by reducing exam time while maintaining image quality and reducing sonographer fatigue.” AI interpretation can accurately interpret echocardiogram findings across multiple metrics Transthoracic echocardiography (TTE) is a key tool for cardiovascular evaluation, but manual reporting can be slow, and interpretation is subject to intra-reviewer variability. A novel AI tool, PanEcho, is the first view-agnostic, multitask AI model that automates TTE interpretation across views and acquisitions for all key echocardiographic metrics and findings. An initial validation study showed a median area under the receiver operating characteristic curve (AUC) of 0.91 across 18 classifications. Key findings include an AUC of 0.99 to detect severe aortic stenosis, 0.98 for moderatesevere left ventricular (LV) systolic dysfunction and 0.95 for moderatesevere LV dilation. Holste “To our knowledge, this is the first AI model to provide comprehensive echocardiogram interpretation from multiview echocardiography,” said Gregory Holste, MSE, graduate student at the Yale School of Medicine Cardiovascular Data Science (CorDS) Lab. “Current AI applications in echocardiography have been limited to single views and single pathologies for outcomes. And intrepetation is nearly real time. PanEcho was developed using 1.23 million echocardiographic videos from 33,927 TTE studies performed at a New England health system between January 2016 and June 2022. The model can perform 39 TTE reporting tasks spanning the full spectrum of myocardial and valvular structure and function from parasternal, apical and subcostal views, including B-mode and color Doppler videos. The model was evaluated on a distinct New England health system cohort and two cohorts in California. Researchers assessed off-the-shelf diagnostic performance and PanEcho’s ability to function as a foundational model that can be fine-tuned for specific domains. Khera The model estimated continuous metrics with a median normalized mean absolute error (MAE) of 0.13 across 21 routine echocardiographic tasks. LV ejection fraction (EF) can be estimated with 4.4% MAE and LV internal diameter with 3.8 mm MAE. PanEcho can identify which views are most informative for each task. The model has transferred LVEF estimation to novel pediatric populations with superior performance compared to existing approaches, 3.9% MAE versus 4.5% MAE for the next-best approach. “We see strong predictive performance even in very simplified acquisition of just five videos from key views,” said principal investigator Rohan Khera, MD, MS, CorDS Director. “Such applications to simpler acquisitions could broaden the efficient, expert-level interpretation of PanEcho even to point-of-care ultrasound, especially suited to low-resource settings. The next step is prospective validation in a real-world clinical workflow.” Customized agendas for Sessions 2024 Pick up your copy near the Science & Technology Hall entrance, or scan the QR codes to view online. Take a through

RkJQdWJsaXNoZXIy MjI2NjI=