Fully Automated Echocardiogram Interpretation in Clinical Practice - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Oct 16;138(16):1623-1635.
doi: 10.1161/CIRCULATIONAHA.118.034338.

Fully Automated Echocardiogram Interpretation in Clinical Practice

Affiliations

Fully Automated Echocardiogram Interpretation in Clinical Practice

Jeffrey Zhang et al. Circulation. .

Abstract

Background: Automated cardiac image interpretation has the potential to transform clinical practice in multiple ways, including enabling serial assessment of cardiac function by nonexperts in primary care and rural settings. We hypothesized that advances in computer vision could enable building a fully automated, scalable analysis pipeline for echocardiogram interpretation, including (1) view identification, (2) image segmentation, (3) quantification of structure and function, and (4) disease detection.

Methods: Using 14 035 echocardiograms spanning a 10-year period, we trained and evaluated convolutional neural network models for multiple tasks, including automated identification of 23 viewpoints and segmentation of cardiac chambers across 5 common views. The segmentation output was used to quantify chamber volumes and left ventricular mass, determine ejection fraction, and facilitate automated determination of longitudinal strain through speckle tracking. Results were evaluated through comparison to manual segmentation and measurements from 8666 echocardiograms obtained during the routine clinical workflow. Finally, we developed models to detect 3 diseases: hypertrophic cardiomyopathy, cardiac amyloid, and pulmonary arterial hypertension.

Results: Convolutional neural networks accurately identified views (eg, 96% for parasternal long axis), including flagging partially obscured cardiac chambers, and enabled the segmentation of individual cardiac chambers. The resulting cardiac structure measurements agreed with study report values (eg, median absolute deviations of 15% to 17% of observed values for left ventricular mass, left ventricular diastolic volume, and left atrial volume). In terms of function, we computed automated ejection fraction and longitudinal strain measurements (within 2 cohorts), which agreed with commercial software-derived values (for ejection fraction, median absolute deviation=9.7% of observed, N=6407 studies; for strain, median absolute deviation=7.5%, n=419, and 9.0%, n=110) and demonstrated applicability to serial monitoring of patients with breast cancer for trastuzumab cardiotoxicity. Overall, we found automated measurements to be comparable or superior to manual measurements across 11 internal consistency metrics (eg, the correlation of left atrial and ventricular volumes). Finally, we trained convolutional neural networks to detect hypertrophic cardiomyopathy, cardiac amyloidosis, and pulmonary arterial hypertension with C statistics of 0.93, 0.87, and 0.85, respectively.

Conclusions: Our pipeline lays the groundwork for using automated interpretation to support serial patient tracking and scalable analysis of millions of echocardiograms archived within healthcare systems.

Keywords: diagnosis; echocardiography; machine learning.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Workflow for fully automated echocardiogram interpretation. The number of echocardiograms used for each step is indicated. Only a subset of these had measurements for cardiac structure or function, and far fewer had measurements for longitudinal strain. For disease detection, the slash separates the number of studies of cases and controls, respectively, used to train the model. HCM indicates hypertrophic cardiomyopathy; and PAH, pulmonary arterial hypertension. *For image segmentation, this number represents how many manually traced still images were used for training. Echo indicates echocardiogram; HCM, hypertrophic cardiomyopathy; and PAH, pulmonary arterial hypertension.
Figure 2.
Figure 2.
Convolutional neural networks successfully discriminate echocardiographic views. A, t-Distributed Stochastic Neighbor Embedding (t-SNE) visualization of view classification. t-SNE is an algorithm used to visualize high-dimensional data in lower dimensions. It depicts the successful grouping of test images corresponding to 23 different echocardiographic views. Echocardiographic still images indicate the distinct clustering of images of A4c views without occlusions and those with occlusion of the left atrium. B, Confusion matrix demonstrating successful and unsuccessful view classifications within the test data set. Numbers along the diagonal represent successful classifications, whereas off-diagonal entries are misclassifications. A2c indicates apical 2-chamber; A3c, apical 3-chamber; A4c, apical 4-chamber; echo, echocardiogram; LV, left ventricular; and PLAX, parasternal long axis.
Figure 3.
Figure 3.
Convolutional neural networks successfully segment cardiac chambers. We used the U-net algorithm to derive segmentation models for 5 views: A2c, A3c, A4c (left: top, middle, and bottom, respectively), parasternal short axis at the level of the papillary muscle (right, middle), and PLAX (right, bottom). For each view, the trio of images, from left to right, corresponds to the original image, the manually traced image used in training (ground truth), and the automated segmented image (determined as part of the cross-validation process). A2c indicates apical 2-chamber; A3c, apical 3-chamber; A4c, apical 4-chamber; CNN, convolutional neural network; and PLAX, parasternal long axis.
Figure 4.
Figure 4.
Automated segmentation results in accurate cardiac structure measurements in real-world conditions. A and B, Bland-Altman plot comparing automated and manual (derived during standard clinical workflow) measurements for indexed left ventricular end diastolic volume (LVEDV) from 8457 echocardiograms and left atrial volume from 4800 studies. Orange, red, and blue dashed lines delineate the central 50%, 75%, and 95% of patients as judged by differences between automated and manual measurements. The solid gray line indicates the median. C and D, Automated measurements reveal a difference in left atrial volumes between patients with HCM and matched controls (C) and a difference in left ventricular mass between patients with cardiac amyloidosis and matched controls (D). HCM indicates hypertrophic cardiomyopathy.
Figure 5.
Figure 5.
An automated computer vision pipeline accurately assesses cardiac function. A and B, Bland-Altman plot comparing automated and manual ejection fraction estimates for 6417 individual echocardiograms (A) and global longitudinal strain (GLS) for 418 echocardiograms (B). Delimiting lines are as in Figure 4A.
Figure 6.
Figure 6.
Automated strain measurements enable quantitative patient trajectories of patients with breast cancer treated with cardiotoxic chemotherapies. Automated strain values were computed for 9421 (apical) videos of 152 patients with breast cancer undergoing serial echocardiographic monitoring during chemotherapy. Individual plots were generated for each patient. A, A 58-year-old woman received trastuzumab therapy only. Each colored dot represents an individual echocardiogram. A smoothing spline was fit to the data. Ejection fractions in the published echocardiographic report are shown. Vertical blue dashed lines represent initiation and cessation of trastuzumab therapy. A horizontal dashed line at the longitudinal strain of 16% indicates a commonly used threshold for abnormal strain. B, Automated strain measurements confirm the more severe toxicity that occurs when combining trastuzumab/pertuzumab with anthracyclines. Violin plot showing median longitudinal strain values for patients pretreated (red) or not pretreated (blue) with neoadjuvant doxorubicin/cyclophosphamide before therapy with trastuzumab (or pertuzumab).
Figure 7.
Figure 7.
CNNs enable detection of abnormal myocardial diseases. A through C, Receiver operating characteristic curves for hypertrophic cardiomyopathy (A), cardiac amyloid (B), and pulmonary arterial hypertension (C) detection. D, Relationship of probability of amyloid with left ventricular mass. Blue line indicates linear regression fit, with 95% CI indicated by gray shaded area. AUROC indicates area under the receiver operating characteristic curve; CNN, convolutional neural network; HCM, hypertrophic cardiomyopathy; and PAH, pulmonary arterial hypertension.

Comment in

Similar articles

Cited by

References

    1. Hill JA, Olson EN. Cardiac plasticity. N Engl J Med. 2008;358:1370–1380. doi: 10.1056/NEJMra072139. - PubMed
    1. Ishizu T, Seo Y, Kameda Y, Kawamura R, Kimura T, Shimojo N, Xu D, Murakoshi N, Aonuma K. Left ventricular strain and transmural distribution of structural remodeling in hypertensive heart disease. Hypertension. 2014;63:500–506. doi: 10.1161/HYPERTENSIONAHA.113.02149. - PubMed
    1. Neskovic AN, Edvardsen T, Galderisi M, Garbi M, Gullace G, Jurcut R, Dalen H, Hagendorff A, Lancellotti P, Popescu BA, Sicari R, Stefanidis A European Association of Cardiovascular Imaging Document Reviewers: Focus cardiac ultrasound: the European Association of Cardiovascular Imaging viewpoint. Eur Heart J Cardiovasc Imaging. 2014;15:956–960. doi: 10.1093/ehjci/jeu081. - PubMed
    1. Andrus BW, Welch HG. Medicare services provided by cardiologists in the United States: 1999-2008. Circ Cardiovasc Qual Outcomes. 2012;5:31–36. doi: 10.1161/CIRCOUTCOMES.111.961813. - PubMed
    1. Szeliski R. Computer Vision: Algorithms and Applications. London: Springer-Verlag;; 2011.

Publication types

MeSH terms