Abstract
The purpose of this study was to determine the diagnostic accuracy of an iPhone for evaluation of the coronary arteries on coronary CT angiography (CTA) in comparison to a standard clinical workstation. Fifty coronary CTA exams were selected to include a range of normal and abnormal cases including both coronary artery disease (CAD) of varying severity and coronary artery anomalies. Two cardiac radiologists reviewed each exam on a standard clinical workstation initially and then on an iPhone 6 after a washout period. Coronary stenosis was evaluated on a 4-point scale and presence of coronary anomalies was recorded. Two additional cardiac radiologists reviewed all cases in consensus on the standard workstation and these results were used as the reference standard. When reader results were compared to the reference standard, there was no significant difference in agreement for per-vessel stenosis scores using either the iPhone or standard clinical workstation. The intraobserver intertechnology agreement on a per-vessel basis for obstructive CAD were 97.4% (299/307, kappa = 0.777) and 97.5% (317/325, kappa = 0.804) for the two readers. All cases of coronary anomalies were identified by both readers regardless of the device used. Coronary CTA examinations can be interpreted on a smartphone with diagnostic accuracy comparable to a standard workstation. 3D visualization app on the iPhone may facilitate urgent coronary CTA review when a workstation is not available.
Keywords: Coronary CT angiography, iPhone, Smartphone, Coronary artery disease, Mobile
Introduction
Coronary artery disease (CAD) is among the leading causes of mortality in the USA, responsible for about one in seven deaths [1]. Over seven million individuals presented to the ED with chest pain in 2011 [2]. Coronary CTA provides rapid and cost-effective evaluation of CAD in such patients [3]. The high negative predictive value of coronary CTA ensures the safe discharge of patients and obviates the need for invasive coronary angiography [4]. In a busy ED environment with high patient turnover, accurate and prompt review of coronary CTA exams is necessary.
It is currently recommended that coronary CTA exams be interpreted on a dedicated workstation with 3D post-processing capabilities [5]. As smartphone devices become increasingly ubiquitous and more powerful, they may potentially become an alternative way to view diagnostic imaging. Modern smartphones are internet-capable with the ability to access hospital data (including radiology exams) through cloud computing. The iPhone is one of the most popular smartphones currently available.
Interpretation of medical imaging on mobile devices requires high-fidelity image display. A study on remote reading of pathology slides showed a strong correlation between image quality and diagnostic confidence [6]. In radiology, the image format for diagnosis and storage must be the same. Unlike plain film and mammography data, CT images with lossless compression (512 × 512 pixel matrix) can be fully displayed on the iPhone without the use of zoom. Simultaneous display of real-time clinical and dynamic radiologic images on a mobile device is also possible [7].
Prior studies have examined the use of the iPhone in the diagnosis of aortic injury and acute appendicitis [8, 9]. Tablet computers with larger screens than the iPhone had been evaluated for the diagnosis of pulmonary embolism, and to evaluate brain CT and spinal MRI [10–12]. We recently demonstrated that the diagnostic accuracy of coronary CTA is comparable between the iPad and a standard workstation [13]. It is not yet established whether the iPhone is sufficient for reviewing coronary CTA exams. In this study, we test the hypothesis that the iPhone is comparable to a standard workstation with regard to their diagnostic performance for coronary artery disease and coronary anomalies.
Materials and Methods
This retrospective study was approved by the hospital’s institutional review board. The study cohort consisted of 50 patients with coronary CTA exams performed in the ED, selected to represent a range of CAD severity and coronary anomalies. We used the same coronary CTA cases and acquisition parameters as those in our previous study [13]. A radiology technologist anonymized and then imported the test cases from the hospital PACS into the syngo.via server (Siemens Medical Solutions). Patient cases were anonymized in compliance with the Health Insurance Portability and Accountability Act.
Coronary CTA Analysis
Two cardiac-trained imagers (C.T.L. and S.L.Z. with 3 and 5 years of experience, respectively) participated in the multi-device case reviews as test readers. Reference diagnoses were determined on the standard workstation by two experienced cardiac imagers (L.C.C. and E.K.F.) in consensus, comprising the reference scorers. Initial case interpretations were performed on the Picture Archiving and Communication System (Ultravisual, Emageon, Inc.) workstation, which allowed for interactive double-oblique reformatting and creation of maximum intensity projection (MIP) images. Curved multiplanar reconstruction (MPR) sequences were generated using post-processing software by CT technologists on a dedicated workstation (syngo.via, Siemens Healthcare) and separately exported to the PACS server as per clinical routine. No patient health information or clinical history was provided to reviewers. A 4-point Likert scale was used in the grading of CAD severity: 0 for no stenosis, 1 for <50% stenosis, 2 for stenosis ≥50–69%, and 3 for ≥70% stenosis (Fig. 1). Nonobstructive CAD was defined as stenosis <50% and obstructive CAD was defined as stenosis ≥50%. A stenosis score was provided for each of the following coronary vessels: right coronary artery (RCA), posterior descending artery (PDA), left main (LM), left anterior descending (LAD), diagonal branches, left circumflex (LCx), and obtuse marginal (OM) branches. The highest stenosis score for each patient was recorded. Presence or absence of coronary anomaly/aneurysm was determined. A washout period of at least 3 months elapsed between the reviews on the workstation and the smartphone.
Mobile interpretations were performed on the iPhone 6 plus (Apple Inc., USA), one of the most widely used smartphones in the USA. This model features a 5.5-in. display (diagonal viewing area) with a spatial resolution of 1920 × 1080 pixels at 401 pixels per inch. Maximal brightness of the device is 500 cd/m2. Neither reader had previous experience interpreting radiological exams on the iPhone. Standard wireless local area network settings (802.11n) with encryption was utilized. One reader performed the readings in a typical well-lit home office using residential Wi-Fi, while the other reader reviewed the cases in a radiology reading room with minimized ambient light. Data transmission over cellular network was not tested.
Full DICOM datasets for coronary CTA exams were displayed using the Siemens syngo.via WebViewer app (Siemens Medical Solutions). Access into the app required a hospital-assigned username and password. Thin-section (0.75 mm) axial plane images and reconstructions in multiple cardiac phases were immediately available to view. Post-processing capabilities of the app included oblique MPR, MIP, and volume rendering technique (VRT). Curved MPR sequences created by CT technologists as part of clinical routine were exported to the server, as the mobile app lacked the capability to create these sequences on-the-fly.
Statistical Analysis
Data was collected and stored in a spreadsheet using Microsoft Excel (Microsoft, Redmond, USA). Statistical analysis was performed using Stata (version 13.1, StataCorp, College Station, TX). The frequency of concordant readings on a per-vessel basis between test and reference readers were reported in percentages. The agreement between readers (interobserver agreement) and between devices (intertechnology agreement) were measured using Cohen’s kappa statistics. Strength of agreement was defined for the following κ values: <0.20 poor agreement, 0.21–0.40 fair agreement, 0.41–0.60 moderate agreement, 0.61–0.80 good agreement, and 0.81–1.00 very good agreement. We performed multivariable logistic regression analysis using device and reader as independent variables to determine their influence on obstructive CAD diagnosis (dependent variable). A p value <0.05 was considered statistically significant.
Results
Reference results for the 50 cases are given in Table 1. The prevalence of CAD in the study cohort was 56%. Out of 28 patients with CAD, 15 had at most mild stenosis, 7 had at most moderate stenosis, and 6 had severe stenosis. Six patients had anomalous RCA with interarterial course and one patient had multiple coronary artery aneurysms.
Table 1.
Diagnosis | Coronary CTA cases (n = 50) |
---|---|
Highest stenosis score | |
0 = No CAD | 22 |
1 = Mild CAD | 15 |
2 = Moderate CAD | 7 |
3 = Severe CAD | 6 |
Coronary anomaly | 7 |
Coronary aneurysm | 1 |
The frequency of concordant scores between the multi-device and reference readers are listed in Table 2. The coronary vessel with the lowest agreement rates was the LAD at 72–80%. Both test readers achieved 100% accuracy in diagnosing coronary anomalies and aneurysms.
Table 2.
Reader 1 | Reader 2 | |||
---|---|---|---|---|
iPhone (%) | Workstation (%) | iPhone (%) | Workstation (%) | |
Left main | 96.0 | 90.0 | 92.0 | 96.0 |
Left anterior descending | 80.0 | 80.0 | 76.0 | 72.0 |
Diagonal branches | 95.3 | 95.2 | 90.5 | 85.4 |
Circumflex | 92.0 | 92.0 | 90.0 | 88.0 |
Obtuse marginal branches | 95.0 | 100.0 | 100.0 | 91.9 |
Right coronary artery | 89.6 | 95.8 | 89.6 | 83.3 |
Posterior descending artery | 91.7 | 86.4 | 85.7 | 83.3 |
Maximal per-patient stenosis severity | 76.0 | 82.0 | 76.0 | 70.0 |
Coronary anomalies | 100.0 | 100.0 | 100.0 | 100.0 |
Intraobserver intertechnology (workstation versus iPhone) and interobserver (reference versus workstation/iPhone) kappa values evaluating the agreement in CAD scoring (4-point scale) are listed in Table 3. Intertechnology agreement was strong with kappa values of 0.777 and 0.804.
Table 3.
Reader 1 | Reader 2 | |||
---|---|---|---|---|
κ | p value | κ | p value | |
Intraobserver | ||||
Workstation versus iPhone | 0.777 | <0.01 | 0.804 | <0.01 |
Test reader versus reference | ||||
Workstation | 0.651 | <0.01 | 0.765 | <0.01 |
iPhone | 0.717 | <0.01 | 0.758 | <0.01 |
Device and reader were entered into a multivariate logistic regression equation, which showed no statistically significant difference in the diagnosis of obstructive CAD compared to the reference scores. Therefore, the choice of device was not associated with a difference in diagnostic accuracy for CAD.
On a per-artery level, the intertechnology agreements for obstructive CAD were high for both readers at 97.4 and 97.5% (299/307 and 317/325, respectively). The two test readers combined for 16 arteries with discrepant readings, consisting of 7 LAD arteries, 2 each of the LM/LCx/diagonal/OM arteries, and 1 RCA.
Discussion
Our results showed similarly high levels of agreement for CAD scores between our test readers and reference readers, independent of whether an iPhone or standard workstation was used. Intraobserver intertechnology agreement was higher for each test reader than their corresponding interobserver agreement with the reference reader. Intraobserver and interobserver agreement values were similar to those observed by Nicol et al. in a study examining 64-slice MDCT [14].
Both test readers each had 8 discrepancies on a per-vessel basis, with discrepancy rates of 2.6% and 2.5%, respectively. Some of the discrepancies may potentially be attributed to the “satisfaction of search” phenomenon, which occurred in this study when the detection of a vessel with obstructive CAD led to the failure to detect obstructive disease in other vessels. For example, despite the 8 discrepant vessel readings for reader 1, only 1 patient had a different reading in the overall presence of obstructive CAD (no obstructive CAD on the workstation and obstructive CAD on the iPhone).
Agreement rates between different devices and different scorers was the lowest for the LAD, possibly in part due to the higher incidence of LAD disease among our study population. Given the similarly low LAD agreement on the workstation and iPhone, it is likely that these discrepancies are related to the intrinsic subjectivity in stenosis scoring of the LAD. The presence of noneccentric calcified lesion and mixed plaque may also decrease the likelihood of agreement between readers [14].
The prevalence of obstructive CAD in our study was 26% (13 of 50) which was comparable to a prior study (25%) by LaBounty et al. on the diagnostic accuracy of coronary CTA on the iPhone [15]. That study utilized an older device (iPhone 3G) with DICOM display software limited to axial images only, whereas current software can produce real-time MPR and 3D post-processing images. Perhaps due to these difference, intraobserver intertechnology agreements were slightly better in our study (97.4% and kappa = 0.777, 97.5% and kappa = 0.804) compared to LaBounty’s study (92% and kappa = 0.75). Similarly, Park et al. showed very good agreement for obstructive CAD (kappa = 0.89) between in-house radiologists using a dedicated workstation and a blinded cardiac radiologist using a smartphone [16].
Using the smartphone as an alternative diagnostic device would grant the on-call radiologists greater freedom. They may step away from the workstation to perform other tasks, such as a bedside ultrasound or a fluoroscopic procedure, and remain immediately accessible for imaging consultation. Although the study did not control for different lighting conditions, the ability to review cases in a well-lit environment without impairing the radiologist’s diagnostic accuracy would further free the radiologist from the dark environment of a typical reading room. After a mobile imaging review, the radiologist should return to the workstation to verify that the mobile interpretation is concordant with the workstation interpretation and to furnish a final report onto PACS.
Compared to previous iPhone models, the iPhone 6 plus has the largest screen size with an area of 4.8 in. by 2.7 in. While the screen is much smaller on a smartphone than a monitor for diagnostic imaging, the phone is also brought to a closer distance to the user which increases the effective viewing area. On this smaller screen, judicious use of the zoom and pan functions are essential to properly visualize the small coronary arteries. There are smartphones with larger screen size on the market and they may potentially be more effective in the diagnosis of CAD; however, the tradeoff is that a bulkier device might not be as portable as a smaller one.
Security concerns should be addressed before considering the use of a smartphone for diagnostic imaging review. In our study no patient health information was stored on the device, and the app required a login to the server whenever the session timeout period elapses. Screen lock, a standard security feature on smartphones, provides an additional layer of protection. Nonetheless, it would be good practice to close the mobile app and lock the phone once image reviews are completed.
Once an imaging diagnosis is made, the smartphone can also be used to immediately communicate critical findings. We routinely discuss clinically significant radiology results to emergency physicians over the phone. Additionally, representative radiological images can be sent via encrypted email or SMS messaging between two smartphone devices. A pilot study showed excellent diagnostic accuracy when comparing review of smartphone capture of plain films and the original radiograph [17]. Conceivably, incorporation of screen-sharing functionality could enable a “virtual consultation” where both devices can share control of the PACS display. Virtual consults can provide clinicians timely access to imaging review, particularly when traditional consultation to imaging specialist is difficult [18].
In this study, there was no difficulty in diagnosing coronary artery anomalies and aneurysms while using the iPhone. The qualitative features of anomalous and aneurysmal coronaries can be readily appreciated on MPR, MIP, and VRT reconstructions (Fig. 2). Transmission of representative CTA images on a mobile device would have great value in communicating with referring clinicians and patients.
Several limitations to our study should be considered. Navigating radiologic studies on any new device presents a learning curve that must be overcome for competency. The user must be proficient at manipulating the imaging views to make an accurate diagnosis. Both iPhone readers in this study had previous experience using the same DICOM viewing app on an iPad; therefore, a complete novice to diagnostic imaging applications on the smartphone may yield different results. The cardiac imager’s proficiency in cardiac CT may also be important, as both iPhone readers were board-certified radiologists who had reviewed over 1000 cardiac CT scans. We attempted to minimize potential recall bias by incorporating a washout period of 3+ months between the use of either device. We did not describe any non-coronary findings that may affect clinical management, such as pulmonary embolism or aortic aneurysm. Prior study showed that aortic emergencies can be accurately diagnosed on a handheld DICOM viewer [8]. For the mobile phone to be an effective extension of the clinical workstation, further validation of these results is necessary.
Conclusion
Coronary CTA examinations can be interpreted on a smartphone with diagnostic accuracy comparable to a standard workstation. 3D visualization app on the iPhone may facilitate urgent coronary CTA review when a workstation is not available.
CAD, coronary artery disease; CTA, computed tomographic angiography; ED, emergency department; MIP, aximum intensity projection; MPR, multiplanar reconstruction; VRT, volume rendering technique; 3D, 3-dimensional.
Compliance with Ethical Standards
Conflict of Interest
Dr. Elliot K. Fishman discloses the following relationships: Siemens Medical Systems, research grant support, and HIP Graphics, co-founder. The other authors have no competing interests or disclosures to declare.
IRB Statement
This research was approved by the institutional review board.
Contributor Information
Cheng Ting Lin, Phone: 410-614-6170, Email: clin97@jhmi.edu.
Stefan Loy Zimmerman, Phone: 410-340-6649, Email: stefan.zimmerman@jhmi.edu.
Linda C. Chu, Phone: 410-340-6649, Email: lchu1@jhmi.edu
John Eng, Phone: 410-340-6649, Email: jeng@jhmi.edu.
Elliot K. Fishman, Phone: 410-340-6649, Email: efishman@jhmi.edu
References
- 1.Writing Group M. Mozaffarian D, Benjamin EJ, et al. Heart disease and stroke statistics-2016 update: a report from the American Heart Association. Circulation. 2016;133:e38–360. doi: 10.1161/CIR.0000000000000350. [DOI] [PubMed] [Google Scholar]
- 2.Centers for Disease Control and Prevention: National Hospital Ambulatory Medical Care Survey: 2011 Emergency Department Summary Tables. In, 2011
- 3.Ladapo JA, Jaffer FA, Hoffmann U, et al. Clinical outcomes and cost-effectiveness of coronary computed tomography angiography in the evaluation of patients with chest pain. Journal of the American College of Cardiology. 2009;54:2409–2422. doi: 10.1016/j.jacc.2009.10.012. [DOI] [PubMed] [Google Scholar]
- 4.Hamon M, Lepage O, Malagutti P, et al. Diagnostic performance of 16- and 64-section spiral CT for coronary artery bypass graft assessment: meta-analysis. Radiology. 2008;247:679–686. doi: 10.1148/radiol.2473071132. [DOI] [PubMed] [Google Scholar]
- 5.Leipsic J, Abbara S, Achenbach S, et al. SCCT guidelines for the interpretation and reporting of coronary CT angiography: a report of the Society of Cardiovascular Computed Tomography Guidelines Committee. Journal of cardiovascular computed tomography. 2014;8:342–358. doi: 10.1016/j.jcct.2014.07.003. [DOI] [PubMed] [Google Scholar]
- 6.Fontelo P, Liu F, Yagi Y. Evaluation of a smartphone for telepathology: lessons learned. J Pathol Inform. 2015;6:35. doi: 10.4103/2153-3539.158912. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Lee Y, Kim C, Choi HJ, Kang B, Oh J, Lim TH. A feasibility study of telementoring for identifying the appendix using smartphone-based telesonography. Journal of digital imaging. 2017;30:148–155. doi: 10.1007/s10278-016-9921-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Choudhri AF, Norton PT, Carr TM, 3rd, Stone JR, Hagspiel KD, Dake MD. Diagnosis and treatment planning of acute aortic emergencies using a handheld DICOM viewer. Emergency radiology. 2013;20:267–272. doi: 10.1007/s10140-013-1118-8. [DOI] [PubMed] [Google Scholar]
- 9.Choudhri AF, Carr TM, 3rd, Ho CP, Stone JR, Gay SB, Lambert DL. Handheld device review of abdominal CT for the evaluation of acute appendicitis. Journal of digital imaging. 2012;25:492–496. doi: 10.1007/s10278-011-9431-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Johnson PT, Zimmerman SL, Heath D, et al. The iPad as a mobile device for CT display and interpretation: diagnostic accuracy for identification of pulmonary embolism. Emergency radiology. 2012;19:323–327. doi: 10.1007/s10140-012-1037-0. [DOI] [PubMed] [Google Scholar]
- 11.McNulty JP, Ryan JT, Evanoff MG, Rainford LA. Flexible image evaluation: iPad versus secondary-class monitors for review of MR spinal emergency cases, a comparative study. Academic radiology. 2012;19:1023–1028. doi: 10.1016/j.acra.2012.02.021. [DOI] [PubMed] [Google Scholar]
- 12.Mc Laughlin P, Neill SO, Fanning N, et al. Emergency CT brain: preliminary interpretation with a tablet device: image quality and diagnostic performance of the Apple iPad. Emergency radiology. 2012;19:127–133. doi: 10.1007/s10140-011-1011-2. [DOI] [PubMed] [Google Scholar]
- 13.Zimmerman SL, Lin CT, Chu LC, Eng J, Fishman EK. Remote reading of coronary CTA exams using a tablet computer: utility for stenosis assessment and identification of coronary anomalies. Emergency radiology. 2016;23:255–261. doi: 10.1007/s10140-016-1399-9. [DOI] [PubMed] [Google Scholar]
- 14.Nicol ED, Stirrup J, Roughton M, Padley SP, Rubens MB. 64-channel cardiac computed tomography: intraobserver and interobserver variability (part 1): coronary angiography. Journal of computer assisted tomography. 2009;33:161–168. doi: 10.1097/RCT.0b013e31817c423e. [DOI] [PubMed] [Google Scholar]
- 15.LaBounty TM, Kim RJ, Lin FY, Budoff MJ, Weinsaft JW, Min JK. Diagnostic accuracy of coronary computed tomography angiography as interpreted on a mobile handheld phone device. JACC Cardiovascular imaging. 2010;3:482–490. doi: 10.1016/j.jcmg.2009.11.018. [DOI] [PubMed] [Google Scholar]
- 16.Park JH, Kim YK, Kim B, et al. Diagnostic performance of smartphone reading of the coronary CT angiography in patients with acute chest pain at ED. Am J Emerg Med. 2016;34:1794–1798. doi: 10.1016/j.ajem.2016.06.009. [DOI] [PubMed] [Google Scholar]
- 17.Licurse MY, Kim SH, Kim W, Ruutiainen AT, Cook TS. Comparison of diagnostic accuracy of plain film radiographs between original film and smartphone capture: a pilot study. Journal of digital imaging. 2015;28:646–653. doi: 10.1007/s10278-015-9783-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Rosenkrantz AB, Sherwin J, Prithiani CP, Ostrow D, Recht MP. Technology-assisted virtual consultation for medical imaging. Journal of the American College of Radiology : JACR. 2016;13:995–1002. doi: 10.1016/j.jacr.2016.02.029. [DOI] [PubMed] [Google Scholar]