Abstract
Malaria remains a major burden on global health, causing about half a million deaths every year. The objective of this work is to develop a fast, automated, smartphone-supported malaria diagnostic system. Our proposed system is the first system using both image processing and deep learning methods on a smartphone to detect malaria parasites in thick blood smears. The underlying detection algorithm is based on an iterative method for parasite candidate screening and a convolutional neural network model (CNN) for feature extraction and classification. The system runs on Android phones and can process blood smear images taken by the smartphone camera when attached to the eyepiece of a microscope. We tested the system on 50 normal patients and 150 abnormal patients. The accuracies of the system on patch-level and patient-level are 97% and 78%, respectively. AUC values on patch-level and patient-level are, respectively, 98% and 85%. Our system could aid in malaria diagnosis in resource-limited regions, without depending on extensive diagnostic expertise or expensive diagnostic equipment.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Malaria remains a major burden on global health, causing millions of deaths every year in more than 90 countries and territories. According to the World Health Organization’s (WHO) malaria report in 2018, about 219 million malaria cases were detected worldwide in 2017, causing approximately 435,000 deaths [1]. Malaria is caused by Plasmodium parasites that are transmitted though the bites of infected female Anopheles mosquitoes. An estimated 9 out of 10 malaria deaths occur in sub-Saharan Africa; most deaths occur among children, where a child dies almost every minute from the disease [1]. Microscopy examination of stained thick and thin blood smears is currently considered as the gold standard for malaria diagnosis [2, 3]. Thick blood smears are used to detect the presence of malaria parasites in a drop of blood, whereas thin blood smears allow differentiating parasite species and development stages. Microscopy examination is low-cost and widely available, but is time-consuming. Moreover, the effectiveness of microscopy diagnosis depends on a parasitologists’ expertise [4]. In situations with poor quality control, inaccurate results can lead to misdiagnosis or inappropriate treatment [4]. Thus, a fast and efficient automated diagnosis system is essential to malaria control.
The development of small camera-equipped microscopic devices, such as smartphones, has offered a new way for malaria diagnosis in resource-poor areas, using image processing and machine learning techniques [5]. Previous work has focused on the design and development of mobile devices for capturing images to replace current microscopes [6,7,8,9,10,11,12], also in combination with image processing [13,14,15,16,17]. However, so far, most of the work has concentrated on thin blood smears, and only the system in [16] is developed for parasite detection in thick blood smears.
In this paper, we propose a fast, low-cost, automated system for diagnosing malaria in thick smears. In fact, our system is the first system that can process thick blood smears on smartphones using image processing and deep learning methods. We implemented the system as a smartphone application (app), which runs on Android phones and which can detect parasites in a thick blood smear image within ten seconds. Our system aims to aid in clinical diagnosis of malaria in resource limited areas by trying to solve pending issues such as accessibility, cost, rapidness, and accuracy. Compared to the work in [16], we apply deep learning techniques for parasite detection and achieve more accurate results on more patients, including both normal and abnormal patients.
The paper structure will be as follows: Sect. 2 describes the image processing and analysis methods for our proposed system; Sect. 3 presents our smartphone tool for automated malaria diagnosis; Sect. 4 shows the experimental results on 200 patients; and Sect. 5 concludes the paper with the discussion and conclusion.
2 Image Processing and Deep Learning Methods
For our automated malaria diagnosis in thick smear images, we split the problem into two sub-problems: white blood cell (WBC) detection and parasite detection. We first detect WBCs and remove them from the image so that they do not distract our subsequent parasite counting method. This also provides the WBC count, which is an essential part of the standard protocol for diagnosing malaria in thick smears. The second stage, parasite detection, consists of a screening step using image-processing methods and a classification step using deep learning methods. Figure 1 shows the flowchart of our automated malaria diagnosis system.
2.1 WBC Detection
Based on a histogram analysis of thick blood smears, we assume that both the nuclei of parasites and WBCs have lower intensities than the background due to their staining. To avoid confusing WBCs with parasites, we first filter out WBCs before performing parasite candidate screening. For WBC detection, we first convert a thick smear RGB image into a grayscale image. Then, we threshold the grayscale image using Otsu’s method [18]. After this, we apply morphological operations to separate out WBCs. We can count potentially touching WBCs as separate cells by considering the typical expected size of a white blood cell. Before we screen for parasites in the next stage, we set all pixels of detected WBCs to zero.
2.2 Parasite Detection
Parasite detection in thick blood smear images involves parasite candidate screening and classification. We identify parasite candidates using our proposed Iterative Global Minimum Screening (IGMS) method and perform classification by a customized Convolutional Neural Network (CNN) classifier.
IGMS identifies parasite candidates by localizing the minimum non-zero intensity pixel values in a grayscale image. If only one pixel is localized, a circular region centered at this pixel location with a pre-defined radius of 22 pixels, which is the average parasite radius, is cropped from the original RGB image and is considered a parasite candidate. If several pixels with the same minimum intensity are localized, a circular candidate region is extracted for each of them when its distance to at least one of the other pixels is larger than 22. Once a parasite candidate is selected, the intensity values inside this region of the grayscale image will be replaced by zeros to guarantee the convergence of the IGMS method. To further reduce the runtime of parasite candidate screening, the original thick blood smear image is downsampled by a factor of two in each dimension for localizing minimum intensities, while the candidates are always cropped from the original RGB image. The IGMS screening procedure stops when the number of parasite candidates reaches a given number. In our experiments, we identify 400 parasite candidates for each image to cover the true parasites as much as possible, while still providing an acceptable runtime. Using this number, experiments on our dataset of 200 patients show that we can achieve a sensitivity above 97% on image level and patient-level. Each parasite candidate is a 44 × 44 × 3 RGB image patch, with pixels outside the circular region set to zero.
Once the parasite candidates are identified, we use a CNN model to classify them either as true parasites or background. In this work, we customize a CNN model consisting of three convolutional layers, three batch normalization layers, three max-pooling layers, two fully-connected layers and a softmax classification layer as shown in Fig. 2. The batch normalization layer is used to allow a higher learning rate and to be less sensitive to the initialization parameters, followed by a rectified linear unit (ReLU) as the activation function.
Architecture of the customized CNN model for parasite classification. The numbers below the green dotted line represent the convolutional kernel sizes and the sizes of the max-pooling regions. The hidden layers include three fully-connected layers and two dropout layers with a dropout ratio of 0.5. The output softmax layer computes the probabilities of the input patch being either a parasite or non-parasite. (Color figure online)
3 Smartphone Tool
Based on the image processing algorithms and deep learning methods for WBC and parasite detection, we develop a smartphone-supported automated system to diagnose malaria in thick blood smear images. We implement the system as an Android platform app. When using this app, the camera of the smartphone is attached to the eyepiece of a microscope, while the user adjusts the microscope to find target fields in the blood smear and takes pictures with the app. The algorithms in the app will then process these images directly on the phone. The app records the automatic parasite counts, along with patient and smear metadata, and saves them in a local database on the smartphone, where they can be used to monitor disease severity, drug effectiveness, and other parameters. We implemented an embedded camera function to preview and capture the image seen through the microscope. A user will operate with the optical zoom of the microscope to bring the image into focus and enlarge the image. The app does provide the option to adjust white balance and the option to adjust the color of the image among different lighting conditions. Once the image is taken, the app presents the captured image to the user for review. When the user accepts the image, the app processes the image, counts and records the infected cells and parasites, and displays the results in the user interface. Typically, users will take several images until they have acquired enough data to meet the requirements of their protocols, which usually involves counting a minimum number of white blood cells. The app will aggregate the parasite counts across all images. We implemented the algorithms for WBC and parasite detection using the OpenCV4Android SDK library.
After the image acquisition and processing stage, the app will go through a series of input masks for the user to fill in the information associated with the current patient and smear. This information is saved in the local database of the app, which we built with the SQLite API provided by Android. The app offers a user interface to the database where the user can view the data and images of previous smears, allowing hospital staff to monitor the condition of patients.
Since malaria is a disease that is widespread in different areas around the world, the app aims to support several languages to accommodate users in different countries. With English being the default language, the app, currently, also supports Thai and simplified Chinese. We are working on adding support for other languages. This app, called NLM Malaria Screener, is available in the Google Play™ store (https://play.google.com/store). Figure 3 shows our smartphone-supported automated malaria diagnosis system and a screenshot with detected parasites in a thick blood smear image.
4 Experimental Setup and Results
We acquired Giemsa-stained thick blood smear images from 150 patients infected with P. falciparum and from 50 normal patients at Chittagong Medical College Hospital, Bangladesh. The images were acquired using an Android smartphone and its camera. They were captured with 100x magnification in RGB color space with a 3024 × 4032 pixel resolution. An expert microscopist manually annotated each image at the Mahidol-Oxford Tropical Medicine Research Unit (MORU), Bangkok, Thailand. We archived all images and their annotations at the U.S. National Library of Medicine (IRB#12972). In this work, we use 2967 thick blood smear images from all 200 patients, including 1819 images from the 150 patients infected with parasites. We evaluate the performance of our automated malaria diagnosis system with five-fold cross evaluation, splitting the dataset into training sets and test sets on patient-level. To achieve a better performance, we use a balanced training set with an equal number of positive and negative patches. We do so by cropping the positive patches from the manually annotated images while generating the negative patches based on IGMS.
4.1 Patch-Level Five-Fold Cross Evaluation
We perform the five-fold training, validation, and testing of our customized CNN classifier on an Intel(R) Xeon(R) CPU E3-1245 (Dual CPU, 3.5 GHz, 16 GB). Table 1 shows the mean performance of our automated malaria diagnosis system across five folds, using a threshold of 0.7 for the CNN classifier, in terms of accuracy, F-score, AUC, sensitivity, specificity, precision, and negative predictive value, which are 96.89%, 81.80%, 98.48%, 90.82%, 97.43%, 74.84% and 99.17%, respectively. The left-hand side of Fig. 4 shows the corresponding ROC curves. Note that the performance on patient-level is generally lower because computing the parasitemia for a patient involves classifying multiple parasite candidate patches.
4.2 Patient-Level Five-Fold Cross Evaluation
For each run of our patient-level cross evaluation, we train on a set of 90 infected patients and 30 normal patients, validate on a set of 30 infected patients and 10 normal patients, and test on a set of 30 infected patients and 10 normal patients. The image set for each patient contains on average 12 images.
For the five-fold cross evaluation on patient-level, we obtain an average AUC value of 84.90% with a standard deviation of 4.21%. The right-hand side of Fig. 4 shows the ROC curve for the evaluation on patient-level. The average accuracy, precision, sensitivity, and specificity values we obtain on patient-level are 78.00%, 90.42%, 79.33%, and 74.00%, respectively. For a specificity of 80%, the average accuracy, precision, and sensitivity values on patient-level are 77.50%, 91.90%, and 76.67%, respectively.
5 Discussion and Conclusion
We present the first smartphone-based system exploiting deep learning for detecting malaria parasites in thick blood smear images. The idea is to develop a fast, low-cost, automated, smartphone-supported tool to diagnose malaria in resource-limited malaria-prone regions, using image pre-processing and deep learning methods. For five-fold cross evaluation on patch-level and patient-level we achieve average AUC values of 98.48% and 84.90%, respectively, and average accuracy values of 96.89% and 78.00%, respectively. The input patch size of the CNN model can influence the experimental results. We have evaluated the CNN classifier performance using three different patch sizes, 36 × 36, 44 × 44, and 52 × 52, and obtained the best performance with an input size of 44 × 44. When testing on a Samsung Galaxy S6 with an Exynos 7 Octa 7420 Processor and Android 7.0, our system can diagnose a thick blood smear image within ten seconds, proving that we can run powerful deep learning methods for malaria screening on a resource-limited mobile platform.
We have also applied object detection networks, such as faster-RCNN [19] and YOLO [20], to detect parasites in thick blood smears. However, these object detection networks need to be adapted to work well for very small objects like parasites, with an average size of 44 × 44 pixels in an image of 4032 × 3024 pixels, otherwise they would result in many more false negatives compared to our customized CNN classifier.
In conclusion, we have developed a fast and low-cost diagnostic application for smartphones that can be used in resource-limited regions without the need for specific malaria expertise. Future work will use multi-scale information to improve the classification performance and will test the stability of our app under diverse slide preparation methods and protocols.
References
WHO: World malaria report 2018 (2018)
WHO: Guidelines for the treatment of malaria. 3rd edn. World Health Organization (2015)
Makhija, K.S., Maloney, S., Norton, R.: The utility of serial blood film testing for the diagnosis of malaria. Pathology 47(1), 68–70 (2015)
WHO: Malaria micropscopy quality assurance manual. World Health Organization (2016)
Poostchi, M., Silamut, K., Maude, R.J., Jaeger, S., Thoma, G.: Image analysis and machine learning for detecting malaria. Transl. Res. 194, 36–55 (2018)
Breslauer, D.N., Maamari, R.N., Switz, N.A., Lam, W.A., Fletcher, D.A.: Mobile phone based clinical microscopy for global health applications. PLoS ONE 4(7), 1–7 (2009)
Tuijn, C.J., Li, J.: Data and image transfer using mobile phones to strengthen microscopy-based diagnostic services in low and middle income country laboratories. PLoS One 6(12), e28348 (2011)
Skandarajah, A., Reber, C.D., Switz, N.A., Fletcher, D.A.: Quantitative imaging with a mobile phone microscope. PLoS One 9(5), e96906 (2014)
Pirnstill, C.W., Coté, G.L.: Malaria diagnosis using a mobile phone polarized microscope. Sci. Rep. 5, 1–13 (2015)
Coulibaly, J.T., et al.: Evaluation of malaria diagnoses using a handheld light microscope in a community-based setting in rural Côte d’Ivoire. Am. J. Trop. Med. Hyg. 95(4), 831–834 (2016)
Kaewkamnerd, S., Uthaipibull, C., Intarapanich, A., Pannarut, M., Chaotheing, S., Tongsima, S.: An automatic device for detection and classification of malaria parasite species in thick blood film. BMC Bioinform. 13(Suppl 17), S18 (2012)
Quinn, J.A., Nakasi, R., Mugagga, P.K.B., Byanyima, P., Lubega, W., Andama, A.: Deep convolutional neural networks for microscopy-based point of care diagnostics. In: International Conference on Machine Learning for Health Care, Los Angeles, CA, pp. 1–12 (2016)
Cesario, M., Lundon, M., Luz, S., Masoodian, M., Rogers, B.: Mobile support for diagnosis of communicable diseases in remote locations. In: 13th International Conference of the NZ Chapter of the ACM’s Special Interest Group on Human-Computer Interaction – CHINZ 2012, Dunedin, New Zealand, pp. 25–28 (2012)
Dallet, C., Kareem, S., Kale, I.: Real time blood image processing application for malaria diagnosis using mobile phones. In: IEEE International Symposium on Circuits and Systems, Melbourne VIC, Australia, pp. 2405–2408 (2014)
Rosado, L., Da Costa, J.M.C., Elias, D., Cardoso, J.S.: Automated detection of malaria parasites on thick blood smears via mobile devices. Procedia Comput. Sci. 90, 138–144 (2016)
Rosado, L., Correia da Costa, J.M., Elias, D., Cardoso, J.S.: Mobile-based analysis of malaria-infected thin blood smears: automated species and life cycle stage determination. Sensors 17(10), 2167 (2017)
Eysenbach, G., Ofli, F., Chen, S., Kevin, G., Oliveira, A.D.: The malaria system microapp: a new, mobile device-based tool for malaria diagnosis. JMIR Res. Protoc. 6(4), e70 (2017)
Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 9(1), 62–66 (1979)
Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2017)
Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. https://arxiv.org/pdf/1506.02640.pdf. Accessed 01 Apr 2019
Acknowledgment
We would like to thank Dr. Md. A. Hossain for supporting our data acquisition at Chittagong Medical Hospital, Bangladesh. This research is supported by the Intramural Research Program of the National Institutes of Health, National Library of Medicine, and Lister Hill National Center for Biomedical Communications. Mahidol-Oxford Tropical Medicine Research Unit is funded by the Wellcome Trust of Great Britain. This research is also supported by the National Basic Research Program of China under No. 61671049 and the National Key R&D Plan of China under No. 2017YFB1400100.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Yang, F., Yu, H., Silamut, K., Maude, R.J., Jaeger, S., Antani, S. (2019). Smartphone-Supported Malaria Diagnosis Based on Deep Learning. In: Suk, HI., Liu, M., Yan, P., Lian, C. (eds) Machine Learning in Medical Imaging. MLMI 2019. Lecture Notes in Computer Science(), vol 11861. Springer, Cham. https://doi.org/10.1007/978-3-030-32692-0_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-32692-0_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32691-3
Online ISBN: 978-3-030-32692-0
eBook Packages: Computer ScienceComputer Science (R0)