Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease
Next Article in Journal
A Wearable Device for Breathing Frequency Monitoring: A Pilot Study on Patients with Muscular Dystrophy
Previous Article in Journal
Experimental Study on Whole Wind Power Structure with Innovative Open-Ended Pile Foundation under Long-Term Horizontal Loading
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Letter

Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease

1
Department of Computer Science, Graduate School, Sangmyung University, Seoul 03016, Korea
2
Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul 08826, Korea
3
Department of Psychiatry, Seoul National University College of Medicine & SMG-SNU Boramae Medical Center, Seoul 03080, Korea
4
Department of Human-Centered Artificial Intelligence, Sangmyung University, Seoul 03016, Korea
*
Authors to whom correspondence should be addressed.
Sensors 2020, 20(18), 5349; https://doi.org/10.3390/s20185349
Submission received: 10 August 2020 / Revised: 14 September 2020 / Accepted: 16 September 2020 / Published: 18 September 2020
(This article belongs to the Section Biomedical Sensors)

Abstract

:
Brain disease can be screened using eye movements. Degenerative brain disorders change eye movement because they affect not only memory and cognition but also the cranial nervous system involved in eye movement. We compared the facial and eye movement patterns of patients with mild Alzheimer’s disease and cognitively normal people to analyze the neurological signs of dementia. After detecting the facial landmarks, the coordinate values for the movements were extracted. We used Spearman’s correlation coefficient to examine associations between horizontal and vertical facial and eye movements. We analyzed the correlation between facial and eye movements without using special eye-tracking equipment or complex conditions in order to measure the behavioral aspect of the natural human gaze. As a result, we found differences between patients with Alzheimer’s disease and cognitively normal people. Patients suffering from Alzheimer’s disease tended to move their face and eyes simultaneously in the vertical direction, whereas the cognitively normal people did not, as confirmed by a Mann–Whitney–Wilcoxon test. Our findings suggest that objective and accurate measurement of facial and eye movements can be used to screen such patients quickly. The use of camera-based testing for the early detection of patients showing signs of neurodegeneration can have a significant impact on the public care of dementia.

1. Introduction

In the 21st century, mortality and fertility rates have been declining, while medical and living standards have been increasing. As a result, we are facing a population aging problem [1]. Forms of dementia, such as Alzheimer’s disease, are mostly caused by aging and degenerative brain changes, and, unfortunately, incidence and social costs have rapidly increased since 2010 [2]. Due to the progression of the disease, the brain’s functioning deteriorates, making it difficult to maintain an individual’s life independently, as most patients experience self-loss and are unable to participate in everyday experiences [3]. Early detection is very important for dementia patients, as they are dependent; it can reduce the caregiver’s burden and social costs. Therefore, there is a pressing need for a method that can easily and quickly detect dementia.
Recently, increasingly more studies have shown that recognition of emotions or measurement of mental states can be achieved by analyzing eye movements [4,5,6]. Although subtle changes in eye-movement patterns are difficult to identify, they can reveal the emotion or mental state that the brain is processing because their structural characteristics are linked to the brain. Dementia is commonly known as a degenerative brain disease that causes brain damage, cognitive impairment, and behavioral and psychological symptoms [7,8]. In addition, many patients experience exhibit visual neurological symptoms and signs [9]. However, it is very difficult to evaluate the visual acuity of dementia patients; doing so depends on the pathological proficiency of the diagnosing doctor [10]. Therefore, in this study, we analyzed the facial and eye movement patterns of patients with Alzheimer’s disease and cognitively normal people when they were watching videos to identify the symptoms of dementia.
Previous research on the subject can be categorized into two groups: studies that used eye-feature-based methods and studies that used facial-feature-based methods. One method for measuring cognitive responses and brain function that uses the human eye is the pupillary response. Another method is gaze analysis [11,12,13]. When degenerative brain change makes it impossible for a stimulus that enters the retina to be processed, the pupil cannot respond to that stimulus [14]. As such, a method that uses pupil dilation and fixation duration has been proposed for analyzing the human mental state and the recognition of emotion [15].
Several studies have examined pupillary response. One study analyzed the symptoms of post-traumatic stress disorder (PTSD) by measuring changes in the pupil size and fixation duration of Iraqi veterans when viewing traumatic and neutral images. When viewing trauma-related stimuli, higher PTSD levels were associated with vigilance rather than avoidance, but this was not found in those with lower PTSD levels [16]. Another study suggested that early markers of Alzheimer’s disease could be identified in the pupil responses by analyzing the correlation between the pupil dilation when participants perform cognitive tasks and the cognitive decline. Granholm and colleagues found that the pupil expanded according to the needs of the task in patients with mild cognitive impairment (MCI) and cognitive impairment when performing a task that required cognitive function. However, they also found that pupil size and performance tended to decrease dramatically when tasks were demanding and exceeded the patient’s cognitive abilities [17].
Other studies measured only the gaze and did not examine the pupil in psychiatric disorders. Weeks et al. performed some experiments on patients with social anxiety disorder. The results confirmed that a gaze characteristic or response for people with anxiety is avoidance. They tracked the participants’ eyes when they were watching videos that could trigger social anxiety. The gaze avoidance of participants with social anxiety disorder was proportional to their anxiety and fear of the stimuli [18]. In another study, MCI or dementia patients watched videos according to some given conditions, and it proved possible to infer cognitive functions by recording and analyzing the participants’ eye movements and observing the changes in their gaze patterns [19]. One research study developed a model that can detect the symptoms of depression using various features, such as head pose, facial expression, eye gaze, and audio features. In this model, a feature point is specified by extracting a landmark from a region of a face, and the facial expression and pose features are extracted based on the distance between the feature points and the velocity of the movement. The extracted features are classified using a support vector machine and neural networks [20].
However, there are problems in the existing studies due to the complexity of the experiment, the restricted environment, and the gaze analysis method. For example, participants completed subjective assessments before and after the experiment and performed a task that met the conditions within a restricted environment. This approach increases the complexity of the experiment. Dementia patients have a decline in attention and concentration. Moreover, they often experience vigilance or anxiety. Negative emotions can also affect the patient’s cognitive function. Therefore, experiments that involve complex tasks can intensify the symptoms, potentially affecting the results. This makes it difficult to make accurate diagnoses and to implement models that can be used with actual patients.
Another problem with the gaze analysis method is that of determining how to define the gaze. We consider that the gaze cannot be defined only by the direction or velocity of the eye movements, because the gaze reflects human emotions or mental states; that is, the behavioral aspects of the natural gaze should include not only the eyes but also the facial movements. Thus, in this study, we analyzed the correlation between the facial and eye movements of patients with Alzheimer’s disease while they were watching videos. Based on these results, a method is provided that enables swift and accurate analysis of eye movements of patients with Alzheimer’s disease.

2. Materials and Methods

2.1. Participants

The experiment was conducted with elderly people over 60 years old who were divided into two groups. One group, the mild Alzheimer’s disease group, consisted of 17 patients with mild Alzheimer’s. The diagnosis of probable Alzheimer’s disease was made according to the criteria of the National Institute of Neurological and Communicative Disorders and Alzheimer’s Disease and Related Disorders Association (NINCDS-ADRDA) [21]. The exclusion criteria for participation were as follows: any depressive symptoms; impaired physical mobility that might influence the study process; history of neurological disease (i.e., head trauma or stroke) or major psychiatric disease according to the criteria of the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) [REF]; auditory or visual difficulties that could disrupt the experiment procedure; refusal to give consent; and inability to properly complete the test as judged by an examiner. Exclusion criteria were determined by a board-certified psychiatrist (J.Y.L.). The severity of dementia was rated by Clinical Dementia Rating scale, and the Clinical Dementia Rating scores of the patient group were 0.5 (very mild) or 1 (mild) [22]. The other group, which was the normal cognitive group, included participants without dementia or other mental disorders. This group consisted of 17 healthy participants. Information about the two groups (Alzheimer’s disease and normal cognition) who participated in this research can be found in Table 1. In the table, education refers to the number of years that the participant had been educated. The Mini-Mental State Examination (MMSE) is based on a perfect score of 30. The comparison of differences between the two groups for the MMSE score is significant: normal cognitive > Alzheimer’s disease (p < 0.001, regardless of age or sex).

2.2. Ethical Considerations

All participants gave their informed consent before taking part in the study. The study was conducted in accordance with the Declaration of Helsinki. It was approved by the Institutional Review Board of SMG-SNU Boramae Medical Center (IRB No.30-2017-63).

2.3. Experimental Configuration

The video used in this experiment was about 22 min long and consisted of six different short films that evoked the six basic emotions (happiness, sadness, fear, anger, surprise, and disgust). Participants watched each short film for about 1 min, and the participants then rested for 2 min. After 2 min of rest, they watched the next video. There was no difference in individual responses to the six short films. The face video data used in this study were obtained by recording the face of the participant while they watched the video. A Canon EOS 70D camera (Canon Inc., Tokyo, Japan) was used to record at 30 frames per second with a 1920 × 1080 resolution. As shown in Figure 1, the distance between the camera and the subject was set to 1 m. We conducted the experiment individually in a separate room. In addition, for the consistency of the experiment’s environment, we isolated the participants from factors, such as light, motion, and background, that could have obstructed the participant’s concentration.

2.4. Image Analysis

To acquire the eye and facial movement data of an Alzheimer’s patient, the facial and eye coordinate values were first extracted from the participant’s video. For the extraction, we referred to the method shown in Figure 2, which was taken from a facial behavior analysis toolkit called OpenFace 2.0 [23]. The OpenFace 2.0 toolkit provides an open source library for detecting faces, extracting facial landmarks based on facial features, and estimating eye gaze and head pose. It also includes the process of recognizing the action units of the face. In this study, only the process of estimating the head pose and eye gaze was used [24,25,26,27]. The extracted values are output together in a comma-separated values (.csv) file format along with the resulting processed image. The gaze value is a vector value that reflects the amount of change in the feature point movement of the face landmark [25,26,27]. Figure 3 presents the coordinate axes used in this study. The facial coordinate axes used were Head-α, Head-β, and Head-γ, which represented the rotation of the facial movements on the 3D axis, as shown in Figure 3a. As for the eyes, the left eye and right eye coordinate axes were REye-X, REye-Y, LEye-X, and LEye-Y, which were the moving coordinate values on the 2D axis, as shown in Figure 3b.

2.5. Statistical Analysis

2.5.1. Spearman’s Correlation Coefficients of Facial and Eye Movements

Using the extracted coordinate values, Spearman’s correlation coefficients were calculated to analyze the correlations between the vertical and horizontal facial movements and eye movements [28,29]. The coordinate axis pair was used to identify the correlation between the two directions, as shown in Figure 4a (a pair of axes was used to obtain the correlation in the horizontal direction). The eyes in the horizontal direction used the X-axis. Since the face uses a rotation axis, the vector values in the horizontal direction were obtained using the Head-β axis (Figure 4b shows the axis used to find the correlation between the facial and eye movements in the vertical direction). For the vertical direction of the face, we obtained the vector value by using the rotation Head-α axis. Thus, four types of correlation coefficients were obtained for each group using combinations of 1 to 4, as presented in Figure 4.

2.5.2. Mann–Whitney–Wilcoxon Test of the Spearman’s Correlation Coefficient

We used the Mann–Whitney–Wilcoxon (MWW) test to determine if the two groups had different or equal correlations [30]. Before performing the MWW test, we conducted the Kolmogorov–Smirnov (KS) test to examine the normality of the four types of correlation coefficients obtained from the two groups [31].
The KS test showed that the distribution of the correlation coefficient was non-normal in both the Alzheimer’s disease and normal groups. Based on this result, the MWW test was performed to test the difference between the two independent groups. All tests were performed using the significance level of 95% (p > 0.05).

3. Results

We found that the gaze variance in the Alzheimer’s disease group was significantly greater in all directions than in the normal group. One of the symptoms that comes with cognitive decline is a decrease in concentration, which causes frequent eye movements and facial movements. The degree of dispersion can be seen in Figure 5.
Figure 6 displays some of the extracted facial landmarks. The Alzheimer’s patients made many facial movements, and the directions of their face and eyes coincided in the vertical direction. The normal group had few facial movements, and we did not find any pattern of facial and eye movements in the vertical direction.
The horizontal correlation coefficient was not significantly different between the two groups, and it was negatively correlated. That is, when the face moved horizontally, the eyes tended to move away from the face. The distribution of these correlation coefficients is shown in Figure 7a.
As shown in Figure 7b, the correlation coefficients for the vertical direction were different between the two groups. In the Alzheimer’s disease patient group, the correlation coefficient in the vertical direction is composed of values close to 1, indicating a positive correlation. In the case of the normal group, the correlation coefficient values in the vertical direction have a wider distribution than the Alzheimer’s disease patient group, and there is no consistency of gaze within the group. In conclusion, in the case of the Alzheimer’s patients, when the face moved up and down, the eye movement tended to move in the same direction as the face, whereas in the normal group, the facial movement did not correlate with the direction of the eyes.
Table 2 shows the statistics of the correlation coefficients obtained. Using data from all participants, “direction” refers to the eyes in pairs of facial and eye axes (see Figure 4).
The MWW test results verifying the significance of the two groups are also shown in Figure 7. For the p-values and annotation of the MWW test shown in the figure, refer to Table 2.

4. Discussion and Conclusions

4.1. Variance for Movement in All Directions

As a result of comparing the degree of variation in the movements of the Alzheimer’s patient and the healthy person in each direction, we found that the degree of variation was higher in the Alzheimer’s disease patient group. That is, the face and eyes moved more often in multiple directions in patients with Alzheimer’s disease. One type of abnormal eye movement caused by dementia is distraction [32]. Patients with Alzheimer’s disease have difficulty looking upwards and have poor visual fixation. Moreover, during a visual search, the target object often cannot be found, and the gaze duration for other objects is long because it lacks a specific focus [33].
When cognitive ability deteriorates, the ability to concentrate attention decreases once the target object is detected, and the amount of eye movement increases. Researchers argue that these variations in gaze in Alzheimer’s patients are due to damage to frontal and parietal lobes; deficits in these areas are known to be related to dysfunctional attention in the course of Alzheimer’s disease, leading to deficits in the initiation and suppression of saccades and smooth pursuit [34]. Patients with Alzheimer’s disease may have a higher degree of facial movement in order to overcome these problems. Accordingly, the above-mentioned symptoms in the existing eye-movement screening method displayed in the group of patients with Alzheimer’s disease may be the clinical manifestation owing to the reduced attention. In this respect, our results suggest that the abnormal eye movements measured in the current study can be a marker in screening for Alzheimer’s disease.

4.2. Correlation in Vertical Direction between Face and Eye Movements

Dementia is commonly referred to as a symptom associated with the cognitive impairment of a neurodegenerative disease. Abnormal eye movements appear in most types of dementias [35]. We found that the vertical gaze of Alzheimer’s patients differed from the gaze of the normal group, using only the camera’s records. These patients had a statistically significantly higher correlation coefficient between eyes and face in the vertical direction, despite a higher degree of change in face and eye movement.
Vertical eye movement paralysis refers to abnormal saccadic eye movement in the vertical direction, which is related to the results of this study. Deficits in the movement of visually guided vertical saccades in Alzheimer’s disease may be related to the attentional impairment, rather than to the ocular motor nerve impairment. Our study’s results indicate that the face and eyes may move relatively simultaneously due to reduced vertical smooth eye movement [36].
Dementia is difficult to diagnose accurately because the boundary of symptoms is difficult to quantify. To check whether or not the existing eye movement is abnormal, the patient is asked to move the eye according to the given condition, and a score is given when executed properly. During the test, the patient uses only eye movements, which allows the test-giver to ensure that eye movements exceed the normal range compared to the cognitively normal group [37]. However, even in the cognitively normal group, there are cases where results are out of range due to aging; the movements of the eye are very discreet and are displayed only when the range is greatly exceeded. Hence, early diagnosis is difficult [38].

4.3. Limitations and Future Directions

Given the normality test process and the amount of data, appropriate statistical methods were applied to obtain results sufficient to demonstrate the difference between the two groups participating in the experiment. However, more data are needed for such analysis to be applied to existing dementia diagnostic methods. In addition, differing types of features need to be analyzed to diagnose the exact disease among neurodegenerative diseases that indicate dementia.
The analyses used in the current study were correlation and variance along each direction of facial and eye movement. It is important to identify dementia accurately; although patients’ general symptoms may be similar, the degree of damage and its progress in certain areas of the brain depends on the disease [38]. To overcome these limitations, future studies will explore ways in which additional data from groups with Alzheimer’s disease and cognitively normal groups can be collected. Furthermore, we are looking for a method that can classify different features of eye movements and facial movements depending on the disease. We are investigating the method of using feature points in the landmark extraction method used in the current research. We are also considering ways to classify various fine features using machine learning.

4.4. Conclusions

Alzheimer’s disease has severe cognitive symptoms. In patients with dementia, early detection can have a significant impact on their care. Therefore, it is very important to detect and treat the symptoms of dementia immediately and efficiently. Our study analyzed the differences in gaze between patients with Alzheimer’s disease and cognitively normal people in an intuitive way, and the findings suggest these differences can be used to screen Alzheimer’s disease symptoms quickly using only camera recordings. In addition, we propose a method that enables objective and accurate measurement of eye movements by determining the correlation between facial and eye movements. The ultimate goal of our study is to provide a foundation for the development of automated tools for analyzing the symptoms of dementia patients. Further symptom analysis and research should increase the accuracy and significance of the symptom diagnosis and build an automated system based on the measured data by incorporating deep learning.

Author Contributions

Conceptualization, J.-Y.L. and E.C.L.; methodology, U.N. and K.L.; formal analysis, U.N. and K.L.; investigation, K.L. and H.K.; data curation, U.N., K.L., and H.K.; writing—original draft preparation, U.N. and H.K.; writing—review and editing, J.-Y.L. and E.C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Industrial Strategic Technology Development Program (No.10073159) and funded by the Ministry of Trade, Industry & Energy (MI, Korea). Also, this work was supported by the NRF (National Research Foundation) of Korea funded by the Korea government (Ministry of Science and ICT) (NRF-2019R1A2C4070681).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Powell, J.L. The power of global aging. Ageing Int. 2010, 35, 1–14. [Google Scholar] [CrossRef]
  2. Wimo, A.; Guerchet, M.; Ali, G.C.; Wu, Y.T.; Prina, A.M.; Winblad, B.; Jönsson, L.; Liu, Z.; Prince, M. The worldwide costs of dementia 2015 and comparisons with 2010. Alzheimer’s Dement. 2017, 13, 1–7. [Google Scholar] [CrossRef] [PubMed]
  3. Cohen-Mansfield, J.; Golander, H.; Arnheim, G. Self-identity in older persons suffering from dementia: Preliminary results. Soc. Sci. Med. 2000, 51, 381–394. [Google Scholar] [CrossRef]
  4. Lu, Y.; Zheng, W.L.; Li, B.; Lu, B.L. Combining Eye Movements and EEG to Enhance Emotion Recognition. In Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, Buenos Aires, Argentina, 25–31 July 2015. [Google Scholar]
  5. Schurgin, M.W.; Nelson, J.; Iida, S.; Ohira, H.; Chiao, J.Y.; Franconeri, S.L. Eye movements during emotion recognition in faces. J. Vis. 2014, 14, 14. [Google Scholar] [CrossRef] [Green Version]
  6. Kojima, T.; Matsushima, E.; Ohta, K.; Toru, M.; Han, Y.H.; Shen, Y.C.; Moussaoui, D.; David, I.; Sato, K.; Kathmann, N.; et al. Stability of exploratory eye movements as a marker of schizophrenia—A WHO multi-center study. Schizophr. Res. 2001, 52, 203–213. [Google Scholar] [CrossRef]
  7. Rosen, H.J.; Gorno-Tempini, M.L.; Goldman, W.P.; Perry, R.J.; Schuff, N.; Weiner, M.; Feiwell, R.; Karmer, J.H.; Miller, B.L. Patterns of brain atrophy in frontotemporal dementia and semantic dementia. Neurology 2002, 58, 198–208. [Google Scholar] [CrossRef] [Green Version]
  8. Aarsland, D. Cognitive impairment in Parkinson’s disease and dementia with Lewy bodies. Parkinsonism Relat. Disord. 2016, 22, S144–S148. [Google Scholar] [CrossRef]
  9. Brahm, K.D.; Wilgenburg, H.M.; Kirby, J.; Ingalla, S.; Chang, C.Y.; Goodrich, G.L. Visual impairment and dysfunction in combat-injured servicemembers with traumatic brain injury. Optom. Vis. Sci. 2009, 86, 817–825. [Google Scholar] [CrossRef] [Green Version]
  10. Downs, M.G. The role of general practice and the primary care team in dementia diagnosis and management. Int. J. Geriatr. Psychiatry 1996, 11, 937–942. [Google Scholar] [CrossRef]
  11. Hess, E.H.; Polt, J.M. Pupil size in relation to mental activity during simple problem-solving. Science 1964, 143, 1190–1192. [Google Scholar] [CrossRef]
  12. Iqbal, S.T.; Zheng, X.S.; Bailey, B.P. Task-Evoked Pupillary Response to Mental Workload in Human-Computer Interaction. In Proceedings of the CHI ‘04 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 1477–1480. [Google Scholar]
  13. Marshall, S.P. Identifying cognitive state from eye metrics. Aviat. Space Environ. Med. 2007, 78, B165–B175. [Google Scholar] [PubMed]
  14. Adoni, A.; McNett, M. The pupillary response in traumatic brain injury: A guide for trauma nurses. J. Trauma Nurs. 2007, 14, 191–196. [Google Scholar] [CrossRef] [PubMed]
  15. Kuchinke, L.; Trapp, S.; Jacobs, A.M.; Leder, H. Pupillary responses in art appreciation: Effects of aesthetic emotions. Psychol. Aesthet. Creat. Arts 2009, 3, 156. [Google Scholar] [CrossRef] [Green Version]
  16. Kimble, M.O.; Fleming, K.; Bandy, C.; Kim, J.; Zambetti, A. Eye tracking and visual attention to threatening stimuli in veterans of the Iraq war. J. Anxiety Disord. 2010, 24, 293–299. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Granholm, E.L.; Panizzon, M.S.; Elman, J.A.; Jak, A.J.; Hauger, R.L.; Bondi, M.W.; Lyons, M.J.; Franz, C.E.; Kremen, W.S. Pupillary responses as a biomarker of early risk for Alzheimer’s disease. J. Alzheimer’s Dis. 2017, 56, 1419–1428. [Google Scholar] [CrossRef] [Green Version]
  18. Weeks, J.W.; Howell, A.N.; Srivastav, A.; Goldin, P.R. “Fear guides the eyes of the beholder”: Assessing gaze avoidance in social anxiety disorder via covert eye tracking of dynamic social stimuli. J. Anxiety Disord. 2019, 65, 56–63. [Google Scholar] [CrossRef]
  19. Zhang, Y.; Wilcockson, T.; Kim, K.I.; Crawford, T.; Gellersen, H.; Sawyer, P. Monitoring dementia with automatic eye movements analysis. Intell. Decis. Technol. 2016, 57, 299–309. [Google Scholar]
  20. Dham, S.; Sharma, A.; Dhall, A. Depression scale recognition from audio, visual and text analysis. arXiv 2017, arXiv:1709.05865. [Google Scholar]
  21. McKhann, G.; Drachman, D.; Folstein, M.; Katzman, R.; Price, D.; Stadlan, E.M. Clinical diagnosis of Alzheimer’s disease: Report of the NINCDS-ADRDA Work Group under the auspices of Department of Health and Human Services Task Force on Alzheimer’s Disease. Neurology 1984, 34, 939–944. [Google Scholar] [CrossRef] [Green Version]
  22. Morris, J.C. The Clinical Dementia Rating (CDR): Current version and scoring rules. Neurology 1993, 43, 2412–2414. [Google Scholar] [CrossRef]
  23. Baltrušaitis, T.; Zadeh, A.; Lim, Y.C.; Morency, L.P. OpenFace 2.0: Facial Behavior Analysis Toolkit. In Proceedings of the 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018. [Google Scholar]
  24. Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime Multi-Person 2d Pose Estimation Using Part Affinity Fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), San Juan, Puerto Rico, 22–25 July 2017; pp. 7291–7299. [Google Scholar]
  25. Zadeh, A.; Lim, Y.C.; Baltrušaitis, T.; Morency, L.P. Convolutional Experts Constrained Local Model for 3D Facial Landmark Detection. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 2519–2528. [Google Scholar]
  26. Baltrusaitis, T.; Robinson, P.; Morency, L.P. Constrained Local Neural Fields for Robust Facial Landmark Detection in the Wild. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Sydney, Australia, 1–8 December 2013; pp. 354–361. [Google Scholar]
  27. Wood, E.; Baltrusaitis, T.; Zhang, X.; Sugano, Y.; Robinson, P.; Bulling, A. Rendering of Eyes for Eye-Shape Registration and Gaze Estimation. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 201; pp. 3756–3764.
  28. Mashiko, T.; Umeda, T.; Nakaji, S.; Sugawara, K. Position related analysis of the appearance of and relationship between post-match physical and mental fatigue in university rugby football players. Br. J. Sports Med. 2004, 38, 617–621. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Xiao, C.; Ye, J.; Esteves, R.M.; Rong, C. Using Spearman’s correlation coefficients for exploratory data analysis on big dataset. Concurr. Comput. Pract. Exp. 2016, 28, 3866–3878. [Google Scholar] [CrossRef]
  30. Martínez-Murcia, F.J.; Górriz, J.M.; Ramirez, J.; Puntonet, C.G.; Salas-Gonzalez, D. Alzheimer’s Disease Neuroimaging Initiative. Computer aided diagnosis tool for Alzheimer’s disease based on Mann–Whitney–Wilcoxon U-test. Expert Syst. Appl. 2012, 39, 9676–9685. [Google Scholar] [CrossRef]
  31. Lilliefors, H.W. On the Kolmogorov-Smirnov test for normality with mean and variance unknown. J. Am. Stat. Assoc. 1967, 62, 399–402. [Google Scholar] [CrossRef]
  32. Jin, Z.; Reeves, A. Attentional release in the saccadic gap effect. Vis. Res. 2009, 49, 2045–2055. [Google Scholar] [CrossRef] [Green Version]
  33. Moser, A.; Kömpf, D.; Olschinka, J. Eye movement dysfunction in dementia of the Alzheimer type. Dementia 1995, 6, 264–268. [Google Scholar] [CrossRef]
  34. Garbutt, S.; Matlin, A.; Hellmuth, J.; Schenk, A.K.; Johnson, J.K.; Rosen, H.; Dean, D.; Kramer, J.; Neuhaus, J.; Miller, B.L.; et al. Oculomotor function in frontotemporal lobar degeneration, related disorders and Alzheimer’s disease. Brain 2008, 131, 1268–1281. [Google Scholar] [CrossRef] [Green Version]
  35. Antoniades, C.A.; Kennard, C. Ocular motor abnormalities in neurodegenerative disorders. Eye 2015, 29, 200–207. [Google Scholar] [CrossRef] [Green Version]
  36. Scinto, L.F.; Daffner, K.R.; Castro, L.; Weintraub, S.; Vavrik, M.; Mesulam, M.M. Impairment of spatially directed attention in patients with probable Alzheimer’s disease as measured by eye movements. Arch. Neurol. 1994, 51, 682–688. [Google Scholar] [CrossRef]
  37. Mosimann, U.P.; Müri, R.M.; Burn, D.J.; Felblinger, J.; O’Brien, J.T.; McKeith, I.G. Saccadic eye movement changes in Parkinson’s disease dementia and dementia with Lewy bodies. Brain 2005, 128, 1267–1276. [Google Scholar] [CrossRef] [Green Version]
  38. Williams, D.R.; Lees, A.J. Progressive supranuclear palsy: Clinicopathological concepts and diagnostic challenges. Lancet Neurol. 2009, 8, 270–279. [Google Scholar] [CrossRef]
Figure 1. Example of the experimental environment.
Figure 1. Example of the experimental environment.
Sensors 20 05349 g001
Figure 2. Facial and eye movement extraction by OpenFace 2.0 [23].
Figure 2. Facial and eye movement extraction by OpenFace 2.0 [23].
Sensors 20 05349 g002
Figure 3. Coordinate axes: (a) face and (b) eye.
Figure 3. Coordinate axes: (a) face and (b) eye.
Sensors 20 05349 g003
Figure 4. A pair of axes was used to obtain the correlation coefficient: (a) horizontal and (b) vertical.
Figure 4. A pair of axes was used to obtain the correlation coefficient: (a) horizontal and (b) vertical.
Sensors 20 05349 g004
Figure 5. Variance for movement in all directions.
Figure 5. Variance for movement in all directions.
Sensors 20 05349 g005
Figure 6. Extracted facial landmarks: (a) normal group and (b) AD group.
Figure 6. Extracted facial landmarks: (a) normal group and (b) AD group.
Sensors 20 05349 g006
Figure 7. Correlation coefficients represented by the boxplots: (a) horizontal and (b) vertical.
Figure 7. Correlation coefficients represented by the boxplots: (a) horizontal and (b) vertical.
Sensors 20 05349 g007
Table 1. Characteristics of the research participants.
Table 1. Characteristics of the research participants.
VariableAD (n = 17)Normal (n = 17)
MeanSD%MeanSD%
Age77.236.79-746.53-
Education 19.73.94-10.945.06-
Gender 2--47/53--41/59
MMSE20.125.28 272.85
1 Years of education; 2 male/female. AD = Alzheimer’s disease. Normal = cognitively normal. SD = standard deviation.
Table 2. Results of Spearman’s correlation.
Table 2. Results of Spearman’s correlation.
Horizontal (a)Vertical (b)
DirectionLeftRightLeftRight
GroupADNormalADNormalADNormalADNormal
Median−0.669−0.54−0.716−0.6160.7850.6730.7880.69
IQR0.1350.5150.1180.3090.0920.1560.060.163
Mean ± SD−0.639 ± 0.162−0.467 ± 0.219−0.62 ± 0.25−0.557 ± 0.2230.787 ± 0.080.681 ± 0.1320.783 ± 0.1570.68 ± 0.096
MWWUSigUSigUSigUSig
910.0671200.4082130.019 (*)2100.025 (*)
* 0.01 < p ≤ 0.05. AD = Alzheimer’s disease. Normal = cognitively normal. SD = standard deviation.

Share and Cite

MDPI and ACS Style

Nam, U.; Lee, K.; Ko, H.; Lee, J.-Y.; Lee, E.C. Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease. Sensors 2020, 20, 5349. https://doi.org/10.3390/s20185349

AMA Style

Nam U, Lee K, Ko H, Lee J-Y, Lee EC. Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease. Sensors. 2020; 20(18):5349. https://doi.org/10.3390/s20185349

Chicago/Turabian Style

Nam, Uiseo, Kunyoung Lee, Hyunwoong Ko, Jun-Young Lee, and Eui Chul Lee. 2020. "Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease" Sensors 20, no. 18: 5349. https://doi.org/10.3390/s20185349

APA Style

Nam, U., Lee, K., Ko, H., Lee, J. -Y., & Lee, E. C. (2020). Analyzing Facial and Eye Movements to Screen for Alzheimer’s Disease. Sensors, 20(18), 5349. https://doi.org/10.3390/s20185349

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop