Abstract
The face recognition and subject emotion classification is also an important area of current studies in psychology, with many potential uses and applications. Correctly assessing and recognizing subject’s emotion can lead to better understanding its behavior. However the used methods in present often have had a multitude of disadvantages. Therefore, we have set upon creating a solution that is modular and invariant from surrounding light conditions, which in the past represented the biggest problem. We can do so by using an array of data resources (localization of eye pupil) and data from sensors that complement each other, diminishing their disadvantages and reinforcing confidence. These data we can measure using to human physiological properties – pulse (heart rate sensor) and skin response (GSR). To verify of our proposed solution, we realized a simple experiment (displaying the various video clips to 50 participants). This experiment showed that the using SDK Affdex and the particular sensors, we achieved greater classification success rate (90.79%) than with alone SDK Affdex (85.04%).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Makransky, G., Terkildsen, T., Mayer, R.: Role of subjective and objective measures of cognitive processing during learning in explaining the spatial contiguity effect. Learn. Instr. 61, 23–34 (2019). https://doi.org/10.1016/j.learninstruc.2018.12.001
Lahane, P., Kumar Sangaiah, A.: An approach to EEG based emotion recognition and classification using kernel density estimation. Pap. Present. Procedia Comput. Sci. 48(C) 574–581 (2015). https://doi.org/10.1016/j.procs.2015.04.138
Zhang, Y., Ji, X., Zhang, S.: An approach to EEG-based emotion recognition using combined feature extraction method. Neurosci. Lett. 633, 152–157 (2016). https://doi.org/10.1016/j.neulet.2016.09.037
Kragel, P.A., Knodt, A.R., Hariri, A.R., LaBar, K.S.: Decoding spontaneous emotional states in the human brain. PLoS Biol. 14(9) (2016). https://doi.org/10.1371/journal.pbio.2000106
Baby, S.T., Vanitha, L.: Emotion detection in human beings using ECG signals. Int. J. Eng. Trends Technol. (IJETT) 4(5), 1337–1342 (2013)
Gravina, R., Alinia, P., Ghasemzadeh, H., Fortino, G.: Multi-sensor fusion in body sensor networks: state-of-the-art and research challenges. Inf. Fusion 35, 1339–1351 (2017). https://doi.org/10.1016/j.inffus.2016.09.005
Mosciano, F., Mencattini, A., Ringeval, F., Schuller, B., Martinelli, E., Di Natale, C.: An array of physical sensors and an adaptive regression strategy for emotion recognition in a noisy scenario. Sens. Actuators A: Phys. 267, 48–59 (2017). https://doi.org/10.1016/j.sna.2017.09.056
Davletcharova, A., Sugathan, S., Abraham, B., James, A.P.: Detection and analysis of emotion from speech signals. Pap. Present. Procedia Comput. Sci. 58, 91–96 (2015). https://doi.org/10.1016/j.procs.2015.08.032
Zhang, Q., Chen, X., Zhan, Q., Yang, T., Xia, S.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92–93, 84–90 (2017). https://doi.org/10.1016/j.compind.2017.04.005
Ioannou, S., Gallese, V., Merla, A.: Thermal infrared imaging in psychophysiology: potentialities and limits. Psychophysiology 51(10), 951–963 (2014). https://doi.org/10.1111/psyp.12243
Salazar-López, E., et al.: The mental and subjective skin: emotion, empathy, feelings and thermography. Conscious. Cogn. 34, 149–162 (2015). https://doi.org/10.1016/j.concog.2015.04.003
Cruz-Albarran, I.A., Benitez-Rangel, J.P., Osornio-Rios, R.A., Morales-Hernandez, L.A.: Human emotions detection based on a smart-thermal system of thermographic images. Infrared Phys. Technol. 81, 250–261 (2017). https://doi.org/10.1016/j.infrared.2017.01.002
Zualkernan, I., Aloul, F., Shapsough, S., Hesham, A., El-Khorzaty, Y.: Emotion recognition using mobile phones. Comput. Electr. Eng. 60, 1–13 (2017). https://doi.org/10.1016/j.compeleceng.2017.05.004
Tarnowski, P., Kołodziej, M., Majkowski, A., Rak, R.J.: Emotion recognition using facial expressions. Pap. Present. Procedia Comput. Sci. 108, 1175–1184 (2017). https://doi.org/10.1016/j.procs.2017.05.025
Hess, E.H.: The tell-tale eye: how your eyes reveal hidden thoughts and emotions. Van Nostrand Reinhold (1975)
Beatty, J., Lucero-Wagoner, B.: The pupillary system. In: Berntson, G., Tassinar, L.G. (eds.) Handbook of Psychophysiology, 2nd edn., pp. 142–162 Cambridge University Press, Hillsdale (2000)
Keltner, D., Cordaro, D.T.: Understanding multimodal emotional expressions: recent advances in basic emotion theory. Emotion Researcher (2015). http://emotionresearcher.com/understanding-multimodal-emotional-expressions-recent-advances-in-basic-emotion-theory/
Aviezer, H., Trope, Y., Todorov, A.: Body cues, not facial expressions discriminate between intense positive and negative emotions. Science 338(6111), 1225–1229 (2012)
Abramson, L., Marom, I., Petranker, R., Aviezer, H.: Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body. Emotion 17(3), 557–565 (2017). https://doi.org/10.1037/emo0000252
Schneider, N., Bex, P., Barth, E., Dorr, M.: An open-source low-cost eye-tracking system for portable real-time and offline tracking. Paper presented at the ACM International Conference Proceeding Series (2011). https://doi.org/10.1145/1983302.1983310
Kim, J., Lee, E.C., Lim, J.S.: A new objective visual fatigue measurement system by using a remote infrared camera. Paper presented at the Proceedings of the 2011 8th International Joint Conference on Computer Science and Software Engineering, JCSSE 2011, pp. 182–186. (2011). https://doi.org/10.1109/jcsse.2011.5930117
Mantiuk, R., Kowalik, M., Nowosielski, A., Bazyluk, B.: Do-it-yourself eye tracker: low-cost pupil-based eye tracker for computer graphics applications (2012). https://doi.org/10.1007/978-3-642-27355-1_13
Hernandez, J., Picard, R.W.: SenseGlass: using Google glass to sense daily emotions. Paper presented at the UIST 2014 - Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology, pp. 77–78 (2014). https://doi.org/10.1145/2658779.2658784
Scheirer, J., Fernandez, R., Picard, R.W.: Expression glasses: a wearable device for facial expression recognition. Paper presented at the Conference on Human Factors in Computing Systems - Proceedings, pp. 262–263 (1999). https://doi.org/10.1145/632716.632878
Maskeliunas, R., Raudonis, V.: Are you ashamed? Can a gaze tracker tell? PeerJ Comput. Sci. (8) (2016). https://doi.org/10.7717/peerj-cs.75
Ekman, P., Rosenberg, E.L.: What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS). What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS), pp. 1–672. (2012). https://doi.org/10.1093/acprof:oso/9780195179644.001.0001
Quazi, M.T., Mukhopadhyay, S.C.: Continuous monitoring of physiological parameters using smart sensors. Paper presented at the Proceedings of the International Conference on Sensing Technology, ICST, pp. 464–469 (2011). https://doi.org/10.1109/icsenst.2011.6137022
Quazi, M.T., Mukhopadhyay, S.C., Suryadevara, N.K., Huang, Y.M.: Towards the smart sensors based human emotion recognition. Paper presented at the 2012 IEEE I2MTC - International Instrumentation and Measurement Technology Conference, Proceedings, pp. 2365–2370 (2012). https://doi.org/10.1109/i2mtc.2012.6229646
Silva, F., Olivares, T., Royo, F., Vergara, M.A., Analide, C.: Experimental study of the stress level at the workplace using an smart testbed of wireless sensor networks and ambient intelligence techniques (2013). https://doi.org/10.1007/978-3-642-38622-0_21
Wiem, M.B.H., Lachiri, Z.: Emotion recognition system based on physiological signals with raspberry pi III implementation. Paper presented at the 2017 3rd International Conference on Frontiers of Signal Processing, ICFSP 2017, pp. 20–24 (2017). https://doi.org/10.1109/icfsp.2017.8097053
Hintjens, P.: ZeroMQ: Messaging for Many Applications. O’Reilly Media, Sebastopol (2013). Incorporated
Akgul, F.: ZeroMQ. Packt Publishing (2013). ISBN 178216104X, 9781782161042
Oliva, M., Anikin, A.: Pupil dilation reflects the time course of emotion recognition in human vocalizations. Sci. Rep. 8(1) (2018). https://doi.org/10.1038/s41598-018-23265-x
Duque, A., Sanchez, A., Vazquez, C.: Gaze-fixation and pupil dilation in the processing of emotional faces: the role of rumination. Cogn. Emot. 28(8), 1347–1366 (2014). https://doi.org/10.1080/02699931.2014.881327
Saha, R., et al.: A brief study on evolution of iris recognition system. Paper presented at the 2017 8th IEEE Annual Information Technology, Electronics and Mobile Communication Conference, IEMCON 2017, pp. 685–688. (2017). https://doi.org/10.1109/iemcon.2017.8117234
Acknowledgments
This paper was created with the financial support of the projects: 1. Research and Innovation for the project Fake news on the Internet - identification, content analysis, emotions (code: NFP313010T527). 2. The project UGA: Gathering data on understanding of study materials based upon the students’ pupil movement (code: VII/9/2019). 3. The project KEGA 036UKF-4/2019, Adaptation of the learning process using sensor networks and the Internet of Things.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Magdin, M., Kohútek, M., Koprda, Š., Balogh, Z. (2019). EmoSens – The Proposal of System for Recognition of Emotion with SDK Affectiva and Various Sensors. In: Huang, DS., Bevilacqua, V., Premaratne, P. (eds) Intelligent Computing Theories and Application. ICIC 2019. Lecture Notes in Computer Science(), vol 11643. Springer, Cham. https://doi.org/10.1007/978-3-030-26763-6_39
Download citation
DOI: https://doi.org/10.1007/978-3-030-26763-6_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26762-9
Online ISBN: 978-3-030-26763-6
eBook Packages: Computer ScienceComputer Science (R0)