Global Electroencephalography Synchronization as a New Indicator for Tracking Emotional Changes of a Group of Individuals during Video Watching - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Dec 1:11:577.
doi: 10.3389/fnhum.2017.00577. eCollection 2017.

Global Electroencephalography Synchronization as a New Indicator for Tracking Emotional Changes of a Group of Individuals during Video Watching

Affiliations

Global Electroencephalography Synchronization as a New Indicator for Tracking Emotional Changes of a Group of Individuals during Video Watching

Chang-Hee Han et al. Front Hum Neurosci. .

Abstract

In the present study, we investigated whether global electroencephalography (EEG) synchronization can be a new promising index for tracking emotional arousal changes of a group of individuals during video watching. Global field synchronization (GFS), an index known to correlate with human cognitive processes, was evaluated; this index quantified the global temporal synchronization among multichannel EEG data recorded from a group of participants (n = 25) during the plays of two short video clips. The two video clips were each about 5 min long and were designed to evoke negative (fearful) or positive (happy) emotion, respectively. Another group of participants (n = 37) was asked to select the two most emotionally arousing (most touching or most fearful) scenes in each clip. The results of these questionnaire surveys were used as the ground-truth to evaluate whether the GFS could detect emotional highlights of both video clips. The emotional highlights estimated using the grand-averaged GFS waveforms of the first group were also compared with those evaluated from galvanic skin response, photoplethysmography, and multimedia content analysis, which are conventional methods used to estimate temporal changes in emotional arousal during video plays. From our results, we found that beta-band GFS values decreased during high emotional arousal, regardless of the type of emotional stimulus. Moreover, the emotional highlights estimated using the GFS waveforms coincided best with those found by the questionnaire surveys. These findings suggest that GFS might be applicable as a new index for tracking emotional arousal changes of a group of individuals during video watching, and is likely to be used to evaluate or edit movies, TV commercials, and other broadcast products.

Keywords: affective brain-computer interface (aBCI); electroencephalography (EEG); global field synchronization (GFS); neurocinematics; passive brain-computer interface.

PubMed Disclaimer

Figures

Figure 1
Figure 1
A schematic diagram of our experimental paradigm.
Figure 2
Figure 2
Overall procedure of data analysis.
Figure 3
Figure 3
Grand averaged GFS waveforms: (A) negative clip; (B) positive clip. N1, N2, P1, and P2 represent time periods during which the GFS values dropped below the lower horizontal line. Additional time periods during which the GFS values suddenly dropped near the lower horizontal line (a, b, c, d, e, f, g, and h) are also marked as gray areas.
Figure 4
Figure 4
Two most impressive scenes in each clip as identified in the questionnaires: (A) negative clip; (B) positive clip. N1, N2, P1, and P2 represent time periods during which the GFS values dropped below the lower horizontal line. Additional time periods during which the GFS values suddenly dropped near the lower horizontal line (a, b, c, d, e, f, g, and h) are also marked as gray areas.
Figure 5
Figure 5
Grand averaged waveforms of MCA, GSR, PPG, and questionnaires results: (A) negative clip; (B) positive clip; First, second, and third row represent MCA, GSR, and PPG waveforms, respectively. A fourth row shows the results of the questionnaire survey.

Similar articles

Cited by

References

    1. Anderson K., McOwan P. W. (2006). A real-time automated system for the recognition of human facial expressions. IEEE Trans. Syst. Man Cybern. B. Cybern. 36, 96–105. 10.1109/TSMCB.2005.854502 - DOI - PubMed
    1. Baumgartner T., Esslen M., Jäncke L. (2006). From emotion perception to emotion experience: emotions evoked by pictures and classical music. Int. J. Psychophysiol. 60, 34–43. 10.1016/j.ijpsycho.2005.04.007 - DOI - PubMed
    1. Black M. J., Yacoob Y. (1997). Recognizing facial expressions in image sequences using local parameterized models of image motion. Int. J. Comput. Vis. 25, 23–48. 10.1023/A:1007977618277 - DOI
    1. Cutting J. E., DeLong J. E., Nothelfer C. E. (2010). Attention and the evolution of hollywood film. Psychol. Sci. 21, 432–439. 10.1177/0956797610361679 - DOI - PubMed
    1. Czigler B., Csikós D., Hidasi Z., Gaál Z. A., Csibri E., Kiss E., et al. . (2008). Quantitative EEG in early Alzheimer's disease patients — Power spectrum and complexity features. Int. J. Psychophysiol. 68, 75–80. 10.1016/j.ijpsycho.2007.11.002 - DOI - PubMed

LinkOut - more resources