Automatic emotion perception using eye movement information for E-Healthcare systems.
Abstract
Facing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system.Citation
Wang, Y., Iv, Z., Zheng, Y. (2018) 'Automatic emotion perception using eye movement information for E-Healthcare systems.' Sensors, 18(9), 2826; doi. 10.3390/s18092826.Publisher
MDPIJournal
SensorsDOI
10.3390/s18092826Type
ArticleLanguage
enae974a485f413a2113503eed53cd6c53
10.3390/s18092826