Show simple item record

dc.contributor.authorYang Wang
dc.contributor.authorZhao Iv
dc.contributor.authorYongjun Zheng
dc.date.accessioned2018-10-11T09:13:42Z
dc.date.available2018-10-11T09:13:42Z
dc.date.issued2018-08-31
dc.identifier.citationWang, Y., Iv, Z., Zheng, Y. (2018) 'Automatic emotion perception using eye movement information for E-Healthcare systems.' Sensors, 18(9), 2826; doi. 10.3390/s18092826.en
dc.identifier.doi10.3390/s18092826
dc.identifier.urihttp://hdl.handle.net/10545/623027
dc.description.abstractFacing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system.
dc.description.sponsorshipAnhui Provincial Natural Science Research Project of Colleges and Universities Fund under Grant KJ2018A0008, Open Fund for Discipline Construction under Grant Institute of Physical Science and Information Technology in Anhui University, and National Natural Science Fund of China under Grant 61401002.en
dc.language.isoenen
dc.publisherMDPIen
dc.subjectemotion recognitionen
dc.subjecteye movementen
dc.subjectadolescenceen
dc.subjecthealthcareen
dc.titleAutomatic emotion perception using eye movement information for E-Healthcare systems.en
dc.typeArticleen
dc.contributor.departmentAnhui Universityen
dc.contributor.departmentUniversity of Derbyen
dc.identifier.journalSensorsen
refterms.dateFOA2019-02-28T17:36:18Z
html.description.abstractFacing the adolescents and detecting their emotional state is vital for promoting rehabilitation therapy within an E-Healthcare system. Focusing on a novel approach for a sensor-based E-Healthcare system, we propose an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously. Specifically, we extract the time-frequency eye movement features by firstly applying the short-time Fourier transform (STFT) to raw multi-channel EOG signals. Subsequently, in order to integrate time domain eye movement features (i.e., saccade duration, fixation duration, and pupil diameter), we investigate two feature fusion strategies: feature level fusion (FLF) and decision level fusion (DLF). Recognition experiments have been also performed according to three emotional states: positive, neutral, and negative. The average accuracies are 88.64% (the FLF method) and 88.35% (the DLF with maximal rule method), respectively. Experimental results reveal that eye movement information can effectively reflect the emotional state of the adolescences, which provides a promising tool to improve the performance of the E-Healthcare system.


Files in this item

Thumbnail
Name:
sensors-18-02826-v2.pdf
Size:
4.294Mb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record