Show simple item record

dc.contributor.authorAfendi Tengku Mohd
dc.contributor.authorKurugollu, Fatih
dc.contributor.authorCrookes, Danny
dc.contributor.authorBouridane, Ahmed
dc.contributor.authorFarid, Mohsen
dc.date.accessioned2019-03-18T13:12:43Z
dc.date.available2019-03-18T13:12:43Z
dc.date.issued2018-09-17
dc.identifier.citationZulcaffle, T.M.A., Kurugollu, F., Crookes, D., Bouridane, A. and Farid, M., (2019). 'Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera'. IEEE Transactions on Information Forensics and Security, 14(4), pp.1067-1082. DOI: 10.1109/TIFS.2018.2870594en_US
dc.identifier.issn1556-6013
dc.identifier.doi10.1109/TIFS.2018.2870594
dc.identifier.urihttp://hdl.handle.net/10545/623612
dc.description.abstractFrontal view gait recognition for people identification has been carried out using single RGB, stereo RGB, Kinect 1.0 and Doppler radar. However, existing methods based on these camera technologies suffer from several problems. Therefore, we propose a four-part method for frontal view gait recognition based on fusion of multiple features acquired from a Time of Flight (ToF) camera. We have developed a gait data set captured by a ToF camera. The data set includes two sessions recorded seven months apart, with 46 and 33 subjects respectively, each with six walks with five covariates. The four-part method includes: a new human silhouette extraction algorithm that reduces the multiple reflection problem experienced by ToF cameras; a frame selection method based on a new gait cycle detection algorithm; four new gait image representations; and a novel fusion classifier. Rigorous experiments are carried out to compare the proposed method with state-of-the-art methods. The results show distinct improvements over recognition rates for all covariates. The proposed method outperforms all major existing approaches for all covariates and results in 66.1% and 81.0% Rank 1 and Rank 5 recognition rates respectively in overall covariates, compared with a best state-of-the-art method performance of 35.7% and 57.7%.en_US
dc.description.sponsorshipN/Aen_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.relation.urlhttps://ieeexplore.ieee.org/abstract/document/8466800en_US
dc.rights© © 20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
dc.subjectGait Recognitionen_US
dc.subjectFrontal Viewen_US
dc.subjectTime of Flight Cameraen_US
dc.subjectFusion of featuresen_US
dc.subjectdepth gait data seten_US
dc.titleFrontal view gait recognition with fusion of depth features from a time of flight cameraen_US
dc.typeArticleen_US
dc.contributor.departmentQueen's University, Belfasten_US
dc.contributor.departmentUniversity of Derbyen_US
dc.contributor.departmentNorthumbria Universityen_US
dc.identifier.journalIEEE Transactions on Information Forensics and Securityen_US
dcterms.dateAccepted2018-09-02
refterms.dateFOA2019-03-18T13:12:43Z
dc.author.detail785317en_US


Files in this item

Thumbnail
Name:
ieeetifsReview2D finalDannyChe ...
Size:
1.157Mb
Format:
PDF
Description:
pre-print

This item appears in the following Collection(s)

Show simple item record