IDR Logo

Please use this identifier to cite or link to this item: http://idr.iitbbs.ac.in/jspui/handle/2008/2158
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBhattacharyya A.en_US
dc.contributor.authorSaini R.en_US
dc.contributor.authorRoy P.P.en_US
dc.contributor.authorDogra D.P.en_US
dc.contributor.authorKar S.en_US
dc.date.accessioned2020-01-16T05:55:36Z-
dc.date.available2020-01-16T05:55:36Z-
dc.date.issued2019-
dc.identifier.citation1en_US
dc.identifier.urihttp://dx.doi.org/10.1007/s00500-018-3446-9-
dc.identifier.urihttp://10.10.32.48:8080/jspui/handle/2008/2158-
dc.description.abstractRecently, recognition of gender from facial images has gained a lot of importance. There exist a handful of research work that focus on feature extraction to obtain gender-specific information from facial images. However, analyzing different facial regions and their fusion help in deciding the gender of a person from facial images. In this paper, we propose a new approach to identify gender from frontal facial images that is robust to background, illumination, intensity, and facial expression. In our framework, first the frontal face image is divided into a number of distinct regions based on facial landmark points that are obtained by the Chehra model proposed by Asthana et al. The model provides 49 facial landmark points covering different regions of the face, e.g., forehead, left eye, right eye, lips. Next, a face image is segmented into facial regions using landmark points and features are extracted from each region. The compass LBP feature, a variant of LBP feature, has been used in our framework to obtain discriminative gender-specific information. Following this, a support vector machine-based classifier has been used to compute the probability scores from each facial region. Finally, the classification scores obtained from individual regions are combined with a genetic algorithm-based learning to improve the overall classification accuracy. The experiments have been performed on popular face image datasets such as Adience, cFERET (color FERET), LFW and two sketch datasets, namely CUFS and CUFSF. Through experiments, we have observed that, the proposed method outperforms existing approaches. � 2018, Springer-Verlag GmbH Germany, part of Springer Nature.en_US
dc.language.isoenen_US
dc.subjectCombination of facial regionsen_US
dc.subjectDecision fusionen_US
dc.subjectFacial gender recognitionen_US
dc.subjectFacial landmark detectionen_US
dc.subjectGenetic algorithmen_US
dc.titleRecognizing gender from human facial regions using genetic algorithmen_US
dc.typeArticleen_US
Appears in Collections:Research Publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.