Show simple item record

dc.contributor.advisorRattani, Ajita
dc.contributor.authorKrishnan, Anoop
dc.contributor.authorAlmadan, Ali
dc.date.accessioned2021-05-04T12:03:13Z
dc.date.available2021-05-04T12:03:13Z
dc.date.issued2021-04-02
dc.identifier.citationKrishnan, A.; Almadan, A. 2021. Understanding the bias in Deep Learning based gender classification systems -- In Proceedings: 17th Annual Symposium on Graduate Research and Scholarly Projects. Wichita, KS: Wichita State University
dc.identifier.urihttps://soar.wichita.edu/handle/10057/19911
dc.descriptionPresented to the 17th Annual Symposium on Graduate Research and Scholarly Projects (GRASP) held online, Wichita State University, April 2, 2021.
dc.descriptionResearch completed in the Department of Electrical Engineering and Computer Science, College of Engineering
dc.description.abstractAutomated gender classification has important applications in many domains such as demographic research, law enforcement, online advertisements, as well as human computer interface. Research search questioned the fairness of this technology across gender and race. Specifically majority of the studies raised the concern of higher error rates of the race based gender classification system for dark skinned people like Afro American and for women. However to date, the majority of the studies were limited to African American and Caucasian only. There can be two reasons for this, since these classification systems are data driven, either these systems lack uniform data distribution of individual races (which leads to data imbalance), or the algorithm (here, we talk about machine learning algorithms) which is a black box. We tackled the data imbalance using a demographically balanced dataset (so far it is the best), Fairface for the study. It contains data (images) of male and female (so, our research is constrained to binary form), belongs to seven races, and are evenly distributed. The second problem, is what we are trying to tackle it down. What classification algorithms can be used or how the different classifications algorithms perform (architectural differences), when they are trained to classify images based on their gender. Here, we used deep learning based solution which involves state of the art technology for classification, estimation so on. Finally, even in this era, social justice, human rights, equality are violated. Since most of the technology in this period involves deep learning machine solutions, these machines should not violate equality, but that is not the case. That is where the relevance of this study stands.
dc.description.sponsorshipGraduate School, Academic Affairs, University Libraries
dc.language.isoen_US
dc.publisherWichita State University
dc.relation.ispartofseriesGRASP
dc.relation.ispartofseriesv. 17
dc.titleUnderstanding the bias in Deep Learning based gender classification systems
dc.typeAbstract
dc.rights.holderWichita State University


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record