Policy Report

Evaluating Gender Bias in AI Applications Using Household Survey Data

Date Published

Oct 24, 2023

Authors

Seth Goodman, Katherine Nolan, Rachel Sayers, Ariel BenYishay, Jacob Hall, Mavis Zupork Dome, Edem Selormey

Publisher

Citation

Goodman, S., Nolan, K., Sayers, R., BenYishay, A., Hall, J., Zupork Dome, M., and E Selormey. 2023. Evaluating Gender Bias in AI Applications Using Household Survey Data. Williamsburg, VA and Accra, Ghana: AidData at William & Mary and Ghana Center for Democratic Development.

Abstract

In this study we introduce an approach for evaluating the role of gender in wealth estimates generated using artificial intelligence (AI), geospatial data, and USAID’s Demographic and Health Surveys (DHS) data. In partnership with Ghana Center for Democratic Development, we use the 2014 round of the Ghana DHS to compare the performance of models trained on gender specific household data. Findings indicate that even after controlling for sample size, models trained and validated on male household data outperform models trained and validated on female household data. Alongside the Technical Report detailing the methodology and findings, the associated code and data is available to facilitate future use. The Executive Summary highlights key lessons from this study including the importance of local partners when developing AI applications. This report and associated work was produced as part of USAID's Equitable AI Challenge and aims to elevate equitability discussions surrounding the growing use of AI in development.


Related Publications

No items found.

Featured Authors

Seth Goodman
Research & Evaluation

Seth Goodman

Research Scientist

Katherine Nolan
Research & Evaluation

Katherine Nolan

Research Scientist

Rachel Sayers
Research & Evaluation

Rachel Sayers

Research Scientist

Ariel BenYishay
Research & Evaluation

Ariel BenYishay

Chief Economist, Director of Research and Evaluation

Jacob Hall
Research & Evaluation

Jacob Hall

Data Analyst

Related Datasets

No items found.