AidData was recently announced as one of the five winners of USAID’s Equitable AI Challenge. Dozens of competitors submitted proposals to the challenge—implemented through DAI’s Digital Frontiers—which was designed to fund approaches that will increase the accountability and transparency of AI systems used in global development contexts.
Under the year-long award, AidData will partner with the Ghana Center for Democratic Development (CDD-Ghana) to explore gender bias in AI applications for estimating poverty using household survey data. This project adds to AidData’s broader research initiative on gender equity in development, as well as ongoing machine learning applications. The award for $133,000 will run through October 2023 and is led by co-principal investigators Seth Goodman, Katherine Nolan, and Rachel Sayers, research scientists at AidData.
“We’re thrilled to have been selected along with our partner CDD-Ghana as one of the winners of USAID’s Equitable AI Challenge,” said Nolan. “This funding provides a valuable opportunity to innovate new tools and methods incorporating AI to reduce gender bias in development and improve the measurement of important gender-related development outcomes.”
AidData, in partnership with CDD-Ghana, aims to evaluate the impact of gender bias on poverty estimates generated using AI and USAID’s Demographic and Health Surveys (DHS) data in order to inform AI developers and researchers, development organizations, and decision makers who produce or use poverty estimates. The project leverages AidData’s expertise in artificial intelligence, geospatial data, household surveys, and CDD-Ghana’s knowledge of the local context and environments to produce a novel public good that will elevate equitability discussions surrounding the growing use of AI in development.
“Poverty estimates created using AI models trained with household survey data are increasingly used in research, evaluation, and decision-making applications. We want to better understand how gender bias may be affecting these models, and provide a practical means for others to evaluate bias in their own AI applications” said Sayers.
The grant will produce an assessment of the impact of gender bias on poverty estimates for Ghana as an initial case study, along with an open-source repository of code and methods for replicating the assessment in additional countries. The case study and code will be shared with key organizations in Ghana to explore current uses of AI, poverty estimates, and concerns focused on gender bias. Ultimately, the resources produced will encourage deeper consideration for potential bias in AI models and data being used for development applications and provide a practical means for others to evaluate bias in their own applications.
This work comes on the heels of a larger effort to bring AidData’s geospatial impact evaluations (GIEs) to scale and better understand how gender differences and power dynamics drive measurement error in the household surveys that feed into impact evaluations. Last summer, three researchers who lead AidData’s Gender Equity in Development Initiative—Rachel Sayers, Jessica Wells, and Katherine Nolan—traveled to field sites in Ghana for an evaluation of a national-dam building program and worked to pilot a new methodology to better disaggregate geospatial datasets by gender. Funded through a one-year grant from the William & Flora Hewlett Foundation, the team also received an add-on grant from Innovations for Poverty Action (IPA), with the specific goal of identifying the measurement error in self-reported agricultural characteristics through the collection of gender-disaggregated geospatial data.
Over the next year, AidData and the other awardees of the Equitable AI Challenge will work with USAID and its partners to implement their approaches and generate new technical knowledge, lessons learned, and tested solutions in addressing gender bias in AI tools. Through this implementation phase, USAID seeks to foster a diverse and more inclusive digital ecosystem where all communities can benefit from emerging technologies like AI and—most importantly—ensure all members of these communities are not harmed by these technologies. This effort will inform USAID and the development community, providing a greater understanding of AI fairness tools and approaches, what they capture and what they leave out, and what tactics are needed to update, adapt, and socialize these tools for broader use.