How Tough is AidData's TUFF Methodology? New Evidence from Uganda and South Africa
Initial field research results revealed general agreement between the data on development projects generated via TUFF and ground-truthing fieldwork.
It is widely believed that non-Western governments are supplying large and increasing amounts of development finance to developing countries. However, it has been difficult to assess the composition and impact of this funding, since many emerging donors do not participate in international regimes such as the OECD's Creditor Reporting System or the International Aid Transparency Initiative. An innovative, open source data collection methodology, now verified through field-based research, may turn this conundrum on its head.
In April 2013, AidData published the world’s most comprehensive database of project-level Chinese development activities in Africa. Since then, AidData has refined its methodology to remotely collect and synthesize data from media reports, government documents and databases, Chinese embassy websites, NGO reports, and scholarly articles. An updated version of the database and methodology – renamed Tracking Under-Reported Financial Flows (TUFF) – to be released tomorrow will capture over 1,900 development finance projects from 2000 to 2012. Non-investment projects in the databae total over $83 billion. The database includes 268 pledged projects; however, the reported total of $83 billion only captures official commitments, including porjects in implementation and completed projects.
Some scholars have argued that verifying open source data through expert field-based research is the preferred approach to gathering accurate data on Chinese development finance. To test this assertion, a team of researchers from the University of Cape Town, Zhejiang University, the College of William & Mary, and Brigham Young University undertook a five-month field study in 2013 to gauge the accuracy of our TUFF methodology. With support from the United Nations University World Institute for Development Economics Research, we trained enumerators to visit development project locations that were remotely identified by the TUFF methodology.
This “ground-truthing” methodology – the direct observation of project sites and infrastructure, as well as direct interaction with project stakeholders – follows a step-by-step procedure that is systematic, transparent, and replicable. For the pilot, we trained 19 professional enumerators in Uganda and four in South Africa to visit local project locations. While on location, enumerators conducted interviews with officials who were accountable for the projects and were able to construct a more accurate, granular narrative for the projects.
The new release of the dataset allows users to upload multimedia elements to individual project pages like the photo above of a platinum mine project in South Africa.
Following the field research, we examined the extent to which these on-site and remote data collection methods complement or contradict each other. Initial results revealed general agreement between the data on development projects generated via TUFF and ground-truthing fieldwork. Furthermore, the ground-truthing methodology was also found to be a helpful complement to the TUFF data collection approach in several respects. First, ground-truthing offers less risk of data “contamination” because information flows directly from project site to the researcher, rather than being interpreted by a third party. Second, ground-truthing provides first-hand information that requires the researcher to make fewer judgment calls than when faced with conflicting reports collected from different sources remotely. Third, ground-truthing has the advantage of being able to uncover projects in communities that have low levels of press freedom or little coverage from media outlets.
Nonetheless, ground-truthing can be quite time- and resource-intensive. For example, the Uganda pilot required approximately 1,500 total researcher hours to ground-truth 42 projects, and enumerators ultimately succeeded in interviewing a relevant official for only half of the projects they attempted. Enumerators were more likely to fail to ground-truth smaller projects that had occurred several years before the visits. Generally speaking, ground-truthing is also only well suited for gathering information on projects with a physical location, rather than aid distributed nationally or given directly to the government.
Ultimately, data collected through ground-truthing can add precision and detail to project records identified through remote data collection methods, such as TUFF. There is a clear tradeoff, however, between the informational richness acquired through field-based data collection and the comprehensiveness of data collected through open source data collection methods.
For more information about our ground-truthing methodology and the results of our study, read the paper now available here. Stay tuned for the release of the updated Tracking Chinese Development Finance in Africa dataset and methodology tomorrow on The First Tranche.
*Post updated to reflect current financial information available at china.aiddata.org.
Edwin Muchapondwa is the Director of the School of Economics at the University of Cape Town. Austin Strange is a graduate student at Zhejiang University and an AidData Research Associate. Daniel Nielson is an Associate Professor of Political Science at Brigham Young University and AidData’s Chief Social Scientist. Michael Tierney is Director of the Institute for the Theory and Practice of International Relations at the College of William & Mary. Brad Parks is AidData’s Co-Executive Director at the College of William & Mary.