Making data accessible: lessons learned from computational reproducibility of impact evaluations
Harvard Dataverse (Africa Rice Center, Bioversity International, CCAFS, CIAT, IFPRI, IRRI and WorldFish)
View Archive InfoField | Value | |
Title |
Making data accessible: lessons learned from computational reproducibility of impact evaluations
|
|
Identifier |
https://doi.org/10.7910/DVN/FPNITS
|
|
Creator |
Goel, Neeta
Khatua, Sayak Gaarder, Marie |
|
Publisher |
Harvard Dataverse
|
|
Description |
The study aims to check the reusability of impact evaluation data by verifying the results presented in published 3ie reports. In order to verify results, we conduct push button replications on the original data and code submitted by the authors. We use the push button replication protocol developed at 3ie to determine the level of comparability of the replication results to the original findings. Our sample includes closed 3ie-funded impact evaluations commissioned between 2008 and 2018. Of the 74 studies in our sample, we successfully reproduced results from 38 studies (51%). 24 (32%) studies were categorized as incomplete and 12 (16%) studies were categorized as having major differences. The cumulative replication rate in 2018 increased to 51%, as compared to the below-40% replication rate in previous years. Overall, on average, it took about 3 hours to complete the replication of a single impact evaluation. Evidence from impact evaluations are credible when it is verifiable. Our findings suggest that greater attention is needed to ensure the reliability and reusability of evidence. We recommend push button replications as a tested method to ascertain the credibility of findings. |
|
Subject |
Social Sciences
open data open research replication reproducibility agriculture economics impact evaluations transparency |
|
Contributor |
International Initiative for Impact Evaluation (3ie)
|
|