Oxford University Press's
Academic Insights for the Thinking World

Replication in international relations

The integrity of science is threatened in many ways – by direct censorship; by commercial, political, or military secrecy; by various forms of publication bias; by exorbitant journal subscription fees that effectively deny access to the general public; by cheating and falsification of results; and by sloppiness in the research process or the editorial process prior to publication. There isn’t a single antidote to all these problems, but transparency goes a long way in relation to many of them.

One important way to promote transparency and quality control in published research is to require that systematic data be made available for replication studies.

In political science, the replication idea received a major impetus some twenty years ago in a widely cited and debated article by Gary King. In 2002, editors of international relations journals joined forces at a panel at the annual convention of the International Studies Association and signed a joint statement that pledged the journals to a set of minimum replication standards and urged other editors to follow their lead. The statement recommended not just the release of all data used, but also a codebook and program files that would make it possible to reproduce the published results. All files should be submitted electronically for posting on a website maintained by the journal for that purpose, but could of course also be posted elsewhere.

Although these and other journals did post substantial numbers of datasets, several studies have shown that the journals in political science and international relations have been slow in implementing satisfactory replication practices. Many leading journals lack a clear policy on replication and some that do lag behind in the implementation. In many cases, scholars, when been left to their own devices, post the data on their own websites. Such links are often unstable and are not updated when the owner moves to another institution. In some cases, data are updated and replication becomes impossible because the original dataset is not preserved. Even the four political science and international relations journals that signed the 2003 declaration have holes in their replication files.

 One important way to promote transparency and quality control in published research is to require that systematic data be made available for replication studies.

Nevertheless, the field seems to be moving forward, even if slowly. The International Studies Association subscribes to ‘transparency of scholarship and cumulation of knowledge.’ The lead research journal of the association, International Studies Quarterly, maintains an archive of replication data on their website. Many of the datasets are located at the Harvard Dataverse site. With nearly 64,000 datasets with over 360,000 files, it is a treasure trove for scholars and provides more stability than individual homepages.

The main beneficiary of making replication material generally available is the scholarly community as a whole. Posting data provides an incentive for greater accuracy in data collection and analysis, and promises to increase the quality of published results. But there are incentives for individual scholars as well. Those who download the data may be able to use them for independent studies, building on the earlier work. As for the original authors, several studies have shown that publications with replication data gather more citations on average. This also provides an incentive for journals to pursue active replication policies. Moreover, the increasing availability of replication data provides opportunities for teaching – students to replicate published studies as part of their methods training and perhaps even to ease the way into academic publishing when they fail to replicate extant work or discover fruitful extensions of it.

A few journals have moved beyond requiring the authors to post their data upon publication of the article. They ask for the data to be submitted before final acceptance. This permits the editorial office to run a ‘preplication’ test, to see if the empirical results reported in an article can be duplicated. If they cannot, the author needs to do some more work. In the earlier ISA symposium, Bruce Bueno de Mesquita, had proposed an even more radical policy: making the dataset available to the referees for the article. Both authors and referees may want to give priority to journals that have adopted a replication policy.

In principle, the calls for replication in the social sciences have also included articles with a non-quantitative orientation. Author instructions for some journals encourage authors of articles without quantitative data to use replication sites for additional documentation not included in the article itself, such as interview guides and transcripts. To date, few authors seem to have taken advantage of this option.

Advancing replication policies and practices provides an important tool for making academic work more transparent and reliable.

Feature image credit: Computer Keyboard by SteveRaubenstine. CC0 Public Domain via Pixabay.

Recent Comments

There are currently no comments.

Leave a Comment

Your email address will not be published. Required fields are marked *