Statistical infarction: A postmortem of the Cornell Food and Brand Lab pizza publications

omnesres.com, Charlottesville, United States
Graduate School of Teaching (ICLON), Leiden University, Leiden, Netherlands
University Medical Center, University of Groningen, Groningen, Netherlands
DOI
10.7287/peerj.preprints.3025v1
Subject Areas
Ethical Issues, Science Policy, Statistics
Keywords
Statistics, Replication, Reproducibility, Reanalysis
Copyright
© 2017 Anaya et al.
Licence
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Preprints) and either DOI or URL of the article must be cited.
Cite this article
Anaya J, van der Zee T, Brown N. 2017. Statistical infarction: A postmortem of the Cornell Food and Brand Lab pizza publications. PeerJ Preprints 5:e3025v1

Abstract

We previously reported over 150 inconsistencies in a series of four articles (the "pizza papers") from the Cornell Food and Brand Lab that described a study of eating habits at an all-you-can-eat pizza buffet. The lab's initial response led us to investigate more of their work, and our investigation has now identified issues with at least 45 publications from this lab. Perhaps because of the growing media attention, Cornell and the lab have released a statement concerning the pizza papers, which included a response to the inconsistencies, along with data and code. Many of the inconsistencies were identified with the new technique of granularity testing, and this case has the highest density of granularity inconsistencies that we know of. This is also the first time a data set has been made public after granularity concerns were raised, making it a highly suitable case study for showing the accuracy and potential of this technique. It is also important that a third party audit the lab's response, given the continuing investigation of misconduct and presumably future reports and data releases. Our careful inspection of the data set suggests no evidence of fabrication, but we found the lab's report confusing, incomplete, and error prone. In addition, we found the number of missing, unusual, and logically impossible responses in the data set highly concerning. Unfortunately, given the unsound theory, poor methodology, questionable data, and countless errors, we find it remarkable that these four papers were published and recommend retraction of all four papers.

Author Comment

This is a preprint submission to PeerJ Preprints