Moreover, the second bias in study selection points in the opposite direction: the RP: CB has only replicated studies that had enough methodological details, or whose authors were cooperative enough to fill in the missing information. This means that, out of all most-cited papers, the authors have arguably only attempted to reproduce the most rigorous ones. Indeed, in an unrelated project in psychology, Wicherts and colleagues showed preliminary evidence that original authors were less likely to respond to follow-up emails if their articles had more errors and weaker evidence, even when the follow-up emails made no reference to these errors (Wicherts et al., 2011). It is likely that the replicability of the most influential papers in Cancer Biology is even worse than the dismal 46% replicability rate makes it appear. All in all, these results point towards the need for major reforms in Cancer biology.
References
Arrowsmith, J. (2012). A decade of change. Nature Reviews Drug Discovery, 11(1), 17–18. https://doi.org/10.1038/nrd3630
Begley, C. G., & Ellis, L. M. (2012). Drug development: Raise standards for preclinical cancer research. Nature, 483(7391), 483531a. https://doi.org/10.1038/483531a
Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer, T., Razen, M., & Wu, H. (2016). Evaluating replicability of laboratory experiments in economics. Science, aaf0918. https://doi.org/10.1126/science.aaf0918
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., … Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637–644. https://doi.org/10.1038/s41562-018-0399-z
Cova, F., Strickland, B., Abatista, A., Allard, A., Andow, J., Attie, M., Beebe, J., Berniūnas, R., Boudesseul, J., Colombo, M., Cushman, F., Diaz, R., N’Djaye Nikolai van Dongen, N., Dranseika, V., Earp, B. D., Torres, A. G., Hannikainen, I., Hernández-Conde, J. V., Hu, W., … Zhou, X. (2018). Estimating the Reproducibility of Experimental Philosophy. Review of Philosophy and Psychology. https://doi.org/10.1007/s13164-018-0400-9
Errington, T. M., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021). Challenges for assessing replicability in preclinical cancer biology. ELife, 10, e67995. https://doi.org/10.7554/eLife.67995
Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E., & Nosek, B. A. (2021). Investigating the replicability of preclinical cancer biology. ELife, 10, e71601. https://doi.org/10.7554/eLife.71601
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716–aac4716. https://doi.org/10.1126/science.aac4716
Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug Discovery, 10(9), 712–712. https://doi.org/10.1038/nrd3439-c1
Serra-Garcia, M., & Gneezy, U. (n.d.). Nonreplicable publications are cited more than replicable ones. Science Advances, 7(21), eabd1705. https://doi.org/10.1126/sciadv.abd1705
Wicherts, J. M., Bakker, M., & Molenaar, D. (2011). Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results. PLOS ONE, 6(11), e26828. https://doi.org/10.1371/journal.pone.002682
Feature image author – @Pavel Danilyuk
Leave a Reply