Fifty shades of health research
A version of this article was originally published in the University of Toronto’s Centre for Global Health Bulletin, March 2025 Newsletter. All changes are tracked in the version history on GitHub.
Summary
What counts as bad health research? There is no single answer. This article presents (not fifty but) three distinct cases of controversial biomedical, clinical, and “meta” research and introduces a broad audience to the topic of research waste. Research waste refers to the inefficient, ineffective, or unintentionally harmful use of resources (e.g., financial, human, time, participant burden, or excess use of animals) at any level of the health research ecosystem.
Case 1: Landmark paper retracted over images suspicious of manipulation
In June 2024, a scientific paper published 18 years ago was retracted (Lesné et al., 2006). This paper is now considered one of the most influential papers ever retracted. After concerns were raised online about the trustworthiness of the figures on which its conclusion was based, an online discussion and an investigation by the journal ensued. In consequence, the authors retracted the paper.
This case was highly publicized by Science and other media. The paper itself was cited over 2,300 times, including more than 10 citations after the retraction note was published. One of the reasons for this study’s popularity is its landmark role in Alzheimer’s disease research: after years of failure to develop an effective Alzheimer’s medication, this study provided experimental evidence of a promising drug target. Its retraction cast a slight shadow on the consequent research and translation efforts.
Case 2: Five hundred excessive deaths in patients not offered an effective drug
“Of more than 2000 redundant clinical trials on statins in patients with coronary artery disease… an extra 3000 [major adverse cardiac events], including nearly 600 deaths, were experienced by participants not treated with statins in these trials.”
This is the conclusion reached after investigating the aftermath of a clinical cardiology practice guideline introduced in 2007 in China. This guideline included a strong recommendation, based on high-quality evidence, for administering a statin medication to patients with two common heart diseases: stable angina pectoris or acute coronary syndrome. It was then logical to assume, the investigators argued, that any new trials conducted after that point (plus a one-year lag to account for the guideline adoption) were to be considered redundant, and 3470 people harmed.
The results of this investigation were published in the reputable British Medical Journal (BMJ) journal back in 2021 (Jia et al., 2021). Almost four years after its publication, it has not been widely publicized. The publication’s Altmetric profile shows that it was only ever picked up by two news outlets, one blog, and 53 social media posts. It has also received only 18 academic citations.
Case 3: The false promise of meta-research
A recently published review of methods for assessing research “waste” is not among the 18 papers that cited the statin trials study. Research waste includes practices that are known to be controversial but are not broadly condemned as questionable (unacceptable) or as alleged misconduct (Rosengaard et al., 2024a). Upon examination, the reason why the statin trials study was missed was because of a slight difference in the wording used in the study abstract and the review’s search strategy. This is despite the rigorous design and state-of-the-art methods employed by the review authors (Rosengaard et al., 2024b).
Evidence syntheses of health research are designed to be “the way that academics bring together knowledge from across multiple studies into a whole, to present the state of current understanding about a given area” (Thomas, 2024). There are multiple examples of how the field does not fully deliver its promise. A recent finding was that 78% of systematic reviews do not have a reproducible search strategy, considered to be a key feature of this kind of meta-research study (Rethlefsen et al., 2024). Another estimate contends that 97% of systematic reviews either do not have adequate methods or are clinically useless (Ioannidis, 2016). There is not enough unanimity even on the preferred terminology to denote meta-research itself, let alone on how to collaborate efficiently (Puljak, 2019).
AWARE of research waste
This article presented three distinct cases of health research that can, ostensibly, be called “bad.” The cases range from the darkest shades (Case 1: a retracted article due to alleged misconduct) through a well-declared gray zone (Case 2: research explicitly called redundant in a meta-research study) or even an area genuinely challenging to categorize on a black and white spectrum (Case 3: integral challenges within meta-research itself).
The difference, however, seems to be primarily in the degree of community consensus toward the label (misconduct vs. waste vs. “challenges”), not necessarily the impacts of the discussed research. For example, the largely unnoticed Case 2 provides evidence for well-documented, and drastic, negative health outcomes whereas, for the widely condemned and retracted Case 1, the direct health impacts are more elusive.
A broader scholarly debate questions whether a research work should be evaluated based on its “quality” in and of itself or based on its “impacts” on the world. A notable discussion happened around a 2022 initiative called the Coalition for Advancing Research Assessment (CoARA). The coalition calls for wider adoption of peer review in research assessment, in opposition to publication-based metrics. This initiative received strong pushback from the president of the International Society for Scientometrics and Infometrics (ISSI) (Abramo, 2024).
It is clear that no unanimous position exists in the academic community as to what exactly constitutes bad health research. Complete clarity about what we mean by research waste and how we measure it is urgently needed to develop evidence-based strategies for easing its insidious pressure on health care decision-makers, providers, and patients globally.
Our project, called Avoidable WAste in health REsearch (AWARE), aims to look more into this.
Acknowledgments
I would like to thank Erica Di Ruggiero and Michelle Christian for their helpful feedback on an earlier version of this article, the Scientometrics Centre at the HSE University for their informative news feed, my supervisor Andrea Tricco for the opportunities to study this topic, and the many other colleagues who shared their insights about research waste over the past 2 years.
References
Abramo, G. (2024). The forced battle between peer-review and scientometric research assessment: Why the CoARA initiative is unsound. Research Evaluation, rvae021. https://doi.org/10.1093/reseval/rvae021
Ioannidis, J. P. A. (2016). The mass production of redundant, misleading, and conflicted systematic reviews and meta‐analyses. The Milbank Quarterly, 94(3), 485–514. https://doi.org/10.1111/1468-0009.12210
Jia, Y., Wen, J., Qureshi, R., Ehrhardt, S., Celentano, D. D., Wei, X., Rosman, L., Wen, Y., & Robinson, K. A. (2021). Effect of redundant clinical trials from mainland China evaluating statins in patients with coronary artery disease: Cross sectional study. BMJ, n48. https://doi.org/10.1136/bmj.n48
Lesné, S., Koh, M. T., Kotilinek, L., Kayed, R., Glabe, C. G., Yang, A., Gallagher, M., & Ashe, K. H. (2006). RETRACTED ARTICLE: A specific amyloid-β protein assembly in the brain impairs memory. Nature, 440(7082), 352–357. https://doi.org/10.1038/nature04533
Puljak, L. (2019). Methodological research: Open questions, the need for ’research on research’ and its implications for evidence-based health care and reducing research waste. International Journal of Evidence-Based Healthcare, 17(3), 145–146. https://doi.org/10.1097/XEB.0000000000000201
Rethlefsen, M. L., Brigham, T. J., Price, C., Moher, D., Bouter, L. M., Kirkham, J. J., Schroter, S., & Zeegers, M. P. (2024). Systematic review search strategies are poorly reported and not reproducible: A cross-sectional metaresearch study. Journal of Clinical Epidemiology, 166, 111229. https://doi.org/10.1016/j.jclinepi.2023.111229
Rosengaard, L. O., Andersen, M. Z., Rosenberg, J., & Fonnes, S. (2024a). Five aspects of research waste in biomedicine: A scoping review. Journal of Evidence-Based Medicine, jebm.12616. https://doi.org/10.1111/jebm.12616
Rosengaard, L. O., Andersen, M. Z., Rosenberg, J., & Fonnes, S. (2024b). Several methods for assessing research waste in reviews with a systematic search: A scoping review. PeerJ, 12, e18466. https://doi.org/10.7717/peerj.18466
Thomas, J. (2024). Methods development in evidence synthesis: A dialogue between science and society. In A. Oancea, G. E. Derrick, N. Nuseibeh, & X. Xu (Eds.), Handbook of Meta-Research (pp. 146–158). Edward Elgar Publishing. https://doi.org/10.4337/9781839105722.00020