Troubling retractions
Peer review failures by top science journals may undermine trust in medical science
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
The Lancet and The New England Journal of Medicine—among the world’s most prestigious science journals—each retracted a headlining study about COVID-19 on Thursday. The implications extend beyond those journals, though, and threaten to erode public trust in medical research at a time when it is most needed.
The Lancet’s study, the better-known of the two, claimed to have analyzed over 96,000 COVID-19 patients from 671 different hospitals. On the strength of this seeming mountain of data, the authors stated that the drugs hydroxychloroquine and chloroquine did not help patients, and that in fact they often harmed patients. The study was published May 22, and its conclusions affected other studies: When researchers learn their projects would harm patients, they are ethically obligated to stop.
Thus, the World Health Organization’s “Solidarity” trial, proceeding in over 100 countries, paused its hydroxychloroquine arm. The ASCOT trial, seeking to recruit 2,500 patients in Australia and Asia, also paused.
But soon, an impromptu online peer-review effort sprang up around the Lancet study, identifying major red flags: For example, the study claimed the prevalence of smoking was almost uniform at about 10 percent worldwide. (World Health Organization data show wide differences, with 13.9 percent of Africans smoking tobacco, 28.7 percent of Europeans, and so forth.) The Lancet study also had only four authors, a number more often associated with small studies at a single institution than with massive studies analyzing tens of thousands of patients.
Surgisphere Corp., a company founded by study co-author Sapan Desai, claimed to have the data, yet said it was prohibited by “data sharing agreements” from allowing anyone else to evaluate or verify the analysis—an extremely unusual way of handling data for groundbreaking research. Even more basic questions remained unanswered: For one, how could a tiny company persuade hundreds of hospitals on six continents to share patient data?
Hundreds of researchers signed an open letter, posted online May 28, that questioned the Lancet study. On Wednesday the research journal published an “expression of concern” alluding to “important scientific questions [that] have been raised.” Then, the next day, three of the four study co-authors asked the journal to retract the study.
How did it get published at all? Lancet editor Richard Horton has published editorials criticizing President Donald Trump, who has promoted the use of hydroxychloroquine against COVID-19. But even if political leanings influenced his decision to accept the Surgisphere study, politics can’t explain why The New England Journal of Medicine also accepted a recent Surgisphere study about less controversial medications. That study, published online May 1, analyzed whether two common classes of blood-pressure medication increased the health risk in patients with COVID-19 and concluded they do not. This week, as critics questioned Surgisphere’s reliability, the NEJM announced its own “expression of concern.” On Thursday it retracted the article.
Even that was not the end of Surgisphere’s impact: A third journal posted a preprint article online (since deleted, but archived here) relying on Surgisphere’s data and endorsing the use of ivermectin—a treatment for worms and other parasites—against the coronavirus.
All this matters, not simply because the public needs to be able to trust the research behind its medical care, but also because this bad research stopped better research from proceeding. The World Health Organization trial and the ASCOT trial have resumed, but will patients be willing to join? Or will they view the Surgisphere data as reason to stay away? What of the patients in South America now turning to ivermectin on the basis of a study since deleted?
Above all, how did a tiny startup’s hand-waving explanation fool the two most influential medical journals in the world—and a third journal for good measure—at the same time? Did Surgisphere’s talk of “machine learning” and “actionable data insights” dazzle the journal editors into ignoring the red flags?
Questions outnumber answers right now. Those answers will be crucial to reestablishing public trust in medical research.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.