On May 22, the medical journal The Lancet published a study that aimed to figure out how effective chloroquine and hydroxychloroquine, anti-malarial drugs, are in combination with certain anti-bacterial drugs for treating hospitalized COVID-19 patients. The researchers concluded that “each of these drug regimens was associated with decreased in-hospital survival.” In other words, patients treated with the drugs were more likely to die than those who were not. That study was based on data supplied by the medical data aggregation company Surgisphere which claims to have assembled a database of tens of thousands of COVID-19 patient records from hundreds of hospitals across the globe.
Almost immediately other researchers began questioning the accuracy of the Surgisphere database and therefore the accuracy of the study’s finding that the anti-malarials are ineffective—or worse, dangerous—at treating hospitalized COVID-19 patients. For example, an open letter signed by scores of outside researchers points out that Surgisphere has not released the code or data used in The Lancet study although the journal is a signatory to the Wellcome open research data guidelines. The outside researchers note further possible problems with Surgisphere’s reported data for COVID-19 patients: The data were not properly adjusted for confounders such as disease severity and doses used to treat patients. Researchers also suggest that Surgisphere is reporting data from implausibly high numbers of COVID-19 patients in Australia and parts of Africa.
The researchers are particularly concerned because several randomized placebo-controlled clinical trials of the drugs that could more clearly show the benefits or dangers of such treatments have been derailed in the wake of The Lancet study.
As a result of this storm of criticism, the editors at The Lancet have issued “an Expression of Concern to alert readers to the fact that serious scientific questions have been brought to our attention” about the article. The editors further note that “an independent audit of the provenance and validity of the data has been commissioned by the authors not affiliated with Surgisphere and is ongoing, with results expected very shortly.” The journal has also posted a minor correction to the article with respect to misclassifying Australian data, adding a supplemental table.
Surgisphere has now attracted the attention of data sleuths who are turning up oddities about the company including that its purportedly vast database does not appear to have been used in prior peer-reviewed studies and that it has a suspiciously low number of employees for a company that claims to have relationships with hundreds of hospitals.
For its part, Surgisphere maintains that its database is scientifically sound. “Mandatory audits happen at least four times a year, and everything from data acquisition to data reporting is independently reviewed by an external third-party auditor,” claims the company in an online statement. “Surgisphere has passed all of its prior audits with no major or minor nonconformities.”
In response to its critics, Surgisphere says that it is pursuing an independent academic audit of where its data come from, the database, and its statistical analysis “with all due haste.”
Given the severity of the coronavirus pandemic, the results of a truly thorough and transparent audit cannot come fast enough. The possibility that clinicians have been misled by shoddy research into avoiding the use of an effective drug to treat COVID-19 patients borders on scandalous.