Peer Reviewed Publications Mostly Lack Raw Data
[Thursday, February 27, 2020] A detailed analysis of all articles submitted for publication to a leading medical journal over the last two years by the Editor-in-Chief saw that in most cases authors were unable to provide raw data to their articles when requested, raising doubts about the authenticity and credibility of the data being presented for publication. Although 97% of the articles in question were rejected by the journal, about half were published elsewhere. The Editor-in-Chief had questioned authors of about 20% of the articles submitted to his journal because the data was “too beautiful to be true” but speculated how many of the other articles that looked authentic had similar issues with the raw data and hence the associated credibility gaps. Publishing in a peer-reviewed journal is the gold standard for credibility for a research program. Peer reviewed articles are frequently used to support the rationale for further research, patent filings, and even clinical development programs. However, published work is also notorious for its lack of reproducibility. In cancer research, some reports cite only about 11% of the of published studies could be validated or reproduced. The main reason is lack of adequate review process. Most peer reviewed journals are hard pressed for timely review and rely on the age-old principle of honor, meaning that most data are trusted on its face value and raw data is never requested. Even in the very few cases where raw data is available, the journals do not have the resources to adequately review it. The authenticity of the data is only one of the problems with the publications; other “inappropriate practices of science, such as HARKing (Hypothesizing After the Results are Known), p-hacking, selective reporting of positive results and poor research design” are also issues contributing to the lack of reproducibility. The report is commendable in its honesty. This is an astonishingly honest review by the Editor-in-Chief of his own journal’s review process. And it highlights the widespread practices in all scientific publications. There cannot be a doubt that similar issues exist with all similar journals. And it supports the need for full reports of studies containing all the data, all the analysis and non-biased full disclosure before data is used for further evaluation. So, when you see high profile publications touting a new finding, always ask for the raw data behind it. You may avoid unpleasant surprises later. |
|