When a scientist realizes that they've made a fundamental error in their research that has the potential to invalidate their findings, they are often confronted with an ethical dilemma in determining what course of action that they might take to address the situation. Richard Mann, a young researcher from Uppsala University, lived through that scenario back in 2012, when a colleague contacted him right before he presented a lecture based on the results of his research that he had a problem that called his results into question.
When he gave his seminar, Mann marked the slides displaying his questionable results with the words "caution, possibly invalid". But he was still not convinced that a full retraction of his paper, published in Plos Computational Biology, was necessary, and he spent the next few weeks debating whether he could simply correct his mistake with a new analysis rather than retract the paper.
But after about a month, he came to see that a full retraction was the better option as it was going to take him at least six months to wade through the mess that the faulty analysis had created. However, it had occurred to him that there was a third option: to keep quiet about his mistake and hope that no one noticed it.
After numerous sleepless nights grappling with the ethics of such silence, he eventually plumped for retraction. And looking back, it is easy to say that he made the right choice, he remarks. "But I would be amazed if people in that situation genuinely do not have thoughts about [keeping quiet]. I had first, second and third thoughts." It was his longing to be able to sleep properly again that convinced him to stay on the ethical path, he adds.
Mann's case represents a success story for ethics in science, where his choices to demonstrate personal integrity and to provide transparency regarding the errors he had made through the retraction of his work proved to have no impact on his professional career, though he may have feared it. Such are the rewards of integrity and transparency in science, where the honest pursuit of truth outweighs both personal reputation and professional standing.
Still, an 2017 anonymous straw poll of 220 scientists indicated that 5% would choose to do nothing if they detected an error in their own work after it had been published in a high-impact journal, where they would hope that none of their peers would ever notice, while another 9% would only retract a paper if another researcher had specifically identified their error.
According to Nature, only a tiny fraction of published papers are ever retracted, even though a considerably higher percentage of scientists have admitted to knowing of issues that would potentially invalidate their published results in confidential surveys.
The reasons behind the rise in retractions are still unclear. "I don't think that there is suddenly a boom in the production of fraudulent or erroneous work," says John Ioannidis, a professor of health policy at Stanford University School of Medicine in California, who has spent much of his career tracking how medical science produces flawed results.
In surveys, around 1–2% of scientists admit to having fabricated, falsified or modified data or results at least once (D. Fanelli PLoS ONE 4, e5738; 2009). But over the past decade, retraction notices for published papers have increased from 0.001% of the total to only about 0.02%. And, Ioannidis says, that subset of papers is "the tip of the iceberg" — too small and fragmentary for any useful conclusions to be drawn about the overall rates of sloppiness or misconduct.
There is, of course, a difference between errors resulting from what Ioannidis calles "sloppiness", which can run the gamut from data measurement errors to the use of less-than-optimal analytical methods, which can all happen to honest researchers, and those that get baked into research findings through knowing misconduct.
The good news is that for honest scientists who act to disclose errors in their work, there is no career penalty. And why should there be? They are making science work the way that it should, where they are contributing to the advancement of their field where the communication of what works and what doesn't work has value. As serial entrepreneur James Altucher has said, "honesty is the fastest way to prevent a mistake from turning into a failure."
The bigger problem is posed by those individuals who put other goals ahead of honesty. The ones who choose to remain silent when they know their findings will fail to stand up to serious scrutiny. Or worse, the ones who choose to engage in irrational, hateful attacks against the individuals who detect and report their scientific misconduct as a means to distract attention away from it, which is another form of refusing to acknowledge the errors in their work.
The latter population are known as pseudoscientists. Fortunately, they're a very small minority, but unfortunately, they create outsized problems within their fields of study, where they can continue to do damage until they're exposed and isolated.