Skip to content
Q&A

How Can We Root Out Bad Science?

The reward system in science is designed to incentivize novel results over quality and rigor. Grantee Dr. John Ioannidis is working to change that, establishing the basics of ‘metaresearch’ and finding ways to improve how science is conducted and reviewed.

Arnold A decorative icon
Evidence shredded

Dr. John Ioannidis has good news and bad news about science: We can’t assume reported research findings are right just because they’ve been published in a journal, but improvements are gradually taking hold.

Ioannidis is a pioneer of metaresearch, or “research on research.” He made a huge splash in the science world with a 2005 paper titled, “Why Most Published Research Findings Are False.” As a professor at the Stanford University School of Medicine and co-director of the Meta-Research Innovation Center at Stanford, or METRICS, he investigates scientific practices and methods in his primary field, medicine, as well as others since all science eventually uses the same tool, the scientific method.

Arnold Ventures has provided funding for Ioannidis to establish the basics of metaresearch and investigate ways to improve how science is conducted and reviewed. It’s a reflection of one of the organization's core values: advocating for accurate research to fuel better public policy.

In a recent interview, Ioannidis explained his work and the problems he sees in science today. Questions and answers are edited for length and clarity.

Headshot of Arnold Ventures
Arnold Ventures

Most people probably aren’t aware of metaresearch and don’t understand that there's an issue with the scientific process. Explain what you’re trying to accomplish.

John Ioannidis

We are studying how science and research practices are working or not working and how we can get more reliable evidence more efficiently. For medicine in particular, that's something that could affect both life-and-death questions and also quality of life. And if you go beyond medicine, it can affect the basic core of how we make decisions based on science.

Headshot of Arnold Ventures
Arnold Ventures

What problems do you see in current scientific research?

John Ioannidis

Research is a noble enterprise, but it's very difficult and it's very easy to make mistakes. And I'm not talking about fraud here. There may be a few cases of fraud here and there, but the most pressing problem is just suboptimal research practices in every step in the process, from the way that we think about the very research agenda, picking questions, designing studies, executing studies, analyzing them, reporting or misreporting or not fully reporting their results. On top of that, we have a reward and incentive system in science that should be rewarding the best research, but unfortunately that's not what it is.

Headshot of Arnold Ventures
Arnold Ventures

How does the reward system go astray?

John Ioannidis

There's tremendous pressure to deliver something that looks spectacular and novel and earth-shaking. This means that there's a lot of pressure to deliver results that may be exaggerated or false. There's far less in terms of incentives for quality, for scientific rigor, for using the best methods, and for being transparent and sharing data. We need to find ways to get more of this better-quality, better-credentialed type of research being done as opposed to just aiming to publish more papers and get more funds to publish even more.

Headshot of Arnold Ventures
Arnold Ventures

What would you say are some ways to change the incentives?

John Ioannidis

Openness and sharing is one way—sharing could include raw data to improve transparency, other types of materials, software. If you don't have those, you cannot do much large-scale collaboration. If you do very small studies, you get lots of false positives and false negatives.

Adoption of replication culture in many fields is another. Replication has more value than discovery. It sounds like a paradox, but it is true. Discovery's a nuisance, it's an anomaly, and you need to verify that anomaly and see whether you can see it again and again.

Another is how do we standardize our processes to use the best analytical tools, methods of data collection, and reporting of findings? There's about 300 different reporting standards that have been developed for different study designs.

Headshot of Arnold Ventures
Arnold Ventures

What successes have you seen so far?

John Ioannidis

We looked at indicators of openness, reproducibility and transparency in biomedicine, and we saw, in the last three years, a clear improvement in some of these indicators, particularly for data sharing. Previous analyses we had done showed that hardly any papers, in a random sample, were sharing data. Currently, almost 20 percent of papers have data sharing. We also see substantial improvements for recording of conflicts of interest, for disclosures about funding, and a little bit of movement on replication, although not as much as probably we would like.

Headshot of Arnold Ventures
Arnold Ventures

How do you respond to other scientists who say that you are nitpicking others’ work or bringing in your own biases?

John Ioannidis

I have no doubt that I'm bringing biases to the table, and this is why it is important for other scientists to look critically at my work and other people's work. The pushback I have received is not that major because my projects typically are not trying to just cherry-pick one paper and conclude what a bad scientist that person is. I really don't care about that. I do care about a bird’s-eye view that affects thousands and millions of papers and efforts and investigations. If a scientist is told, ‘Your paper's wrong,’ he will have a very hard time accepting it. If you’re told, ‘One million papers are wrong,’ it’s a gentler message, for sure.