It’s been 25 years since the National Academy of Sciences set its standards for appropriate scientific conduct, and the world of science has changed dramatically in that time. So now the academies of science, engineering and medicine have updated their standards.
The report published Tuesday, “Fostering Integrity in Research,” shines a spotlight on how the research enterprise as a whole creates incentives that can be detrimental to good research.
Robert Nerem, a professor emeritus of bioengineering at Georgia Tech, was not expecting that outcome when he agreed to chair the academy committee five years ago. He thought the committee would simply be updating the 1992 standards.
“We hadn’t had more than a couple of meetings when we realized this wasn’t a question of updating, this was a question of taking a brand new look and a very different look,” Nerem told Shots.
Science had changed. It was global and interconnected. Questions about the reproducibility of results had bubbled up. And it was increasingly clear that issues about proper conduct of research weren’t isolated to individual labs, but influenced by a continuously evolving academic, publishing and funding environment.
“This should not be something that gets looked at every 10 to 20 years, but is an ongoing discussion,” Nerem said. “And somebody needs to lead that ongoing discussion.”
That observation ultimately prompted the committee to recommend the creation of a Research Integrity Advisory Board. This nongovernmental board wouldn’t punish bad actors, but it would help foster good research and help institutions respond better to issues as they arise.
The focus of the 2017 report also shifts dramatically from the 1992 report, which emphasized individual cases of misconduct and questionable behavior, as opposed to the research enterprise as a whole.
“We’ve been fond of the ‘bad apple’ narrative, and we’re talking about switching to the barrels and the barrel makers,” said committee member C.K. Gunsalus, who heads the National Center for Professional and Research Ethics at the University of Illinois.
“We’re not just talking about misconduct here, which is formally defined in the U.S. as fabrication of data, falsification or plagiarism,” said committee member
Brian Martinson, from the HealthPartners Institute in Minneapolis. “We recognize there’s a fuller range of behavior that we refer to as detrimental research practices.”
These can include cutting corners, using dubious statistics, or not fully sharing what you’ve done so other scientists can reproduce your results. The previous report called some of these “questionable” practices, but the new committee decided that word was inadequate.
“Sometimes these detrimental research practices can be as damaging as actual misconduct,” Nerem said. They can undercut the validity of findings and make them not reproducible in other labs. Other scientists can spend a long time chasing dead ends.
“You’ve wasted the time of a lot of people, and time is an irreplaceable resource,” Gunsalus said. “And it’s valuable and you use highly trained people with expensive educations using expensive equipment in labs. When you waste the time you’ve done something really damaging.”
These practices are far more common than outright fraud, and that adds up. How big a problem is this? That’s hard to say, Nerem told Shots. That’s why the report calls for more effort to study these issues.
“It’s interesting since we’re talking about research in science and engineering, which are fields that are data driven, that we have no data on this particular issue,” Nerem said. “I don’t think this is prevalent, but I think research misconduct and what we call in the report ‘detrimental research practices’ occur more often than any of us would like, and the research community has to step up to the plate to address this.”
The report sets out a series of recommendations designed to improve the integrity of science, including steps that universities can take to improve their standards and protect whistle-blowers.
Scientists are called upon to share their data and methods as rapidly as possible. And funders should make sure data and computer code are archived, Nerem’s committee said, to make it easier for findings to be reproduced by independent scientists.
The report arrives at a time when many scientists feel that their enterprise is under siege in Washington, with threats of massive budget cuts and diminished interest in science-based facts. Still, the scientists behind this report remain committed to improving an enterprise that already provides a great deal of value to society.