Falsification, fabrication and plagiarism are the focus of federal rules on research misconduct, but a wider range of questionable practices could pose a more serious threat to the integrity of science, according to a historian who studies scientific misconduct.
Nicholas H. Steneck, professor of history at the University of Michigan, says there is little evidence that high-profile misconduct cases, such as the faking of data by leading stem cell researcher Woo Suk Hwang, send science off track for long. Such episodes typically are exposed when colleagues report their misgivings or other scientists fail to replicate the research.
But Steneck told the 31st annual AAAS Forum on Science and Technology Policy that some less dramatic, often less detectable practices can undermine the objectivity and integrity of the research record. The questionable practices include duplicate publication of data (which can skew meta-analyses of the scientific literature on a topic), conflicts of interest that can affect a researcher’s objectivity, biased research designs, inadequate literature reviews and failing to present data that contradicts one’s own previous research.
The consequences of such practices can be significant, Steneck said. An inadequate literature review can lead to mistakes that endanger the health or even the lives of patients in clinical trials. Improperly designed studies also waste public funds and can affect the cost of medicines, he said.
[To see Steneck's PowerPoint presentation, click here.]
While the prevalence of research misconduct remains subject to debate, Steneck cited a study last year in the journal Nature, based on a survey of more than 6,000 researchers, which found that 0.3 percent admitted they had engaged in major misbehaviors, such as falsifying data. About 5 to 15 percent acknowledged engaging in questionable research practices such as using inadequate or inappropriate research designs. Steneck also cited a 2000 survey of biostatisticians which estimated the rate of fraud in medical research at just below 1 per cent.
The National Science Foundation and the Office of Research Integrity at the Department of Health and Human Services jointly confirm about 20 cases a year of research misconduct, according to Steneck. The actual number could be 10 times that or more, he said, because some researchers suspect but do not report misconduct, some institutions fail to undertake rigorous investigations and some journals find but do not report misconduct.
Steneck said the scientific community’s response to misconduct allegations has been based on long-standing assumptions that misconduct is rare, unpreventable and difficult to detect and that self-regulation keeps improper behavior at bay while maintaining high standards of integrity in research. He argues such assumptions are misplaced.
Self-regulation works, Steneck said. It is important. We certainly would never give it up, but it does have serious flaws. Scientific misconduct often is not very clever or subtle, he said, yet co-workers are inattentive and allow it to happen. He noted that Dr. Gerald Schatten of the University of Pittsburgh, one of Hwang’s collaborators, had been told by Hwang in January 2005 that some cell lines had been lost through contamination. But according to the Pittsburgh committee that investigated the case, Schatten failed to realize from this that there was not enough time to grow and analyze new ones by 15 March when the fraudulent paper was first submitted for publication.
Simply by reading and looking more closely—better attention–many cases of misconduct could be caught, Steneck said. Others steps toward improved self-regulation, he said, should include better supervision and mentoring of young researchers; more careful reviews of grant proposals, articles submitted for publication and job promotions; more specific guidelines for scientists on authorship of papers, use of digital images and proper management of data; and better training on responsible research practices.
If institutions and professional groups do not improve their self-regulation of research, Steneck said, broader government regulation may be required. In that regard, he suggested that federal regulators broaden the description of research misconduct to include practices that significantly compromise the accuracy and objectivity of the research record, waste public funds, or endanger human lives. He also suggested that journals that publish government-supported research be obliged to report misconduct.
Steneck was joined on the 21 April Forum panel by Felice Levine, executive director of the American Educational Research Association, and John Horgan, a science writer and director of the Center for Science Writings at the Stevens Institute of Technology.
Levine discussed recent efforts by public officials to encroach on the process of scientific peer review, including attempts by members of Congress to prohibit funding of specific National Institutes of Health grants. She also reviewed reported attempts by the Bush administration to impose political tests on appointees for federal science advisory positions.
Attacks on peer-review can have major and sustained consequences for the integrity and doing of science, Levine said in her prepared remarks. They also reveal the important role of the organized scientific community over time in protecting science or ameliorating some of the consequences of such intrusions. Challenges can be expected to continue.
Horgan spoke in a personal vein on two ethical dilemmas he faced recently, one involving the Pentagon and the other involving the Templeton Foundation. In the first, a defense contractor asked Horgan for advice about fighting terrorism. He accepted the assignment even though he believes the Bush administration’s military approach is aggravating rather than solving the problem of terrorism. But he said he remains troubled by his choice. I suspect that many scientists, like me, are ambivalent about working with this government, Horgan said. But they are still tempted to do so because, again like me, they could use the money, and they find the assignment flattering and challenging.
In the second dilemma, Horgan accepted a journalism fellowship from the Templeton Foundation, started by the billionaire financier Sir John Templeton to support efforts to find common ground between science and religion. (The AAAS Dialogue on Science, Ethics, and Religion is among the beneficiaries.)
Horgan described himself as an agnostic and said, I have misgivings about the foundation’s agenda but I took the fellowship anyway. I rationalized that the foundation had not bought me, as long as I remained true to my views. Basically, I used the same justification as a congressman accepting a golf junket from the lobbyist Jack Abramoff.
The fellows spent several weeks at the University of Cambridge, where they heard prominent scientists and philosophers discuss science and religion. Horgan said some of his misgivings were confirmed, with the dialogue skewed in the direction of religion, particularly Christianity. But he said the fellowship was for the most part wonderful, an intellectual and literal feast.
Horgan joked that his modus operandi is clear: Take money from a group that he has doubts about and then bite the hand that feeds him. My ethics are shaky, I admit, he said. But I hope that by writing and speaking about the ethics of accepting money from the Templeton Foundation and the defense industry, I’ll encourage others to do so as well. The integrity of science can only benefit from an open and candid discussion of these issues.