News: News Archives
AAAS Meeting Explores Ways to Improve Ethics Panels that Oversee Social Science Research
Cora B. Marrett
The ethics boards that oversee federally funded studies involving human subjects have become part of the research landscape since the 1970s. But some researchers in the social and behavioral sciences complain that the boards—established in response to abuses in biomedical research—too often take a heavy-handed approach toward survey research and other social science projects that pose minimal risk, if any, to those who participate.
A 22 September meeting of the AAAS Committee on Scientific Freedom and Responsibility grappled with how to improve the performance of the ethics panels, called institutional review boards, or IRBs, when it comes to the social sciences. The committee members heard from researchers who recounted frustrating delays while answering board requests for more information or changes to their research plans. They also heard from speakers who have served on IRBs or who have responsibility for their operation. From both researchers and administrators alike, there was general agreement that the system can and should work better.
"Academic institutions are based on trust and mutual respect," said Anne N. Hirshfield, associate vice president for health research, compliance and technology transfer at The George Washington University. "IRBs can only work if there is mutual attention to a common goal—conducting ethical research that protects the rights and welfare of participants." She added: "I have experienced IRBs that work very well... They truly work to protect human subjects."
Joan E. Sieber, a psychologist and professor emerita at California State University-East Bay, has served on seven IRBs during her career and chaired three of them. Sieber, who also is editor-in-chief of a journal on human research ethics, said that IRBs have "had some salutary effects" on the conduct of research in the social sciences.
But she said there has been a disconnect between the admirable sentiment behind the rules that established IRBs and the way those rules are administered. IRBs often lack expertise on new methodologies, including the growing use of Internet-based and community-based research in the social sciences, Sieber said. "IRBs should be getting consultation on projects they are not qualified to review," she said. "They are not doing it." As a result, she said, "some of the most innovative and consequential research is going to be damaged."
While not disputing the need to ensure that research in the social sciences is conducted ethically, Sieber and others said that the IRB system has proved to be cumbersome and has encouraged some researchers to seek ways to avoid scrutiny of their work through what one meeting participant called "scofflaw behavior."
Gigi Gronvall, a senior associate at the Center for Biosecurity of the University of Pittsburgh Medical Center, recalled her efforts to do a survey of scientists who do dual-use research in biology, such as work on viruses that might have military as well as civilian application. The IRB at Johns Hopkins University, where the biosecurity center was located at the time, asked Gronvall to give a three-page consent form to all those she interviewed. It included a warning that the respondent, in agreeing to answer Gronvall's questions, ran the risk of being investigated by government agencies, being "exploited by hostile entities," or even being kidnapped.
The IRB's members, Gronvall said, "were totally over-identifying with my subject population." The result was a six-month delay in the survey project, during which Gronvall almost lost her funding. "The IRB should be on your side," she said. "That's not how I felt during this."
Gronvall said that blocking or delaying research on a controversial topic can mean that it will be explored only in the news media, without any IRB-style protections for those being interviewed.
A summary prepared for the AAAS committee includes other anecdotes which suggest the IRB system is not working as well as it should. It includes the story of a psychology professor who wanted to administer an alcoholism screening test to students at a local community college. The IRB said the study could put participants at risk if they came to realize that they have a drinking problem, even though the researcher would provide referral information for treatment services and discuss the potential risks and benefits on the consent form. The IRB ultimately rejected the proposed survey without making any suggestions to the researcher on how to change it. He moved on to another institution (with a more receptive IRB) to collect his data.
In another case, a cultural anthropologist proposed doing open-ended interviews with military personnel returned from Iraq to determine their level of knowledge about Iraqi culture. The IRB questioned whether some of the interview subjects might be suffering from post-traumatic stress disorder (PTSD) and vulnerable to re-trauma when speaking to the researcher. Sieber, in recounting the case, said that the research literature suggests that soldiers with PTSD often benefit from talking about their experiences and are not re-traumatized. The researcher eventually received an okay from the IRB, but Sieber said the project was put on hold for months and the IRB process provoked stress of its own for the researcher.
Another case involved a psychologist who wanted to interview fourth, sixth, and seventh graders to determine how they interact with technology as they learn to use the Internet. The IRB sent the proposal back seven times asking the researcher for clarifications on the questions she proposed to ask. It also asked her to promise not to ask for any amplifications or additional comments beyond those provided by the children in response to the questions. As the lengthy review process continued for more than 15 months, the researcher was unable to speak to a single student and is unlikely to receive further funding for the project.
Zachary M. Schrag, an assistant professor of history at George Mason University, noted that the federal regulation of medical research rests on an empirical foundation, including the work of scientists like Henry Beecher who published a 1966 article that drew attention to 22 examples of unethical clinical research that had risked patients' lives. Beecher's work helped lay the foundation for new federal guidelines on human experimentation and informed consent. The push for more protections came to a head following the 1972 disclosure of the Tuskegee syphilis study, in which federal health officials deliberately withheld treatment from nearly 400 black men with syphilis.
Schrag said there have been no empirical studies comparable to Beecher's for the social sciences, although there were cases that raised concern, including behavior modification studies in prisons. Still, since the 1960s, federal ethics policies have governed social and behavioral research as well as biomedical studies. This includes the so-called Common Rule, in effect since 1991.
Participants at the AAAS meeting suggested steps that could help address some of the concerns about the rigidity of the IRB process. Hirshfield said university administrators need to recognize that senior faculty—those who are heavily involved in research and also tend to be outstanding teachers—often make the best candidates for positions on IRBs. They often are committed to the concept that research can serve the public good, she said, and understand the needs of the researchers as well as the importance of protecting the study participants.
Jeffrey Cooper, director of the higher education group at the Huron Consulting Group, said there are procedures within the existing rules that can be exploited to help researchers in the social and behavioral science gain quicker IRB approval for their projects. These include an expedited review procedure for research activity determined to pose no more than "minimal risk"—that is, harm or discomfort no more than what would be ordinarily encountered in daily life or during routine physical or psychological tests.
There also are exemptions that allow some types of research to avoid IRB review altogether.
"There is flexibility in the regulations," said Cora B. Marrett, assistant director for education and human resources at the National Science Foundation (NSF). "It's not so much the regulations, it's the application" of those regulations that can cause inconsistencies and problems, she said.
Janet DiPietro, associate dean for research at Johns Hopkins University's Bloomberg School of Public Health, agreed that "most of the problems have to do with IRB office work flow and the social culture of the review board." She described steps she has taken at Hopkins to reduce the amount of time it takes the IRB to decide whether a research proposal is exempt from review or not. Where once the average decision time was 40 days, she said, it now can be as short as 48 hours. DiPietro said that the IRB now convenes on a weekly schedule so that requests for expedited review can quickly get on the agenda.
Felice Levine, executive director of American Educational Research Association, urged IRBs to be "less adversarial, less punitive" and "more educative." With research projects becoming larger and involving more institutions, Levine recommended that IRBs be decentralized. For research projects posing minimal risk, she said, review committees could be established at the level of academic departments and research units. That would provide a less complex system for investigators to navigate, she said, and help ensure more IRB expertise on the research methods under consideration.
Current regulations also allow IRBs to waive the need for written consent forms, Levine said. She emphasized that consent is a process between the researcher and research participants and not just about signing a form. Levine said, for example, that IRBs can grant waivers of written consent (with appropriate documentation by researchers) when a written record may increase the risk of harm. Mark Frankel, the AAAS staff officer for the Committee on Scientific Freedom and Responsibility, said that many subjects in research studies on deviant behavior might opt out rather than sign a consent form that could fall into the hands of superiors, police, or others.
Levine and others stressed the need for all parties in the IRB system—board members, staff and principal investigators—to become more informed about the rules and procedures. Boards can do more to assist principal investigators from the outset in developing research protocols, several meeting participants said. At the same time, GWU's Hirshfield said, "it is the obligation of the P.I. to know what the IRB needs and to stage the argument. Give the citations to show that your work is not risky."
Meanwhile, Marrett said, NSF is funding studies—now ongoing—aimed at bringing more empirical data to bear on the question of IRB performance. One study, under the auspices of the National Academies, is looking at principles and practices of effective IRBs. Another is examining the oversight functions of IRBs and how the experiences and values of research subjects are incorporated into research protocols. A third study is looking at how Internet-based research is being handled by IRBs at major research universities.
The Committee on Scientific Freedom and Responsibility was established in 1976. Its duties include encouraging and assisting AAAS, its affiliates and other scientific groups in developing statements of principles governing professional conduct.
For the Committee, balancing scientific freedom and responsibility is central to its core mission," said Frankel. "Some Committee members believe that the balance is awry because IRB's are imposing unwarranted and arbitrary demands on proposed research by social and behavioral scientists. This raises serious issues related to scientific freedom insofar as such actions lead to potentially valuable research that is inappropriately altered, unduly delayed, or not done at all."
7 October 2008