The proliferation of non-government laboratories handling dangerous microorganisms, combined with a troubling incident or two, has prompted the government to consider costly new security measures that scientists fear may be unnecessary and could do more harm than good, according to a new report.
A review of existing biosafety training programs, conducted by two units of AAAS, found that the programs "may already address concerns" that have arisen in Congress and the executive branch about the reliability of personnel at the laboratories, known as high-containment facilities. The report recommends that, before instituting new requirements, the government should "consider existing employment and biosafety training practices... as they may already contribute to vetting of personnel" and the prevention of "malicious actors or unstable personnel" from gaining access to hazardous pathogens.
More than two dozen experts in biosafety, biosecurity and life sciences, as well as architects and engineers, participated in the study, which is based in large part on a workshop they attended at AAAS earlier this year.
Participants warn that security requirements should not become so intrusive or rigid that they hamper vital research and discourage talented researchers from working in the field.
"We wanted to get the message across [in this report] as to how we thought the government should be thinking about that," said Mark Frankel, director of the AAAS Program on Scientific Freedom, Responsibility and Law. "We wanted to be sure the hammer wasn't coming down when it really wasn't needed."
The report raises questions about the proposed use in biological research of the kind of "personnel reliability programs" long used in nuclear and chemical weapons laboratories and which may include psychological screening, drug and alcohol abuse testing, background investigations, polygraph testing, credit checks, and top-secret clearances.
"Clearly one of the main things we wanted to do," Frankel said, "is to draw attention to the issue of 'personnel reliability,' which I do not think is well-recognized by life scientists out there in the field. They have absolutely no idea what's coming."
Leaders of the AAAS study point to similar skepticism expressed in recent recommendations from the National Science Advisory Board for Biosecurity (NSABB), which found no need for the establishment of a formal national personnel reliability program for high-containment facilities. Such a program, it said, "is likely to have unintended and detrimental consequences for the scientific enterprise that . . . could result in more harm to public health and safety and to national security than an insider threat poses."
While government weapons laboratories have been steeped in a culture of security and secrecy, universities and other non-governmental institutions that host the new generation of biological labs have enjoyed a tradition of openness, the NSABB report noted. In addition, there are crucial differences in the types of security that might work. For example, measures are in place to ensure that nuclear material is kept physically secure and inventoried.
Biological research, by contrast, can present even greater challenges in that it deals mostly with live organisms that can be grown from a small sample into large numbers. Most are pathogens that occur in nature and can be isolated from soils or infected hosts outside the lab and used to cause harm. But there is little evidence that traditional nuclear-style security measures added inside the laboratories would mitigate such dangers, the scientists argue.
Kavita Berger, project director at the AAAS Center for Science, Technology and Security Policy and co-leader of the new biosafety study, said participants agreed that policymakers' focus should be on how best to use—and strengthen—existing infrastructures to achieve both security and safety goals. For example, she said, assessments of laboratory personnel's inability to work with dangerous biological agents can be made in the course of routine close interactions between faculty and graduate students, trainers and trainees, mentors and staff.
The security problem "just needs to be thought of more strategically. . . and [without] thinking about biosafety issues with the nuclear mindset," she said.
She cited recent instances in which U.S. security requirements have "been a hindrance." The latest example involves the H1N1 (swine) flu virus, "where samples were sent to Canada instead of the CDC [Centers for Disease Control and Prevention] in Atlanta because of our security regulations. The labs are fine in Canada, but the point is that we are basically cutting ourselves out of the game for global health as well as the global research enterprise, and that's not good."
Since 2001, the number of laboratories designed to handle dangerous biological agents has soared from a handful to some 336 entities spread among the government sector, universities, independent research institutes and private industry. This happened mainly because terrorist attacks and the emergence of new infectious diseases led the government to channel money into biodefense and related public health activities. The number of researchers and support staff registered to work with potentially harmful microbes now totals more than 14,600.
Official anxiety intensified in 2007, with the revelation that the previous year, a researcher at Texas & M University had been infected accidentally with Brucella, a pathogen previously weaponized by the Soviet Union, but the lab had not reported it. That incident and the subsequent allegation that Bruce Ivins, a researcher at the U.S. Army Medical Research Institute of Infectious Diseases, might have carried out the 2001 anthrax attacks, triggered a flurry of activity including congressional inquiries, legislative proposals, a task force, a call by the Weapons of Mass Destruction Commission for an oversight review, and an executive order issued by President George W. Bush on biosecurity and personnel reliability measures.
The AAAS report, titled "Biological Safety Training Programs as a Component of Personnel Reliability," urges the creation of a national, anonymous database documenting exposures and their corrective actions, in order to promote information-sharing and help prevent future incidents.
It also calls for more federal funds for continual training and facility maintenance. Operating high-containment facilities can cost from $5,000 to $50,000 per day in ordinary circumstances and much more when there is "active research" going on, it said.