Dual-Use Science and Technology Requires Proactive Measures, Experts Say

Stopping intentional misuse of beneficial science and technologies will require applying the social and natural sciences, good communication, and international collaboration, panelists told a AAAS audience.
Diane DiEuliis and Shin Chang-Hoon | AAAS/Carla Schaffer

Imagine a malware program like the 2010 Stuxnet worm creating a nuclear meltdown at a power plant. Or a small group of radicals developing a highly contagious strain of H5N1 influenza that could be spread by human contact. Or an employee in a biological laboratory obtaining samples of highly toxic or contagious agents to use as a weapon.

Many beneficial developments in science and technology also create opportunities for people to intentionally use them to harm, a quality referred to as "dual use." Such harmful acts, along with natural disasters, pose primary threats to research institutions that use chemical, biological, radiological or nuclear (CBRN) technologies, experts said at a meeting on using science and technology prevent and respond to CBRN disasters, held at AAAS headquarters 22-23 January.

The meeting, sponsored by the AAAS Center for Science, Technology, and Security Policy and the Asan Institute for Policy Studies, drew upon experts from South Korea and the United States, who discussed how science and technology can be used to prevent and respond to CBRN disasters.

To combat willful misuse of science and technology, organizations need to reliably identify individuals who want to do harm, from either inside or outside the organization, and to find ways to deter them, said Diane DiEuliis, deputy director of the Office of Policy and Planning in the Department of Health and Human Services.

"The social and behavioral sciences have had a lot to offer in terms of innovative ways to not only deter individuals but to influence individuals to choose appropriate behaviors," said DiEuliis, who has a Ph.D. in neuroscience. New research in neuroscience and cognition are giving researchers insights into individual decision-making and what drives people's behavior, she said, which could be used to improve the assessment of individuals before they are employed in CBRN research laboratories.

An example of this approach is the Defense Personnel and Security Research Center's interview process for potential employees, DiEuliis said. Using data on past incidents and potential threats, researchers developed a list of behaviors and personality traits (such as how a person handles stress, or a tendency to get into debt), which may identify someone who would make undesirable choices. They then developed interview questions to help identify individuals who may have those traits.

Shin Chang-Hoon, director of the International Law and Conflict Resolution Center and the Asan Nuclear Policy and Technology Center at the Asan Institute for Policy Studies, said the keys to preventing CBRN disasters relate to both social and scientific responsibility. The former involves maintaining a balance between economic development and society's well-being, while the latter includes the duty to communicate with the public about applications of those technologies, according to Shin. When communication happens regularly among scientists, policymakers, industry and the public, the public's sense of safety increases, and disasters may even be prevented by timely policy interventions, he said.

Scientific research and natural disasters both cross international boundaries, Shin and DiEuliis agreed, which highlights the need for international coordination and global standards for dealing with dual-use technologies and preparing for CBRN disasters. The United States recently developed a policy for dual-use research in the biological sciences, according to DiEuliis, and the policymaking process incorporated international discussions organized by the World Health Organization and the Global Health Security Initiative. "Timely opportunities exist right now, particularly within the life sciences labs…to fully engage with the international community to address these risks and their potential mitigation jointly," she said.

International agreements made after the Chernobyl nuclear disaster do ensure that countries will give early warning of nuclear and radiological emergencies, Shin said. However, these conventions only address post-accident measures. "So I think it is high time for us…to embark on an agreement on preventive measures against CBRN disasters as a symbol of regional or international cooperation."

Natural disasters, such as the earthquake-triggered tsunami that caused the Fukushima nuclear disaster, also pose risks that should be mitigated. While scientific communities have generally done a good job in working with CBRN materials safely, not all institutions have adequate emergency preparedness plans, DiEuliis said. "The world is seeing an increase in large-scale natural disasters," she said. "Something that we learned from Hurricanes Katrina and Sandy is that disaster damage cannot always be predicted, and I think all institutions can benefit from planning for this."

Finally, creating a culture of ethics and awareness of the threat of misuse of CBRN technology is a key component of preventing intentional misuse, DiEuliis said. "Scientists don't always think about what can be done with their research," she said. This awareness also varies among laboratories, institutions and fields, with those in biological fields being less aware, in general, than nuclear or radiological researchers.

Attendee Ed You, a special agent with the Federal Bureau of Investigation's Weapons of Mass Destruction Directorate, Biological Countermeasures Unit, said one way this type of culture is being created is the through the International Genetically Engineered Machine competition, of which the FBI is a co-sponsor. The competition gets hundreds of college students to work on synthetic biology projects, and there are biosecurity workshops at the final championships. As part of their project, they must consider the ethics, safety, and security aspects of their research, an approach he hopes more researchers will incorporate into their work.

The AAAS Center for Science, Technology, and Security Policy has worked closely with the Department of Health and Human Services and the FBI since 2009 to build trust among scientists and security professionals to prevent malicious use of science and technology. Kavita Berger, the Center's associate director and lead collaborator with the FBI, said that's because "scientists and technologists should play a stronger role in safeguarding their work, to keep from contributing to CBRN disasters."