Jennifer Kuzma and David Rejeski
David Rejeski call them DTPs, "damn tough problems," that arise as new fields such as nanotechnology and synthetic biology emerge from labs around the world with consequences difficult to predict.
Too often, governments—and the public at large—can be caught flat-footed, unable to fully comprehend the risks and benefits of a rapidly emerging technology, Rejeski and others told a 30 April session of the AAAS Forum on Science and Technology Policy. Traditional methods of risk assessment may not be up to the task. They urged adoption of what is being called "anticipatory governance," an approach that emphasizes preparation more than prediction.
For while it may be impossible to fully predict the potential consequences of a new technology, said Rejeski, the head of the Foresight and Governance Project at the Woodrow Wilson Center for International Scholars in Washington, D.C., there are ways for regulators to better anticipate possible impacts and more fully engage the public in the evaluation process.
Daniel Sarewitz
Given the political and social commitment to rapid technological innovation as an engine for economic growth, said Daniel Sarewitz of the Consortium for Science, Policy and Outcomes at Arizona State University, policy makers must "be smart about that commitment and progress beyond what has been pretty much been a reactive mode where we go full steam ahead and, when bad things happen, we try to figure out how to regulate, respond, roll back."
Faced with "rampant novelty" in disciplines such as nanotechnology and the neurosciences, government agencies need a sort of embedded early-warning system that evolves and adapts along with the technologies, Rejeski said.
More than 600 leaders from U.S. and foreign governments, businesses, research centers and universities attended the 34th annual AAAS Policy Forum, a two-day immersion in the issues and sometimes gritty political realities that dominate the nation's science and technology agenda. Meeting just blocks from the White House, the Forum is regarded as the largest and most important annual science and technology policy conference in the United States, focusing on federal budget and R&D issues; public- and private-sector research; education; innovation; and other high-profile domestic and international S&T issues. It was organized by AAAS Science & Policy Programs.
In regulating new technologies, experts said, the first step is to understand the state of the science, often a daunting task even for researchers in the field. According to one citation index, there were about 407,000 published papers between 1990 and 2006 on nanotechnology, which involves research and development on structures in about the 1 to 100 nanometer range (1 nanometer is 1/80,000th the thickness of a human hair).
There can be early warnings of potential problems that often are ignored, Rejeski said. A scientist wrote in 1992 that carbon nanotubes—molecular-scale tubes of graphitic carbon with very high strength and unusual electrical properties—look like asbestos particles under the microscope and need to be carefully evaluated for their potential toxicity. In 1998, a news story in the journal Science asked: "Nanotubes: The Next Asbestos?" Rejeski said there have been peer-reviewed articles in the past two years suggesting that nanotubes can, indeed, penetrate lung tissue and cause damage similar to asbestos. The federal Environmental Protection Agency is now seeking further data from industry. The gap between the first warning and the regulatory response, Rejeski said, was too long.
While the federal government is spending about $1.5 billion a year on nanotechnology research, he said, only about 1.5% of that is being devoted to analysis of possible risks. He called for an "Early Warning Officer" in each science-related federal agency, with the job of tracking both risks and opportunities from new technologies.
Another step in anticipatory governance, Rejeski said, is to track the "known unknowns," the developments that seem possible even if they are not yet a reality. In the field of synthetic biology, for example, what might happen if a totally synthetic organism were created and introduced into the environment? Asking such questions can force researchers to more urgently consider the implications of their work and undertake empirical studies to determine the possible impact.
Rejeski also suggested establishment of a voluntary reporting system, like the aviation safety reporting system, where lab workers in emerging fields could anonymously report mistakes and bad practices that might require government attention. And he called for more training of scientists in cutting-edge fields on the ethical questions that may arise as their work enters the marketplace. Only about 15 % of engineering programs have any kind of ethics course work, Rejeski said. He also urged more experiments on the safety and possible risks of new nanotech-based consumer products. He said there has been a doubling every 14 months of the number of new nano-based consumer products.
Sarewitz, of Arizona State University, also has been paying close attention to nanotechnology as part of a project, now in its fourth year, to develop more deliberative approaches to new technologies. There has been a growing appreciation of how innovation works, he said, and specialists have been developing new methods to assess emerging technology and make the process more transparent. His own work has involved a method called "Real-Time Technology Assessment" or RTTA. The goal, he said, is to manage emerging technologies while management is still possible.
As with Rajeski's work, the first step is to understand what is going on in the science by sifting through thousands of research articles, patents and journal citations. Sarewitz and his colleagues (about 100 in all, including staff at Arizona State's Biodesign Institute) have been focusing on the values of scientists and the public on specific questions, such as job loss, as nanotechnology emerges. "Scientists don't think there's much risk of loss of jobs from nanotech, Sarewitz said. "And they're right. For them, there isn't. The public, having just experienced 30 years of the decimation of the manufacturing sector in the U.S., has a bit more skepticism."
Sarewitz and his colleagues also have examined scenarios that could be part of a nanotech future. For example, they are looking at a hypothetical brain implant using nanotech materials that would allow you to receive information through a brain-computer hookup while you sleep. In theory, the device would allow dramatic decreases in the amount of time we need to assimilate new data each day. The researchers are paying close attention to the nanotech literature for any evidence that talk of such devices is moving beyond the hypothetical. RTTA also seeks to determine whether scientists are beginning to think differently about their work as a result of their participation in the assessment process.
"We're not telling scientists what to do," Sarewitz said. "We are trying to allow them to understand the setting in which they are doing their work." RTTA "doesn't succumb to the illusion of control" over the emerging technology, Sarewitz said. But neither does it succumb to what he called "technological somnambulance," a sense of resignation in the face of complex, quickly moving developments in novel areas of research.
There are precedents for new institutional approaches to difficult issues in the sciences, Sarewitz said. He argues that anticipatory governance can be institutionalized on a broad scale, citing the impact of institutional review boards, or IRBs, on the conduct of clinical trials and other research involving human subjects. "There are thousands of IRBs now," Sarewitz. In the early 1970s, there were none.
"This shows that comprehensive governance of innovation activities is a reasonable goal," Sarewitz said. "Perfection isn't the goal here. The goal is evolution."
Jennifer Kuzma, an associate professor in the Center for Science, Technology and Public Policy at the University of Minnesota's Humphrey Institute, has looked at case studies on how regulators responded in the past to new technologies in the areas of human drugs, medical devices, workplace chemicals and agricultural biotechnology. Her team used 28 criteria to judge the oversight process. While there was clarity of subject matter and concern for health outcomes in each case, Kuzma said, weaknesses were readily apparent in the amount of transparency and public input in the process.
Kuzma, a former AAAS Science & Technology Policy Fellow, and her colleagues used lessons from the historical analysis to inform their use of an anticipatory governance model called "upstream oversight assessment" to weigh the risks and benefits of nanotechnology in agriculture and foods. The assessment technique tries to identify regulatory and non-regulatory issues associated with new technological products long before they are marketed.
The researchers found that developments arising out of multiple, converging disciplines can be the most difficult to monitor and assess. Kuzma mentioned the use of nanoparticles made from DNA that can be used to track pollutants in agricultural runoff. The research crosses the fields of biotechnology, nanotechnology, and geographic information systems, among others.
Still, Kuzma was enthusiastic about the use of new approaches in risk assessment, particularly the effort to bring more stakeholders into the process. "Public participation is a key recommendation," she said. It can improve risk assessment and policy making and, ultimately, the acceptance of a new technology. There are challenges in engaging the public while still maintaining a company's right to preserve its confidential business information, Kuzma said.
"The million-dollar question is how do we do this well," she said. But she quickly added: "We don't need to know how to do things perfectly before we actually implement them. Let's not think about it for another 10 years, how we're going to engage the public on a wide scale. Let's experiment."
David Kriebel
David Kriebel, director of the Lowell Center for Sustainable Production at the University of Massachusetts, Lowell, told the Forum about his efforts to grapple with new technology on a very practical level. He noted that regulatory standards often lag behind the emergence of technologies and sometimes continue to do so even as new hazards are discovered. There has been evidence accumulating since the 1970s that fluids used in the metalworking industry can cause laryngeal cancer and other cancers, Kriebel said. Yet the metalworking fluids still are regulated under a rule established in the 1960s that allows oil mists of up to 5 milligrams per cubic meter of air. Industrial hygienists called the rule a "laundry standard," Kriebel said. It permitted oil mists in the air as long as they did not condense on the ceiling and drip onto the workers shirts.
Despite the research by Kriebel and his colleagues on the cancer risks of metalworking fluids, the standard has not been revised. "We did this work in the 1980s and the early 1990s," Kriebel said. "The bottom line is that this work has had absolutely zero impact" on the regulation of the fluids.
Trained as epidemiologist, Kriebel began to realize that simply doing hazard research and publishing it in peer-reviewed journals does not necessarily produce change. Moreover, he said, "new hazards are introduced faster than we can possibly study them."
Nanotechnology may be producing some of them, he said, and could be analogous to the revolution in synthetic organic chemistry in the middle of the 20th century, when chemists learned to make essentially any organic chemical they wanted. The products of that revolution diffused through the world economy with tremendous benefits, Kriebel said, "but also with a long list of lingering environmental and health hazards and consequences that we are still having difficulty dealing with." There are some 82,000 chemicals in commerce, he said, with few of them adequately screened for toxicity. About 700 to 1,000 new chemicals are introduced each year.
"We haven't done a very good job of governing that innovative process," Kriebel said. Too often, there is little safety data on new products, and the screening process tends to be slow and focused on one chemical at a time, he said. A promising development is the use of high-throughput assay systems that are capable of screening many chemicals at once for potential hazards—an approach that is also being applied to products of nanotechnology. But Kriebel said that such assays tend to produce low-quality data that cannot be fed into the current regulatory system (which prefers strong evidence obtained through extensive, often lengthy testing for specific cancers or other ailments through bioassays in animals).
Newly identified mechanisms of toxicity, such as the effect of low-level exposures to hormone-mimicking chemicals or combined effects of multiple chemicals in the environment, add even more complexity to the risk assessment process.
Given the failure to respond to early warnings of possible hazard from chemicals such as fluorocarbons, tributyl tin, benzene, PCBs, and DDT, Kriebel said, there is no alternative but to strive for new ways to anticipate potential hazards of emerging technologies as they diffuse into the marketplace.
The challenge, he said, is to decide how much evidence is enough to take action, such as determining there is likely a causal link between benzene exposure and leukemia. "We tend to think there is some absolute threshold of knowledge," Kriebel said. "Put scientists in a room. They deliberate and say, 'Yes, there is enough evidence.' I think that is not realistic. There is always uncertainty. The question, instead, is: When do we have enough evidence to act as if this chemical causes this hazard? It is a constant balancing act."
And since it is a balancing act with social consequences, he said, "this is necessarily a public debate." Anticipatory governance inevitably involves a precautionary principle, Kriebel argues. He cited a positive example of precaution from his home state. The Massachusetts Toxics Use Reduction Act, passed in 1989, requires all users of toxic chemicals to pay a fee to state government to help industries figure out how to reduce use of the chemicals. It also requires each user to write a plan on how to get rid of the chemicals. There are no penalties or threats of lawsuits for enforcement.
"They don't have to carry out the plan," Kriebel said. "Guess what? Once they've done the plan, they follow it." The act has led to more than 40% reduction in toxic substance use statewide. "One of the things they realize is that this is going to save them money," he said.
The European Union also has used the precautionary approach, Kriebel said, in a system called REACH (Registration, Evaluation and Authorization of Chemicals) that establishes a list of "substances of very high concern." The burden is on users to show that the substances can be used safely and that no safer alternatives exist.
A similar approach makes sense for emerging technologies, Kriebel said. "I really do think there is a positive vision here of governing emerging technologies," he said. "I think that there is tremendous potential in things like nanotechnology to benefit all of us." But he added: "What we really need to do is shift from a reactionary to a precautionary regime" for evaluating those technologies.