GAO is Seeking Comment on Technology Readiness Assessment: Brief Q&A
The U.S. Government Accountability Office (GAO), an independent office of Congress tasked with investigation and evaluation of federal agencies, today published a draft guide on technology readiness assessment for federal programs. Technology readiness assessment (TRA) refers to the process of evaluating a given technology’s maturity during development, which can mitigate risks, assess potential performance, and reduce the odds of potentially-costly project failures. This new GAO guide is intended as a source of best practices for agencies that pursue hardware and software technologies of all kinds, from weapons to communication to space exploration. It follows previous GAO best-practice guides on estimating costs and schedule assessment.
GAO will be seeking public comment and feedback on the guide through August 10, 2017. Interested parties can follow the link at the GAO website for more.
The AAAS R&D Budget and Policy Program recently got to chat about the project with GAO’s Chief Scientist, Tim Persons, Ph.D. The below transcript has been lightly edited for length and clarity.
How would you define technology readiness assessment, and why should it matter for public agencies?
Joint Strike Fighter | Photo Credit: DOD
Persons: Technology readiness assessment is the practice of assigning maturity so that you can identify technical risks in the life cycle of the particular system that you’re building. It could be an advanced plant or some sort of facility that the Department of Energy is building, for example; it could be an advanced weapon system, a next-generation plane like the F-35 Lightning II or the Joint Strike Fighter out of the Department of Defense. And it could be a next-generation homeland security system like an advanced spectroscopic portal monitor we would use at our borders.
Technology doesn’t just sort of pop into existence, it starts out in a life cycle. Someone will have an idea, and may write it down and publish a paper on it. We have a world-class university system here in the United States…where you come up with ideas and develop a concept of something you could test out at the laboratory level. There’s a difference between what a given technology may do in the lab and what may happen in some operational context, like launching it into space or putting it where lives are on the line in an airplane, let’s say. You really want to understand the nature of the technology, how well-known is it, how well-tested is it…if you have to do something like push a launch button and launch billions of dollars of value up into space. You want to have confidence that it works.
This is an area GAO has tackled many times before in reports over the past couple decades. For example, GAO offered warnings to DOD over the Joint Strike Fighter program you mentioned, about some of the risks in moving technology out the door too quickly. GAO has also looked at projects funded by the Department of Energy and the National Nuclear Security Administration, and you regularly evaluate NASA’s major projects. How do agencies apply TRA now, and what challenges do they face?
Persons: In terms of the agencies and departments that have been well-practiced in TRA, the original example is NASA. I use the language about launching a rocket intentionally because that’s a high-consequence, high-regret scenario. You want to make sure anything you launch into orbit or you put out to scan the universe will actually work. DOD, the Air Force, Navy, those services will use [TRA] as part of their weapons acquisition process. And certainly the Department of Energy, particularly the Environmental Management component of DOE dealing with nuclear waste cleanup – think of the Hanford nuclear waste site and Savannah River, dealing with both high and low-activity waste in various chemistries and volumes. There’s a lot of focus on technology readiness assessment to give decisionmakers and project managers risk management information on which way they need to go to accomplish their mission.
Where we’ve seen challenges pop up has been with relatively immature technology. So think about the Technology Readiness Level [TRL] scale that NASA came up with years ago, it’s a 1-through-9 scale. 1, 2, and 3 is typically university-level research, or an advanced research and development domain. Level 4, 5, and 6 is where you’re doing product development or system development. 7, 8, and 9 is where you’re starting to operationalize and launch it in a relevant environment, ultimately [getting] to TRL 9 where it’s fully functional, everything works, there’s no risk any longer in the maturity. Where agencies and departments have gotten into trouble has been when they take relatively immature technology – think TRL 3 technologies – and put them in an operational or acquisition context, when GAO has said a best practice is actually TRL 6 and even 7 [for operationalization].
When that doesn’t happen, what you’re effectively doing is parallelizing – you’re trying to both mature the technology and get it ready for launch or deployment at the same time. That’s where there has been a tremendous amount of risk taken…A number of agencies and departments have done it. The number one challenge has been taking relatively immature technologies and acting as if in a project management or acquisition life cycle context they’re at a much higher maturity than they actually are. And then when they don’t actually work in their operational context, that’s where programs have fallen on the rocks and things don’t work, delays are incurred, and there are large expenses.
Tim Persons, GAO
What do you hope federal programs get out of the guide?
Persons: We want to simply write down the best practices of tech readiness assessment. GAO didn’t walk into this process pretending to have some sort of authoritative knowledge or all kinds of experience in practice. But what we did want to do is use our unique status within the Congress…to just ask the question, what are the best practices for TRA? There are a number of agencies that have been well-practiced in this, I mentioned NASA and DOD for example, and they have written documents on these things. But that doesn’t help other agencies that haven’t been working on this but equally want it or need it, like the Department of Transportation or other elements of the Department of Energy.
The second benefit is it helps GAO when it does its ongoing evaluations and its normal oversight functions…We’re telling an agency, when the Congress asks us to look at some weapon system or space system, it’s an open-book test. Everybody knows up front what we’re going to be looking for. And because it wasn’t written simply by us, but coordinated through us, it really does have the weight of the global community of practice, and that’s what I think makes these powerful and unique.
How did GAO go about assembling these best practices?
Persons: We cast a wide net on this. We certainly connected with any of the agencies, departments or entities we knew of. We also went to academia, we went to industry and got a lot of positive, enthusiastic help from major players. And we wanted that intentionally across sectors – not just one sector like aerospace, but others as well…Like with our previous guides, we’ve had international perspectives also rolled in. We put our signal out and we’re open to all the best thinking that’s possible. It’s all done of course on a voluntary basis for us…What’s unique about these guides is they’re expertise-crowdsourced, and they’re written with help from all our friends. They’re very different than a normal GAO report.
Would you say there’s a lot the agencies can learn from industry practice, which might be more conservative in some cases?
Persons: I think there’s something to that. From an industry perspective you’re really talking about product development. You want to have a lot of de-risking occur in your labs, or if you’re supporting academics to study some things for you, you’d want to have high confidence in those things before you craft and launch a product which can dramatically affect your bottom line. In the government case, there is oftentimes a higher tolerance of risk because there are certain things that only the government can do. There’s a lot of innovation that needs to happen in a competitive market, yet there’s always these things like nuclear submarines or special nuclear material capabilities that only the government can and should do. And that’s where you may find in those cases a higher tolerance of risk to do things because it’s so unique and exotic, and in almost every case there’s a very pressing national need.
* * *
Visit GAO’s website for more.