The federal government has increased its ability to innovate—and increase speed and agility—through a variety of programs. Are these efforts getting results? That’s the question a MITRE team set out to answer. Their findings now guide government agencies.
How Do You Measure Innovation? A MITRE Study Offers Answers
How is the government doing when it comes to spurring innovation? A MITRE team focused on finding out how government innovation organizations assessed their success.
What they found has benefitted numerous innovation-driving organizations from both government and industry.
It began when our researchers set out to assess and measure the effectiveness of government-funded innovation activities. The goal: provide recommended best practices agencies could adopt for improved measures of effectiveness. As soon as the team began sharing its findings and recommendations, it became clear that they had tapped into an unmet need.
Their report was published in the October 2020 issue of Defense Acquisition Research Journal, winning second place in the journal’s Edward Hirsch Acquisition Writing competition. Justin Brunelle and Daniel Frisk are two of the authors and the principal investigators for this project, which was funded by MITRE’s independent R&D program. (Other authors include Paula Randall and Ben Mayer.)
To date, more than 30 government organizations have used the recommendations to inform their innovation work. Brunelle says he applies the guidelines he helped develop in his direct work for MITRE government sponsors every single day.
“MITRE itself is continually exploring new ways of increasing innovation, for example through collaboration,” Brunelle says. “Our findings from this research are relevant to most organizations. Because MITRE works with government, industry, and academia on innovation programs, we can help agencies adapt best practices from industry and academia, and from across government domains.”
The findings from this research help organizations map out what they’re trying to achieve and how to get there. Because government innovation activities frequently involve multiple organizations and stakeholders, each with their own definitions of success, there needs to be a common understanding of each participant’s characteristics, roles, activities, and metrics.
The First Challenge: Defining “Innovation”
To begin, the MITRE research team surveyed 39 government innovation organizations, including in-house departments and outside firms. The survey sought detailed information on mission, process, best practices, metrics, and organizational characteristics.
In analyzing the data, the researchers keyed in on the first challenge--defining innovation.
“Innovation means so many different things to different organizations,” Frisk says. “Does it mean outreach to a nontraditional company for contracting or does it mean solving a problem with a bucket of parts, like they did for Apollo 13?”
To help the multiple participants in an innovation program speak a common language, we created a new vocabulary, identifying seven distinct categories: networker, educator/adviser, acquisition facilitator, investor, incubator, accelerator, and developer.
“People in these innovation organizations need to understand their roles and responsibilities and focus on their particular areas,” Frisk says. “Once people know what they should be doing, they can begin to measure their results.”
The Right Metrics: Focusing on Mission Impact
“Measuring innovation is a very hard problem,” Brunelle says. “Effective metrics provide insight on the workload, reach, productivity, and impact of an organization. They need to be aligned with end goals. Otherwise, you’re wasting time and money to collect data that doesn’t contribute to your understanding of how to implement an effective program. And each participant has a different role and thus a different metric.”
The team found some groups were measuring the amount of funding received or the number of projects undertaken, rather than measuring the impact on the mission. That may be partly because the outcome—or impact—takes time to materialize. It may not be felt for a year or more after an innovation has made its way to the field.
According to the report, outcome metrics are critical to connecting the activities of innovation organizations to the missions and goals of their parent organizations. Ultimately, the value of an innovation organization is measured by its ability to generate positive outcomes for its users. However, outcome metrics constituted less than 20% of all reported measures.
“It takes a lot of effort to track innovations as they move into user organizations and into regular use,” Frisk says. “And we found that most organizations weren’t doing the kind of tracking you need to prove impact.”
The team created a series of observations, recommendations, and proposed metrics for each type of innovation organization. For instance, an accelerator would want to pursue output metrics, such as number of transitions, time to transition, and rate of adoption. An educator/adviser should measure number of knowledge transfers.
To address that need, the team built a survey-based decision tree that provides government innovators with practical guidance on how to either build a new innovation organization or discover and engage with relevant existing organizations.
Guidance for Building a Solid Innovation Organization
The report has now become something of a handbook for any group seeking to develop a solid foundation for its innovation activities. The guidance applies to industry as well as government.
For instance, Mobilize used the report to help inform the appropriate metrics for their VISION tool, which is used to track innovations for the U.S. Air Force’s innovation ecosystem. Mobilize provides a dynamic-data-driven software suite that combines big data, human experience, and innovation—focusing on technology that empowers people to have a greater impact.
“VISION is one of the nearly 30 organizations already availing themselves of the MITRE report,” says Sanith Wijesinghe, who leads MITRE’s Agile Connected Government research. “That number is likely to grow significantly.”
He says government agencies are keenly aware that their approach to innovation is not on par with private entities.
“As R&D budgets expand, there’s more pressure to find the best ways of using innovation, and MITRE has a role to play in that conversation. With this work, we can provide even more tactical guidance. That’s important because new funding translates into organizational changes whose effectiveness will need to be measured.”
What’s more, the guidance also works at MITRE. As Wijesinghe notes: “We’re using it in our independent R&D program to optimize our own work.”