New report suggests strategies for agencies to better quantify value of T2

New report suggests strategies for agencies to better quantify value of T2

November 17, 2021

The metrics currently mandated for documenting the value of federal technology transfer may be insufficient for informing high-level decisions related to policy, strategy and funding, according to a new report that outlines the challenges of quantifying the impact of tech transfer and suggests ways for agencies to help address those challenges. Those suggestions include expanding the concept of impactful metrics beyond those mandated by law, and making sure new efforts to develop agency-level "learning agendas" include technology transfer.

To assist in measuring the effectiveness of federal technology transfer activities, the Office of Science and Technology Policy (OSTP) requested that the Institute for Defense Analyses (IDA) Science and Technology Policy Institute (STPI), develop the paper in coordination with the National Science and Technology Council’s Lab-to-Market (L2M) Subcommittee’s Strategy Team Five interagency working group.

"It’s important for agencies to evaluate impacts beyond the traditional technology transfer metrics typically reported to Congress," said Vanessa Peña, a research staff member at STPI and co-author of the report. "Creating a metrics portfolio and using innovative measures and metrics can assist in communicating how tech transfer efforts lead to impacts."

The report notes that federal agencies report many measures and metrics related to intramural technology transfer activities. As seen in the National Institute of Standards and Technology (NIST) reports on federal technology transfer, federal agencies do collect information about the inventions and partnerships involving federal agencies. (Read more about the latest NIST report here:

While intramural reporting is required for agencies, there is no similar requirement for agencies to report extramural inventions. Government grantees and contractors from federally funded R&D do report inventions to the agencies through invention data management systems, such as iEdison; however, this information is not collected into a public report.

In addition to the required metrics reported by federal agencies, the STPI report notes that agencies may wish to develop a more holistic metrics portfolio that considers both intramural and extramural research, including measures of activities beyond the narrow context of revenue generation and expand measures that capture contributions to broader economic prosperity, national security, and societal impact. The report also stresses the importance of evaluating and communicating these impact measures in the context of each agency's mission and the missions of its sub-agencies; lack of mission-specific context can lead to inappropriate generalizations across federal entities with very different objectives.

"Telling the story of R&D and tech transfer impacts, and the respective measures used, will vary depending on agency mission and the scope of tech transfer efforts they support across the research, development, demonstration, and deployment (RDD&D) continuum," Peña said.

Learning agendas, mandated by the Foundations for Evidence-Based Policymaking Act of 2018, represent a new opportunity to communicate data related to tech transfer's value—and to do so within the context of an agency's mission. These learning agendas are intended to allow federal agencies to systematically identify and prioritize questions related to their programs, policies and regulations. Learning agendas for an agency’s technology transfer activities could be developed and integrated into broader learning agenda efforts at the agency and customized for the agency's specific missions and needs, the report suggests.

"It is important to develop these frameworks to increase the evidence-base for impacts and evaluation capacity," Peña said. "There’s also the potential to link agency and lab-level learning agenda efforts with broader agency-level ones, which are now required to evaluate mission performance. Tech transfer should be part of that development and discussion, to improve linkages between agency strategic priorities and tech transfer field activities."

The report suggests that key elements of a learning agenda addressing federal technology transfer evaluation challenges include context, questions, proposed activities to address the questions, a timeline, proposed tools and methods, types of existing data and new data needed, challenges and proposed solutions. Other recommendations in the report relate to alignment of a tech transfer learning agenda with strategic planning and budgeting, engagement and communication with stakeholders, and data collection and expectation setting.

Read the report: