Tech Transfer Central
Tech Transfer eNews

Tech Transfer E-News provides a weekly round-up of current news and information in the world of tech transfer, delivered every Wednesday (sign up here). It is published by Technology Transfer Tactics newsletter, which is available as a monthly subscription. For more information or to order a subscription click here, or for a sample issue, click here.

Improve TTO benchmarking by normalizing data and taking a deeper dive behind the numbers

A detailed article on using a normalization process to attain more meaningful TTO metrics and benchmarks appears in the August issue of Technology Transfer Tactics. To subscribe and access the full article, along with more than 13 years of archived best practices and success strategies for TTOs, click here.

TTOs must understand their own performance data if they are to optimize outcomes, but it also is critical to normalize the data to obtain a useful comparison against other universities that allows you to identify potential areas of improvement.

Tracking data is vital not just for benchmarking against others but also for understanding the effectiveness of the tech transfer program and staff, stresses Laura Schoppe, founder and president of Fuentek, a tech transfer consulting company in Cary, NC, and current chair for the board of the AUTM Foundation.

AUTM survey data is the most comprehensive information a TTO can use to compare its performance to that of peers, and TTOs can compare their own metrics against those of other institutions to determine if they are overstaffed or understaffed, paying too much in legal fees, or patenting too much or too little, Schoppe explains. But it’s important to make sure the data is normalized so comparisons are meaningful when compared against true peers.

Comparing metrics is not as simple as putting your data alongside that of the best performers, she emphasizes. The data must be normalized so that you are getting a useful comparison, rather than what might be a comparison against a TTO in significantly different circumstances, Schoppe says.

For example, “you don’t want to compare yourself to schools that have medical schools if you don’t have a medical school. The metrics are quite different,” she explains. “You want to pick schools that are similar to you [in] research expenditures, because it’s not reasonable if you are a school that is bringing in $200 million a year to compare yourself to a school that is bringing in $1 billion a year. You want to get schools that are above and below you, but reasonably close.”

You also can identify two peer groups — one that most closely resembles your own circumstances, and an aspirational peer group that represents the schools that your administration or your TTO would like to emulate. The first may provide the most accurate comparison to see where you stand, but the second may help identify tactics that could improve your performance, Schoppe suggests.

The data should be normalized along a series of different parameters, she says. For instance, you may use the number of invention disclosures divided by research expenditures. That normalizes the output, producing a ratio that accounts for the different research expenditures among schools, Schoppe explains. Another slice of the data may look at disclosures by number of office FTEs, or by number of licensing associates. “AUTM data has shown an expectation of between two and four invention disclosures for every $10 million in research expenditures,” she notes. “Four is the ideal — what you want to be shooting for. If you’re at one, that’s a red flag.”

The same kind of normalization can be applied to data on licensing revenue or any other data point you wish to consider. If you find yourself below the norm for AUTM or your peer group on any metric, that is the time to dig deeper into your own data beyond the numbers you provide to AUTM, Schoppe says. Look closer at the numbers by each college or department, she advises. For example, look at the research expenditures in the chemistry department vs. the number of invention disclosures. How does that compare to the physics department?

“When you start analyzing by department, you’re likely to find where the problem lies if you are below average. You’ll see that in certain departments you are not hitting your numbers,” she says. “That can help you focus and determine where you need to go in and do some training. If your chemistry department is the best funded in the school but you’re way low in invention disclosures, you can pinpoint your efforts and get your tech manager who specializes in chemistry to work very closely with them.”

Another issue that may become clear when you compare your data to your peer group is having a lot of patents but not a lot of licensing revenue. In this case, Schoppe suggests looking at whether you are patenting a lot for one inventor or one department but not getting much revenue from it. If so, the data might indicate you have a “squeaky wheel” who gets a lot of attention without actually producing much in the end, she says.

“That person might be very productive with research and papers. That’s wonderful and you can acknowledge them in other ways, but spending money on another patent that is not going to license is not efficient for you,” Schoppe says. “The data can help you focus internally to look at whether you are patenting the right technology. It could be that you are not following up with the right marketing and that’s why the patents go nowhere, but it gives you something to think about.”

Click here to continue reading this article with a subscription to Technology Transfer Tactics. Already a subscriber? Click here to log in.

Posted under: Tech Transfer e-News