Tech Transfer Central
Tech Transfer eNews

Tech Transfer E-News provides a weekly round-up of current news and information in the world of tech transfer, delivered every Wednesday (sign up here). It is published by Technology Transfer Tactics newsletter, which is available as a monthly subscription. For more information or to order a subscription click here, or for a sample issue, click here.

Consistency, standardization are keys to solidifying your TTO’s data integrity

It takes a good deal of time to plan properly when implementing a new database (or cleaning up an existing one) and instituting processes that will lead to greater data integrity. However, failure to do so will take even more time and will ultimately cost more money, warned a pair of experts from Fuentek, LLC, the presenters in a webinar sponsored by TTT entitled “Improving Data Integrity: Boost TTO Operations and Improve IP Management Database Reliability.”

“Take time at the beginning of the process,” advised Fuentek Vice President Becky Stoughton. “If you think about [possible pitfalls] ahead of time, there is a greater potential to minimize them.”

So, what exactly is data integrity? “Data integrity essentially means you have accurate and complete information within your database system so can get to it quickly and easily and can provide it in reasonable amount of time,” said Laura Schoppe, president of Fuentek. “How you present it — and to whom — takes some thought. The strategic [approach] is also important.”

What causes poor data integrity? One of the main causes, said Schoppe, is missing and incomplete information. Another, she said, is inconsistent data entry. Some very common reasons behind those problems, she noted, include lack of adequate training, lack of clarity in assigning database responsibility, and a lack of standardization in how data should be entered and what should be included. Other causes of poor data integrity, she added, include staff turnover or changes in jobs or responsibilities within the office. “Another is lack of time — people say there are a lot of steps and they do not have the time,” said Schoppe.

“I’m here to tell you it will cost you a lot more on the back end,” she warned. “Entering the data completely and accurately will pay off manifold on the back end.”

To put it simply, she said, it all depends on having good standard operating procedures (SOPs), training people properly, clearly defining responsibilities, and establishing standards for what data to include and how it should be formatted.

Many of the impacts of “bad data,” Schoppe noted, are “fairly obvious.” Errors and omissions, for example can lead to significant inefficiency, decreased productivity, and as a result, increased costs.

“We had a government client,” she recalled. “While we had meticulous records [of their disclosures], they had poor data management and would get requests for information from external parties.”

Fuentek, she said, would get “fire drill” calls from the client, asking what technologies they had in specific vectors, and which were the top-ranked assets. “We were able to do it, but the fact that they couldn’t was very concerning,” said Schoppe. “They have fixed things since then, but it’s a good example of how [bad data] can have a negative impact on how you get to good opportunities or cause serious problems internally and externally.”

People “get unhappy with you,” she continued, if you can’t get them data on the fly. “You may also have the potential of missing out on revenue,” Schoppe observed. For example, a university client was running their database essentially off an Excel spreadsheet; it had become their primary way of managing data. “It was not complete; it was not connected to additional things like licensing activity,” she explained. “They had not asked to see licensing revenue for years and were not collecting everything. The office did not have good data and was perceived as inept and disorganized. These clients did miss out on opportunities when they were not able to get to data as quickly as possible.”

In short, she summarized, the old adage “garbage in/garbage out” really applies. “No matter what system you use, how you enter data has a huge impact on output,” Schoppe asserted.

A detailed article on improving data integrity appears in the January issue of Technology Transfer Tactics. For subscription information, click here. For information on the distance learning program “Improving Data Integrity: Boost TTO Operations and Improve IP Management Database Reliability” – now available on DVD and on-demand video, click here.

Posted under: Tech Transfer e-News