High-quality data key to business decisions – US prof

30th October 2015 By: Schalk Burger - Creamer Media Senior Deputy Editor

High-quality data is critical to produce reliable and accurate information that will improve business and industry decisions, says University of Arkansas at Little Rock master data management specialist Professor John Talburt.

More companies realise that much of their competitive advantages lie in their data and, commensurately, in the management of its quality.

The end goal of data quality management is to produce information as a product. The movement emerged in the 1980s, specifically from the Massachusetts Institute of Technology (MIT).

A key paradigm introduced by the MIT specialists was the application of the manufacturing paradigm of total quality management, which entails ensuring quality across the entire supply chain, from raw materials to end-product use.

Talburt says this paradigm emphasises the continuous assessment and improvement of quality of data sourcing, capturing and storage, with a key emphasis on the architecture of the system to ensure quality is a key criterion, especially during the early stages.

A useful way for companies to approach data quality management is to view and manage information as a product. This requires good-quality data and data governance, while also requiring an understanding that data and information are assets of companies requiring proactive monitoring and management, consistent with other company assets, he explains.

“Most large enterprises are implementing data governance across their structures, but it is linked to the maturity of data management practices in an organisation,” says Talburt.

While critical data management components of data governance are still relatively immature, all practitioners recognise the value of establishing data dictionaries, and unifying and developing a business data glossary to collect and define the business rules to ensure the effective management of information and data quality.

This approach enables centralised management of data quality, as well as the collection and storage of metadata (characteristics of data that enable their segregation and effective use in various processes) in a single repository. This also aids compliance with data management regulations, international standards, best practices or in-house policies.

Developing architectures for information generation, propagation and use across companies’ processes ensures that information systems are more robust and create higher-quality information, and does not require fixing or cleaning at a later stage.

“Critical to the movement towards effective data management and governance is the role of leadership. This has resulted in the new office of CDO (chief data officer) being created, and MIT has been a big proponent of CDOs.

“Data-quality management principles and standards have been encoded in the ISO 8000-series of standards and, although significant work remains, business leaders should begin emphasising good data governance to produce high-quality information for business use,” concludes Talburt.