For many organizations, data has become the most important corporate asset. Better data means better productivity, ameliorated risk exposure, heightened customer service/satisfaction, robust business intelligence, and increased regulatory compliance. As data proliferates throughout the enterprise at increasingly exponential rates, CIOs are finding it more difficult to fully understand the breadth of their data assets and assure the timely delivery of data to internal business users and external customers with acceptable levels of quality and security. Although data is a commodity, the stewardship of data is not. The cutting-edge management of data—data stewardship and data governance—and its supporting infrastructure has become a leading mechanism in sustaining competitive advantage and growth. Increasing numbers of large to medium-size organizations are discovering the value of dashboard-enabled quality management of data such as:
- Data Distribution Control
- Data Redundancy Control
- Data Quality Control
- Metadata Persistence
- Data Semantics
- Data Milestoning and Archiving
Using a dashboard driven approach to data stewardship offers a return on investment that far outweighs the costs of construction, deployment, and maintenance of the dashboard. Having quality data means being able to seamlessly share and distribute “one version of the truth” throughout the enterprise. One Trusted Version of the Truth means tireless attention to many factors:
- Intelligibility and Definition
- Documentation and Education
- Rules and Transformations
- Ontology: Supply Chains, Dependencies and Origins
- Business Processes (Mapping to)
Much of my winter of 2008 was spent in Japan where I met with numerous technology and business executives to discuss their data strategies and related challenges in business intelligence (BI). A startling realization was that many organizations were still in the beginning stages of implementing formal data governance and stewardship programs. Moreover, the more data savvy Japanese-based businesses—ones that had at least conducted data audits—were still at a loss on how to better monitor, measure, and improve data quality and consistency. Data audits will invariably show that large portions of enterprise data is:
- Inaccurate and Inconsistent
- Not Delivered in a Timely Fashion
- Not Easily Accessible
- Semantically Confusing
- Not Secure or Confidential
Thus, data audits are a great catalyst for change; however, companies need to confidently take the right steps to fix all the problems that are exposed in the audit. Jason de Luca, the President of Smart Partners Japan (a Tokyo based consultancy) elaborates: “Our clients know they are in trouble with their data, but they seldom have a decent roadmap that will lead to an essential solution.” It is a matter of fact that organizational political quagmires usually dictate that they look to an outsider, i.e. a management consulting company, to ask the tough questions that will ultimately produce pivotal and lasting results in data quality:
- What are the current trusted data sources of record for Market data, Reference Data, Transactional Data?
- Which data stores are the most critical to the business? …to operational continuity?
- What kinds of business processes are involved in the creation of mission critical data? Who owns these processes?
- What documentation exists now—such as data models, system mappings, rules, etc.?
- Is there a current enterprise data integration strategy
The key is to relentlessly remind executives of the consequences of poor data governance and stewardship, the perils and hazards of which can be far reaching. For example:
- Obtaining an overall picture of corporate performance will be impossible
- Customer service be a perpetual laggard
- Support for regulatory and audit activities will be minimal
- It will be impossible to strategically align and govern enterprise business segments
- Impossible to identify risk in a cross-functional manner
- Cross-functional identification of risk is onerous
No comments have been posted yet.