The Costs and Effects of Poor Data Quality
Extensive literature has identified the high costs of low quality data and the cost of poor data quality recognising that firms may lose upwards of 10% of revenues due to poor operational data, together with other serious consequential effects relating to tactical decision making and strategy generation.
A report from The Data Warehouse Institute back in 2002 estimated that data quality problems costs US business $600 billion a year (5% of the American GDP) in postage, printing and staff overhead costs alone, whilst the majority of the senior managers in those companies affected remained unaware (Eckerson 2002). A report published jointly by Dun and Bradstreet and the Richard Ivey School of Business in 2006 forecasted that critical data within at least 25% of the Fortune 1000 companies would continue to be inaccurate and that “every business function will have direct costs associated with poor data quality”, whilst a survey conducted by the Economist Intelligence Unit in the same year on behalf of SAP and Intel reported that 72% of the survey respondents said their data was sometimes inconsistent across departmentsand that workers frequently made poor decisions because of inadequate data. More recently Larry English outlined a catalogue of corporate disasters emanating from poor quality business information amounting to ‘One and a Quarter Trillion Dollars’. During 2009 a survey of 193 organisations sponsored by Pitney Bowes, 39% of which had revenues in excess of US $1 billion, reported that a third of the respondents rated their data quality as poor at best, whilst only 4% reported it as excellent Information Difference (2009). A Gartner (one of the world’s leading IT research organisations) report stated that “Through 2011, 75% of organisations will experience significantly reduced revenue growth potential and increased costs due to the failure to introduce data quality assurance” (Fisher 2009).
Even where organisations appreciate the importance of quality data, there appears to be a lack of any real progress. A 2007 survey conducted by BusinessWeek and Hewlett Packard highlighted the fact that many organisations will readily agree that their data is an important asset, but fail to take any action, whilst a PricewaterhouseCoopers survey a year later found that 70% of executives contacted, considered data to be an important asset, yet only 40% felt they used it effectively (Informatica 2008).. One may speculate as to the possible reasons for this apparent absence of activity. There may be a general lack of attention to data per se at various organisational levels, or that enterprises do not persevere with those initiatives they put in place.
The quality of an organisation’s data not only has significant commercial connotations, but it also has serious implications for all enterprises, as they respond to the huge number of regulatory requirements, in the form of record keeping, data gathering and recording and information data hk providing. Failure to comply in full to any request can result in serious financial damage to an organisation or even threaten its very existence, even where fraud, deception or other misdemeanours are absent. This not only relates to prominent laws such as Sarbanes-Oxley and Basel II, but to the myriad of government demands for data, whether they be from HM Revenue and Customs, the Office for National Statistics, or any other governmental agency whether UK or Internationally based. All have potential penalties for late delivery or erroneous information. There needs to be an alignment between Data Governance (as described below) and overall Corporate Governance. For any unified governance, risk and compliance (GRC) strategy to be successful, there has to be confidence in the quality of the inherent data.