At the start of a recent meeting with a senior IT industry executive, I was asked “What is your opinion about all of this discussion around Big Data?” I found this question interesting for two reasons: 1) it was the first question asked, and 2) the scheduled topics for the meeting had absolutely nothing to do with big data! My response to the question was typical– i.e., the challenges we see related to the increased organization complexity resulting from higher volume, velocity, variety of big data…etc.
The thing I really do like about all of this discussion around big data is that it elevates and validates the conversations around data and data quality in general. However, not all data should be treated equally: an executive in the FT article Less is More when it comes to Big Data puts it well when he says data is “like crude oil: it isn’t much use until you start to synthesise it”. Your company may have access to lots of data, but only some of it will be relevant.
Selfishly, many of us seek to parlay the increased air time enjoyed by the big data discussion into the importance of data in general. In my view, there are three areas where there is still work to be done.
- Data deserves a dedicated line item on the corporate balance sheet. Thinking of the value of data in this way clarifies and justifies the increased investment and focus demanded from points 1 and 2 above. In Informatics: the Practice of Information Economics, Doug Laney points out that courts and insurers are still looking to find common ground around the treatment of data loss or destruction (let alone data-related misappropriation or misconduct). This is especially true for those companies whose intangible assets are based largely on business potential realized through the reach of their customer database.
- Software vendors need to continue to integrate a bunch of disparate products for the general movement and monitoring of organizational data into a holistic Data Stewardship Platform – a dedicated application used by data stewards to enforce governance policies and rules. Rex Ahlstrom, Chief Strategy Officer from BackOffice Associates, discusses some of the necessary components here.
- Organizations need to continue to hone their skills around the development of business cases to demonstrate the value of data quality: i.e., how good, clean quality data simply makes companies better and stronger. As analytics thought leader Tom Davenport points out, there are several approaches that can be deployed to establish and monitor these business process improvement initiatives and their associated business value over time.
As the conversations mentioned above mature, some of the enterprise apathy around data will be removed: i.e., we will no longer hear comments like “We’re a $100bn company and continue to grow and be profitable. Nobody from the business is screaming too much about data quality. It’s really not an area of focus for us.”
The perspective above is that of a company that only strives to keeps its lights on: i.e., it doesn’t seek to optimize the efficiency and value of the light generated. Companies don’t need to process every piece of data to the degree that it is squeaky clean. However, companies do need to understand the intrinsic value of potential incremental data processing decisions. Referring back to the earlier example of crude oil processing, companies need to have clarity on what should stay as crude oil and what should be processed and refined further for higher valuable products like lubricants, detergents or fibres such as nylon and polyester? The “bigger” data gets, the more important these data quality processing decisions will be – the last thing you want is for your data telling you to abandon an oil well way before extracting all potential value.