<iframe src="https://www.googletagmanager.com/ns.html?id=GTM-5MNKFGM7" height="0" width="0" style="display:none;visibility:hidden">
New call-to-action

When is Good Enough, Good Enough? The Case for Flexible Data Quality

Published by Luc Hendrikx
July 31, 2024 @ 9:23 AM

In my role as a moderator for countless CIONET round table discussions, TRIBE Sessions, and Community Events on Data and AI in the past months, a recurring theme has emerged amidst the AI frenzy: a renewed and heightened focus on data quality and data governance. It seems the transformative potential of artificial intelligence has cast a spotlight on a fundamental truth – the effectiveness of AI is inextricably linked to the quality of the data it consumes. Yet, as with many emerging technologies, the pendulum may be swinging too far, with organisations potentially overemphasising perfection at the expense of practicality.

In the realm of regulatory reporting and financial disclosures, the mandate is clear: absolute accuracy is non-negotiable. A single data error can trigger a cascade of consequences, from regulatory penalties to damaged reputations. This zero-tolerance approach has rightfully shaped the way organisations handle their most sensitive data.

However, does the same level of rigour need to apply across all data use cases? As businesses increasingly harness data to drive decision-making in areas like pricing, marketing, and operations, the answer may not be a simple "yes".

The High Cost of Perfection

Striving for flawless data is commendable, but it's not without its costs. It often demands significant resources, both in terms of time and technology. The quest for perfection can create bottlenecks, slow down decision-making, and even stifle innovation.

In certain scenarios, a "good enough" data approach may be not only acceptable but strategically advantageous. When modelling price elasticity or generating leads, for instance, a small margin of error might have a negligible impact on the outcome. The speed and agility gained from using readily available data can often outweigh the risks associated with minor inaccuracies.

One Size Doesn't Fit All

The challenge lies in recognising that data quality is not a one-size-fits-all proposition. Many organisations still rely on a centralised data team that applies a uniform quality standard across all data sets. This approach, while well-intentioned, can be overly restrictive and fail to account for the diverse needs of different business functions.

What's needed is a more nuanced approach – a framework that allows for varying levels of data quality based on the specific use case. This could involve:

  • Data Tiering: Categorising data into different tiers based on its criticality and intended use.
  • Targeted Data Cleaning: Focusing resources on improving the quality of data for the most critical use cases.
  • Empowering Business Users: Equipping teams with the tools and knowledge to assess data quality independently and make informed decisions.

One Team can’t Do it All

One striking example will illustrate this: A leading Belgian financial institution has a separate Chief Data Officer and data organisation for the commercial exploitation of data versus the financial and regulatory use of data. They argue that not only a more nuanced approach is needed, but that it also requires different people with a different personality profile and attitudes for both categories of use cases.

Striking the Right Balance

Embracing a flexible approach to data quality doesn't mean abandoning rigour altogether. It's about finding the right balance between accuracy and agility, recognising that "perfect" might not always be the most practical or profitable goal.

By tailoring data quality standards to specific business needs, organisations can unlock the full potential of their data assets and drive better decision-making across the board.

Let's Start the Conversation

The concept of "good enough" data is sure to spark debate. One of the challenges of the modern Digital Leader is to strike the right balance and to organise Data Governance in such a way that the best approach is applied to each data use case. As a community, we need to discuss the implications, explore the potential benefits, and develop best practices for implementing a more flexible approach to data quality and data governance. After your well-deserved break with your friends and family, CIONET Belgium is organising several events that will touch upon this topic including:

Let's not let the pursuit of perfection hold us back.

Let's be smart about our data.

 

No Comments Yet

Let us know what you think

You May Also Like

These Stories on CIONET Belgium

Subscribe by Email