Data quality and integration are key to getting people to trust data (Photo by John Schnobrich on Unsplash)Data integrity company, Precisely, has released its 2023 Data Integrity Trends and Insight Report (registration required).  Data integrity is a requirement as organisations increasingly rely on data for their decision-making. Importantly, the growth of generative AI inside organisations will only deliver positive business-changing results if data can be trusted.

The 26-page report was carried out by the Center for Business Analytics at Drexel University’s LeBow College of Business (LeBow), who worked with Precisely. There were 450 respondents, from C-level execs to data architects, managers and analysts. Respondents came from a range of industries, including IT, insurance, manufacturing, financial services and government. Importantly, companies of all sizes, from very small to very large, were represented. While the survey size is not huge, that multi-industry, multi-size range of responses gives it an interesting validity.

Key findings from the report

There are quite a number of interesting findings in this report. The top 10 called out in the report are:

  • 77% say data driven decision-making is a leading goal for their data programs
  • 70% who struggle to trust their data say data quality is their biggest issue
  • 60% agree that data quality issues impact data integration projects
  • 57% say their overall data strategy is influenced by cloud adoption trends
  • 57% say data governance results in better analytics and insights
  • 53% rank data quality as the top priority for improving data integrity
  • 46% rate their ability to trust data used for decision-making as “high” or “very high”
  • 45% are challenged by lack of effective data management tools
  • 41% say poor address quality prevents the effective use of location data
  • 40% have decreased staffing and resources due to the economic downturn

Some of those responses seem to be surprisingly low, given the focus on the report and the roles of the respondents. For example, it’s hard to see why poor address data is still an issue. Organisations have been cleaning address data for decades, particularly for marketing campaigns. That it still persists as a problem shows that either clean data is not being fed back to production or data entry validation is still ineffective.

With 70% concerned about a lack of trust in data quality as their biggest issue, it seems that organisations are still not investing enough in data quality. Why that is, is hard to understand. This is not a new issue, it has persisted for decades. Perhaps the problem lies in the applications that are in use and an unwillingness to address underlying data problems.

Data program investments

Despite the surprise over some of the findings, there is a significant investment going on in data programs, according to the report. Interestingly, the focus varies depending on the size of the company. For example, those companies under 500 employees see Operational Efficiency as their top target followed by data-driven decision-making. Larger organisations reverse those two.

According to the report authors “In our experience, data-driven decision-making is a critical enabler and force multiplier, improving operational efficiency, cost reduction, revenue generation, and compliance outcomes.”

The statement begs the question, therefore, of why smaller firms are focused on operational efficiency first? Is it about cost savings, a lack of mature processes that can exploit data-driven decision-making, a lack of skills or all of the above?

There are a couple of interesting responses in this part of the report. For example “providing business users with good data can create demand for more of the same.” It seems an obvious statement but only 46% of respondents called it out.

The second is that companies are finally breaking down data silos and improving data access. Driving this is technology improvements but it is also likely down to better data architectures. But this cannot be just a technology fix. The report highlights better data literacy skills and staffing levels as also helping make data more widely used.

Quality issues

It is unlikely that we will ever have a world where quality issues are not part of the data problem. Incomplete records, duplicate records and inconsistent data are just three of the issues here.

No surprise, however, that poor address quality is still an issue. It has been possible to add postcode and address verification into data systems since the mid-1980’s in a lot of countries. Today, most web forms make it easy to select addresses from postcode and house number, so why is the data so poor?

That isn’t addressed here. What is called out is the impact this has on using location data effectively. Marketers have long spent large sums of money on having data cleaned for campaigns. But that data is rarely fed back into improving the underlying data sets.

It is not the only issue. This section of the report calls out a number of issues around data integration. Sadly, again, none of these are new and again, the question has to be asked, why is this still happening? It cannot be solely down to skills.

How to get to trusted data

This section of the report is extremely interesting. Improving data quality is one thing, but it cannot still be about better silos of data. Instead, the report authors link the need for better data integration in parallel to improved quality. They go on to talk about how both of these drive data integrity, and that builds trust in the data.

According to the report, there are three aspects of the data integrity journey:

  1. priorities for improving data integrity
  2. maturity of data integrity practices
  3. challenges to achieving data integrity

While data quality and data integration are key here, there are other areas that also need addressing to create trusted data. Security, privacy, governance, observability and enrichment are just some of those. But these are not the only obstacles. Users want to consume data in real-time using their platform of choice. They also want proactive insights into data trends and patterns,

The impact of data governance in this area cannot be ignored. According to the report, “60% of organizations surveyed have an ongoing data governance program in place, consistent with the results of the 2021 survey conducted by Precisely and LeBow.

“The good news is that organizations are realizing added value from data governance in the areas that matter most. For the 2023 study, 57% of respondents report seeing improved quality of data analytics and insights, 55% report improved data quality, and 44% say collaboration is easier.”

Enterprise Times: What does this mean?

Organisations are more reliant on data than ever before. As a result, they continue to acquire large amounts of data in the belief that it will unlock business value. The challenge is in making sure that they can use that data, and that is where the perennial problems of data quality and data integration come in.

What this report shows is encouraging. Data quality and integration is improving, as is the use of data for decision-making. However, it also shows there is a lot more to do to improve trust in the data. As organisations increase the amount of automated decision-making, including the use of generative AI, data quality will be critical to the effectiveness and trust in these systems.

Overall, there is still a lot of work to be done, but as this report shows, companies are beginning to get there.

LEAVE A REPLY

Please enter your comment!
Please enter your name here