Trust ’23, Precisely’s annual conference did not disappoint. There were 2 keynotes and 8 in total. The two keynotes were divided into three 20-minute sessions. They were separated by brief, insightful customer interviews. Though some may have believed the keynote was over, luckily, all sessions are available on demand, so if you missed the later keynote element, you can go back and watch them.
On the first day, Josh Rogers, CEO of Precisely, started the event with a short keynote introducing some sessions. He also spoke about the importance of data to organisations today and reviewed the challenges many firms have faced, now face and will continue to face. He referenced McKinsey & Company, “The world is experiencing a level of disruption and business risk not seen in generations…”
He flagged the growing importance of data integrity to companies noting, “We are seeing increased regulatory and consumer scrutiny on how companies manage personal data. A survey by Prosper Insights and Analytics showed that 64.5% of consumers would like to see legislation that prevents companies from selling their personal data.”
He also flagged the growing need for accurate ESG data, flagging a session on day 2 of the conference covering the latest trends in ESG reporting. Rogers also shared the company mission statement, “Our mission is to help enterprises achieve data integrity data that is accurate, consistent, and contextualised so you can make more confident business decisions based on trusted data.”
Listening through the first day, the nuance that seems to come from Precisely is that context is critical. Data integrity should be a given for organisations, as once data is clean and trusted, organisations can use it to drive decision-making. However, that decision-making is empowered when the data is presented in context, often using data enrichment such as that provided by the enrichment services Precisely has added in recent years connected by the unique PreciselyID.
Data Integrity Insights and Trends Report
Rogers then introduced the second part of the keynote would be a panel discussing the findings of the annual Data Integrity Insights and Trends report commissioned by Precisely and conducted by Drexel LeBow.
He then shared the first five goals for data programmes, which provided a backdrop for the second session.
- Data-driven decision making
- Operational efficiency
- Cost reduction
- Revenue generation
- Regulatory compliance
The report, which will be published later in the year, provides a wealth of insights into the data programs of organisations. Kevin Ruane hosted the session with Josh Rogers joining Murugan Anandarajan, PhD, Professor of Decision Sciences and MIS, Senior Associate Dean and Diana Jones, Executive Director, Center of Business Analytics, both of Drexel University, LeBow College of Business, both sharing insights.
To achieve the top goal of data-driven decision-making, Anandarajan shared that they achieved this with advanced analytics tools to break down the data silos and improve data quality. The research identified three characteristics of firms that had this goal:
- A data strategy
- A data governance program
- They rated their data quality as high
There are challenges, though, with data literacy and a shortage of skills prominent. There is a huge impact on data initiatives. Jones added, “48% also say that their data integration projects are impacted by lack of expertise.”
Rogers noted that for firms facing this challenge, Precisely can help, with its Strategic Services team helping to form a data strategy, and its tools are becoming simpler to use. A good example of this was the launch of EngageOne RapidCX, a modern communications management platform that takes automation to a new level.
Trends in Data Management
The research also identified three trends influencing data strategies during the current economic downturn. There is an increased focus on financial reporting and predictive analytics. Also, an Increased investment in a range of technology, including:
- Cloud Adoption
- Advanced Analytics
- Workflow Automation
- Digital Transformation
- Artificial Intelligence And Machine Learning
- Data Ops
Anandarajan added, “This was not too surprising because we see a movement towards decision intelligence, where we are talking about the human being supported by technology, called Human+. Where organisations can make more automated decisions or augment their decisions using humans and technology.”
Jones added the third trend noting, “Companies are investing in digital transformation and automation to help them reduce costs and maximise their existing resources.”
Rogers confirmed that Precisely sees these trends in the field with organisations using these technologies to enhance data integrity. Precisely is embedding AI within its products for customers to take advantage of.
Looking even further forward, Anandarajan shared, “I found that more and more companies are looking at ways to enrich the data with third party information because just having your internal data is insufficient in this complex world. One of the top priorities was to enlarge or to increase the data enrichment.”
The Precisely Data Integrity Suite can help overcome the challenge that companies often face in integrating various data formats. Rogers said, “We see a huge demand for the ability for organisations to easily add additional attributes through acquiring data products. If I have 100 million customers and want to add a dozen attributes to that data set, I have to procure those datasets. I have to figure out how do I connect the right level. There’s no universal key to do that. It’s a big kind of thorny problem.
“We are working closely with customers to make that easier and allow our software to not only be able to help do that integration, but our data products to have some of these universal keys that can move these things up.”
There is also a change in how organisations look at data. Anandarajan added, “I also found that organisations are moving from being passive observers to more active users now.”
Ruane then asked the panel about the surprises in this second annual report. Anandarajan commented, “I was surprised to see the amount of times location-related data and spatial Analytics was mentioned. For example, 76% of the respondents reported their reliance on accurate location data for decision-making. They talked about purchasing tools that helped them to derive insights for spatial analytics and also found that location-related data was strongly associated with the quality of data.”
Rogers expanded on where he saw customers taking the lead. It is where this notion of context is so important today. He added, “What we see, in the more sophisticated users, is they’re using spatial analytics capabilities to do additional calculations, to add additional attributes to an entity. Whether that’s a product or a consumer. What we see is that context really helps you understand this entity better and make sure that it is correct.
“You have the benefit of being able to compare, that off of an address that has a known location and build these attributes in a very confident way to give you more context and more accuracy. It’s a really powerful tool that we see customers adopting more and more across a wide variety of industries and a wide variety of use cases.”
In the third part of the first-day keynote, Anjan Kundavaram, Chief Product Officer, gave an overview of the Data Integrity Suite and how it solves the challenge of enriching data. Kundavaram shared the six tenets of its vision that organisations would be wise to note.
- Business and IT must partner in the management of their data and its integrity.
- Data should be delivered when business users need it in real-time and where they need it, whether that’s in the cloud or any other environment.
- The Data integrity process (including data governance, data quality and master data management) must run where ever your data is on-prem, in Snowflake in Databricks, anywhere, no more copies.
- Business and IT users share a single scalable metadata catalogue.
- Data integrity processes (identifying data issues, creating recommendations for data quality rules, and managing those rules over time) must be automated with AI and large language models. Processes like identifying data issues, creating recommendations for data quality rules, and managing those rules over time.
- Choose the data integrity services you need when you need them, and have those capabilities interoperate seamlessly.
Emily Washington, SVP of Product Management for the Data Integrity Suite, then shared the latest updates to the Data Integrity Suite. She also demonstrated how simple it is to build data pipelines within it.
Enterprise Times; What does this mean
There was a lot of content within these keynotes; this first was a welcome mix of insights and product information. Along with the announcement of EngageOne RapidCX, it is clear the direction Precisely is heading.
Organisations are collecting huge volumes of data. The first challenge Precisely has already helped to solve is turning that data into trusted data. The problem is that organisations’ data cannot always deliver meaningful insights. Precisely now has the products to enable firms to collate, clean, enrich and then use to add value to their business.
This is where the additional data services that Precisely can connect your data using the PreciselyID can add the context that makes data both valuable and provide the richness to provide insights to drive businesses forward.
There is more to come from the LeBow Research, which is due out soon and should shed further light on the challenges and priorities for data management.