Ahead of its annual conference, Trust ’23 Precisely has announced a series of updates to the Precisely Data Integrity Suite. This is a significant update that includes several new features and services that extend the ability for customers to gather, improve and enrich the data available for their enterprise. The updates support the recent announcements around how Precisely are supporting customers with ESG reporting processes.
The updates include a new data quality service, an integrated data catalogue, new connectors to further extend the data integration service and enhancements to the Data Observability capabilities within the Suite. Precisely has also extended the availability of its geo addressing and data enrichment with a new Geocode API that uses the unique identifier, PreciselyID.
Anjan Kundavaram, CPO at Precisely, commented, “With these latest advancements, the Data Integrity Suite provides a seamless experience like no other, empowering businesses to harness trusted data for critical decision-making. Customers can now effortlessly access the highest quality data, proactively observe it to prevent issues downstream, unlock essential context, and make it available in the environment of their choosing. The data catalogue provides the unique common thread, meaning different services can be easily combined for maximum value.”
Why this matters
Recent research by the Center for Business Analytics at Drexel University’s LeBow College of Business that will be presented at Trust ’23, indicates the importance of data-driven decision-making. Precisely will publish the 2023 Data Integrity Trends & Insights Report in full during June 2023. It identified that organisations are increasingly pressured to have trusted data on which to base decisions. 77% of business leaders responding to the survey see data-driven decision-making as the main priority for 2023.
Having the data is easy; getting it to a point where it is trusted is much harder. With data-driven decisions making a global priority for organisations, the Data Integrity Suite aims to solve this challenge.
The report notes that those data-driven decisions often drive the remaining priorities for business leaders. The remaining priorities are:
- operational efficiency – 73%
- cost reduction – 62%
- revenue generation – 59%
- regulatory compliance – 57%
Sanjeev Mohan, Principal and Founder at SanjMo and former Gartner Research VP and industry analyst, explained, “Today’s businesses wrestle with delivering trusted data from hybrid environments to analytic systems using a mix of point solutions. They lose control of their data as it moves from data producer to data consumer, and as a result, data issues aren’t identified until they hit downstream systems. The Data Integrity Suite now brings together Precisely’s longstanding leadership in data quality with data observability and data governance so teams can not only observe, detect, and predict problems, they can also fix them – in the environments where the data lives.”
What is in the update
Kundavaram will deliver a product keynote on May 16th with Emily Washington, SVP of Product Management. They will further demonstrate and explain the new capabilities of the Data Integrity Suite, which supports the end-to-end need for data that is accurate, consistent, and filled with context. The latest enhancements to the Data Integrity Suite include:
Data Quality Service
The new Data Quality Service enables organisations to design data quality pipelines using a friendly and intuitive user interface. The service runs on the cloud and enables the validation and enrichment of data to improve its context for use by operational software, AI and analytics platforms to which it passes the data.
Enterprise Times asked Precisely to reveal more details about this new service. The response was, “The Precisely Data Integrity Suite is a set of interoperable services that enable your business to build trust in its data. One of those services is the new Data Quality service that allows users to deliver data that’s accurate, consistent, and fit for purpose across operational and analytical systems. This new service draws key strengths from all three of our market-leading data quality solutions – Precisely Trillium, Precisely Spectrum Quality, and Precisely Data360 – to address new use cases but does not replace them. The Data Quality service is designed for hybrid cloud environments and can be implemented alongside our current products to provide complementary capabilities.”
Integrated data catalogue
The integrated data catalogue automatically captures metadata from any Data Integrity Suite connection. It effectively automates what is a very time-consuming manual process in some organisations. Users can better search, explore, understand, and collaborate across critical data assets as the catalogue is built and extended, especially after unknown updates.
Enterprise Times asked Precisely how the data catalogue capabilities have changed. “The Data Integrity Suite’s Data Catalog is now unified and integrated with all the available Suite services. This means users can see the same assets between Data Integration, Data Quality, Data Governance, and Data Observability through the Data Catalog, which helps users move faster, expands data discoverability, and enhances usability.”
Enhanced data integration service
Precisely has enhanced the data integration service to include hundreds of new connectors. The connectors create pipelines both to and from the data integrity suite. Data Sources include connectors for different cloud sources within a range of cloud repositories, applications, and other data sources. Sources can include:
- complex mainframe sources such as IBM’s IMS and VSAM,
- market-leading cloud platforms such as Azure, Snowflake, Google and AWS
- streaming services such as Kafka and
- applications, such as Salesforce and Oracle NetSuite.
Data targets include a similarly long list of major cloud and technology vendors.
The integration service can enable real-time data replication, enabling organisations to create resilient, high-performance data pipelines that connect data from critical systems to modern platforms, driving faster innovation and competitive advantage.
Extending the use of PreciselyID
The Data Integrity suite can now access a range of geo-addressing and data enrichment services. It achieves this with a new cloud-native Geocode API that leverages the unique and persistent PreciselyID. The new Data Quality Services also helps, with Precisely stating, “We continue to offer the Geo Addressing service’s capabilities like geocoding via API. What’s exciting about our latest innovations is our ability to deliver these same best-in-class capabilities through the Data Quality service’s design interface, which visualizes data changes as they happen.”
Data Observability enhancements
With this update, it is now possible to observe more data attributes than before. Users can detect issues before they impact downstream systems. These new attributes, such as schema drift, offers significant thresholds and completeness in data drift alerts. The new Schema Drift rules allow users to monitor changes in table, column and column data type. Changes that can trigger alerts include:
- Table is deleted
- Column is deleted
- Table is renamed
- New column added
- Column renamed
- Column datatype changed
Once an issue is addressed, the user can mark an alert as complete, removing the alert from the alert list. The Data Observability service also supports all major cloud warehouses. Though Precisely did not name these, ultimately, it will aim to include them all on its connector list.
Enterprise Time: What does this mean
Another big update for the Data Integrity Suite from Precisely. For many organisations, the emphasis has been on implementing analytics and AI solutions that deliver business value. However, without data integrity, organisations risk creating solutions that may not be optimal and could worsen a situation. Ultimately, business leaders must trust their data, and solutions like Precisely’s Data Integrity Suite aim to do just that. These recent enhancements allow organisations to collate quality data as fast as it is generated from across the many sources their organisation has.