Don't Waste Time with Bad Data.

The key to rapid data integration is clean and correct data.

What is Data Quality?

An integrated toolset to profile and validate data. Clean data speeds up the overall time to get desired results.

Download ›

Download Designer Trial to
try out Data Quality

Learn about Data Quality
in a short video

Inspect Your Data

Find out what treatment your data needs

Discover

Insight Into Your Data Quality

Data Quality starts with discovering what state your data is in – i.e. getting quick insight into a new dataset, seeing patterns, distribution of missing fields, incorrect data types, etc.

This insight helps you decide how to further handle the dataset. Do you have really unclean data, calling for a rigid workflow with manual correction steps? Or can an occasional error here and there be fixed automatically?

Share

To Build an Effective Workflow

In a multi-user environment, the people you work with and the exchange of knowledge is a key to an effective and successful workflow.

With CloverETL, you can perform analysis on data sets and share the results with your co-workers on the CloverETL Server. Everything you do can be shared with others, with deep user right permissions management.

Monitor

Watch & Analyze Trends

Continuous analysis of statistical information keeps you safe from unexpected problems. It also enables you to further analyze and learn from historical evolution of your data.

The Data Profiler in the Data Quality package for the Server automatically gathers statistical data from your data processes. You can then use this data to further analyze trends or detect anomalies which would otherwise pass unnoticed.

Enable Data Quality

Put your understanding and our tools to work

Establish

Validation rules

The Validator component is a comprehensive filtering tool that lets you visually define data quality rules to your data.

With just a few clicks, you can design rules that can be easily understood without any coding – removing any programming noise to let you focus on the rules themselves. Reusable predefined custom rules make your work truly effective.

Apply

Filter data

Execute rules and get immediate reports – forget processing cryptic error codes and hardcore messages.

Validator will produce meaningful results and detailed insight out of the box. You can use that along with the rejected data to send it back for repairs.

Fix

Data Transformations

Naturally you want to fix as many problems as possible automatically. With CloverETL, there are plenty of data processing tools at hand.

With CTL (CloverETL Transformation Language) and components like Reformat, Dedup, Address Doctor, and more, you can design powerful post-validation processing that will save you a lot of manual effort.

Try Data Quality

Download

Purchase

Purchase
Data Quality is an extension available to any commercial edition of CloverETL

Data Quality for Designer

Data Quality for Server

A desktop version lets you perform data analysis
and design and run Validator rules
Adds monitoring, trends analysis and automatic fix
workflow options on top of the desktop offering.
Discover potential data issues (Data Profiler) Share results with your team mates (Reporting Console)
Enforce data quality rules (Validator) Automate running and gathering of reject outputs
Fix problems using CloverETL transformations Automate & monitor fix workflows
Gather and monitor trends
Run distributed Cluster transformations
Includes license for all desktop features

Data Profiler

Discover a fast, accurate way to assess the current condition of your data

CloverETL Data Profiler is a data profiling module that provides a fast, accurate way to assess the current condition of your data. This is a vital first step before undertaking any new data integration project and is essential for business, as it lowers the risk of project implementation and data management.

Discovery

CloverETL Data Profiler is typically used as the first step in data management and integration. It allows for a quick analytical view on data and its quality. Once developers or business analysts discover structure, patterns, or anomalies and errors with the data profiling tool, they can better assess the next steps required to improve the quality and consistency of the data. Data profiling also helps better estimate the effort and time it will take for data management implementation.

Inspection Depth

There are various metrics options available, such as: null count, unique values count, statistical metrics, string pattern distribution, convertibility to date or number, frequency charts and value distribution charts, to name a few.

History & Trends

History of profiling results is stored in the relational database, which is useful for trending and comparison. The storage structure is open, allowing further exploitation of profiling results. You can access all the stored information using the Server API to analyze the data and compute trends.

Sharing

The web-based reporting console is simple and intuitive, providing access to statistical information in tables and charts. There is a 100% web-based view of data for easier deployment among users. The reporting console can be accessed either directly from the application or from a standard web browser. The option to export to XML and printing is also available.

Validator

Visual rule-based data filtering.
Replace coding with visual business rules designer.

An essential aspect of keeping your data in good shape is being able to define data quality rules that can be well-documented, built into, and shared throughout your data processing workflow.

Overview

You can look at Validator as a barrier that makes sure the data entering your workflow meet your data quality standards. You can create reject files and reported back detail errors in the data.

Drag&drop Visual Rules Design

Validator lets you design business rules in a highly productive, visual way. With just a few clicks, you can apply rules to your data that can be easily understood without any coding, removing programming noise and letting you focus on the rules themselves. Simply drag and drop rules and apply them to a field, or group them together to create complex rule-sets.

Filtering Data

At runtime, Validator filters data based on rules such as: nonempty, value range, format, or custom rules (see below). Validator runs every rule on each record and either passes the record onwards, or rejects it should any of the business rules fail. Built-in comprehensive error reporting lets you handle the rejected record with clear error details simplifying the fix process. With the workflow capabilities of CloverETL you can control the whole process, from creating reject files to reporting errors.

Custom Rules

As many businesses have very specific needs – and therefore specific business rules applicable to their data – the power of Validator comes with its ability to be extended with custom rules. You can have your development team set up one or more rule libraries, which are implemented as CTL2 functions, to be later used and re-used in Validator definitions. From then on, these custom rules are easy to manage from a central location and can even be shared across all transformations and workflows.

Availability

The Data Quality package is available as an option to both CloverETL Designer and CloverETL Server editions. Please contact us if you have any questions.

Try Data Quality

Download
Start exploring Data Quality features by downloading a fully functional 45-day trial of
CloverETL Designer with Data Quality tools enabled


Data integration software and ETL tools provided by the CloverETL platform offer solutions for such data management tasks such as data integration, data migration, or data quality. CloverETL is a vital part of such enterprise solutions as data warehousing, business intelligence (BI) or master data management (MDM). CloverETL’s flagship applications are CloverETL Designer and Server. CloverETL Designer is a visual data transformation designer that helps define data flows and transformations in a quick, visual, and intuitive way. CloverETL Server is an enterprise ETL runtime environment. It offers a set of enterprise features such as automation, monitoring, user management, real-time ETL, clustering, or cloud data integration.