Data Validation Screen

Data Validation ensures that the data captured has undergone quality control so the data is both correct and useful. Sequentum Enterprise makes data validation simple and easy through a centralized screen where every field can be configured to ensure quality. The purpose of validation especially important once an agent is deployed and running on a schedule, as the data will no longer manually be checked every time. It ensures that the data is correct, both in format and in content.

Under the Data tab click on Validation Rules to adjust validation settings

ValidationRules.gif

Data Validation Screen.

Data Types

Sequentum Enterprise supports the following data types for captured data.

 
Data Type Description
Short Text All content will be captured as Short Text by default. Short Text content can be up to 4000 characters long.
Long Text Long Text content can be any length, but cannot always be used in comparisons, so you may not be able to include Long Text content in duplicate checks.
Integer A whole number.
Float A floating point number.
Date/Time A date and/or time value.
Boolean

A value that can be true or false. Boolean values are stored as 1 or 0 integer values.

 

Allow Empty

Specifies whether the captured column can be empty or not.

Format Style

Data captured can be formatted into the following styles:

  • Regular Expressions
  • Date/Time
  • Numeric Value Range
  • JSON

Format

Formatting acts as a form of validation. It ensures both that the data collected is of the correct type and the the desired format for export.

For example you could specify DateTime to be formatted as YYYY-MM-DD.

Time Zone

If a captured data is defined as a date/time, a time zone can be used to determine what time zone the data was collected in. The following are selections for time zone:

  • Assume Local Time
  • Assume Universal Time
  • Adjust to Universal Time

Validation

Data validation can be applied to the agent at different time, be default, the agent is configured to execute data validation during runtime while the agent is collecting data.  This can be changed to export so that the data validation is applied when the agent is finished running and executing the export script.

Export and Runtime Validation

There are different validation error handling options that can be applied to the agent.

  • None - Do nothing when there is a data validation error.
  • Remove Row - Removes the row from the export data.
  • Remove Row and Increase Error - Removes the row from the export data and also increase the error count.
  • Trigger Failure - Triggers a failure and the agent will fail.

System

System values are not stored in the internal database.

Key

A key column is used to uniquely identify a data entry. Multiple capture commands can be marked as key columns to combine extracted data from multiple commands into a value that uniquely identifies a data entry.

Export Validation:

Export Validation is executed during data export. It is possible to manipulate empty tables, empty rows, and handle duplicate rows.

Empty table handling:

Specifies what happens when there is an empty table:

  • None
  • Remove
  • Remove Table and Trigger Error
  • Trigger Export Failure

Empty row handling:

Specifies what happens when there is an empty row:

  • None
  • Remove
  • Remove Row and Trigger Error
  • Trigger Export Failure

Duplicate row handling:

Specifies what happens when there is a duplicate row:
Note: If there is a Key column defined, the key will be used to determine a duplicate row, otherwise if there is no Key defined, the hash value (SHA 512) of the entire row is used to determine a duplicate.

  • None
  • Remove (SHA-512)
  • Remove (Key Values)
  • Remove (Key Values Across Sessions)
Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.