DataGate is a simple yet powerful data processor that brings business model validation to the traditional ETL (Extract Transform Load) approach. With DataGate we bring ETVL (Extract Transform Validate Load) to the fold.
DataGate has many uses, from simple to highly complex:
- Extract data from a legacy system and map it to a new system, irrespective of differing technologies and platforms.
- Consume data from a 3rd party system, filtering out all data that does or does not conform to predefined requirements.
- Provide scheduled reports on data conformance and quality, from a database, flat file, Excel file, or any other system.
- Process FEED data, giving a conformance percentage and detailed report of missing information or errors.
- Ensure data conformance against a standard such as ISO 15926.
Despite being a small tool, DataGate is extremely configurable and scalable. It can be run as a Windows application or configured to run as part of a batch process or scheduled task, offering the ability to manage many data processing tasks as a cohesive whole.
DataGate utilises the full potential of a machine’s processing power to ensure optimum speed and efficiency of a data processing task.
Class Library/Business Model
DataGate uses a Class Library/Business Model approach where data can be designed in a real-world format, defining data items, their attributes, relationships, and business rules. This model is used as the driving force in transforming and validating data.
DataGate will load data from any source, be it a legacy or 3rd party system, file, feed, or document management system. Designed with a highly-scalable software architecture, DataGate can be tailored to load data from any source.
Any number of extract “jobs” can be defined to import data:
- Import from a SQL Server database.
- Import from a generic database using OleDb.
- Import from Excel (XLS, XLSX).
- Import from a delimted file (such as CSV or tab delimited).
- Import file information from a Windows directory.
We often struggle with the intricacies of different systems. A customer database may track data in a vastly different format to what we need, requiring complex mappings to be designed before we can make sense of that data. These complexities are tenfold when we need to consume data from multiple sources.
The business model allows data to be easily mapped to a useable format, with dynamic retrieval of field names from the load process combined with an intuitive drag-and-drop system to create data mappings, transforming data has never been so simple.
DataGate utilises proven practices from the ISO 15926 standard to faciliate cross-communication of data from disparate systems.
DataGate uses an extremely powerful and scalable validation engine that allows business rules to be added to the business model. For example, an asset number must conform to a textual pattern, a numeric value must fall within a pre-defined range, or a supplier code must exist within a given list of suppliers.
The validation stage will process all loaded data and give conformance metrics in respect to business rules, reporting this information in a variety of ways, giving verbose warnings and errors per item and overall conformance metrics.
The load phase allows the processed data to be loaded to a variety of target systems, such as a database, Excel file, flat file, or a 3rd party system. Valid data and invalid data can be separated and dealt with in a variety of ways.
Any number of load “jobs” can be defined to export data:
- Export to a SQL Server database.
- Export to a generic database using OleDb.
- Export to Excel (XLS, XLSX).
- Export to a delimted file (such as CSV or tab delimited).
- Export to a proprietary format, such as a system-specific XML format.
For more information on eSensible and DataGate please contact us.