Tuesday, 6 May 2014

Objektum Modernization announce New Legacy Data Bridge Tool

Research has shown that up to half of data migration projects fail to produce the required level of data integrity and robustness in the migrated data source. This is often caused by a lack of understanding of the original data structure as well as a failure to implement a migration strategy that mitigates the risk of human error.

Many of the tools that exist to assist with data migration fail to provide the required level of customization when dealing with legacy data sources and are unable to accommodate bespoke databases that differ considerably from commercial off the shelf data solutions.

Therefore we are proud to announce the Legacy Data Bridge tool.

Our invaluable tool addresses these issues by delivering an automated solution to data migration and provides a unique synchronization capability that ensures consistency between legacy data sources and their replacement. Adopting our solution provides you with the confidence to switch over to the new system, secure in the knowledge that your legacy system is there as a safety net in case of catastrophe.

The Legacy Data Bridge is a state-of-the-art, model driven automated solution for migrating and synchronizing heterogeneous data sources. At its heart are a set of highly configurable synchronization rules that are automatically generated from our modern, model based design technology. These rules provide an error free, unambiguous data migration solution that is easy to understand and maintain.

Interfacing to the data sources is achieved through a set of adapters that can be plugged into any data technology. These adapters are specifically developed to meet the needs of your data migration needs and can be configured to access data either directly or via the data source Application Programming Interface (API).

Input data, accessed through the adapters, is then processed by a data source analyser in order to produce a list of elements to which the rules must be applied. Built on our proven parsing technology, the data source analyser is specifically developed for each customer engagement and provides an agile, easy to understand and highly maintainable solution. The rules engine is then executed on the analyser output; the data can be cleansed, duplicates removed and generated into the target data source through the output adapters. Each time the engine applies a rule the system records an in-depth journal, of all actions performed, providing both a complete audit history as well as the capability to undo any action in the event of an error in the rules being discovered after the synchronization process has been applied.

Rules are developed in a modelling environment in order to abstract the purpose of a rule from its implementation. This approach allows experts in the problem domain to create and change rules, using familiar terminology, without needing to understand the implementation technology. 
Data synchronization rules can be a mixture of both real-time and periodic, depending on the desired synchronization strategy.

As rules are developed and evolved the Legacy Data Bridge‘s sandpit feature allows them to be fully tested, in a safe environment, prior to deployment on the production system.

Whatever your data migration needs are, our Legacy Data Bridge provides, a rapid, risk free reliable solution.
The Benefits:

  • Accelerates the data migration activities to deploy your new system sooner
  • Reduce the risk of deployment to a new system
  • Provides confidence that your data is cleansed and error free
  • Allows legacy systems to be decommissioned with confidence
  • Removes data constraints on the new system architecture
  • Maintains comprehensive log of all transactions
  • Ensures consistency between “as is” and “to be” systems through an automated approach
  • Removes the overhead associated with maintaining both the legacy and new system 

To find out more about our new tool click the links below:

Thanks for reading! 

No comments :

Post a Comment