With businesses decommissioning their legacy systems to save both time and
money, Howard Sherrington, CEO of NSC Group explores the available
options to support continued, easy access to migrated legacy data
Applications will come and go, but it’s the data that really matters. When you embark on an application retirement or data management project, you need to ensure two things. First, that the data migration runs smoothly. Second, and most important, that the environment you develop for accessing the data following the migration is as efficient and easy to use as possible.
This leads to the key question: which is the most effective data access approach to take following the migration from those legacy or obsolete systems? With so many different methods to consider – from business intelligence tools and specialised archiving solutions to hand written SQL – businesses need to carefully weigh up the pros and cons of each approach so they choose a solution that delivers in terms of efficiency, individual productivity after migration and cost-effectiveness.
And from an end-user point of view, the access mechanisms also need to be easy to use and resilient to future changes, to ensure you have secure, widespread and future-proofed access to legacy data.
With these points in mind, let’s take a closer look at some of the options for data access to ensure you follow the right path for data management, migration and retrieval.
Specialised archive solutions
Deploying a specialised archiving solution is one method to consider for data migration. Generally, these solutions provide archive solutions for well-known IT products and extend to other products and to total lifecycle management.
However, archive solution companies do not tend to offer any means of retrieving the data once it has been migrated. This is because their main intention is to store data in such a way that it can be retrieved by some other SQL-based means – but the method chosen is left with the customer to decide.
Consequently, this is likely to be a drawback for companies looking for a more comprehensive solution, which can not only migrate the data but also makes subsequent access and usage easy.
Business intelligence tools (BI) – such as those from Business Objects or Cognos – are not designed to take a logical view of a database, as an application would. These tools exist to provide historical, current, and predictive views of business operations in order to support business decisions.
BI tools are designed to trawl through vast amounts of data – typically held in a Data Warehouse – and analyse it to support business decisions. For performance reasons, data that is intended to be used in this way is sometimes summarised as it is stored on the database or analytical databases are used for very fast retrieval.
As such, a BI tool may not give a complete picture of data at a detailed level – which may not give your staff the granular access they need when using or analysing the migrated data.
A key advantage to analytical databases is that they are designed for rapid retrieval of data by BI tools, which means data does not need to be reduced to summary level to achieve reasonable performance. Some versions may also specialise in the ability to compress the data for storage, or represent data in non-proprietary XML.
However, these databases do not maintain the logical relationships used by the original legacy application, and like specialised archive solutions they fail to offer any means of retrieving the data.
The manual SQL approach to data migration relies on the user knowing in advance what their required views are. This is because each search has to be created individually, essentially making the creation of every possible view an almost impossible task.
Businesses should also be aware that the solution works on the basis that these pre-defined views will remain unchanged over time – meaning end-users have to constantly reengage the SQL writer to customise the SQL when necessary.
In this way, this particular approach to migration fails to offer the same level of flexibility as other solutions. And even though hand-written SQL enables businesses to tailor the package to suit their needs at the time, establishing user requirements that are not resilient to future change is ultimately one of its major weaknesses.
Weighing up your options
The solutions mentioned so far have some major flaws when it comes to actually accessing the legacy data. This is why a more effective tool is one that takes advantage of the open, Web model to run on any hardware and operating system at both the server and client side, delivering customisable data views and queries within a browser-based interface.
This gives even non-technical users uniform access to data migrated from legacy systems from a familiar point-and-click interface – minimising the need for user training and facilitating secure access to data over the Web from any location.
This way, businesses are not left with the task of finding their own method to retrieve and access data, a drawback of some of the solutions already discussed, making it a valuable tool for those looking to maintain continued, widespread access to data after the legacy application has been decommissioned.
What’s more, by using a solution that delivers customisable data views, businesses have greater flexibility during the migration process and can tailor the package to suit their changing needs.
Finding the right path to migration can be a long, time-consuming process. But by weighing up your options and choosing a solution that will support easy, flexible but robust data access, you can experience the true perks of application retirement – by ensuring that the legacy data keeps on working for you.
NSC Group is a specialist software developer, based in Manchester, UK. Founded in 1970 to develop and market financial systems on a wide variety of hardware platforms, it now specialises in data migration and access solutions.