Cloud Computing: It’s Not The Destination, It’s The Journey!

A guest blog by Dr. Robin Bloor, Bloor Research

For some companies, especially large ones, the big attraction of the cloud is cost. The simple fact is that data centers are expensive to build and expensive to run. Of course cloud data centers are expensive to run too, but the big cloud vendors got going at a propitious time. They got to choose where to locate their data centers - in cooler climates close to sources of cheap electricity in areas where labor was relatively inexpensive. If you compare such data centers to those that were built before the Internet began to change everything, for example by banks and other big IT users in the financial sector, it’s clear that there are huge cost savings to be had.

So why haven’t the big banks either moved a great deal of their applications to run in the public cloud, or perhaps better, built their own cloud data centers and gradually migrated applications from expensive data centers in places like New York and Los Angeles, to inexpensive data centers near hydroelectric power stations in Washington State?

The Advantage of the Green Field

There is an obvious explanation. It’s about flexibility. If you set up a public facing cloud infrastructure service, you get to decide what applications run in it. You can structure the service and pricing so that only the convenient and profitable applications run in your cloud. In short you get to choose the applications.

In reality, large financial sector companies don’t have many such applications that can be easily transported from the data center into the cloud. Many of their applications are running on mainframes or large Unix clusters - not the commodity servers of the cloud. And when you consider the applications that run quite happily on commodity x86 servers, migrating those applications is rarely a simple matter.

Database Intransigence

Moving an application from the data center to the cloud is not necessarily difficult. If it’s a stand-alone application that isn’t running 24x7, it’s simply a matter of copying. You set up a copy of the application environment. You create a database, you take a copy of the data and upload it, and when you’ve done all that, you’re probably good to go. Sure, you’ll need to test the configuration and maybe also application performance, but it’s likely to be fairly painless.

The situation is distinctly different when the database needs to be running all or most of the time. Such situations have become increasingly common in many areas of the financial sector. This is partly because the financial sector is global. It rarely sleeps and even when it takes a nap, it has to do so with one eye open, because the Internet is always awake.

This situation is exacerbated by the fact that nowadays there are more and more dependencies between applications. Some of these dependencies derive from regulatory requirements. When you are managing risk, you may need to be able to generate snap-shots of the state of multiple applications and data sets at exactly precise times - perhaps even globally. This naturally increases the availability requirement for applications.

Other dependencies derive from the way we do IT. In the past ten years or so we have built systems that share functionality through what is called Service Oriented Architecture. This software architecture reducing a great deal of duplication but it does so at the expense of creating application dependencies.

When you have direct dependencies across many applications, if you want to migrate some of them to the cloud, you probably need to move them all. You will want to avoid direct application-to-application calls going between the data center and the cloud. Consequently groups if applications often have to migrate together. The most difficult part of this migration is moving the databases.

The Data Journey

Moving data takes time. Moving a great deal of data can take considerable time. If the applications are running 24x7 or if the level of dependency between applications is high you have to be able to switch over from running in the data center to running in the cloud almost instantly. This takes some arranging.

The only strategy that is going to work effectively is one using database replication. In effect, you set up the data center databases to replicate their contents to database instances in the cloud. You can load the cloud databases at leisure from a full database backup and then gradually replicate all updates to the cloud databases as they occur. Once you have the two databases in step you can move all the applications and users into the cloud. You may even be able to move the applications gradually because you’ve got the data layer in sync from the get go.

It may not necessarily be plain sailing. Replication almost always places a significant extra load on a database - and thus the applications may slow down during the migration until they are fully operational in the cloud.

Truly Distributed Database

A relatively new database is NuoDB which offers a truly distributed capability and enables a slightly different approach. A good way to think about its capability is that it replicates between multiple sites, but it does so in a peer-to-peer way. Normally with database replication one database is the master and the other the slave. But with this product, transactions can execute on either copy of the database and the two copies are automatically kept in step. A product that can work in this manner - there may be more than one for all I know - would be perfect for migrating databases to (or from) the cloud.

However, the truly neat thing about this capability is that it will also enable an application and its database to permanently straddle the cloud. With such a mode of operation, you could choose to use cloud resources only at peak times. It would provide an unusual level of flexibility in managing infrastructure.

Such distributed capability is likely to become popular if it makes the use of the cloud more flexible, especially in areas like the financial sector and particularly banking, where data center costs are high and there is an obvious need to provide greater for infrastructure flexibility.

NuoDB Note: Thanks, Robin. Readers: we just posted an independent research study on the cloud in Financial Services. You can find it here. Take a look and let us know what you think about the study, Robin’s point-of-view or……..your point-of-view.

Add new comment