- Use Cases
- Dev Center
- Watch the Demo ❯
You are here
Staying Afloat in a Sea of Databases
Mar 11 2016
The one thing the cloud offers in spades is scale. With resources available for pennies on the gigabyte, no one should be complaining about a lack of resources even when confronted by the gargantuan volumes that are soon to arrive with Big Data and the Internet of Things.
This holds true for the database as well. Fortune 500 firms have long had access to resources that allow them to create and manage thousands, even tens of thousands, of databases, which invariably leads to conflicts and disconnects as multiple database management systems (DBMSs) and query languages come into play. This is one of the chief benefits of cloud-based distributed relational DBMS platforms like NuoDB – the ability to set up databases on the same SQL footing that many legacy data systems employ.
Still, the prospect of so many databases suddenly striving for attention is a major step for many organizations even if they are effectively housed within a single logical construct.
For one thing, says HP Software’s Jeff Veis, the very nature of the data itself will require a different mindset from what most DBAs are used to. If we are talking about the IoT, then it will be primarily sensor-driven data and other machine-to-machine M2M communications, which may or may not have to be ingested, processed and distributed in the same way as traditional enterprise data. Much of the processing, in fact, will likely take place on the edge to better capitalize on the real-time opportunities that the IoT presents. How big is this market going to be? According to Gartner, we could see today’s universe of about 5 billion connected devices jump to more than 25 billion by 2020, so in addition to having a rapidly scalable database platform in place the enterprise will need a range of integrated tools to intelligently collect, manage and process all of this data.
But just as data has a shelf-life, so too does the database. The temptation, of course, is to save everything that comes along, but in a world of 25 billion end points constantly streaming bits to the enterprise this will not be possible. This is why the Database Lifecycle Management (DLM) movement is gathering steam. DLM is a much trickier prospect than product or application lifecycle management, says Red Gate Software’s Ben Rees, because data is literally the lifeblood of the enterprise. If app code is lost it can be easily replaced, but if data is lost processes cease to function. Therefore, DLM requires a number of additional steps for things like version control, continuous integration and release management. If implemented correctly, however, it should give the enterprise a robust mechanism for scaling up the number of databases without choking critical infrastructure with outdated, useless information.
And it seems like change is afoot for the database itself as the IoT produces increasingly diverse and autonomous data workloads. According to J. Andrew Rogers, CTO of data modeling firm SpaceCurve, databases will be tasked with performing increasingly contextualized processes that reflect the growing geospatial nature of the distributed data environment. This means jobs will have to take into consideration not only the “who”, “what” and “why” of any given data set, but the “when” and “where” as well. This will provide a more accurate view of reality and usher in the kind of real-time processes that can affect profound changes on that reality – think dynamic traffic routing through congested city streets or rapid food distribution to areas that are, or soon will be, hit by drought or famine. To achieve this, however, emerging databases will need to de-emphasize traditional functions surrounding data relationships and queries in favor of high-velocity ingestion, spatial modeling/analytics and real-time execution.
A database platform that scales, then, is only the first step in accommodating Big Data and the IoT. Ultimately, enterprise executives will have to re-examine what they think they know about data sets, relationships and the processes that drive the business model and then leverage that new perspective to identify and capitalize on the hidden opportunities of the emerging digital economy.
Arthur Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet. Follow Art on Twitter @acole602.