For a long while, the cloud was seen as a convenient way to improve traditional data center functionality. Even as individuals and business units began to provision their own resources, the aim was almost always to tap the needed resources to support familiar productivity applications, with perhaps some limited collaboration and file-sharing thrown in.
By Arthur Cole
Lately, however, the real cloud value that forward thinkers have long predicted seems to be coming into focus: the idea that an entirely new enterprise will take shape in the cloud sporting a more dynamic and flexible business model that will put the old way of doing business to shame.
DataStax’ Matt Pfeil calls it the “Internet Enterprise” by virtue of its “data-driven DNA.” This is when data and data analytics become the top priority and organizations gain new levels of customer service, market development and product innovation—think Uber, Google and Facebook rather than Ford, Exxon and Boeing. There was a time when organizations hoped to blend the old and the new when it came to data infrastructure, but nowadays the way forward is clear: Either become an Internet Enterprise or lose out to the Internet Enterprise that is invading your territory.
Big Data is a key part of this transition, and cloud providers are hard at work looking for ways to bring it to enterprise customers. Google recently added the Dataflow and Pub/Sub services to its enterprise line-up, which provide a dynamic framework on which to build scalable Big Data applications using familiar programming languages and SDKs. Dataflow is designed to provide a single, fault-tolerant environment for both batch and stream processing, while Pub/Sub can be used to integrate apps and services under a single API that enables queueing, notification, logging and other functions. In this way, the enterprise gains full Big Data collection and analytics capabilities at lower cost and with greater processing speed than is generally available with on-premises infrastructure.
Indeed, with Big Data emerging as the preeminent business tool for the coming decade, even the largest of enterprises will have no choice but to utilize the cloud in order to gain meaningful insight into their data, says CIO.com’s Mike Lamble. Already, cloud deployment of Big Data tools like Hadoop and MapReduce is nearly complete, and cloud providers are way ahead of the enterprise in terms of visualization and dashboarding tools, not to mention parallel statistical processing and automated data integration. Even for companies that have established warehousing and other systems on-premises, public cloud resources will be crucial for initiatives like machine learning, massive parallel processing (MPP) and one-time Big Data projects.
Related: Top 5 Examples of Cloud Computing
The key challenge moving forward will be to build a unified, connected cloud environment to avoid recreating the same data silos that exist in the data center, says ITProPortal’s Jacob Martin. This will require careful coordination across multiple business units and users, which is probably best handled by IT, and broad recognition that faster is not better if a particular set of resources functions in isolation. As well, cloud data connectors that support high-speed analytics, back-up and replication should work across all standard database formats used by the enterprise, and the entire cloud environment should possess the intelligence to handle multi-database conversion and other challenges that tend to arise in daily production environments.
It may be tempting to view these developments as the beginning of a new cloud, and in fact, references to Cloud 2.0 are on the increase. But the reality is that this is the same cloud that we’ve known and loved for the past five years or more—it’s just that applications and use cases are starting to evolve in new and interesting ways.
Rather than witnessing the birth of the new cloud, what we are seeing is the birth of a new enterprise.