Big data analytics are driving rapid growth of public cloud computing. Why? It solves real problems, delivers real value, and is pretty easy to implement on public clouds.
Don't take my word for it. Revenues for the top 50 public cloud providers shot up 47 percent in the fourth quarter of 2013, to $6.2 billion, according to Technology Business Research.
TBR's latest figures reveal the extent to which public cloud providers are using big data to drive their own operations, get new customers, and expand features and functions. Although public cloud customers want storage and compute services, many implementing big data systems these days find that the public cloud is also the best and most cost-effective platform for big data.
Public clouds providers, such as Amazon Web Services, Google, and Microsoft, offer their own brands of big data systems in their clouds, whether NoSQL or SQL, that can be had by the drink. This contrasts to DIY big data, which means allocating huge portions of your data centers to the task and, in some cases, spending millions of dollars for database software.
Big data is driving public cloud adoption for fairly obvious reasons:
- The cloud cost is a fraction of that to purchase big data resources on demand.
- Cloud-to-cloud and cloud-to-enterprise data integration got much better in the last few years, so it's easy to set up massive databases in the clouds and sync them with any number of operational databases, cloud-based or on-premise.
- In most cases, public clouds can provide better performance and scalability for most big data systems because they can provide autoscaling and autoprovisioning.
So, big data + cloud = match made in heaven? There are always issues with new technologies, but in this case the bumps in the road have been slight. I suspect that big data will continue to drive more public cloud usage in the future.