The battle has been raging for a decade between on-premises data centers and the cloud, but it's now clear that the cloud has won. Sure, there'll be on-premises systems in use for years, just as there are mainframes and Cobol apps in use today. But the on-demand cloud model is becoming the norm.
But perhaps not as you'd expect.
The cloud battle is usually portrayed as a choice between highly customized, highly managed, purpose-built internal systems that take an army to manage but deliver very specific competitive value versus more generic cloud services that take fewer resources to maintain and secure but at the (perhaps small) price of standardized processes and options that force companies to find nontechnical ways to differentiate themselves.
That "differentiation through technology" notion was an article of faith in the last decade, and it's why every company customized its ERP system to the point of collapse and spent buckets of money investing in technology innovation across operational and analytical systems.
Moving to the cloud involved more than a loss of control over systems, it meant giving up technology as a secret weapon. It turns out that "secret weapon" thinking was simplistic, and the common result was a mess of systems whose complexity undercut their purported business advantage.
But the age of generic technology also hasn't happened, even if the cloud has allowed companies to take advantage of less-complex systems, creating space for meaningful innovation. A great example of that shift occurred at Tribune Media, which got the rare chance to do a greenfield technology deployment. It ended up with a mixture of cloud technologies (such as SaaS), cloudlike technologies (in the data center), and a small portion of traditional technologies.
What it shows is an example of the postcloud world now emerging. Vanilla shouldn't be the only flavor, and the cloud's "the sky's the limit" on-demand pricing model and the issues of bandwidth and latency create new costs that are often as unreasonable as the old approach of going super custom and thus super complex -- for infrastructure, applications, and data alike.
What instead is happening is the triumph of a few key notions whose deployment depends on the right combination of price, complexity, responsiveness, flexibility, and strategic advantage. Let me state it this way: How you deploy is a tactic, not a goal. That's the biggest lesson to learn in the post-cloud world.
What are those notions?
Design for change: Whether it's a browser or an analytics engine, know that needs and context around information and business needs will change, so don't tie yourself to brands, formats, tools, or providers when you don't have to. Containers like Docker are the latest instantiation of that notion, and containers won't be the last.
A modern cautionary tale is the concept of big data, which originally meant preserving data context rather than transforming it ETL-style beyond recognition, limiting its future utility. Unfortunately, big data today is becoming more and more like cloud-powered data warehousing, confusing the small advantage of on-demand processing with the intrinsic advantage of preserving context for multiple possible uses on demand. That's not cloud, but 1990s IT.;
Chief Analytics Officer Europe
15% off with code 7WDCAO17
Chief Analytics Officer Spring 2017
15% off with code MP15
Big Data and Analytics for Healthcare Philadelphia
$200 off with code DATA200
10% off with code 7WDATASMX
Data Science Congress 2017
20% off with code 7wdata_DSC2017