Cloud Computing Is Far More Than Just Cutting Enterprise IT Costs
Cloud Computing is so much more than a computer in the Cloud
Nov. 13, 2008 09:15 AM
First, though, there is a growing recognition that today’s market leaders will inevitably need to become more interoperable if this business segment - and they - are to grow. The proprietary nature of their offerings today may allow them to innovate ahead of the standards process (that will be shaped in large part by the lessons they learn), and the relatively high cost of switching to a competitor today may give each the critical mass upon which to invest and grow, but the characteristics of the current market are clearly the characteristics of a nascent market; computing’s new Wild West. As so often before, standardisation, true competition, mainstream adoption and commoditisation will all follow as we move toward phases 2 and 3 of Gartner analyst Thomas Bittman’s intriguing ‘evolution of the Cloud Computing market.’ Similarly, Erica Naone offers a useful overview of Cloud Computing’s open source component in Technology Review this month. None of the projects she covers are a significant challenge to Amazon’s EC2, Microsoft’s Azure, Salesforce’s force.com or Google’s App Engine… yet. But together they help to keep these commercial entrants honest, and remind all of us that switching costs can be brought very low indeed if the pain of the status quo becomes too great.
Writing ‘Welcome to the Data Cloud?‘ for ZDNet last month, I began to explore the important role that data could and should play in the Cloud;
“Just as ‘we’ used to duplicate and under-utilise computational resources, so we do something very similar with our data. We expensively enter and re-enter the same facts, over and over again. We over-engineer data capture forms and schemas, making collection exorbitantly expensive, whilst often appearing to do all we can to limit opportunities for re-use. Under the all-too-easy banners of ’security’ and ‘privacy’ we secure individual data stores and fail to exploit connections with other sources, whether inside or outside the enterprise.
In a small way, the efforts of the Linked Data Project’s enthusiasts have demonstrated how different things should be. The cloud of contributing data sets grows from month to month, and the number of double-headed arrows denoting a two-way linkage is on the rise. Even the one-way relationships that currently dominate the diagram are a marked improvement on ‘business as usual’ elsewhere on the data Web; even in these cases, data from a third party is being re-used (by means of a link across the web) rather than replicated or re-invented. Costs fall. Opportunities open up. Both resources, potentially, improve. The strands of the web grow stronger.“
It is here, in the use and reuse of data, that the potential of the Cloud will be realised. Back in the previously cited conversation between Nick Carr and Tim O’Reilly, O’Reilly himself came very close to saying so;
“In short, Google is the ultimate network effects machine. ‘Harnessing collective intelligence’ isn’t a different idea from network effects, as Nick argues. It is in fact the science of network effects - understanding and applying the implications of networks.
I want to emphasize one more point: the heart of my argument about Web 2.0 is that the network effects that matter today are network effects in data. My thought process (outlined in The Open Source Paradigm Shift and then What is Web 2.0?, went something like this:
- The consequence of IBM’s design of a personal computer made out of commodity, off- the-shelf parts was to drive attractive margins out of hardware and into software, via Clayton Christensen’s ‘law of conservation of attractive profits.’ Hardware became a low margin business; software became a very high margin business.
- Open source software and the standardized protocols of the Internet are doing the same thing to software. Margins will go down in software, but per the law of conservation of attractive profits, this means that they will go up somewhere else. Where?
- The next layer of attractive profits will accrue to companies that build data-backed applications in which the data gets better the more people use the system. This is what I’ve called Web 2.0.
It’s network effects (perhaps more simply described as virtuous circles) in data that ultimately matter, not network effects per se.”
Talis CTO Ian Davis would appear to agree, commenting;
“People need to be investing in their data as the long term carrier of value, not the applications around them… the data is more likely to persist than the software so it’s important to get the data right and take care of it.”
Salesforce CEO Marc Benioff, too, used his Dreamforce User Conference this month to move a company long associated with the ‘data centre extending’ Cloud firmly in the direction of embracing data and the network. As Krishnan Subramanian noted on Cloud Ave before the keynote,
“Till now, the Force.com platform served business users to develop apps that can be used internally within an organization. They have to tap into Force.com APIs from outside platforms to offer customer facing web apps. With the new initiative, it becomes easy for customers to allow the internet users to “interact” with their data.”