Industry News Desk
Cloud Computing vs. Grid Computing - What's the Difference?
Cloud computing really is about lots of small allocation requests
Aug. 21, 2008 05:30 AM
Cloud computing really is about lots of small allocation requests. The Amazon EC2 accounts are limited to 20 servers each by default and lots and lots of users allocate up to 20 servers out of the pool of many thousands of servers at Amazon. The allocations are real-time and in fact there is no provision for queueing allocations until someone else releases resources. This is a completely different resource allocation paradigm, a completely different usage pattern, and all this results in completely different method of using compute resources.
I always come back to this distinction between cloud and grid computing when people talk about “in-house clouds.” It’s easy to say “ah, we’ll just run some cloud management software on a bunch of machines,” but it’s a completely different matter to uphold the premise of real-time resource availability. If you fail to provide resources when they are needed, the whole paradigm falls apart and users will start hoarding servers, allocating for peak usage instead of current usage, and so forth.
Reader Feedback: Page 1 of 1
Latest Cloud Developer Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week