Do companies really know the total cost of local compute and storage? This article in USA Today (hardly a tech mag) suggests that cloud is only lower cost in the short term or cap cost, renting vs. buying. Cloud computing isn’t always lower cost but there are a lot of instances where it’s definitely lower cost. I get the AWS model where you can closely match your compute costs to your demand when your demand is spiky. What the article doesn’t cover is the TOTAL cost of local compute. It’s NOT buying vs. renting. Buying ties you down into a lot of long term costs.
Wednesday, February 13, 2013
Saturday, February 9, 2013
To truly utilize the new paradigm of cloud computing’s scalable architecture, developers need to rethink their application design from the ground up. To leverage the lower cost of on-demand scalability (ability to scale only when needed) of cloud IaaS, applications need to be designed to run on multiple systems and assume that there are multiple implementations of that function doing the same thing. Requiring a single instance of a class or function be running at a time eliminates scalability and introduces a single point of failure.
Posted by Chris Claborne at 3:50 PM