Wednesday, February 13, 2013

Total Cost of Local Compute

Do companies really know the total cost of local compute and storage?  This article in USA Today (hardly a tech mag) suggests that cloud is only lower cost in the short term or cap cost, renting vs. buying.  Cloud computing isn’t always lower cost but there are a lot of instances where it’s definitely lower cost.  I get the AWS model where you can closely match your compute costs to your demand when your demand is spiky.  What the article doesn’t cover is the TOTAL cost of local compute.  It’s NOT buying vs. renting.  Buying ties you down into a lot of long term costs.


When I go back and look at the advantages of cloud computing, there are a lot of costs there that you are no longer responsible for long term.  It’s real money and headache but I don’t think many companies really know what their cost per compute hour and storage really is.  

I’m not so blind as to think that you can lower costs dollar for dollar when going to cloud computing.  If you already have a data center, infrastructure, and staff, there’s some elasticity in that capability that you can pull out without getting a complete step function in cost, and there is going to the cloud still requires people with specialized skills in IT to deliver it.  

Get real, what’s your real cost / compute and storage hour.  A cloud architect means you need to know technology, fit, total cost of ownership, ROI, and ROA (return on agility).

Reference



- Chris Claborne

No comments:

Post a Comment