It's an article of faith that with cloud computing comes lower energy consumption. As an adjunct of virtualization, the cloud should be able to extend consolidation efforts beyond the data center walls to produce an even more efficient IT environment than would otherwise be possible.
But is this view necessarily correct? And is there any real way to prove it one way or another?
It looks like I'm not the only one thinking along these lines, although the issue apparently has not gravitated toward the United States yet. In Canada, Reuven Cohen, founder and CTO at cloud systems vendor Enomaly, wonders if there will ever be a uniform way to measure cloud efficiency. Gathering the correct data is only part of the problem. There's also the small matter of accounting for all of the factors needed to compare cloud services vs. bricks-and-mortar data center infrastructure. Do you take into account the energy impact of actual construction of the data center and manufacture of physical resources? And is it possible that with unlimited resources at their disposal, IT workers would be less likely to restrain their use of those resources, creating an even greater carbon footprint in the end?
Part of the problem is that energy management matrices would have to come from the large cloud providers, like Amazon, according to Australian IT consultant Tom Worthington, and who's to say they would be accurate? Any number of variables could come into play, such as how fully occupied each individual processor is, and what sort of power-generation is being used for particular loads. There's a big difference between oil and gas and, say, solar or hydro power.
At the very least, the cloud offers the potential to be much greener than static IT infrastructure, according to Channel Insider's Chris Gonsalves. In theory, data loads could be shifted across continents to make more efficient use of resources, and consolidating the needs of multiple users onto fewer devices can only produce a net benefit. But again, there doesn't seem to be any way to determine if this will, in fact, happen once cloud use kicks into high gear.
Some start-ups are hoping to take a stab at this. Elastra recently expanded its Plan Composer Architecture platform to help enterprises seek out the best Power Usage Effectiveness (PUE) in the data center and on the cloud. The system uses a variety of markup languages to manage both applications and infrastructure, providing a framework to dynamically track and direct workloads to the most efficient resource sets available at any given time.
Slapping a green label on something as complex as cloud computing may make for an effective marketing campaign, but the reality is that simply because it lowers costs does not mean it provides the most effective of efficient environment. There's every reason to believe that cloud computing is greener than current infrastructure, but unless and until there is a way to document it, we're all just running on faith.