Wednesday 20 January 2010

Carbon Footprinting and IT

Taiwan, Sweden and the UK governments have all recently made indications that they may move towards labelling goods with embedded carbon statistics.

Obviously such labelling is an idea who's time will shortly come, and as further evidence of this it was the topic of the Green Monday Sustainable ICT table I chaired last week.

How reliable is embedded carbon as a purchasing indicator though?  With something like a packet of crisps usage isn't a large factor in its carbon lifecycle.  Electronic goods are a very different proposal as energy usage is a large portion of their total footprint.  Because of the complexity of the components, computer equipment often moreso.

The Carbon Footprint of a Server

Let's take your average server as an example.  Almost every manufacturer would tell you that their server is energy efficient, but what does this mean?   At the moment there is no standard comparison between manufacturers, and even if there were, like processors a few years ago, designers could optimise for performance in benchmarking tests which might not reflect real work usage.

Manufacturers are always going to try to tweak results to favour themselves and if they didn't their marketing departments would no doubt have a word to say.  Recently at the Green Monday table, a data centre manager said that in his experience of purchasing machines, manufacturers could not be trusted to rate their own products, and the only way to really tell was to plug them in through an energy monitor and do your own comparisons.  OK if you have the budget and time for such comparisons, however not everyone does.

Servers come in a wide range of profiles as well, a 64 processor machine is going to be difficult to compare against an array of individual machines.  What is the expected lifetime of the components in a machine, and given that longevity how upgradeable is it?  These things also need to factor into the carbon footprint of a server.

Very different from a packet of crisps

 
Unlike a refridgerator which is basically just on and doing a well defined job, and can be rated on energy consumption and cooling efficiency,  computers run many different applications, at varying times, often simultaneously, many of them bespoke.  All this makes comparisons difficult.  If you are running a standard small office maybe you can make educated guesses, but for data centre operators it's not so easy.

The standard measure for data centre efficiency PUE doesn't even try to do this.  It just compares the energy which is used by the computers in the data centre, as opposed to that used on lighting, cooling etc.  This could be a rough indication of the efficiency of the lighting and cooling systems,  or alternatively the inefficiency of the computing systems, but it's not telling us much more.   What we'd really like to know is bang for buck, how much useful activity as defined by whatever computational thing we are trying to do, do we get for a given amount of energy expended.  A computer often uses between 60 and 70% of its energy even at low utilisation levels just doing housekeeping like keeping disks spinning and memory juiced up, increasing computation doesn't add proportionately that much to it's energy use, so utilisation is key.  This is why virtualisation is such a big win, using as much of the resources of individual machines as possible.  Gartner have come up with PPE as a better statistic, based on "rack density levels, server utilization levels, and energy consumption".  Again it's flawed, rack density doesn't tell us too much and restricts airflow, server utilization doesn't necessarily measure useful work, and energy consumption is only worthwhile if it is compared to useful work, but Gartner's measure is an improvement on PUE, and let's hope they further refine it.


The Carbon Footprint of  Programme


So different software services vary dramatically with the profile of memory, cpu and disk which they use, even when performing similar functions.  We can use some rules of thumb, we can profile for different classes of applications, we can optimise the hardware of different servers for different tasks to get maximum energy efficiency, so there is much useful we can do there.

What about the programmers though, so far they've gotten off lightly but perhaps we should be carbon footprinting their work too.  Once upon a time, programmers worked with limited resources, and tried to tune their applications in every way possible.  That has all changed in the latest generation of programmers.  During my IT degree at QUT, my lecturer John Hynd told us in one of our first programming lectures, that Moore's Law meant we shouldn't worry about efficiency.  (He also told us only one in three of us would get through the course successfully so best of luck to the other two guys whatever you are doing).  Computers were cheap, programmers were expensive and in short demand, everywhere in the world apaprently needed billions of programmers, and of course then the dot com bubble burst and the entire language and economics around IT changed.

Now we are running services, and are an intergral part of every business, no longer only in the role of a differentiator or competitive advantage, so cost is important.  Inefficient programming can mean more servers and more energy consumed.  A good programmer who programmes for efficiency is going to understand trade offs between design and performance, but many don't and this needs to be addressed.  Some universities are changing the way they teach, the University of East London is looking to integrate concepts of Green IT in their teaching programme, though whether that filters through to the programming classes we shall wait and see.

There is also the concept of capacity management, rightsizing resources.  Programmers and administrators tend to overspecify equipment as a matter of course.  If programmers can do a better job of understanding the usage of underlying resources, they can better understand the physical resources which will be required to house them.

Given a programmer does all this diligently, will we see software packages with carbon footprints on them or programmers with carbon aware stampes on their resumes?  Probably not, but who knows.

Summary


Some IT equipment is going to be hard to carbon footprint, too many variables, hardware and software configurations, come in to play.  The embedded carbon in it's manufacture is more tangible.  We need global standards for this sort of measurement, and either wide industry uptake or legislation mandating it before it can become universally useful, but it could provide a comparison for purchasing decisions.

For Home PCs we could provide some useful guidance to users.  Profiling a PCs idling energy usage, standard energy usage running web browsers and office apps, and energy usage running at full tilt with 3D immersive graphics intensive applications.  Users can pick the profile that suits them and it would help them make the right decision about the PC for them.

All in all perhaps taxation is a better strategy, something like removing VAT and replacing it with a carbon tax.  This would remove the need to understand the footprint from users and factor it into the price of goods.  Who will do the carbon accounting and what standards will they use is a question left to be answered.

At the moment carbon footprinting is purely voluntary, if you are a potato crisp company that measures its footprint and finds it is greater than your competitor, do you work to reduce it or just not publish it.  There are a lot of questions yet to be answered at the end of the carbon affluent age.  We have to work these things out somehow though, and if IT wants to think of itself as a forward looking industry, it needs to be leading the effort.

No comments:

Post a Comment

Comments are most welcome. We moderate them, but very loosely and mainly for profanity and to keep the "cheap meds" deals to ourselves.