Miha Kralj is a senior architect with Microsoft whose job it is to think about and strategise for the future. According to him, that future is largely going to revolve around moving to cloud computing (unsurprising given the investment in Azure) and the transition to utility computing.
Moving to utility computing is the same as moving to buying a car as a service - instead of buying a new car based on model, colour, style and such like, you'd choose the car as a service based on value for money and comfort. If you were buying a car as a service you would of course be taking a taxi.
Miha suggests that utility computing will be provided by mega-data-centre providers who provide all the hardware, the power, the operational support and your enterprise will consume the services it needs and scale up/down as required.
There is evidence to suggest a move towards this at present - out of all sales of servers worldwide, 3 customers represent 50% of the business. They of course are Microsoft, Google and Yahoo.
Microsoft themselves no longer buy servers in the conventional sense - instead they purchase locked, sealed container "PODs" like the one pictured below, each consisting of a sealed environment with 200 physical servers. These remain locked unless availability of units falls below 95% (or in other words, when 100 servers have fried), at which point the container provider sends out an engineer.
The consumption of servers by organisations such as Microsoft has a knock on effect on their carbon output. Microsoft have so many servers in their data centres (Chicago alone has capacity for 220 of the above containers), that worldwide they have the same carbon footprint as the entire aviation industry. That's 5% of the planets entire power output and emissions going straight to and attributed to one company!
In fact if a POD provider is able to bring lower power variants to market, Microsoft are happy to replace entire containers to reduce operational costs - the significant cost of the data centre today isn't in capital outlay, it's in the operational cost - the power to run them and the power/water to cool them.
As developers, we can't write code to be more green. Using a while loop isn't going to produce less carbon than a for loop, so no matter what we do as creators of software, that software will continue to rely on carbon hungry servers. The shift towards utility computing and consolidation of hardware can help to alleviate some of these issues by requiring less hardware to do more, and by building these mega-data-centres in strategic locations (Iceland was suggested), power consumption can be drastically reduced.
Before widespread adoption of utility computing can be progressed however, there will be several hurdles that need to be overcome to promote trust in the industry.
Firstly, solid, robust SLA's need to be put in place to promote confidence in off-premises infrastructure and secondly some form of regulation needs to come into play for the IT industry at large. When we plug our laptops into the power socket or when we drink from the tap, we only do so because of the trust established by regulation of those utilities that ensures we won't be fried or poisoned. IT needs the same regulation to promote trust.
Whatever happens in the future, we can't just look back at past success and hope to repeat it. Just because it worked before doesn't mean it will in a future that could be governed by a new set of rules.