I'm back from the conference at Berlin "Faktor X - Tag der natürlichen Ressourcen", a side-conference of the World Resources Forum 2009 in Davos.
It was pretty interesting, although the topic is not new. If you don't know it: We are running out of (several) resources. The main topics going through the media are often energy and oil. Well, several metals will become much more problematic much more earlier! So, what is the aim: material-, product- and production-efficiency and recycling/upcycling(/downcycling).
One of the topics mentioned at the conference was the IT industry. One of the keywords is green IT - more efficiency through IT, more efficiency by more efficient hardware, better recycling of hardware, .... Another point was more efficiency by more efficient software. One of the questions mentioned was “how much resources costs a click at google?”. I think, one can adapt this question to almost every software. We have several software, which is widely used. I'll just mention the apache web-server. Would't it be good to know, that this widely used software won't waste CPU cycles for nothing?
So what about a GSoC project checking such often/widely used software for efficiency (besides checking for buffer or heap overflows or NULL pointer dereferences)? Can anybody imagine such a project? What is your opinion?