Skip to main content

Provocation: are webapps compatible with Kyoto?

Posted by fabriziogiudici on March 3, 2009 at 3:19 PM PST

John Reynolds just posted another blog in the saga "the web is killing the desktop" (no pun intended: it's somewhat obvious that as the world evolves people discuss about the direction). One of the comments, by cmdrx, gave me the hint for a provocation that is floating in my mind since a long time:

With most mobile internet solutions having a 5Gb monthly cap on usage, relying on the internet for everything may be an extremely costly decision.

Well, costs evolve too and in a few years (recession allowing) we will have much higher caps - or no caps at all. In my own personal perspective, this won't necessarily make me willing to stay permanently connected (I've already blogged on the joy of being disconnected). But this is not a new point.

Now, look at this:

Is it green to trigger a potentially world-spanning transaction for every small operation you do with a web application?

Maybe I'm wrong and somebody proved the opposite with the proper maths, but I'd say a good, old, disconnected, desktop application that works mostly on your laptop and uses the network only when the network is really needed (when you have to transfer information) consumes less energy than a webapp "relying on the internet for everything". What do you think?

PS This doesn't relate to the point, but I'd like to stress that I'm not a green paranoid. While I've been a supporter of ecologist associations since I was a child (almost thirty years ago), I'm pretty skeptical about Kyoto and the whole global warming stuff. But I'm puzzled that in a world where the current mantra is that you should care of how much greenhouse gas you're going to produce for every step you take (possibly avoiding to eat too many beans so you don't generate killer amounts of CO2) nobody is thinking about how ecologically inefficient a web application might be. Or somebody is doing?

Comments

You make an excellent point regarding the need for analysis and metrics to prove that some distributed application is necessarily greener than an equivalent desktop application that requires infrequent communication to a remote host. I would think that this would depend on a number of factors, including the amount of devices between the two points, the utilization of network resources, and application/data requirements. Also, the bandwidth just isn't there (at least not yet) for large, complex applications to use move vast amounts of data all around the globe.

So you expect browsers and the demands apps put on them to stop evolving at some point? Isn't the new scramble to make use of those GPU features and multi cores via RIAs. Its all fine and good if you assume everybody has fast, cheap net access to local hydro powered data centers via fibre optics.. but thats not realistic is it? In fact dont mobiles far outweigh desktopsin the east - doubt chemical/battery power retention is very efficient nor net access via mobile cells/wireless.

@warthung and @johnreynolds: these are qualitative arguments, and without quantitative support they might be reverted as well. For instance, one of the fashion energy trends is about small, distributed energy generators (eolic, solar): if I use my laptop at my solar-cell-powered cottage there's no problem about energy distribution. BTW, the argument "that connectivity is there anyway" could be made "the power is there anyway" because of my TV-set, hairdryer and dishwasher. IT waste is clearly a problem, but even data centers get to EOL. And I don't have a mental interface for my webapp, so I still need some appliance to connect (heck, mobile phones and palm gear are the typical front end for webapps, and they are known as a big waste problem). Without maths it's impossible to prove one way or the other. But - back to the "that connectivity is there anyway". Sure? If I had to work mostly in asynchronous mode, a very narrow bandwidth could serve me most of times, e.g. for reading emails. If OTOH I've to use a Google Office application, the bandwidth must be abundant. If I work with Subversion on Java.Net I need plenty of bandwidth for frequent commits; if I used Mercurial, I could save most of it. I suspect that telcos are just giving more bandwidth since it's technically possible, then they are pushing consume models where people need more and more of it in order to sell more. Of course the same argument could hold for Sun, IBM and others for what concerns data centers, clouds and whatsoever. In the case of the iPhone I'm pretty sure that the whole AppStore / iPhone locking stuff has been designed purportedly to force a web app style that increases the bandwidth consumption and makes AT&T & co to gain more out of it. Nothing illegal or even immoral - it's the economy, baby. It's just that what is an advantage for them is not necessarily an advantage for me.

Think of all the Desktops that end up in landfills because they're "obsolete"... That's much less likely to happen if the only thing that your machine needs to run is a browser.

Actually... A web app may well be greener than a desktop app. Arguably, the data center is better utilized than the conventional desktop is, mean less wasted idle energy. Data center have the stronger motivation, as well as the resources, to commit to more expensive upfront "green" solutions. A fractional reduction in consumption makes little economic sense to an individual, but when concentrated in someplace like a data center, they're very motivated because all of those fractional reductions accumulate there where they can be more readily, and efficiently, acted upon. Consider a classic office environment where folks work on thin clients served by a central infrastructure. In this context, you can see the efficiencies of not having hundreds of idle PCs, fans, and hard drives used for little more than typing a document or reading an email. A concentrated server back end can supply those services on less hardware, that also happens to be more efficiently managed. The real cost of the web application is not the data center per se, rather it's the infrastructure connecting the two. However, as we've see so far, that connectivity is there anyway, and if anything we're getting more and more with the addition of the rich media. Finally, with centralized data centers, you have better control and access to things like novel sources of power. It's cheaper to put the data center next to the power generator (i.e. Google building next to hyrdoelectric dams) and bring the data to them, rather than bring the power out to the ends users. Data, especially over fiber optic, travels much more efficiently than raw AC power.

Because if you seriously thought about it and the implications you'd probably start screaming and never stop.. ;) nobody wants to think about turning the clock back, scaling down, nothing wrongs here, we're making progress. As governments & individuals we're happy enough in our self-denial, let our kids sort the mess out. Might have a bit more traction in 2009 with energy prices sky-rocketing..

And I even commented to that post! Har har har... sorry, I had forgotten it. But it's a post of 1.5 years ago. So let me refine my question: why is not everybody discussing this topic at least once per week (I mean, the bean -> CO2 thing and such is discussed even more frequently).

I asked the same thing a couple of years ago. http://weblogs.java.net/blog/javakiddy/archive/2007/09/why_rich_intern.h... Until hefty green taxes start coming into force no doubt people will still scoff at such notions behind their hands. But the halcyon days of cheap plentiful energy are probably behind us and/or far of in the future scifi realm. Even there is a global minimum net accessibility limit. In the UK they're trying to set a 2mb per houshold minimum at the moment, but admit it may be as much as a decade off.. 2Mb? and that's a relatively rich western country.