Friday, March 14, 2008

1 in 10^12

I was doing some poking around the internet to familiarize myself with computer processing speeds (ultra-fast supercomputers were the subjects of various podcasts that I listen to). In the definition of a FLOP (floating point operation) on Wikipedia there was this:

* The entire BOINC averages over 900 TFLOPS as of February 17, 2008.[2]
* SETI@Home computes data averages more than 265 TFLOPS.[3]
* Folding@Home has reached over 1 PFLOPS[4] as of September 15, 2007.[5] Note, as of March 22, 2007, PlayStation 3 owners may now participate in the Folding@home project. Because of this, Folding@home is now sustaining considerably higher than 210 TFLOPS (1267 TFLOPS as of September 23, 2007). See the current stats[6] for details.
* Einstein@Home is crunching more than 70 TFLOPS.[7]
* As of June 2007, GIMPS is sustaining 23 TFLOPS.[8]
* Intel Corporation has recently unveiled the experimental multi-core POLARIS chip, which achieves 1 TFLOPS at 3.2 GHz. The 80-core chip can increase this to 1.8 TFLOPS at 5.6 GHz, although the thermal dissipation at this frequency exceeds 260 watts.

I run BOINC all the time, and am a member of the folding@home and Einstein@home projects. This means my computer is part of a supercomputer, calculating protein folding and helping in the search for pulsars. Isn't that cool? Distributed computing is impressive because of how many resources it might save. Instead of building one huge supercomputer that does all of these operations, it's spread over thousands of computers around the world that would otherwise be doing nothing. The users lend their unused computing power to help understand our world better.

I got thinking about how this would impact the environment. The fact that these computers are using energy instead of being turned off or sleeping allows one to make the argument that it's environmentally irresponsible, but I'd have to see the data on the resources eaten up by large supercomputers to hash this out. There are worse things one could be destroying the environment in the name of anyways, like driving Hummers and transporting food over thousands of miles, so in the end it's worth it in my eyes. Additionally, one of the programs I run (which, incidentally, has the largest projects to download and thus uses the most time on my compy) does climate modeling in the hopes of better understanding global warming. So, the energy it uses is in the search for a solution to climate change, a quintessential carbon offset.


kula said...

One thing to consider is that many of these projects are genuinely useful projects, so the energy used is being used for a useful thing.

It would be interesting to compare the full energy difference between a distributed system like BOINC and an equivalently powerful set of supercomputers. Especially comparing things like the overhead of all those individual machines being on vs. supercomputer setups generally being specialized setups requiring amazing amounts of cooling and other infrastructure.

Nick said...

If anyone would know, it would be you, Thomas. That was exactly my thought about the infrastructure and cooling. I'm not sure how much those things suck up as far as energy resources, but I imagine it to be a fair bit. As far as building an entire building for them vs. the distribution of PC processors (stored in the homes of the owners, which would be there already...) it seems that the distributed computing comes out on top. But again, I'd like to see actual data on it...

fontgoddess said...

I like the description "quintessential carbon offset." I think you need to do a lolcats-style pic for this. Warming the Earth to cool it . . .

[BTW: Hello!!! I have your old blog in my RSS reader and was getting sad that you had not posted. Thankfully you kept your profile, and I can again catch up on the life of my favorite vegan.]