Wednesday, April 27, 2011

Exascale Computing

Recently, I heard David Keyes talk about exascale computing ("Exaflop/s Seriously!"). There were two interesting lessons.

1. In one of his viewgraphs (which I unfortunately do not have access to) he plotted the peak performance of the top supercomputer in the Top500, versus time. The resulting curve looks familiar (Moore's law), and when drawn on a log-linear scale looks approximately linear. I reproduce a similar looking curve (but this one combines all the 500 top machines) below.


The interesting part of the viewgraph was that, on the same plot, he also threw in the performance of the 500th best machine in the Top500. This was also linearly increasing (on a log linear plot) curve that lay somewhat lower than the fastest computer.  The surprising bit (for me at least) was that the "phase shift" in terms of time was about 8 years. That is, if the top machine in the Top500 was left alone, in 8 years it would be overtaken by everybody else in the elite group.

This can have policy implications for HPC centers. Rather than trying to build the fastest computer now, perhaps, having a long-term plan to keep upgrading the system is more important.

2. We've seen CPU clock cycles stagnate. For HPC centers, which use thousand of them, energy considerations can quickly become the dominant concern. This explains the shift to multicore. I learned that energy scales roughly as the third power of the frequency. From here, for example
"If the clock rate of the Multi-Core CPUs will be reduced by 20% only, then the energy consumption is reduced to 50% compared to a system running at full clock speed.”
On the other hand, for multicore CPUs the energy is only proportional to the first power of the number of cores, explaining the move towards multicore, low frequency supercomputers. I don't know if the numbers here include the cost of air-conditioning (I presume they do), but they can be a significant operating cost.

I remember that we had a small Beowulf cluster (puny by today's standards), when I was in grad school, that was housed in a student office for sometime. The room was unbearable in summer.

Monday, April 25, 2011

Links:

1. Why I "prefer" cheap wine.

2. Why "this is fun" may be a better password than "J4FS<2".

3. Bike paths in Tallahassee (Google Maps) - clicking on route activates a small description.

Wednesday, April 20, 2011

Scott Adams on Scott Adams

I linked to this article by Scott Adams in the last post. Something very interesting seems to have developed since then (H/T nanopolitan).

Essentially, Adams used an alias (plannedchaos) to defend Scott Adams on MetaFilter. An extremely interesting (or lucky?) thing was that right after his first rant, sombody seems to have sniffed him out.

Friday, April 15, 2011

Scott Adams on Real Education

Interesting perspective from the creator of Dilbert. He begins with:
I understand why the top students in America study physics, chemistry, calculus and classic literature. The kids in this brainy group are the future professors, scientists, thinkers and engineers who will propel civilization forward. But why do we make B students sit through these same classes? That's like trying to train your cat to do your taxes—a waste of time and money. Wouldn't it make more sense to teach B students something useful, like entrepreneurship?
  And ends with:
Remember, children are our future, and the majority of them are B students. If that doesn't scare you, it probably should. 
In many ways, I think it is a critique of standardized testing (which automatically fosters standardized education).

Linear Least Squares in Octave

While Octave functions ols and leasqr are good for heavy-lifting, polyfit is often sufficient for simple linear regressions.

Given arrays of data x and y: c = polyfit(x,y,1) gives the regression yfit = c(1) * x + c(2).

Sunday, April 10, 2011

Links: Unhurried Nonfiction

I found Longform.org today, and can foresee spending quite a bit of time here in the future. It works somewhat like slashdot.org. Here the description from the website.
Longform.org posts new and classic non-fiction articles, curated from across the web, that are too long and too interesting to be read on a web browser 
We recommend enjoying them using read later services like Instapaper and Read It Later and feature buttons to save articles with one click.
You can even subscribe to a feed via RSS.

Saturday, April 2, 2011

What's your Nobel number?

We like quantifying things. We design impact factors, and h-indices to determine the scientific worth of a journal, or a researcher. Sometimes these numbers are meaningful. Sometimes they are gamed. Sometimes they are misused.

My pet peeve against these measures (and their extended family) is that they are designed to measure popularity. Unfortunately, they are commonly used as a proxy for scientific quality.

Can high-quality stuff be popular? You bet.

But the relationship between quality and popularity is tenuous at best.

High-quality stuff can stay under the radar for prolonged periods, and flashy low-quality stuff can go platinum. We intuitively understand and appreciate this difference when we judge music, movies, politics, or literature.

Love them or hate them, we probably have to learn to live with them.


There is another class of numbers, like the Erdos number, that exist for pure entertainment and tongue-in-cheek bragging value. They measure proximity to greatness, and are related to the six-degrees of separation idea.

Let us define a new number (maybe it already exists) which we shall call the "Nobel number" which measures the shortest "collaborative distance" between a scientist and a Nobel Laureate as measured by authorship in scientific literature.

Thus, if you have co-authored a paper with a Nobel Laureate, your Nobel number is 1.

Mine is 2.

PS: I saw Energy Secretary Steven Chu on CSPAN yesterday. He is a Nobel Laureate and co-authored a paper with my PhD advisor Ron Larson.