Saturday, February 27, 2010

Interesting Calculus Sites

1. Step-by-step Calculus from Wolfram Alpha: I wrote about Wolfram Alpha before. It apparently can show intermediate steps, which could be helpful for someone learning calculus.

Or perhaps, it may cause more harm, I don't know.

2. MyCalculus: I took it for a short test drive, and it seems to work fine. To see intermediate steps, it seems like you have to buy stuff, though.

Tuesday, February 23, 2010

Are we producing too many science and engineering PhDs?

A recent Scientific American article ponders the question "Does the US produce too many scientists?"

I don't think there is anything really new in this article, that hasn't already been said before in some shape or form, in articles such as this, this and this.

Although reality is complex and multi-faceted, it appears that evidence supports the hypothesis that "the number of graduate students in PhD programs is disproportionately large".

Monday, February 22, 2010

Double Summation in GNU Octave or Matlab

Let's say that you have two vectors: a n*1 vector t, and a N*1 vector T.

Let's say further, that you want to represent the double summation, equivalent to the following two for loops, without using loops, since Octave and Matlab suck at loops.

DoubleSum = 0;
for i = 1:n
  for j = 1:N
    DoubleSum += t(i) * T(j);
  end
end

Here's one possible method, using the intrinsic function sum.

DoubleSum = sum(sum((t * T')))

If you know some Octave, you will be able to parse the command by yourself.

Let's throw in another wrinkle. Let's say, that in addition to t and T, we had another N*1 vector g, and we wanted to find the double summation corresponding to the quantity g(j)*exp(-t(i)/T(j)), instead of t(i)*T(j) in the double for loop above.

Simple:

DoubleSum = sum(exp(-t * (1./T)') * g)

Friday, February 19, 2010

Old Comic!

I am currently teaching interpolation in my graduate numerical methods class.

Hence, couldn't resist the temptation of proposing a moral for this old xkcd comic:

"Estimations based on extrapolating first derivatives may be dangerous!"

Wednesday, February 17, 2010

Can't engineers program anymore?

I taught junior-level thermodynamics three times in a row as a fresh assistant professor at Florida State. There was a small Matlab (actually Octave, but doesn't really matter) component to the class, so that we could solve some realistic problems without sweating the algebra or arithmetic, and let the computer do the dirty work.

But instead of being wildly popular, it was easily the most-unloved portion of an otherwise well-received course. I have talked to colleagues in other engineering departments here and elsewhere, and the lack of programming skills seems to be a pervasive phenomenon, independent of the rank of the university.

It has indeed become a national epidemic, or maybe perhaps, even a global pandemic.

To me, this seemed unexplainable, at first. Kids these days are exposed to computers and other programmable gadgets at such an early age. They are comfortable with them. How then, could their ability to squeeze performance out of these devices be so poor?

And we are talking Matlab - which if you ask me - is not even a real programming language. I mean there are no classes, structures, libraries, templates, or any of the other hallmarks of a real programming language. Spreadsheets like Excel are all they can handle, and even there, they aren't power-users who exploit the underlying programmability.

One thing I realized gradually, was that electronic equipment and computers have become extraordinarily complex. When I was first exposed to computers, it was easy and fascinating to poke "under the hood", and tweak things. This was true of other electronic equipment like radios, TVs and VCRs

As Feynman puts it (via this amazing blog):
Radio circuits were much easier to understand in those days because everything was out in the open. After you took the set apart (it was a big problem to find the right screws), you could see this was a resistor, that’s a condenser, here’s a this, there’s a that; they were all labeled. And if wax had been dripping from the condenser, it was too hot and you could tell that the condenser was burned out. If there was charcoal on one of the resistors you knew where the trouble was. Or, if you couldn’t tell what was the matter by looking at it, you’d test it with your voltmeter and see whether voltage was coming through. The sets were simple, the circuits were not complicated. The voltage on the grids was always about one and a half or two volts and the voltages on the plates were one hundred or two hundred, DC. So it wasn’t hard for me to fix a radio by understanding what was going on inside, noticing that something wasn’t working right, and fixing it.
And as a commentator on the same posting wrote:
The problem, I think, is much wider and deeper than just for physics. Philosopher Stephen Clarke has argued that we are currently in a unique transition point in the history of technology: From a time when most people could understand (or could quickly learn) how most technologies they encountered in their everday life worked, to a time when almost no ordinary person can understand how any everday technology works.
I will blog more about this topic in the near future, but if you have any ideas and suggestions to make programming "cool" do let me know.

Saturday, February 13, 2010

Weekend Links

1. Math functions in real life (via Flowing Data): Interesting photographs, although some of them seem too ambitious. Here's one for example.

2. This PHD cartoon
3. This interesting article on "the Joy of Less" (via SimoleanSense)
The millionaires I know seem desperate to become multimillionaires, and spend more time with their lawyers and their bankers than with their friends (whose motivations they are no longer sure of). And I remember how, in the corporate world, I always knew there was some higher position I could attain, which meant that, like Zeno’s arrow, I was guaranteed never to arrive and always to remain dissatisfied.

Friday, February 12, 2010

Puzzler

I thought I'd quickly pose a puzzle that I read (as a comment) on one of the blogs I follow. I won't name the source, just yet, to avoid giving the answer away.

Anyway, the puzzle runs as follows:
Ten mathematicians are sitting around a dinner table. They want to figure out their average salary (total salary/10), without revealing their own individual salaries. They each have a piece of paper to scribble on, and a pot into which these pieces of papers may be put.

PS: The answer is not unique. The two people I posed this problem to, gave me two different, but acceptable answers.

Tuesday, February 9, 2010

Does exercise really make us thinner?

I recently read through an interesting article called the "Scientist and the Stairmaster" by Gary Taubes, with the interesting "abstract" (more a longish subtitle) "Why most of us believe that exercise makes us thinner—and why we're wrong."

It has an interesting thesis, and deserves to be looked at. The reason I bring the article up is for two reasons.

First, the author brings up the topic of thermodynamics three times in the article. At one point he boldly proclaims:
Humans, rats, and all living organisms are ruled by biology, not thermodynamics.
If that was insufficient to knock you out senseless, he comes with an encore.
These physiological mechanisms serve fundamentally to work against the inevitable pull of thermodynamics (which is entropy, a.k.a. death) and so make life possible.
I think the first claim is categorically bogus, and the implication in the second statement is dishonest.

Which thermodynamic law, when properly applied, is outlawed by biology?

The second law of thermodynamics clearly allows the entropy of a "system" free to exchange energy with its surroundings to decrease.

There is nothing remarkable about it.

Even dumb, non-living stuff free from any "life-force" can do that, making the phrase "to work against the inevitable pull of thermodynamics (which is entropy, a.k.a. death) and so make life possible" inappropriately dramatic.

Some of the comments under the article are bang on (or perhaps they share my scepticism). I reproduce a few excerpts, without edits, below:
Look, you are going to get whatever validation you need from this article, these studies, these reports, that suits you. In the end, every study can be countered by another, each report has an opposing report and with our global communication, you can always find studies and reports to support what your personal habits.

Gary Taubes is trying to sell a book and I am all for capitalism but you must see it as that.

One thing that I was hoping he would clear up was if, as he says, obesity is a function of the insulin levels in the body. I have read some articles indicating that exercise will lower insulin levels in the body. Wouldn't that help in a person's fight with obesity?

Another question that should be considered is how long the studies have gone on. A person may gain muscle to a point, and then stop further development (when the muscle is strong enough to cope with the new excercise strength requirements), while the fat keeps burning off. Their weight would then go down.
I hope one day, technical journals will allow free commenting on articles like news magazines do. I kid you not - I often learn more from them - than the main article.

Monday, February 8, 2010

More Advice

"Ten Lessons I wish I had been Taught" by Gian-Carlo Rota (via Micromath).

About the speaker, via Wikipedia:
Rota was one of the most respected and popular teachers at MIT. He taught a difficult but very popular course in probability, 18.313, which MIT has not offered again. He also taught 18.001 (Applications of Calculus), and 18.03 Differential Equations. His philosophy course in Phenomenology was offered on Friday nights to keep the enrollment manageable. Among his many eccentricities, he would not teach without a can of Coca-Cola, and handed out prizes ranging from Hershey bars to pocket knives to students who asked questions in class or did well on tests.
Some of it is actually very good, and most of it is commonsensical.

Check it out.

Thursday, February 4, 2010

Why are so many nursery rhymes depressing?

Thanks to my daughter, and the magic of repetition, I can now sing more nursery rhymes from memory than I could at any previous instant in my life. At home, we have several illustrated books, both from India and the US, and it is fascinating to note differences in otherwise familiar rhymes. Wikipedia has a fairly exhaustive list of the different versions sung in different parts of the world.

As part of relearning, I have been paying more attention to the meaning or interpretation of these rhymes. One striking attribute of many rhymes is how dark the subject matter is.

Take Ring-a-ring-o'-Roses for example. A popular interpretation is that the rhyme refers to the plague, with the roses representing rashes which are a common symptom, the posies referring to a supposed medicine. The last two lines (in the Indian version) note the constant sneezing and eventual death.

Or take Rock-a-bye-baby instead. Why do kids need to visualize, the bough breaking and the cradle falling down from the tree top. What good can that possibly accomplish?

Or London bridge is falling down. Despite the cheerful melody that usually accompanies the rhyme, the subject speaks of an engineering accident in which many people probably died, for chrissake!

Even Hindi nursery rhymes aren't free of blame.

Remember "Machli jal ki raani hain"?

What are we teaching our children?

Monday, February 1, 2010

Natural v/s Artificial

A few years ago, one of my cousins brought up the topic of "the boundary between artificial and natural". Her argument (I paraphrase) was something along the following lines: 
A man builds a dam, it is labeled artificial (as in non-natural).
When a beaver builds one, it is natural.
Therefore, the boundary between natural and artificial is bogus.

Here is a cartoon from Abstruse Goose, which captures the same sentiment more graphically.