Processing math: 100%

Thursday, June 21, 2012

Linear regression and logarithms don't mix?

You have a power-law model y=a_0 x^{a_1}, and a bunch of experimental data points \begin{bmatrix}x_1 & y_1 \\ x_2 & y_2 \\ \vdots & \vdots \\ x_n & y_n\end{bmatrix}.
You want to estimate a_0 and a_1, it is tempting to take the logarithm of both sides \log y = \log a_0 + a_1 \log x, and perform linear regression on suitably transformed experimental data \begin{bmatrix}\log x_1 & \log y_1 \\ \log x_2 & \log y_2 \\ \vdots & \vdots \\ \log x_n & \log y_n\end{bmatrix}.
Beware! You may get something different from what you expect! And your answers might not mean much.

Perhaps, you should be doing maximum likelihood instead. Here is a nice tutorial (pdf) on it.

PS: Part of the motivation for this post was this.

No comments: