I was teaching Gauss elimination to solve linear systems of equations in my undergrad Algorithms class last week, and at one point we had to estimate the asymptotic computational cost of the algorithm. This involved sums such as \[1+2+3+...+n = \sum_{k=1}^n k = \frac{n(n+1)}{2},\] and \[1^2+2^2+3^2+...+n^2 = \sum_{k=1}^n k^2 = \frac{n(n+1)(2n+1)}{6},\] and so on. Sums of higher powers (upto 10) of natural numbers are archived in many places, including here at MathWorld.
One of the students in my class, asked me how to derive these, and I showed a couple of simple tricks that let you calculate the sum of some of the lower power series.
If we only care only about the leading polynomial and its prefactor, as we do when we estimate asymptotic computational costs, there is a much easier method. Under the big-oh notation, one can simply use the approximation, \[\sum_{k=1}^n k^p \approx \int_{0}^{n} x^p dx = \frac{n^p}{p+1}.\]
One of the students in my class, asked me how to derive these, and I showed a couple of simple tricks that let you calculate the sum of some of the lower power series.
If we only care only about the leading polynomial and its prefactor, as we do when we estimate asymptotic computational costs, there is a much easier method. Under the big-oh notation, one can simply use the approximation, \[\sum_{k=1}^n k^p \approx \int_{0}^{n} x^p dx = \frac{n^p}{p+1}.\]
1 comment:
Hello, Thanks for posting interesting articles. Could you explain more easily how to convert integral form to sum form using quadrature /I heard about it and read on wiki but still complicated to understand/. thanks.
Post a Comment