The uncertainties package makes it extremely easy to perform numerical error propagation. If you have numbers with associated uncertainties like \(x = 1 \pm 0.1\), and you want to find how the uncertainty propagates to functions of \(x\).
from uncertainties import *
x = ufloat(1,0.1) # x = 1 +/- 0.1
y = ufloat(2, 0.1)
print "x = ", x
print "x^2 = ", x**2
print "x/y =", x/y
Output:
x = 1.00+/-0.10
x^2 = 1.00+/-0.20
x/y = 0.50+/-0.06
Install
Installing the package is extremely easy with "pip".
sudo pip install --upgrade uncertainties
Bigger Example
I had a datafile called "stiff.dat", which had data in columns. The first column was bp (base-pairs), the second and third columns were the contour length \(L\) and the associated uncertainty \(\Delta L\), and the last two columns were a quantity \(D^2\) and \(\Delta D^2\).
$ cat stiff.dat
25 5.31 0.18 42.25 4.36
50 15.07 0.84 108.46 23.03
75 24.99 2.26 129.92 29.36
100 36.05 3.85 169.35 20.40
125 49.91 4.91 263.68 88.00
150 58.04 5.82 354.27 66.53
175 80.64 6.89 508.11 132.49
200 85.61 7.73 651.71 234.54
250 129.23 10.16 1018.93 322.32
300 158.55 7.45 1240.22 474.75
425 267.88 13.32 7129.34 5050.29
What I wanted to compute was \[s = L/\sqrt{D^2}.\]
import numpy as np
data = np.loadtxt('regular.dat')
bp = data[:,0]
N = len(bp)
for i in range(N):
L, dL = data[i,1:3]
D2, dD2 = data[i,3:5]
s = ufloat(L, dL)/(ufloat(D2, dD2)**0.5)
print("{0:3d} {1:.4f} {2:0.4f}".format(int(bp[i]), s.n, s.s))
The ".n" and ".s" give access to the nominal value and the standard deviation.
25 0.7746 0.0603
50 1.4261 0.2046
75 2.2058 0.2223
100 2.4951 0.2947
125 2.7341 0.5820
150 3.3198 0.3114
175 3.6405 0.4322
200 4.1599 0.6252
250 4.7373 0.6742
300 5.1782 0.7471
425 5.4803 1.3160
Beyond
The package has a lot of features. It has some native support for arrays types, correlation of variables, etc.
Here are links to additional tutorial pages
Simple Example
from uncertainties import *
x = ufloat(1,0.1) # x = 1 +/- 0.1
y = ufloat(2, 0.1)
print "x = ", x
print "x^2 = ", x**2
print "x/y =", x/y
Output:
x = 1.00+/-0.10
x^2 = 1.00+/-0.20
x/y = 0.50+/-0.06
Install
Installing the package is extremely easy with "pip".
sudo pip install --upgrade uncertainties
Bigger Example
I had a datafile called "stiff.dat", which had data in columns. The first column was bp (base-pairs), the second and third columns were the contour length \(L\) and the associated uncertainty \(\Delta L\), and the last two columns were a quantity \(D^2\) and \(\Delta D^2\).
$ cat stiff.dat
25 5.31 0.18 42.25 4.36
50 15.07 0.84 108.46 23.03
75 24.99 2.26 129.92 29.36
100 36.05 3.85 169.35 20.40
125 49.91 4.91 263.68 88.00
150 58.04 5.82 354.27 66.53
175 80.64 6.89 508.11 132.49
200 85.61 7.73 651.71 234.54
250 129.23 10.16 1018.93 322.32
300 158.55 7.45 1240.22 474.75
425 267.88 13.32 7129.34 5050.29
What I wanted to compute was \[s = L/\sqrt{D^2}.\]
import numpy as np
data = np.loadtxt('regular.dat')
bp = data[:,0]
N = len(bp)
for i in range(N):
L, dL = data[i,1:3]
D2, dD2 = data[i,3:5]
s = ufloat(L, dL)/(ufloat(D2, dD2)**0.5)
print("{0:3d} {1:.4f} {2:0.4f}".format(int(bp[i]), s.n, s.s))
The ".n" and ".s" give access to the nominal value and the standard deviation.
25 0.7746 0.0603
50 1.4261 0.2046
75 2.2058 0.2223
100 2.4951 0.2947
125 2.7341 0.5820
150 3.3198 0.3114
175 3.6405 0.4322
200 4.1599 0.6252
250 4.7373 0.6742
300 5.1782 0.7471
425 5.4803 1.3160
Beyond
The package has a lot of features. It has some native support for arrays types, correlation of variables, etc.
Here are links to additional tutorial pages
No comments:
Post a Comment