A random walk through a subset of things I care about. Science, math, computing, higher education, open source software, economics, food etc.
Friday, July 28, 2017
Tuesday, July 25, 2017
Russell's paradox
I came across this interesting paradox on a recent podcast. According to wikipedia:
According to naive set theory, any definable collection is a set. Let ''R'' be the set of all sets that are not members of themselves. If ''R'' is not a member of itself, then its definition dictates that it must contain itself, and if it contains itself, then it contradicts its own definition as the set of all sets that are not members of themselves. This contradiction is Russell's paradox.
Symbolically:There is a nice commentary on the paradox in SciAm, and a superb entry on the Stanford Encyclopedia of Philosophy
\[\text{Let } R = \{ x \mid x \not \in x \} \text{, then } R \in R \iff R \not \in R\]
Wednesday, July 19, 2017
Questions Kids Ask
Between my curious 4- and 8-year olds, I got asked the following questions in the past month.
I found all of them fascinating.
1. Why are our front milk teeth (incisors) the first to fall out?
2. Why is "infinity minus infinity" not equal to zero?
3. Why don't you get a rainbow when you shine a flashlight on rain in the night?
4. How are Cheerios and donuts made (into tori)?
5. His, hers, ours, yours. Then why not "mines"?
PS: I also learned from my 4-year old that daddy long legs aren't really spiders and don't spin webs, and that sea turtles feed on jellyfish.
I found all of them fascinating.
1. Why are our front milk teeth (incisors) the first to fall out?
2. Why is "infinity minus infinity" not equal to zero?
3. Why don't you get a rainbow when you shine a flashlight on rain in the night?
4. How are Cheerios and donuts made (into tori)?
5. His, hers, ours, yours. Then why not "mines"?
PS: I also learned from my 4-year old that daddy long legs aren't really spiders and don't spin webs, and that sea turtles feed on jellyfish.
Sunday, July 16, 2017
Matplotlib: Subplots, Inset Plots, and Twin Y-axes
This jupyter notebook highlights ways in which matplotlib gives you control over the layout of your charts. This is intended as a personal cheatsheet.
Friday, July 7, 2017
John Roberts Commencement Speech
This part of the address is really nice and timeless.
The transcript of the full speech is available here.
Wednesday, July 5, 2017
Joints from Marginals: Compilation
For convenience, here is a link to the three blogs in this series in one place.
1. A technique for solving the problem in a special case
2. The reason this technique works
3. The corners/edges of this technique, or how it fails for non-Gaussian marginals
Sunday, July 2, 2017
Joint from Marginals: non-Gaussian Marginals
In a previous post, I asked the question if the method described here can be used with non-Gaussian distributions.
Let us explore that by considering two independent zero mean, unit variance distributions that are not Gaussian. Let us sample \(x_1\) from a triangular distribution, and \(x_2\) from a uniform distribution.
We consider a triangular distribution with zero mean and unit variance, which is symmetric about zero (spans -sqrt(6) to +sqrt(6)). Similarly, we consider a symmetric uniform distribution, which spans -sqrt(3) to +sqrt(3).
Samples from these independent random variables are shown below.
Let us explore that by considering two independent zero mean, unit variance distributions that are not Gaussian. Let us sample \(x_1\) from a triangular distribution, and \(x_2\) from a uniform distribution.
We consider a triangular distribution with zero mean and unit variance, which is symmetric about zero (spans -sqrt(6) to +sqrt(6)). Similarly, we consider a symmetric uniform distribution, which spans -sqrt(3) to +sqrt(3).
Samples from these independent random variables are shown below.
When we use a correlation coefficient of 0.2, and use the previous recipe, we get correlated random variables with zero mean and the same covariance matrix, but ...
... the marginals are not exactly the same!
This is evident when we increase the correlation coefficient to say 0.5.
The sharp edges of the uniform distribution get smoothened out.
Did the method fail?
Not really. If you paid attention, the method is designed to preserve the mean and the covariance matrix (which is does). It doesn't really guarantee the preservation of the marginal distributions.
Subscribe to:
Posts (Atom)