3/19/2026 at 10:55:51 AM
The way I understand this is that adding of random variables is a smoothening operation on their densities (more generally the distributions, but let me speak of densities only).A little more formally, additions over random variables are convolutions of their densities. Repeated additions are repeated convolutions.
A single convolution can be understood as a matrix multiplication by a specific symmetric matrix. Repeated convolutions are therefore repeated matrix multiplications.
Anyone familiar with linear algebra will know that repeated matrix multiplication by a non degenerate matrix reveals it's eigenvectors.
The Gaussian distribution is such an eigenvector. Just like an eigenvector, it is also a fixed point -- multiplying again by the same matrix wil lead to the same vector, just scaled. The Gaussian distribution convolved is again a Gaussian distribution.
The addition operation in averaging is a matrix multiplication in the distribution space and the division by the the 'total' in the averaging takes care of the scaling.
Linear algebra is amazing.
Pagerank is an eigenvector of the normalised web adjacency matrix. Gaussian distribution is the eigenvector of the infinite averaging matrix. Essentially the same idea.
by srean
3/20/2026 at 5:52:07 PM
> Linear algebra is amazing.The entire control systems theory is basically various applications of linear algebra. Like Kalman Filter that got us to the moon. Simply amazing.
by bsoles
3/19/2026 at 2:21:19 PM
Convolution alone does not smooth. Eg consider a random variable supported on the pts 0 and 1 (delta masses at 2 pts.) No matter how many convolutions you do, you still have support on integers - not smooth at all. You need appropriate rescaling for a gaussian.Also, convolving a distribution with itself is NOT a linear operation, hence cannot be described by a matrix multiplication with a fixed matrix.
by kurlberg
3/19/2026 at 2:36:44 PM
You are absolutely right. Even edge detection can be written as a convolution. That's why I mention averaging.I address scaling, very peripherally, towards the end. Of course, depending on how you scale you end up with distinctly different limit laws.
by srean
3/19/2026 at 11:01:18 AM
> Anyone familiar with linear algebra will know that repeated matrix multiplication by non degenerate matrices reveals it's eigenvectors.TIL that I'm not "familiar" with linear algebra ;)
But seriously, thanks for sharing that knowledge.
by CoastalCoder
3/19/2026 at 11:06:16 AM
If you are not speaking in jest (I strongly suspect you are), knowledge of linear algebra is one of the biggest bang for buck one can get as an investment in mathematical knowledge.So humble and basic a field. So wide it's consequences and scope.
by srean
3/19/2026 at 11:16:10 AM
Their point was that "familiarity" apparently means different things for different people :P Someone using linalg in computer graphics applications may say they're familiar with it even though they've never heard the term "eigenvector". I'm not actually sure about what you mean – how does repeated multiplication reveal eigenvectors?by Sharlin
3/19/2026 at 11:23:54 AM
Consider a diagonalizable matrix A. For example, a real symmetric matrix. Start with any vector b and keep multiplying it with A. A A A ... A b
The vector that the result will converge to is a scaled version of one of the eigenvectors of the matrix A.But which one ? The one with the largest eigenvalue among all eigenvectors not orthogonal to b.
by srean
3/19/2026 at 2:31:30 PM
Ah… that "diagonalizable" is doing some heavy lifting there! I was wondering how exactly you’re going to make, say, a rotation matrix to converge anything to anything that’s not already an eigenvector. And rotation matrices certainly aren’t degenerate! Though apparently non-diagonalizable matrices can be called defective which is such a dismissive term :( Poor rotation matrices, why are they dissed so?!by Sharlin
3/19/2026 at 2:40:03 PM
Love them, those rotation matrices.Take logarithm of the eigenvalues and you get back the angle. This to me had solidified the notion that angles are essentially a logarithmic notion ... Made more rigorous by the notion of exponential maps
by srean
3/19/2026 at 11:15:12 AM
My first sentence was in jest. I've used LA for various things, but haven't had many dealings with eigenvectors. So that information was genuinely new to me.My expression of gratitude was sincere.
by CoastalCoder
3/19/2026 at 11:27:01 AM
Understood and thanks for the opportunity of sharing together in the joy of something so amusing.by srean
3/19/2026 at 1:53:40 PM
You're doing this multiple times, but it's can only mean "It Is" or "It Has".by riffic
3/19/2026 at 1:58:12 PM
Thanks for the heads up. I meant 'its'.Phone autocorrect always interferes and I get tired and lazy about correcting it back. It does get it right most of the time.
by srean
3/19/2026 at 11:08:15 AM
Yeah, I don't think this was revealed on my undergrad linalg course, and neither during all my years of using linalg in computer graphics =Dby Sharlin
3/19/2026 at 2:24:56 PM
I remember my professor talking about eigenvectors in Linear Algebra and it's been 50 years - though I barely remember anything else from that class. It was taught very early on in the course and eventually we used them all the time to solve problems.by stevenwoo
3/19/2026 at 2:29:56 PM
Yes, I was taught about eigenvectors but not that they’re a fixpoint of matmul. At least I don’t think so.by Sharlin
3/19/2026 at 3:11:15 PM
Obligatory XKCD: https://xkcd.com/2501/by UncleSlacky