- Joined
- Sep 2, 2023
Disclaimer, I'm a bit rusted so correct me if I'm wrong.Math is a fun hobby, and a while back, I wondered how they were doing complex matrices before we had computers or even personal calculators in the late 60s/early 70s. (those earlier "personal calculators" don't count; they were enormous and didn't use solid state/integrated circuits until the 60s, so weren't handheld/portable)
Meet Classical Matrix Algebra, calculating Eigenvalues as they did 100 years ago and beyond. Fucking wild. From Modern Algebra and Matrices, 1951
View attachment 6179518
View attachment 6179527
They were just manually transforming everything and checking to ensure everything was correct manually. We don't consider the amount of math we outsource to computers today and how much of what we learn now has been simplified; past curriculums are fucking insane to take a look at.
We built the atomic bomb and the first computers with this kind of math. It's amazing!
Don't know exactly what algorithm is being used here since there are many and I'm not entirely familiar with the notation or theorems used in the book, but the algorithm I recall was to obtain the characteristic polynomial of the matrix and then find its roots to find all eigenvalues. The characteristic polynomial is simply the determinant of the matrix λI - A. To compute such determinants in a reasonable amount of time, Gaussian Elimination can be used to obtain an upper triangular matrix B. When you swap two rows in a matrix, the determinant changes its sign and when you multiply a row by a scalar a, the determinant is also multiplied by a, therefore, the determinant of A is the determinant of B (which is just the product of its diagonal components) divided by (-1)^n*a_1*a_2...*a_k where n is the number of row swaps you performed and a_1,...,a_k are the scalars you used to multiply rows of A.