Eigenvectors and eigenvalues (alternative Simple English Wikipedia page) are a topic you hear a lot in linear algebra and data science machine learning.

However, these are very abstract terms and are difficult to understand why they are useful and what they *really* mean.

This forum post is to catalog helpful resources on uncovering the mysteries of these eigenthings and discuss common confusions around understanding them.

Here are some resources on the topic I have found useful:

- Eigenvectors and Eigenvalues - Explained Visually, visual interactive on exploring eigenvectors and eigenvalues
- In the context of principal component analysis (PCA) with a fun story of a big family dinner explaining PCA with varying levels of complexity, from explaining to grandmother to a young kid
- Eigenvectors and eigenvalues | Essence of linear algebra, chapter 13 (3Blue1Brown), excellent animations and explanation (with minimal math) on what these eigenpairs are (video suggests to watch the previous videos in series to fully understand context if you are new)
- Answer to “How to intuitively understand eigenvalue and eigenvectors?”, where this answer suggests that “Eigenpairs are a lot like the roots of a polynomial” because there also are a ton of applications of these roots and eigenpairs.
- Answer to “What is the importance of eigenvalues/eigenvectors?”, where the short answer is “Eigenvectors make understanding linear transformations easy.”

Haven’t looked through these, but look promising:

- Why is the eigenvector of a covariance matrix equal to a principal component?
- A question on the history and motivation of eigenvectors
- “A Tutorial on Principal Component Analysis” by Jonathon Shlens (Google Research) (PDF)
- 4.5 - Eigenvalues and Eigenvectors (PennState STAT 505 Applied Multivariate Statistical Analysis)
- Chapter 6 Eigenvalues and Eigenvectors (Gilbert Strang) (PDF)
- Pauls Online Notes on Eigenvalues & Eigenvectors