See how linear algebra powers the technologies you use every day — from image compression and search engines to 3D graphics and machine learning. Build your own mini principal component analysis.
Key Concepts
Image Compression (SVD)
Singular Value Decomposition A = UΣVᵀ. The singular values σᵢ tell you how important each component is. Keep only the k largest → compressed approximation. JPEG uses a related technique.
PageRank & Markov Chains
Google's PageRank is the dominant eigenvector of a web link matrix. Markov chains model random walks on graphs — the steady state is an eigenvector with eigenvalue 1.
3D Graphics
Every 3D rotation, scaling, and projection is a matrix. A 4×4 homogeneous matrix encodes all 3D transformations. GPU pipelines multiply millions of vertices by transformation matrices per frame.
Machine Learning
Neural network weights are matrices. Gradient descent updates them. PCA uses eigendecomposition to find the directions of greatest variance in data. SVD powers recommendation systems.
Live Python Practice
# Mini PCA: find direction of greatest variance
import math
# 2D dataset
data = [(1,2),(2,3),(3,3),(4,5),(5,4),(6,7),(7,6),(8,8)]
# Center the data
mx = sum(x for x,y in data)/len(data)
my = sum(y for x,y in data)/len(data)
centered = [(x-mx, y-my) for x,y in data]
print(f"Centered data (mean removed, mx={mx:.2f}, my={my:.2f}):")
for i, (x,y) in enumerate(centered):
print(f" point {i+1}: ({x:.2f}, {y:.2f})")
# Covariance matrix
n = len(centered)
cxx = sum(x*x for x,y in centered) / (n-1)
cxy = sum(x*y for x,y in centered) / (n-1)
cyy = sum(y*y for x,y in centered) / (n-1)
print(f"\nCovariance matrix:")
print(f" [[{cxx:.3f}, {cxy:.3f}]")
print(f" [{cxy:.3f}, {cyy:.3f}]]")
# Eigenvalues of 2x2 cov matrix
trace = cxx + cyy
det = cxx*cyy - cxy**2
disc = trace**2 - 4*det
l1 = (trace + math.sqrt(disc)) / 2
l2 = (trace - math.sqrt(disc)) / 2
print(f"\nEigenvalues: λ₁={l1:.3f} ({l1/(l1+l2)*100:.1f}% variance), λ₂={l2:.3f}")
Interactive Lab
Blue dots = data cloud. Red line = Principal Component 1 (direction of max variance). This is what PCA computes using eigenvalues of the covariance matrix.
Check Your Understanding
Singular Value Decomposition (SVD) is used in image compression because: