← Algebra → Linear Algebra Lesson 11 of 12
Algebra → Linear Algebra · Lesson 11

Inner Products, Orthogonality & Gram-Schmidt

Orthogonality generalizes perpendicularity to any vector space. Orthonormal bases simplify calculations enormously — and Gram-Schmidt is the algorithm that builds them.

Key Concepts

Inner Products

An inner product ⟨u,v⟩ generalizes the dot product. Must satisfy: linearity, symmetry, positive-definiteness. In ℝⁿ: ⟨u,v⟩ = u·v = Σuᵢvᵢ. Induces a norm ‖v‖ = √⟨v,v⟩.

Orthogonal & Orthonormal Sets

Vectors are orthogonal if ⟨uᵢ,uⱼ⟩ = 0 for i≠j. Orthonormal: also each ‖uᵢ‖ = 1. An orthonormal basis makes projections trivial: cᵢ = ⟨v,eᵢ⟩.

Gram-Schmidt Process

Given linearly independent vectors v₁,...,vₙ, build orthonormal set: u₁ = v₁/‖v₁‖. For each k: subtract projections onto previous uᵢ, then normalize.

Orthogonal Projections

proj_v(u) = (u·v/v·v)v. In ℝⁿ with orthonormal basis {e₁,...,eₙ}: proj_W(v) = Σ⟨v,eᵢ⟩eᵢ. Used in least-squares regression, PCA, and signal processing.

Live Python Practice

Interactive Lab

v₁:    v₂:
Gray = original. Blue = e₁ (normalized v₁). Orange = projection of v₂ onto e₁. Red = e₂ (orthogonalized v₂). Result: orthonormal basis.

Check Your Understanding

Two vectors are orthogonal when their dot product is:

Gram-Schmidt produces vectors that are:

proj_v(u) = (u·v / v·v)v represents:

← Lesson 10 Lesson 12 →