Linear Transformations
Linear transformations are functions between vector spaces that preserve structure. Every matrix IS a linear transformation — and every linear transformation between finite-dimensional spaces IS a matrix.
Key Concepts
Definition
T: ℝⁿ → ℝᵐ is linear if T(u+v) = T(u)+T(v) and T(cv) = cT(v) for all vectors u,v and scalars c. In matrix form: T(x) = Ax where A is the matrix representation.
Standard Transformations in 2D
Rotation by θ: [[cos θ, −sin θ],[sin θ, cos θ]]. Reflection over x-axis: [[1,0],[0,−1]]. Scaling: [[sx,0],[0,sy]]. Shear: [[1,k],[0,1]].
Kernel & Image
Kernel (null space): all x where T(x) = 0. Image (range/column space): all possible outputs T(x). Rank-Nullity: rank(A) + nullity(A) = n (number of columns).
Composition & Inverses
Composing T₁ then T₂: matrix product T₂T₁. Inverse transformation T⁻¹ exists iff A is square with det(A) ≠ 0. T⁻¹T = I (identity).
Live Python Practice
Interactive Lab
Check Your Understanding
A linear transformation T satisfies:
The rotation matrix by 90° counterclockwise is:
The kernel of T is: