Prerequisites: Matrix multiplication (ch154), matrix representation (ch152) You will learn:
The formal definition of linearity and why it matters
Kernel (null space) and image (range) of a transformation
The rank-nullity theorem
Why every linear map is a matrix (and vice versa)
Environment: Python 3.x, numpy, matplotlib
# --- Linear Transformations: Kernel, Image, Rank-Nullity ---
import numpy as np
import matplotlib.pyplot as plt
plt.style.use('seaborn-v0_8-whitegrid')
def is_linear(T, n_tests=100, n=3, tol=1e-10):
"""
Test if function T: R^n → R^m is linear by checking:
T(au + bv) == aT(u) + bT(v) for random u, v, a, b.
"""
np.random.seed(0)
for _ in range(n_tests):
u = np.random.randn(n)
v = np.random.randn(n)
a, b = np.random.randn(2)
lhs = T(a*u + b*v)
rhs = a*T(u) + b*T(v)
if not np.allclose(lhs, rhs, atol=tol):
return False
return True
# Linear function
def T_linear(v): return np.array([2*v[0]-v[1], v[0]+v[2], 3*v[1]])
# Non-linear function (has constant term — affine, not linear)
def T_affine(v): return np.array([2*v[0]+1, v[1], v[2]])
# Non-linear (quadratic)
def T_quad(v): return np.array([v[0]**2, v[1], v[2]])
print(f"T_linear is linear: {is_linear(T_linear)}")
print(f"T_affine is linear: {is_linear(T_affine)} (has constant offset — AFFINE not linear)")
print(f"T_quad is linear: {is_linear(T_quad)} (nonlinear)")
print()
# Kernel and Image
A = np.array([[1.,2.,3.],[4.,5.,6.],[2.,4.,6.]]) # rank 2
U, S, Vt = np.linalg.svd(A)
rank = np.sum(S > 1e-9)
null_dim = A.shape[1] - rank
print(f"A shape: {A.shape}, rank: {rank}")
print(f"Rank-nullity theorem: rank + nullity = {rank} + {null_dim} = {rank+null_dim} = n = {A.shape[1]}")
# Null space basis (kernel): right singular vectors with S≈0
null_basis = Vt[rank:].T
print(f"Null space basis (columns):\n{null_basis}")
print(f"A @ null_basis ≈ 0: {np.allclose(A @ null_basis, 0, atol=1e-10)}")
# Image basis (column space): left singular vectors with S>0
img_basis = U[:, :rank]
print(f"Image (column space) basis:\n{img_basis}")T_linear is linear: True
T_affine is linear: False (has constant offset — AFFINE not linear)
T_quad is linear: False (nonlinear)
A shape: (3, 3), rank: 2
Rank-nullity theorem: rank + nullity = 2 + 1 = 3 = n = 3
Null space basis (columns):
[[-0.40824829]
[ 0.81649658]
[-0.40824829]]
A @ null_basis ≈ 0: True
Image (column space) basis:
[[-0.30840629 0.3238604 ]
[-0.72417387 -0.68961744]
[-0.61681259 0.6477208 ]]
4. Mathematical Formulation¶
T: ℝⁿ → ℝᵐ is linear iff:
T(u + v) = T(u) + T(v) [additivity]
T(αv) = αT(v) [homogeneity]
Combined: T(αu + βv) = αT(u) + βT(v)
Kernel (null space): ker(T) = {v ∈ ℝⁿ : T(v) = 0}
Image (range): im(T) = {T(v) : v ∈ ℝⁿ} ⊆ ℝᵐ
Rank-Nullity Theorem: dim(ker(T)) + dim(im(T)) = n
nullity + rank = n
Theorem: Every linear map T: ℝⁿ → ℝᵐ is uniquely represented
by the m×n matrix [T] = [T(e₁)|...|T(eₙ)].7. Exercises¶
Easy 1. Is T(v) = v/||v|| linear? What about T(v) = 2v - [1,0]?
Easy 2. For A = [[1,2],[2,4]], what is the dimension of the kernel? The image?
Medium 1. Implement null_space(A) using SVD and verify it for 5 different matrices. Check that A @ null_space(A) ≈ 0.
Medium 2. Show that the composition of two linear maps is linear. Implement compose(T1, T2) that returns the composed function and verify linearity.
Hard. Prove the rank-nullity theorem numerically: for 20 random matrices of shape (m, n), verify that rank(A) + nullity(A) = n always holds, where nullity is computed from SVD.
9. Chapter Summary & Connections¶
T is linear iff T(αu+βv) = αT(u)+βT(v).
kernel = null space (vectors mapping to 0); image = column space (all reachable outputs).
Rank-Nullity: rank + nullity = input dimension.
Every linear map has a unique matrix representation (given basis).
Forward connections:
In ch165 (Matrix Transformations Visualization), we visualize what different kernel and image dimensions look like geometrically.
In ch169 (Eigenvectors Intuition), eigenvectors are special vectors where T(v) is parallel to v — a deep structural property of the linear map.