Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Chapter 59 — Quadratic Functions

Prerequisites: ch058 (Linear Functions), ch056 (Visualizing Functions)

You will learn:

  • Identify vertex, axis of symmetry, and zeros of a parabola

  • Derive the quadratic formula and implement it robustly

  • Understand discriminant as the geometry of roots

  • Connect to optimization: the vertex is the minimum or maximum

Environment: Python 3.x, numpy, matplotlib


1. Concept

A quadratic function has the form f(x) = ax² + bx + c, where a ≠ 0.

The graph is a parabola: U-shaped (a > 0, opens up) or ∩-shaped (a < 0, opens down).

Key features:

  • Vertex: the single extremum (minimum if a>0, maximum if a<0). Location: x = -b/(2a), y = f(-b/(2a))

  • Axis of symmetry: x = -b/(2a) — the parabola is symmetric about this vertical line

  • Zeros (roots): where ax²+bx+c = 0, found by the quadratic formula

  • Discriminant: Δ = b²-4ac determines the number of real roots (Δ>0: two, Δ=0: one, Δ<0: none)

Why it matters: The quadratic is the simplest non-linear function. Loss landscapes near minima look locally quadratic — that’s why quadratic approximations are the basis of second-order optimization methods.


2. Intuition & Mental Models

Physical analogy: Projectile motion under gravity. Height as a function of time is quadratic: h(t) = h₀ + v₀t - ½gt². The vertex is the peak height.

Computational analogy: Many cost functions in ML have local quadratic structure near their minima. The vertex computation (-b/2a) is conceptually identical to minimum = -gradient / (2 * curvature) in second-order optimization.

Recall from ch059’s own structure: quadratic functions are the first family where the vertex = extremum — this is the concept of optimization at its simplest.


3. Visualization

# --- Visualization: Parabolas — vertex, roots, discriminant ---
import numpy as np
import matplotlib.pyplot as plt
plt.style.use('seaborn-v0_8-whitegrid')

def quadratic(a, b, c, x):
    return a*x**2 + b*x + c

def vertex(a, b):
    xv = -b / (2*a)
    return xv

x = np.linspace(-5, 5, 500)
fig, axes = plt.subplots(1, 3, figsize=(15, 5))

# Three cases of discriminant
cases = [
    (1, -2, -3, 'Δ>0: Two roots'),
    (1, -2,  1, 'Δ=0: One root (tangent)'),
    (1, -2,  5, 'Δ<0: No real roots'),
]
for ax, (a, b, c, label) in zip(axes, cases):
    y = quadratic(a, b, c, x)
    ax.plot(x, y, color='steelblue', linewidth=2)
    ax.axhline(0, color='black', linewidth=0.8)
    # Vertex
    xv = vertex(a, b)
    yv = quadratic(a, b, c, xv)
    ax.plot(xv, yv, 'r*', markersize=14, zorder=5, label=f'Vertex ({xv:.2f}, {yv:.2f})')
    # Roots if real
    disc = b**2 - 4*a*c
    if disc > 0:
        x1 = (-b - np.sqrt(disc)) / (2*a)
        x2 = (-b + np.sqrt(disc)) / (2*a)
        ax.plot([x1, x2], [0, 0], 'go', markersize=10, zorder=5, label=f'Roots: {x1:.2f}, {x2:.2f}')
    elif disc == 0:
        x1 = -b / (2*a)
        ax.plot(x1, 0, 'go', markersize=10, zorder=5, label=f'Root: {x1:.2f}')
    ax.set_ylim(-10, 15)
    ax.set_title(label + f'\na={a}, b={b}, c={c}')
    ax.set_xlabel('x'); ax.set_ylabel('f(x)')
    ax.legend(fontsize=8)

plt.suptitle('Quadratic Functions: f(x) = ax² + bx + c', fontsize=13, fontweight='bold')
plt.tight_layout()
plt.show()

4. Mathematical Formulation

Quadratic formula: For ax² + bx + c = 0: x = (-b ± √(b²-4ac)) / (2a)

Discriminant: Δ = b² - 4ac

  • Δ > 0: two distinct real roots

  • Δ = 0: one repeated real root

  • Δ < 0: no real roots (two complex conjugate roots)

Vertex form: f(x) = a(x - h)² + k, where h = -b/(2a), k = c - b²/(4a) This form makes the vertex (h, k) explicit.

# --- Implementation: Robust quadratic solver ---
import numpy as np

def solve_quadratic(a, b, c):
    """
    Solve ax² + bx + c = 0. Returns real roots only.
    Handles: two roots, one root, no real roots, degenerate (a=0).
    
    Returns:
        list of real roots (0, 1, or 2 elements)
    """
    if a == 0:
        if b == 0:
            return []  # degenerate: no variable
        return [-c / b]  # linear case
    
    disc = b**2 - 4*a*c
    if disc < 0:
        return []  # no real roots
    elif disc == 0:
        return [-b / (2*a)]  # one repeated root
    else:
        sq = np.sqrt(disc)
        # Use numerically stable form (avoid cancellation)
        x1 = (-b - np.copysign(sq, b)) / (2*a)
        x2 = c / (a * x1) if x1 != 0 else (-b + np.copysign(sq, b)) / (2*a)
        return sorted([x1, x2])

# Test cases
for a, b, c in [(1, -5, 6), (1, -2, 1), (1, 0, 1), (2, -4, 2)]:
    roots = solve_quadratic(a, b, c)
    disc = b**2 - 4*a*c
    print(f"  {a}x²+{b}x+{c}=0: roots={[round(r,4) for r in roots]}, Δ={disc}")

6. Experiments

Experiment 1: Plot f(x) = (x-h)² + k for h from -3 to 3 and k from -2 to 2. Observe how h shifts the vertex left/right and k shifts it up/down. Try a negative a (open down).

Experiment 2: Generate 1000 random quadratics (random a, b, c). What fraction have two real roots? One? None? Compare to the theoretical probability for uniform random coefficients.


7. Exercises

Easy 1. Find the vertex and roots of f(x) = 2x² - 8x + 6. (Expected: vertex (2, -2), roots x=1 and x=3)

Easy 2. Write vertex_form(a, b, c) that returns (h, k) such that f(x) = a(x-h)² + k. Verify by expanding.

Medium 1. Implement fit_quadratic(x1, y1, x2, y2, x3, y3) that finds a, b, c such that the parabola passes through three points. Use np.linalg.solve.

Medium 2. The minimum of f(x) = ax² + bx + c (a>0) is at x=-b/(2a). Show this is equivalent to setting the derivative to zero: f’(x) = 2ax + b = 0. Verify numerically using a gradient descent approach.

Hard. A projectile is launched at angle θ with speed v₀. Height is h(t) = v₀ sin(θ) t - ½g t², horizontal position is x(t) = v₀ cos(θ) t. Find the angle that maximizes range. Plot range vs angle and verify the maximum is at θ = 45°.


9. Chapter Summary & Connections

  • f(x) = ax² + bx + c: a controls curvature, vertex at x=-b/(2a)

  • Discriminant Δ = b²-4ac: sign determines number of real roots

  • Quadratic formula gives exact roots

  • Vertex = extremum — the foundation of optimization thinking

Backward connection: Linear functions (ch058) are the degenerate case a=0.

Forward connections:

  • The vertex-as-minimum concept scales to gradient descent in ch212

  • Taylor series (ch219) approximates any smooth function as a quadratic near a minimum

  • The eigenvalue problem in ch169 involves quadratics in the characteristic polynomial