Eigenvalues & Eigenvectors: The Axis of Rotation

[!NOTE] This module explores the core principles of Eigenvalues & Eigenvectors: The Axis of Rotation, deriving solutions from first principles and hardware constraints to build world-class, production-ready expertise.

1. Introduction: The Spinning Earth

Imagine the Earth spinning on its axis.

  • Someone standing in Brazil is moving very fast (around the center of the Earth).
  • Someone standing in London is moving, but slower.
  • But someone standing exactly on the North Pole is not moving sideways at all. They are just spinning in place.

In Linear Algebra, the North Pole-South Pole line is the Eigenvector of the Earth’s rotation matrix.

Most vectors get knocked off their path when transformed by a matrix A. They change direction. But Eigenvectors are special: they stay on their own line. They might get stretched or squashed, but they point in the same (or exactly opposite) direction.

The Equation

Av = λv
  • A: The Transformation Matrix (The “Action”).
  • v: The Eigenvector (The “Direction”).
  • λ (Lambda): The Eigenvalue (The “Stretch Factor”).

2. Interactive Visualizer: The Eigen-Spinner

Below is a 2D space. The matrix A transforms the Blue Vector (v) into the Green Vector (Av).

[!TIP] Try it yourself: Drag the slider to rotate the input vector v. Your goal is to find the angle where the Green Arrow (Av) and Blue Arrow (v) are perfectly aligned (parallel or anti-parallel).

A =
Determinant: 3.00
Trace: 4.00
v: [1.0, 0.0]
Av: [2.0, 1.0]
EIGENVECTOR FOUND! λ = ?

3. Computing Eigenvalues (The Code)

How do we calculate λ in practice? We use efficient numerical algorithms like the QR algorithm (which is what NumPy uses under the hood).

Here is how you compute them in Python:

import numpy as np

# 1. Define the Matrix A
# A = [[2, 1],
#      [1, 2]]
A = np.array([[2, 1],
              [1, 2]])

# 2. Compute Eigenvalues and Eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)

print("Eigenvalues:", eigenvalues)
# Output: [3. 1.]

print("Eigenvectors:\n", eigenvectors)
# Output (columns are the vectors):
# [[ 0.70710678 -0.70710678]
#  [ 0.70710678  0.70710678]]

# Verify for the first eigenvalue (lambda = 3)
v = eigenvectors[:, 0]  # First column
Av = np.dot(A, v)
lambda_v = eigenvalues[0] * v

print(f"Av: {Av}")
print(f"λv: {lambda_v}")
# Both should be identical!

[!NOTE] np.linalg.eig returns eigenvectors as columns. The first column corresponds to the first eigenvalue.


4. Application: The Billion Dollar Eigenvector (PageRank)

Google’s original algorithm, PageRank, is essentially an Eigenvector problem.

  1. Imagine the internet as a giant matrix M.
    • If page j links to page i, then entry Mij is non-zero.
  2. We want to find the “importance” score of every page.
  3. We define importance recursively: “A page is important if important pages link to it.”
  4. This creates the equation: r = Mr.
    • r is the vector of importance scores (ranks).
    • M is the link matrix.
  5. Wait, look at the equation: Mr = 1 × r.
    • This implies r is an Eigenvector of the web graph M corresponding to the Eigenvalue 1!

By repeatedly multiplying the matrix (Power Iteration), Google finds this eigenvector and ranks the search results.


5. Summary

  • Eigenvector: An axis that keeps its orientation during a transformation.
  • Eigenvalue: The scaling factor along that axis.
  • Characteristic Equation: det(A - λI) = 0.
  • Trace: Sum of eigenvalues.
  • Determinant: Product of eigenvalues.