Simulating Random Walks
Random walks are one of those ideas that show up everywhere — from the diffusion of molecules to the movement of stock prices. The concept is simple: at each step, move in a random direction. Yet from this simplicity, rich structure emerges.
Let's simulate a basic 1D random walk in Python. At each time step, we flip a coin and move left or right:
import numpy as np
import matplotlib.pyplot as plt
n_steps = 1000
steps = np.random.choice([-1, 1], size=n_steps)
position = np.cumsum(steps)
plt.figure(figsize=(10, 4))
plt.plot(position, linewidth=0.8)
plt.xlabel("Step")
plt.ylabel("Position")
plt.title("1D Random Walk")
plt.tight_layout()
plt.show()
Each realization looks different, but the statistics are predictable. After $n$ steps, the expected displacement is zero, and the root-mean-square distance from the origin grows as $\sqrt{n}$.
We can also look at many walkers at once to see the distribution emerge:
n_walkers = 5000
n_steps = 500
all_walks = np.random.choice([-1, 1], size=(n_walkers, n_steps))
final_positions = all_walks.sum(axis=1)
plt.figure(figsize=(8, 4))
plt.hist(final_positions, bins=50, density=True, alpha=0.7)
plt.xlabel("Final position")
plt.ylabel("Density")
plt.title(f"Distribution of final positions ({n_walkers} walkers, {n_steps} steps)")
plt.tight_layout()
plt.show()
The histogram looks Gaussian — and that's no coincidence. The Central Limit Theorem guarantees that the sum of many independent steps converges to a normal distribution. It's a beautiful example of universality: the details of each step don't matter, only the aggregate behaviour does.
This is part of what makes random walks so useful as a mental model. Whether you're thinking about diffusion, sampling algorithms, or even evolution, the same underlying mathematics keeps appearing.