Animated Visualization of Brownian Motion in Python

8 minute read

In the previous blog post we have defined and animated a simple random walk, which paves the way towards all other more applied stochastic processes. One of these processes is the Brownian Motion also known as a Wiener Process.

In this blog post, we will see how to generalize from discrete-time to continuous-time random process, because they confront the reality. That means, how to use the simple random walk to construct a Brownian Motion, which has a multitude of interdisciplinary applications some of which in finance. We will use the Python numerical packages such as NumPy and SciPy with a goal of simulating the process and its properties.

To ameliorate the visual perception we will use the Matplotlib Animation API. This is of paramount importance since it helps to transfer the message more efficiently in an enthralling manner.

Ready? So let's go and stay tuned!

Going from discrete to continuous

The simple random walk operates in discrete time steps belonging to the set of natural numbers, which means nothing happens in between. However, the changes in nature happen in infinitesimal time periods. Then, how can we turn \( N \) discrete steps into continuous steps?

Two plots, one with discrete points, the other with continuous points
Fig. 1: From Discrete to Continuous Values

If we have \( N \) discrete time steps, we can evenly distribute them in the space between \( 0 \) and \( 1 \). It implies that we assign one slot to each discrete time step. For instance, if \( N = 10 \) then to each discrete time step we can assign the following slots \( \{ 0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 \} \).

If we start increasing \( N \), the slots will start getting closer and closer to each other, thus creating a continuum. Additionally, we can normalize the values associated with those slots by \( \sqrt{N} \). Formally, if we have a simple random walk defined as:

Math equation defining the random walk

then a continuous and normalized process is defined as:

math equation defining the shift from discrete to continuous points

We demonstrated how with a simple trick we can switch from discrete to continuous notation. Now, letting \( N \) to approach infinity leads us to the definition of a Brownian Motion.

Brownian Motion Definition

If we let \( N \) to grow indefinitely, or to be sufficiently large, we observe some interesting properties of the above mentioned random process. From the Central Limit Theorem, we know for sure that only when \( t = 1 \):

math equation for the Central Limit Theorem

i.e. it converges to a standard normal distribution with mean \( 0 \) and variance \( 1 \). However, the Central Limit Theorem does not give us an answer for the distribution of all other points, i.e. for arbitrary \( t \). The answer is given by the Donsker's Invariance Principle which extends the Central Limit Theorem such that for any \( t \in [0, 1] \) we have:

math equation for the Donsker's theorem

It signifies that the process converges to a standard Brownian Motion, which has the following properties:

  • \( W_{0} = 0 \);
  • \( W \) has independent increments: \( W_{t + u} - W_{t} \) is independent of \( W_{s} \) for every \( t \gt 0, u \geq 0, s \leq t \);
  • \( W \) has Gaussian increments: \( W_{t + u} - W_{t} \sim \mathcal{N}(0, u) \) and
  • \( W_{t} \) is continuous in \( t \).

In this case, just for demonstration purposes to show how to construct a Brownian Motion from a simple random walk we used the unit increments \( \{ -1, +1 \} \) with an equal probability to occur. More generally, these increments can be generated from any distribution, most frequently the normal distribution. This is true due to the universality of the Central Limit Theorem as well as the Donsker's Invariance Principle.

Once we know the definition of a Brownian Motion, we can implement a simulation in Python and make a visualization of the possible outcomes.

Brownian Motion in Python

We can easily construct a Brownian Motion using the NumPy package. By providing the number of discrete time steps \( N \), the number of continuous-time steps \( T \), we simply generate \( N \) increments from the normal distribution with some variance \( h \) and distribute them across the continuous-time steps \( T \). Then, the Brownian Motion is a cumulative sum of the increments, i.e. we use the cumsum function. The Python code is given below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
import numpy as np
np.random.seed(1234)

def brownian_motion(N, T, h):
    """
    Simulates a Brownian motion
    :param int N : the number of discrete steps
    :param int T: the number of continuous time steps
    :param float h: the variance of the increments
    """
    dt = 1. * T/N  # the normalizing constant
    random_increments = np.random.normal(0.0, 1.0 * h, N)*np.sqrt(dt)  # the epsilon values
    brownian_motion = np.cumsum(random_increments)  # calculate the brownian motion
    brownian_motion = np.insert(brownian_motion, 0, 0.0) # insert the initial condition

    return brownian_motion, random_increments

N = 50 # the number of discrete steps
T = 1 # the number of continuous time steps
h = 1 # the variance of the increments
dt = 1.0 * T/N  # total number of time steps

# generate a brownian motion
X, epsilon = brownian_motion(N, T ,h)

Animating the Brownian Motion

Serving nicely formatted static plots is cool, but it doesn't put the accent on what is important to tell. For this reason, we use an animated plot, animating the principal point which is aligned with the message we want to send.

To send the correct message, in this case, we illustrate the universal truth of the Donsker's Invariance Principle. As the number of discrete time steps \( N \) grows, the simple random walk approaches the standard Brownian Motion. We realize this using the Matplotlib's animation API, as shown in the code below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation

fig = plt.figure(figsize=(21, 10))
ax = plt.axes(xlim=(0, 1))
line, = ax.step([], [], where='mid', color='#0492C2')

# formatting options
ax.set_xticks(np.linspace(0,1,11))
ax.set_xlabel('Time', fontsize=30)
ax.set_ylabel('Value', fontsize=30)
ax.tick_params(labelsize=22)
ax.grid(True, which='major', linestyle='--', color='black', alpha=0.6)

# initialization function 
def init():
    line.set_data([], [])
    return line,

# animation function 
def animate(i):
    np.random.seed(42)
    y, epsilon = brownian_motion((i + 1) * 10, 1 ,1)
    tr = np.linspace(0.0, 1, (i + 1) * 10 + 1)
    ax.set_title('Brownian Motion for {} steps'.format((i + 1) * 10), fontsize=40)
    ax.set_ylim((np.min(y) - 0.1, np.max(y) + 0.1))

    line.set_data(list(tr), list(y))
    return line,

# call the animator	 
anim = FuncAnimation(fig, animate, init_func=init, frames=100, interval=100, blit=True)
anim.save('brownian_motion.gif',writer='imagemagick')

The animation function needs three essential inputs: i) an instance of the figure to show the animation, ii) data initialization function and iii) a function that defines the animation logic. Additionally, we provide the number of frames to draw and the delay in milliseconds between them.

The animate function plays a central role. Every iteration, we generate from scratch a Brownian Motion with 10 more steps compared to the one in the previous iteration. Thus, the result we get is the following:

Animation of the Brownian Motion
Animation: Converging to Brownian Motion

We clearly see how the simple random walk converges from a sparsely distributed to a standard Brownian Motion as the number of steps increases.

The full source code related to all we have discussed during this blog can be found on GitHub. For more information, please follow me on Twitter.

If you liked what you just saw, it would be really helpful to subscribe to the mailing list below. You will not get spammed that's a promise! You will get updates for the newest blog posts and visualizations from time to time.

Conclusion

In this blog we demonstrated how we can go from a discrete simple random walk to a continuous Brownian Motion. We used the Python's numerical packages to achieve this task computationally. On top of this, we illustrated the principle of creation a continuous random process with an animated plot using the Matplotlib's animation API.

The Brownian Motion is an important random process. It opens the way towards its variant, the Geometric Brownian Motion, which is a more realistic process with a random exponential growth and predetermined bias. We will cover this process in the next blog.

Appendix: Simulate the Gaussian Increments

One particular property of the Brownian Motion we observed is the Gaussian increments. There is a strong mathematical proof for this statement, which we can augment with a simulation. The computation utilities give us the privilege to see it with our own eyes, as shown in the snippet below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
from scipy import stats

num_simulations = 10000  # how many times to repeat
Ns, Ts, hs = 20000, 10.0, 1.0 # discrete steps, continuous steps, veriance
dts = 1.0 * T/N  # total number of time steps

u = 2.  # the difference in time points
t = int(np.floor((np.random.uniform(low=u+0.01, high=1. * T - u)/T) * N)) # random starting point

# initialize the means
rand_val_t = np.zeros(num_simulations)
rand_val_t_plus_u = np.zeros(num_simulations)
rand_val_t_minus_u = np.zeros(num_simulations)

for i in range(num_simulations):
    # generate a brownian motion
    Xs, _ = brownian_motion(Ns, Ts, hs)

    # store the means at the two points
    rand_val_t[i] = Xs[t]
    rand_val_t_plus_u[i] = Xs[t + int(u*Ns/Ts)]

# calculate the difference
diff = rand_val_t_plus_u - rand_val_t

# print stats
print('The mean is: {0}'.format(np.mean(diff)))
print('The variance is: {0}'.format(np.var(diff)))

# make normality test with null hypothesis: x comes from a normal distribution
k2, p = stats.normaltest(diff)
print("Null hypothesis can be rejected") if p < 1e-3 else print("Null hypothesis cannot be rejected")

The resulting plot of this simulation is depicted in the image below:

Gaussian Bell Curve
Fig 2: The Gaussian Increments of the Brownian Motion

References

[1] Arturo Fernandez, “Brownian Motion and An Introduction to Stochastic Integration” (2011), Statistics 157: Topics In Stochastic Processes Seminar
[2] Eric Vanden-Eijnden, “Lecture 6: Wiener Process”

Leave a comment