# A Simple Introduction to Complex Stochastic Processes

A Simple Introduction to Complex Stochastic Processes

Stochastic processes have many applications, including in finance and physics. It is an interesting model to represent many phenomena. Unfortunately the theory behind it is very difficult, making it accessible to a few ‘elite’ data scientists, and not popular in business contexts.

One of the most simple examples is a random walk, and indeed easy to understand with no mathematical background. However, time-continuous stochastic processes are always defined and studied using advanced and abstract mathematical tools such as measure theory, martingales, and filtration. If you wanted to learn about this topic, get a deep understanding on how they work, but were deterred after reading the first few pages of any textbook on the subject due to jargon and arcane theories, here is your chance to really understand how it works.

Rather than making it a topic of interest to post-graduate scientists only, here I make it accessible to everyone, barely using any maths in my explanations besides the central limit theorem. In short, if you are a biologist, a journalist, a business executive, a student or an economist with no statistical knowledge beyond Stats 101, you will be able to get a deep understanding of the mechanics of complex stochastic processes, after reading this article. The focus is on using applied concepts that everyone is familiar with, rather than mathematical abstraction.

My general philosophy is that powerful statistical modeling and machine learning can be done with simple techniques, understood by the layman, as illustrated in my article on machine learning without mathematics or advanced machine learning with basic excel.

1. Construction of Time-Continuous Stochastic Processes: Brownian Motion

Probably the most basic stochastic process is a random walk where the time is discrete. The process is defined by X(t+1) equal to X(t) + 1 with probability 0.5, and to X(t) – 1 with probability 0.5. It constitutes an infinite sequence of auto-correlated random variables indexed by time. For instance, it can represent the daily logarithm of stock prices, varying under market-neutral conditions. If we start at t = 0 with X(0) = 0, and if we define U(t) as a random variable taking the value +1 with probability 0.5, and -1 with probability 0.5, then X(n) = U(1) + … + U(n). Here we assume that the variables U(t) are independent and with the same distribution. Note that X(n) is a random variable taking integer values between -n and +n.

Five simulations of a Brownian motion (x-axis is the time t, u-axis is Z(t)

What happens if we change the time scale (x-axis) from daily to hourly, or to every millisecond? We then also need to re-scale the values (y-axis) appropriately, otherwise the process exhibits massive oscillations (from -n to +n) in very short time periods. At the limit, if we consider infinitesimal time increments, the process becomes a continuous one. Much of the complex mathematics needed to define these continuous processes do no more than finding the correct re-scaling of the y-axis, to make the limiting process meaningful.

It tuns out that if you define these time-continuous processes as the limit of their time-discrete version, as in our approach, using the correct re-scaling is straightforward. Let us define Y(t, n) as the same process as X(t), but with small time increments of T / n instead of T, where T is the time unit (say, a day). In other words, Y(t / n) = X(t): we just re-scaled the time axis; both t and n are still integers. Now Y(t, n) can take on very high values (between -n and +n) even when t = 1. Thus we also need to re-scale the y-axis.

Note that

The only way to make the right-hand side of the equation not depending on n is to re-scale Y as follows. Define

Then

Also, because of the central limit theorem, by construction Z(t) has a Gaussian distribution, regardless of the distribution of U(1). The final process Z(t) is both time-continuous and continuous on the y-axis, though nowhere differentiable. It looks like a fractal and it is known as a Brownian motion — the standard time-continuous stochastic process — from which many other processes have been derived.

Note that if instead of using a binary random variable for U(1), you use a Gaussian one, then the limiting process Z(t) is identical, but we are dealing with Gaussian variables throughout the construction, making it easier to study the covariance structure and other properties. It then becomes a simple exercise for any college student, to derive the covaraiance between Z(t) and Z(s)..The covariance can also be estimated using simulations.

Finally, note that Z(0) = 0 and E[U(1)] = 0.

2. General Properties

The Brownian motion can also be viewed as a Gaussian stationary time series, characterized by its covariance or auto-correlation structure. It is also related to deterministic dynamic systems that exhibit a fractal behavior. Under appropriate transformations, all these processes can be made equivalent.

One question is whether the above construction (the limit of a time-discrete random walk) covers all types of Brownian motions, or only a few particular cases. One way to investigate this is to check whether this construction can generate any kind of covariance structure that characterizes these processes. The answer is positive, making advanced mathematical theory unnecessary to build and study Brownian motions, as well as the numerous complex stochastic processes derived from this base process. Indeed, if you allow the random variables U(t) used in our construction to NOT be independent, then you can build more sophisticated time-continuous stochastic processes, that are not Brownian motions.

Definitions

All the stochastic processes introduced so far, whether time-discrete or time-continuous, share the following properties. In most cases, it is easy to turn a stochastic process into one that satisfies these properties, using simple transformations, as illustrated later in this section.

Stationarity: This is sometimes called the homogeneous property, and represents the fact that there is no trend, draft, or more precisely, the fact that the properties of the process in question do not explicitly depend on the time parameter t. Such processes are usually characterized by their auto-correlation structure alone.

Ergodicity: This means that one instance of the process is enough to derive all its properties. You don’t need to make hundreds of simulations to study the process’ properties or compute estimates: One simulation over a very long time period will do.

Fractal behavior: If you zoom in or out on any single realization of these processes, you will get a new process with the exact same properties and behavior, indistinguishable from the parent process. Two different time windows provide two versions of the process that are identical with respect to their statistical properties.

Memory-less: The future observations are depending on the present value only, not on past observations. This is sometimes referred to as the Markov property.

It is sometimes possible to transform a process so that it satisfies some of the above properties. For instance, if X(t) is a time series with a linear trend and discrete time increments, the differences X(t) – X(t-1) may represent a stationary. Likewise, if X(t) depends on X(t-1) and X(t-2), then the vector (X(t), Y(t)) with Y(t) = X(t-1) represents a bivariate memory-less time series.

Exercise

Simulate 1,000 realizations of a Brownian motion on [0, 1], using the random walk construction described in this article. Study the distribution of the following quantities, using estimates based on your simulations. In particular, what is the mean and variance, for t in [0, 1], for the following quantities:

min Z(t), max Z(t) over [0, 1]

proportion of time with Z(t) > 0 (note that Z(0) = 0)

number of times the sign of Z(t) oscillates

Keep in mind that the Z(t)’s are auto-correlated.

For related articles from the same author, click here or visit www.VincentGranville.com. Follow me on Twitter at @GranvilleDSC or on LinkedIn.

DSC Resources

Services: Hire a Data Scientist | Search DSC | Classifieds | Find a Job

Contributors: Post a Blog | Ask a Question

Follow us: @DataScienceCtrl | @AnalyticBridge

Popular Articles

Difference between Machine Learning, Data Science, AI, Deep Learning, and Statistics

What is Data Science? 24 Fundamental Articles Answering This Question

Hitchhiker’s Guide to Data Science, Machine Learning, R, Python

Advanced Machine Learning with Basic Excel

Link: A Simple Introduction to Complex Stochastic Processes