Martingale (probability theory)

Martingale (probability theory)
Stopped Brownian motion is an example of a martingale. It can be used to model an even coin-toss betting game with the possibility of bankruptcy.

In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables (i.e., a stochastic process) for which, at a particular time in the realized sequence, the expectation of the next value in the sequence is equal to the present observed value even given knowledge of all prior observed values at a current time.

To contrast, in a process that is not a martingale, it may still be the case that the expected value of the process at one time is equal to the expected value of the process at the next time. However, knowledge of the prior outcomes (e.g., all prior cards drawn from a card deck) may be able to reduce the uncertainty of future outcomes. Thus, the expected value of the next outcome given knowledge of the present and all prior outcomes may be higher than the current outcome if a winning strategy is used. Martingales exclude the possibility of winning strategies based on game history, and thus they are a model of fair games.

Contents

History

Originally, martingale referred to a class of betting strategies that was popular in 18th century France.[1][2] The simplest of these strategies was designed for a game in which the gambler wins his stake if a coin comes up heads and loses it if the coin comes up tails. The strategy had the gambler double his bet after every loss so that the first win would recover all previous losses plus win a profit equal to the original stake. As the gambler's wealth and available time jointly approach infinity, his probability of eventually flipping heads approaches 1, which makes the martingale betting strategy seem like a sure thing. However, the exponential growth of the bets eventually bankrupts its users. Stopped Brownian motion, which is a martingale process, can be used to model the trajectory of such games.

The concept of martingale in probability theory was introduced by Paul Pierre Lévy, and much of the original development of the theory was done by Joseph Leo Doob among others. Part of the motivation for that work was to show the impossibility of successful betting strategies.

Definitions

A basic definition of a discrete-time martingale is a discrete-time stochastic process (i.e., a sequence of random variables) X1X2X3, ... that satisfies for any time n,

\mathbf{E} ( \vert X_n \vert )< \infty
\mathbf{E} (X_{n+1}\mid X_1,\ldots,X_n)=X_n,

That is, the conditional expected value of the next observation, given all the past observations, is equal to the last observation. Due to the linearity of expectation, this second requirement is equivalent to:

\mathbf{E} (X_{n+1} - X_n \mid X_1,\ldots,X_n)=0,

which states that the average "winnings" from observation n to observation n + 1 are 0.

Martingale Sequences with Respect to Another Sequence

More generally, a sequence Y1Y2Y3 ... is said to be a martingale with respect to another sequence X1X2X3 ... if for all n

\mathbf{E} ( \vert Y_n \vert )< \infty
\mathbf{E} (Y_{n+1}\mid X_1,\ldots,X_n)=Y_n.

Similarly, a continuous-time martingale with respect to the stochastic process Xt is a stochastic process Yt such that for all t

\mathbf{E} ( \vert Y_t \vert )<\infty
\mathbf{E} ( Y_{t} \mid \{ X_{\tau}, \tau \leq s \} ) = Y_s, \ \forall\ s \leq t.

This expresses the property that the conditional expectation of an observation at time t, given all the observations up to time s, is equal to the observation at time s (of course, provided that s ≤ t).

General Definition

In full generality, a stochastic process Y : T × Ω → S is a martingale with respect to a filtration Σ and probability measure P if

\mathbf{E}_{\mathbf{P}} ( | Y_{t} | ) < + \infty;
  • for all s and t with s < t and all F ∈ Σs,
\mathbf{E}_{\mathbf{P}} \left([Y_t-Y_s]\chi_F\right)=0,
where χF denotes the indicator function of the event F. In Grimmett and Stirzaker's Probability and Random Processes, this last condition is denoted as
Y_s = \mathbf{E}_{\mathbf{P}} ( Y_t | \Sigma_s ),
which is a general form of conditional expectation.[3]

It is important to note that the property of being a martingale involves both the filtration and the probability measure (with respect to which the expectations are taken). It is possible that Y could be a martingale with respect to one measure but not another one; the Girsanov theorem offers a way to find a measure with respect to which an Itō process is a martingale.

Examples of martingales

  • An unbiased random walk (in any number of dimensions) is an example of a martingale.
  • A gambler's fortune (capital) is a martingale if all the betting games which the gambler plays are fair.
  • Polya's urn contains a number of different coloured marbles, and each iteration a marble is randomly selected out of the urn and replaced with several more of that same colour. For any given colour, the ratio of marbles inside the urn with that colour is a martingale. For example, if currently 95% of the marbles are red then, although the next iteration is much more likely than not to cause more red marbles to be added, this bias is exactly balanced out by the fact that adding more red marbles would alter the ratio much less significantly than adding the same number of non-red marbles would.
  • Suppose Xn is a gambler's fortune after n tosses of a fair coin, where the gambler wins $1 if the coin comes up heads and loses $1 if the coin comes up tails. The gambler's conditional expected fortune after the next trial, given the history, is equal to his present fortune, so this sequence is a martingale.
  • Let Yn = Xn2n where Xn is the gambler's fortune from the preceding example. Then the sequence { Yn : n = 1, 2, 3, ... } is a martingale. This can be used to show that the gambler's total gain or loss varies roughly between plus or minus the square root of the number of steps.
  • (de Moivre's martingale) Now suppose an "unfair" or "biased" coin, with probability p of "heads" and probability q = 1 − p of "tails". Let
X_{n+1}=X_n\pm 1
with "+" in case of "heads" and "−" in case of "tails". Let
Y_n=(q/p)^{X_n}.
Then { Yn : n = 1, 2, 3, ... } is a martingale with respect to { Xn : n = 1, 2, 3, ... }. To show this

\begin{align}
E[Y_{n+1} \mid X_1,\dots,X_n] & = p (q/p)^{X_n+1} + q (q/p)^{X_n-1} \\
& = p (q/p) (q/p)^{X_n} + q (p/q) (q/p)^{X_n} \\
& = q (q/p)^{X_n} + p (q/p)^{X_n} = (q/p)^{X_n}=Y_n.
\end{align}
  • (Likelihood-ratio testing in statistics) A population is thought to be distributed according to either a probability density f or another probability density g. A random sample is taken, the data being X1, ..., Xn. Let Yn be the "likelihood ratio"
Y_n=\prod_{i=1}^n\frac{g(X_i)}{f(X_i)}
(which, in applications, would be used as a test statistic). If the population is actually distributed according to the density f rather than according to g, then { Yn : n = 1, 2, 3, ... } is a martingale with respect to { Xn : n = 1, 2, 3, ... }.
  • Suppose each amoeba either splits into two amoebas, with probability p, or eventually dies, with probability 1 − p. Let Xn be the number of amoebas surviving in the nth generation (in particular Xn = 0 if the population has become extinct by that time). Let r be the probability of eventual extinction. (Finding r as function of p is an instructive exercise. Hint: The probability that the descendants of an amoeba eventually die out is equal to the probability that either of its immediate offspring dies out, given that the original amoeba has split.) Then
\{\,r^{X_n}:n=1,2,3,\dots\,\}
is a martingale with respect to { Xn: n = 1, 2, 3, ... }.
Software-created martingale series.
  • The number of individuals of any particular species in an ecosystem of fixed size is a function of (discrete) time, and may be viewed as a sequence of random variables. This sequence is a martingale under the unified neutral theory of biodiversity.
  • If { Nt : t ≥ 0 } is a Poisson process with intensity λ, then the Compensated Poisson process { Nt − λt : t ≥ 0 } is a continuous-time martingale with right-continuous/left-limit sample paths.
  • An example martingale series can easily be produced with computer software:
  • Microsoft Excel or similar spreadsheet software. Enter 0.0 in the A1 (top left) cell, and in the cell below it (A2) enter =A1+NORMINV(RAND(),0,1). Now copy that cell by dragging down to create 300 or so copies. This will create a martingale series with a mean of 0 and standard deviation of 1. With the cells still highlighted go to the chart creation tool and create a chart of these values. Now every time a recalculation happens (in Excel the F9 key does this) the chart will display another martingale series.
  • R. To recreate the example above, issue plot(cumsum(rnorm(100, mean=0, sd=1)), t="l", col="darkblue", lwd=3). To display another martingale series, reissue the command.

Submartingales, supermartingales, and relationship to harmonic functions

There are two popular generalizations of a martingale that also include cases when the current observation Xn is not necessarily equal to the future conditional expectation E[Xn|X1,...,Xn] but instead an upper or lower bound on the conditional expectation. These definitions reflect a relationship between martingale theory and potential theory, which is the study of harmonic functions. Just as a continuous-time martingale satisfies E[Xt|{Xτ : τ≤s}] − Xs = 0 ∀s ≤ t, a harmonic function f satisfies the partial stochastic differential equation Δf = 0 where Δ is the Laplacian operator. Given a Brownian motion process Wt and a harmonic function f, the resulting process f(Wt) will also be a martingale.

  • A discrete-time submartingale is a sequence X_1,X_2,X_3,\ldots of integrable random variables satisfying
{}E[X_{n+1}|X_1,\ldots,X_n] \ge X_n.
Likewise, a continuous-time submartingale will satisfy
{}E[X_t|\{X_{\tau} : \tau \le s\}] \ge X_s \quad \forall s \le t.
In potential theory, a subharmonic function f will satisfy Δf ≥ 0. Any subharmonic function that is bounded above by a harmonic function for all points on the boundary of a ball will be bounded above by the harmonic function for all points inside the ball. Similarly, if a submartingale and a martingale have equivalent expectations for a given time, the history of the submartingale will tend to be bounded above by the history of the martingale. Roughly speaking, the prefix "sub-" is consistent because the current observation Xn is less than (or equal to) the conditional expectation E[Xn|X1,...,Xn]. Consequently, the current observation provides support from below the future conditional expectation, and the process tends to increase in future time.
  • Analogously, a discrete-time supermartingale satisfies
{}E[X_{n+1}|X_1,\ldots,X_n] \le X_n.
Likewise, a continuous-time supermartingale will satisfy
{}E[X_t|\{X_{\tau} : \tau \le s\}] \le X_s \quad \forall s \le t.
In potential theory, a superharmonic function f will satisfy Δf ≤ 0. Any superharmonic function that is bounded below by a harmonic function for all points on the boundary of a ball will be bounded below by the harmonic function for all points inside the ball. Similarly, if a supermartingale and a martingale have equivalent expectations for a given time, the history of the supermartingale will tend to be bounded below by the history of the martingale. Roughly speaking, the prefix "super-" is consistent because the current observation Xn is greater than (or equal to) the conditional expectation E[Xn|X1,...,Xn]. Consequently, the current observation provides support from above the future conditional expectation, and the process tends to decrease in future time.

Methods for remembering the difference between super- and submartingale

  • "Life is a supermartingale; as time advances, expectation decreases."[4]
  • The "b" in "submartingale" points up, which is consistent with the upward expectation in a submartingale. Similarly, the "p" in "supermartingale" points downward, which is consistent with the downward expectation in a supermartingale.
  • The "super" and "sub" refer to the current observation and not the future expectation. Thus, a "submartingale" is a process for which the current observation (Xn) is lower than the future conditional expectation (E[Xn|X1,...,Xn]). Likewise, a "supermartingale" is a process for which the current observation (Xn) is higher than the future conditional expectation (E[Xn|X1,...,Xn]). A submartingale rises because it is supported from below; a supermartingale falls because it is supported from above.

Examples of submartingales and supermartingales

  • Every martingale is also a submartingale and a supermartingale. Conversely, any stochastic process that is both a submartingale and a supermartingale is a martingale.
  • Consider again the gambler who wins $1 when a coin comes up heads and loses $1 when the coin comes up tails. Suppose now that the coin may be biased, so that it comes up heads with probability p.
    • If p is equal to 1/2, the gambler on average neither wins nor loses money, and the gambler's fortune over time is a martingale.
    • If p is less than 1/2, the gambler loses money on average, and the gambler's fortune over time is a supermartingale.
    • If p is greater than 1/2, the gambler wins money on average, and the gambler's fortune over time is a submartingale.
  • A convex function of a martingale is a submartingale, by Jensen's inequality. For example, the square of the gambler's fortune in the fair coin game is a submartingale (which also follows from the fact that Xn2 − n is a martingale). Similarly, a concave function of a martingale is a supermartingale.

Martingales and stopping times

A stopping time with respect to a sequence of random variables X1X2X3, ... is a random variable τ with the property that for each t, the occurrence or non-occurrence of the event τ = t depends only on the values of X1X2X3, ..., Xt. The intuition behind the definition is that at any particular time t, you can look at the sequence so far and tell if it is time to stop. An example in real life might be the time at which a gambler leaves the gambling table, which might be a function of his previous winnings (for example, he might leave only when he goes broke), but he can't choose to go or stay based on the outcome of games that haven't been played yet.

In some contexts the concept of stopping time is defined by requiring only that the occurrence or non-occurrence of the event τ = t be probabilistically independent of Xt + 1Xt + 2, ... but not that it be completely determined by the history of the process up to time t. That is a weaker condition than the one appearing in the paragraph above, but is strong enough to serve in some of the proofs in which stopping times are used.

One of the basic properties of martingales is that, if (Xt)t > 0 is a (sub-/super-) martingale and τ is a stopping time, then the corresponding stopped process (X_t^\tau)_{t>0} defined by X_t^\tau:=X_{\min\{\tau,t\}} is also a (sub-/super-) martingale.

The concept of a stopped martingale leads to a series of important theorems, including, for example, the optional stopping theorem which states that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value. We can use it, for example, to prove the impossibility of successful betting strategies for a gambler with a finite lifetime and a house limit on bets.

See also

Notes

  1. ^ Balsara, N. J. (1992). Money Management Strategies for Futures Traders. Wiley Finance. p. 122. ISBN 0-47-152215-5. 
  2. ^ Mansuy, Roger (June 2009). "The origins of the Word "Martingale"". Electronic Journal for History of Probability and Statistics 5 (1). http://www.jehps.net/juin2009/Mansuy.pdf. Retrieved 10-22-2011. 
  3. ^ Grimmett, G.; Stirzaker, D. (2001). Probability and Random Processes (3rd ed.). Oxford University Press. ISBN 0-19-857223-9. 
  4. ^ "Record of the Celebration of the Life of Joseph Leo Doob". October 2004. http://www.math.uiuc.edu/People/doob_record.html. Retrieved 4 November 2011. 

References


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • probability theory — Math., Statistics. the theory of analyzing and making statements concerning the probability of the occurrence of uncertain events. Cf. probability (def. 4). [1830 40] * * * Branch of mathematics that deals with analysis of random events.… …   Universalium

  • Martingale — can refer to: Martingale (probability theory), a stochastic process in which the conditional expectation of the next value, given the current and preceding values, is the current value Martingale (tack) for horses Martingale (collar) for dogs and …   Wikipedia

  • Martingale pricing — is a pricing approach based on the notions of martingale and risk neutrality. The martingale pricing approach is a cornerstone of modern quantitative finance and can be applied to a variety of derivatives contracts, e.g. options, futures,… …   Wikipedia

  • Martingale (betting system) — For the generalised mathematical concept, see martingale (probability theory). Originally, martingale referred to a class of betting strategies popular in 18th century France. The simplest of these strategies was designed for a game in which the… …   Wikipedia

  • Martingale (calcul stochastique) — Pour les articles homonymes, voir martingale (homonymie). En calcul stochastique, une martingale désigne un type de processus stochastique, c est à dire un processus aléatoire et dynamique. Ce type de processus X est tel que sa valeur espérée… …   Wikipédia en Français

  • Martingale representation theorem — In probability theory, the martingale representation theorem states that a random variable which is measurable with respect to the filtration generated by a Brownian motion can be written in terms of an Itô integral with respect to this Brownian… …   Wikipedia

  • Martingale central limit theorem — In probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution.… …   Wikipedia

  • Martingale difference sequence — In probability theory, a martingale difference sequence (MDS) is related to the concept of the martingale. A stochastic series Y is an MDS if its expectation with respect to past values of another stochastic series X is zero. Formally If Z is a… …   Wikipedia

  • Probability measure — In some cases, statistical physics uses probability measures, but not all measures it uses are probability measures.[1][2] In mathematics, a probability measure is a real valued function defined on a set …   Wikipedia

  • Outline of probability — Probability is the likelihood or chance that something is the case or will happen. Probability theory is used extensively in statistics, mathematics, science and philosophy to draw conclusions about the likelihood of potential events and the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”