- Discrete probability distribution
In

probability theory , aprobability distribution is called**discrete**if it is characterized by aprobability mass function . Thus, the distribution of arandom variable "X" is discrete, and "X" is then called a**discrete random variable**, if:$sum\_u\; Pr(X=u)\; =\; 1$

as "u" runs through the set of all possible values of "X".

If a random variable is discrete, then the set of all values that it can assume with non-zero probability is finite or

countably infinite , because the sum of uncountably many positivereal number s (which is theleast upper bound of the set of all finite partial sums) always diverges to infinity.Typically, this set of possible values is a topologically discrete set in the sense that all its points are

isolated point s. But, there are discrete random variables for which this countable set is dense on the real line.The

Poisson distribution , theBernoulli distribution , thebinomial distribution , thegeometric distribution , and thenegative binomial distribution are among the most well-known discrete probability distributions.**Alternative description**Equivalently to the above, a discrete random variable can be defined as a random variable whose

cumulative distribution function (cdf) increases only by jump discontinuities — that is, its cdf increases only where it "jumps" to a higher value, and is constant between those jumps. The points where jumps occur are precisely the values which the random variable may take. The number of such jumps may be finite orcountably infinite . The set of locations of such jumps need not be topologically discrete; for example, the cdf might jump at eachrational number .**Representation in terms of indicator functions**For a discrete random variable "X", let "u"

_{0}, "u"_{1}, ... be the values it can take with non-zero probability. Denote:$Omega\_i=\{omega:\; X(omega)=u\_i\},,\; i=0,\; 1,\; 2,\; dots$

These are

disjoint set s, and by formula (1):$Prleft(igcup\_i\; Omega\_i\; ight)=sum\_i\; Pr(Omega\_i)=sum\_iPr(X=u\_i)=1.$

It follows that the probability that "X" takes any value except for "u"

_{0}, "u"_{1}, ... is zero, and thus one can write "X" as:$X=sum\_i\; alpha\_i\; 1\_\{Omega\_i\}$

except on a set of probability zero, where $alpha\_i=Pr(X=u\_i)$ and $1\_A$ is the

indicator function of "A". This may serve as an alternative definition of discrete random variables.**ee also***

Stochastic vector

*Wikimedia Foundation.
2010.*