Pseudorandom generator theorem

Pseudorandom generator theorem

In computational complexity a distribution is considered pseudorandom if no efficient computation can distinguish it from the true uniform distribution by a "non-negligible" advantage. Formally, a family of distributions "Dn" is "pseudorandom" if for any polynomial size circuit "C", and any "ε" inversely polynomial in "n"

:|Prob"x"∈"U" ["C"("x")=1] − Prob"x"∈"D" ["C"("x")=1] | ≤ "ε"

Pseudorandom generators

A function "Gl": {0,1}"l" &rarr; {0,1}"m", where "l" < "m" is a pseudorandom generator if:
*"Gl" can be computed in time polynomial in "l"
*"Gl"("x") is pseudorandom, when "x" is uniformly random

One additional pseudorandom bit implies polynomially more pseudorandom bits

It can be shown that if there is a pseudorandom generator "Gl": {0,1}"l" &rarr; {0,1}"l"+1, "i.e." a generator that adds "only one" pseudorandom bit, then for any "m" = "poly"("l"), there is a pseudorandom generator "G'l": {0,1}"l" &rarr; {0,1}"m".

The idea of the proof is as follows: first "s" bits from uniform distribution "Ul" are picked and used as the seed to the first instance of "Gl", which is known to be a pseudorandom generator. Next, the output of the first instance of "Gl" is divided into two parts: first "l" bits are fed into the second instance of "Gl" as a seed, while the last bit becomes the first bit of the output. Repeating this process for "m" times yields an output of "m" pseudorandom bits.

It can be shown that such "G'l", that consists of "m" instances of "Gl", is indeed a pseudorandom generator by using a hybrid approach and proof by contradiction as follows:

Let's consider "m+1" intermediate distributions "Hi: 0 &le; i &le; m", where first "i" bits are chosen from the uniform distribution, and last "m − i" bits are chosen from the output of "G'l". Thus, "H0" is the full output of "G'l" and "Hm" is a true uniform distribution "Um". Therefore, distributions "Hi" and "Hi+1" will be different in only one bit (bit number "i"+1).

Now, assume that "G'l" is not a pseudorandom distribution; that is, there exists some circuit "C" that can distinguish between "G'l" and "Um" with an advantage "&epsilon;" = 1/"poly"("l"). In other words, this circuit can distinguish between "H0" and "Hm". Therefore, there exists such "i" that the circuit "C" can distinguish between "Hi" and "Hi+1" by at least "&epsilon; / m". Note, that since "m" are polynomial in "l", then "&epsilon; / m" is also polynomial in "l" and is still a non-negligible advantage.

Now, assume we are given "l+1" bits that are either output of "Gl" or a drawn from uniform distribution. Let's reuse the approach of building large pseudorandom generators out of instances of "Gl" and construct a string of pseudorandom bits of length "m−i−1" in the same way the "G'l" was constructed above using the first "l" given bits as the seed. Then, let's create a string consisting of "i" bits drawn from uniform, concatenated with the last one of the given bits, followed by the created "m−i−1" bits. The resulting output is either "Hi" or "Hi+1", since the "i+1" bit is either drawn from uniform or from "Gl". Since by assumption we can distinguish between "Hi" and "Hi+1" with "non-negligible" advantage, then we can distinguish betwenn "U" and "Gl", which implies that "Gl" is not a pseudorandom generator, which is a contradiciton to the hypothesis. Q.E.D.

Now, let's illustrate that if exists, the circuit for distinguishing between "Gl" and "Ul+1" does not have to toss random coins. As we showed above, if exists a circuit "C" for distinguishing between "G'l" and "Um", where "m" = "poly"("l"), then exists a circuit "C' " for distinguishing between "Gl" and "Ul+1" that uses "i" random bits. For this circuit "C' ":

Prob"u, s" ["C' "("u1,...,ui,Gl"("s")) = 1 ] − Prob"u, y" ["C' "("u1,>,...,ui,y") = 1] &ge; "&epsilon; / m",

where "u" is a string of "i" uniformly random bits, "s" is a string of "l" uniformly random bits, and "y" is a string of "l"+1 uniformly random bits.


Prob"u" [ | Prob"s" ["C' "("u1,...,ui,Gl"("s")) = 1] - Prob"y" ["C' "("u1,...,ui,y") = 1] | ] &ge; "&epsilon; / m";

Which means, there exists some fixed string "u" of "i" bits that can be used as an "advice" to circuit "C' " for distinguishing between "Gl" and "Ul+1".

Existence of pseudorandom generators

In complexity theory, the existence of pseudorandom generators is related to the existence of one-way functions and hard-core predicates. Formally, pseudorandom generators exist if and only if one-way functions exist, or

PRG &harr; OWF


One-way functions

Intuitively one-way functions are functions that are easy to compute and hard to invert. In other words the complexity (or circuit size) of the function is much smaller than that of its inverse. Formally: A function &fnof;: {0,1}"n" &rarr; {0,1}"n" is ("S","&epsilon;") "one-way" if for any circuit "C" of size &le; "S",

Prob [&fnof;("C"(&fnof;("x"))) = &fnof;("x")] &le; "&epsilon;".

Moreover, &fnof; is a "one-way function" if
* &fnof; can be computed in polynomial time
* &fnof; is ("poly"("n"), 1/"poly"("n")) one-way

Hard-core predicate

Function "B": {0,1}"n" &rarr; {0,1} is a hard-core predicate for function &fnof; if
* "B" can be computed
* for any polynomial size circuit "C" and any non-negligible "&epsilon;" = 1/"poly"("n"), Prob"x~U " ["C"(&fnof;("x")) = "B"("x")] &le; 1/2+"&epsilon;"

In other words it is hard to predict "B"("x") from function &fnof;("x").


"Here an outline of the proof is given. Please see references for detailed proofs."

PRG &rarr; OWF

Consider a pseudorandom generator "Gl": {0,1}"l" &rarr; {0,1}"2l". Let's create the following one-way function &fnof;: {0,1}"n" &rarr; {0,1}"n" that uses the first half of the output of "Gl" as its output. Formally,

&fnof;("x","y") &rarr; "Gl"("x")

A key observation that justifies such selection, is that the image of the function is of size 2"n" and is a negligible fraction of the pre-image universe of size 2"2n".

To prove that &fnof; is indeed a one-way function let's construct an argument by contradiction. Assume there exists a circuit "C" that inverts &fnof; with advantage "&epsilon;":

Prob [&fnof;("C"(&fnof;("x","y"))) = &fnof;("x","y")] > "&epsilon;"

Then we can create the following algorithm that will distinguish "Gl" from uniform, which contradicts the hypothesis. The algorithm would take an input of "2n" bits "z" and compute ("x","y") = "C"("z"). If "Gl"("x") = "z" the algorithm would accept, otherwise it rejects.

Now, if "z" is drawn from uniform distribution, the probability that the above algorithm accepts is &le; 1/2"l", since the size of image is 1/2"l" of the size of the pre-image. However, if "z" was drawn from the output of "Gl" then the probability of acceptance is > "&epsilon;" by assumption of the existence of circuit "C". Therefore, the advantage that circuit "C" has in distinguishing between the uniform "U" and output of "Gl" is > "&epsilon;" − 1/2, which is non-negligible and thus contradicts our assumption of "Gl" being a pseudorandom generator. Q.E.D.

OWF &rarr; PRG

For this case we prove a "weaker" version of the theorem:

One-way permutation &rarr; pseudorandom generator

A one-way permutation is a one-way function that is also a permutation of the input bits. A pseudorandom generator can be constructed from one-way permutation &fnof; as follows:

"Gl": {0,1}"l"&rarr;{0,1}"l"+1 = &fnof;("x")."B"("x"), where "B" is hard-core predicate of &fnof; and "." is a concatenation operator. Note, that by the theorem proven above, it is only needed to show the existence of a generator that adds just one pseudorandom bit.

Hard-core predicate &rarr; PRG

First, let's show that if "B" is a hard-core predicate for &fnof; then "Gl" is indeed pseudorandom. Again, we'll use an argument by contradiction.

Assume that "Gl" is not a pseudorandom generator; that is, there exists circuit "C" of polynomial size that distinguishes "Gl"("x") =&fnof;("x")."B"("x") from "Ul+1" with advantage &ge;"&epsilon;", where "&epsilon;" is non-negligible. Note, that since &fnof;("x") is a permutation, then if "x" is drawn from uniform distribution, then so if &fnof;("x"). Therefore, "Ul+1" is equivalent to &fnof;("x")."b", where "b" is a bit drawn independently from a uniform distribution. Formally,

Prob"x~U" ["C"("G"("x"))=1] − Prob"x~U,b~U" ["C"("x.b")=1] &ge; "&epsilon;"

Let's construct the following algorithm "C"':

1. Given z=f(x) guess bit b 2. Run C on z.b 3. IF C(z.b)=1 4. output b 5. ELSE 6. output 1-b

Given the output of &fnof; the algorithm first guesses bit "b" by tossing a random coin, "i.e." Prob ["b"=0] = Prob ["b"=1] = 0.5. Then, algorithm (circuit) "C" is run on "f(x).b" and if the result is 1 then "b" is outputted, otherwise the inverse of "b" is returned.

Then probability of "C"' guessing "B"("x") correctly is:

Prob"x~U" ["C"'("z")="B"("x")] =

Prob ["b"="B"("x") &and; "C"("z.b")=1] +Prob ["b"&ne;"B"("x") &and; "C"("z.b")=0] =

Prob ["b"="B"("x")] &sdot;Prob ["C"("z.b")=1 | "b"="B"("x")] +Prob ["b"&ne;"B"("x")] &sdot;Prob ["C"("z.b")=0 | "b"&ne;"B"("x")] =

1/2&sdot;Prob ["C"("z.b")=1 | "b"="B"("x")] +1/2&sdot;Prob ["C"("z.b")=0 | "b"&ne;"B"("x")] =

(1−1/2)&sdot;Prob ["C"("z.b")=1 | "b"="B"("x")] +1/2&sdot;(1−Prob ["C"("z.b")=1 | "b"&ne;"B"("x")] ) =

1/2+Prob"z.b~G(x)" ["C"("z.b")=1] −1/2&sdot;(Prob ["C"("z.b")=1 | "b"="B"("x")] +Prob ["C"("z.b")=1 | "b"&ne;"B"("x")] ) =

1/2+Prob"z.b~G(x)" ["C"("z.b")=1] −Prob"z.b~U" ["C"("z.b")=1] &ge; 1/2+"&epsilon;"

This implies that circuit "C"' can predict "B"("x") with probability more than 1/2 + "&epsilon;", which means that "B" cannot be a hard-core predicate for &fnof; and the hypothesis is contradicted. Q.E.D.

OWP &rarr; hard-core predicate

The outline of the proof is as follows:

If &fnof;{0,1}"n"&rarr;{0,1}"n" is a one-way permutation, then so is &fnof;'{0,1}"2n"&rarr;{0,1}"2n", where &fnof;'("x","y")=&fnof;("x")."y" by definition. Then "B"("x","y")="x"&sdot;"y" is a hard-core predicate for &fnof;', where "&sdot;" is a vector dot product. To prove that it is indeed hard-core let's assume otherwise, and show a contradiction with the hypothesis of &fnof; being one-way. If "B" is not a hard-core predicate, then there exists a circuit "C" that predicts it, so

Prob"x,y" ["C"(&fnof;("x"),"y")="x"&sdot;"y"] &ge; 1/2+"&epsilon;". That fact can be used to recover "x" by cleverly constructing permutations "y" that "isolate" bits in "x". In fact, for a constant fraction of "x", there exists a polynomial time algorithm that lists "O"(1/"&epsilon"2) candidates that include all valid "x". Thus, an algorithm can invert &fnof;("x") in polynomial time for a non-negligible fraction of "x", which contradicts the hypothesis.


* W. Diffie, M.E. Hellman. "New Directions in Cryptography." "IEEE Transactions on Information Theory", IT-22, pp.644-654, 1976.
* A.C. Yao. "Theory and Application of Trapdoor Functions." "23rd IEEE Symposium on Foundations of Computer Science", pp.80-91, 1982.
* M. Blum and S. Micali "How to Generate Cryptographically Strong Sequences of Pseudo-Random Bits." "SIAM Journal on Computing", v13, pp.850-864, 1984.
* J. Hastad, R. Impagliazzo, L.A. Levin and M. Luby. "A Pseudorandom Generator from any One-way Function." "SIAM Journal on Computing", v28 n4, pp.-1364-1396, 1999.

Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Pseudorandom generator — In theoretical computer science, a pseudorandom generator is a deterministic method of generating a large amount of pseudorandom, or apparently random, data, from a small amount of initial random data. The initial data is commonly known as a… …   Wikipedia

  • Hardware random number generator — This SSL Accelerator computer card uses a hardware random number generator to generate cryptographic keys to encrypt data sent over computer networks. In computing, a hardware random number generator is an apparatus that generates random numbers… …   Wikipedia

  • Linear congruential generator — A linear congruential generator (LCG) represent one of the oldest and best known pseudorandom number generator algorithms. [ [ Linear Congruential Generators] by Joe Bolte, The… …   Wikipedia

  • List of mathematics articles (P) — NOTOC P P = NP problem P adic analysis P adic number P adic order P compact group P group P² irreducible P Laplacian P matrix P rep P value P vector P y method Pacific Journal of Mathematics Package merge algorithm Packed storage matrix Packing… …   Wikipedia

  • Algorithmic information theory — is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is the result of putting Shannon s information theory and Turing s… …   Wikipedia

  • Expander graph — In combinatorics, an expander graph is a sparse graph which has high connectivity properties, quantified using vertex or edge expansion as described below. Expander constructions have spawned research in pure and applied mathematics, with several …   Wikipedia

  • Fermat number — In mathematics, a Fermat number, named after Pierre de Fermat who first studied them, is a positive integer of the form:F {n} = 2^{2^{ overset{n} {} + 1where n is a nonnegative integer. The first nine Fermat numbers are OEIS|id=A000215:As of|2008 …   Wikipedia

  • List of number theory topics — This is a list of number theory topics, by Wikipedia page. See also List of recreational number theory topics Topics in cryptography Contents 1 Factors 2 Fractions 3 Modular arithmetic …   Wikipedia

  • Random number generation — A random number generator (often abbreviated as RNG) is a computational or physical device designed to generate a sequence of numbers or symbols that lack any pattern, i.e. appear random. Computer based systems for random number generation are… …   Wikipedia

  • List of electronics topics — Alphabetization has been neglected in some parts of this article (the b section in particular). You can help by editing it. This is a list of communications, computers, electronic circuits, fiberoptics, microelectronics, medical electronics,… …   Wikipedia