Average-case complexity

Average-case complexity

For deterministic algorithms, the average-case complexity (expected time complexity), associates a given input distribution with the expected time of an algorithm.

Leonid Levin presented the motivation for studying average-case complexity as follows: [ [http://www.cs.bu.edu/fac/lnd/research/hard.htm "Intractability Concepts for Concrete Problems", Leonid Levin] ] ::"Many combinatorial problems (called search or NP problems) have easy methods of checking solutions for correctness. Examples: finding factors of a long integer, or proofs of math theorems or short fast programs generating a given string. Such problems can be stated as a task to invert a given, easy to compute, function (multiplication or extraction of a theorem from its proof). In 1971 I noticed that many such problems can be proven to be as hard as the Tiling problem (which, I knew for a while, was universal, i.e. at least as hard as any search problem)...:"A common misinterpretation of these results was that all NP-complete problems are hard, no chance for good algorithms. On this basis some such problems generated much hope in cryptography: the adversary would be helpless. Karp and others noticed that this was naive. While worst instances of NP-complete problems defeat our algorithms, such instances may be extremely rare. In fact, fast on average algorithms were found for a great many NP-complete problems. If all NP problems are easy on average, the P=?NP question becomes quite academic. Even if exponentially hard instances exist, those we could ever find might all be easy. Some problems (like factoring) seem hard for typical instances, but nothing is proven at all to support this (crucial, e.g., for cryptography) belief. These issues turned out to be subtle and it was not clear how a theory could distinguish intrinsically hard on average problems. [Levin 86] , [Venkatesan, Levin STOC-88] , [Impagliazzo, Levin, FOCS-90] proposed such a theory with first average case intractability results. Random (under uniform distribution) instances of some problems are now known to be as hard as random instances of any NP-problem under any samplable distribution."

Literature

The literature of average case complexity includes the following work:
* John Franco. On the probabilistic performance of algorithms for the satisfiability problem. Information Processing Letters, 3:103–106, 1986.
* Leonid Levin. Average case complete problems. SIAM J. Comput., 15:285–286, 1986.
* Philippe Flajolet and J.S. Vitter. Average-case analysis of algorithms and data structures. Technical report, Institut National de Recherche en Informatique et en Automatique, B.P. 105-78153 Le Chesnay Cedex France, August 1987.
* Yuri Gurevich and Saharon Shelah. Expected computation time for Hamiltonian path problem. SIAM Journal on Computing, volume 16, issue 3, June 1897, pages 486-502. ISSN: 0097-5397.
* Shai Ben-David, Benny Chor, Oded Goldreich, and Michael Luby. On the theory of average case complexity. In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 204–216. ACM, 1989.
* Yuri Gurevich. Average case completeness. Journal of Computer and`System Sciences, 42:346–398, 1991. See also [http://research.microsoft.com/~gurevich/Opera/76.pdf 1989 draft] .
* B. Selman D. Mitchell and H. Levesque. Hard and easy distributions of SAT problems. In Proceedings of the 10th National Conference on Artificial Intelligence, pages 459–465, 1992.
* Rainer Schuler and Tomoyuki Yamakami. Structural average case complexity. In Foundations of Software Technology and Theoretical Computer Science, pages 128–139. Springer-Verlag Lecture Notes in Computer Science #652, 1992.
* Rüdiger Reischuk and Christian Schindelhauer. Precise average case complexity. In 10th Annual Symposium on Theoretical Aspects of Computer Science, pages 650–661, 1993.
* R. Venkatesan and S. Rajagopalan. Average case intractability of matrix and Diophantine problems. In 24th Annual ACM STOC, pages 632–642, May 1992.
* [http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.36.3850 Jim Cox, Lars Ericson, Bud Mishra. "The average case complexity of multilevel syllogistic", New York University, 1995]
* Russell Impagliazzo, [http://www-cse.ucsd.edu/~russell/average.ps "A personal view of average-case complexity", UC San Diego, April 17, 1995]

ee also

*NP-complete problems


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Computational complexity theory — is a branch of the theory of computation in theoretical computer science and mathematics that focuses on classifying computational problems according to their inherent difficulty, and relating those classes to each other. In this context, a… …   Wikipedia

  • complexity — /keuhm plek si tee/, n., pl. complexities for 2. 1. the state or quality of being complex; intricacy: the complexity of urban life. 2. something complex: the complexities of foreign policy. [1715 25; COMPLEX + ITY] * * * ▪ scientific theory… …   Universalium

  • Case mix — The term Casemix refers to the type or mix of patients treated by a hospital or unit. Casemix based funding is the key funding model currently used in Australian health care services for reimbursement of the cost of patient care.Casemix is a… …   Wikipedia

  • Information-based complexity — (IBC) studies optimal algorithms and computational complexity for the continuous problems which arise in physical science, economics, engineering, and mathematical finance. IBC has studied such continuous problems as path integration, partial… …   Wikipedia

  • Time complexity — In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the size of the input to the problem. The time complexity of an algorithm is commonly expressed using big O… …   Wikipedia

  • Circuit complexity — In theoretical computer science, circuit complexity is a branch of computational complexity theory in which Boolean functions are classified according to the size or depth of Boolean circuits that compute them. A Boolean circuit with n input bits …   Wikipedia

  • IP (complexity) — In computational complexity theory, the class IP is the class of problems solvable by an interactive proof system. The concept of an interactive proof system was first introduced by Goldwasser, et al. in 1985. An interactive proof system consists …   Wikipedia

  • P versus NP problem — Unsolved problems in computer science Is P = NP ? …   Wikipedia

  • Lattice problem — In computer science, lattice problems are a class of optimization problems on lattices. The conjectured intractability of such problems is central to construction of secure lattice based cryptosystems. For applications in such cryptosystems,… …   Wikipedia

  • Quasi-Monte Carlo methods in finance — High dimensional integrals in hundreds or thousands of variables occur commonly in finance. These integrals have to be computed numerically to within a threshold epsilon. If the integral is of dimension d then in the worst case, where one has a… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”