Ray Solomonoff


Ray Solomonoff

Ray Solomonoff (born 1926, Cleveland, Ohio, son of Russianimmigrants) invented [http://scholarpedia.org/article/Algorithmic_probability Algorithmic Probability] in 1960. He first described his results at a Conference at Cal Tech, 1960, [Paper from conference on "Cerebral Systems and Computers",California Institute of Technology, Feb 8-11, 1960, cited in "AFormal Theory of Inductive Inference, Part 1, 1964, p. 1] andin a report, [http://world.std.com/~rjs/z138.pdf "A Preliminary Report on a General Theory of Inductive Inference,"] in Feb., 1960. ["A PreliminaryReport on a General Theory of Inductive Inference," Report V-131,Zator Co., Cambridge, Ma. Feb 4, 1960.] He clarified these ideas more fully in his 1964 publications, "A FormalTheory of Inductive Inference," [http://world.std.com/~rjs/1964pt1.pdf Part I] and [http://world.std.com/~rjs/1964pt2.pdf Part II."] ["A FormalTheory of Inductive Inference, Part I" "Information and Control",Vol 7, No. 1 pp 1-22, March 1964, and"A Formal Theory of Inductive Inference, Part II" "Information and Control",Vol 7, No. 2 pp 224-254, June 1964.]

Although he is best known for Algorithmic Probability and his General Theory of Inductive Inference, he made other important discoveries in related fields.He wrote three [http://world.std.com/~rjs/50.pdf papers] , two with Rapoport, in 1950-52, that are regarded as the earliest statistical analysis of networks. He was one of the 10 attendees at the 1956 Dartmouth Summer Research Conference on Artificial Intelligence, a seminal event for artificial intelligence as a field. There he wrote [http://world.std.com/~rjs/indinf56.pdf "An Inductive Inference Machine"] , publishing a version in 1957 [An Inductive Inference Machine," IRE Convention Record, Section on Information Theory, Part 2, pp. 56-62] . This was the first paper to be written on Machine Learning. He also invented probabilistic languages in the late 1950s. His most important breakthrough, however, was the invention of Algorithmic Probability.Prior to the 1960s, the usual method of calculatingprobability was based on frequency: taking the ratio of favorableresults to the total number of trials. In his 1960 publication,and, more completely, in his 1964 publications, Solomonoff seriously revisedthis definition of probability. He called this new form ofprobability "Algorithmic Probability."

What was later called Kolmogorov Complexity was a side product of his GeneralTheory. He described this idea in 1960: "Consider a very long sequence of symbols...We shall consider such a sequence of symbols to be 'simple' andhave a high a priori probability, if there exists a very briefdescription of this sequence - using, of course, some sort ofstipulated description method. More exactly, if we use only thesymbols 0 and 1 to express our description, we will assign theprobability 2-"N" to a sequence of symbols if itsshortest possible binary description contains "N"digits." ["A Preliminary Report on a General Theory of InductiveInference,", 1960 p. 1]

Five years later, in 1965, the Russian mathematician Kolmogorovindependently presented a similar idea. When he became aware of Solomonoff's work, he acknowledged Solomonoff's priority, and for several years, Solomonoff's work was better known in the Soviet Union than in the Western World. The general consensus in the scientific community, however, was to associate this typeof complexity with Kolmogorov, who was more concerned with randomnessof a sequence. Algorithmic Probability became associated with Solomonoff,who was focussed on prediction - the extrapolation of a sequence.

Later in the same 1960 publication Solomonoff describes his improvementon the single-shortest-code theory. This is AlgorithmicProbability. He states: "It would seem that if there are severaldifferent methods of describing asequence, each of these methods should be given "some" weight indetermining the probability of that sequence." ["APreliminary Report on a General Theory of InductiveInference,",1960, p.17] He then shows how this idea can be used to generate the universal a priori probability distribution.

In the early 60s, he continued to write and enlarge histheory, publishing a number of reports leading up to his publications in 1964. Although his work at first was not widely known, over theyears many people have come to recognize the importance andpriority of his discoveries.

In 1968 he found a proof for theefficacy of Algorithmic Probability ["Complexity-based InductionSystems, Comparisons and convergence Theorems" "IEEE Trans. onInformation Theory" IT-24, No. 4 (July,1978).] , but mainly because of lackof general interest at that time, did not publish it until 10years later. He showed thatAlgorithmic Probability was "complete"; that if there was anydescribable regularity in a body of data, that AlgorithmicProbability would eventually discover that regularity, needing arelatively small sample of that data. Algorithmic Probability isthe only probability system know to be "complete" in this way.In1986 he described how Algorithmic Probability could be used inapplications to A.I. ["The Application of Algorithmic Probability to Problems inArtificial Intelligence", in Kanal and Lemmer (Eds.), "Uncertaintyin Artificial Intelligence,", Elsevier Science Publishers B.V., pp473-491, 1986.]

In 1970 he formed his own one man company, Oxbridge Research, andhas continued his research there except for short periods at otherinstitutions such as MIT, University of Saarland in Germany and IDSIA in Switzerland. In 2003 he was the first recipient of the The Computer Learning ResearchCenter's Kolmogorov award and gave the inaugural Kolmogorov lecture at Royal Holloway, University of London. Solomonoff iscurrently visiting Professor at the CRLC.

In 2006 he spoke at AI@50, "Dartmouth Artificial Intelligence Conference: the Next Fifty Years" commemorating the fiftieth anniversaryof the original Dartmouth summer study group. Solomonoff was one of five original participants to attend.

Most recently, in Feb. 2008, hegave the keynote address at the Conference, "Current Trends in theTheory and Application of Computer Science" held at Notre DameUniversity in Lebanon. He followed this with a short series oflectures, and began research on new applications of Algorithmic Probability.

A description of Solomonoff's life and work prior to 1997 is in his paper "The Discovery of Algorithmic Probability", Journal of Computer and System Sciences, Vol 55, No. 1, pp 73-88, August 1997. This paper, as well as most of the others mentioned here, are available on his website at his [http://world.std.com/~rjs/pubs.html publications page] .

References

ee also

* Kolmogorov complexity
* Inductive inference
* Ming Li and Paul Vitanyi, "An Introduction to Kolmogorov Complexity and It's Applications." Springer-Verlag, N.Y., 1997, includes historical notes on Solomonoff as well as a description and analysis of his work.

External links

* [http://world.std.com/~rjs/index.html Ray Solomonoff's Homepage]
* For a detailed description of Algorithmic Probability, see [http://scholarpedia.org/article/Algorithmic_probability "Algorithmic Probability"] by Hutter, Legg and Vitanyi in the scholarpedia.


Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Ray Solomonoff — Naissance 25 juillet 1926 Cleveland, Ohio (États Unis) Décès 7 décembre 2009 (à 83 ans) Nationalité …   Wikipédia en Français

  • Kolmogorov complexity — In algorithmic information theory (a subfield of computer science), the Kolmogorov complexity of an object, such as a piece of text, is a measure of the computational resources needed to specify the object. It is named after Soviet Russian… …   Wikipedia

  • AI@50 — AI@50, which is formally known as the Dartmouth Artificial Intelligence Conference: The Next Fifty Years (July 13 15, 2006), commemorated the 50th anniversary of the Dartmouth Conferences which effectively inaugurated the history of artificial… …   Wikipedia

  • Inductive inference — This article is about the mathematical concept, for inductive inference in logic, see Inductive reasoning. Around 1960, Ray Solomonoff founded the theory of universal inductive inference, the theory of prediction based on observations; for… …   Wikipedia

  • Occam's razor — For the aerial theatre company, see Ockham s Razor Theatre Company. It is possible to describe the other planets in the solar system as revolving around the Earth, but that explanation is unnecessarily complex compared to the modern consensus… …   Wikipedia

  • Колмогоровская сложность — В алгоритмической теории информации колмогоровская сложность объекта (такого, как текст) есть мера вычислительных ресурсов, необходимых для точного определения этого объекта. Колмогоровская сложность также известна как описательная сложность,… …   Википедия

  • Algorithmic probability — is a concept in theoretical computer science; it quantifies the idea of theories and predictions with reference to short programs and their output. Around 1960, Ray Solomonoff invented the concept of algorithmic probability: take a universal… …   Wikipedia

  • Matthew effect (sociology) — In sociology, the Matthew effect (or accumulated advantage) is the phenomenon where the rich get richer and the poor get poorer .[1][2] Those who possess power and economic or social capital can leverage those resources to gain more power or… …   Wikipedia

  • Marcus Hutter — (born 1967) is a German computer scientist and professor at the Australian National University. Hutter was born and educated in Munich, where he studied physics and computer science. In 2000 he joined Jürgen Schmidhuber s group at the Swiss… …   Wikipedia

  • Algorithmic information theory — is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is the result of putting Shannon s information theory and Turing s… …   Wikipedia


Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”

We are using cookies for the best presentation of our site. Continuing to use this site, you agree with this.