Donald Geman

Donald Geman
Donald Geman

Donald Geman, Fall 1983, Paris
Born September 20, 1943 (1943-09-20) (age 68)
Chicago, Illinois, USA
Residence United States, France
Nationality American
Fields Mathematics
Statistics
Institutions University of Massachusetts
Johns Hopkins University
École Normale Supérieure de Cachan
Alma mater Columbia University
University of Illinois at Urbana-Champaign
Northwestern University
Doctoral advisor Michael Marcus
Notable awards ISI highly cited researcher

Donald Geman (born September 20, 1943) is an American statistician and a leading researcher in the realm of machine learning and pattern recognition. He and his brother, Stuart Geman, are very well known for proposing the Gibbs sampler and for the first proof of the convergence of the simulated annealing algorithm in one of their works, which became a highly cited work in the realm of engineering.[1][2] He is a Professor at the Johns Hopkins University and simultaneously a visiting professor at École Normale Supérieure de Cachan.

Contents

Biography

Geman was born in Chicago in 1943. He started studying English literature at Columbia University in 1961. In 1963, he transferred to University of Illinois at Urbana-Champaign, where he graduated in 1965. He graduated from Northwestern University in the field of mathematics in 1970. His dissertation was entitled as "Horizontal-window conditioning and the zeros of stationary processes." He joined University of Massachusetts Amherst in 1970, where he retired as a distinguished professor in 2001. Thereafter, he became a professor at the Department of Applied Mathematics at Johns Hopkins University. He has also been a visiting professor at the École Normale Supérieure de Cachan since 2001. He is now a Fellow of Institute of Mathematical Statistics.

Work

D. Geman and J. Horowitz published a series of papers during the late 70s on local times and occupation densities of stochastic processes. A survey of this work and other related problems can be found in the Annals of Probability.[3] In 1984 with his brother Stuart, he published a milestone paper which is still today one of the most cited papers [4] in the engineering literature. It introduces a Bayesian paradigm using Markov Random Fields for the analysis of images. This approach has been highly influential over the last 20 years and remains a rare tour de force in this rapidly evolving field. In another milestone paper,[5] in collaboration with Y. Amit, he introduced the notion for randomized decision trees which have been called random forests and popularized by Leo Breiman. Some of his recent works include the introduction of coarse-to-fine hierarchical cascades for object detection[6] in computer vision and the TSP (Top Scoring Pairs) classifier as a simple and robust rule for classifiers trained on high dimensional small sample datasets in bioinformatics.[7][8]

References

  1. ^ S. Geman and D. Geman (1984). "Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images". IEEE Transactions on Pattern Analysis and Machine Intelligence 6 (6): 721–741. doi:10.1109/TPAMI.1984.4767596. 
  2. ^ Google Scholar:Stochastic Relaxation, Gibbs Distributions and the Bayesian Restoration: http://scholar.google.com/scholar?sourceid=navclient&ie=UTF-8&rlz=1T4ADBR_enUS295US296&q=stochastic%20relaxation%20gibbs%20distribution&um=1&sa=N&tab=ws
  3. ^ D. Geman and J. Horowitz (1980). "Occupation Densities". Annals of Probability 8 (1): 1. doi:10.1214/aop/1176994824. 
  4. ^ ISI Highly Cited:Donald Geman http://hcr3.isiknowledge.com/author.cgi?&link1=Search&link2=Search%20Results&AuthLastName=geman&AuthFirstName=&AuthMiddleName=&AuthMailnstName=&CountryID=-1&DisciplineID=0&id=519
  5. ^ Y. Amit and D. Geman (1997). "Shape Quantization and Recognition with Randomized Trees". Neural Computation. 
  6. ^ D. Geman and F. Fleuret (2001). "Coarse-to-Fine Face Detection". International Journal of Computer Vision. 
  7. ^ D. Geman, C. d'Avignon, D. Naiman and R. Winslow (2004). "Simple Classifying gene expression profiles from pairwise mRNA comparisons". Statistical Applications in Genetics and Molecular Biology 3: Article19. doi:10.2202/1544-6115.1071. PMC 1989150. PMID 16646797. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1989150. 
  8. ^ A-C Tan, D. Naiman, L. Xu, R. Winslow and D. Geman (2005). "Simple decision rules for classifying human cancers from gene expression profiles". Bioinformatics. doi:10.1093/bioinformatics/bti631. 

External links


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Gibbs sampling — In statistics and in statistical physics, Gibbs sampling or a Gibbs sampler is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variables. The purpose of such a sequence is to… …   Wikipedia

  • List of statisticians — Statisticians or people who made notable contributions to the theories of statistics, or related aspects of probability, or machine learning. NOTOC CompactTOC8 side = yes seealso = yes nobreak = yes extlinks = yes x = A * Odd Olai Aalen (1947 ) * …   Wikipedia

  • Muestreo de Gibbs — En matemáticas y física, el muestreo de Gibbs es un algoritmo para generar una muestra aleatoria a partir de la distribución de probabilidad conjunta de dos o más variables aleatorias. Se trata de un caso especial del algoritmo de Metropolis… …   Wikipedia Español

  • Géopolitique du pétrole — La géopolitique du pétrole décrit l impact des besoins en pétrole, matière première devenue essentielle à la vie économique mondiale, sur le comportement des pays les plus puissants. Les gisements de pétrole étant limités et leur emplacement… …   Wikipédia en Français

  • Markov chain Monte Carlo — MCMC redirects here. For the organization, see Malaysian Communications and Multimedia Commission. Markov chain Monte Carlo (MCMC) methods (which include random walk Monte Carlo methods) are a class of algorithms for sampling from probability… …   Wikipedia

  • Random forest — In machine learning, a random forest is a classifier that consists of many decision trees and outputs the class that is the mode of the classes output by individual trees. The algorithm for inducing a random forest was developed by Leo Breiman… …   Wikipedia

  • List of mathematicians (G) — NOTOC Gab * Gabai, David (?, ? ) * Gage, Paul (?, ? ) * Gage, Walter (Canada, 1922 1978) * Gaitsgory, Dennis (USA, ? ) * Galambos, Janos (USA/Hungary, 1940 ) * Gale, David (USA, 1921 2008) * Galerkin, Boris Grigoryevich (Russia, 1871 1945) *… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”