Hava Siegelmann

Hava Siegelmann

Hava Siegelmann is a Computer Scientist at the University of Massachusetts who is Director of their Biologically Inspired Neural and Dynamical Systems Lab [ [http://binds.cs.umass.edu/index.html BINDS Lab] ] . In the early 1990s she proposed a new computational model, the Artificial Recurrent Neural Network (ARNN), and proved that it could perform hypercomputation [ [http://www.cs.math.ist.utl.pt/ftp/pub/CostaJF/01-RCS-iwann.pdf Verifying Properties of Neural Networks] ] . She is considered the originator of the term Super-Turing, a subfield which started after her contribution. Hava has also been oneof the originators of the well-known Support Vector Clustering together with Vladimir Vapnik and colleagues. Lately she has published an original work in the field of System-Biology explicating the inter-clock dynamics causing Jet-lag.


She earned her BA at Technion, her MSc at Hebrew University and her PhD at Rutgers University, all in Computer Science [ [http://binds.cs.umass.edu/havaBio.html Biography at UMass] ] .

Her initial publications on the computational power of Neural Networks culminated in a sole-author paper in Science [ H.T. Siegelmann, "Analog Computational Power," Science, 271(19), January 1996: 373] as well as monograph book on "Neural Networks and Analog Computation: Beyond the Turing Limit"



She has written 44 refereed papers in professional journals including:
* W. Bush and H.T. Siegelmann,"Circadian Synchrony in Networks of Protein Rhythm Driven Neurons" Complexity 12, Issue 1 (Sept/Oct 2006)
* T. Leise and H Siegelmann, "Dynamics of a multistage circadian system," Journal of Biological Rhythms, August, 21:4 (2006), 314-323 - this attracted Media Attention e.g. Boston Globe, Yahoo!News, Forbes, United Press International, National Public Radio etc.
* A. Roitershtein, A. Ben-Hur and H.T. Siegelmann "On probabilistic analog automata," Theoretical Computer Science, 320(2-3) pp. 449-464, June 2004
* A. Ben-Hur, H.T. Siegelmann, "Computing with Gene Networks," Chaos 14(1) pp. 145-151, March 2004 (Work was chosen as the work to describe in physics news)
* A. Ben-Hur, J. Feinberg, S. Fishman and H. T. Siegelmann "Random matrix theory for the analysis of the performance of an analog computer: a scaling theory," Phys. Lett. A. 323(3-4) pp. 204-209, March 2004
* A. Ben-Hur, H.T. Siegelmann and S. Fishman. "A theory of complexity for continuous time dynamics." Journal of Complexity 18(1) : 51-86, 2002
* H.T. Siegelmann, "Neural and Super-Turing Computing," Philosophy 2002
* H.T. Siegelmann, "Analog Computational Power," Science, 271(19), January 1996: 373 - responding to comments on her earlier article
* H.T. Siegelmann, "Computation Beyond the Turing Limit," Science, 238(28), April 1995: 632-637
* H.T. Siegelmann and E.D. Sontag, "Analog Computation via Neural Networks," Theoretical Computer Science, 131, 1994: 331-360
* H.T. Siegelmann and E.D. Sontag, "Turing Computability with Neural Networks," Applied Mathematics Letters, 4(6), 1991: 77-80

and in addition given numerous papers at conferences etc..


* Neural Networks and Analog Computation : Beyond the Turing Limit Birkhauser, Boston, December 1998 ISBN 0-8176-3949-7

She has contributed 18 book chapters including:
* "Neural Computing". New Trends in Computer Science, Gheroge Paul editor, 2003
* "Neural Automata and Computational Complexity," in Handbook of Brain Theory and Neural Networks, Michael A. Arbib (ed.), 2002
* "Finite vs. Infinite Descriptive Length in Neural Networks and the Associated Computational Complexity," in Finite vs. Infinite: Contributions to an Eternal Dilemma, C. Calude and Gh. Paun (eds.), Springer Verlag, 2000
* "Neural Automata and Computational Complexity," in Handbook of Brain Theory and Neural Networks, Michael A. Arbin (ed.), 2000
* "Computability with Neural Networks," in Lectures in Applied Mathematics, Vol. 32, J. Reneger, M. Shub, and S. Smale (eds.), American Mathematical Society, 1996: 733-747
* "Recurrent Neural Networks," in The 1000th Volume of Lecture Notes in Computer Science: Computer Science Today, Jan Van Leeuwen (ed.), Springer Verlag, 1995: 29-45

Notes & References

Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Super-recursive algorithm — In computer science and computability theory, super recursive algorithms are algorithms that are more powerful, that is, compute more, than Turing machines. The term was introduced by Mark Burgin, whose book Super recursive algorithms develops… …   Wikipedia

  • Сверхтьюринговые вычисления — Сверхтьюринговыми вычислениями (или гипервычислениями (англ. hypercomputation)) называются такие вычисления, которые не могут быть проделаны на машине Тьюринга. Они включают в себя разнообразные гипотетические методы, основанные на… …   Википедия

  • Hypercomputation — refers to various hypothetical methods for the computation of non Turing computable functions (see also supertask). The term was first introduced in 1999 by Jack Copeland and Diane Proudfoot [Copeland and Proudfoot,… …   Wikipedia

  • Real computation — In computability theory, the theory of real computation deals with hypothetical computing machines using infinite precision real numbers. They are given this name because they operate on the set of real numbers. Within this theory, it is possible …   Wikipedia

  • Artificial neural network — An artificial neural network (ANN), usually called neural network (NN), is a mathematical model or computational model that is inspired by the structure and/or functional aspects of biological neural networks. A neural network consists of an… …   Wikipedia

  • Hypercalcul — Demande de traduction Hypercomputation → …   Wikipédia en Français