Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hava T. Siegelmann is active.

Publication


Featured researches published by Hava T. Siegelmann.


Theoretical Computer Science | 1994

Analog computation via neural networks

Hava T. Siegelmann; Eduardo D. Sontag

Abstract We pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research. Our systems have a fixed structure, invariant in time, corresponding to an unchanging number of “neurons”. If allowed exponential time for computation, they turn out to have unbounded power. However, under polynomial-time constraints there are limits on their capabilities, though being more powerful than Turing machines. (A similar but more restricted model was shown to be polynomial-time equivalent to classical digital computation in the previous work (Siegelmann and Sontag, 1992).) Moreover, there is a precise correspondence between nets and standard nonuniform circuits with equivalent resources, and as a consequence one has lower bound constraints on what they can compute. This relationship is perhaps surprising since our analog devices do not change in any manner with input size. We note that these networks are not likely to solve polynomially NP-hard problems, as the equality “ p = np ” in our model implies the almost complete collapse of the standard polynomial hierarchy. In contrast to classical computational models, the models studied here exhibit at least some robustness with respect to noise and implementation errors.


systems man and cybernetics | 1997

Computational capabilities of recurrent NARX neural networks

Hava T. Siegelmann; Bill G. Horne; C. L. Giles

Recently, fully connected recurrent neural networks have been proven to be computationally rich-at least as powerful as Turing machines. This work focuses on another network which is popular in control applications and has been found to be very effective at learning a variety of problems. These networks are based upon Nonlinear AutoRegressive models with eXogenous Inputs (NARX models), and are therefore called NARX networks. As opposed to other recurrent networks, NARX networks have a limited feedback which comes only from the output neuron rather than from hidden states. They are formalized by y(t)=Psi(u(t-n(u)), ..., u(t-1), u(t), y(t-n(y)), ..., y(t-1)) where u(t) and y(t) represent input and output of the network at time t, n(u) and n(y) are the input and output order, and the function Psi is the mapping performed by a Multilayer Perceptron. We constructively prove that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines. We conclude that in theory one can use the NARX models, rather than conventional recurrent networks without any computational loss even though their feedback is limited. Furthermore, these results raise the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power.


Science | 1995

Computation Beyond the Turing Limit

Hava T. Siegelmann

Extensive efforts have been made to prove the Church-Turing thesis, which suggests that all realizable dynamical and physical systems cannot be more powerful than classical models of computation. A simply described but highly chaotic dynamical system called the analog shift map is presented here, which has computational power beyond the Turing limit (super-Turing); it computes exactly like neural networks and analog machines. This dynamical system is conjectured to describe natural physical phenomena.


Applied Mathematics Letters | 1991

Turing computability with neural nets

Hava T. Siegelmann; Eduardo D. Sontag

Abstract This paper shows the existence of a finite neural network, made up of sigmoidal neurons, which simulates a universal Turing machine. It is composed of less than 10 5 synchronously evolving processors, interconnected linearly. High-order connections are not required.


Archive | 1999

Neural Networks and Analog Computation

Hava T. Siegelmann

ly, an automaton is defined by the above data. Thus, as a mathematical object, an automaton is simply the quintuple


conference on learning theory | 1992

On the computational power of neural nets

Hava T. Siegelmann; Eduardo D. Sontag

This paper deals with finite networks which consist of interconnections of synchronously evolving processors. Each processor updates its state by applying a “sigmoidal” scalar nonlinearity to a linear combination of the previous states of all units. We prove that one may simulate all Turing Machines by rational nets. In particular, one can do this in linear time, and there is a net made up of about 1,000 processors which computes a universal partial-recursive function. Products (high order nets) are not required, contrary to what had been stated in the literature. Furthermore, we assert a similar theorem about non-deterministic Turing Machines. Consequences for undecidability and complexity issues about nets are discussed too.


Information & Computation | 1996

The Dynamic Universality of Sigmoidal Neural Networks

Joe Kilian; Hava T. Siegelmann

We investigate the computational power of recurrent neural networks that apply the sigmoid activation function?(x)=2/(1+e?x)]?1. These networks are extensively used in automatic learning of non-linear dynamical behavior. We show that in the noiseless model, there exists a universal architecture that can be used to compute any recursive (Turing) function. This is the first result of its kind for the sigmoid activation function; previous techniques only applied to linearized and truncated version of this function. The significance of our result, besides the proving technique itself, lies in the popularity of the sigmoidal function both in engineering applications of artificial neural networks and in biological modelling. Our techniques can be applied to a much more general class of “sigmoidal-like” activation functions, suggesting that Turing universality is a relatively common property of recurrent neural network models.


international conference on pattern recognition | 2000

A support vector clustering method

Asa Ben-Hur; D. Horn; Hava T. Siegelmann; Vladimir Vapnik

We present a novel kernel method for data clustering using a description of the data by support vectors. The kernel reflects a projection of the data points from data space to a high dimensional feature space. Cluster boundaries are defined as spheres in feature space, which represent complex geometric shapes in data space. We utilize this geometric representation of the data to construct a simple clustering algorithm.


Chaos | 2001

Symbolic dynamics and computation in model gene networks.

Roderick Edwards; Hava T. Siegelmann; K. Aziza; Leon Glass

We analyze a class of ordinary differential equations representing a simplified model of a genetic network. In this network, the model genes control the production rates of other genes by a logical function. The dynamics in these equations are represented by a directed graph on an n-dimensional hypercube (n-cube) in which each edge is directed in a unique orientation. The vertices of the n-cube correspond to orthants of state space, and the edges correspond to boundaries between adjacent orthants. The dynamics in these equations can be represented symbolically. Starting from a point on the boundary between neighboring orthants, the equation is integrated until the boundary is crossed for a second time. Each different cycle, corresponding to a different sequence of orthants that are traversed during the integration of the equation always starting on a boundary and ending the first time that same boundary is reached, generates a different letter of the alphabet. A word consists of a sequence of letters corresponding to a possible sequence of orthants that arise from integration of the equation starting and ending on the same boundary. The union of the words defines the language. Letters and words correspond to analytically computable Poincare maps of the equation. This formalism allows us to define bifurcations of chaotic dynamics of the differential equation that correspond to changes in the associated language. Qualitative knowledge about the dynamics found by integrating the equation can be used to help solve the inverse problem of determining the underlying network generating the dynamics. This work places the study of dynamics in genetic networks in a context comprising both nonlinear dynamics and the theory of computation. (c) 2001 American Institute of Physics.


Physica D: Nonlinear Phenomena | 1998

Analog computation with dynamical systems

Hava T. Siegelmann; Shmuel Fishman

A b s t r a c t Physical systems exhibit various levels of complexity: their long term dynamics may converge to fixed points or exhibit complex chaotic behavior. This paper presents a theory that enables to interpret natural processes as special purpose analog computers. Since physical systems are naturally described in continuous time, a definition of computational complexity for continuous time systems is required. In analogy with the classical discrete theory we develop fundamentals of computational complexity for dynamical systems, discrete or continuous in time, on the basis of an intrinsic time scale of the system. Dissipative dynamical systems are classified into the computational complexity classes Pd, Co-RPd, NPd and EXP,t, corresponding to their standard counterparts, according to the complexity of their long term behavior. The complexity of chaotic attractors relative to regular ones leads to the conjecture Pa :fi NPj. Continuous time flows have been proven useful in solving various practical problems. Our theory provides the tools for an algorithmic analysis of such flows. As an example we analyze the continuous Hopfield network.

Collaboration


Dive into the Hava T. Siegelmann's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asa Ben-Hur

Colorado State University

View shared research outputs
Top Co-Authors

Avatar

Megan M. Olsen

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Shmuel Fishman

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bhaskar DasGupta

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Kyle Ira Harrington

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

P. Taylor

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge