Expressive Power of Nondeterministic Recurrent Neural Networks in Terms of their Attractor Dynamics
Jérémie Cabessa and Jacques Duparc
We provide a characterization of the expressive powers of several models of nondeterministic recurrent neural networks according to their attractor dynamics. More precisely, we consider two forms of nondeterministic neural networks. In the first case, nondeterminism is expressed as an external binary guess stream processed by means of an additional Boolean guess cell. In the second case, nondeterminism is expressed as a set of possible evolving patterns that the synaptic connections of the network might follow over the successive time steps. In these two contexts, ten models of nondeterministic neural networks are considered, according to the nature of their synaptic weights. Overall, we prove that the static rational-weighted neural networks of type 1 are computationally equivalent to nondeterministic Muller Turing machines. They recognize the class of all effectively analytic (∑11 lightface) sets. The nine other models of analog and/or evolving neural networks of types 1 and 2 are all computationally equivalent to each other, and strictly more powerful than nondeterministic Muller Turing machines. They recognize the class of all analytic (∑11 boldface) sets.
Keywords: Recurrent neural networks, neural computation, analog computation, evolving systems, attractors, spatiotemporal patterns, turing machines, expressive power, ω-languages, Borel sets, analytic sets.