next up previous contents index
Next: Turing computability with DTRNN Up: Featured papers Previous: #omlin96j3###   Contents   Index


Kremer (1996b):

As has been discussed earlier in this chapter, certain DTRNN architectures may not be capable of representing all possible FSM. Recurrent cascade correlation (RCC) (see section 3.4.3) is both the name of a learning algorithm which constructs a DTRNN during learning and the name of the class of architectures generated by this algorithm. A number of papers have addressed the representational capabilities of RCC networks ((Giles et al., 1995), (Kremer, 1996a),; (Kremer, 1996b)) by defining a class of FSM that cannot be represented in RCC nets; the class grows in generality as we go from one paper to another. A general formulation is presented by Kremer (1996b) (http://www.dlsi.ua.es/~mlf/nnafmc/papers/kremer96finite.pdf), but the class of FSM that can be represented in this architecture still remains to be defined. According to Kremer (1996b), which uses a reductio ad absurdum proof, RCC nets cannot represent FSM which, as a response to input strings whose symbols repeat with a periodicity $p$, output strings have a periodicity $\omega$ such that $p\bmod\omega=0$ (if $p$ is even) or $2p\bmod\omega=0$ (if $p$ is odd). One such automaton is the odd parity automaton (see figure 2.3), whose output has a period $2$ if its input is a constant ``11111...'' (period $1$).


next up previous contents index
Next: Turing computability with DTRNN Up: Featured papers Previous: #omlin96j3###   Contents   Index
Debian User 2002-01-21