Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Associative learning on a continuum in evolved dynamical neural networks
Izquierdo E., Harvey I., Beer R. Adaptive Behavior16 (6):361-384,2008.Type:Article
Date Reviewed: May 15 2009

The primary goal of this well-written paper is to demonstrate that it is possible to model learning over a continuous range of inputs, using a continuous-time recurrent neural network (CTRNN) with fixed weights. “Learning” in this context means the ability to alter behavior as a result of changes in the environment, more specifically an environment with a continuum of different states. Using Harvey’s microbial genetic algorithm, the authors evolve synaptic weights and timing constants for a CTRNN that can simulate the learning behavior of a species of nematode worm, C. elegans. The worm is able to associate a particular temperature with food and corresponding feeding behavior, and to change this pairing when the environment (the temperature/food correlation) changes.

In an earlier paper [1], the first two authors published the first example of learning on a continuum with fixed-weight CTRNNs; this new paper is much more comprehensive, including an extensive examination of the dynamics of the learning behavior through snapshots of the node activations at different points in the pairing and testing sequences. The ability to learn without plasticity in the weights is a consequence of the interplay between the network nodes’ time constants and the environment, and it displays a number of characteristics of real learning, such as exponential memory decay curves. A lengthy discussion section at the end of the paper explores these issues in depth.

As a result of comparing the continuous version of the learning network to one devised for a simple discrete case consisting of just two temperature/food pairings, the authors propose a state transition model they call a continuous-state machine to describe the learning behavior. This “continuous manifold of finite-state machines” is a direct analog of the finite-state machine that arises naturally in the discrete case.

Full descriptions of the best-evolved networks are provided in an appendix. The paper is well illustrated and documented with an extensive list of references.

Reviewer:  R. Roos Review #: CR136837 (1001-0082)
1) Izquierdo, E.; Harvey, I. Learning on a continuum in evolved dynamical node networks. In Artificial Life X: Proc. of the 10th International Conference on the Simulation and Synthesis of Living Systems. MIT Press, 2006, 507–512.
Bookmark and Share
 
Connectionism And Neural Nets (I.2.6 ... )
 
 
Automata (F.1.1 ... )
 
 
Modeling Methodologies (I.6.5 ... )
 
 
Model Development (I.6.5 )
 
 
Models Of Computation (F.1.1 )
 
 
Robotics (I.2.9 )
 
Would you recommend this review?
yes
no
Other reviews under "Connectionism And Neural Nets": Date
Connectionist robot motion planning
Mel B., Academic Press Prof., Inc., San Diego, CA, 1990. Type: Book (9780124900202)
Apr 1 1992
Foundations of neural networks
Khanna T., Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1990. Type: Book (9780201500363)
Jan 1 1993
Neurocomputing
Hecht-Nielsen R., Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1989. Type: Book (9780201093551)
Jan 1 1993
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy