Towards the open ended evolution of neural networks

Lucas, S. M. “Towards the open ended evolution of neural networks.” (1995): 388-393.
URL1

A framework is described that allows the completely open-ended evolution of neural network architectures, based on an active weight neural network model. In this approach, there is no separate learning algorithm; learning proceeds (if at all) as an intrinsic part of the network behaviour. This has interesting application in the evolution of neural nets, since now it is possible to evolve all aspects of a network (including the learning `algorithm’) within a single unified paradigm. As an example, a grammar is given for growing a multilayer perceptron with active weights that has the error back-propagation learning algorithm embedded in its structure.

Cited by…
Related articles