• search hit 6 of 8
Back to Result List

Evolving Parsimonious Networks by Mixing Activation Functions

  • Neuroevolution methods evolve the weights of a neural network, and in some cases the topology, but little work has been done to analyze the effect of evolving the activation functions of individual nodes on network size, an important factor when training networks with a small number of samples. In this work we extend the neuroevolution algorithm NEAT to evolve the activation function of neurons in addition to the topology and weights of the network. The size and performance of networks produced using NEAT with uniform activation in all nodes, or homogenous networks, is compared to networks which contain a mixture of activation functions, or heterogenous networks. For a number of regression and classification benchmarks it is shown that, (1) qualitatively different activation functions lead to different results in homogeneous networks, (2) the heterogeneous version of NEAT is able to select well performing activation functions, (3) the produced heterogeneous networks are significantly smaller than homogeneous networks.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Availability
Metadaten
Document Type:Conference Object
Language:English
Parent Title (German):GECCO '17: Proceedings of the Genetic and Evolutionary Computation Conference. Berlin, Germany, July 15-19, 2017
First Page:425
Last Page:432
ISBN:978-1-4503-4920-8
DOI:https://doi.org/10.1145/3071178.3071275
ArXiv Id:http://arxiv.org/abs/1703.07122
Publisher:ACM
Date of first publication:2017/07/01
Tag:activation function; bloat; heterogeneous networks; neuroevolution; regression
Departments, institutes and facilities:Fachbereich Informatik
Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE)
Dewey Decimal Classification (DDC):0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Entry in this database:2017/04/26