Volltext-Downloads (blau) und Frontdoor-Views (grau)

Generalization of SELU to CNN

  • Neural network based object detectors are able to automatize many difficult, tedious tasks. However, they are usually slow and/or require powerful hardware. One main reason is called Batch Normalization (BN) [1], which is an important method for building these detectors. Recent studies present a potential replacement called Self-normalizing Neural Network (SNN) [2], which at its core is a special activation function named Scaled Exponential Linear Unit (SELU). This replacement seems to have most of BNs benefits while requiring less computational power. Nonetheless, it is uncertain that SELU and neural network based detectors are compatible with one another. An evaluation of SELU incorporated networks would help clarify that uncertainty. Such evaluation is performed through series of tests on different neural networks. After the evaluation, it is concluded that, while indeed faster, SELU is still not as good as BN for building complex object detector networks.

Export metadata

Additional Services

Search Google Scholar Check availability


Show usage statistics
Document Type:Master's Thesis
Author:Bach Ha
Number of pages:53
Referee:Paul G. Plöger, Gerhard K. Kraetzschmar, Florian Zimmermann
Publisher:Fraunhofer Publica
Contributing Corporation:Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme
Date of first publication:2019/04/17
Keyword:Batch Normalization; SELU; YOLO v3; deep learning; object detection
Dewey Decimal Classification (DDC):0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Theses, student research papers:Hochschule Bonn-Rhein-Sieg / Fachbereich Informatik
Entry in this database:2019/04/25