• search hit 3 of 30
Back to Result List

Generalization of SELU to CNN

  • Neural network based object detectors are able to automatize many difficult, tedious tasks. However, they are usually slow and/or require powerful hardware. One main reason is called Batch Normalization (BN) [1], which is an important method for building these detectors. Recent studies present a potential replacement called Self-normalizing Neural Network (SNN) [2], which at its core is a special activation function named Scaled Exponential Linear Unit (SELU). This replacement seems to have most of BNs benefits while requiring less computational power. Nonetheless, it is uncertain that SELU and neural network based detectors are compatible with one another. An evaluation of SELU incorporated networks would help clarify that uncertainty. Such evaluation is performed through series of tests on different neural networks. After the evaluation, it is concluded that, while indeed faster, SELU is still not as good as BN for building complex object detector networks.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Availability
Metadaten
Document Type:Master's Thesis
Language:English
Pagenumber:53
URL:https://nbn-resolving.org/urn:nbn:de:0011-n-5407763
Referee:Paul G. PlögerGND, Gerhard K. KraetzschmarGND, Florian Zimmermann
Publisher:Fraunhofer Publica
Date of first publication:2019/04/17
Tag:Batch Normalization; SELU; YOLO v3; deep learning; object detection
Dewey Decimal Classification (DDC):0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Theses:Fachbereich / Informatik
Entry in this database:2019/04/25