TY - THES U1 - Master Thesis A1 - Ha, Bach T1 - Generalization of SELU to CNN N2 - Neural network based object detectors are able to automatize many difficult, tedious tasks. However, they are usually slow and/or require powerful hardware. One main reason is called Batch Normalization (BN) [1], which is an important method for building these detectors. Recent studies present a potential replacement called Self-normalizing Neural Network (SNN) [2], which at its core is a special activation function named Scaled Exponential Linear Unit (SELU). This replacement seems to have most of BNs benefits while requiring less computational power. Nonetheless, it is uncertain that SELU and neural network based detectors are compatible with one another. An evaluation of SELU incorporated networks would help clarify that uncertainty. Such evaluation is performed through series of tests on different neural networks. After the evaluation, it is concluded that, while indeed faster, SELU is still not as good as BN for building complex object detector networks. KW - deep learning KW - YOLO v3 KW - object detection KW - SELU KW - Batch Normalization UR - https://nbn-resolving.org/urn:nbn:de:0011-n-5407763 U6 - https://doi.org/10.24406/publica-fhg-282624 DO - https://doi.org/10.24406/publica-fhg-282624 SP - 53 S1 - 53 PB - Fraunhofer Publica ER -