Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison Journalartikel uri icon



  • An appropriate choice of the activation function (like ReLU, sigmoid or swish) plays an important role in the performance of (deep) multilayer perceptrons (MLP) for classification and regression learning. Prototype-based classification learning methods like (generalized) learning vector quantization (GLVQ) are powerful alternatives. These models also deal with activation functions but here they are applied to the so-called classifier function instead. In this paper we investigate successful candidates of activation functions known for MLPs for application in GLVQ and their influence on the performance.


  • 2019

Beitrag veröffentlicht in

  • arXiv  Integrierende Ressource


  • arXiv:1901.05995