Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison Journalartikel uri icon

 

Abstract

  • An appropriate choice of the activation function (like ReLU, sigmoid or swish) plays an important role in the performance of (deep) multilayer perceptrons (MLP) for classification and regression learning. Prototype-based classification learning methods like (generalized) learning vector quantization (GLVQ) are powerful alternatives. These models also deal with activation functions but here they are applied to the so-called classifier function instead. In this paper we investigate successful candidates of activation functions known for MLPs for application in GLVQ and their influence on the performance.

Veröffentlichungsjahr

  • 2019

Beitrag veröffentlicht in

  • arXiv  Integrierende Ressource

Band

  • arXiv:1901.05995