Extreme Learning Machine with Enhanced Variation of Activation Functions

Jacek Kabzinski

2016

Abstract

The main aim of this paper is to stress the fact that the sufficient variability of activation functions (AF) is important for an Extreme Learning Machine (ELM) approximation accuracy and applicability. A slight modification of the standard ELM procedure is proposed, which allows increasing the variance of each AF, without losing too much from the simplicity of random selection of parameters. The proposed modification does not increase the computational complexity of an ELM training significantly. Enhancing the variation of AFs results in reduced output weights norm, better numerical conditioning of the output weights calculation, smaller errors for the same number of the hidden neurons. The proposed approach works efficiently together with the Tikhonov regularization of ELM.

Download


Paper Citation


in Harvard Style

Kabzinski J. (2016). Extreme Learning Machine with Enhanced Variation of Activation Functions . In Proceedings of the 8th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (IJCCI 2016) ISBN 978-989-758-201-1, pages 77-82. DOI: 10.5220/0006066200770082

in Bibtex Style

@conference{ncta16,
author={Jacek Kabzinski},
title={Extreme Learning Machine with Enhanced Variation of Activation Functions},
booktitle={Proceedings of the 8th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (IJCCI 2016)},
year={2016},
pages={77-82},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006066200770082},
isbn={978-989-758-201-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 8th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (IJCCI 2016)
TI - Extreme Learning Machine with Enhanced Variation of Activation Functions
SN - 978-989-758-201-1
AU - Kabzinski J.
PY - 2016
SP - 77
EP - 82
DO - 10.5220/0006066200770082