← Back
Publicaciones

Negative Correlation Hidden Layer for the Extreme Learning Machine

Authors

PERALES GONZÁLEZ, CARLOS, FERNÁNDEZ NAVARRO, FRANCISCO DE ASÍS, PÉREZ RODRÍGUEZ, JAVIER, CARBONERO RUZ, MARIANO

External publication

No

Means

Appl. Soft Comput.

Scope

Article

Nature

Científica

JCR Quartile

SJR Quartile

JCR Impact

8.263

SJR Impact

1.959

Publication date

01/01/2021

ISI

000734390900010

Scopus Id

2-s2.0-85105924861

Abstract

Extreme Learning Machine (ELM) algorithms have achieved unprecedented performance in supervised machine learning tasks. However, the preconfiguration of the nodes in the hidden layer in ELM models through randomness does not always lead to a suitable transformation of the original features. Consequently, the performance of these models relies on broad exploration of these feature mappings, generally using a large number of nodes in the hidden layer. In this paper, a novel ELM architecture is presented, called Negative Correlation Hidden Layer ELM (NCHL-ELM), based on the Negative Correlation Learning (NCL) framework. This model incorporates a parameter into each node in the original ELM hidden layer, and these parameters are optimized by reducing the error in the training set and promoting the diversity among them in order to improve the generalization results. Mathematically, the ELM minimization problem is perturbed by a penalty term, which represents a measure of diversity among the parameters. A variety of regression and classification benchmark datasets have been selected in order to compare NCHL-ELM with other state-of-the-art ELM models. Statistical tests indicate the superiority of our method in both regression and classification problems. © 2021 Elsevier B.V.

Keywords

Diversity; Extreme Learning Machine; Feature mapping; Hidden layer; Negative Correlation Learning

Universidad Loyola members