← Volver atrás
Publicaciones

Regularized ensemble neural networks models in the Extreme Learning Machine framework

Autores

PERALES GONZÁLEZ, CARLOS, CARBONERO RUZ, MARIANO, BECERRA ALONSO, DAVID, PÉREZ RODRÍGUEZ, JAVIER, FERNÁNDEZ NAVARRO, FRANCISCO DE ASÍS

Publicación externa

No

Medio

Neurocomputing

Alcance

Article

Naturaleza

Científica

Cuartil JCR

Cuartil SJR

Impacto JCR

4.438

Impacto SJR

1.178

Fecha de publicacion

07/10/2019

ISI

000480413200020

Scopus Id

2-s2.0-85069590641

Abstract

Extreme Learning Machine (ELM) has proven to be an efficient and speedy algorithm for classification. In order to generalize the results of standard ELM, several ensemble meta-algorithms have been implemented. On this manuscript, we propose a hierarchical ensemble methodology that promotes diversity among the elements of an ensemble, explicitly through the loss function in the single-hidden-layer feed-forward network version of ELM. The diversity term in the loss function is justified using the concept of regularization from the Negative Correlation Learning framework. Statistical tests show that our proposal is competitive in both performance and diversity measures against bagging and boosting ensemble methodologies. (C) 2019 Elsevier B.V. All rights reserved.

Palabras clave

Extreme Learning Machine; Ensemble; Hierarchy; Diversity; Negative Correlation

Miembros de la Universidad Loyola