Title Global convergence of Negative Correlation Extreme Learning Machine
Authors PERALES GONZÁLEZ, CARLOS
External publication No
Means Neural Process Letters
Scope Article
Nature Científica
JCR Quartile 3
SJR Quartile 2
JCR Impact 2.56500
SJR Impact 0.59700
Web https://www.scopus.com/inward/record.uri?eid=2-s2.0-85103415032&doi=10.1007%2fs11063-021-10492-z&partnerID=40&md5=550f104ba5308a5eac3c62f89cc46ff7
Publication date 01/06/2021
ISI 000634619500001
Scopus Id 2-s2.0-85103415032
DOI 10.1007/s11063-021-10492-z
Abstract Ensemble approaches introduced in the Extreme Learning Machine literature mainly come from methods that rely on data sampling procedures, under the assumption that the training data are heterogeneously enough to set up diverse base learners. To overcome this assumption, it was proposed an ELM ensemble method based on the Negative Correlation Learning framework, called Negative Correlation Extreme Learning Machine (NCELM). This model works in two stages: (i) different ELMs are generated as base learners with random weights in the hidden layer, and (ii) a NCL penalty term with the information of the ensemble prediction is introduced in each ELM minimization problem, updating the base learners, (iii) second step is iterated until the ensemble converges. Although this NCL ensemble method was validated by an experimental study with multiple benchmark datasets, no information was given on the conditions about this convergence. This paper mathematically presents sufficient conditions to guarantee the global convergence of NCELM. The update of the ensemble in each iteration is defined as a contraction mapping function, and through Banach theorem, global convergence of the ensemble is proved.
Keywords Ensemble; Negative correlation learning; Extreme learning machine; Fixed-point; Banach; Contraction mapping
Universidad Loyola members

Change your preferences Manage cookies