A modified Lanczos Algorithm for fast regularization of extreme learning machines

Renjie Hu, Edward Ratner, David Stewart, Kaj Mikael Björk, Amaury Lendasse

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a new regularization for Extreme Learning Machines (ELMs). ELMs are Randomized Neural Networks (RNNs) that are known for their fast training speed and good accuracy. Nevertheless the complexity of ELMs has to be selected, and regularization has to be performed in order to avoid underfitting or overfitting. Therefore, a novel Regularization is proposed using a modified Lanczos Algorithm: Iterative Lanczos Extreme Learning Machine (Lan-ELM). As summarized in the experimental Section, the computational time is on average divided by 4 and the Normalized MSE is on average reduced by 11%. In addition, the proposed method can be intuitively parallelized, which makes it a very valuable tool to analyze huge data sets in real-time.

Original languageEnglish (US)
Pages (from-to)172-181
Number of pages10
JournalNeurocomputing
Volume414
DOIs
StatePublished - Nov 13 2020

Keywords

  • Classification
  • Extreme Learning machines
  • Lanczos Algorithm
  • Neural Networks
  • Regression
  • Regularization

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A modified Lanczos Algorithm for fast regularization of extreme learning machines'. Together they form a unique fingerprint.

Cite this