We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Ensemble Method of Diverse Regularized Extreme Learning Machines.
- Authors
CHEN Yang; WANG Shitong
- Abstract
As a fast training algorithm of single hidden layer forward networks, extreme learning machine (ELM) randomly initializes the input layer weights and hidden layer biases, and gets the weights of output layer through the analysis method. It overcomes many shortcomings of gradient based learning algorithm, such as local minimum, inappropriate learning rate, slow learning speed, etc. However, ELM still inevitably has overfitting and poorly stable phenomenon, especially on large- scale datasets. This paper proposes the ensemble method of diverse regularized extreme learning machines (DRELM) to solve the above problems. First, its own random distribution weigthts are used to assure the diversity between each ELM base learner, then leave-one-out (LOO) cross validation method and MSEPRESS method are used to find the optimal hidden node number of each base learner, calculate the optimal hidden layer output weights to train better and different base learners. Then the new penalty term about diversity is explicitly added to the objective function and the output matrix of each learner is updated iteratively. Finally, the final output of the whole network model is obtained by averaging the output of all base learners. This method can effectively realize the ensemble of regularized extreme learning machines (RELM) with both accuracy and diversity. Experimental results on 10 UCI datasets indicate the effectiveness of DRELM.
- Subjects
MACHINE learning; DISTRIBUTION (Probability theory); WEIGHT training; MATRIX functions; PROBLEM solving; ITERATIVE learning control; FEEDFORWARD neural networks
- Publication
Journal of Frontiers of Computer Science & Technology, 2022, Vol 16, Issue 8, p1819
- ISSN
1673-9418
- Publication type
Article
- DOI
10.3778/j.issn.1673-9418.2101001