Show simple item record

AuthorHu, Minghui
AuthorSuganthan, P. N.
Available date2023-02-12T11:01:11Z
Publication Date2022-09-01
Publication NameApplied Soft Computing
Identifierhttp://dx.doi.org/10.1016/j.asoc.2022.109257
CitationHu, M., & Suganthan, P. N. (2022). Experimental evaluation of stochastic configuration networks: Is SC algorithm inferior to hyper-parameter optimization method?. Applied Soft Computing, 126, 109257.‏
ISSN15684946
URIhttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85134435218&origin=inward
URIhttp://hdl.handle.net/10576/39996
AbstractTo overcome the pitfalls of Random Vector Functional Link (RVFL), a network called Stochastic Configuration Networks (SCN) has been proposed. By constraining and adaptively selecting the range of randomized parameters using the Stochastic Configuration (SC) algorithm, SCN claims to be potent in building an incremental randomized learning system according to residual error minimization. The SC has three variants depending on how the range of output weights are updated. In this work, we first relate the SCN to appropriate literature. Subsequently, we show that the major parts of the SC algorithm can be replaced by a generic hyper-parameter optimization method to obtain overall better results.
Languageen
PublisherElsevier Ltd
SubjectIncremental learning
Random vector functional link
Randomized neural network
Stochastic configuration network
TitleExperimental evaluation of stochastic configuration networks: Is SC algorithm inferior to hyper-parameter optimization method?
TypeArticle
Volume Number126
dc.accessType Abstract Only


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record