Show simple item record

AuthorKiranyaz, Mustafa Serkan
AuthorInce T.
AuthorIosifidis A.
AuthorGabbouj M.
Available date2022-04-26T12:31:22Z
Publication Date2017
Publication NameNeurocomputing
ResourceScopus
Identifierhttp://dx.doi.org/10.1016/j.neucom.2016.10.044
URIhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85006339727&doi=10.1016%2fj.neucom.2016.10.044&partnerID=40&md5=eefa40a4f5ba00e520f8a3f9c2cf5a01
URIhttp://hdl.handle.net/10576/30627
AbstractThere are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully-connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.
Languageen
PublisherElsevier B.V.
SubjectBenchmarking
Complex networks
Cybernetics
Mathematical operators
Neural networks
Scalability
Bench-mark problems
Complex configuration
Diversity
Generalization performance
Generalized models
Multi-layer perceptrons
Multi-layer perceptrons (MLPs)
Optimal operators
Backpropagation
Article
artificial neural network
back propagation
generalized operational perceptron
learning disorder
linear system
mathematical analysis
mathematical computing
mathematical parameters
nerve cell
perceptron
priority journal
progressive operational perceptron
statistical model
TitleProgressive Operational Perceptrons
TypeArticle
Pagination142-154
Volume Number224


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record