Progressive Operational Perceptrons
Author | Kiranyaz, Mustafa Serkan |
Author | Ince T. |
Author | Iosifidis A. |
Author | Gabbouj M. |
Available date | 2022-04-26T12:31:22Z |
Publication Date | 2017 |
Publication Name | Neurocomputing |
Resource | Scopus |
Identifier | http://dx.doi.org/10.1016/j.neucom.2016.10.044 |
Abstract | There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully-connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain. |
Language | en |
Publisher | Elsevier B.V. |
Subject | Benchmarking Complex networks Cybernetics Mathematical operators Neural networks Scalability Bench-mark problems Complex configuration Diversity Generalization performance Generalized models Multi-layer perceptrons Multi-layer perceptrons (MLPs) Optimal operators Backpropagation Article artificial neural network back propagation generalized operational perceptron learning disorder linear system mathematical analysis mathematical computing mathematical parameters nerve cell perceptron priority journal progressive operational perceptron statistical model |
Type | Article |
Pagination | 142-154 |
Volume Number | 224 |
Check access options
Files in this item
Files | Size | Format | View |
---|---|---|---|
There are no files associated with this item. |
This item appears in the following Collection(s)
-
Electrical Engineering [2703 items ]