عرض بسيط للتسجيلة

المؤلفKiranyaz, Mustafa Serkan
المؤلفInce T.
المؤلفIosifidis A.
المؤلفGabbouj M.
تاريخ الإتاحة2022-04-26T12:31:20Z
تاريخ النشر2020
اسم المنشورNeural Computing and Applications
المصدرScopus
المعرّفhttp://dx.doi.org/10.1007/s00521-020-04780-3
معرّف المصادر الموحدhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85081633955&doi=10.1007%2fs00521-020-04780-3&partnerID=40&md5=d326c80fe9be33c15699162d9d55926f
معرّف المصادر الموحدhttp://hdl.handle.net/10576/30607
الملخصFeed-forward, fully connected artificial neural networks or the so-called multi-layer perceptrons are well-known universal approximators. However, their learning performance varies significantly depending on the function or the solution space that they attempt to approximate. This is mainly because of their homogenous configuration based solely on the linear neuron model. Therefore, while they learn very well those problems with a monotonous, relatively simple and linearly separable solution space, they may entirely fail to do so when the solution space is highly nonlinear and complex. Sharing the same linear neuron model with two additional constraints (local connections and weight sharing), this is also true for the conventional convolutional neural networks (CNNs) and it is, therefore, not surprising that in many challenging problems only the deep CNNs with a massive complexity and depth can achieve the required diversity and the learning performance. In order to address this drawback and also to accomplish a more generalized model over the convolutional neurons, this study proposes a novel network model, called operational neural networks (ONNs), which can be heterogeneous and encapsulate neurons with any set of operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. Finally, the training method to back-propagate the error through the operational layers of ONNs is formulated. Experimental results over highly challenging problems demonstrate the superior learning capabilities of ONNs even with few neurons and hidden layers.
اللغةen
الناشرSpringer
الموضوعComplex networks
Convolution
Convolutional neural networks
Feedforward neural networks
Learning systems
Neurons
Personnel training
Generalized models
Learning capabilities
Learning performance
Linearly separable
Multi modal function
Multi-layer perceptrons
Nonlinear neural networks
Universal approximators
Multilayer neural networks
العنوانOperational neural networks
النوعArticle
الصفحات6645-6668
رقم العدد11
رقم المجلد32
dc.accessType Abstract Only


الملفات في هذه التسجيلة

الملفاتالحجمالصيغةالعرض

لا توجد ملفات لها صلة بهذه التسجيلة.

هذه التسجيلة تظهر في المجموعات التالية

عرض بسيط للتسجيلة