عرض بسيط للتسجيلة

المؤلفTran D.T.
المؤلفKiranyaz, Mustafa Serkan
المؤلفGabbouj M.
المؤلفIosifidis A.
تاريخ الإتاحة2022-04-26T12:31:20Z
تاريخ النشر2020
اسم المنشورNeurocomputing
المصدرScopus
المعرّفhttp://dx.doi.org/10.1016/j.neucom.2019.10.079
معرّف المصادر الموحدhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85075527604&doi=10.1016%2fj.neucom.2019.10.079&partnerID=40&md5=ac6224a30da2a962735a7d9633506701
معرّف المصادر الموحدhttp://hdl.handle.net/10576/30610
الملخصGeneralized Operational Perceptron (GOP) was proposed to generalize the linear neuron model used in the traditional Multilayer Perceptron (MLP) by mimicking the synaptic connections of biological neurons showing nonlinear neurochemical behaviours. Previously, Progressive Operational Perceptron (POP) was proposed to train a multilayer network of GOPs which is formed layer-wise in a progressive manner. While achieving superior learning performance over other types of networks, POP has a high computational complexity. In this work, we propose POPfast, an improved variant of POP that signicantly reduces the computational complexity of POP, thus accelerating the training time of GOP networks. In addition, we also propose major architectural modications of POPfast that can augment the progressive learning process of POP by incorporating an information preserving, linear projection path from the input to the output layer at each progressive step. The proposed extensions can be interpreted as a mechanism that provides direct information extracted from the previously learned layers to the network, hence the term ?memory?. This allows the network to learn deeper architectures and better data representations. An extensive set of experiments in human action, object, facial identity and scene recognition problems demonstrates that the proposed algorithms can train GOP networks much faster than POPs while achieving better performance compared to original POPs and other related algorithms.
اللغةen
الناشرElsevier B.V.
الموضوعComplex networks
Computational complexity
Memory architecture
Multilayers
Network architecture
Network layers
Neurons
Data representations
Learning performance
Linear projections
Multi layer perceptron
Multi-layer network
Neural architectures
Progressive learning
Synaptic connections
Internet protocols
algorithm
article
face
human
learning
memory
perceptron
العنوانProgressive Operational Perceptrons with Memory
النوعArticle
الصفحات172-181
رقم المجلد379


الملفات في هذه التسجيلة

الملفاتالحجمالصيغةالعرض

لا توجد ملفات لها صلة بهذه التسجيلة.

هذه التسجيلة تظهر في المجموعات التالية

عرض بسيط للتسجيلة