• English
    • العربية
  • العربية
  • Login
  • QU
  • QU Library
  •  Home
  • Communities & Collections
  • Copyrights
View Item 
  •   Qatar University Digital Hub
  • Qatar University Institutional Repository
  • Academic
  • Faculty Contributions
  • College of Engineering
  • Computer Science & Engineering
  • View Item
  • Qatar University Digital Hub
  • Qatar University Institutional Repository
  • Academic
  • Faculty Contributions
  • College of Engineering
  • Computer Science & Engineering
  • View Item
  •      
  •  
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Hybrid training of deep neural networks with multiple output layers for tabular data classification

    Icon
    View/Open
    Publisher version (You have accessOpen AccessIcon)
    Publisher version (Check access options)
    Check access options
    1-s2.0-S0031320325006958-main.pdf (1.563Mb)
    Date
    2025-07-05
    Author
    Al-Ali, Abdulaziz
    Suganthan, Ponnuthurai N.
    Aly, Hussein
    Hamdy, Mohamed
    Metadata
    Show full item record
    Abstract
    In the rapidly evolving landscape of deep learning, the ability to balance performance and computational efficiency is indispensable. Layer-wise training, which involves independently training each hidden layer of a neural network with its private output layer, offers a promising avenue by enabling the construction of a single network that can leverage an ensemble of output layers during prediction. This approach has been successfully employed in state-of-the-art models like ensemble deep multilayer perceptron (edMLP) and ensemble deep random vector functional link (edRVFL), pushing the boundaries of their base models, MLP trained by backpropagation (BP) and RVFL trained using a closed-form solution (CFS). However, edRVFL often underperforms edMLP in accuracy, while edMLP incurs significantly higher computational cost. To this end, we introduce two novel hybrid training approaches that integrate BP and CFS, aiming to balance the trade-offs. Extensive experiments on diverse classification datasets reveal that one of the proposed models, ensemble deep adaptive sampling (edAS), achieves statistically significant improvements in classification accuracy over state-of-the-art models, including edRVFL, edMLP, and self-normalizing neural network (SNN), while being less computationally expensive. Furthermore, the second proposed model, MO-MLP, demonstrates statistically significant superiority over competing models while requiring less than one-third of the computation time needed by models that incorporate BP in a layer-wise manner. The source code for all proposed models is available on GitHub.11https://github.com/M-Hamdy-M/ed-hybrids.
    URI
    https://www.sciencedirect.com/science/article/pii/S0031320325006958
    DOI/handle
    http://dx.doi.org/10.1016/j.patcog.2025.112035
    http://hdl.handle.net/10576/68415
    Collections
    • Computer Science & Engineering [‎2491‎ items ]

    entitlement


    Qatar University Digital Hub is a digital collection operated and maintained by the Qatar University Library and supported by the ITS department

    Contact Us
    Contact Us | QU

     

     

    Home

    Submit your QU affiliated work

    Browse

    All of Digital Hub
      Communities & Collections Publication Date Author Title Subject Type Language Publisher
    This Collection
      Publication Date Author Title Subject Type Language Publisher

    My Account

    Login

    Statistics

    View Usage Statistics

    Qatar University Digital Hub is a digital collection operated and maintained by the Qatar University Library and supported by the ITS department

    Contact Us
    Contact Us | QU

     

     

    Video