Important complexity reduction of random forest in multi-classification problem
Author | Hassine, Kawther |
Author | Erbad, Aiman |
Author | Hamila, Ridha |
Available date | 2020-05-14T09:55:45Z |
Publication Date | 2019 |
Publication Name | 2019 15th International Wireless Communications and Mobile Computing Conference, IWCMC 2019 |
Resource | Scopus |
Abstract | Algorithm complexity in machine learning problems has been a real concern especially with large-scaled systems. By increasing data dimensionality, a particular emphasis is placed on designing computationally efficient learning models. In this paper, we propose an approach to improve the complexity of a multi-classification learning problem in cloud networks. Based on the Random Forest algorithm and the highly dimensional UNSW-NB 15 dataset, a tuning of the algorithm is first performed to reduce the number of grown trees used during classification. Then, we apply an importance-based feature selection to optimize the number of predictors involved in the learning process. All of these optimizations, implemented with respect to the best performance recorded by our classifier, yield substantial improvement in terms of computational complexity both during training and prediction phases. - 2019 IEEE. |
Sponsor | This publication was made possible by NPRP grant 8-634-1-131 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors. |
Language | en |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Subject | Algorithm complexity Feature selection Predictor importance Random Forest |
Type | Conference |
Pagination | 226-231 |
Files in this item
Files | Size | Format | View |
---|---|---|---|
There are no files associated with this item. |
This item appears in the following Collection(s)
-
Computer Science & Engineering [2409 items ]
-
Electrical Engineering [2801 items ]