عرض بسيط للتسجيلة

المؤلفKhan, Muhammad Asif
المؤلفHamila, Ridha
المؤلفMenouar, Hamid
تاريخ الإتاحة2024-08-21T09:49:58Z
تاريخ النشر2023
اسم المنشورProceedings - 2023 IEEE International Conference on Big Data and Smart Computing, BigComp 2023
المصدرScopus
معرّف المصادر الموحدhttp://dx.doi.org/10.1109/BigComp57234.2023.00014
معرّف المصادر الموحدhttp://hdl.handle.net/10576/57847
الملخصDeep learning models require an enormous amount of data for training. However, recently there is a shift in machine learning from model-centric to data-centric approaches. In datacentric approaches, the focus is to refine and improve the quality of the data to improve the learning performance of the models rather than redesigning model architectures. In this paper, we propose CLIP i.e., Curriculum Learning with Iterative data Pruning. CLIP combines two data-centric approaches i.e., curriculum learning and dataset pruning to improve the model learning accuracy and convergence speed. The proposed scheme applies loss-aware dataset pruning to iteratively remove the least significant samples and progressively reduces the size of the effective dataset in the curriculum learning training. Extensive experiments performed on crowd density estimation models validate the notion behind combining the two approaches by reducing the convergence time and improving generalization. To our knowledge, the idea of data pruning as an embedded process in curriculum learning is novel.
راعي المشروعThis publication was made possible by the PDRA award PDRA7-0606-21012 from the Qatar National Research Fund (a member of The Qatar Foundation). The statements made herein are solely the responsibility of the authors.
اللغةen
الناشرIEEE
الموضوعConvergence
crowd counting
curriculum learning
data-centric
pruning.
العنوانCLIP: Train Faster with Less Data
النوعConference Paper
الصفحات34-39
dc.accessType Full Text


الملفات في هذه التسجيلة

Thumbnail

هذه التسجيلة تظهر في المجموعات التالية

عرض بسيط للتسجيلة