Robust multi-label surgical tool classification in noisy endoscopic videos
Author | Qayyum, Adnan |
Author | Ali, Hassan |
Author | Caputo, Massimo |
Author | Vohra, Hunaid |
Author | Akinosho, Taofeek |
Author | Abioye, Sofiat |
Author | Berrou, Ilhem |
Author | Capik, Paweł |
Author | Qadir, Junaid |
Author | Bilal, Muhammad |
Available date | 2025-07-08T03:58:11Z |
Publication Date | 2025 |
Publication Name | Scientific Reports |
Resource | Scopus |
Identifier | http://dx.doi.org/10.1038/s41598-024-82351-5 |
ISSN | 20452322 |
Abstract | Over the past few years, surgical data science has attracted substantial interest from the machine learning (ML) community. Various studies have demonstrated the efficacy of emerging ML techniques in analysing surgical data, particularly recordings of procedures, for digitising clinical and non-clinical functions like preoperative planning, context-aware decision-making, and operating skill assessment. However, this field is still in its infancy and lacks representative, well-annotated datasets for training robust models in intermediate ML tasks. Also, existing datasets suffer from inaccurate labels, hindering the development of reliable models. In this paper, we propose a systematic methodology for developing robust models for surgical tool classification using noisy endoscopic videos. Our methodology introduces two key innovations: (1) an intelligent active learning strategy for minimal dataset identification and label correction by human experts through collective intelligence; and (2) an assembling strategy for a student-teacher model-based self-training framework to achieve the robust classification of 14 surgical tools in a semi-supervised fashion. Furthermore, we employ strategies such as weighted data loaders and label smoothing to enable the models to learn difficult samples and address class imbalance issues. The proposed methodology achieves an average F1-score of 85.88% for the ensemble model-based self-training with class weights, and 80.88% without class weights for noisy tool labels. Also, our proposed method significantly outperforms existing approaches, which effectively demonstrates its effectiveness. |
Sponsor | The authors gratefully acknowledge the University of the West of England (UWE), Bristol, for their financial support through the Vice Chancellor\u2019s Challenge Fund (Project: IVA HEART; Grant No: CF2231). This funding facilitated the recruitment of a Research Associate for this study. Additionally, the authors acknowledge the British Heart Foundation Chair for supporting Prof. Massimo Caputo\u2019s research (UOB Project No: CH/17/1/32804). |
Language | en |
Publisher | Nature Research |
Subject | Algorithms Endoscopy Humans Machine Learning Surgical Instruments Video Recording article classification female human intelligence machine learning male teaching assistant videorecording algorithm endoscopy machine learning procedures surgical equipment |
Type | Article |
Issue Number | 1 |
Volume Number | 15 |
Files in this item
This item appears in the following Collection(s)
-
Computer Science & Engineering [2482 items ]