TEMSET-24K: Densely Annotated Dataset for Indexing Multipart Endoscopic Videos using Surgical Timeline Segmentation

View/ Open
Date
2025-08-14Author
Bilal, MuhammadAlam, Mahmood
Bapu, Deepashree
Korsgen, Stephan
Lal, Neeraj
Bach, Simon
Hajiyavand, Amir M.
Ali, Muhammed
Soomro, Kamran
Qasim, Iqbal
Capik, Paweł
Khan, Aslam
Khan, Zaheer
Vohra, Hunaid
Caputo, Massimo
Beggs, Andrew D.
Qayyum, Adnan
Qadir, Junaid
Ashraf, Shazad Q.
...show more authors ...show less authors
Metadata
Show full item recordAbstract
Indexing endoscopic surgical videos is vital in surgical data science, forming the basis for systematic retrospective analysis and clinical performance evaluation. Despite its significance, current video analytics rely on manual indexing, a time-consuming process. Advances in computer vision, particularly deep learning, offer automation potential, yet progress is limited by the lack of publicly available, densely annotated surgical datasets. To address this, we present TEMSET-24K, an open-source dataset comprising 24,306 trans-anal endoscopic microsurgery (TEMS) video microclips. Each clip is meticulously annotated by clinical experts using a novel hierarchical labeling taxonomy encompassing “phase, task, and action” triplets, capturing intricate surgical workflows. To validate this dataset, we benchmarked deep learning models, including transformer-based architectures. Our in silico evaluation demonstrates high accuracy (up to 0.99) and F1 scores (up to 0.99) for key phases like “Setup” and “Suturing.” The STALNet model, tested with ConvNeXt, ViT, and SWIN V2 encoders, consistently segmented well-represented phases. TEMSET-24K provides a critical benchmark, propelling state-of-the-art solutions in surgical data science.
Collections
- Computer Science & Engineering [2518 items ]

