Show simple item record

AuthorQiao, Zhongzheng
AuthorHu, Minghui
AuthorJiang, Xudong
AuthorSuganthan, Ponnuthurai Nagaratnam
AuthorSavitha, Ramasamy
Available date2025-01-20T05:12:03Z
Publication Date2023
Publication NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ResourceScopus
Identifierhttp://dx.doi.org/10.1109/ICASSP49357.2023.10094960
ISSN15206149
URIhttp://hdl.handle.net/10576/62269
AbstractClass-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit Soft-Dynamic Time Warping (Soft-DTW) for knowledge distillation, which aligns the feature maps along the temporal dimension before calculating the discrepancy. Compared with Euclidean distance, Soft-DTW shows its advantages in overcoming catastrophic forgetting and balancing the stability-plasticity dilemma. We construct two novel MTS-CIL benchmarks for comprehensive experiments. Combined with a prototype augmentation strategy, our framework demonstrates significant superiority over other prominent exemplar-free algorithms.
SponsorThis research is part of the programme DesCartes and is supported by the National Research Foundation, Prime Minister's Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme.
Languageen
PublisherInstitute of Electrical and Electronics Engineers Inc.
SubjectContinual Learning
Dynamic Time Warping
Knowledge Distillation
Multivariate time series classification
TitleClass-Incremental Learning on Multivariate Time Series Via Shape-Aligned Temporal Distillation
TypeConference
Pagination1-5
Volume Number2023-June
dc.accessType Full Text


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record