TY - GEN
T1 - Leveraging Dependencies among Learned Temporal Subsequences
AU - Roychoudhury, Shoumik
AU - Zhou, Fang
AU - Obradovic, Zoran
N1 - Publisher Copyright:
Copyright © 2022 by SIAM.
PY - 2022
Y1 - 2022
N2 - Research on classifying time-series based on subsequences, known as shapelets, has attracted considerable interest in the community. Most existing shapelet-based time-series classification approaches neglect the temporal dependencies among extracted shapelets. Recently, shapelet-orders that encode the temporal dependencies among pairwise shapelets were shown to be informative features. However, based on a random selection of candidate shapelets, the state-of-the-art model does not guarantee optimal shapelets selection. This, in turn, may lead to inferior quality shapelet-orders. Learning shapelets, instead of searching, guarantees near-optimal shapelets thus decreasing generalization error. However, the costly initialization approach for learning generalized shapelets significantly limits its scalability in large time-series datasets. We address the problem of leveraging temporal dependencies among generalized shapelets from randomly initialized subsequences by jointly learning from the shapelet-transform space and the shapelet-order space. The underlying hypothesis is that leveraging the temporal dependency information of generalized shapelets improves the classification performance. Furthermore, introducing a randomized subsequence initialization for learning generalized shapelets allows a more scalable shapelet learning approach. The proposed model was significantly more accurate and faster than the baseline alternatives when evaluated on both synthetic and real-world time-series datasets.
AB - Research on classifying time-series based on subsequences, known as shapelets, has attracted considerable interest in the community. Most existing shapelet-based time-series classification approaches neglect the temporal dependencies among extracted shapelets. Recently, shapelet-orders that encode the temporal dependencies among pairwise shapelets were shown to be informative features. However, based on a random selection of candidate shapelets, the state-of-the-art model does not guarantee optimal shapelets selection. This, in turn, may lead to inferior quality shapelet-orders. Learning shapelets, instead of searching, guarantees near-optimal shapelets thus decreasing generalization error. However, the costly initialization approach for learning generalized shapelets significantly limits its scalability in large time-series datasets. We address the problem of leveraging temporal dependencies among generalized shapelets from randomly initialized subsequences by jointly learning from the shapelet-transform space and the shapelet-order space. The underlying hypothesis is that leveraging the temporal dependency information of generalized shapelets improves the classification performance. Furthermore, introducing a randomized subsequence initialization for learning generalized shapelets allows a more scalable shapelet learning approach. The proposed model was significantly more accurate and faster than the baseline alternatives when evaluated on both synthetic and real-world time-series datasets.
UR - https://www.scopus.com/pages/publications/85131317055
M3 - 会议稿件
AN - SCOPUS:85131317055
T3 - Proceedings of the 2022 SIAM International Conference on Data Mining, SDM 2022
SP - 504
EP - 512
BT - Proceedings of the 2022 SIAM International Conference on Data Mining, SDM 2022
PB - Society for Industrial and Applied Mathematics Publications
T2 - 2022 SIAM International Conference on Data Mining, SDM 2022
Y2 - 28 April 2022 through 30 April 2022
ER -