TY - GEN
T1 - “Forget” the Forget Gate
T2 - 37th Computer Graphics International Conference, CGI 2020
AU - Fanta, Habtamu
AU - Shao, Zhiwen
AU - Ma, Lizhuang
N1 - Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - Abnormal event detection is a challenging task that requires effectively handling intricate features of appearance and motion. In this paper, we present an approach of detecting anomalies in videos by learning a novel LSTM based self-contained network on normal dense optical flow. Due to their sigmoid implementations, standard LSTM’s forget gate is susceptible to overlooking and dismissing relevant content in long sequence tasks. The forget gate mitigates participation of previous hidden state for computation of cell state prioritizing current input. Besides, the hyperbolic tangent activation of standard LSTMs sacrifices performance when a network gets deeper. To tackle these two limitations, we introduce a bi-gated, light LSTM cell by discarding the forget gate and introducing sigmoid activation. Specifically, the proposed LSTM architecture fully sustains content from previous hidden state thereby enabling the trained model to be robust and make context-independent decision during evaluation. Removing the forget gate results in a simplified and undemanding LSTM cell with improved performance and computational efficiency. Empirical evaluations show that the proposed bi-gated LSTM based network outperforms various LSTM based models for abnormality detection and generalization tasks on CUHK Avenue and UCSD datasets.
AB - Abnormal event detection is a challenging task that requires effectively handling intricate features of appearance and motion. In this paper, we present an approach of detecting anomalies in videos by learning a novel LSTM based self-contained network on normal dense optical flow. Due to their sigmoid implementations, standard LSTM’s forget gate is susceptible to overlooking and dismissing relevant content in long sequence tasks. The forget gate mitigates participation of previous hidden state for computation of cell state prioritizing current input. Besides, the hyperbolic tangent activation of standard LSTMs sacrifices performance when a network gets deeper. To tackle these two limitations, we introduce a bi-gated, light LSTM cell by discarding the forget gate and introducing sigmoid activation. Specifically, the proposed LSTM architecture fully sustains content from previous hidden state thereby enabling the trained model to be robust and make context-independent decision during evaluation. Removing the forget gate results in a simplified and undemanding LSTM cell with improved performance and computational efficiency. Empirical evaluations show that the proposed bi-gated LSTM based network outperforms various LSTM based models for abnormality detection and generalization tasks on CUHK Avenue and UCSD datasets.
KW - Abnormal event detection
KW - Abnormality generalization
KW - Long Short-Term Memory
KW - Self-contained LSTM
UR - https://www.scopus.com/pages/publications/85096548937
U2 - 10.1007/978-3-030-61864-3_15
DO - 10.1007/978-3-030-61864-3_15
M3 - 会议稿件
AN - SCOPUS:85096548937
SN - 9783030618636
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 169
EP - 181
BT - Advances in Computer Graphics - 37th Computer Graphics International Conference, CGI 2020, Proceedings
A2 - Magnenat-Thalmann, Nadia
A2 - Stephanidis, Constantine
A2 - Papagiannakis, George
A2 - Wu, Enhua
A2 - Thalmann, Daniel
A2 - Sheng, Bin
A2 - Kim, Jinman
A2 - Gavrilova, Marina
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 20 October 2020 through 23 October 2020
ER -