TY - JOUR
T1 - Few-shot activity learning by dual Markov logic networks
AU - Zhang, Zhimin
AU - Zhu, Tao
AU - Gao, Dazhi
AU - Xu, Jiabo
AU - Liu, Hong
AU - Ning, Huansheng
N1 - Publisher Copyright:
© 2022 Elsevier B.V.
PY - 2022/3/15
Y1 - 2022/3/15
N2 - In Human activity recognition (HAR), a large amount of data may have no labels, so it is necessary to realize effective and credible data calibration under Few-shot learning (FSL). This paper proposes a new method for calibrating unlabeled data by using two models to improve the credibility of data labels further. Markov logic network (MLN) is used as the basic model. On the one hand, the construction of knowledge can reduce the dependence on the amount of data. On the other hand, the relationship between the actions can be effectively expressed. Specifically, we pre-train two MLN models using two unmatched training datasets. Then, these models are used to infer the possible labels of unlabeled data simultaneously. When labels are inconsistent, the most likely label will be selected and used to retrain the models. In order to ensure the credibility of the selected label, the selection process adopts a dual-model cross-validation method. This method uses the least square method to determine models’ weights according to the test results. Finally, it gives the possible probability of the labels under the common prediction of the models. Experimental results showed that the dual-model design method is better than the single model method in terms of time efficiency and better than the single model, TextCNN, TextLSTM, Transformer, and other models in terms of label credibility.
AB - In Human activity recognition (HAR), a large amount of data may have no labels, so it is necessary to realize effective and credible data calibration under Few-shot learning (FSL). This paper proposes a new method for calibrating unlabeled data by using two models to improve the credibility of data labels further. Markov logic network (MLN) is used as the basic model. On the one hand, the construction of knowledge can reduce the dependence on the amount of data. On the other hand, the relationship between the actions can be effectively expressed. Specifically, we pre-train two MLN models using two unmatched training datasets. Then, these models are used to infer the possible labels of unlabeled data simultaneously. When labels are inconsistent, the most likely label will be selected and used to retrain the models. In order to ensure the credibility of the selected label, the selection process adopts a dual-model cross-validation method. This method uses the least square method to determine models’ weights according to the test results. Finally, it gives the possible probability of the labels under the common prediction of the models. Experimental results showed that the dual-model design method is better than the single model method in terms of time efficiency and better than the single model, TextCNN, TextLSTM, Transformer, and other models in terms of label credibility.
KW - Dual-model cross-validation
KW - Few-shot learning
KW - Least square method
KW - Markov logic network
KW - Unlabeled data calibration
UR - https://www.scopus.com/pages/publications/85123346074
U2 - 10.1016/j.knosys.2022.108158
DO - 10.1016/j.knosys.2022.108158
M3 - 文章
AN - SCOPUS:85123346074
SN - 0950-7051
VL - 240
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 108158
ER -