Deep Sequential Multi-task Modeling for Next Check-in Time and Location Prediction

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Scopus citations

Abstract

In this paper, we address the problem of next check-in time and location prediction, and propose a deep sequential multi-task model, named Personalized Recurrent Point Process with Attention (PRPPA), which seamlessly integrates user static representation learning, dynamic recent check-in behavior modeling, and temporal point process into a unified architecture. An attention mechanism is further included in the intensity function of point process to enhance the capability of explicitly capturing the effect of past check-in events. Through the experiments, we verify the proposed model is effective in location and time prediction.

Original languageEnglish
Title of host publicationDatabase Systems for Advanced Applications - DASFAA 2019 International Workshops
Subtitle of host publicationBDMS, BDQM, and GDMA, Proceedings
EditorsYongxin Tong, Juggapong Natwichai, Guoliang Li, Jun Yang, Joao Gama
PublisherSpringer Verlag
Pages353-357
Number of pages5
ISBN (Print)9783030185893
DOIs
StatePublished - 2019
Event24th International Conference on Database Systems for Advanced Applications, DASFAA 2019 - Chiang Mai, Thailand
Duration: 22 Apr 201925 Apr 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11448 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Database Systems for Advanced Applications, DASFAA 2019
Country/TerritoryThailand
CityChiang Mai
Period22/04/1925/04/19

Keywords

  • Check-in prediction
  • Deep recurrent modeling
  • Multi-task learning
  • Temporal point process

Fingerprint

Dive into the research topics of 'Deep Sequential Multi-task Modeling for Next Check-in Time and Location Prediction'. Together they form a unique fingerprint.

Cite this