Learning from event sequences
Loading...
Authors
Shou, Xiao
Issue Date
2023-08
Type
Electronic thesis
Thesis
Thesis
Language
en_US
Keywords
Mathematics
Alternative Title
Abstract
Event sequences are fundamental in various domains, encompassing customer transactions, electronic health records, and other scenarios involving actions over time. Understanding the dynamics of event transitions enables prediction of future events, examination of event influence, and quantification of their effects. This thesis focuses on learning and inference problems in event sequences using state-of-the-art machine learning and deep learning approaches.Temporal point processes are widely employed for modeling discrete events in continuous time. In this thesis, we propose an innovative self-supervised learning paradigm coupled with a contrastive module which leverages transformer encoders to predict the subsequent event given history. What sets our approach apart is its ability to capture the continuous time dynamics, distinguishing it from traditional self-supervision techniques used in time series and NLP. In addition, we tackle the intricate task of multi-label prediction in event streams, where multiple event types can co-occur within the continuous time frame- work. To address this challenge, we introduce the Transformer-based Conditional Mixture of Bernoulli Network, enabling the capture of complex temporal and probabilistic interdependencies among concurrent event labels. Through comprehensive experimentation, our proposed methodologies exhibit compelling empirical performance when compared to exist- ing methods, establishing their efficacy across various synthetic and real-world benchmark datasets.
To identify influential events related to specific events of interest, we introduce the influence-aware attention for multivariate temporal point process model. This model lever- ages transformer attention mechanisms and variational inference techniques to capture the temporal dynamics and aggregate instance-instance interactions, thereby revealing type-wise influence between different event types. We also explore scenarios without timestamps and propose a probabilistic attention-to-influence neural model to discover influential event types in timestamp-free datasets. Our models have demonstrated superior performance around influencing set identification and prediction tasks for a particular event of interest.
Moreover, we delve into the realm of causal inference in temporal point processes by extending Rubin’s framework for average treatment effect and propensity scores to accommodate multivariate point processes. This extension enables us to perform causal inference between event variables in recurrent event streams. In the context of datasets without timestamps, we enhance autoregressive transformer models by incorporating pairwise causal knowledge derived from causal knowledge graphs. This incorporation guides the training of transformer models, ultimately bolstering the reliability and trustworthiness of the model. This line of research has proven to be highly effective in our investigations.
Extensive experiments validate the effectiveness of our models and algorithms, outperforming existing methods on synthetic and real-world benchmarks. Finally, we identify future directions for research, including the exploration of multi-modal self-supervision and sequential decision-making within the context of temporal point processes.
Description
August2023
School of Science
School of Science
Full Citation
Publisher
Rensselaer Polytechnic Institute, Troy, NY