Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL)

Event Representation with Sequential, Semi-Supervised Discrete Variables


Within the context of event modeling and understanding, we propose a new method for neural sequence modeling that takes partially-observed sequences of discrete, external knowledge into account. We construct a sequential, neural variational autoencoder that uses a carefully defined encoder, and Gumbel-Softmax reparametrization, to allow for successful backpropagation during training. We show that our approach outperforms multiple baselines and the state-of-the-art in narrative script induction on multiple event modeling tasks. We demonstrate that our approach converges more quickly.

  • 1293383 bytes


Downloads: 808 downloads

UMBC ebiquity