Rel-grams: A Probabilistic Model of Relations in Text
Rel-grams: A Probabilistic Model of Relations in Text
authors Niranjan Balasubramanian, Stephen Soderland and Mausam
venue Joint Workshop on Automatic Knowledge Base Construction and Web-scale Knowledge Extraction
year 2012
abstract We introduce the Rel-grams language model, which is analogous to an n-grams model, but is computed over relations rather than over words. The model encodes the conditional probability of observing a relational tuple R, given that R' was observed in a window of prior relational tuples. We build a database of rel-grams co-occurence statistics from ReVerb extractions over 1.8M news wire documents and show that a graphical model based on these statistics is useful for automatically discovering event templates. We make this database freely available and hope it will prove a useful resource for a wide variety of NLP tasks.

download: pdf