BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

Alex Wang, Kyunghyun Cho


Abstract
We show that BERT (Devlin et al., 2018) is a Markov random field language model. This formulation gives way to a natural procedure to sample sentences from BERT. We generate from BERT and find that it can produce high quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.
Anthology ID:
W19-2304
Volume:
Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Venues:
NAACL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–36
Language:
URL:
https://www.aclweb.org/anthology/W19-2304
DOI:
10.18653/v1/W19-2304
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/W19-2304.pdf