The Annotated Transformer

Alexander Rush


Abstract
A major goal of open-source NLP is to quickly and accurately reproduce the results of new work, in a manner that the community can easily use and modify. While most papers publish enough detail for replication, it still may be difficult to achieve good results in practice. This paper presents a worked exercise of paper reproduction with the goal of implementing the results of the recent Transformer model. The replication exercise aims at simple code structure that follows closely with the original work, while achieving an efficient usable system.
Anthology ID:
W18-2509
Volume:
Proceedings of Workshop for NLP Open Source Software (NLP-OSS)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Venues:
ACL | NLPOSS | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
52–60
Language:
URL:
https://www.aclweb.org/anthology/W18-2509
DOI:
10.18653/v1/W18-2509
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
http://aclanthology.lst.uni-saarland.de/W18-2509.pdf