Please use this identifier to cite or link to this item: http://dx.doi.org/10.18419/opus-10604
Authors: Nguyen, Truong Thinh
Title: Machine translation with transformers
Issue Date: 2019
metadata.ubs.publikation.typ: Abschlussarbeit (Master)
metadata.ubs.publikation.seiten: 60
URI: http://elib.uni-stuttgart.de/handle/11682/10621
http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-106212
http://dx.doi.org/10.18419/opus-10604
Abstract: The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, has achieved state-of-the-art performance in recent neural machine translation (NMT) tasks. Although the Recurrent Neural Network (RNN) is one of the most powerful and useful architectures for transforming one sequence into another one, the Transformer model does not employ any RNN. This work aims to investigate the performance of the Transformer model compared to different kinds of RNN model in a variety of difficulty levels of NMT problems.
Appears in Collections:05 Fakultät Informatik, Elektrotechnik und Informationstechnik

Files in This Item:
File Description SizeFormat 
thesis.pdf1,41 MBAdobe PDFView/Open


Items in OPUS are protected by copyright, with all rights reserved, unless otherwise indicated.