Machine translation with transformers

dc.contributor.authorNguyen, Truong Thinh
dc.date.accessioned2019-10-21T13:05:58Z
dc.date.available2019-10-21T13:05:58Z
dc.date.issued2019de
dc.description.abstractThe Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, has achieved state-of-the-art performance in recent neural machine translation (NMT) tasks. Although the Recurrent Neural Network (RNN) is one of the most powerful and useful architectures for transforming one sequence into another one, the Transformer model does not employ any RNN. This work aims to investigate the performance of the Transformer model compared to different kinds of RNN model in a variety of difficulty levels of NMT problems.en
dc.identifier.other168133240X
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-106212de
dc.identifier.urihttp://elib.uni-stuttgart.de/handle/11682/10621
dc.identifier.urihttp://dx.doi.org/10.18419/opus-10604
dc.language.isoende
dc.rightsinfo:eu-repo/semantics/openAccessde
dc.subject.ddc004de
dc.titleMachine translation with transformersen
dc.typemasterThesisde
ubs.fakultaetInformatik, Elektrotechnik und Informationstechnikde
ubs.institutInstitut für Maschinelle Sprachverarbeitungde
ubs.publikation.seiten60de
ubs.publikation.typAbschlussarbeit (Master)de

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
thesis.pdf
Size:
1.38 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.39 KB
Format:
Item-specific license agreed upon to submission
Description: