Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: http://dx.doi.org/10.18419/opus-11307
Autor(en): Tobien, Patrick
Titel: Multi-frame approaches for learning optical flow predictions
Erscheinungsdatum: 2020
Dokumentart: Abschlussarbeit (Master)
Seiten: 87
URI: http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-113244
http://elib.uni-stuttgart.de/handle/11682/11324
http://dx.doi.org/10.18419/opus-11307
Zusammenfassung: In recent years, optical flow estimation techniques have shifted from classical variational approaches to learning-based methods, which show promising results with respect to both accuracy and speed. However, only few attempts have been made at using more than two frames of the input sequence for optical flow prediction. ProFlow by Maurer and Bruhn [MB18] is a self-supervised learning approach that uses three consecutive frames in order to estimate forward motion from the corresponding backward motion. By capitalizing on the temporal information contained in the sequence, ProFlow manages to improve performance in occluded regions, where a good estimation is particularly difficult. In this work, ProFlow is extended such that it uses backward motions over multiple frames to predict the corresponding forward motion, thus leveraging additional information from the past. Four different multi-frame modifications are introduced that each enable a forward flow estimation from multiple backward flows. The best performing approach employs several convolutional neural networks (CNNs) where each network is trained on a separate backward flow. By explicitly combining the respective predictions, the model achieves an improvement of between 10% and 15% on the Sintel datasets, proving that additional temporal information can be leveraged for optical flow prediction.
Enthalten in den Sammlungen:05 Fakultät Informatik, Elektrotechnik und Informationstechnik

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
MA_multiframe_prediction.pdf31,72 MBAdobe PDFÖffnen/Anzeigen


Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.