Optimization of back-propagation learning algorithm on MLP networks

dc.contributor.authorHoyo, Daniel delde
dc.date.accessioned2012-12-10de
dc.date.accessioned2016-03-31T08:00:05Z
dc.date.available2012-12-10de
dc.date.available2016-03-31T08:00:05Z
dc.date.issued2012de
dc.description.abstractIn order to generate more efficient neural networks, the configuration of the ANN itself has to be optimized, specially refering to its parameters and architecture. To do so, this problem will be approached from the learning and training process point of view, realizing different tests. These evaluations will lead us to determine which are the most optimum parameters for this processes. At the same time, the importance of the input pattern and the data used will be studied, observing how these influences on the learning process, not only from a runtime point of view, but also measuring the obtained error in the trained network. On the other side, the implementation itself will be optimized, doing this by executing the learning algorithm in parallel, using different nodes, meassuring the time needed for completing the trainning, and comparing it with the time needed in a sequential execution.en
dc.identifier.other377284343de
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-80107de
dc.identifier.urihttp://elib.uni-stuttgart.de/handle/11682/2999
dc.identifier.urihttp://dx.doi.org/10.18419/opus-2982
dc.language.isoende
dc.rightsinfo:eu-repo/semantics/openAccessde
dc.subject.ddc004de
dc.titleOptimization of back-propagation learning algorithm on MLP networksen
dc.typemasterThesisde
ubs.fakultaetFakultät Informatik, Elektrotechnik und Informationstechnikde
ubs.institutInstitut für Parallele und Verteilte Systemede
ubs.opusid8010de
ubs.publikation.typAbschlussarbeit (Diplom)de

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
DIP_3353.pdf
Size:
1.4 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
935 B
Format:
Plain Text
Description: