Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
http://dx.doi.org/10.18419/opus-9387
Langanzeige der Metadaten
DC Element | Wert | Sprache |
---|---|---|
dc.contributor.advisor | Steinwart, Ingo (Prof. Dr.) | - |
dc.contributor.author | Farooq, Muhammad | - |
dc.date.accessioned | 2017-12-05T08:24:17Z | - |
dc.date.available | 2017-12-05T08:24:17Z | - |
dc.date.issued | 2017 | de |
dc.identifier.other | 1009133969 | - |
dc.identifier.uri | http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-94043 | de |
dc.identifier.uri | http://elib.uni-stuttgart.de/handle/11682/9404 | - |
dc.identifier.uri | http://dx.doi.org/10.18419/opus-9387 | - |
dc.description.abstract | Conditional expectiles are becoming an increasingly important tool in finance as well as in other areas of application such as demography when the goal is to explore the conditional distribution beyond the conditional mean. In this thesis, we consider a support vector machine (SVM) type approach with the asymmetric least squares loss for estimating conditional expectiles. Firstly, we establish learning rates for this approach that are minimax optimal modulo a logarithmic factor if Gaussian RBF kernels are used and the desired expectile is smooth in a Besov sense. It turns out that our learning rates, as a special case, improve the best known rates for kernel-based least squares regression in aforementioned scenario. As key ingredients of our statistical analysis, we establish a general calibration inequality for the asymmetric least squares loss, a corresponding variance bound as well as an improved entropy number bound for Gaussian RBF kernels. Furthermore, we establish optimal learning rates in the case of a generic kernel under the assumption that the target function is in a real interpolation space. Secondly, we complement the theoretical results of our SVM approach with the empirical findings. For this purpose we use a sequential minimal optimization method and design an SVM-like solver for expectile regression considering Gaussian RBF kernels. We conduct various experiments in order to investigate the behavior of the designed solver with respect to different combinations of initialization strategies, working set selection strategies, stopping criteria and number of nearest neighbors, and then look for the best combination of them. We further compare the results of our solver to the recent R-package ER-Boost and find that our solver exhibits a better test performance. In terms of training time, our solver is found to be more sensitive to the training set size and less sensitive to the dimensions of the data set, whereas, ER-Boost behaves the other way around. In addition, our solver is found to be faster than a similarly implemented solver for the quantile regression. Finally, we show the convergence of our designed solver. | en |
dc.language.iso | en | de |
dc.rights | info:eu-repo/semantics/openAccess | de |
dc.subject.ddc | 510 | de |
dc.title | Kernel-based expectile regression | en |
dc.type | doctoralThesis | de |
ubs.dateAccepted | 2017-10-13 | - |
ubs.fakultaet | Mathematik und Physik | de |
ubs.institut | Institut für Stochastik und Anwendungen | de |
ubs.publikation.seiten | viii, 138 | de |
ubs.publikation.typ | Dissertation | de |
ubs.thesis.grantor | Mathematik und Physik | de |
Enthalten in den Sammlungen: | 08 Fakultät Mathematik und Physik |
Dateien zu dieser Ressource:
Datei | Beschreibung | Größe | Format | |
---|---|---|---|---|
farooq2017KernelBasedExpectileRegression.pdf | 2,12 MB | Adobe PDF | Öffnen/Anzeigen |
Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.