Browsing by Author "Obst, Julian"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Experimentelle Bestimmung vergleichsoptimaler Sortieralgorithmen(2019) Obst, JulianDie gängigsten Sortierverfahren sortieren Mengen, indem sie zwei Elemente miteinander vergleichen und damit Schritt für Schritt eine Ordnung aufbauen. Es stellt sich die Frage, was die minimale Anzahl an Vergleichen ist, die benötigt wird, um eine beliebige Permutation einer Menge zu sortieren. Das zentrale Ziel dieser Arbeit ist es, die theoretischen Grundlagen zur Bestimmung vergleichsoptimaler Sortieralgorithmen zu erläutern und die Anzahl der Vergleiche von solchen Algorithmen zu bestimmen. Dabei werden auch die praktischen Ansätze besprochen, die benutzt wurden, um einen Algorithmus zu erstellen, der dieses Problem lösen soll. In diesem Zusammenhang werden die Aspekte der Implementierung dieses Algorithmus beleuchtet. Vor allem werden in diesem Punkt aber auch Optimierungen beschrieben, die helfen sollen, die Laufzeit zu verkürzen. Für n = 12, 13 konnten bekannte Ergebnisse bestätigt werden.Item Open Access Parameter initialization for warm-starting QAOA(2022) Obst, JulianThe Quantum Approximate Optimization Algorithm (QAOA) is a promising candidate for achieving quantum advantage on near-term quantum devices. Warm-Started QAOA (WS-QAOA) is animproved version of QAOA. It utilizes an approximate solution to the problem in question, which is efficiently computed on classical hardware first. This version aims for improved performanceat a lower depth. Since QAOA is a variational algorithm, finding suitable parameters is key to its successful application. Oftentimes, good parameters are obtained by picking random initial parameters and optimizing them. There have been recent works that demonstrated a successful transfer of parameters between different problem instances solved with QAOA. This thesis investigates how parameter transfer can be applied to WS-QAOA for MaxCut with a focus on unweighted 3-, 4- and 5-regular graphs. I show that parameters can be successfully transferred between warm-started d-regular instances, where both instances have degree d. The approximate MaxCut solution used to warm-start the algorithm (initial cut) greatly influences the success and a successful transfer is more likely if both instances are warm-started with a good initial cut. Further, I show empirically that for WS-QAOA the regular instances also demonstrate a concentration of good parameters. For example, in the experiments performed here, optimized parameters of a regular instance could be transferred to another with no significant reduction in approximation ratio on average. The 3- and 5-regular instances even showed an increase of the average approximation ratio by 0.003 and 0.008 respectively, 4-regular instances saw a 0.041 decrease. However, this had more to do with the quality of the initial cut than the transferred parameters. These insights lead to an approach called landscape approximation that helps to determine good initial parameters that can be used directly or be further optimized.