Universität Stuttgart

Permanent URI for this communityhttps://elib.uni-stuttgart.de/handle/11682/1

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    ItemOpen Access
    Physics-informed regression of implicitly-constrained robot dynamics
    (2022) Geist, Andreas René; Allgöwer, Frank (Prof. Dr.-Ing.)
    The ability to predict a robot’s motion through a dynamics model is critical for the development of fast, safe, and efficient control algorithms. Yet, obtaining an accurate robot dynamics model is challenging as robot dynamics are typically nonlinear and subject to environment-dependent physical phenomena such as friction and material elasticities. The respective functions often cause analytical dynamics models to have large prediction errors. An alternative approach to analytical modeling forms the identification of a robot’s dynamics through data-driven modeling techniques such as Gaussian processes or neural networks. However, solely data-driven algorithms require considerable amounts of data, which on a robotic system must be collected in real-time. Moreover, the information stored in the data as well as the coverage of the system’s state space by the data is limited by the controller that is used to obtain the data. To tackle the shortcomings of analytical dynamics and data-driven modeling, this dissertation investigates and develops models in which analytical dynamics is being combined with data-driven regression techniques. By combining prior structural knowledge from analytical dynamics with data-driven regression, physics-informed models show improved data-efficiency and prediction accuracy compared to using the aforementioned modeling techniques in an isolated manner.
  • Thumbnail Image
    ItemOpen Access
    Simulating stochastic processes with variational quantum circuits
    (2022) Fink, Daniel
    Simulating future outcomes based on past observations is a key task in predictive modeling and has found application in many areas ranging from neuroscience to the modeling of financial markets. The classical provably optimal models for stationary stochastic processes are so-called ϵ-machines, which have the structure of a unifilar hidden Markov model and offer a minimal set of internal states. However, these models are not optimal in the quantum setting, i.e., when the models have access to quantum devices. The methods proposed so far for quantum predictive models rely either on the knowledge of an ϵ-machine, or on learning a classical representation thereof, which is memory inefficient since it requires exponentially many resources in the Markov order. Meanwhile, variational quantum algorithms (VQAs) are a promising approach for using near-term quantum devices to tackle problems arising from many different areas in science and technology. Within this work, we propose a VQA for learning quantum predictive models directly from data on a quantum computer. The learning algorithm is inspired by recent developments in the area of implicit generative modeling, where a kernel-based two-sample-test, called maximum mean discrepancy (MMD), is used as a cost function. A major challenge of learning predictive models is to ensure that arbitrarily many time steps can be simulated accurately. For this purpose, we propose a quantum post-processing step that yields a regularization term for the cost function and penalizes models with a large set of internal states. As a proof of concept, we apply the algorithm to a stationary stochastic process and show that the regularization leads to a small set of internal states and a constantly good simulation performance over multiple future time steps, measured in the Kullback-Leibler divergence and the total variation distance.