05 Fakultät Informatik, Elektrotechnik und Informationstechnik
Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/6
Browse
7 results
Search Results
Item Open Access Efficient modeling and computation methods for robust AMS system design(2018) Gil, Leandro; Radetzki, Martin (Prof. Dr.-Ing.)This dissertation copes with the challenge regarding the development of model based design tools that better support the mixed analog and digital parts design of embedded systems. It focuses on the conception of efficient modeling and simulation methods that adequately support emerging system level design methodologies. Starting with a deep analysis of the design activities, many weak points of today’s system level design tools were captured. After considering the modeling and simulation of power electronic circuits for designing low energy embedded systems, a novel signal model that efficiently captures the dynamic behavior of analog and digital circuits is proposed and utilized for the development of computation methods that enable the fast and accurate system level simulation of AMS systems. In order to support a stepwise system design refinement which is based on the essential system properties, behavior computation methods for linear and nonlinear analog circuits based on the novel signal model are presented and compared regarding the performance, accuracy and stability with existing numerical and analytical methods for circuit simulation. The novel signal model in combination with the method proposed to efficiently cope with the interaction of analog and digital circuits as well as the new method for digital circuit simulation are the key contributions of this dissertation because they allow the concurrent state and event based simulation of analog and digital circuits. Using a synchronous data flow model of computation for scheduling the execution of the analog and digital model parts, very fast AMS system simulations are carried out. As the best behavior abstraction for analog and digital circuits may be selected without the need of changing component interfaces, the implementation, validation and verification of AMS systems take advantage of the novel mixed signal representation. Changes on the modeling abstraction level do not affect the experiment setup. The second part of this work deals with the robust design of AMS systems and its verification. After defining a mixed sensitivity based robustness evaluation index for AMS control systems, a general robust design method leading to optimal controller tuning is presented. To avoid over-conservative AMS system designs, the proposed robust design optimization method considers parametric uncertainty and nonlinear model characteristics. The system properties in the frequency domain needed to evaluate the system robustness during parameter optimization are obtained from the proposed signal model. Further advantages of the presented signal model for the computation of control system performance evaluation indexes in the time domain are also investigated in combination with range arithmetic. A novel approach for capturing parameter correlations in range arithmetic based circuit behavior computation is proposed as a step towards a holistic modeling method for the robust design of AMS systems. The several modeling and computation methods proposed to improve the support of design methodologies and tools for AMS system are validated and evaluated in the course of this dissertation considering many aspects of the modeling, simulation, design and verification of a low power embedded system implementing Adaptive Voltage and Frequency Scaling (AVFS) for energy saving.Item Open Access Maschinelles Lernen für intelligente Automatisierungssysteme mit dezentraler Datenhaltung am Anwendungsfall Predictive Maintenance(2019) Maschler, Benjamin; Jazdi, Nasser; Weyrich, MichaelFür eine hohe Ergebnisqualität sind Machine Learning Algorithmen auf eine breite Datenbasis angewiesen. Studien zeigen jedoch, dass viele Unternehmen nicht bereit sind, ihre Daten mit anderen Unternehmen, beispielsweise in Form einer gemeinsamen Daten-Cloud, zu teilen. Ziel sollte es daher sein, effizientes maschinelles Lernen mit einer dezentralen Datenhaltung, die den Verbleib vertraulicher Daten im jeweiligen Ursprungs-Unternehmen ermöglicht, zu ermöglichen. In diesem Artikel wird diesbezüglich ein neuartiges Konzept vorgestellt und hinsichtlich seiner Potentiale für intelligente Automatisierungssysteme am Beispiel des Anwendungsfalls Predictive Maintenance analysiert. Die Umsetzbarkeit des Konzepts unter Nutzung verschiedener bestehender Ansätze wird diskutiert, bevor schließlich auf potentielle Mehrwerte für Anlagenbetreiber sowie -hersteller unter besonderer Berücksichtigung der Perspektive kleiner und mittlerer Unternehmen eingegangen wird.Item Open Access Mining software repositories for coupled changes(2017) Ramadani, Jasmin; Wagner, Stefan (Prof. Dr.)Software-Repositories enthalten Informationen über den Entwicklungsverlauf eines Softwaresystems, die von den Entwicklern bei Wartungsarbeiten genutzt werden können. Dazu gehören Daten im Versionsverwaltungssystem, im Issue-Tracking-System und in den Dokumentationsarchiven. Eine der meistverwendeten Techniken zur Analyse von Software-Repositories ist Data Mining, wobei die Frequent-Itemsets-Analyse häufig verwendet wird, um Gruppen von Dateien zu definieren, die in der Vergangenheit häufig zusammen geändert wurden. Diese Dateigruppen definieren wir als gekoppelte Dateien oder Coupled Files. Die meisten Studien über gekoppelte Dateiänderungen in Software-Repositories beziehen Git Versioning-Systeme nicht mit ein. Auch wird darin das Feedback der Entwickler bezüglich der Nützlichkeit der gekoppelten Dateiänderungen und deren Einfluss auf die Wartungsaufgaben nicht berücksichtigt. Das Hauptziel der vorliegenden Untersuchung besteht darin, Entwickler bei ihren Wartungsaufgaben zu unterstützen, und zwar in Form von Vorschlägen für mögliche Dateiänderungen, die auf früheren Änderungen im Git-Versionsverlauf der Software basieren. Wir untersuchten die Extraktion gekoppelter Dateiänderungen durch Data Mining mit Git und analysierten die Rückmeldung von Entwicklern bezüglich der Interessantheit und Nützlichkeit der Vorschläge für gekoppelte Dateiänderungen sowie deren Einfluss auf die Wartungsarbeiten. Anhand einer Fallstudie in der Industrie extrahierten wir gekoppelte Dateiänderungen aus drei Git-Repositories. Basierend auf der in dieser Fallstudie untersuchten Interessantheit und den Erkenntnissen aus einer Reihe empirischer Studien zu Wartungsarbeiten stellten wir eine Theorie auf über die Verwendung von Vorschlägen für gekoppelte Dateiänderungen bei Wartungsarbeiten. Diese Theorie wurde in folgenden Studien getestet: (1) Wir führten ein kontrolliertes Experiment durch, in dem wir Heuristiken zum Gruppieren verwandter Änderungssätze in Git untersuchten, aus denen wir relevante gekoppelte Dateiänderungen extrahierten. (2) In einem Quasi-Experiment untersuchten wir die Nützlichkeit gekoppelter Dateiänderungsvorschläge und deren Auswirkung auf die Korrektheit der Lösung sowie den Zeitaufwand für die Bearbeitung der Wartungsaufgaben. (3) In einer Explorationsstudie untersuchten wir, wie gekoppelte Dateiänderungsvorschläge die Strategie der Entwickler beeinflussen, mit der sie Wartungsarbeiten Hilfe suchen. In einer Explorationsstudie erweiterten wir das Konzept gekoppelter Dateiänderungen auf Paketebene und ermittelten verschiedene Stufen von Entwickler-Kompetenz anhand der von den Entwicklern bearbeiteten Systempakete. Wir entwickelten ein Werkzeug auf Eclipse-Basis, das gekoppelte Dateiänderungsvorschläge extrahiert, visualisiert und Entwicklern bei Wartungsaufgaben zur Verfügung stellt. Wir haben Heuristiken definiert um verwandte Änderungen in Git zu gruppieren. Mit der Frequent-Itemsets-Analyse gelang uns die Extraktion relativer häufiger gekoppelter Dateiänderungen aus Git. Die an der Fallstudie zur Interessantheit gekoppelter Dateien beteiligten Entwickler zeigten sich interessiert an dieser Art von Hilfe während der Wartungsarbeiten. Das Experiment zur Nützlichkeit von gekoppelten Dateiänderungsvorschlägen ergab, dass die Entwickler, die die Vorschläge nutzten, ihre Aufgaben erfolgreicher bewältigten als diejenigen, die es nicht taten. Die Ergebnisse der Explorationsstudie zur Inanspruchnahme von Hilfe bei Wartungsaufgaben zeigen, dass gekoppelte Dateiänderungsvorschläge auch den Bedarf an für die Wartungsaufgaben relevanten externen Informationsquellen reduzieren und so den Wartungsprozess kompakter machen. Zudem wurde das Konzept der gekoppelten Dateiänderungen erfolgreich eingesetzt, um Kompetenzprofile mit unterschiedlichen Spezialisierungen zu erstellen, die auf den Änderungen in den gekoppelten Systempaketen basieren. Die Rückmeldungen der Entwickler zu dem Verfahren der gekoppelten Dateiänderungsvorschläge wurden als positiv identifiziert. Unsere Theorie über den Einsatz gekoppelter Dateiänderungsvorschläge bei Wartungsaufgaben wurde erfolgreich getestet. Mit den vorgeschlagenen Heuristiken ermittelten wir, dass die Gruppierung der Änderungssätze in Git ihre Relevanz beeinflusst. Die Rückmeldungen der Entwickler zeigten, dass das Format und der Kontext gekoppelter Dateiänderungsvorschläge sich auf deren Nützlichkeit auswirken. Die Ergebnisse zeigen auch, dass gekoppelte Dateiänderungsvorschläge die Bearbeitung der Wartungsaufgaben und die Strategien zum Suchen nach Hilfe positiv beeinflussen. Die weitere Analyse von Kopplungen zwischen Teilen des Quellcodes anhand großer Datensätze ermöglicht es, die Auswirkungen gekoppelter Dateiänderungen auf die Wartung und die Qualität von Software besser zu verstehen.Item Open Access Data transfer in partitioned multi-physics simulations : interpolation & communication(2019) Lindner, Florian; Mehl, Miriam (Prof. Dr. rer. nat. habil.)Partitioned multi-physics simulations allow to reuse existing solvers and to combine them to multi-physics scenarios. This provides not only greater flexibility and improved time-to-solution, but also helps to manage the increasing complexity of modern scientific software. This thesis sees itself as a continuation of the works of B. Gatzhammer and B. Uekermann who developed a comprehensive tool to couple independent simulation codes. I focus on the two important aspects of interpolation between non-matching grids as well as communication between several parallel codes and conclude with aspects of software development of the coupling library preCICE. The interpolation part puts special emphasis on radial-basis function interpolation. It starts with a thorough review of existing interpolation methods with special consideration of the black-box approach to multi-physics simulations and explores promising enhancements to RBF interpolation. Numerical experiments provide a rigorous testing for accuracy, stability and scaling behavior of different variants of RBF implementations. Following the insights gained from the numerical experiments, a highly-optimized parallel implementation for preCICE is developed, containing various measures to improve accuracy and stability of the interpolation. The communication part first defines the requirements for partitioned simulations in terms of communication. A new technique for peer-to-peer communication networks between distinct MPI domains is developed and evaluated against existing approaches. Furthermore, a fast method to establish connections via the file system is presented. Both measures optimize the initialization phase and achieve a considerable speedup. Finally, a strategy to fully decouple algorithmically independent participants on the communication protocol level is implemented and tested. In the last part, the software-related challenges in developing a parallel scientific application involving multiple independent solvers are outlined. I show how the preCICE project handles testing, profiling and integration of a large parallel scientific software with multiple participants. A profiling library for distributed applications has been developed and is extensively used in preCICE and potentially other projects.Item Open Access A network abstraction for control systems(2014) Carabelli, Ben W.; Dürr, Frank; Koldehofe, Boris; Rothermel, KurtNetworked control systems (NCS), such as the smart power grid, implement feedback control loops by connecting distributed sensors and actuators to a remote controller over a communication network. In order to avoid the costly and time-consuming installation of dedicated networks, NCS can benefit from utilizing readily available IP networks such as the Internet. However, as control systems are typically sensitive to delay and loss, the integration of such systems over best-effort networks becomes a challenge, which we address in this paper with two main contributions. First, we propose an end-to-end transport abstraction for NCS based on a novel probabilistic quality of service specification which (1) is compatible with existing control models and (2) provides the network with application-specific knowledge about the relation between system performance and network-relevant metrics. Second, we realize this abstraction at the network layer with an optimal routing algorithm, which fulfils the required QoS while minimizing the usage of network resources. We show that our approach lends itself to the implementation with state-of-the-art software-defined networking (SDN) technologies, and demonstrate its effectiveness in our evaluation.Item Open Access 3D printing-as-a-service for collaborative engineering(2017) Baumann, Felix W.; Roller, Dieter (Univ.-Prof. Hon.-Prof. Dr.)3D printing or Additive Manufacturing (AM) are utilised as umbrella terms to denote a variety of technologies to manufacture or create a physical object based on a digital model. Commonly, these technologies create the objects by adding, fusing or melting a raw material in a layer-wise fashion. Apart from the 3D printer itself, no specialised tools are required to create almost any shape or form imaginable and designable. The possibilities of these technologies of these technologies are plentiful and cover the ability to manufacture every object, rapidly, locally and cost-efficiently without wasted resources and material. Objects can be created to specific forms to perform as perfectly fitting functions without consideration of the assembly process. To further the advance the availability and applicability of 3D printing, this thesis identifies the problems that currently exist and attempts to solve them. During the 3D printing process, data (i. e., files) must be converted from their original representation, e. g., CAD file, to the machine instructions for a specific 3D printer. During this process, information is lost, and other information is added. Traceability is lacking in 3D printing. The actual 3D printing can require a long period of time to complete, during which errors can occur. In 3D printing, these errors are often non-recoverable or reversible, which results in wasted material and time. In addition to the lack of closed-loop control systems for 3D printers, careful planning and preparation are required to avoid these costly misprints. 3D printers are usually located remotely from users, due to health and safety considerations, special placement requirements or out of comfort. Remotely placed equipment is impractical to monitor in person; however, such monitoring is essential. Especially considering the proneness of 3D printing to errors and the implications of this as described previously. Utilisation of 3D printers is an issue, especially with expensive 3D printers. As there are a number of differing 3D printing technologies available, having the required 3D printer, might be problematic. 3D printers are equipped with a variety of interfaces, depending on the make and model. These differing interfaces, both hard- and software, hinder the integration of different 3D printers into consistent systems. There exists no proper and complete ontology or resource description schema or mechanism that covers all the different 3D printing technologies. Such a resource description mechanism is essential for the automated scheduling in services or systems. In 3D printing services the selection and matching of appropriate and suitable 3D printers is essential, as not all 3D printing technologies are able to perform on all materials or are able to create certain object features, such as thin walls or hollow forms. The need for companies to sell digital models for AM will increase in scenarios where replacement or customised parts are 3D printed by consumers at home or in local manufacturing centres. Furthermore, requirements to safeguard these digital models will increase to avoid a repetition of the problems from the music industry, e. g., Napster. Replication and ‘theft’ of these models are uncontrollable in the current situation. In a service oriented deployment, or in scenarios where the utilisation is high, estimations of the 3D printing time are required to be available. Common 3D printing time estimations are inaccurate, which hinder the application of scheduling. The complete and comprehensive understanding of the complexity of an object is discordant, especially in the domain of AM. This understanding is required to both support the design of objects for AM and match appropriate manufacturing resources to certain objects. Quality in AM and FDM have been incompletely researched. The quality in general is increased with maturity of the technology; however, research on the quality achievable with consumer-grade 3D printers is lacking. Furthermore, cost-sensitive measurement methods for quality assessment are expandable. This thesis presents the structured design and implementation of a 3D printing service with associated contributions that provide solutions to particular problems present in the AM domain. The 3D printing service is the overarching component of this thesis and provides the platform for the other contributions with the intention to establish an online, cloud-based 3D printing service for use in end-user and professional settings with a focus on collaboration and cooperation.Item Open Access Modeling and simulation of cabin air filtration with focus on electrostatic effects(2019) Schober, Carolin; Mehl, Miriam (Prof. Dr. rer. nat. habil.)Cabin air filters serve to remove harmful pollutants from the air flow supplied to the car passenger compartment. Electrostatic charges on cabin air filter media significantly improve the degree of particle separation without compromising the air permeability, thus achieving superior filtration performance. In order to optimize the performance metrics, a basic understanding of electrostatic filtration effects is required. However, these effects are largely unexplored due to limited experimental measurement options. Numerical simulations allow a deeper insight into fundamental physical processes than the measurement of macroscopic quantities. However, the uni-directionally coupled status quo simulation approach leads to results deviating from experimental observations for electrostatically charged systems. Numerous unknown parameters such as the charge distribution on filter fibers and dust particles and the lacking implementation of all simultaneously effective electrostatic separation mechanisms cause these differences. This dissertation provides an enhanced fully-coupled modeling approach to simulate specific electrostatic filtration effects. The new simulation model includes the interaction of highly bipolar charged dust particles with each other, with filter fibers, and with the background air flow. Extensive studies demonstrate the necessity of this high level of detail in order to dissolve electrostatic agglomeration effects in the inflow area. In addition, combined numerical and experimental test scenarios provide qualitative results allowing to observe the effect of induced dipoles and mirror charges. A combination of the fully-coupled modeling approach with the status quo simulation method in a two-step procedure is highly recommended for further research studies.