05 Fakultät Informatik, Elektrotechnik und Informationstechnik
Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/6
Browse
19 results
Search Results
Item Open Access Modelling the quality economics of defect-detection techniques(2006) Wagner, StefanThere are various ways to evaluate defect-detection techniques. However, for a comprehensive evaluation the only possibility is to reduce all influencing factors to costs. There are already some models and metrics for the cost of quality that can be used in that context. These models allow the structuring of the costs but do not show all influencing factors and their relationships. This paper proposes an analytical model for the economics of defect-detection techniques that can be used for analysis and optimisation of the usage of such techniques. In particular we analyse the sensitivity of the model and how the model can be applied in practice.Item Open Access Synchronisierung von digitalen Modellen mit realen Fertigungszellen auf Basis einer Ankerpunktmethode am Beispiel der Automobilindustrie(2017) Ashtari Talkhestani, Behrang; Schlögl, Wolfgang; Weyrich, MichaelDie zunehmende Produktvielfalt und die Verkürzung der Produktlebenszyklen erfordern eine schnelle und kostengünstige Rekonfiguration bestehender Produktionssysteme [1]. Um diesen Herausforderungen zu begegnen, ist ein aktuelles digitales Modell der bestehenden Fertigungszelle, im Folgenden Digitaler Zwilling genannt, eine geeignete Lösung. Der Digitale Zwilling führt zu einer Kostenreduktion durch Verkürzung der Umrüstzeiten durch virtuelle Planung und Simulation basierend auf dem aktuellen Zustand der realen Produktionsanlage als auch durch eine frühzeitige Erkennung von Konstruktions- oder Prozessablauffehlern in der Produktionsanlage. Voraussetzung für die Verwendbarkeit des Digitalen Zwillings vom Produktionssystem ist allerdings, dass ein aktuelles (virtuelles) Anlagenmodell von den mechatronischen Bestandteilen der realen Anlage während der verschiedenen Phasen ihres Lebenszyklus existiert. In diesem Beitrag wird die domänenübergreifende, mechatronische Datenstruktur der virtuellen Fertigungszellen in der Automobilindustrie diskutiert. Es wird eine systematische Ankerpunktmethode vorgestellt, mithilfe derer die Abweichungen zwischen den virtuellen Modellen und der Realität detektiert und ermittelt werden können. Basierend darauf wird eine sogenannte regelbasierte Konsistenzprüfung zur durchgängigen, domänenübergreifenden Synchronisierung der aktuellen mechatronischen Ressourcenkomponenten der Produktionssysteme mit deren virtuellem Anlagemodell vorgestellt.Item Open Access Software quality models : purposes, usage scenarios and requirements(2009) Deißenböck, Florian; Juergens, Elmar; Lochmann, Klaus; Wagner, StefanSoftware quality models are a well-accepted means to support quality management of software systems. Over the last 30 years, a multitude of quality models have been proposed and applied with varying degrees of success. Despite successes and standardisation efforts, quality models are still being criticised, as their application in practice exhibits various problems. To some extent, this criticism is caused by an unclear definition of what quality models are and which purposes they serve. Beyond this, there is a lack of explicitly stated requirements for quality models with respect to their intended mode of application. To remedy this, this paper describes purposes and usage scenarios of quality models and, based on the literature and experiences from the authors, collects critique of existing models. From this, general requirements for quality models are derived. The requirements can be used to support the evaluation of existing quality models for a given context or to guide further quality model development.Item Open Access The Quamoco product quality modelling and assessment approach(2012) Wagner, Stefan; Lochmann, Klaus; Heinemann, Lars; Kläs, Michael; Trendowicz, Adam; Plösch, Reinhold; Seidl, Andreas; Goeb, Andreas; Streit, JonathanPublished software quality models either provide abstract quality attributes or concrete quality assessments. There are no models that seamlessly integrate both aspects. In the project Quamoco, we built a comprehensive approach with the aim to close this gap. For this, we developed in several iterations a meta quality model specifying general concepts, a quality base model covering the most important quality factors and a quality assessment approach. The meta model introduces the new concept of a product factor, which bridges the gap between concrete measurements and abstract quality aspects. Product factors have measures and instruments to operationalise quality by measurements from manual inspection and tool analysis. The base model uses the ISO 25010 quality attributes, which we refine by 200 factors and 600 measures for Java and C# systems. We found in several empirical validations that the assessment results fit to the expectations of experts for the corresponding systems. The empirical analyses also showed that several of the correlations are statistically significant and that the maintainability part of the base model has the highest correlation, which fits to the fact that this part is the most comprehensive. Although we still see room for extending and improving the base model, it shows a high correspondence with expert opinions and hence is able to form the basis for repeatable and understandable quality assessments in practice.Item Open Access Exploring classification algorithms and data feature selection for domain specific industrial text data(2016) Villanueva Zacarías, Alejandro GabrielUnstructured text data represents a valuable source of information that nonetheless remains sub utilised due to the lack of efficient methods to manipulate it and extract insights from it. One example of such deficiencies is the lack of suitable classification solutions that address the particular nature of domain-specific industrial text data. In this thesis we explore the factors that impact the performance of classification algorithms, as well as the properties of domain-specific industrial text data, to propose a framework that guides the design of text classification solutions that can achieve an optimal trade-off between accuracy and processing time. Our research model investigates the effect that the availability of data features has on the observed performance of a classification algorithm. To explain this relationship, we build a series of prototypical Naïve Bayes algorithm configurations out of existing components and test them on two role datasets from a quality process of an automotive company. A key finding is that properly designed feature selection techniques can play a major role in achieving optimal performance both in terms of accuracy and processing time by providing the right amount of meaningful features. We test our results for statistical significance, proceed to suggest an optimal solution for our application scenario and conclude by describing the nature of the variable relationships contained in our research model.Item Open Access An industrial case study on the evaluation of a safety engineering approach for software-intensive systems in the automotive domain(2016) Abdulkhaleq, Asim; Vöst, Sebastian; Wagner, Stefan; Thomas, JohnSafety remains one of the essential and vital aspects in today's automotive systems. These systems, however, become ever more complex and dependent on software which is responsible for most of their critical functions. Therefore, the software components need to be analysed and verified appropriately in the context of software safety. The complexity of software systems makes defining software safety requirements with traditional safety analysis techniques difficult. A new technique called STPA (Systems-Theoretic Process Analysis) based on system and control theory has been developed by Leveson to cope with complex systems. Based on STPA, we have developed a comprehensive software safety engineering approach in which the software and safety engineers integrate the analysis of software risks with their verification to recognize the software-related hazards and reduce the risks to a low level. In this paper, we explore and evaluate the application of our approach to a real industrial system in the automotive domain. The case study was conducted analysing the software controller of the Active Cruise Control System (ACC) of the BMW Group.Item Open Access Application performance management : measuring and optimizing the digital customer experience(Troisdorf : SIGS DATACOM GmbH, 2018) Hoorn, André van; Siegl, StefanNowadays, the success of most companies is determined by the quality of their IT services and application systems. To make sure that application systems provide the expected quality of service, it is crucial to have up-to-date information about the system and the user experience to detect problems and to be able to solve them effectively. Application performance management (APM) is a core IT operations discipline that aims to achieve an adequate level of performance during operations. APM comprises methods, techniques, and tools for i) continuously monitoring the state of an applications system and its usage, as well as for ii) detecting, diagnosing, and resolving performance-related problems using the monitored data. This book provides an introduction by covering a common conceptual foundation for APM. On top of the common foundation, we introduce today's tooling landscape and highlight current challenges and directions of this discipline.Item Open Access Assessment of overload capabilities of power transformers by thermal modelling(2011) Schmidt, Nicolas; Tenbohlen, Stefan; Skrzypek, Raimund; Dolata, BartekThis contribution presents an approach to determine the overload capabilities of oil-cooled power transformers depending on the ambient temperature. For this purpose the investigated method introduces a simplified, empirical based thermal model that predicts changes in oil temperature with high accuracy. This model considers the entire transformer as a single, homogenous tempered body with a certain thermal capacity. All electrical losses are perceived as an input of equally distributed heat and assumed to be the sum of the load and no-load losses given by the transformer design. In contrary to earlier approaches the heat exchange with the ambience is modelled as a complex function depending first of all on the temperature difference between the transformer and its surroundings. Furthermore, the loading rate, material properties, levels of temperatures and emerging temperature gradients are taken into account as influencing factors determining the heat exchange. To display the behaviour of a specific transformer, the model employs several empirical factors. For determination of these empirical factors an evaluation time of two to four representative weeks of transformer operation is found to be sufficient. To validate the created model and test its operational reliability, measuring data from several ONAN- and ONAF-transformers are consulted. These data sets comprise the top oil and ambient temperature as well as the loading rate and the status of the cooling system. Furthermore, the corresponding name plate data is integrated. Subsequently to the calculation of the top oil temperature, the maximum constant loading rate resulting in a hot-spot temperature below critical level is determined based upon the remarks of IEC 60076 - 7 [1]. Finally, a characteristic linear function for each investigated transformer displaying the maximum loading rate depending solely on the ambient temperature is derived. In case of the investigated ONAN- and ONAF-transformers within a power range of 31.5 - 63 MVA, significant overload potentials could be disclosed.Item Open Access Water saturation limits and moisture equilibrium curves of alternative insulation systems(2011) Tenbohlen, Stefan; Jovalekic, Mark; Bates, Lisa; Szewczyk, RadoslawA method developed for establishing moisture equilibrium curves for any combination of liquid and solid insulation is presented in this paper. Moisture saturation curves for natural and synthetic esters have been presented in the temperature range up to 140°C together with curve for mineral oil as a reference. Sorption isotherms have been established for cellulose based and aramid fiber based materials. Eventually, the moisture equilibrium diagrams have been created for given combinations of solids and liquids. Moisture equilibrium curves have been created for combinations of mineral oil and ester fluids with aramid fiber based papers and boards, as they are commonly used in alternative insulation systems. The new curves give information on moisture distribution within the alternative insulation systems and may be critical for setting the material choices, design rules and maintenance guidelines for equipment using these combinations. Only then the materials could be used optimally and their specific characteristics could bring full range of benefits to the equipment. Also the condition monitoring and diagnostics for the purpose of asset management will be more reliable when these new characteristics are used. It has been observed that insulation components made of aramid insulation may have lower water content comparing to cellulose based conventional materials at the same water content measured in dielectric liquid. As a result, the performance of aramid insulation components may be less sensitive to moisture in oil (aging processes, dielectric strength, partial discharge performance) comparing to conventional systems based on cellulose.Item Open Access Naming the pain in requirements engineering: design of a global family of surveys and first results from Germany(2013) Méndez Fernández, Daniel; Wagner, StefanContext: For many years, we have observed industry struggling in defining a high quality requirements engineering (RE) and researchers trying to understand industrial expectations and problems. Although we are investigating the discipline with a plethora of empirical studies, those studies either concentrate on validating specific methods or on single companies or countries. Therefore, they allow only for limited empirical generalisations. Objective: To lay an empirical and generalisable foundation about the state of the practice in RE, we aim at a series of open and reproducible surveys that allow us to steer future research in a problem-driven manner. Method: We designed a globally distributed family of surveys in joint collaborations with different researchers from different countries. The instrument is based on an initial theory inferred from available studies. As a long-term goal, the survey will be regularly replicated to manifest a clear understanding on the status quo and practical needs in RE. In this paper, we present the design of the family of surveys and first results of its start in Germany. Results: Our first results contain responses from 30 German companies. The results are not yet generalisable, but already indicate several trends and problems. For instance, a commonly stated problem respondents see in their company standards are artefacts being underrepresented, and important problems they experience in their projects are incomplete and inconsistent requirements. Conclusion: The results suggest that the survey design and instrument are well-suited to be replicated and, thereby, to create a generalisable empirical basis of RE in practice.