Recent Submissions

ItemOpen Access
Simulative Bestimmung der Temperatur im Dichtkontakt von Radial-Wellendichtungen
(Stuttgart : Institut für Maschinenelemente, 2025) Feldmeth, Simon; Bauer, Frank (apl. Prof. Dr.-Ing.)
An Radial-Wellendichtungen kann Leckage auftreten, wenn die Elastomer-Dichtlippe des Radialwellendichtrings (RWDR) im Betrieb thermisch geschädigt wird. Dabei überschreitet die Temperatur im Dichtkontakt das ertragbare Maß, weil zu viel Reibwärme im Dichtkontakt entsteht und/oder diese Reibwärme nicht ausreichend gut an die Umgebung abgeführt werden kann. RWDR aus besonders temperaturbeständigen Elastomeren sind sehr teuer und könnten in Zukunft möglicherweise verboten werden. Für eine wirtschaftliche und zugleich funktionssichere Auswahl des RWDRs muss deshalb bereits in der Konstruktionsphase bekannt sein, welche Temperaturen später während des Betriebs im Kontaktbereich herrschen werden. Die Messung der Temperatur im Dichtkontakt erfordert einen Prototyp sowie Prüftechnik und ist damit sehr aufwändig. Ziel dieser Arbeit ist es deshalb, eine Methode zu entwickeln, mit der die Temperatur im Dichtkontakt von Radial-Wellendichtungen prognostiziert werden kann. In dieser Arbeit wurde eine Simulationsmethode entwickelt, mit der das Temperaturfeld im gesamten Dichtsystem und dessen Einbauumfeld simuliert werden kann. Daraus kann die Temperatur im Dichtkontakt ermittelt werden. Zum Einsatz kommt die CHT-Methode (Conjugate Heat Transfer), bei der die „klassische“ Strömungssimulation (CFD) um die Berechnung der Wärmeleitung in angrenzenden Festkörpern erweitert wird. Die Entstehung der Reibwärme wird mittels benutzerdefinierter Funktionen in das Simulationsmodell integriert, wodurch die Reibleistung/-wärme als Funktion der Betriebsbedingungen und der Radialkraft des RWDR modelliert werden kann. Diese Funktion basiert auf empirischen Messdaten. Im Simulationsmodell können Geometriemerkmale, Betriebsbedingungen und Stoffwerte durch Parametrisierung einfach und systematisch variiert werden. Die Validierung des Simulationsmodells erfolgt durch einen Abgleich der Temperatur in unmittelbarer Nähe des Dichtkontakts, die mittels Infrarot-Thermografie ermittelt wurden. Um die Temperatur im Dichtkontakt von Radial-Wellendichtungen noch einfacher und schneller als mittels Simulation abschätzen zu können, wurde darüber hinaus das Näherungsverfahren „ExACT“ entwickelt. Dieses basiert auf einem thermischen Ersatzmodell und berücksichtigt die acht wichtigsten Einflussfaktoren auf die Temperatur im Dichtkontakt, wodurch es einen deutlich breiteren Anwendungsbereich und eine höhere Genauigkeit als bisherige Näherungsverfahren aufweist. Zur einfachen und intuitiven Anwendung des ExACT-Verfahrens wurde das Berechnungstool InsECT entwickelt, das als Web-App frei zugänglich ist. Sowohl die Simulationsmethode als auch das Berechnungstool InsECT mit dem darin enthaltenen Näherungsverfahren ExACT wurden bereits mehrfach erfolgreich in anderen Forschungsprojekten sowie in der Industrie eingesetzt.
ItemOpen Access
Re-identification attacks to validate the privacy provided by anonymization
(2025) Below, Richard
When sensitive microdata regarding people is published, choosing a secure anonymization method is vital. Comparing the effectiveness of different anonymization methods is challenging due to their structural differences. Many re-identification attacks exist that attempt to reverse these methods and identify individuals. However, prior work typically evaluates privacy risks in isolation - focusing on a single anonymization technique at a time. The proposal in this work is to compare different anonymization methods by simulating re-identification attacks on them. In a first step, an ontology that models the landscape of attacks and anonymization methods is created. Additionally, all attacks available in the literature are retrieved and analyzed with regard to which anonymization methods are susceptible to them. As specified in the ontology, the anonymization methods, attacks and their relationships are organized into a structured knowledge graph. Finally, a framework is created, making our contributions seamlessly accessible. The framework allows access to the knowledge graph via an interactive visualization. Additionally, attacks that can be simulated on custom data anonymized by different methods are implemented. After simulating the attacks, their success can serve as a state-of-the-art approximation of the actual re-identification risk. The attack’s success aids in bridging the comparability gap between structurally different anonymization methods.
Thumbnail Image
ItemOpen Access
High-rate intercity quantum key distribution with a semiconductor single-photon source
(2024) Yang, Jingzhong; Jiang, Zenghui; Benthin, Frederik; Hanel, Joscha; Fandrich, Tom; Joos, Raphael; Bauer, Stephanie; Kolatschek, Sascha; Hreibi, Ali; Rugeramigabo, Eddy Patrick; Jetter, Michael; Portalupi, Simone Luca; Zopf, Michael; Michler, Peter; Kück, Stefan; Ding, Fei
Quantum key distribution (QKD) enables the transmission of information that is secure against general attacks by eavesdroppers. The use of on-demand quantum light sources in QKD protocols is expected to help improve security and maximum tolerable loss. Semiconductor quantum dots (QDs) are a promising building block for quantum communication applications because of the deterministic emission of single photons with high brightness and low multiphoton contribution. Here we report on the first intercity QKD experiment using a bright deterministic single photon source. A BB84 protocol based on polarisation encoding is realised using the high-rate single photons in the telecommunication C-band emitted from a semiconductor QD embedded in a circular Bragg grating structure. Utilising the 79 km long link with 25.49 dB loss (equivalent to 130 km for the direct-connected optical fibre) between the German cities of Hannover and Braunschweig, a record-high secret key bits per pulse of 4.8 × 10 -5 with an average quantum bit error ratio of ~ 0.65% are demonstrated. An asymptotic maximum tolerable loss of 28.11 dB is found, corresponding to a length of 144 km of standard telecommunication fibre. Deterministic semiconductor sources therefore challenge state-of-the-art QKD protocols and have the potential to excel in measurement device independent protocols and quantum repeater applications.
Thumbnail Image
ItemOpen Access
LUBAC enables tumor-promoting LTβ receptor signaling by activating canonical NF-κB
(2024) Chen, Yu-Guang; Rieser, Eva; Bhamra, Amandeep; Surinova, Silvia; Kreuzaler, Peter; Ho, Meng-Hsing; Tsai, Wen-Chiuan; Peltzer, Nieves; de Miguel, Diego; Walczak, Henning
Lymphotoxin β receptor (LTβR), a member of the TNF receptor superfamily (TNFR-SF), is essential for development and maturation of lymphoid organs. In addition, LTβR activation promotes carcinogenesis by inducing a proinflammatory secretome. Yet, we currently lack a detailed understanding of LTβR signaling. In this study we discovered the linear ubiquitin chain assembly complex (LUBAC) as a previously unrecognized and functionally crucial component of the native LTβR signaling complex (LTβR-SC). Mechanistically, LUBAC-generated linear ubiquitin chains enable recruitment of NEMO, OPTN and A20 to the LTβR-SC, where they act coordinately to regulate the balance between canonical and non-canonical NF-κB pathways. Thus, different from death receptor signaling, where LUBAC prevents inflammation through inhibition of cell death, in LTβR signaling LUBAC is required for inflammatory signaling by enabling canonical and interfering with non-canonical NF-κB activation. This results in a LUBAC-dependent LTβR-driven inflammatory, protumorigenic secretome. Intriguingly, in liver cancer patients with high LTβR expression, high expression of LUBAC correlates with poor prognosis, providing clinical relevance for LUBAC-mediated inflammatory LTβR signaling.
Thumbnail Image
ItemOpen Access
A sphingolipid rheostat controls apoptosis versus apical cell extrusion as alternative tumour-suppressive mechanisms
(2024) Armistead, Joy; Höpfl, Sebastian; Goldhausen, Pierre; Müller-Hartmann, Andrea; Fahle, Evelin; Hatzold, Julia; Franzen, Rainer; Brodesser, Susanne; Radde, Nicole E.; Hammerschmidt, Matthias
Evasion of cell death is a hallmark of cancer, and consequently the induction of cell death is a common strategy in cancer treatment. However, the molecular mechanisms regulating different types of cell death are poorly understood. We have formerly shown that in the epidermis of hypomorphic zebrafish hai1a mutant embryos, pre-neoplastic transformations of keratinocytes caused by unrestrained activity of the type II transmembrane serine protease Matriptase-1 heal spontaneously. This healing is driven by Matriptase-dependent increased sphingosine kinase (SphK) activity and sphingosine-1-phosphate (S1P)-mediated keratinocyte loss via apical cell extrusion. In contrast, amorphic hai1afr26 mutants with even higher Matriptase-1 and SphK activity die within a few days. Here we show that this lethality is not due to epidermal carcinogenesis, but to aberrant tp53-independent apoptosis of keratinocytes caused by increased levels of pro-apoptotic C16 ceramides, sphingolipid counterparts to S1P within the sphingolipid rheostat, which severely compromises the epidermal barrier. Mathematical modelling of sphingolipid rheostat homeostasis, combined with in vivo manipulations of components of the rheostat or the ceramide de novo synthesis pathway, indicate that this unexpected overproduction of ceramides is caused by a negative feedback loop sensing ceramide levels and controlling ceramide replenishment via de novo synthesis. Therefore, despite their initial decrease due to increased conversion to S1P, ceramides eventually reach cell death-inducing levels, making transformed pre-neoplastic keratinocytes die even before they are extruded, thereby abrogating the normally barrier-preserving mode of apical live cell extrusion. Our results offer an in vivo perspective of the dynamics of sphingolipid homeostasis and its relevance for epithelial cell survival versus cell death, linking apical cell extrusion and apoptosis. Implications for human carcinomas and their treatments are discussed.
Thumbnail Image
ItemOpen Access
Künstliche Intelligenz in betrieblichen Prozessen : Ein Vorgehensmodell zur partizipativen Gestaltung von KI-Anwendungen
(2024) Ruess, Patrick; Staffa, Anna; Kreutz, Anna; Busch, Christine; Saba Gayoso, Christian Oswaldo; Pollmann, Kathrin
Schon heute gilt Künstliche Intelligenz (KI) als betrieblicher Wertschöpfungsfaktor, von dem sich Unternehmen neue Impulse für bestehende Prozesse und Geschäftsmodelle versprechen. Während der derzeitige Diskurs vor allem technische Möglichkeiten und Anwendungsfälle in den Blick nimmt, umfasst die erfolgreiche betriebliche Integration allerdings auch wesentliche soziale und organisatorische Aspekte. Im vorliegenden Artikel werden daher inner- und überbetriebliche Anforderungen identifiziert, die eine Mitarbeiter\*innen-gerechte und partizipative Gestaltung von KI-Anwendungen im betrieblichen Umfeld ermöglichen. Die empirische Grundlage hierfür bildet eine Interviewstudie, in der der KI-Einsatz in unterschiedlichen Branchen und Unternehmensbereichen untersucht wurde. Darauf aufbauend wird ein Vorgehensmodell eingeführt, dass gemäß den identifizierten Kriterien eine partizipative Teilhabe bei der Gestaltung von betrieblichen KI-Anwendungen erlaubt. Das Modell bezieht sich auf die Qualifizierung und Akzeptanzbildung in der Belegschaft, aber auch auf die organisatorische Umsetzung und Verstetigung. Die Herangehensweise verknüpft infrastrukturelle, interaktive als auch konzeptionelle Bausteine miteinander und zielt darauf ab, die Beteiligung der Mitarbeiter*innen in allen Phasen der KI-Entwicklung zu fördern und in der betrieblichen Umsetzung zu berücksichtigen. Die Ergebnisse dieser Forschung bieten praktische Anknüpfungspunkte für Unternehmen, die sich mit Fragen der KI-Implementierung befassen. Gleichzeitig ergänzen sie den aktuellen wissenschaftlichen Diskurs um die Perspektive, die eine konsequente betriebliche Gestaltung und Teilhabe vorsieht. Die zu diesem Zweck identifizierten Anforderungen komplementieren die empirische Grundlage in der Forschung.
Thumbnail Image
ItemOpen Access
Chemical heat derived from rocket-borne WADIS-2 experiment
(2024) Grygalashvyly, Mykhaylo; Strelnikov, Boris; Strelnikova, Irina; Rapp, Markus; Lübken, Franz-Josef; Schütt, Corinna; Stephan, Claudia; Eberhart, Martin; Löhle, Stefan; Fasoulas, Stefanos
Chemical heating rates were derived from three of the most significant reactions based on the analysis of common volume rocket-borne measurements of temperature, atomic oxygen densities, and neutral air densities. This is one of the first instances of the retrieval of nighttime chemical heat through the utilization of non-emissive observations of atomic oxygen concentrations, obtained through in situ measurements, performed at the Andøya Space Center (69°N, 16°E) at 01:44:00 UTC on 5 March 2015. Furthermore, we determine the heating efficiency for one of the most significant reactions of atomic hydrogen with ozone and illustrate the methodology for such calculations based on known atomic oxygen and temperature. Subsequently, using ozone values obtained from satellite observations, we retrieved odd-hydrogens and total chemical heat. Finally, we compared the retrieved chemical heat with the heat from turbulent energy dissipation. Our findings reveal that the vertically averaged chemical heat is greater than the heat from turbulent energy dissipation throughout the entire mesopause region during nocturnal conditions. The heating rates of turbulent energy dissipation may exceed the chemical heating rates only in narrow peaks, several hundred meters wide.
Thumbnail Image
ItemOpen Access
Modeling and mitigation of vortex formation in ejector deep hole drilling with smoothed particle hydrodynamics
(2024) Baumann, Andreas; Gerken, Julian Frederic; Sollich, Daniel; Rupasinghe, Nuwan; Biermann, Dirk; Eberhard, Peter
Ejector deep hole drilling achieves high-quality boreholes in production processes. High feed rates are applied to ensure a high productivity level, requiring reliable chip removal from the cutting zone for a stable process. Therefore, a constant metalworking fluid flow under high volume flow rates or high pressure is required. Experimental results show a vortex formation at the outer cutting edge. This vortex can lead to delayed chip removal from the cutting zone, and ultimately, it can lead to chip clogging and result in drill breakage due to increased torque. This paper investigates modified drill head designs using the smoothed particle hydrodynamics method. The investigated modifications include various designs of the chip mouth covering. Besides graphical analysis based on flow visualizations, flow meters are placed at the tool’s head to evaluate the impact of the modifications on the flow rate and possible increased resistance and relocation of the fluid flow from the outer cutting edge to other parts of the tool. The simulation results for the reference design show the experimentally observed vortex formation, validating the simulation model. By adding the tool’s rotation in the SPH simulation, which is not included in the experiments for observation reasons, the vortex formation is positively influenced. In addition, some designs show promising results to further mitigate the vortex formation while maintaining a sufficient fluid flow around the cutting edges.
Thumbnail Image
ItemOpen Access
Closed-loop laser volume ablation with adaptive scan paths
(2024) Buser, Matthias; Menold, Tobias; Michalowski, Andreas
This research focuses on closed-loop control in laser volume ablation, also known as laser milling. Such process control enables precise ablation results on workpieces with much wider tolerances regarding the initial surface geometry, internal structure, or its response to the incident laser beam, compared to conventional open-loop processing. However, state of the art closed-loop ablation systems incorporate the process control at the cost of increased processing time. The two main causes are the alternating between processing and measuring, and the use of static scan paths that do not adapt continuously to the evolving geometry of the workpiece during processing. This study addresses this issue by proposing a parallelized work flow of processing, measuring the surface topography and adaptive path planning, eliminating interruptions and achieving faster processing through continuously optimized scan paths. The realized machining system achieved a mean reduction in processing time of 29%, 36%, and 52% on three different test geometries compared to the state of the art.
Thumbnail Image
ItemOpen Access
Building a fully-automatized active learning framework for the semantic segmentation of geospatial 3D point clouds
(2024) Kölle, Michael; Walter, Volker; Sörgel, Uwe
In recent years, significant progress has been made in developing supervised Machine Learning (ML) systems like Convolutional Neural Networks. However, it’s crucial to recognize that the performance of these systems heavily relies on the quality of labeled training data. To address this, we propose a shift in focus towards developing sustainable methods of acquiring such data instead of solely building new classifiers in the ever-evolving ML field. Specifically, in the geospatial domain, the process of generating training data for ML systems has been largely neglected in research. Traditionally, experts have been burdened with the laborious task of labeling, which is not only time-consuming but also inefficient. In our system for the semantic interpretation of Airborne Laser Scanning point clouds, we break with this convention and completely remove labeling obligations from domain experts who have completed special training in geosciences and instead adopt a hybrid intelligence approach. This involves active and iterative collaboration between the ML model and humans through Active Learning, which identifies the most critical samples justifying manual inspection. Only these samples (typically ≪1%of Passive Learning training points) are subject to human annotation. To carry out this annotation, we choose to outsource the task to a large group of non-specialists, referred to as the crowd, which comes with the inherent challenge of guiding those inexperienced annotators (i.e., “short-term employees”) to still produce labels of sufficient quality. However, we acknowledge that attracting enough volunteers for crowdsourcing campaigns can be challenging due to the tedious nature of labeling tasks. To address this, we propose employing paid crowdsourcing and providing monetary incentives to crowdworkers. This approach ensures access to a vast pool of prospective workers through respective platforms, ensuring timely completion of jobs. Effectively, crowdworkers become human processing units in our hybrid intelligence system mirroring the functionality of electronic processing units .