Refine
Year of publication
Document type
- Article (peer-reviewed) (596)
- Bachelor Thesis (96)
- Contribution to a Periodical (79)
- Conference Proceeding (57)
- Master's Thesis (29)
- Report (26)
- Working Paper (23)
- Academic Papers (15)
- Other (12)
- Doctoral Thesis (3)
Has full text
- Yes (940) (remove)
Keywords
- Electrical impedance tomography (49)
- Machine learning (15)
- Mechanical ventilation (14)
- Acute respiratory distress syndrome (12)
- Exercise (11)
- Umfrage (10)
- Convolutional neural network (9)
- Blockchain (8)
- Bone mineral density (8)
- Germany (8)
Course of studies
- IBW - Internationale Betriebswirtschaft (57)
- IMM - International Management (30)
- IBM - International Business Management (23)
- APE - Advanced Precision Engineering (21)
- BMP - Business Management and Psychology (13)
- MZT - Mikromedizintechnik (12)
- OMB - OnlineMedien (5)
- WIB - Wirtschaftsinformatik (4)
- AIN - Allgemeine Informatik (2)
- BAM - Business Application Architectures (2)
- INM - Informatik (2)
- IRCD - International Relations and Cultural Diplomacy (2)
- MBA - International Business Management (2)
- MIB - Medieninformatik (2)
- SSM - Security & Safety Engineering (2)
- BPT - Bio- und Prozess-Technologie (1)
- DIM - Design Interaktiver Medien (1)
- EBCD - International Economics, Business and Cultural Diplomacy (1)
- EMBA - Executive Master of International Business Management (1)
- ETI - Elektronik und Technische Informatik (1)
- MDT - Medical Diagnostic Technologies (1)
- MEB - Medical Engineering (1)
- MIM - Medieninformatik (1)
- MZ - Mikromedizin (1)
- SEM - Sales & Service Engineering (1)
- SSB - Security & Safety Engineering (1)
- TMB - Wirtschaftsingenieurwesen - Technikmanagement (1)
- WPI - Wirtschaftingenieurwesen - Product Innovation (1)
1. Einführung 1.1 Historie 1.2 Rechneraufbau 1.3 Speicheradressen 1.4 Dualzahlen 1.5 Datentypen hardwareseitig 1.6 Datentypen softwareseitig 1.7 Softwareschichten 1.8 Gleichungen und Anweisungen
2.Einführung in C 2.1 Compile - Link - Run 2.2 Geradeausprogramm 2.3 Grunddatentypen 2.4 Ausdrücke, Wertzuweisungen 2.5 Operatoren 2.6 Standardfunktionen 2.7 Blöcke, Verzweigungen 2.8 Schleifen (while, for) 2.9 Matrizen und Strukturen 2.10 Aufzählungstypen, Typdefinitionen 2.11 Zeiger 2.12 Funktionen 2.13 Datenfiles 2.14 Zeichenkettenfunktionen (alt) 2.15 Speicherklasse, Initialisierung, Typumwandlungen
3. C++ mit MFC 3.1 Klassen 3.2 Aufbau einer Klasse 3.3 Beispiel Klasse CNurEineZahl 3.4 Basiswissen C++
4. Einführung in Java 4.1 Grundlagen von Java 4.2 Beispiel Ticketautomat 4.3 Beispiel Sinuskurve 4.4 Numerische Lösung von DGLs
Mathematik 2 für MEB/MM
(2018)
Vektorrechnung: Darstellung von Vektoren, Komponenten, Einheitsvektor, Addition, Subtraktion, Projektion, Skalarprodukt, Kreuzprodukt, Geraden, Ebenen Lineare Algebra: Darstellung, Rechnen mit Matrizen und Vektoren, Drehmatrix, lineare Gleichungssysteme (Eliminationsverfahren GAUSS oder GAUSS-JORDAN, inverse Matrix, über- und unterbestimmte Systeme), Determinanten (Rang einer Matrix, SARRUSsche Regel, CRAMERsche Regel), Eigenwerte und Eigenvektoren einer quadratischen Matrix
Fourier-Reihen: (Fourier-Reihe, reell, komplex, beliebige Periode, punktweise Funktion), Wellen (zeitliche und räumliche Ausbreitung von Wellen, Amplitude, Frequenz, Phase)
Differentialrechnung für Funktionen von mehreren Variablen: graphische Darstellung, skalare und Vektorfelder, partielle Ableitung, Differenzial, Gradient, Kettenregeln, Fehlerfortpflanzung
Integralrechnung für Funktionen von mehreren Variablen: konstante Grenzen, Produktzerlegung, Koordinatensysteme, Polarkoordinaten, Zylinderkoordinaten, Kugelkoordinaten, Trägheitsmomente, variable Grenzen
Gewöhnliche Differentialgleichungen: Klassifikation, Anfangswerte bzw. Randbedingungen, Trennung der Veränderlichen, Variation der Konstanten, DGL mit konstanten Koeffizienten, e-Ansatz, homogene, inhomogene DGL, charakteristische Gleichung, inhomogener Lösungen, Systeme linearer DGL Laplace-Transformation: Eigenschaften, Lösung einer DGL, Korrespondenztabellen, Partialbruchzerlegung, Rücktransformation, Übertragungsfunktion
Übungsbeispiele, Beispielklausur
Einführung (Beispiel Destille)
1. RI-Diagramme ( Kennbuchstaben, Symbole, Beispiele)
2. Einführung in die Steuerung (Ablaufsteuerung, Graphen, Befehle)
3. Einführung in die Regelung 3.1. Regelkreis (Blockschaltbild, Regelstrecke, Regler, Rückführung) 3.2. 2-Punkt-Regler, Hysterese 3.3. 3-Punkt-Regler, Toleranzbereich 3.4. Regelschaltungen ( Einfachregelkreis, Aufschaltung, Kaskade, Mehrgrößenregelung) 3.5. Regelaufgaben
4. Bauteile eines Regelkreises 4.1. Messwerterfassung ( Sensoren, Messumformer, Bus, Trennverstärker) 4.2. Messwertaufzeichnung 4.3. Signalverarbeitung ( Regler, µ-Controller, SPS (speicherprogrammierte Steuerung), PC) 4.4. Aktoren ( Ventil, Klappe, Motor, Pumpe, Verdichter, Stellungsregler, S- und K-Algorithmus) 4.5. Regelstrecken ( 4 Grundtypen PT1, PTn, I, Itn)
5. Steuerung mit SPS 5.1. Einführung (Aufbau, Programmverarbeitung) 5.2. FUP-Programmierung (Logik, Flipflop, Timer, Taktkette, Melde-, Ausgabeteil, OB1) 5.3 Impuls, Zähler, Betriebsartenteil
6. Übertragungsglieder (Laplace-Transformation, Übertragungsfunktion, Ortskurve, Bode-Diagramm) 6.1. P-Glied (Verstärkung, Linearisierung) 6.2. PT1-Glied (DGL, Antwort, Identifikation) DT1-Glied (DGL, Antwort, Identifikation) I-Glied (DGL, Antwort, Identifikation) 6.3. Zusammengesetzte Übertragungsglieder PTn-Glied (DGL, Antwort, Identifikation) ITn-Glied (DGL, Antwort, Identifikation)
7. Kontinuierliche Regler 7.1. P-Regler(Regelabweichung, Proportionalbereich) 7.2. PID-Regler, PIDT1-Regler (DGL, Parameter) 7.3. Reglerauswahl, -einstellung (Ziegler-Nichols, Chien u.a.)
Anhang: Übungsblätter 1 + 2, Dictionary
Vergleicht man ein Automatisierungssystem mit dem menschlichen Körper, dann sind die Rechner (PC, SPS, Mikro-Controller) das Gehirn, die Signalleitungen die Nervenbahnen, die Sensoren die Augen und Ohren, die Aktoren die Muskeln, die Starkstromleitungen die Adern. Alle Teile sind wichtig und müssen zueinander passen.
Biomedizinische Statistik
(2024)
Die klassische konfirmatorische Statistik, auch frequentistische Statistik genannt, setzt voraus, dass man theoretisch unendlich viele Stichproben ziehen kann, und dass dann die aus den Stichproben berechnete Prüf- oder Testgröße unter der Nullhypothese H0 eine bestimmte Verteilung annimmt. Meistens sind die Testgrößen so konstruiert, dass bei Ziehung der Stichproben aus immer derselben Grundgesamtheit (es gilt die Nullhypothese H0) eine Verteilung der Testgröße um den Wert null herum entsteht, z.B. in Form einer Glockenkurve, d.h., kleine Werte überwiegen. Große Werte der Prüf- oder Testgröße kommen mit geringer Wahrscheinlichkeit vor und signalisieren einen möglichen Ausnahmefall. Statt anzunehmen, dass einer der seltenen Fälle einer großen Prüfgröße eingetroffen ist, nimmt man lieber an, dass sich die Grundgesamtheiten unterscheiden (Alternativhypothese HA).
Entwurf und Implementierung einer kamerabasierten Anwendung für ein Mühlespiel für den Panda-Roboter
(2023)
Funktionsweise und Vergleich von Methoden zur Generierung von Punktwolken in einer Versandstation
(2023)
Injury to the anterior and posterior cruciate ligaments is not an uncommon occurrence. A rapid movement can already cause such an injury. Reconstruction of the anterior and posterior cruciate ligaments (ACL/PCL) in the knee joint is a complex orthopedic challenge. In this context, the development of a graft preparation system is gaining importance. Such a system will help surgeons prepare grafts accurately and efficiently to optimize reconstruction processes. This can increase the accuracy of graft placement, reduce surgical times, and decrease recovery time and postoperative risk for patients.
This paper focuses on the development of a graft preparation system for ACL/PCL reconstruction in collaboration with Getsch+Hiller Medizintechnik. The challenges and requirements in developing such a system were analyzed to ensure functionality and effectiveness. Existing approaches and technologies were examined to identify potential improvements and innovations. Development included extensive research on ACL/PCL reconstruction using grafts, including various surgical techniques. A detailed analysis of the graft preparation process was performed to identify critical issues. The design of the graft preparation system is based on research data and is performed using CAD. The needs of the target group, customer requirements and findings from research were considered.
In dieser Bachelorarbeit wird die Bedrohungslage durch den Einsatz von frei erhältlichen Drohnen zur Beeinträchtigung der Informationssicherheit von Organisationen umfassend untersucht. Unter Verwendung einer Kombination aus Literaturanalyse, Experteninterviews und Fallstudie wird die Vielfalt der Bedrohungen und die Machbarkeit von Drohnenunterstützte Angriffen auf die Informationssicherheit detailliert beleuchtet. Dabei zeigt sich, dass Drohnen eine reale und vielschichtige Gefahr darstellen. Die Einschätzung der Eintrittswahrscheinlichkeit solcher Angriffe ist aufgrund der dynamischen Entwicklung der Drohnentechnologie und der Vielfalt der Anwendungsmöglichkeiten komplex. Diese Arbeit betont die Notwendigkeit für Organisationen, effektive Abwehrstrategien zu entwickeln, und weist auf die Wichtigkeit weiterer Forschung in diesem Bereich hin, um den Herausforderungen der technologischen Entwicklungen und der sich ständig ändernden Bedrohungslandschaft gerecht zu werden.
Introduction: The present study investigated the role of training intensity in the dose–response relationship between endurance training and cardiorespiratory fitness (CRF). The hypothesis was that beginners would benefit from an increase in training intensity after an initial training phase, even if the energy expenditure was not altered. For this purpose, 26 weeks of continuous moderate training (control group, CON) was compared to training with gradually increasing intensity (intervention group, INC) but constant energy expenditure.
Methods: Thirty-one healthy, untrained subjects (13 men, 18 women; 46±8 years; body mass index 25.4 ± 3.3 kg m−2; maximum oxygen uptake, VO2max −1 −1 34 ± 4 ml min kg ) trained for 10 weeks with moderate intensity [3 days/week for 50 min/session at 55% heart rate reserve (HRreserve)] before allocation to one of two groups. A minimization technique was used to ensure homogeneous groups. While group CON continued with moderate intensity for 16 weeks, the INC group trained at 70% HRreserve for 8weeks and thereafter participated in a 4 × 4 training program (high-intensity interval training, HIIT) for 8 weeks. Constant energy expenditure was ensured by indirect calorimetry and corresponding adjustment of the training volume. Treadmill tests were performed at baseline and after 10, 18, and 26 weeks.
Results: The INC group showed improved VO2max (3.4 ± 2.7 ml kg−1 min−1) to a significantly greater degree than the CON group (0.4 ± 2.9 ml kg−1 min−1) (P = 0.020). In addition, the INC group exhibited improved Vmax (1.7 ± 0.7 km h−1) to a significantly greater degree than the CON group (1.0 ± 0.5 km h−1) (P = 0.001). The reduction of resting HR was significantly larger in the INC group (7±4bpm) than in the CON group (2±6bpm) (P=0.001). The mean heart rate in the submaximal exercise test was reduced significantly in the CON group (5±6bpm; P=0.007) and in the INC group (8±7bpm; P=0.001), without a significant interaction between group and time point.
Data processed in context is more meaningful, easier to understand and has higher information content, hence it derives its semantic meaning from the surrounding context. Even in the field of acoustic signal processing. In this work, a Deep Learning based approach using Ensemble Neural Networks to integrate context into a learning system is presented. For this purpose, different use cases are considered and the method is demonstrated using acoustic signal processing of machine sound data for valves, pumps and slide rails. Mel-spectrograms are used to train convolutional neural networks in order to analyse acoustic data using image processing techniques.
Separation of ventilation and cardiac activity on recorded voltages before EIT image reconstruction
(2023)
For many practitioners, considering sustainability during a software development project is a challenge. The Sustainability Awareness Framework (SusAF) is a tool for thinking through short, medium-and long-term impacts of socio-technical systems on its surrounding environment. While SusAF has been used by several companies, is not widely adopted in industry yet. In this Vision Paper, we discuss the options for extending the reach of SusAF and what it would take to evolve SusAF into a (de-facto) standard
Digital transformation is now reaching into topics like End-of-life Care, Funeral Culture, and Coping with Grief. Those developments are inevitably accompanied by the growing challenge to design IT systems that are appropriate and helpful for the stakeholders involved. Our aim in this paper is to further introduce the rather new combined research field of Socioinformatics and Thanatology (the scientific study of death and dying) and to present it with the first results on which requirements to consider for the design of digital tools within ‘Thanatopractice’. By using Participatory Design and the Sustainability Awareness Framework (SusAF) in the context of three workshops on socio-technical systems (Online Pastoral Care, Virtual Graveyards, and AI Memory Avatars), we want to sensitize software practitioners to the multidimensional impacts of their products and services in a field, which the participants in the workshops often described as “highly sensitive”.
In this paper, we derive set constraints for a reduced order model and augment them into a model predictive control (MPC) scheme to ensure safe operation of the large-scale ensemble system. For the control feedback, only the aggregated information of the whole system is required. For the constraint satisfaction, we consider an adaptive tube formulation to characterize the deviation between the reduced order model and the ensemble system. Employing the robust control invariant set, we ensure recursive feasibility and initial feasibility under an easily verifiable condition.
3D Computer Vision for the Industrial Metaverse - On the potentials of Neural Radiance Fields
(2023)
The industrial metaverse refers to the use of virtual reality (VR) and augmented reality (AR) technologies in the context of industry and manufacturing. It is envisioned as a shared, immersive digital space where people can interact with and manipulate virtual representations of physical objects and processes. The industrial metaverse has the potential to transform the way products are designed, manufactured, and maintained,
enabling new levels of collaboration, automation, and innovation.
It further includes virtual representations of humans, also known as avatars. These avatars can be used to enable remote collaboration and communication between people in the virtual space. In this way, the industrial metaverse can facilitate virtual meetings, trainings, and other interactive experiences that involve human participants.
Neural Radiance Fields (NeRFs) are a powerful tool for synthesizing photorealistic images of 3D objects, including virtual representations of humans known as avatars. In this talk, we will discuss the potential applications of NeRFs in generating high-fidelity objects and avatars for use in the industrial metaverse.
Influence of Reconstruction Algorithms on Harmonic Analysis in Electrical Impedance Tomography
(2023)
The absolute value of recruitment-to-inflation ratio does not correlate with the recruited volume
(2023)
Seit 2021 läuft in Baden-Württemberg das Landesprojekt Hochschulweiterbildung@BW, in dessen Mittelpunkt die strukturelle Weiterentwicklung der wissenschaftlichen und künstlerischen Weiterbildung steht.
Im folgenden Beitrag wird die dritte Säule des Projektes, die Initiierung und Etablierung einer Struktur von Regional-und Fachvernetzungsstellen an den beteiligten Hochschulen, in den Blick genommen. Dieses neue Instrument wird zunächst in seiner Grundstruktur vorgestellt, eine Zwischenbilanz nach Ablauf der ersten Projekthälfte gezogen und dann der Frage nachgegangen, wie die Arbeit der Regional- und Fachvernetzungsstellen dazu beiträgt, die Bedarfe aus Wirtschaft und Gesellschaft und die Weiterbildungsangebote der Hochschulen noch besser in Passung zu bringen.
Laparoscopic Video Analysis Using Temporal, Attention, and Multi-Feature Fusion Based-Approaches
(2023)
Existing literature (Erling & Hingeldorf, 2006; Earls, 2014) indicates that there is a lack of formal policies at the macro- or meso-level governing the use of English in German higher education. This has led to a situation in which higher education institutions (HEIs) are required to formulate and implement their own policies and guidelines regarding English-medium instruction (EMI). As a growing number of HEIs adopt EMI (Wächter & Maiworm, 2014; Macaro et al., 2018) without access to policy guidelines, there is an urgent need to scrutinize the policy formulation and implementation processes at the institutional level. Such investigation is crucial to understand the complexities that come with tailoring EMI to unique institutional contexts, objectives, and stakeholder needs. We believe that this will enable more effective and equitable implementations, while also providing insights that could inform future policy recommendations. In this article, we analyze the motivations for drafting a language policy at a medium-sized German university of applied sciences1 (UAS) and investigate the attitudes and opinions towards EMI of three stakeholder groups: faculty members, administrative staff, and the student body. We were especially interested in exploring the rationales for implementing Bilingual Degree Programs (BDPs), as a variant of EMI, and how each stakeholder group influenced the formulation and implementation of the policy. To get an initial overview, we read institutional policy documents outlining the proposed language policy. We then complemented the documentary analysis by conducting a survey investigating the attitudes and opinions of the stakeholder groups using a questionnaire format (n=207). Finally, to gain deeper insights and triangulate data from the questionnaire, we conducted semi-structured interviews (n=18). Analysis of the data indicates that the primary motivation for implementing BDPs is to attract greater numbers of international as well as domestic applicants to make up for an ongoing decline in student numbers. We also discovered that stakeholder groups hold different beliefs about BDPs, impacting their level of support for their implementation. We argue this is due to some groups within the institution being more influential in policy formulation, leading to feelings of disempowerment in individuals tasked with implementing BDPs, but not being consulted in the policy formulation process. Finally, it also seems that the institutional policy is driven by experience in implementation, resulting in policy enhancement over time. We assume this approach is a direct outcome of the lack of policy guidelines and consider the issues that arise from such an approach and share implications of the current practice.
Quality assurance (QA) plays a crucial role in manufacturing to ensure that products meet their specifications. However, manual QA processes are costly and time-consuming, thereby making artificial intelligence (AI) an attractive solution for automation and expert support. In particular, convolutional neural networks (CNNs) have gained a lot of interest in visual inspection. Next to AI methods, the explainable artificial intelligence (XAI) systems, which achieve transparency and interpretability by providing insights into the decision-making process of the AI, are interesting methods for achieveing quality inspections in manufacturing processes. In this study, we conducted a systematic literature review (SLR) to explore AI and XAI approaches for visual QA (VQA) in manufacturing. Our objective was to assess the current state of the art and identify research gaps in this context. Our findings revealed that AI-based systems predominantly focused on visual quality control (VQC) for defect detection. Research addressing VQA practices, like process optimization, predictive maintenance, or root cause analysis, are more rare. Least often cited are papers that utilize XAI methods. In conclusion, this survey emphasizes the importance and potential of AI and XAI in VQA across various industries. By integrating XAI, organizations can enhance model transparency, interpretability, and trust in AI systems. Overall, leveraging AI and XAI improves VQA practices and decision-making in industries.