Refine
Document type
Keywords
- Blockchain (2)
- CPS (2)
- Cloud (2)
- Cloud computing (2)
- Data acquisition (2)
- ML (2)
- MLOps (2)
- Modularization (2)
- Accountability (1)
- Amazon Lambda (1)
Course of studies
- INM - Informatik (1)
ARTHUR – Distributed Measuring System for Synchronous Data Acquisition from Different Data Sources
(2023)
In industrial manufacturing lines, different machines are well orchestrated and applied for their well-defined purpose. As each of these machines must be monitored and maintained in the first place, there are scenarios in which a Data Acquisition system brings enormous benefits. Since the cost of such professional systems is often not appropriate or feasible for research projects or prototyping, a proof of concept is often achieved by applying end-user hardware. In this work, a distributed measurement system for supporting the collection of data is described with respect to AI-based projects for research and teaching. ARTHUR (meAsuRing sysTem witH distribUted sensoRs) is arbitrarily expandable and has so far been used in the field of data acquisition on machine tools. Typical measured values are Accoustic Emission values, force plates X-Y-Z force values, simple PLC switching signals, OPC-UA machine parameters, etc., which were recorded by a wide variety of sensors. The overall ATHUR system is based on Raspberry Pis and consists of a master node, multiple independent measurement worker nodes, a streaming system realized with Redis, as well as a gateway that stores the data in the cloud. The major objectives of the ARTHUR system are scalability and the support for low-cost measuring components while solely applying open-source software. The work on hand discusses the advantages and disadvantages regarding the hard- and software of this TCP/IP-based system.
On the way to the smart factory, the manufacturing companies investigate the potential of Machine Learning approaches like visual quality inspection, process optimisation, maintenance prediction and more. In order to be able to assess the influence of Machine Learning based systems on business-relevant key figures, many companies go down the path of test before invest. This paper describes a novel and inexpensive distributed Data Acquisition System, ARTHUR (dAta collectoR sysTem witH distribUted sensoRs), to enable the collection of data for AI-based projects for research, education and the industry. ARTHUR is arbitrarily expandable and has so far been used in the field of data acquisition on machine tools. Typical measured values are Acoustic Emission values, force plate X-Y-Z force values, simple SPS signals, OPC-UA machine parameters, etc. which were recorded by a wide variety of sensors. The ARTHUR system consists of a master node, multiple measurement worker nodes, a local streaming system and a gateway that stores the data to the cloud. The authors describe the hardware and software of this system and discuss its advantages and disadvantages.
In modern industrial production lines, the integration and interconnection of various different manufacturing components, like robots, laser cutting machines, milling machines, CNC-machines, etc. allows for a higher degree of autonomous production on the shop floor. Manufacturers of these increasingly complex machines are beginning to equip their business models with bidirectional data flows to other factories. This is creating a digital, cross-company shop floor infrastructure where the transfer of information is controlled by digital contracts. To establish a trusted ecosystem, the new technology "blockchain" and a variety of technology stacks must be combined while ensuring security. Such blockchain-based frameworks enable bidirectional trust across all contract partners. Essential data flows are defined by specific technical representation of contract agreements and executed through smart contracts.This work describes a platform for rapid cross-company business model instantiation based on blockchain for establishing trust between the enterprises. It focuses on selected security aspects of the deployment- and configuration processes applied by the industrial ecosystem. A threat analysis of the platform shows the critical security risks. Based on an industrial dynamic machine leasing use case, a risk assessment and security analysis of the key platform components is carried out.
Nowadays, machine learning projects have become more and more relevant to various real-world use cases. The success of complex Neural Network models depends upon many factors, as the requirement for structured and machine learning-centric project development management arises. Due to the multitude of tools available for different operational phases, responsibilities and requirements become more and more unclear. In this work, Machine Learning Operations (MLOps) technologies and tools for every part of the overall project pipeline, as well as involved roles, are examined and clearly defined. With the focus on the inter-connectivity of specific tools and comparison by well-selected requirements of MLOps, model performance, input data, and system quality metrics are briefly discussed. By identifying aspects of machine learning, which can be reused from project to project, open-source tools which help in specific parts of the pipeline, and possible combinations, an overview of support in MLOps is given. Deep learning has revolutionized the field of Image processing, and building an automated machine learning workflow for object detection is of great interest for many organizations. For this, a simple MLOps workflow for object detection with images is portrayed.
Die Vielfalt von heutzutage auftretenden Datenstrukturen schafft Bedarf für individu-
ell abgestimmte Analyseplattformen. Dabei benötigte Ressourcen sind vom jeweiligen
Anwendungsfall abhängig. Diese Arbeit diskutiert Broker für die Virtualisierung der
verarbeitenden Anwendungen, welche durch ein abstrahiertes Dashboard bedient wer-
den. Eine Domain Specific Language ermöglicht die Generierung eines Grundgerüsts
entsprechender Komponenten, die mit individueller Logik anzureichern sind. Die be-
schriebene Architektur bezieht sich zu großen Teilen auf den Umgang mit den flexiblen
Eingangsdaten von virtualisierten Verarbeitungsplattformen.