Refine
Year of publication
- 2019 (11) (remove)
Document type
Has full text
- No (11) (remove)
Is part of the Bibliography
- Yes (11)
Keywords
- Industry 4.0 (3)
- Condition monitoring (2)
- Access control (1)
- Ambient assisted living (1)
- Anomalous behavior (1)
- Application deployment (1)
- Artificial neural networks (1)
- Availability (1)
- Big data (1)
- Blockchain (1)
Formal Description of Use Cases for Industry 4.0 Maintenance Processes Using Blockchain Technology
(2019)
The rise of digital twins in the manufacturing industry is accompanied by new possibilities, like process automation and condition monitoring, real time simulations and quality and maintenance prediction are just a few advantages which can be realized. This paper takes a novel approach by extracting the fundamental knowledge of a data set from a production process and mapping it to an expert fuzzy rule set. Afterwards, new fundamental augmented data is generated by exploring the feature space of the previously generated fuzzy rule set. At the same time, a high number of artificial neural network (ANN)models with different hyperparameter configurations are created.
The best models are chosen, in line with the idea of survival of the fittest, and improved with the additional training data sets, generated by the fuzzy rule simulation. It is shown that ANN models can be improved by adding fundamental knowledge represented by the discovered fuzzy rules. Those models can represent digitized machines as digital twins. The architecture and effectiveness of the digital twin is evaluated within an industry 4.0 use case.
Ensuring data quality is central to the digital transformation in industry. Business processes such as predictive maintenance or condition monitoring can be implemented or improved based on the available data. In order to guarantee high data quality, a single data validation system are usually used to validate the production data for further use. However, using a single system allows an attacker only to perform one successful attack to corrupt the whole system. We present a new approach in which a data validation system using multiple different validators minimizes the probability of success for the attacker. The validators are arranged in clusters based on their properties. For a validation process, a challenge is given that specifies which validators should perform the current validation. Validation results from other validators are dropped. This ensures that even for more than half of the validators being corrupted anomalies can be detected during the validation process.