Refine
Document type
Language
- English (6)
Has full text
- No (6)
Is part of the Bibliography
- Yes (6)
Keywords
- Distributed data validation network (2)
- Industry 4.0 (2)
- Anomaly detection (1)
- Autorisierung (1)
- Big data (1)
- Cloud-edge computing (1)
- Cluster-based data validation (1)
- Context-Awareness (1)
- Data validation (1)
- Distributed DNN (1)
- Edge security (1)
- Factory of the future (1)
- Industrial internet of things (1)
- Industrie 4.0 (1)
- Internet of things (1)
- Security Zertifikat (1)
- Sicherheit (1)
Machine learning applications, like machine condition monitoring, predictive maintenance, and others, become a state of the art in Industry 4.0. One of many machine learning algorithms are decision trees for the decision-making process. A new approach for creating distributed decision trees, called node based parallelization, is presented. It allows data to be classified through a network of industrial devices. Each industrial device is responsible for a single classification rule. Also, nodes that react incorrectly, for example, due to an attack, are taken into account using a variety of methods to remain the decision-making process correct and robust.
In Industry 4.0 machine learning approaches are a state-of-the art for predictive maintenance, machine condition monitoring, and others. Distributed decision trees are one of the learning algorithms for such applications. A new approach of node based parallelization for the construction is presented and allows to classify data through a network of nodes. Attacks on the nodes are discussed based on different attack scenarios and attack classifications are presented. A thorough analysis of protection measurements is given, such that classification is not maliciously modified by an attacker. Different countermeasures are proposed and analyzed. A quorum-based system allows for a good balance between computational overhead and robustness of the algorithm.
Distributed machine learning algorithms that employ Deep Neural Networks (DNNs) are widely used in Industry 4.0 applications, such as smart manufacturing. The layers of a DNN can be mapped onto different nodes located in the cloud, edge and shop floor for preserving privacy. The quality of the data that is fed into and processed through the DNN is of utmost importance for critical tasks, such as inspection and quality control. Distributed Data Validation Networks (DDVNs) are used to validate the quality of the data. However, they are prone to single points of failure when an attack occurs. This paper proposes QUDOS, an approach that enhances the security of a distributed DNN that is supported by DDVNs using quorums. The proposed approach allows individual nodes that are corrupted due to an attack to be detected or excluded when the DNN produces an output. Metrics such as corruption factor and success probability of an attack are considered for evaluating the security aspects of DNNs. A simulation study demonstrates that if the number of corrupted nodes is less than a given threshold for decision-making in a quorum, the QUDOS approach always prevents attacks. Furthermore, the study shows that increasing the size of the quorum has a better impact on security than increasing the number of layers. One merit of QUDOS is that it enhances the security of DNNs without requiring any modifications to the algorithm and can therefore be applied to other classes of problems.
Ensuring data quality is central to the digital transformation in industry. Business processes such as predictive maintenance or condition monitoring can be implemented or improved based on the available data. In order to guarantee high data quality, a single data validation system are usually used to validate the production data for further use. However, using a single system allows an attacker only to perform one successful attack to corrupt the whole system. We present a new approach in which a data validation system using multiple different validators minimizes the probability of success for the attacker. The validators are arranged in clusters based on their properties. For a validation process, a challenge is given that specifies which validators should perform the current validation. Validation results from other validators are dropped. This ensures that even for more than half of the validators being corrupted anomalies can be detected during the validation process.