Privacy shield: a system for edge computing using asynchronous federated learning

Due to increase in IoT devices, the data produced every day are also increasing rapidly. The growth in data means more processing and more computations are required without delay. This introduced us to a new horizon of the computing infrastructure, i.e., edge computing. Edge computing gained promine...

全面介绍

Saved in:
书目详细资料
Main Authors: Khalid, Adnan, Aziz, Zeeshan, Fathi, Mohamad Syazli
格式: Article
语言:English
出版: Hindawi Limited 2022
主题:
在线阅读:http://eprints.utm.my/103970/1/MohamadSyazliFathi2022_PrivacyShieldaSystemforEdge.pdf
http://eprints.utm.my/103970/
http://dx.doi.org/10.1155/2022/7465640
标签: 添加标签
没有标签, 成为第一个标记此记录!
实物特征
总结:Due to increase in IoT devices, the data produced every day are also increasing rapidly. The growth in data means more processing and more computations are required without delay. This introduced us to a new horizon of the computing infrastructure, i.e., edge computing. Edge computing gained prominence as a solution to the problem of delayed transmission, processing, and response by the cloud architecture. It was further augmented with the field of artificial intelligence. It has become a topic in research with preservation of data privacy as the focal point. This paper provides Privacy Shield, a system for edge computing using asynchronous federated learning, where multiple edge nodes perform federated learning while keeping their private data hidden from one another. Contrary to the pre-existing distributed learning, the suggested system reduces the calls between the edge nodes and the main server during the training process ensuring that there is no negative impact on the accuracy of the mode. We used different values of Ω that affect the accuracy and the compression ratio. In the interval of Ω = [0.2, 0.4], the C-ratio increases and the value of Ω and the compression ratio value are directly proportional regardless of the fluctuations. We analyzed the accuracy of the model which increases along with the increase in compression ratio. Compressing the gradient communications reduces the likelihood of the data of being attacked. As the nodes train asynchronously on limited and different kinds of samples, the binary weights adjustment method was used to handle the resulting "unbalanced"learning. The MNIST and Cifar10 dataset were used for testing in both tasks, respectively.