Secur. Commun. Networks | 2021

Distributed Outsourced Privacy-Preserving Gradient Descent Methods among Multiple Parties

 
 
 
 

Abstract


)e Internet of)ings (IoT) is one of the latest internet evolutions. Cloud computing is an important technique which realizes the computational demand of largely distributed IoT devices/sensors by employing various machine learning models. Gradient descent methods are widely employed to find the optimal coefficients of a machine learning model in the cloud computing. Commonly, the data are distributed among multiple data owners, whereas the target function is held by the model owner. )e model owner can train its model over data owner’s data and provide predictions. However, the dataset or the target function’s confidentiality may not be kept in secret during computations. )us, security threats and privacy risks arise. To address the data and model’s privacy mentioned above, we present two new outsourced privacy-preserving gradient descent (OPPGD) method schemes over horizontally or vertically partitioned data among multiple parties, respectively. Compared to previously proposed solutions, our methods improve in comprehensiveness in a more general scene. )e data privacy and the model privacy are preserved during the whole learning and prediction procedures. In addition, the execution performance evaluation demonstrates that our schemes can help the model owner to optimize its target function and provide exact prediction with high efficiency and accuracy.

Volume 2021
Pages 8876893:1-8876893:16
DOI 10.1155/2021/8876893
Language English
Journal Secur. Commun. Networks

Full Text