近年来,如何合理有效地在区块链上实现用户隐私数据保护是区块链技术领域的一个关键性问题.针对此问题,设计出一种基于Pedersen承诺与Schnorr协议的安全多方计算协议(protocol of blockchain based on Pedersen commitment linked Schno...近年来,如何合理有效地在区块链上实现用户隐私数据保护是区块链技术领域的一个关键性问题.针对此问题,设计出一种基于Pedersen承诺与Schnorr协议的安全多方计算协议(protocol of blockchain based on Pedersen commitment linked Schnorr protocol for multi-party computation,BPLSM).通过构筑该协议架构并进行形式化证明演算,表明了该协议能够融入区块链网络、能够在匿名情况下合并不同隐私消息并进行高效签署的特点.此外分析了协议的性质与安全性,证明了在区块链中应用BPLSM协议的泛用型隐私计算方案计算上的低算力开销,并具备良好的信息隐蔽性.最后对协议进行实验仿真,结果表明:在小范围人数固定的多方计算中,BPLSM协议验签的时间成本比当前主流的BLS签名节省约83.5%.展开更多
Federated learning is a distributed machine learning technique that trains a global model by exchanging model parameters or intermediate results among multiple data sources. Although federated learning achieves physic...Federated learning is a distributed machine learning technique that trains a global model by exchanging model parameters or intermediate results among multiple data sources. Although federated learning achieves physical isolation of data, the local data of federated learning clients are still at risk of leakage under the attack of malicious individuals. For this reason, combining data protection techniques (e.g., differential privacy techniques) with federated learning is a sure way to further improve the data security of federated learning models. In this survey, we review recent advances in the research of differentially-private federated learning models. First, we introduce the workflow of federated learning and the theoretical basis of differential privacy. Then, we review three differentially-private federated learning paradigms: central differential privacy, local differential privacy, and distributed differential privacy. After this, we review the algorithmic optimization and communication cost optimization of federated learning models with differential privacy. Finally, we review the applications of federated learning models with differential privacy in various domains. By systematically summarizing the existing research, we propose future research opportunities.展开更多
文摘近年来,如何合理有效地在区块链上实现用户隐私数据保护是区块链技术领域的一个关键性问题.针对此问题,设计出一种基于Pedersen承诺与Schnorr协议的安全多方计算协议(protocol of blockchain based on Pedersen commitment linked Schnorr protocol for multi-party computation,BPLSM).通过构筑该协议架构并进行形式化证明演算,表明了该协议能够融入区块链网络、能够在匿名情况下合并不同隐私消息并进行高效签署的特点.此外分析了协议的性质与安全性,证明了在区块链中应用BPLSM协议的泛用型隐私计算方案计算上的低算力开销,并具备良好的信息隐蔽性.最后对协议进行实验仿真,结果表明:在小范围人数固定的多方计算中,BPLSM协议验签的时间成本比当前主流的BLS签名节省约83.5%.
文摘Federated learning is a distributed machine learning technique that trains a global model by exchanging model parameters or intermediate results among multiple data sources. Although federated learning achieves physical isolation of data, the local data of federated learning clients are still at risk of leakage under the attack of malicious individuals. For this reason, combining data protection techniques (e.g., differential privacy techniques) with federated learning is a sure way to further improve the data security of federated learning models. In this survey, we review recent advances in the research of differentially-private federated learning models. First, we introduce the workflow of federated learning and the theoretical basis of differential privacy. Then, we review three differentially-private federated learning paradigms: central differential privacy, local differential privacy, and distributed differential privacy. After this, we review the algorithmic optimization and communication cost optimization of federated learning models with differential privacy. Finally, we review the applications of federated learning models with differential privacy in various domains. By systematically summarizing the existing research, we propose future research opportunities.