Approach | Technique | Main Idea |
Zhou and Tang [62] | SVGR Algorithm | Develop a novel differentially private distributed algorithm based on the stochastic variance reduced gradient technique. |
Van Dijk et al. [64] | Asynchronous federated learning algorithm | Develop a novel algorithm that eliminates waiting times and reduces overall network communication. |
Girgis et al. [65] | CLDP-SGD | Develop a distributed communication-efficient and locally differentially private stochastic gradient descent algorithm along with a detailed analysis of its communication, privacy, and convergence tradeoffs. |
Zhang et al. [66] | Mechanism Design | Develop a federated learning scheme based on differential privacy and mechanism design under which high-quality clients are selected to improve the accuracy of the model. |
Denisov et al. [67] | Optimal Private Linear Operators | Develop improved differential privacy for SGD that achieves significant improvements in a notable problem in federated learning with differential privacy at the user level. |
Lian et al. [68] | COFEL | Develop a novel federated learning system that reduces communication time through layer-based parameter selection and enhances privacy protection through local differences in privacy. |
Amiri et al. [69] | Universal Vector Quantization | Develop a novel algorithm for achieving differential privacy and reduced communication overhead through compression of client-server communication by quantization. |
Liu et al. [70] | FL-PFA | Develop a novel framework, named FL-PFA, that achieves communication cost minimization. |
Zhang et al. [32] | Clipping | Develop a novel federated learning framework with clipping technique |
Truex et al. [71] | Security Multi-party Computation | Develop a novel approach that combines differential privacy and SMC, thus enabling users to reduce the growth of noise injection. |