A Privacy-Preserving Federated Learning Framework with Multiparty Threshold Homomorphic Encryption

Publication details

Federated learning enables collaborative computation across multiple decentralized devices, minimizing data transfer overhead while enhancing privacy by keeping data local. However, it remains susceptible to inference attacks and potential data leakage. To strengthen privacy guarantees, especially for sensitive domains, advanced privacy-preserving techniques such as homomorphic encryption are recommended. This work proposes a privacy-preserving federated learning framework that integrates threshold homomorphic encryption into the federated learning pipeline to enable secure aggregation and protect intermediate computations. We employ threshold homomorphic encryption, a cryptographic technique well-suited for multiuser environments such as federated learning. We utilize the Cheon-Kim-Kim-Song (CKKS) scheme, as implemented in the OpenFHE library. Our approach extends the standard Federated Averaging (FedAvg) algorithm by homomorphically encrypting model updates and performing aggregation directly on encrypted data. To assess the trade-offs between efficiency and security, we evaluate the performance of the proposed method against a baseline. The design prioritizes practical constraints, including computational efficiency, making it suitable for deployment in privacy-sensitive domains such as healthcare and finance. To ensure compatibility with continuous integration and deployment (CI/CD) pipelines, all components of the solution are containerized using Docker.