Federated Learning Implementation for Privacy-Preserving AI Model Training Across Distributed Cloud Platforms

Authors

  • Zhang Kin Linine Federated Learning Engineer, China. Author

Keywords:

Federated Learning, Privacy-Preserving AI, Distributed Cloud, Differential Privacy, Secure Aggregation, Edge Computing, Model Training

Abstract

In response to increasing privacy concerns and the demand for scalable, decentralized machine learning (ML) models, federated learning (FL) has emerged as a promising paradigm. This paper explores the implementation of federated learning across distributed cloud platforms to enable privacy-preserving AI model training without centralized data storage. We present an architecture integrating secure model aggregation, differential privacy, and platform interoperability. Additionally, we review literature to situate current developments, analyze system performance, and highlight security and scalability trade-offs. Our experimental simulations demonstrate that federated learning offers significant privacy guarantees while maintaining competitive model accuracy compared to centralized training.

References

[1] McMahan, H. Brendan, et al. “Communication-Efficient Learning of Deep Networks from Decentralized Data.” Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR, 2017, pp. 1273–1282.

[2] Bonawitz, Keith, et al. “Towards Federated Learning at Scale: System Design.” Proceedings of the 2nd Conference on Machine Learning and Systems (MLSys), 2019, pp. 374–388.

[3] Kairouz, Peter, et al. “Advances and Open Problems in Federated Learning.” Foundations and Trends in Machine Learning, vol. 14, no. 1–2, 2021, pp. 1–210.

[4] Geyer, Robin C., Tassilo Klein, and Moin Nabi. “Differentially Private Federated Learning: A Client-Level Perspective.” Proceedings of the 31st Conference on Neural Information Processing Systems Workshops, 2018.

[5] Kamadi, S. (2023). Identity-Driven Zero Trust Automation in GitOps: Policy-as-Code Enforcement for Secure Code Deployments. International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 9(3), 893-902. https://doi.org/10.32628/CSEIT235148

[6] Hardy, Samuel, et al. “Private Federated Learning on Vertically Partitioned Data via Entity Resolution and Additively Homomorphic Encryption.” Proceedings of the 2017 ACM Workshop on Artificial Intelligence and Security, ACM, 2017, pp. 35–46.

[7] Li, Tian, et al. “Federated Learning: Challenges, Methods, and Future Directions.” IEEE Signal Processing Magazine, vol. 37, no. 3, 2020, pp. 50–60.

[8] Yang, Qiang, Yang Liu, Tianjian Chen, and Yongxin Tong. “Federated Machine Learning: Concept and Applications.” ACM Transactions on Intelligent Systems and Technology, vol. 10, no. 2, 2019, pp. 1–19.

[9] Sheller, Micah J., et al. “Federated Learning in Medicine: Facilitating Multi-Institutional Collaborations without Sharing Patient Data.” Scientific Reports, vol. 10, 2020, pp. 1–12.

[10] Truex, Stacey, et al. “A Hybrid Approach to Privacy-Preserving Federated Learning.” Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security, ACM, 2019, pp. 1–11.

[11] Zhao, Yue, et al. “Federated Learning with Non-IID Data.” arXiv preprint, arXiv:1806.00582, 2018.

[12] Lalitha, Anusha, et al. “Fully Decentralized Federated Learning.” Proceedings of the IEEE Conference on Decision and Control, IEEE, 2019, pp. 1–6.

[13] Wang, Shiqiang, et al. “Adaptive Federated Learning in Resource Constrained Edge Computing Systems.” IEEE Journal on Selected Areas in Communications, vol. 37, no. 6, 2019, pp. 1205–1221.

Downloads

Published

2024-04-08

How to Cite

Federated Learning Implementation for Privacy-Preserving AI Model Training Across Distributed Cloud Platforms. (2024). INTERNATIONAL JOURNAL OF ENGINEERING TRENDS AND TECHNOLOGY RESEARCH (IJETTR), 5(1), 1-8. https://ijettr.com/index.php/IJETTR/article/view/IJETTR_05_01_001