Abstract
Federated learning is a new type of privacy-preserving distributed machine learning method. Despite its advantages, Federated learning suffers from catastrophic forgetting due to the temporal variability of data on participating devices. While some methods have been proposed to address this issue in federated learning, they often come at the expense of storage space, which is not a practical solution. This paper presents a federated incremental learning framework designed specifically for resource-constrained devices to address catastrophic forgetting. Leveraging a feature generator, knowledge distillation, and dynamic adaptive weight allocation, the framework addresses catastrophic forgetting and accelerates model convergence. Our framework effectively addresses the issue of catastrophic forgetting, even in the context of resource-constrained devices with limited storage. The results demonstrate significant improvements in our framework compared to existing baselines on the CIFAR-10 and CIFAR-100 datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)
Dong, J., et al.: Federated class-incremental learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10164–10173 (2022)
Ma, Y., Xie, Z., Wang, J., Chen, K., Shou, L.: Continual federated learning based on knowledge distillation. In: Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence (2022)
Yoon, J., Jeong, W., Lee, G., Yang, E., Hwang, S.J.: Federated continual learning with weighted inter-client transfer. In: International Conference on Machine Learning, pp. 12073–12086. PMLR (2021)
Miyato, T., Koyama, M.: CGANs with projection discriminator. In: 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, April 30–May 3 2018, Conference Track Proceedings (2018)
Jiang, X., Borcea, C.: Concept matching: clustering-based federated continual learning. arXiv preprint arXiv:2311.06921 (2023)
Qi, D., Zhao, H., Li, S.: Better generative replay for continual federated learning. In: The Eleventh International Conference on Learning Representations (2023)
Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, vol. 27 (2014)
Li, W., Chen, J., Wang, Z., Shen, Z., Ma, C., Cui, X.: IFL-GAN: improved federated learning generative adversarial network with maximum mean discrepancy model aggregation. IEEE Trans. Neural Netw. Learn. Syst. (2022)
Zhang, J., Chen, C., Zhuang, W., Lyu, L.: TARGET: federated class-continual learning via exemplar-free distillation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 4782–4793 (2023)
Wu, J., Huang, Z., Thoma, J., Acharya, D., Van Gool, L.: Wasserstein divergence for GANs. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11209. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01228-1_40
Reddi, S.J., et al.: Adaptive federated optimization. In: 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, 3–7 May 2021 (2021)
Michieli, U., Zanuttigh, P.: Incremental learning techniques for semantic segmentation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops (2019)
Li, Z., Hoiem, D.: Learning without forgetting. IEEE Trans. Pattern Anal. Mach. Intell.Intell. 40, 2935–2947 (2017)
Acknowledgments
This work is supported by the General Program of National Natural Science Foundation of China under Grant (62372050) and the Beijing Natural Science Foundation of China (4232029).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Gao, Z., Zhang, J., Yu, X. (2024). Addressing Catastrophic Forgetting in Federated Learning on Resource-Constrained Devices: A Feature Replay Approach. In: Huang, DS., Zhang, X., Zhang, C. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2024. Lecture Notes in Computer Science(), vol 14879. Springer, Singapore. https://doi.org/10.1007/978-981-97-5675-9_29
Download citation
DOI: https://doi.org/10.1007/978-981-97-5675-9_29
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5674-2
Online ISBN: 978-981-97-5675-9
eBook Packages: Computer ScienceComputer Science (R0)