Dual Supervised Contrastive Learning Based on Perturbation Uncertainty for Online Class Incremental Learning | SpringerLink
Skip to main content

Dual Supervised Contrastive Learning Based on Perturbation Uncertainty for Online Class Incremental Learning

  • Conference paper
  • First Online:
Pattern Recognition (ICPR 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15309))

Included in the following conference series:

  • 87 Accesses

Abstract

To keep learning knowledge from a data stream with changing distribution, continual learning has attracted lots of interests recently. Among its various settings, online class-incremental learning (OCIL) is more realistic and challenging since the data can be used only once. Currently, by employing a buffer to store a few old samples, replay-based methods have obtained huge success and dominated this area. Due to the single pass property of OCIL, how to retrieve high-valued samples from memory is very important. In most of the current works, the logits from the last fully connected layer are used to estimate the value of samples. However, the imbalance between the number of samples for old and new classes leads to a severe bias of the FC layer, which results in an inaccurate estimation. Moreover, this bias also brings about abrupt feature change. To address this problem, we propose a dual supervised contrastive learning method based on perturbation uncertainty. Specifically, we retrieve samples that have not been learned adequately based on perturbation uncertainty. Retraining such samples helps the model to learn robust features. Then, we combine two types of supervised contrastive loss to replace the cross-entropy loss, which further enhances the feature robustness and alleviates abrupt feature changes. Extensive experiments on three popular datasets demonstrate that our method surpasses several recently published works.

S. Su and Z. Chen—Authors contributed equally.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 8465
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 10581
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ahn, H., Kwak, J., Lim, S., Bang, H., Kim, H., Moon, T.: Ss-il: separated softmax for incremental learning. In: ICCV, pp. 844–853 (2021)

    Google Scholar 

  2. Aljundi, R., et al.: Online continual learning with maximal interfered retrieval. Adv. Neural Inform. Process. Syst. 32 (2019)

    Google Scholar 

  3. Aljundi, R., Lin, M., Goujaud, B., Bengio, Y.: Gradient based sample selection for online continual learning. Adv. Neural Inform. Process. Syst. 32 (2019)

    Google Scholar 

  4. Bellitto, G., Pennisi, M., Palazzo, S., Bonicelli, L., Boschini, M., Calderara, S.: Effects of auxiliary knowledge on continual learning. In: 2022 26th International Conference on Pattern Recognition (ICPR), pp. 1357–1363. IEEE (2022)

    Google Scholar 

  5. Boschini, M., Bonicelli, L., Buzzega, P., Porrello, A., Calderara, S.: Class-incremental continual learning into the extended der-verse. IEEE Trans. Pattern Anal. Mach. Intell. 45(5), 5497–5512 (2022)

    Article  Google Scholar 

  6. Buzzega, P., Boschini, M., Porrello, A., Abati, D., Calderara, S.: Dark experience for general continual learning: a strong, simple baseline. Adv. Neural. Inf. Process. Syst. 33, 15920–15930 (2020)

    Google Scholar 

  7. Buzzega, P., Boschini, M., Porrello, A., Calderara, S.: Rethinking experience replay: a bag of tricks for continual learning. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 2180–2187. IEEE (2021)

    Google Scholar 

  8. Caccia, L., Aljundi, R., Asadi, N., Tuytelaars, T., Pineau, J., Belilovsky, E.: New insights on reducing abrupt representation change in online continual learning. arXiv preprint arXiv:2104.05025 (2021)

  9. Cha, H., Lee, J., Shin, J.: Co2l: contrastive continual learning. In: ICCV, pp. 9516–9525 (October 2021)

    Google Scholar 

  10. Chaudhry, A., Ranzato, M., Rohrbach, M., Elhoseiny, M.: Efficient lifelong learning with a-gem. arXiv preprint arXiv:1812.00420 (2018)

  11. Chaudhry, A., et al.: On tiny episodic memories in continual learning. arXiv preprint arXiv:1902.10486 (2019)

  12. Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: ICML, pp. 1597–1607. PMLR (2020)

    Google Scholar 

  13. Davari, M., Asadi, N., Mudur, S., Aljundi, R., Belilovsky, E.: Probing representation forgetting in supervised and unsupervised continual learning. In: CVPR, pp. 16712–16721 (June 2022)

    Google Scholar 

  14. Fu, Z., Wang, Z., Xu, X., Li, D., Yang, H.: Knowledge aggregation networks for class incremental learning. Pattern Recogn. 137, 109310 (2023)

    Article  Google Scholar 

  15. Gallardo, J., Hayes, T.L., Kanan, C.: Self-supervised training enhances online continual learning (2021). https://arxiv.org/abs/2103.14010

  16. Gu, Y., Yang, X., Wei, K., Deng, C.: Not just selection, but exploration: online class-incremental continual learning via dual view consistency. In: CVPR, pp. 7442–7451 (2022)

    Google Scholar 

  17. Hadsell, R., Chopra, S., LeCun, Y.: Dimensionality reduction by learning an invariant mapping. In: CVPR, vol. 2, pp. 1735–1742. IEEE (2006)

    Google Scholar 

  18. He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.: Momentum contrast for unsupervised visual representation learning. In: CVPR, pp. 9729–9738 (2020)

    Google Scholar 

  19. Jia, M., Tang, L., Chen, B.C., Cardie, C., Belongie, S., Hariharan, B., Lim, S.N.: Visual prompt tuning. In: ECCV, pp. 709–727. Springer (2022)

    Google Scholar 

  20. Khosla, P., et al.: Supervised contrastive learning. Adv. Neural. Inf. Process. Syst. 33, 18661–18673 (2020)

    Google Scholar 

  21. Kirkpatrick, J., et al.: Overcoming catastrophic forgetting in neural networks. Proc. Natl. Acad. Sci. 114(13), 3521–3526 (2017)

    Article  MathSciNet  Google Scholar 

  22. Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Handbook Systemic Autoimmune Diseases 1(4) (2009)

    Google Scholar 

  23. Li, X., Wang, S., Sun, J., Xu, Z.: Memory efficient data-free distillation for continual learning. Pattern Recogn. 144, 109875 (2023)

    Article  Google Scholar 

  24. Li, Z., Hoiem, D.: Learning without forgetting. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 2935–2947 (2017)

    Article  Google Scholar 

  25. Liang, G., Chen, Z., Chen, Z., Ji, S., Zhang, Y.: New insights on relieving task-recency bias for online class incremental learning. IEEE Trans. Circuits Syst. Video Technol. 34(5), 3451–3464 (2024)

    Article  Google Scholar 

  26. Liu, X., Masana, M., Herranz, L., Van de Weijer, J., Lopez, A.M., Bagdanov, A.D.: Rotate your networks: Better weight consolidation and less catastrophic forgetting. In: 2018 24th International Conference on Pattern Recognition (ICPR), pp. 2262–2268. IEEE (2018)

    Google Scholar 

  27. Lopez-Paz, D., Ranzato, M.: Gradient episodic memory for continual learning. Adv. Neural Inform. Process. Syst. 30 (2017)

    Google Scholar 

  28. Mai, Z., Li, R., Jeong, J., Quispe, D., Kim, H., Sanner, S.: Online continual learning in image classification: an empirical survey. Neurocomputing 469, 28–51 (2022)

    Article  Google Scholar 

  29. Mai, Z., Li, R., Kim, H., Sanner, S.: Supervised contrastive replay: revisiting the nearest class mean classifier in online class-incremental continual learning. In: CVPR, pp. 3589–3599 (2021)

    Google Scholar 

  30. Masana, M., Liu, X., Twardowski, B., Menta, M., Bagdanov, A.D., van de Weijer, J.: Class-incremental learning: survey and performance evaluation on image classification. IEEE Trans. Pattern Anal. Mach. Intell. 45(5), 5513–5533 (2023)

    Article  Google Scholar 

  31. McCloskey, M., Cohen, N.J.: Catastrophic interference in connectionist networks: the sequential learning problem. In: Psychology of Learning and Motivation, vol. 24, pp. 109–165. Elsevier (1989)

    Google Scholar 

  32. Shi, F., Wang, P., Shi, Z., Rui, Y.: Selecting useful knowledge from previous tasks for future learning in a single network. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 9727–9732. IEEE (2021)

    Google Scholar 

  33. Shim, D., Mai, Z., Jeong, J., Sanner, S., Kim, H., Jang, J.: Online class-incremental continual learning with adversarial shapley value. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 9630–9638 (2021)

    Google Scholar 

  34. Song, K., Liang, G., Chen, Z., Zhang, Y.: Non-exemplar class-incremental learning by random auxiliary classes augmentation and mixed features. IEEE Trans. Circ. Syst. Video Technol. (2024)

    Google Scholar 

  35. Van de Ven, G.M., Tuytelaars, T., Tolias, A.S.: Three types of incremental learning. Nat. Mach. Intell. 4(12), 1185–1197 (2022)

    Article  Google Scholar 

  36. Vinyals, O., Blundell, C., Lillicrap, T., Wierstra, D., et al.: Matching networks for one shot learning. Adv. Neural Inform. Process. Syst. 29 (2016)

    Google Scholar 

  37. Vitter, J.S.: Random sampling with a reservoir. ACM Trans. Math. Softw. (TOMS) 11(1), 37–57 (1985)

    Article  MathSciNet  Google Scholar 

  38. Wang, Q., Wang, R., Wu, Y., Jia, X., Meng, D.: Cba: improving online continual learning via continual bias adaptor. In: ICCV, pp. 19082–19092 (2023)

    Google Scholar 

  39. Wang, R., et al.: Attriclip: a non-incremental learner for incremental knowledge learning. In: CVPR, pp. 3654–3663 (2023)

    Google Scholar 

  40. Wang, Z., et al.: Learning to prompt for continual learning. In: CVPR, pp. 139–149 (2022)

    Google Scholar 

  41. Yao, X., et al.: Pcl: proxy-based contrastive learning for domain generalization. In: CVPR, pp. 7097–7107 (2022)

    Google Scholar 

  42. Yoon, J., Madaan, D., Yang, E., Hwang, S.J.: Online coreset selection for rehearsal-based continual learning. arXiv preprint arXiv:2106.01085 (2021)

  43. Yu, L., Hu, T., HONG, L., Liu, Z., Weller, A., Liu, W.: Continual learning by modeling intra-class variation. Trans. Mach. Learn. Res. (2023). https://openreview.net/forum?id=iDxfGaMYVr

  44. Zhang, Y., Pfahringer, B., Frank, E., Bifet, A., Lim, N.J.S., Jia, Y.: A simple but strong baseline for online continual learning: Repeated augmented rehearsal. Adv. Neural. Inf. Process. Syst. 35, 14771–14783 (2022)

    Google Scholar 

  45. Zhao, B., Xiao, X., Gan, G., Zhang, B., Xia, S.T.: Maintaining discrimination and fairness in class incremental learning. In: CVPR, pp. 13208–13217 (2020)

    Google Scholar 

Download references

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China (No. 62376218, No. 62101453), Guangdong Basic and Applied Basic Research Foundation (No. 2023A1515011298), Natural Science Basic Research Program of Shaanxi (No. 2022JC-DW-08).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guoqiang Liang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Su, S., Chen, Z., Liang, G., Zhang, S., Zhang, Y. (2025). Dual Supervised Contrastive Learning Based on Perturbation Uncertainty for Online Class Incremental Learning. In: Antonacopoulos, A., Chaudhuri, S., Chellappa, R., Liu, CL., Bhattacharya, S., Pal, U. (eds) Pattern Recognition. ICPR 2024. Lecture Notes in Computer Science, vol 15309. Springer, Cham. https://doi.org/10.1007/978-3-031-78189-6_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-78189-6_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-78188-9

  • Online ISBN: 978-3-031-78189-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics