Repository logo
 

Beyond Catastrophic Forgetting in Continual Learning: An Attempt with SVM

Supervisor

Item type

Conference Contribution

Degree name

Journal Title

Journal ISSN

Volume Title

Publisher

The International Conference on Machine Learning (ICML)

Abstract

A big challenge in continual learning is avoiding catastrophically forgetting previously learned tasks. The possibility of improving existing knowledge whilst integrating new information has been much less explored. In this paper we describe a method that aims to improve the performance of previously learned tasks by refining their knowledge while new tasks are observed. Our method is an example of this ability in the context of Support Vector Machines for binary classification tasks, which encourages retention of existing knowledge whilst refining. Experiments with synthetic and real-world datasets show that the performance of these tasks can be continually improved by transferring selected knowledge, leading to the improvement on the performance of the learning system as a whole.

Description

Keywords

Source

Proceedings of the 37 th International Conference on Machine Learning, Vienna, Austria, PMLR 108, 2020.

DOI

Rights statement

Copyright 2020 by the author(s).