Protect important weights
Constrain changes in weights carrying high Fisher information. Simple, general, and easy to implement.
A model learns from a stream of tasks or data, without retraining from scratch each time the world changes.
Most current approaches still require substantial human preparation before new data can be introduced.
The goal is not only to improve a score.
It is to understand, almost mechanistically, what happens inside a learning model when it remembers, adapts, or forgets.
Continual learning lives in the tension between these two forces.
Constrain changes in weights carrying high Fisher information. Simple, general, and easy to implement.
A strong contemporary reference point and state of the art at the time of writing.
All benchmarks were evaluated in a class-incremental setting.
For CIFAR-10, CIFAR-100, and CORe50, learning was performed on ResNet50 ImageNet embeddings rather than raw - this is a commonly used protocol.
A SOM is a grid of prototypes that organizes itself topologically.
Similar inputs activate nearby regions, so the map becomes a visible representation of structure.
What if every neuron had its own λ and σ?