- It is the same rehearsal method as iCaRL. Incremental Learning reuses existing data to update new data, so it tends to create models that are biased against old or new classes.
- This results in information asymmetric, which results in performance degradation due to Catastopic Forgetting. To solve this problem, this paper proposes a new class incremental learning paradigm called Deep Model Consolidation (DMC) Method is described in more detail in the original paper.
For running the code you should follow these steps:
- Download CIFAR-100 and ImageNet32 test set in the proper folders in the provided folder structure.
- Run data_prep.ipynb
- Choose the configuration you wish to run in the configuration_setup.txt and copy the provided parameters in the proper code cell in DMC.ipynb
- Run DMC.ipynb
- Upon completion run Graphing_solutions.ipynb
Note 1. If you wish to test some other depth of residual network the code is provided to do that on the entire CIFAR-100 dataset. Just change the resnet_type parameter and run Resnet32.ipynb
Note 2. Every jupyter notebook provided is meant to be run on Google Colab platform.