News
- Our paper 'VERSE: Virtual-Gradient Aware Streaming Lifelong Learning with Anytime Inference' has been accepted at ICRA2024.
- I will be attending Research Week with Google 2024.
- Our paper 'Streaming LifeLong Learning With Any-Time Inference' has been accepted at ICRA2023.
|
Research
My current research aims to enable lifelong learning in the deep neural networks while trained incrementally with the sequentially coming data possibly from an unbounded stream with the following constraints: (i) the neural network can have access to the current data only, and access to the previously observed data is forbidden, and (ii) the model should be able to adapt to the changes in the data distribution without suffering from catastrophic forgetting. The following publication reflects my current research interest. Please check my Google Scholar page for the full list.
|
|
VERSE: Virtual-Gradient Aware Streaming Lifelong Learning with Anytime Inference
Soumya Banerjee,
Vinay Kumar Varma,
Avideep Mukherjee,
Deepak Gupta,
Vinay Namboodiri,
Piyush Rai
IEEE International Conference on Robotics and Automation (ICRA) 2024,
Yokohama, JAPAN
abstract |
bibtex | paper (coming soon) | arXiv
Lifelong learning, also referred to as continual learning, is the problem of training an AI agent continuously while also preventing it from forgetting its previously acquired knowledge. Most of the existing methods primarily focus on lifelong learning within a static environment and lack the ability to mitigate forgetting in a quickly-changing dynamic environment. Streaming lifelong learning is a challenging setting of lifelong learning with the goal of continuous learning in a dynamic non-stationary environment without forgetting. We introduce a novel approach to lifelong learning, which is streaming, requires a single pass over the data, can learn in a class-incremental manner, and can be evaluated on-the-fly (anytime inference). To accomplish these, we propose virtual gradients for continual representation learning to prevent catastrophic forgetting and leverage an exponential-moving-average-based semantic memory to further enhance performance. Extensive experiments on diverse datasets demonstrate our method's efficacy and superior performance over existing methods.
@article{banerjee2023verse,
title={Verse: Virtual-gradient aware
streaming lifelong learning
with anytime inference},
author={Banerjee, Soumya and
Verma, Vinay K and
Mukherjee, Avideep and
Gupta, Deepak and
Namboodiri, Vinay P and
Rai, Piyush},
journal={arXiv preprint
arXiv:2309.08227},
year={2023}
}
|
|
Streaming LifeLong Learning With Any-Time Inference
Soumya Banerjee,
Vinay Kumar Varma,
Vinay Namboodiri
IEEE International Conference on Robotics and Automation (ICRA) 2023,
ExCeL London, UK
abstract |
bibtex | paper | arXiv
Despite rapid advancements in the lifelong learning (LLL) research, a large body of research mainly focuses on improving the performance in the existing static continual learning (CL) setups. These methods lack the ability to succeed in a rapidly changing dynamic environment, where an AI agent needs to quickly learn new instances in a `single pass' from the non-i.i.d (also possibly temporally contiguous/coherent) data streams without suffering from catastrophic forgetting. For practical applicability, we propose a novel lifelong learning approach, which is streaming, i.e., a single input sample arrive in each time step, single pass, class-incremental, and subject to be evaluated at any moment. To address this challenging setup and various evaluation protocols, we propose a Bayesian framework, that enables fast parameter update, given a single training example, and enables any-time inference. We additionally propose an implicit regularizer in the form of snap-shot self-distillation, which effectively minimizes the forgetting further. We further propose an effective method that efficiently selects a subset of samples for online memory rehearsal and employs a new replay buffer management scheme that significantly boosts the overall performance. Our empirical evaluations and ablations demonstrate that the proposed method outperforms the prior works by large margins.
@article{banerjee2023streaming,
title={Streaming LifeLong Learning
With Any-Time Inference},
author={Banerjee, Soumya and
Verma, Vinay Kumar and
Namboodiri, Vinay P},
journal={arXiv preprint
arXiv:2301.11892},
year={2023}
}
|
|
I have reviewed for the following venues:
- 2024: CVPR, WACV, ICLR, ICRA
- 2023: CVPR, ICCV, NeurIPS, BMVC
- 2022: NeurIPS, WACV, ICVGIP, Pattern Recognition
|
|