Combining Partition-Tree Weighting and MAML for Continual and Online Learning


Anna Koop, Michael Bradley Johanson, and Michael H. Bowling. Combining Partition-Tree Weighting and MAML for Continual and Online Learning. CoLLAs, 2025.

Download


Abstract

Learning from experience requires adapting and responding to errors over time. However, gradient-based deep learning can fail dramatically in the continual, online setting. In this work, we address this shortcoming by combining two meta-learning methods: the purely online Partition Tree Weighting (PTW) mixture-of-experts algorithm, and a novel variant of the Model-Agnostic Meta-Learning (MAML) initialization-learning procedure. We demonstrate our approach, RMPTW, in a piecewise stationary classification task in which the task distribution is unknown and the context changes are unobserved and random. We refer to this continual, online, task-agnostic setting as experiential learning. In this setting, RMPTW matches and even outperforms an augmented learner that is allowed to train offline from the environment's task distribution and is given explicit notification when the environment context changes. RMPTW thus provides a base learner with the benefits of offline training, access to the true task distribution, and direct observation of context-switches, but requires only a O(log T) increase in computation and memory.


BibTeX

@InProceedings{ 2025-collas-ptwmaml,
    title = {Combining Partition-Tree Weighting and MAML for Continual and Online Learning},
    author = {Anna Koop and Michael Bradley Johanson and Michael H. Bowling},
    booktitle = {Proceedings of The 4th Conference on Lifelong Learning Agents (CoLLAs 2025)},
    year = {2025}
}