Skip to content
/ CoSFan Public

Repository holding the code for the 2025 ICLR work: "Continual Slow-and-Fast Adaptation of Latent Neural Dynamics (CoSFan): Meta-Learning What-How & When to Adapt."

Notifications You must be signed in to change notification settings

qu-gg/CoSFan

Repository files navigation

Continual Slow-and-Fast Adaptation of
Latent Neural Dynamics (CoSFan):
Meta-Learning What-How & When to Adapt
[OpenReview]

Ryan Missel, Linwei Wang

framework schematic

Figure 1: Overview of CoSFan, showing the what-how meta-model continually aggregate a heterogeneous data stream with a reservoir that identifies tasks via Gaussian mixture models..

Description

An increasing interest in learning to forecast for time-series of high-dimensional observations is the ability to adapt to systems with diverse underlying dynamics. Access to observations that define a stationary distribution of these systems is often unattainable, as the underlying dynamics may change over time. Naively training or retraining models at each shift may lead to catastrophic forgetting about previously-seen systems. We present a new continual meta-learning (CML) framework to realize continual slow-and fast adaptation of latent dynamics (CoSFan). We leverage a feed-forward meta-model to infer what the current system is and how to adapt a latent dynamics function to it, enabling fast adaptation to specific dynamics. We then develop novel strategies to automatically detect when a shift of data distribution occurs, with which to identify its underlying dynamics and its relation with previously-seen dynamics. In combination with fixed-memory experience replay mechanisms, this enables continual slow update of the what-how meta-model. Empirical studies demonstrated that both the meta- and continual-learning component was critical for learning to forecast across non-stationary distributions of diverse dynamics systems, and the feed-forward meta-model combined with task-aware/-relational continual learning strategies significantly outperformed existing CML alternatives.

Citation

Please cite the following if you use the data or the model in your work:

@inproceedings{
  missel2025cosfan,
  title={Continual Slow-and-Fast Adaptation of Latent Neural Dynamics (Co{SF}an): Meta-Learning What-How \& When to Adapt},
  author={Ryan Missel and Linwei Wang},
  booktitle={The Thirteenth International Conference on Learning Representations},
  year={2025},
  url={https://openreview.net/forum?id=Dl3MsjaIdp}
}

Requirements

Refer to requirements.txt.

About

Repository holding the code for the 2025 ICLR work: "Continual Slow-and-Fast Adaptation of Latent Neural Dynamics (CoSFan): Meta-Learning What-How & When to Adapt."

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages