Published at ICLR 2026 🎉 [Paper]
State Space Models (SSMs) like Mamba have emerged as powerful alternatives to Transformers, but existing PEFT methods simply transfer Transformer-tailored approaches without addressing Mamba's unique temporal processing dynamics.
Memba introduces a membrane-driven PEFT approach specifically designed for Mamba, with three key components:
- Leaky Integrate Membrane (LIM) — Bio-inspired gating mechanism that accumulates membrane potentials over time on Mamba's gate branch, enhancing selective information retention
- Strategic LoRA Placement — Low-Rank Adaptations on input and output projections
- Cross-Layer Membrane Transfer — Averaged membrane states propagated across layers for temporal coherence
| Model | Method | #Params(%) | Avg. Acc. (%) |
|---|---|---|---|
| Mamba-370M | LoRAp (X) | 2.67 | 47.8 |
| Mamba-370M | Memba (in_proj) | 2.07 | 48.2 |
| Mamba-790M | LoRAp (X) | 1.75 | 50.8 |
| Mamba-790M | Memba (in_proj) | 1.47 | 51.8 |
| Mamba-1.4B | LoRAp (X) | 1.36 | 53.7 |
| Mamba-1.4B | Memba (in_proj) | 1.13 | 54.5 |
| Method | #Params(K) | Natural | Specialized | Structured | Avg. |
|---|---|---|---|---|---|
| LoRAp (X) | 1,778 | 76.64 | 83.89 | 60.84 | 71.52 |
| LoRA (out_proj) | 2,663 | 76.42 | 83.96 | 60.08 | 71.12 |
| Hybrid (w/ proj) | 117,236 | 77.00 | 84.41 | 61.55 | 72.05 |
| Memba (in+out proj) | 4,718 | 77.07 | 85.66 | 61.70 | 72.40 |
For setup, dependencies, and detailed usage instructions, please refer to the README in each task directory:
- Language Tasks: language/commonsense_reasoning/README.md
- Vision Tasks: vision/README.md
Memba/
├── language/
│ └── commonsense_reasoning/
│ ├── finetune.py # Language fine-tuning entry point
│ ├── lm_harness_eval.py # Evaluation with lm-evaluation-harness
│ ├── requirements.txt
│ └── mamba_peft/src/peft/
│ └── tuners/mamba_peft.py # Memba PEFT implementation (LIM + LoRA)
├── vision/
│ ├── tools/ # Training and testing scripts
│ ├── configs/mmpretrain/vim/vtab1k/ # Experiment configs
│ └── personal_lib/
│ ├── mmpretrain/models/peft/ # Vision Memba implementation
│ └── external_packages/mamba-1p1p1/ # Modified Mamba with LIM integration
@article{lee2025memba,
title={Memba: Membrane-driven Parameter-Efficient Fine-Tuning for Mamba},
author={Lee, Donghyun and Li, Yuhang and Yin, Ruokai and Xiao, Shiting and Panda, Priyadarshini},
journal={arXiv preprint arXiv:2506.18184},
year={2025}
}Our code builds upon Mamba, MambaPEFT, Vim, MMPreTrain, LLM-Adapters, and PEFT. Especially, we built Memba upon MambaPEFT code. Thank you for the authors!
This project is released under the Apache 2.0 License.
