Skip to content

Latest commit

 

History

History
22 lines (18 loc) · 915 Bytes

File metadata and controls

22 lines (18 loc) · 915 Bytes

MINT-Quantization

MINT, Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks, ASP-DAC 2024, Best paper nomination

For most updated code: https://github.com/RuokaiYin/MINT_Quantization

Notice:

I found the code to have some errors when using different PyTorch versions. I will solve the problem later. For now, please run the code using PyTorch with version 1.13.0. This version is tested to be working. Thanks.

Citing

If you find MINT is useful for your research, please use the following bibtex to cite us,

@inproceedings{yin2024mint,
  title={MINT: Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks},
  author={Yin, Ruokai and Li, Yuhang and Moitra, Abhishek and Panda, Priyadarshini},
  booktitle={2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC)},
  pages={830--835},
  year={2024},
  organization={IEEE}
}