This is the source code of our paper published in EMNLP 2022 Tiny-Attention Adapter: Contexts Are More Important Than the Number of Parameters.
To run the code, you need to first install the latest version of PyTorch with cudatoolkit. See the PyTorch website for installation instructions.
Then run
pip install .
to install the package and some dependencies.
After the installation, use
bash run_examples.sh
to start training.