-
Notifications
You must be signed in to change notification settings - Fork 3
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 9befe80
Showing
16 changed files
with
2,098 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,138 @@ | ||
# Byte-compiled / optimized / DLL files | ||
__pycache__/ | ||
*.py[cod] | ||
*$py.class | ||
|
||
# C extensions | ||
*.so | ||
|
||
# Distribution / packaging | ||
.Python | ||
build/ | ||
develop-eggs/ | ||
dist/ | ||
downloads/ | ||
eggs/ | ||
.eggs/ | ||
lib/ | ||
lib64/ | ||
parts/ | ||
sdist/ | ||
var/ | ||
wheels/ | ||
share/python-wheels/ | ||
*.egg-info/ | ||
.installed.cfg | ||
*.egg | ||
MANIFEST | ||
|
||
# PyInstaller | ||
# Usually these files are written by a python script from a template | ||
# before PyInstaller builds the exe, so as to inject date/other infos into it. | ||
*.manifest | ||
*.spec | ||
|
||
# Installer logs | ||
pip-log.txt | ||
pip-delete-this-directory.txt | ||
|
||
# Unit test / coverage reports | ||
htmlcov/ | ||
.tox/ | ||
.nox/ | ||
.coverage | ||
.coverage.* | ||
.cache | ||
nosetests.xml | ||
coverage.xml | ||
*.cover | ||
*.py,cover | ||
.hypothesis/ | ||
.pytest_cache/ | ||
cover/ | ||
|
||
# Translations | ||
*.mo | ||
*.pot | ||
|
||
# Django stuff: | ||
*.log | ||
local_settings.py | ||
db.sqlite3 | ||
db.sqlite3-journal | ||
|
||
# Flask stuff: | ||
instance/ | ||
.webassets-cache | ||
|
||
# Scrapy stuff: | ||
.scrapy | ||
|
||
# Sphinx documentation | ||
docs/_build/ | ||
|
||
# PyBuilder | ||
.pybuilder/ | ||
target/ | ||
|
||
# Jupyter Notebook | ||
.ipynb_checkpoints | ||
|
||
# IPython | ||
profile_default/ | ||
ipython_config.py | ||
|
||
# pyenv | ||
# For a library or package, you might want to ignore these files since the code is | ||
# intended to run in multiple environments; otherwise, check them in: | ||
# .python-version | ||
|
||
# pipenv | ||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. | ||
# However, in case of collaboration, if having platform-specific dependencies or dependencies | ||
# having no cross-platform support, pipenv may install dependencies that don't work, or not | ||
# install all needed dependencies. | ||
#Pipfile.lock | ||
|
||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow | ||
__pypackages__/ | ||
|
||
# Celery stuff | ||
celerybeat-schedule | ||
celerybeat.pid | ||
|
||
# SageMath parsed files | ||
*.sage.py | ||
|
||
# Environments | ||
.env | ||
.venv | ||
env/ | ||
venv/ | ||
ENV/ | ||
env.bak/ | ||
venv.bak/ | ||
|
||
# Spyder project settings | ||
.spyderproject | ||
.spyproject | ||
|
||
# Rope project settings | ||
.ropeproject | ||
|
||
# mkdocs documentation | ||
/site | ||
|
||
# mypy | ||
.mypy_cache/ | ||
.dmypy.json | ||
dmypy.json | ||
|
||
# Pyre type checker | ||
.pyre/ | ||
|
||
# pytype static type analyzer | ||
.pytype/ | ||
|
||
# Cython debug symbols | ||
cython_debug/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,91 @@ | ||
# AutoAttend: Automated Attention Representation Search | ||
|
||
_code implementation of paper [AutoAttend: Automated Attention Representation Search](http://proceedings.mlr.press/v139/guan21a.html)._ | ||
|
||
Authors: [Chaoyu Guan](https://github.com/Frozenmad), Xin Wang, Wenwu Zhu | ||
|
||
## Brief Introduction | ||
|
||
<center> | ||
|
||
 | ||
</center> | ||
|
||
We design an automated framework searching for the best self-attention models for given tasks. In which, we leverage functional layers to describe models with self-attention mechanisms, and propose context-aware parameter sharing to build and train the supernet, so that it can consider the specialty and functionality of parameters for different functional layers and inputs. More detailed algorithms can be found in our [paper](http://proceedings.mlr.press/v139/guan21a.html). | ||
|
||
## Usage | ||
|
||
### 1. Prepare the datasets | ||
|
||
First, please prepare the [SST dataset](https://nlp.stanford.edu/sentiment/trainDevTestTrees_PTB.zip) and [pretrained glove embeddings](https://nlp.stanford.edu/data/glove.840B.300d.zip) under `./data` folder. The organization should be in following format: | ||
|
||
``` | ||
+ data | ||
- glove.840B.300d.txt | ||
+ sst | ||
+ trees | ||
- train.txt | ||
- dev.txt | ||
- test.text | ||
``` | ||
|
||
### 2. Preprocess the datasets | ||
|
||
Run following command to prepare off-the-shelf datasets for speed up | ||
``` | ||
python -m task.dataset.sst5 | ||
``` | ||
|
||
### 3. Train the supernet | ||
|
||
Train the whole supernet running the following command: | ||
``` | ||
python -m task.text_classification.train_supernet --epoch x | ||
``` | ||
You can set the epoch number as you wish. In our paper, it is set to 10. | ||
|
||
### 4. Search for the best architectures | ||
|
||
Get the best architectures using evolution algorithms running: | ||
``` | ||
python -m task.text_classification.search_supernet --model_path ./searched/model_epoch_x.full --devices 0 1 2 3 0 1 2 3 | ||
``` | ||
Where x is the epoch number. In our case it is 10. You can set the device number you want to use, and you can pass repeated device number. The code will run in multiprocessing way according to the device number you passed. | ||
|
||
### 5. Retrain the architectures | ||
|
||
Re-evaluate the searched models using following cmd: | ||
``` | ||
python -m task.text_classification.retrain --arch "xxxx" | ||
``` | ||
|
||
The `xxxx` should be replaced with the architectures searched. The best architecture we find is | ||
``` | ||
python -m task.text_classification.retrain --arch "[[4, 0, 1, -1, 0], [0, 0, 3, -1, 0], [1, 0, 3, 1, 1], [1, 1, 3, 1, 1], [0, 1, 4, -1, 0], [2, 5, 1, 3, 1], [3, 1, 2, 1, 1], [3, 1, 3, 2, 1], [1, 1, 1, -1, 0], [1, 1, 2, 1, 1], [1, 2, 3, 4, 1], [3, 6, 4, 0, 1], [3, 2, 0, -1, 0], [1, 0, 0, -1, 0], [1, 4, 4, 0, 1], [3, 15, 2, 1, 1], [3, 10, 1, -1, 0], [1, 14, 2, 3, 1], [3, 18, 1, 2, 1], [4, 9, 0, -1, 0], [1, 16, 1, -1, 0], [0, 12, 0, -1, 0], [4, 3, 3, 2, 1], [2, 0, 4, -1, 0]]" | ||
``` | ||
|
||
You should derive a mean test accuracy around `0.5371` | ||
|
||
## Codes for Graph | ||
|
||
The codes for searching graph models will be published at our [AutoGL library](https://github.com/THUMNLab/AutoGL) soon! | ||
|
||
## Cite Us | ||
|
||
If you find our work helpful, please cite our paper as following: | ||
|
||
``` | ||
@InProceedings{guan21autoattend, | ||
title = {AutoAttend: Automated Attention Representation Search}, | ||
author = {Guan, Chaoyu and Wang, Xin and Zhu, Wenwu}, | ||
booktitle = {Proceedings of the 38th International Conference on Machine Learning}, | ||
pages = {3864--3874}, | ||
year = {2021}, | ||
editor = {Meila, Marina and Zhang, Tong}, | ||
volume = {139}, | ||
series = {Proceedings of Machine Learning Research}, | ||
month = {18--24 Jul}, | ||
publisher = {PMLR}, | ||
pdf = {http://proceedings.mlr.press/v139/guan21a/guan21a.pdf}, | ||
url = {http://proceedings.mlr.press/v139/guan21a.html} | ||
``` |
Empty file.
Oops, something went wrong.