Skip to content

Commit

Permalink
Release detrex v0.1.1 (#104)
Browse files Browse the repository at this point in the history
* refine README

* release v0.1.1

* refine README

* refine Deformable-DETR README

* refine README

Co-authored-by: ntianhe ren <[email protected]>
  • Loading branch information
rentainhe and ntianhe ren authored Oct 18, 2022
1 parent f428680 commit 3ba53ba
Show file tree
Hide file tree
Showing 5 changed files with 43 additions and 13 deletions.
22 changes: 11 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
[🛠️Installation](https://detrex.readthedocs.io/en/latest/tutorials/Installation.html) |
[👀Model Zoo](https://detrex.readthedocs.io/en/latest/tutorials/Model_Zoo.html) |
[🚀Awesome DETR](https://github.com/IDEA-Research/awesome-detection-transformer) |
[🆕News](#change-log) |
[🆕News](#whats-new) |
[🤔Reporting Issues](https://github.com/IDEA-Research/detrex/issues/new/choose)


Expand Down Expand Up @@ -53,6 +53,16 @@ The repo name detrex has several interpretations:

- <font color=#008000> <b> de-t.rex </b> </font>: de means 'the' in Dutch. T.rex, also called Tyrannosaurus Rex, means 'king of the tyrant lizards' and connects to our research work 'DINO', which is short for Dinosaur.

## What's New
v0.1.1 was released on 18/10/2022:
- Add model analysis tools and benchmark in [tools](./tools/).
- Support visualization on COCO eval results and annotations in [tools](./tools/)
- Support [Group-DETR](./projects/group_detr/).
- Release more DINO training results including `DINO-R50-24epochs`, `DINO-R101`, `DINO-Swin-Tiny`, `DINO-Swin-Small`, `DINO-Swin-Base`, `DINO-Swin-Large` in [DINO](./projects/dino/).
- Release better `Deformable-DETR` baselines with **48.2 AP** on COCO dataset in [Deformable-DETR](./projects/deformable_detr/).


Please see [changelog.md](./changlog.md) for details and release history.

## Installation

Expand Down Expand Up @@ -87,16 +97,6 @@ Please see [projects](./projects/) for the details about projects that are built
</details>


## Change Log

The **beta v0.1.0** version was released in 21/09/2022. Highlights of the released version:
- Support various backbones, including: [FocalNet](https://arxiv.org/abs/2203.11926), [Swin-T](https://arxiv.org/pdf/2103.14030.pdf), [ResNet](https://arxiv.org/abs/1512.03385) and other [detectron2 builtin backbones](https://github.com/facebookresearch/detectron2/tree/main/detectron2/modeling/backbone).
- Add [timm](https://github.com/rwightman/pytorch-image-models) backbone wrapper and [torchvision](https://github.com/pytorch/vision) backbone wrapper.
- Support various Transformer-based detection algorithms, including: [DETR](https://arxiv.org/abs/2005.12872), [Deformable-DETR](https://arxiv.org/abs/2010.04159), [Conditional-DETR](https://arxiv.org/abs/2108.06152), [DAB-DETR](https://arxiv.org/abs/2201.12329), [DN-DETR](https://arxiv.org/abs/2203.01305), and [DINO](https://arxiv.org/abs/2203.03605).
- Support flexible config system based on [Lazy Configs](https://detectron2.readthedocs.io/en/latest/tutorials/lazyconfigs.html)

Please see [changelog.md](./changlog.md) for details and release history.

## License

This project is released under the [Apache 2.0 license](LICENSE).
Expand Down
15 changes: 15 additions & 0 deletions changlog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,20 @@
## Change Log

### v0.1.1 (18/10/2022)
#### New Features
- Add model analyze tools for detrex [#79](https://github.com/IDEA-Research/detrex/pull/79)
- Add benchmark [#81](https://github.com/IDEA-Research/detrex/pull/81)
- Add visualization for COCO eval results and annotations [#82](https://github.com/IDEA-Research/detrex/pull/82)
- Support `Group-DETR` algorhtim [#84](https://github.com/IDEA-Research/detrex/pull/84)
- Release `DINO-Swin` training results [#86](https://github.com/IDEA-Research/detrex/pull/86)
- Release better `Deformable-DETR` baselines [#102](https://github.com/IDEA-Research/detrex/pull/102) [#103](https://github.com/IDEA-Research/detrex/pull/103)

#### Bug Fixes
- Fix bugs in ConvNeXt backbone [#91](https://github.com/IDEA-Research/detrex/pull/91)

#### Documentation
- Add pretrained model weights download links [#86](https://github.com/IDEA-Research/detrex/pull/86)

### v0.1.0 (30/09/2022)
The **beta v0.1.0** version of detrex was released in 30/09/2022

Expand Down
15 changes: 15 additions & 0 deletions docs/source/changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,20 @@
## Change Log

### v0.1.1 (18/10/2022)
#### New Features
- Add model analyze tools for detrex [#79](https://github.com/IDEA-Research/detrex/pull/79)
- Add benchmark [#81](https://github.com/IDEA-Research/detrex/pull/81)
- Add visualization for COCO eval results and annotations [#82](https://github.com/IDEA-Research/detrex/pull/82)
- Support `Group-DETR` algorhtim [#84](https://github.com/IDEA-Research/detrex/pull/84)
- Release `DINO-Swin` training results [#86](https://github.com/IDEA-Research/detrex/pull/86)
- Release better `Deformable-DETR` baselines [#102](https://github.com/IDEA-Research/detrex/pull/102) [#103](https://github.com/IDEA-Research/detrex/pull/103)

#### Bug Fixes
- Fix bugs in ConvNeXt backbone [#91](https://github.com/IDEA-Research/detrex/pull/91)

#### Documentation
- Add pretrained model weights download links [#86](https://github.com/IDEA-Research/detrex/pull/86)

### v0.1.0 (21/09/2022)
The **beta v0.1.0** version of detrex was released in 21/09/2022

Expand Down
2 changes: 1 addition & 1 deletion projects/deformable_detr/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Here we provide the pretrained `Deformable-DETR` weights based on detrex.

All the models are trained using `8 GPUs` with total batch size equals to `16`. We've observed that the result of `deformable-two-stage` model trained using `8 GPUs` may be slightly lower than `16 GPUs` with `32` total batch size.

**Notable facts and caveats**: The training settings are different from the original repo, we use `lr=1e-5` for backbone and `1e-4` for the other modules. The original implementation sets `lr` to `2e-5` for `backbone`, `sampling_offsets` and `reference_points`, and `2e-4` for other modules. And we used `top-300` confidence boxes for testing, which may get a slightly better results on COCO evaluation. And we only freeze the stem layer in ResNet backbone by setting `freeze_at=1` in config.
**Notable facts and caveats**: The training settings are different from the original repo. Most of the training settings are following [DINO](https://github.com/IDEA-Research/detrex/tree/main/projects/dino). As we set `lr=1e-5` for backbone and `1e-4` for the other modules. The original implementation sets `lr` to `2e-5` for `backbone`, `sampling_offsets` and `reference_points`, and `2e-4` for other modules. And we used `top-300` confidence boxes for testing, which may get a slightly better results on COCO evaluation. And we only freeze the stem layer in ResNet backbone by setting `freeze_at=1` in config.

## Converted Weights
<table><tbody>
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
from torch.utils.cpp_extension import CUDA_HOME, CppExtension, CUDAExtension

# detrex version info
version = "0.1.0"
version = "0.1.1"
package_name = "detrex"
cwd = os.path.dirname(os.path.abspath(__file__))

Expand Down

0 comments on commit 3ba53ba

Please sign in to comment.