Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Refactor]: Refactor DETR and Deformable DETR (open-mmlab#8763)
* [Fix] Fix UT to be compatible with pytorch 1.6 (open-mmlab#8707) * Update * Update * force reinstall pycocotools * Fix build_cuda * docker install git * Update * comment other job to speedup process * update * uncomment * Update * Update * Add comments for --force-reinstall * [Refactor] Refactor anchor head and base head with boxlist (open-mmlab#8625) * Refactor anchor head * Update * Update * Update * Add a series of boxes tools * Fix box type to support n x box_dim boxes * revert box type changes * Add docstring * refactor retina_head * Update * Update * Fix comments * modify docstring of coder and ioucalculator * Replace with_boxlist with use_box_type * fix: fix config of detr-r18 * fix: modified import of MSDeformAttn in PixelDecoder of Mask2Former * feat: add TransformerDetector as the base detector of DETR-like detectors * refactor: refactor modules and configs of DETR * refactor: refactor DETR-related modules in transformer.py * refactor: refactor DETR-related modules in transformer.py * fix: add type comments in detr.py * correct trainloop in detr_r50 config * fix: modify the parent class of DETRHead to BaseModule * refactor: refactor modules and configs of Deformable DETR * fix: modify the usage of num_query * fix: modify the usage of num_query in configs * refactor: replace input_proj of detr with ChannelMapper neck * refactor: delete multi_apply in DETRHead.forward() * Update detr_r18_8xb2-500e_coco.py using channel mapper for r18 * change the name of detection_transfomer.py to base_detr.py * refactor: modify construct binary masks section of forward_pretransformer * refactor: utilize abstractmethod * update ABCmeta to make sure reload class TransformerDetector * some annotation * some annotation * some annotation * refactor: delete _init_transformer in detectors * refactor: modify args of deformable detr * refactor: modify about super().__init__() * Update detr_head.py Remove the multi feat lvl in function 'predict_by_feat' * Update detr.py update init_weights * some annotation for head * to make sure the head args the same as detector * to make sure the head args the same as detector * some bug * fix: fix bugs of num_pred in DeformableDETRHead * add kwargs to transformer * support MLP and sineembed position * detele positional encodeing * delete useless postnorm * Revert "add kwargs to transformer" This reverts commit a265c1a. * Update detr_head.py Update type and shape of args * Update detr_head.py fix args docstring in predict_by_feat * Update base_detr.py Update docstring for forward_pretransformer * Update deformable_detr.py Fix docstring * to support conditional detr with reload forward_transformer * fix: update config files of Two-stage and Box-refine * replace all bs with batch_size in detr-related files * update deformable.py and transformer.py * update docstring in base_detr * update docstring in base_detr, detr * doc refine * Revert "doc refine" This reverts commit b69da4f. * doc refine * doc refine * updabase_detr, detr, and le layers/transformdoc * fix doc in base_detr * add origin repo link * add origin repo link * refine doc * refine doc * refine doc * refine doc * refine doc * refine doc * refine doc * refine doc * doc: add doc of the first edition of Deformable DETR * batch_size to bs * refine doc * refine doc * feat: add config comments of specific module * refactor: refactor base DETR class TransformerDetector * fix: fix wrong return typehint of forward_encoder in TransformerDetector * refactor: refactor DETR * refactor: refactor Deformable DETR * refactor: refactor forward_encoder and pre_decoder * fix: fix bugs of new edition * refactor: small modifications * fix: move get_reference_points to deformable_encoder * refactor: merge init_&inter_reference to references in Deformable DETR * modify docstring of get_valid_ratio in Deformable DETR * add some docstring * doc: add docstring of deformable_detr.py * doc: add docstring of deformable_detr_head.py * doc: modify docstring of deformable detr * doc: add docstring of deformable_detr_head.py * doc: modify docstring of deformable detr * doc: add docstring of base_detr.py * doc: refine docstring of base_detr.py * doc: refine docstring of base_detr.py * a little change of MLP * a little change of MLP * a little change of MLP * a little change of MLP * refine config * refine config * refine config * refine doc string for detr * little refine doc string for detr.py * tiny modification * doc: refine docstring of detr.py * tiny modifications to resolve the conversations * DETRHead.predict() draft * tiny modifications to resolve conversations * refactor: modify arg names and forward strategies of bbox_head * tiny modifications to resolve the conversations * support MLP * fix docsting of function pre_decoder * fix docsting of function pre_decoder * fix docstring * modifications for resolving conversations * refactor: eradicate key_padding_mask args * refactor: eradicate key_padding_mask args * fix: fix bug of deformable detr and resolve some conversations * refactor: rename base class with DetectionTransformer and other modifications * fix: fix config of detr * fix the bug of init * fix: fix init_weight of DETR and Deformable DETR * resolve conflict * fix auto-merge bug * fix pre-commit bug * refactor: move the position of encoder and decoder * delete Transformer in ci test * delete Transformer in ci test Co-authored-by: jbwang1997 <[email protected]> Co-authored-by: KeiChiTse <[email protected]> Co-authored-by: LYMDLUT <[email protected]> Co-authored-by: lym <[email protected]> Co-authored-by: Kei-Chi Tse <[email protected]>
- Loading branch information