This repository contains the implementation of the paper "DCEA: DETR With Concentrated Deformable Attention for End-to-End Ship Detection in SAR Images".
Ensure the following dependencies are installed:
python==3.8.0
torch==2.0.1
torchvision==0.15.2
onnx==1.14.0
onnxruntime==1.15.1
pycocotools
PyYAML
scipy
You can install these dependencies using:
pip install -r requirements.txt
To ensure seamless integration, prepare your dataset in the COCO standard format as outlined below.
- Place the dataset in the following path:
configs/dataset/coco/
. - Structure the dataset files as follows:
coco/
annotations/ # COCO annotation JSON files
train2017/ # training images
val2017/ # validation images
To train the model, use:
python train.py -c path/to/config -r path/to/checkpoint
Replace path/to/config
with the path to your configuration file, and path/to/checkpoint
with the path to an existing checkpoint if resuming training (optional).
To evaluate the model, run:
python train.py -c path/to/config -r path/to/checkpoint --test-only
Adding --test-only
will run evaluation only, without further training.
For inference, use:
python inference.py
Before running inference, configure inference.py
with the correct paths and parameters as needed.
This project is released under the MIT License.
This implementation is based on DETR and RT-DETR frameworks. We thank the original authors for their contributions.