Haodong Li123§,
Wangguangdong Zheng1,
Jing He3,
Yuhao Liu1,
Xin Lin2,
Xin Yang34,
Ying-Cong Chen34✉,
Chunchao Guo1✉
1Tencent Hunyuan
2UC San Diego
3HKUST(GZ)
4HKUST
§Work primarily done during an internship at Tencent Hunyuan.
✉Corresponding author.
DA2 predicts dense, scale-invariant distance from a single 360° panorama in an end-to-end manner, with remarkable geometric fidelity and strong zero-shot generalization.
- 2025-10-10 The curated panoramic data is released on huggingface!
- 2025-10-10 The evaluation code and the testing data are released!
- 2025-10-04 The 🤗Huggingface Gradio demo (online and local) are released!
- 2025-10-04 The inference code and the model are released!
- 2025-10-01 Paper released on arXiv!
This installation was tested on: Ubuntu 20.04 LTS, Python 3.12, CUDA 12.2, NVIDIA GeForce RTX 3090.
- Clone the repository:
git clone https://github.com/EnVision-Research/DA-2.git
cd DA-2
- Install dependencies using conda:
conda create -n da-2 python=3.12 -y
conda activate da-2
pip install -e src
For macOS users: Please remove
xformers==0.0.28.post2(line 16) fromsrc/pyproject.tomlbeforepip install -e src, as xFormers does not support macOS.
- Online demo: Hugggingface Space
- Local demo:
python app.py
We've pre-uploaded the cases appeared in the project page. So you can proceed directly to step 3.
- Images are placed in a directory, e.g.,
assets/demos. - (Optional) Masks (e.g., sky masks for outdoor images) in another directory, e.g.,
assets/masks. The filenames under both directories should be consistent. - Run the inference command:
sh infer.sh
- The visualized distance and normal maps will be saved at
output/infer/vis_all.png. The projected 3D point clouds will be saved atoutput/infer/3dpc.
- Download the evaluation datasets from huggingface:
cd [YOUR_DATA_DIR]
huggingface-cli login
hf download --repo-type dataset haodongli/DA-2-Evaluation --local-dir [YOUR_DATA_DIR]
- Unzip the downloaded datasets:
tar -zxvf [DATA_NAME].tar.gz
- Set the
datasets_dir(line 20) inconfigs/eval.jsonwithYOUR_DATA_DIR. - Run the evaluation command:
sh eval.sh
- The results will be saved at
output/eval.
If you find our work useful in your research, please consider citing our paper🌹:
@article{li2025depth,
title={DA$^{2}$: Depth Anything in Any Direction},
author={Li, Haodong and Zheng, Wangguangdong and He, Jing and Liu, Yuhao and Lin, Xin and Yang, Xin and Chen, Ying-Cong and Guo, Chunchao},
journal={arXiv preprint arXiv:2509.26618},
year={2025}
}This implementation is impossible without the awesome contributions of MoGe, UniK3D, Lotus, Marigold, DINOv2, Accelerate, Gradio, HuggingFace Hub, and PyTorch to the open-cource community.
