This repo provides instructions for preparing your docker submission to participate the "Closed-loop" track the INTERPRET challenge for trajectory prediction of vehicles in traffic. The challenge is organized by the team of the INTERACTION dataset. The INTERPRET challenge is the first "Closed-loop" challenge for evaluating the performance of predictors by considering their impacts in a "Prediction->Planning" pipeline that is commonly adopted in autonomous driving.
For more details about the INTERACTION dataset and the INTERPRET challenge (a NeurIPS2020 competition), please refer the websites.
We provide a simulator built based on realistic scenarios selected from the INTERACTION dataset. In the simulator, one robot car is selected whose actions are driven by a default motion planner based on the submitted predictors. Other agents in the simulator are responsive agents driven by internal algorithms.
Hence, all participants need to prepare their predictors as docker files and submit to the server so that the closed-loop evaluation can be performed.
The predictors communite with the simulator via grpc. We provide the following two APIs. Please read them while you are preparing your docker image.
- Obtaining real-time information from the simulator: the predictors (clients) will use
on_envto store the historical trajectory (10 frames) sent by the simulator (server); - Sending real-time predictions to the simulator: the predictor (clients) will use
fetch_my_stateto get the predicted results (30 frames in the future) from the predictors (clients).
In the ./predictor/*,
predictor.pyis an abstract class, you need to implement all the abstract methods in your own predictor.echo_predictor.pyis a simple echo predictor, namely returns back the last frame in the historical trajectory.lstm_predictor.pyis a pytorch-based examplar predictor which is trained on theinteractiondataset and has 0.3 MoN performance on the MA scenario.lstm.ptstores the model parameters. You can refer it to prepare your docker image.
-
python 3.7
-
docker
-
grpc && protobuf
-
please follow the instruction in the
simulator/readme.md -
pip install grpcio pip install protobuf
-
-
pytorch or tensorflow to support your predictor.
Prepare
use protoc to generate python version protocols for communication.
each time the simulator.proto updated, you need generate them again.
protoc -I ../proto/ --grpc_out=. --plugin=protoc-gen-grpc=`which grpc_python_plugin` ../proto/simulator.proto
protoc -I ../proto/ --python_out=. ../proto/simulator.protoIf this fails, you can try
python -m grpc_tools.protoc --proto_path=../proto/ --python_out=. --grpc_python_out=. ../proto/simulator.protoYou may get a warning when using grpc_tools.protoc, but it should still execute succesfully and generate the simulator_pb2.py and simulator_pb2_grpc files (provided that you have grpc_tools installed).
Run locally
# simulator should be launched before predictor, where
# -s sepcify the simulator service address
# -p specify the simulator service port
python3 main.py -s 127.0.0.1 -p 50051Docker
- You need to put the dependencies (pytorch, numpy, ....) in the
requirement.text, then
cd deploy
./make_image.sh my_predictorPlease read the Metrics_for_the_Closed_Loop_Track_of_the_INTERPRET_Challenge.pdf in the directory.
Contact If you have any questions, please submit an issue in this repo.