-
Notifications
You must be signed in to change notification settings - Fork 55
Is the MPC part not open source? How to run SUPER on a real drone? #5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thank you for your question, and I apologize for not having the time to write a guide on real-world deployment yet. However, I am here to actively support you during your deployment process, and I hope this issue will assist others who wish to replicate it. Currently, the SUPER publishes the committed trajectory in two ways. The first and easier method uses position, velocity, acceleration, jerk, yaw, and yaw_dot from the following code snippet: // https://github.com/hku-mars/SUPER/blob/master/super_planner/include/ros_interface/ros1/fsm_ros1.hpp#L281
cmd_pub = nh_.advertise<quadrotor_msgs::PositionCommand>(cfg_.cmd_topic, 10); This can be input to the open-source PX4 high-level controller found here: PX4 Controller. Please note that the quadrotor_msgs may not be identical and could fail the MD5 check, causing ROS communication issues. We haven't tested it with px4ctrl, but copying PositionCommand.msg in mars_quadrotor_msgs to px4ctrl and recompiling the workspace may resolve this. The second and more advanced method utilizes the published polynomial trajectory, which is suitable for an MPC controller. This allows the controller to evaluate multiple reference states along the trajectory for predictive control: // https://github.com/hku-mars/SUPER/blob/master/super_planner/include/ros_interface/ros1/fsm_ros1.hpp#L282
mpc_cmd_pub_ = nh_.advertise<quadrotor_msgs::PolynomialTrajectory>(cfg_.mpc_cmd_topic, 10); However, please note that the MPC module of SUPER is not yet open-sourced, but we plan to release it later this year. If you're interested in trying this method, you may need to implement it yourself in the meantime. I will keep this issue open until I have time to create a comprehensive guide on real-world deployment. If you succeed and are willing to share your insights, I encourage you to document your experience here. I will reference this issue in the README to assist anyone looking to deploy SUPER on a real-world drone. |
How do your NUC get the 20V power? Have you used a Power Distribution Boards or a DC-DC module in the part of hardware? |
Can the MPC code be open sourced? I'd like to learn about this part |
Yes, we use a customized in-house board designed in our lab, but we don't have an open-source plan yet. I am currently testing another commercial board and will provide updates as progress is made. |
We do have plans to open source the MPC component, but it will likely be later this year. Stay tuned! |
Thank you for your reply and look forward to the open source of the code
…---- Replied Message ----
| From | Yunfan ***@***.***> |
| Date | 04/03/2025 11:07 |
| To | ***@***.***> |
| Cc | ***@***.***>***@***.***> |
| Subject | Re: [hku-mars/SUPER] Is the MPC part not open source? How to run SUPER on a real drone? (Issue #5) |
Can the MPC code be open sourced? I'd like to learn about this part
We do have plans to open source the MPC component, but it will likely be later this year. Stay tuned!
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
RENyunfan left a comment (hku-mars/SUPER#5)
Can the MPC code be open sourced? I'd like to learn about this part
We do have plans to open source the MPC component, but it will likely be later this year. Stay tuned!
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
In the real world, what is the point cloud frequency of mid360 set to? Is the IMU difference used to increase the frequency of the odometery? |
I also want to ask about this. The highest frequency of the mid360 is 100Hz as I know. And the hz of odometry topic is 100Hz. How can the pose information used by mapping get 200 Hz? Use the odometry topic of nxtpx4 directly? |
In the real world, the Mid360's point cloud frequency is set to 30 Hz, with IMU propagation applied to reach 200 Hz odometry for the controller. For SUPER, a 10 Hz odometry frequency suffices for most applications. |
To reach 200 Hz for mapping, IMU data with higher frequency propagates the LIO-estimated state. 100 Hz is typically sufficient, but IMU propagation supports higher rates if needed. |
The frequency of topic /Odometry is limited by the freqency of the point cloud. What is your pose information used by controller? By the way, I want to konw, how can I realize the hardware synchronized between imu created by nxtpx4 and mid360lidar? |
For sensor calibration, consult our work at https://github.com/hku-mars/LiDAR_IMU_Init. In FAST-LIO, odometry frequency aligns with the LiDAR frequency. To enable higher-frequency odometry, modify the system to propagate the EKF-estimated state in the IMU callback function and publish the propagated state (not yet open-sourced but straightforward to implement). Our controller leverages this propagated state at 100 Hz, synchronized with the MPC operating at 100 Hz. |
Thanks, So you use the software synchronization rather hardware? Just like the document said "if you want to run FAST-LIO on your own data but the LiDAR and IMU are not synchronized or calibrated before, you can directly run LI-Init (since it will switch into FAST-LIO after Initialization is finished)." which means there is no hardware synchronization. |
Thank you for your open source. Is the MPC part not open source? If we want to reproduce it, how can we replace it?
The text was updated successfully, but these errors were encountered: