The first goal to this project is to have a UAV automated landing on a landing pad. To achieve this goal, a camera was used to detect the landing pattern. In other words, video was the input and thrusts were the output of this system.
Picture above is a software chain diagram for the system.
Pose_estimator node is used for processing video input data, and it sent local position x, y, z and yaw to roscopter node. Roscopter subscribed the data, and encode them to mavlink message to transmit through the radio. Mavlink message #32, LOCAL_POSITION_NED, is used for sending estimated local position from camera.
In PX4 autopilot, when the mavlink app receive the mavlink message from radio, it publishes the data in PX4, and the position control flow software chain subscribes the data and other sensor data for controlling the position and attitude of a UAV.
No comments:
Post a Comment