TY - GEN
T1 - Visually Augmented Guidance System Realization for Landing Rocket Model
AU - Nugroho, Larasmoyo
AU - Andiarti, Rika
AU - Akmeliawati, Rini
AU - Juliarsyah, Rizanto M.
AU - Khoifan, Abdul
AU - Prayogo, Yulian Surya
AU - Wijaya, Sastra Kusuma
N1 - Funding Information:
FUNDING INFORMATION This work is primarily supported by Ministry of Research Technology and Higher Education through the Research Grant of National Innovation System Consortium (INSINAS), contract Nos. 14/INS-1/PPK/E4/2021 and 70/INS-1/PPK/E4/2020. PhD scholarship is granted by LIPI through DBR Program 2019.
Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - UAV's auto landing based on GPS data is considered to be unprecise, because GPS inaccuracy is inherently attributed to the low rate of satellite signal update. Besides of that, GPS data does not pay attention to the condition of the area and allows landing in any hazardous place. The visually augmented precision landing (VAPL) guidance system proposed in this paper guides the landing rocket model (LRM) to conduct vertical-takeoff vertical-landing (VTVL) maneuvers by fusing data from vision sensors, GPS and even ultrasonic as navigational aids. This system uses the MAVlink protocol to communicate data between the sensor board and the flight controller. The difficulty of tracking object using visual sensor is caused by many factors i.e. light intensity, color saturation, parallax, and aspect angle segmentation. This paper addresses those obstacles by refining calibration procedure and replace area segmentation method to proportional feedback method. Unwanted inverted flight response could be dealt with the improved program algorithm. The results show that the proportional feedback method approach can reduce the error in area segmentation method significantly, from 100% error to maximum value of accuracy up to 80%. In this study, the landing rocket model could land on the target, but data errors still exist due to the mistakes produced by the sensor vision when focusing on the target while discriminating it from the larger block area. The miss distance between the LRM to the target is approximately 1 meter. Therefore, the proposed VAPL could be considered more precise than a solely GPS-guided landing system that has average miss distance 3 meters.
AB - UAV's auto landing based on GPS data is considered to be unprecise, because GPS inaccuracy is inherently attributed to the low rate of satellite signal update. Besides of that, GPS data does not pay attention to the condition of the area and allows landing in any hazardous place. The visually augmented precision landing (VAPL) guidance system proposed in this paper guides the landing rocket model (LRM) to conduct vertical-takeoff vertical-landing (VTVL) maneuvers by fusing data from vision sensors, GPS and even ultrasonic as navigational aids. This system uses the MAVlink protocol to communicate data between the sensor board and the flight controller. The difficulty of tracking object using visual sensor is caused by many factors i.e. light intensity, color saturation, parallax, and aspect angle segmentation. This paper addresses those obstacles by refining calibration procedure and replace area segmentation method to proportional feedback method. Unwanted inverted flight response could be dealt with the improved program algorithm. The results show that the proportional feedback method approach can reduce the error in area segmentation method significantly, from 100% error to maximum value of accuracy up to 80%. In this study, the landing rocket model could land on the target, but data errors still exist due to the mistakes produced by the sensor vision when focusing on the target while discriminating it from the larger block area. The miss distance between the LRM to the target is approximately 1 meter. Therefore, the proposed VAPL could be considered more precise than a solely GPS-guided landing system that has average miss distance 3 meters.
KW - flight controller
KW - landing rocket model
KW - MAVlink protocol
KW - precision landing guidance system
KW - visual imaging sensor
UR - http://www.scopus.com/inward/record.url?scp=85124809579&partnerID=8YFLogxK
U2 - 10.1109/ICARES53960.2021.9665182
DO - 10.1109/ICARES53960.2021.9665182
M3 - Conference contribution
AN - SCOPUS:85124809579
T3 - Proceedings of the 2021 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology, ICARES 2021
BT - Proceedings of the 2021 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology, ICARES 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th IEEE International Conference on Aerospace Electronics and Remote Sensing Technology, ICARES 2021
Y2 - 3 November 2021 through 4 November 2021
ER -