%0 Journal Article
%T 改进YOLOv10n的无人机航拍图像检测算法
Algorithm for Aerial Image Detection of Unmanned Aerial Vehicles Based on Improved YOLOv10n
%A 徐浩哲
%A 王元兴
%J Computer Science and Application
%P 230-238
%@ 2161-881X
%D 2025
%I Hans Publishing
%R 10.12677/csa.2025.151023
%X 无人机航拍图像目标检测在民用和军事领域具有重要的应用价值。针对无人机航拍图像中目标小、尺度变化大和背景干扰等因素导致检测精度低、定位不准确的问题,提出一种改进YOLOv10n的无人机航拍图像目标检测算法。首先将C2f模块进行改进,利用递归门控卷积(gnConv)与c2f融合二次创新得到C2f-GConv模块,以适应航拍图像中物体的形变和尺度变化。同时将骨干网络替换成Efficientformerv2,使得EfficientFormerV2在保持类似MobileNetV2大小和速度的同时,比MobileNetV2高约4%的top-1精度,明显提高了模型的效率和性能。在VisDrone2019数据集上进行对比实验和消融实验,mAP50值较基线模型提升了3.2%,检测速度FPS达到90帧/s,能够满足实时性的检测需求。与主流算法进行对比实验,所提算法表现优于目前主流算法。
Aerial target detection in unmanned aerial vehicle (UAV) imagery holds significant application value in both civilian and military fields. To address the challenges of low detection accuracy and imprecise localization caused by small targets, large scale variations, and background interference in UAV imagery, an improved YOLOv10n algorithm for aerial image target detection is proposed. Firstly, the C2f module is enhanced by integrating the recursive gated convolution (gnConv) with the C2f for a second innovation, resulting in C2f-GConv adapting to the deformation and scale changes of objects in aerial images. Meanwhile, the backbone network is replaced with EfficientFormerV2, which maintains size and speed similar to MobileNetV2 but achieves about 4% higher top-1 accuracy than MobileNetV2, significantly improving the model’s efficiency and performance. Comparative and ablation experiments are conducted on the VisDrone2019 dataset, with the mAP50 value increasing by 3.2% over the baseline model and a detection speed of FPS reaching 90 frames per second, meeting the real-time detection requirements. Comparative experiments with mainstream algorithms show that the proposed algorithm outperforms current mainstream algorithms.
%K 无人机图像,
%K YOLOv10n,
%K 递归门控卷积,
%K EfficientFormerV2
UAV Imagery
%K YOLOv10n
%K Recursive Gated Convolution
%K EfficientFormerV2
%U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=106443