%0 Journal Article
%T 基于嵌入式深度神经网络的杂草识别模型
Weed Recognition Model Based on Embedded Deep Neural Network
%A 曾翊
%A 张积烨
%A 余箫
%A 余吉雅
%A 张艳超
%J Modeling and Simulation
%P 4949-4961
%@ 2324-870X
%D 2023
%I Hans Publishing
%R 10.12677/MOS.2023.126449
%X 杂草的生存能力强,对环境要求低,分布广泛且随机,必将争夺农作物所需的水分和营养等生存条件,导致农作物的质量和产量有一定程度的降低。为解决以上问题,本文设计了一种基于嵌入式深度神经网络的杂草识别系统,使用嵌入式设备查看农田的杂草分布情况并检测出杂草的具体位置和种类,此时就可以有针对性地去除杂草,提高农作物质量和产量,节约劳动成本,同时保护生态环境。该系统选择YOLO系列算法进行模型训练,并通过模型转换获得合适的模型部署在嵌入式设备上进行目标检测。具体来讲,先筛选足够数量的杂草图像样本,经过手动标记获得对应标签文件,从而获得数据集;而后使用数据集模型训练,本文算法为YOLOv4、YOLOv4-tiny和YOLOx算法,得到PTH模型;PTH模型加载在PC端,PTH模型经模型转换生成TRT模型部署在嵌入式端Jetson Nano上。实验结果表明,采用F1值和mAP值进行模型准确度评估时,YOLOx最高,准确度为80.48%以上;采用计数准确度对目标检测正确率进行评估时,该指标下三个算法均在94%以上;采用单张处理时间进行速度评估时,YOLOv4-tiny作为轻量化网络仅需0.0068 s,嵌入式端仅需0.0453 s,相当于实时检测。
The strong survival ability, low environmental requirements, widespread and random distribution of weeds will inevitably compete for the living conditions such as water and nutrients needed by crops, leading to a certain degree of reduction in crop quality and yield. To solve these problems, this paper designs a weed recognition system based on embedded deep neural network. The sys-tem uses embedded devices to view the distribution of weeds in farmland, detect the specific loca-tion and species of weeds, and remove them in a targeted manner, improving crop quality and yield, saving labor costs, and protecting the ecological environment. This system selects the YOLO series algorithm for model training and deploys the appropriate model on embedded devices for target detection through model conversion. Specifically, a sufficient number of weed image samples are first selected and corresponding label files are obtained through manual labeling to obtain the da-taset. Then, the dataset is used for model training. The algorithms used in this paper are YOLOv4, YOLOv4-tiny, and YOLOx, and the PTH model is obtained. The PTH model is loaded on the PC side, and the TRT model is generated through model conversion and deployed on the embedded Jetson Nano. The experimental results indicate that when evaluating model accuracy using F1 score and mAP (mean Average Precision), YOLOx achieves the highest accuracy, surpassing 80.48%. When evaluating object detection accuracy using counting accuracy, all three algorithms perform above 94% in this metric. For speed evaluation based on single-frame pro- cessing time, YOLOv4-tiny, as a lightweight network, requires only 0.0068 seconds, and on the embedded platform, it takes only 0.0453 seconds, equivalent to real-time detection.
%K 卷积神经网络,目标检测算法,YOLO,嵌入式设备,杂草识别
Convolutional Neural Network
%K Object Detection Algorithm
%K YOLO
%K Embedded Device
%K Weed Identification
%U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=73723