【哪吒开发板试用】 Intelligent Eggplant Picking Robot Based On Nezha Developer Kit
Intelligent Eggplant Picking Robot Based on Nezha Developer KitABSTRACTThis study designs and implements an intelligent eggplant-picking robot aimed at improving the efficiency and quality of eggp
Intelligent Eggplant Picking Robot Based on Nezha Developer Kit
This study designs and implements an intelligent eggplant-picking robot aimed at improving the efficiency and quality of eggplant harvesting. The system uses the Nezha Developer Kit as the core control board and accelerates the YOLOv5 object detection model with OpenVINO to achieve efficient recognition and localization of eggplants. The system includes the design and kinematic analysis of the robotic arm and end-effector, and the implementation of a clamping and cutting integrated end-effector to ensure precise and low-damage eggplant picking. Additionally, the hardware and software architecture of the robot control system is designed to enable fast inference and real-time control. The system's recognition accuracy and picking efficiency were validated, demonstrating its potential application value in smart agriculture.
Key words: Smart Agriculture, Eggplant-Picking Robot, YOLOv5, Nezha Developer Kit, OpenVINO, Kinematic Analysis, End-Effector
Chapter 1 Research Content
In the production of eggplants, harvesting is the most labor-intensive and time-consuming stage, accounting for approximately 50-70% of the total workload. To ensure product quality, timely harvesting is essential, making it one of the most arduous tasks. Therefore, the research and development of an intelligent eggplant harvesting robot can significantly enhance harvesting efficiency and quality, alleviate the labor burden on farmers, and demonstrate the advantages of ‘Ubiquitous AI’.
This research aims to develop an intelligent eggplant harvesting robot to enhance the efficiency and quality of eggplant harvesting and reduce the labor burden on farmers. The research content includes the following aspects, as shown in Figure 1.1:
(1) End Effector Design: Design an end effector capable of efficiently and accurately gripping and cutting eggplants, minimizing damage to the fruit during the harvesting process.
(2) Robotic Arm Design: Design a four-degree-of-freedom robotic arm that can flexibly move and operate in a greenhouse environment to complete the eggplant harvesting tasks.
(3) Vision Algorithm: Utilize the YOLOv5 model combined with OpenVINO accelerated inference technology to achieve efficient recognition and localization of eggplants, ensuring harvesting accuracy.
(4) Control System Design: Develop a control system based on the STM32 microcontroller to coordinate the operation of the robotic arm and end effector, ensuring system stability and reliability.
(5) Testing and Analysis: Conduct multiple experiments in a simulated greenhouse environment to evaluate the system's harvesting success rate, efficiency, stability, and adaptability, followed by data analysis and results discussion.
Fig1.1 Research content
Chapter 2 System Design and Implementation
The overall structure of the robot is shown in Figure 2.1, including the mobile chassis, harvesting basket, robotic arm, end effector, camera, and other devices. The following sections will introduce several key components in detail.
Fig2.1 Overall structure of the robot
(1) End effector design
The designed end effector structure is shown in Figure 2.2, which mainly consists of a clamping device, a cutting device, a blade, an 8mm camera, a laser ranging end, a connecting piece, and a fixing piece.
1. Clamping device 2. Cutting device 3. Blade 4.8mm camera 5. Laser ranging module 6. Connectors and fixtures
Fig2.2 End effector structure
The picking process of the end effector designed in this paper is shown in Figure 2.3 (a). First, the positions of the eggplant and stem are identified. Then, the robotic arm grips the eggplant and places the stem in the cutting area. Next, the eggplant is moved to separate the stem from the main stalk, as shown in Figure 2.3(b), and finally, the stem is cut.
(a) Eggplant picking process
(b) The state of the eggplant
Fig2.3 Eggplant picking plan
(2) Robot arm design
The design of the robotic arm in this article is shown in Figure 2-4. By designing a four-degree-of-freedom robotic arm with base rotation, arm rotation, elbow rotation, and end effector clamping action, the position and direction of the end effector can be flexibly adjusted to adapt to the diversity and distribution of eggplant growth.
Fig2.4 Eggplant picking plan
The team used Matlab Robotics Toolbox to construct a kinematic model of the robotic arm based on D-H parameters and conducted kinematic simulations to verify the correctness of the established kinematic equations.
(3) Visual algorithm design
This article chooses YOLOv5 (You Only Look Once version 5) as the algorithm model for eggplant detection and recognition. In order to train the YOLOv5 model, the team went to a local greenhouse base to take 3000 eggplant images as the object detection dataset. The training dataset and test set were divided in a 7:3 ratio, and the quantity distribution table is shown in Table 2.1.
Tab2.1 Object detection dataset
Eggplant images |
Training dataset |
Test dataset |
total |
Single eggplant |
374 |
144 |
518 |
Multiple eggplant |
1726 |
756 |
2482 |
The team used LabelImg calibration software to calibrate the eggplant fruit and stalk of the training dataset. The trained YOLO model was converted into BIN and XML, and OpenVINO was used for inference acceleration. The test results are shown in Figure 2.5.
Fig2.5 Identification results of eggplant fruit and stem
Chapter 3 Testing and Analysis
In order to verify the performance of the robot system in practical applications, the team built a picking platform that simulates a greenhouse environment. The robot needs to move from a fixed starting point to multiple preset eggplant positions for picking. Each picking includes moving to the eggplant position, performing picking actions, and putting the eggplant into the basket. The picking scene is shown in Figure 3-1.
(a) Working status
(b) Pick up the eggplant
Fig3.1 Laboratory scenario testing
In the tests, we used the Jetson Nano and NeZha DevelopKit available in our laboratory as the main controllers for the eggplant robot for comparative experiments. These two main control devices are similar in price but exhibit different behaviors in specific applications, as shown in Table 3-1.
Tab2.1 Different control equipment recognition effect
Main Control Device |
Acceleration Technology |
FPS with Eggplany |
FPS without Eggplan |
Jetson Nano |
|
4
|
4 |
Jetson Nano |
With TensorRT |
11~12 |
18 |
NeZha |
|
24 |
24 |
NeZha |
With OpenVINO |
29~30 |
29~30 |
Through comparative experiments, it can be seen that NeZha DevelopKit outperforms Jetson Nano in terms of frame rate performance both without and with acceleration technology. Especially after using OpenVINO acceleration, the frame rate reaches 29-30 FPS. Therefore, Intel's NeZha DevelopKit outperforms Jetson Nano at the same price in terms of performance.
(a)APP UI
At the same time, we have done a lot of experiments, such as using Intel NeZha's eggplant picking robot, which takes an average of 9.4/s per pick and a maximum of 19.6/s.
In the future, we will continue to optimize the design of robotic arms and end effectors, enhance the system's picking ability in complex environments, and explore the possibility of multi-crop picking to further promote the intelligence and automation of agricultural production.
更多推荐
所有评论(0)