College - Author 1
College of Engineering
Department - Author 1
Electrical Engineering Department
Degree Name - Author 1
BS in Electrical Engineering
College - Author 2
College of Engineering
Department - Author 2
Electrical Engineering Department
Degree - Author 2
BS in Electrical Engineering
College - Author 3
College of Engineering
Department - Author 3
Electrical Engineering Department
Degree - Author 3
BS in Electrical Engineering
Date
6-2025
Primary Advisor
Siavash Farzan, College of Engineering, Electrical Engineering Department
Abstract/Summary
Increasing labor costs and agricultural demands have created a need for automated apple harvesting. However, automated apple harvesting via robotic manipulation must overcome certain challenges to be an effective and efficient method. First, the system must compete with or perform better than manual human labor. Second, it must safely and accurately navigate an apple orchard autonomously. Third, the robotic manipulator must be able to securely grasp and pick apples without causing damage. To address these challenges, this project utilizes a Husky UGV equipped with a 2D LiDAR sensor, an RGB-D camera, an IMU, an OpenManipulator-X robot arm, and a soft robotic gripper with tactile sensing. Incoming data from the LiDAR sensor and the odometry are processed using SLAM algorithms to enable the robot to map its environment in real time. Then, localization algorithms such as Adaptive Monte Carlo Localization integrate LiDAR and odometry data to precisely position the robot within the map. Using this localization, the robot navigates to each apple tree using the NAV2 stack. Meanwhile, the RGB-D camera data, in combination with YOLOv3 computer vision algorithms, allows the system to detect and locate apples. Finally, the soft robotic gripper samples each apple grasp using force-resistive sensors and analog-to-digital conversion. A convolutional neural network, trained on a custom-collected dataset, classifies each grasp to ensure apples are only harvested under suitable conditions. This complete system is powered by an onboard battery pack with an approximate runtime of 4 hours. This project explores the many subfields of robotics. It demonstrates the successful application of autonomous navigation and robotic manipulation alongside signal processing, sensor integration, and computer vision to enable efficient apple harvesting.
URL: https://digitalcommons.calpoly.edu/eesp/694