Robotics + YOLO + ROS2 + Agriculture
An autonomous agricultural robot that detects fireblight disease in apple trees using computer vision, classifies infection severity, and marks affected branches so farmers can prune them before the disease spreads. Built at CMU's Kantor Lab for the Farm Robotics Challenge 2026. View source →
01 The Robot
02 My Contribution
Assisted in training and refining the YOLO object detection model for fireblight identification. The model processes camera frames in real time, detecting infected branches and classifying severity levels (early, moderate, severe). Bounding boxes with confidence scores are published as ROS2 topics for downstream nodes to consume.
Built the actuation pipeline for the robotic arm that delivers the spray marker. When the behaviour tree determines a branch needs marking, it sends a serial signal to the spray node. The ROS2 node receives this signal, computes the arm trajectory to the target location, and triggers the spray mechanism. The arm returns to its home position after each marking cycle.
03 Technology Stack
Real-time fireblight detection on camera frames. Classifies infection severity and outputs bounding boxes with confidence scores as ROS2 messages.
The backbone of the entire system. Nodes for vision, navigation, manipulation, spray control, and the behaviour tree all communicate via ROS2 topics and services.
Orchestrates the robot's decision-making. Sequences detection, severity assessment, arm positioning, and spray actuation. Handles fallbacks and error recovery.
The mobile base. An agricultural robot platform with ruggedized wheels, onboard compute, and mounting points for sensors and the robotic arm.
UR-series robotic arm with a custom spray end-effector. Receives target coordinates from the vision pipeline and executes marking trajectories via serial commands.
Autonomous row traversal through orchard rows. Uses GPS waypoints and local obstacle avoidance to navigate between trees without human intervention.
04 Detection Pipeline
The farm-ng platform autonomously traverses orchard rows using GPS waypoints and local path planning.
Onboard cameras capture frames of each tree. The YOLO model runs inference in real time, detecting fireblight symptoms.
Detected infections are classified by severity. The behaviour tree evaluates whether the branch needs marking based on confidence and severity thresholds.
The robotic arm receives coordinates via serial signal, positions the spray nozzle at the infected branch, and applies a visible marker for the farmer.
05 Skills