Robotics + YOLO + ROS2 + Agriculture

Fireblight
Detection Robot

An autonomous agricultural robot that detects fireblight disease in apple trees using computer vision, classifies infection severity, and marks affected branches so farmers can prune them before the disease spreads. Built at CMU's Kantor Lab for the Farm Robotics Challenge 2026. View source →

Scroll

Farm-ng platform with robotic arm and vision stack

▶ Demo
Full System DemoRobot with arm, vision, and spray system operating on a test tree in the lab
▶ Video
Robot PlatformFarm-ng base with mounted mast, sensors, and spray arm
UR robotic arm
UR Robotic ArmThe spray arm that activates when the behaviour tree receives a serial signal

YOLO detection model and the spray arm integration

Computer Vision

Assisted in training and refining the YOLO object detection model for fireblight identification. The model processes camera frames in real time, detecting infected branches and classifying severity levels (early, moderate, severe). Bounding boxes with confidence scores are published as ROS2 topics for downstream nodes to consume.

Spray Arm Actuation

Built the actuation pipeline for the robotic arm that delivers the spray marker. When the behaviour tree determines a branch needs marking, it sends a serial signal to the spray node. The ROS2 node receives this signal, computes the arm trajectory to the target location, and triggers the spray mechanism. The arm returns to its home position after each marking cycle.

Robot arm reaching toward tree

What powers the system

👁

YOLOv8 Detection

Real-time fireblight detection on camera frames. Classifies infection severity and outputs bounding boxes with confidence scores as ROS2 messages.

YOLOv8PyTorchReal-time
🤖

ROS2

The backbone of the entire system. Nodes for vision, navigation, manipulation, spray control, and the behaviour tree all communicate via ROS2 topics and services.

ROS2 HumblePythonC++
🌳

Behaviour Tree

Orchestrates the robot's decision-making. Sequences detection, severity assessment, arm positioning, and spray actuation. Handles fallbacks and error recovery.

BehaviorTree.CPPXML

Farm-ng Platform

The mobile base. An agricultural robot platform with ruggedized wheels, onboard compute, and mounting points for sensors and the robotic arm.

farm-ngGPSNavigation
💪

Robotic Arm + Spray

UR-series robotic arm with a custom spray end-effector. Receives target coordinates from the vision pipeline and executes marking trajectories via serial commands.

UR ArmSerial CommsMoveIt
🗺

Navigation Stack

Autonomous row traversal through orchard rows. Uses GPS waypoints and local obstacle avoidance to navigate between trees without human intervention.

Nav2WaypointsSLAM

From tree scan to spray marker

01

Navigate

The farm-ng platform autonomously traverses orchard rows using GPS waypoints and local path planning.

02

Scan

Onboard cameras capture frames of each tree. The YOLO model runs inference in real time, detecting fireblight symptoms.

03

Classify

Detected infections are classified by severity. The behaviour tree evaluates whether the branch needs marking based on confidence and severity thresholds.

04

Mark

The robotic arm receives coordinates via serial signal, positions the spray nozzle at the infected branch, and applies a visible marker for the farmer.

What I worked with

ROS2 YOLOv8 Python Computer Vision Behaviour Trees Serial Communication Robotic Arm Control PyTorch Linux Git Embedded Systems Team Collaboration
View on GitHub