Overview
Precision Drone Spray is a two semester capstone project that built an autonomous weed detection and targeted spraying system on top of an agriculture drone platform. The goal was to give farmers a tool that can map a field, detect weeds using computer vision, and spray only the areas that need treatment instead of blanketing entire fields with herbicide.
The system ties together a drone that flies programmed routes, an image pipeline that identifies weeds, a mapping interface that visualizes detections as GPS points, and a precision spray mode that uses those locations for follow up passes. The project was done in collaboration with the Texas A&M Department of Soil and Crop Sciences, so the technical work was always tied back to real agricultural use cases.
My Contributions
I served as the Computer Vision lead and contributed across system design, integration, and field testing. Some of the key things I owned:
- Designed and implemented the weed detection pipeline that processed drone imagery and produced geotagged weed locations that could be consumed by the mapping and spray subsystems.
- Worked with the UI teammate so that detections flowed cleanly from image capture, through inference and filtering, into a front end that farmers could use to review and approve spray plans.
- Helped define data formats and interfaces between subsystems so that the computer vision output could be reused for both visualization and spray control.
- Participated in field tests, validating that detections lined up with actual weed locations in the field and that GPS accuracy was sufficient for practical spraying.
- Contributed to documentation and final presentations to both the capstone faculty and the agricultural partners, explaining how the detection system worked and how it could be scaled.
Tech Stack
Hardware & Platforms
- Agriculture drone platform for mapping and spraying missions
- Onboard GPS and IMU data for geotagging detections
- Field laptop and ground station setup for mission management
- Test plots provided by Soil and Crop Sciences for real world trials
Software & Tools
- Python for computer vision and data processing
- Machine learning framework for training and running the weed detector
- Web front end for map visualization and system control
- Version control and issue tracking to coordinate a multi member engineering team
System Design
From a system point of view, the project had three major flows. First, the drone flies an autonomous mapping mission and captures images along with GPS data. Second, the computer vision pipeline ingests those images, identifies weeds, and associates each detection with a location in the field. Finally, the front end displays these points on an interactive map so that a user can review detections and trigger a spray mission that revisits only those locations.
On the computer vision side, my focus was on clean inputs and clean outputs. I worked on preparing image data, tuning detection thresholds, filtering noisy detections, and packaging the results in a way that downstream subsystems could understand. This meant thinking about bounding boxes and labels, but also about how to translate all of that into coordinates and routes that match what the drone and the operator actually see in the field.
Enclosure and system demo used during final presentations.
Challenges
- Handling real world imagery where lighting, crop height, and background clutter change from frame to frame, which required careful thresholding and data selection to avoid false positives.
- Coordinating between departments and disciplines so that the model detected the weeds that actually mattered to agronomists, not only what looked distinct in the images.
- Integrating computer vision outputs with mapping and flight control in a way that preserved accuracy from pixel space to GPS coordinates and then to planned spray paths.
Results
- Delivered a working prototype that could take mapped imagery, detect weeds, and produce a visualization of weed locations over a field that informed follow up spraying.
- Demonstrated the system to faculty and industry partners, walking through the full pipeline from drone mapping to detection and visualization.
- Built experience in multi subsystem integration, where computer vision, geospatial data, and user interface design all had to align to solve a real problem for end users.



