Today we have a guest post from Agribotix.
Today we have a guest post! Daniel McKinnon from Agribotix shares some of the details behind an emerging field fusing robotics and agriculture. Check it out!
Coming off the wildly successful 3DRobotics DroneCon and SparkFun Autonomous Vehicles Competition, a blog post on potential uses of UAVs or drones seems quite timely. While dozens of industries are rapidly adopting the commercial use of UAVs, many analysts have identified agriculture as the largest potential market. However, many unanswered questions remain as to specifically how a UAV would help a farmer’s bottom line. Agribotix has spent the last year beginning to answer these questions and, given our dedication to open source and publishing our progress, we thought it would be helpful to share our findings with the SparkFun community.
The first thing that we rapidly learned was that farmers want to become neither drone pilots nor imaging experts. They simply want actionable intelligence that enables them to make more educated on field decisions. While 3DRobotics and SparkFun aficionados (including the entire Agribotix team) really enjoy digging into the guts of the electronics and software that drives them, all but the most progressive, interested farmers want a simple solution that just provides them with some kind of valuable in-field intelligence. At the start of the season, we assumed this intelligence would be extremely high-resolution, very accurately georeferenced color and NDVI images of their fields, but this hypothesis has evolved considerably as we have gotten our product into the hands of more and more growers.
Our first big lesson (and the one that may be of the most interest to SparkFun readers) related to the airframe used to collect the images. Agribotix was born after its founders were hired by the Denver Zoo to develop a large quadcopter, which happened to be chock-full of SparkFun components, to assist in wildlife (specifically Cinereous Vulture) observation and capture. These origins, combined with the ease of launching and using a multirotor, substantially biased the Agribotix team towards that platform for agriculture. However, we rapidly learned that, due to a glide ratio of zero, rotary craft simply don’t have the endurance to survey a quarter section or quarter square mile field, which represents a hard minimum limit due to its ubiquity in modern agriculture.
We learned that while a multirotor could be very useful platform for the smaller fields typically used for higher value crops like grapes or citrus, the efficiency of a fixed wing became a necessity for monitoring the row crops like corn and wheat that blanket the country from Colorado to the East Coast. With this in mind, Agribotix developed a flying wing UAV on top of 3D Robotics’ open source Mission Planner/PixHawk platform. While Agribotix extensively tested four airframes after abandoning the quadcopter, the flying wing platform proved very rapid to set up and tear down in the field, capable of handling the rough on-ground conditions associated with agricultural use, adept at reliably taking off and landing autonomously using the PixHawk flight controller, and, most importantly, effective at handling high wind found in the Great Plains. The Agribotix team didn’t experiment as extensively with flight controllers, but the open source PixHawk/Mission Planner combination performs beautifully under most conditions thanks to a great team of developers and, according to a fellow ex-army UAV entrepreneur, actually performs much better than the military grade hardware he previously deployed. We have been really impressed with the open-source project and would not be surprised if all the companies in the drone community eventually coalesce around the PixHawk.
Once the airframe was engineered and tested (all of our components have been posted on DIYdrones and our website if any SparkFun readers want to build their own drone to Agribotix specs), we had to develop a reliable way of collecting data that was both valuable to the grower and easy to integrate into the existing agricultural data system. As previously mentioned, we initially thought extremely high resolution color and vegetation index (a vegetation index combines the color and near infrared reflectivity of plants to determine plant health and density) maps of fields would provide the most value. After all, we were collecting images with a 3 cm resolution on the ground so why wouldn’t a farmer want to look at each individual leaf from the air?
However, we learned that farmers prize neither resolution nor highly accurate georeferencing. While we initially hypothesized that farmers would use our maps to apply chemicals or fertilizer at a near leaf resolution, it turns out binning fields into 60 ft x 60 ft blocks is about at granular as it comes. This turns out to be great news because data transfer speeds in rural areas really limit resolution as well—downloading a 1.5 Gb file took one of our rural customers all night.
So what are these UAV-collected images used for anyway? The first simple use case is crop scouting. Due to their shear size, it is very difficult to monitor fields for problems from the ground. A blocked pivot sprinkler head could go undetected for weeks, causing thousands of dollars in losses in the process. The center of the field could be overcome with weeds and a farmer wouldn’t know until harvest came. Drone-collected color images are excellent for providing a gross picture of general field health and problems.
However, crop scouting and general monitoring are fairly obvious, low-value uses of UAVs. We had to employ an entirely different type of imaging to gain additional insight into field conditions. While plants appear green to the eye, they actually reflect substantially more light in the NIR, due to a spongy layer found on the backside of the leaf. This is essentially a biological heat sink, as the NIR photons are not energetic enough to drive photosynthesis, but would cause the plant to warm. In the 1970’s NASA researchers, notably Compton Tucker, began to exploit this property to develop vegetation indices combining the color and NIR reflectance of plants that accurately determine vegetation density from a simple photograph.
After a lot of flying, imaging, and ground truthing, we at Agribotix were able to combine a commercially available Canon S100 camera, Event38 filter, and simple Difference Vegetation Index (DVI) to get very valuable information from drone-collected images. This can be seen very clearly corn.
These DVI maps of corn field are then used to direct the top dress, or in-season fertilizing, of the corn fields by dividing the field into strong, OK, and weak zones. Without these images, the farmer would uniformly apply 50 pounds of fertilizer to the entire field. These maps allow the farmer to apply 60 pounds to the struggling areas, 50 pounds to the medium areas, and 40 pounds to the healthy areas, both decreasing costs and boosting yields.
Drones in agriculture have tremendous promise to boost efficiencies, drive farmer profits, and increase environmental sustainability and we at Agribotix are working hard to develop applications in all these areas. Thanks for reading and feel free to reach out with any drones in ag-related questions in the comments below!