Climate change is forcing food producers to change processes to adapt to increasingly unpredictable and dangerous weather. Smallholder farmers are particularly vulnerable to climate change because they often lack the resources needed to transform crops and cultivation techniques. At NC State, researchers integrate his AI into geospatial technology to help solve these complex challenges to global food security and poverty.
Josh Gray is an associate professor at the University of Natural Resources, working in the Department of Forestry and Environmental Resources and the Center for Geospatial Analysis. Remote sensing uses satellites to study the physical properties of the Earth from a distance.
“If you want to measure the same thing from wall to wall, and you want to measure it every day, remote sensing has you covered,” says Gray. “No other technology allows this. It’s a very important scaling technique.”
No other technology allows this.
“We can make measurements from orbit that we can’t do on the ground,” added Gray. “It doesn’t matter how many graduate students you can send to the field. You can’t measure things like how much water is half a kilometer underground.”
At North Carolina State University’s annual university research symposium, Gray said his lab is using geospatial techniques and machine learning to study small farms in the Indian Gangetic plains and how they might face on a warming planet. You have learned how to identify solutions to problems that
Smallholder farmers are people who own very small farms. Together, they produce more than half of the calories consumed by humans worldwide. Along the Himalayas, these smallholder farmers are disproportionately impoverished and affected by climate change.
These smallholder farmers usually grow rice and wheat. Wheat is particularly sensitive to elevated heat. To determine whether planting wheat earlier in the season could protect crops from rising temperatures, Gray’s lab determined planting dates across these small farms from a series of satellite images. developed the process.
Each of these farms is less than 2 hectares (about 5 acres) and there are tens of millions of farms in the area. Manually tracking the boundaries of each farm is difficult. The lab trained a convolutional neural network to automatically identify farms via satellite imagery.
First, the lab used data augmentation to create image samples. These samples were based on existing images, but flipped horizontally or vertically, rotated at different angles, blurred, and with different levels of brightness.
These image samples were input to U-Net, a convolutional neural network architecture. A convolutional neural network, known as a CNN, is a machine learning computational model that can analyze visuals. A U-Net trained on these image samples was able to accurately identify the field boundaries.
According to Gray, the project is “a blend of traditional statistics, domain knowledge, and AI and machine learning.” Combining these disciplines creates a more competent and dynamic research process. “Knowing from which time period to select images allowed us to have a much more efficient machine learning pipeline,” says Gray.
Gray’s lab plans to scale up the process to analyze millions of farms in the region. The research enabled the team to determine viable solutions for smallholder farmers and enhance food security.