The agricultural sector is facing major challenges: German farmers are already feeling the far-reaching effects of climate change and will have to adapt to this to a greater extent in the future. Rising temperatures and changes in precipitation affect all agricultural variables, ranging from crop growth to crop rotations right through to tillage. Decentralized AI in the cloud as well as centralized AI on farms can help make this process of adapting to changing conditions more efficient, accelerate the process across all areas of agriculture and thus make the overall ecosystem more agile and future-proof.
This is where the NaLamKI project comes into play (see the facts-and-figures box for more on this). Activities will focus on building a cloud-based software-as-a-service (SaaS) platform with open interfaces for providers from agriculture and industry, as well as service providers of special-purpose applications for crop farming. By aggregating sensor and machine data collected using satellites and drones, soil sensors, robotics, manual data collection and inventory data, it is possible to create a data pool from which agricultural processes can be more sustainably optimized using advanced AI methods. AI applications deployed on the platform support farmers in analyzing crop and soil conditions across large areas of land and assist with the reorganization of nutrient and crop protection processes such as irrigation, fertilization and pest control in order to ensure sufficient crop yields both in terms of quality and quantity, to reduce emissions and to preserve biodiversity. The targeted use of crop protection products, for example, increases crop yields, lowers costs, conserves resources and actively protects the environment.
Farmers will interact with AI
“In addition to climate change, a shortage of skilled labor is also impacting the quality and flow of agricultural processes. Therefore, it is often the case that plant conditions can only be checked on a very selective basis. At present, it is not possible to detect and precisely determine soil water conditions or pest infestation, for example, in large agricultural areas,” says Dr. Sebastian Bosse, Head of Interactive & Cognitive Systems Group at Fraunhofer HHI. To address this, AI methods to analyze remote sensing data for modeling of agricultural processes and for 5G networking on farmland are being developed as part of the institute’s project. “Among other things, we are looking into image analysis of drone, satellite and robotic camera data and making the results meaningful to farmers,” says the engineer. By merging all the data, they will gain an insight into the characteristics of cultivated areas that was virtually nonexistent before.
Farmers will be able to interact with the AI and ask it questions. For example, based on the current soil moisture reading and crop diseases, the AI will be able provide instructions for action and show the effects of different scenarios. To be precise, a dashboard showing the area of farmland and current soil conditions will be displayed on a tablet. By clicking on specific areas, the farmer will be provided with information about problems such as low water levels, as well as recommendations on how best to deal with them.
SaaS platform based on GAIA-X
The (training) data and AI services will be provided in a decentralized manner using Gaia-X – a European cloud infrastructure with data sovereignty. What’s more, a decentralized, distributed-learning AI system will be established, with data stored locally at the farms. Farmers will be able to share the AI models and transfer them to the NaLamKI platform in order to continuously improve the algorithms. The platform will be open to third-party providers. Start-ups, for example, could offer their innovative AI solutions on the platform.
Inspection of orchards
Initial data collection for the AI model development process has already been completed. For instance, images taken by a robot of an apple row crop on a fruit farm in the Palatinate region of Germany are now available. For this purpose, data from various sensors, such as position sensors, LIDAR, RGB and multi-spectral cameras, were collected, analyzed and merged (sensor fusion) while the (semi-)autonomous robot passed through the plantation. The aim is to create a meaningful representation of the trees in an orchard so that the fruit count and degree of ripeness, the stem diameter, the condition of the individual crops and the surrounding soil can be determined. This also includes detecting impediments, in particular living creatures in tall vegetation, along the route traveled. “We evaluate the data as we move through the plantation. The information we collect is merged with the map of the fruit tree population and depicted in a map of the property. Documentation is created for the farmer based on these data,” says Dr. Bosse, explaining the specific application.