News

Apr 25 '24

AIFS Front and Center at AI Themed Research Expo

#Education

AIFS team, from left to right, includes Fangzhou Li, Ammar Ziadeh, Lukas Maximilian Masopust, Ashley Balley, Kristin Singhasemanon, Georgiana Prevost, and Steve Brown at the AIFS table at Research Expo with the AIFS banner and two computer screens on the table
An image without an alt, whoops
An image without an alt, whoops
An image without an alt, whoops
Previous
Next

With the UC Davis 2024 Research Expo focused on AI, the AI Institute for Next Generation Food Systems was front and center — literally! Our space was at the front of the main ballroom, next to the stage.

Visitors got to view two full-sized farm robots, each equipped with its own research technology and farm capabilities. Our table featured a video introducing each of our research clusters, and one of our researchers demo'd the recently-released FoodAtlas food composition database.

Phenotyping Platform Based on Amiga Robot from the Plant AI and Biophysics Lab

The first robot, built using the Farm-ng Amiga base, caught people's attention with its unusual height, which is needed to properly position its stereo cameras.

The project, called GEMINI, is working to develop an image acquisition platform for phenotyping that will eventually be deployed to crop breeding centers in Africa. GEMINI aims to create a state-of-the-art breeding toolkit to improve yield and quality in previously neglected staple crops that are important for food, nutritional and economic security in low- and middle-income countries, including common bean, cowpea and sorghum.

Ph.D. candidate, Heesup Yun, explains that they chose the Farm-ng's Amiga because it is easy to modify to fit their research needs. They worked with the company to develop the design and had the necessary parts shipped to UC Davis. The research team assembled the robot under the supervision of AIFS faculty, Dr. Mason Earles.

The team mounted stereo cameras that can see the left, right and top of the crop to capture images from multiple angles. They are also using network based RTK GPS correction for accurate GPS positioning.

The team's research will take place in the UC Davis Vegetable Crop Center where the robot will collect images of the warm-season grain legume field once or twice a week from June to September 2024.

By collecting images of the beans from multiple angles, the team will be able to extract plant traits such as leaf area, flower counts and pod counts. They will eventually be able to predict the 3D shape of the plant, which will contribute to their research connecting sensing data, crop models and crop genetics.

FRAIL-bot from the Bioautomation Lab

The FRAIL-bot was built by Dennis Sadowski to help harvest fragile crops such as strawberries. It navigates the rows in a field to bring a new tray to pickers while collecting the full tray to remove from the field. It has an accompanying Carrito that can determine when the tray is getting full and summon the FRAIL-bot at just the right time.

The FRAIL-bot serves agricultural field workers who are often operating continuously in hot environments, at risk of heat-related illness and chronic ergonomic injury. In addition to its practical use in the field, the robot is being adapted to monitor the environment and workers' physiology in real time and provide resources to mitigate injury and illness risk.

Lead GSR Daniel Martin explains how this will work:

Monitoring: The FRAIL-bot will collect environmental data such as temperature and humidity from a set of sensors, and it will receive physiological data, such as heart rate and skin temperature, from a set of wearable sensors on each worker.

Mitigation: The robot will incorporate this information in its scheduling algorithm and adjust its deployment to serve the workers based on their likelihood of incurring injury and illness. Furthermore, the robot may provide resources and visual or auditory suggestions to the workers to reduce their physical stress and strain.

FoodAtlas

In addition to the two farm robots, Ph.D. candidate, Fangzhou Li, presented a live demonstration of the newly released FoodAtlas food composition database. FoodAtlas is an AIFS-funded project dedicated to building the largest evidence-based food knowledge graph in the world.

FoodAtlas is based on nearly 150,000 data pairs of foods and constituent molecules. The massive database was built through machine reading of over 60,000 research articles; a feat that would have taken a human team years to accomplish.



Our AIFS group photo, from left to right, includes Fangzhou Li, Ammar Ziadeh, Lukas Masopust, Ashley Balley, Kristin Singhasemanon, Georgiana Provost, and Steve Brown.

>_

Related News & Events