Fast facts
Human perception of food in different lighting conditions improves computer prediction
Machine vision-based prediction errors decreased by 20 percent
Lighting color temperature, brightness influence human perception of food quality
FAYETTEVILLE — Have you ever stood in front of apples on display at the grocery store trying to pick out the best ones and wondered, “Is there an app for this?”
Current machine-learning based computer models used for predicting food quality are not as consistent as a human’s ability to adapt to environmental conditions. Still, information compiled in an Arkansas Agricultural Experiment Station study may be used someday to develop that app, as well as provide grocery stores with insights on presenting foods in a more appealing manner and optimize software designs for machine vision systems used in p4rocessing facilities.
The study led by Dongyi Wang, assistant professor of smart agriculture and food manufacturing in the biological and agricultural engineering department and the food science department, was recently published in the Journal of Food Engineering.
Even though human perception of food quality can be manipulated with illumination, the study showed that computers trained with data from human perceptions of food quality made more consistent food quality predictions under different lighting conditions.
“When studying the reliability of machine-learning models, the first thing you need to do is evaluate the human’s reliability,” Wang said. “But there are differences in human perception. What we are trying to do is train our machine-learning models to be more reliable and consistent.”
The study, supported by the National Science Foundation, showed that computer prediction errors can be decreased by about 20 percent using data from human perceptions of photos under different lighting conditions. It outperforms an established model that trains a computer using pictures without human perception variability taken into consideration.
Even though machine vision techniques have been widely studied and applied in the food engineering field, the study noted that most current algorithms are trained based on “human-labeled ground truths or simple color information.” No studies have considered the effects of illumination variations on human perception, and how the biases can affect the training of machine vision models for food quality evaluations, the authors stated.
The researchers used lettuce to evaluate human perceptions under different lighting conditions, which were in turn used to train the computer model. Sensory evaluations were done at the experiment station’s Sensory Science Center. Han-Seok Seo, professor in the food science department and director of the Sensory Science Center, was a co-author of the study.
Out of 109 participants in a broad age range, 89 completed all nine sensory sessions of the human perceptional reliability phase of the study. None of the participants were color blind or had vision problems. In five consecutive days, the panelists evaluated 75 images of Romaine lettuce each day. They graded freshness of the lettuce on a scale of zero to 100.
The images of lettuce the sensory panel graded were of samples photographed over the course of eight days to provide different levels of browning. They were taken under different lighting brightness and color temperatures, ranging from a blueish “cool” tone to an orangey “warm” tone, to obtain a dataset of 675 images.
Several well-established machine learning models were applied to evaluate the same images as the sensory panel, the study noted. Different neural network models used the sample images as inputs and were trained to predict the corresponding average human grading to better mimic human perception.
As seen in other experiments at the Sensory Science Center, human perception of food quality can be manipulated with illumination. For example, warmer environmental colors can disguise lettuce browning, Wang explained.
Wang said the method to train machine vision-based computers using human perceptions under different lighting conditions could be applied to many things, from foods to jewelry.
Other co-authors of the study from the University of Arkansas included Shengfan Zhang, associate professor of industrial engineering in the College of Engineering; Swarna Sethu, former post-doctoral researcher in biological and agricultural engineering department, and now assistant professor of Computer Information Sciences at Missouri Southern State University; and Victoria J. Hogan, program assistant in the food science department.
The study was supported by the National Science Foundation, grant numbers OIA-1946391 and No. 2300281. The authors also recognized graduate and senior undergraduate students Olivia Torres, Robert Blindauer and Yihong Feng for helping collect, analyze and grade samples.