AI wants to count your calories

For now, the researchers’ algorithm is tuned to assess the contents of a standard US tablespoon. Tonje Thilesen for WSJ
For now, the researchers’ algorithm is tuned to assess the contents of a standard US tablespoon. Tonje Thilesen for WSJ

Summary

Researchers are developing tech to analyze consumption by the bite, with the goal of tracking nutrition while you eat.

A plate of food has been set before you—perhaps a humble stew, a leafy salad or a burger and fries.

Your assignment: Record the calories and nutrient content of what you consume.

It’s one thing to consult charts for various foods or accept the suggestions of an app. But how can you dissect the specific contents of your particular soup, salad or burger—and only account for what ends up in your belly?

To solve this problem, researchers in Canada are leveraging artificial intelligence to analyze food bite-by-bite. All diners have to do is record themselves eating. The research aims to improve on old–fashioned food journals and smartphone apps that estimate calories and nutrients based on photographs of plated meals or scans of bar codes on packaged foods.

“We’re starting to see tech that can decompose foods and have a better understanding of the individual parts of a meal," says Alexander Wong, a director of the Vision and Image Processing Lab at the University of Waterloo in Ontario, Canada.

Wong and his colleagues use video footage recorded with a cellphone to measure the amount of food on a spoon as it travels from dish to mouth. The algorithm doesn’t yet identify the type of food, but measuring portion size is a first step to deducing the calories and nutritional content of food as it’s eaten, without additional input from the diner.

“We put a camera on the table, and you forget about it," says Yuhao Chen, a research assistant professor in the lab. The team is also analyzing still images of food and footage captured by wearable glasses.

Chen says he expects the lab to solve the problem of identifying precisely what food is on the spoon in the next few months.

“It’s one of the biggest challenges," he says. “In Asian cuisines, you expect more soups. In Japanese cuisine, there are more smaller dishes. In India, everything looks like curry mixed together."

Their work could find a broader audience. Since 2020 consumers have spent nearly $1 billion on subscriptions to diet and nutrition apps, according to Sensor Tower, a market-intelligence company.

For now, the researchers’ algorithm is tuned to assess the contents of a standard U.S. tablespoon. The shallow depth provides a favorable angle to accurately estimate dietary intake, they say. At this proof-of-concept stage, the method has a mean absolute percentage error of around 22%, meaning that if a spoonful of food weighs 100 grams, the algorithm’s prediction could fall between 78 grams and 122 grams.

The discrepancies arise because the measurements are taken at the pixel level, which changes as food moves through the frame. Adapting the algorithm to other utensils will require additional work.

“A fork is easy," Chen said. “Chopsticks are a little harder."

Once the algorithm distinguishes the food from the utensil, it analyzes the bite’s size and shape, a step toward the eventual goal of calculating its calories and nutrients. A spoonful of potatoes, for example, would have a different size and shape than a spoonful of stew.

Compared with photographing a full plate of food, where some items might be obscured or later discarded, using a utensil is a more accurate way to quantify and identify what people actually eat, said Chong Wah Ngo, a professor of computer science at Singapore Management University. He has researched using deep learning and large language models for food recognition but wasn’t involved in the study.

“The spoon has uncovered the food content inside a dish, allowing recognition of ingredients not visible on a plate," he said. “The paper presents an innovative system that has potential to tackle several practical challenges in food recognition."

The lab is focusing on North American foods in collaboration with Purdue University, where researchers have assembled a 3-D data set of 637 foods with detailed nutrition information and weight in a laborious process that involves purchasing and scanning foods.

The researchers are starting simple.

“We looked at commonly consumed foods in the U.S.," says Fengqing Maggie Zhu, an associate professor of electrical and computer engineering at Purdue and a principal investigator at the university’s Video and Image Processing Laboratory.

Chen and Wong’s research is funded by the National Research Council Canada, with a focus on aging. “A lot of the aging population suffer from malnutrition," Wong says. “It’s a vicious cycle."

As the body breaks down, he said, the more likely the elderly are to get injured, the more likely they are to be hospitalized, and the less likely they are to eat.

Apple earlier this year purchased DarwinAI, a startup that Wong helped build, and employs Wong as a director of machine learning research. Apple declined to comment on Wong’s university research.

The University of Waterloo researchers anticipate testing a prototype of their diet-monitoring tool within the next year.

Write to Jo Craven McGinty at jo.mcginty@wsj.com

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

topics

MINT SPECIALS