Scientists to study real-world eating behaviors using wearable sensors and artificial intelligence

High-tech system development by URI, UT-Austin researchers measures real-world eating behavior, in and out of the lab

KINGSTON, R.I. — March 4, 2024 — A pedometer measures your steps, but what if you had a similar automated device to measure your eating behavior? Evidence from nutritional studies has long shown that the speed, timing and duration of an individual’s eating behavior are strongly related to obesity and other health issues. While eating behaviors can be accurately measured in a controlled laboratory setting, a blind spot exists when researchers attempt to study how participants actually eat “in the wild.”

A new National Institutes of Health-funded project by three scientists at the University of Rhode Island and The University of Texas at Austin aims to shed light on real-world eating behaviors, using AI-enabled wearable technology. The four-year, $2.4 million grant from the National Institute of Diabetes and Digestive and Kidney Diseases, supports the work of URI Nutrition Professor Kathleen Melanson and Psychology Professor Theodore Walls, and UT-Austin Electrical and Computer Engineering Professor Edison Thomaz.

They plan to develop a system to detect detailed information on eating motions, potentially every bite and chew. The researchers will combine more than 60 years of expertise in nutrition, behavioral statistics, and engineering to deploy a novel interdisciplinary project that would give researchers more complete data on study participants’ nutritional habits and behaviors.

“Eating behavior data collected in labs are most accurate, but people don’t live in labs, so we don’t know what they’re doing in real-world, day-to-day living,” Melanson said. “We want to compare the results from our new system to what we already have in the lab to ensure it is collecting data appropriately. The goal is to use this system in research so that we can test our interventions on modifications of ingestive behaviors.”

The study employs two devices: a typical smart watch and a discreet, custom-made sensor that sits on a participant’s jawline. The smart watch will capture the movement of arms and wrists when participants make typical eating gestures, measuring speed and frequency. It will be coupled with data captured by a small, button-sized sensor that will record jaw movements associated with chewing, recording the speed and intensity of the motion. 


Researchers at URI and UT-Austin are using wearable sensors, including a button-sized sensor that sits on participants’ faces near the jawline, to measure volunteers’ personal eating behaviors, including chewing rate and intensity.

“The study moves through several successive experiments from an in-lab setting ‘into the wild,’ gradually moving from the internal validity of inferences in a lab-controlled setting to those grounded in the external validity of the real world,” said Walls, whose research produces statistical tools to make sense of real-time intensive longitudinal health behavior data.

The researchers will study participants over the course of four progressive phases. After being fitted with the unobtrusive system, they will eat prescribed meals measured with standardized lab procedures and close supervision of the researchers. The next phase involves cafeteria-style eating, still under the close auspices of the researchers. The testing then moves into a restaurant setting where participants have more control over their meals. These phases will incrementally reflect more real-world eating conditions.

“We’re trying to answer, can you tell when someone is eating something? That might sound like a very simple question, but it turns out it’s very hard to do if you’re not in a very controlled setting,” said Thomaz. “When we talk, the jawbone definitely moves, but it doesn’t move in the rhythmic way it moves when you are chewing food. Those are the kinds of hints we’re going to try to leverage with sensors and AI algorithms. We will connect the two devices and see if we can come up with a more powerful system for detecting these kinds of eating behaviors.”

Finally, study participants return to their usual lives, but wearing the sensors to monitor their eating habits. Researchers will measure such data as the speed of eating, chewing and oral processing, how long food stays in mouth before swallowing, and how quickly the food disappears from their plates.

“With these kind of chewing and oral processing behaviors—rapid eating, taking large bites, not pausing between bites, and not chewing sufficiently—people might be over-ingesting calories before the satiety signals have time to develop,” Melanson said. “So, by assessing these behaviors, we can help develop a system that can be used in interventions to help people adapt their ingestive behaviors to maximize satiety and help with energy intake.”

Walls added that the study allows researchers to add other sensors, possibly one that monitors facial skin stretching. “We want to make sure we can make that progression out of the lab in a way that really works in an overall behavioral monitoring approach.  Our ‘customers’ are people who want to start clinical trials using this system. This stage is just about the measurement, but later we will move on and test real programs to help people manage their eating behavior.”

Participants in the study will be those who would potentially benefit the most from it, those at risk for obesity-related harms. The researchers will recruit participants from the LatinX communities in both Rhode Island and Texas, enabling them to explore unique cultural food practices. Members of these communities experience higher prevalence of obesity and related health issues, on average.

Anyone interested in being a participant in the study can contact the researchers at: dibs@uri.edu.