Measuring what we eat
Can you remember what you ate last week? In a Wageningen food lab, sensors register the size of your serving, what it contains and the number of times you chew. They can even see right through the bread to determine what is on top. Here, nutrition scientists study how smart technology contributes to a healthy diet. They discovered that processed food requires less chewing, which causes us to eat more. Measuring what we eat, a good idea?
‘We use the latest technology such as scanners, sensors, special cameras and Artificial Intelligence (AI) in our lab to conduct research on nutrition’, says Guido Camps, AI and Nutrition researcher at Wageningen University & Research and OnePlanet Research Center. To date, most nutrition research involved the use of questionnaires. But people rarely remember what they ate three days ago or last week.
“Your smartwatch tells you how many calories you burn per day, but it is nowhere near able to tell you how many calories you have consumed. This is something we want to develop.”
‘In the best-case scenario, questionnaires have a 60% accuracy. Moreover, completing and processing the questionnaires is a labour-intensive chore.’ For example, if you must fill in “Two slices of bread with chocolate sprinkles”, and then detail whether the bread was white or whole grain, and with margarine or real butter.’
‘Dieticians spend most of their time figuring out what a person eats, which leaves less time for actual consultation. Moreover, there is a negative correlation between the Body Mass Index (BMI) and knowing what you eat. This may be out of embarrassment or denial. Still, it is precisely for this group of clients that dieticians require accurate data.’
In short, better data on what people eat, when they eat it and how much is needed. ‘And this can be achieved with super cool tricks’, Camps says. These tricks can be found in the Hungry Robots Lab for nutrition research launched a year ago at Wageningen University & Research. Here, cameras register how many times we chew and how long we take to eat a single bite of food. Machine vision is used to identify facial characteristics such as nose, lips and the corners of the mouth in order to register how a person bites, chews and swallows.
Why is this relevant? Recently, Camps and his research team used this technology to determine that the dangers of ultra-processed foods do not lie within the processing itself. ‘Some scientists claim that processed food is unhealthy, while processing in itself is good for food safety and shelf-life. Our new study shows that people eat ultra-processed food much faster, which means they eat more.’
Because ultra-processed food requires less chewing, it does not lead to satiety. That chewing causes one to feel “full” was already a known fact, but the indisputable link to processing had not yet been made.
‘Ultra-processed products are unjustly seen as the opposite of unprocessed foods. Granulated sugar and honey, for example, are both sugars. Their origin is irrelevant. Our research shows that the speed with which we eat is the relevant factor. This means the food industry can continue to produce tasty foods but must make them more difficult to eat so that they require more chewing, which results in a lower calorie intake.’
The most spectacular feature in the Hungry Robots Lab is the hyperspectral cameras, says Camps. People may only see the light within the spectrum the human eye can process. Hyperspectral cameras are able to register light that is beyond the human range. ‘We wondered whether this would enable us to better recognise foods, for example, cola with or without sugar. A hyperspectral camera sees normal coke as a sugar molecule laden syrup, while it sees sugar-free coke as something akin to water.’
‘A hyperspectral camera can also see through certain materials. For example, to identify what spread has been put on a sandwich’, Camps explains. ‘This is currently quite complex and rather expensive, but we aim to develop this technology further so that it may be integrated into a smartphone or a smart tray. This means you would only have to send pictures of your meal if you are participating in nutrition research.’
In the laboratory, a smart tray registers the weight of each bite as it is consumed off the tray. ‘This application may prove useful in health care. For example, in the care of older people who fail to eat enough. In the later stages of life, people often have a diminished appetite.’ If the smart tray registers how much food is consumed, the software can raise the alarm if critical values are not reached.
There is also a smartwatch sensor capable of registering this by recording your hand gestures. The sensor can determine whether you are eating with a 95% accuracy, the researchers observed. ‘Not everyone is able to eat off a smart tray. The smartwatch is an excellent and viable alternative to monitor whether people eat enough and send out a warning if this is not the case.’
The smartwatch may also be deployed in nutrition research of diet programmes. Camps: ‘Your smartwatch tells you how many calories you burn per day, but it is nowhere near able to tell you how many calories you have consumed. This is something we want to develop.’
Dealing with data
But what about privacy, when so much sensitive data can be recorded? ‘The person in question owns the data, unlike commercial apps, where the developers or companies own the data. Our research aims to provide the individual with insight into their own eating behaviour. One could perhaps authorise health care professionals to access the data.’
The Hungry Robots Lab collaborates with many Wageningen departments and chair groups, both in research and education at Wageningen University. ‘Dealing with data is becoming increasingly important in every study programme’, Camps states. The lab is, for example, involved in the minor ‘Quantified self’ and in the new master’s ‘Data science for health’ set to launch on 1 September 2022.