The “Smart Dining Table”: Automatic Behavioral Tracking of a Meal with a Multi-Touch-Computer

DSpace/Manakin Repository

The “Smart Dining Table”: Automatic Behavioral Tracking of a Meal with a Multi-Touch-Computer

Citable link to this page

 

 
Title: The “Smart Dining Table”: Automatic Behavioral Tracking of a Meal with a Multi-Touch-Computer
Author: Manton, Sean; Magerowski, Greta; Patriarca, Laura; Alonso-Alonso, Miguel

Note: Order does not necessarily reflect citation order of authors.

Citation: Manton, Sean, Greta Magerowski, Laura Patriarca, and Miguel Alonso-Alonso. 2016. “The “Smart Dining Table”: Automatic Behavioral Tracking of a Meal with a Multi-Touch-Computer.” Frontiers in Psychology 7 (1): 142. doi:10.3389/fpsyg.2016.00142. http://dx.doi.org/10.3389/fpsyg.2016.00142.
Full Text & Related Files:
Abstract: Studying how humans eat in the context of a meal is important to understanding basic mechanisms of food intake regulation and can help develop new interventions for the promotion of healthy eating and prevention of obesity and eating disorders. While there are a number of methodologies available for behavioral evaluation of a meal, there is a need for new tools that can simplify data collection through automatic and online analysis. Also, there are currently no methods that leverage technology to add a dimension of interactivity to the meal table. In this study, we examined the feasibility of a new technology for automatic detection and classification of bites during a laboratory meal. We used a SUR40 multi-touch tabletop computer, powered by an infrared camera behind the screen. Tags were attached to three plates, allowing their positions to be tracked, and the saturation (a measure of the infrared intensity) in the surrounding region was measured. A Kinect camera was used to record the meals for manual verification and provide gesture detection for when the bites were taken. Bite detections triggered classification of the source plate by the SUR40 based on saturation flux in the preceding time window. Five healthy subjects (aged 20–40 years, one female) were tested, providing a total sample of 320 bites. Sensitivity, defined as the number of correctly detected bites out of the number of actual bites, was 67.5%. Classification accuracy, defined as the number of correctly classified bites out of those detected, was 82.4%. Due to the poor sensitivity, a second experiment was designed using a single plate and a Myo armband containing a nine-axis accelerometer as an alternative method for bite detection. The same subjects were tested (sample: 195 bites). Using a simple threshold on the pitch reading of the magnetometer, the Myo data achieved 86.1% sensitivity vs. 60.5% with the Kinect. Further, the precision of positive predictive value was 72.1% for the Myo vs. 42.8% for the Kinect. We conclude that the SUR40 + Myo combination is feasible for automatic detection and classification of bites with adequate accuracy for a range of applications.
Published Version: doi:10.3389/fpsyg.2016.00142
Other Sources: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4749696/pdf/
Terms of Use: This article is made available under the terms and conditions applicable to Other Posted Material, as set forth at http://nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of-use#LAA
Citable link to this page: http://nrs.harvard.edu/urn-3:HUL.InstRepos:25658358
Downloads of this work:

Show full Dublin Core record

This item appears in the following Collection(s)

 
 

Search DASH


Advanced Search
 
 

Submitters