Advertisement

A Deep Network for Automatic Video-Based Food Bite Detection

  • Dimitrios KonstantinidisEmail author
  • Kosmas DimitropoulosEmail author
  • Ioannis Ioakimidis
  • Billy Langlet
  • Petros Daras
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11754)

Abstract

Past research has now provided compelling evidence pointing towards correlations among individual eating styles and the development of (un)healthy eating patterns, obesity and other medical conditions. In this setting, an automatic, non-invasive food bite detection system can be a really useful tool in the hands of nutritionists, dietary experts and medical doctors in order to explore real-life eating behaviors and dietary habits. Unfortunately, the automatic detection of food bites can be challenging due to occlusions between hands and mouth, use of different kitchen utensils and personalized eating habits. On the other hand, although accurate, manual bite detection is time-consuming for the annotator, making it infeasible for large scale experimental deployments or real-life settings. To this regard, we propose a novel deep learning methodology that relies solely on human body and face motion data extracted from videos depicting people eating meals. The purpose is to develop a system that can accurately, robustly and automatically identify food bite instances, with the long-term goal to complement or even replace manual bite-annotation protocols currently in use. The experimental results on a large dataset reveal the superb classification performance of the proposed methodology on the task of bite detection and paves the way for additional research on automatic bite detection systems.

Keywords

Deep learning Bite detection Video analysis Motion features 

Notes

Acknowledgement

This work was supported by the European Project: PROTEIN Grant no. 817732 with the H2020 Research and Innovation Programme.

References

  1. 1.
    Ioakimidis, I., Zandian, M., Eriksson-Marklund, L., Bergh, C., Grigoriadis, A., Södersten, P.: Description of chewing and food intake over the course of a meal. Physiol. Behav. 104, 761–769 (2011)CrossRefGoogle Scholar
  2. 2.
    Fogel, A., et al.: A description of an “obesogenic” eating style that promotes higher energy intake and is associated with greater adiposity in 4.5 year-old children: results from the GUSTO cohort. Physiol. Behav. 176, 107–116 (2017)CrossRefGoogle Scholar
  3. 3.
    Ohkuma, T., Hirakawa, Y., Nakamura, U., Kiyohara, Y., Kitazono, T., Ninomiya, T.: Association between eating rate and obesity: a systematic review and meta-analysis. Int. J. Obes. 39(11), 1589 (2015)CrossRefGoogle Scholar
  4. 4.
    Fagerberg, P., Langlet, B., Glossner, A., Ioakimidis, I.: Food intake during school lunch is better explained by objectively measured eating behaviors than by subjectively rated food taste and fullness: a cross-sectional study. Nutrients 11(3), 597 (2019)CrossRefGoogle Scholar
  5. 5.
    Langlet, B., Tang Bach, M., Odegi, D., Fagerberg, P., Ioakimidis, I.: The effect of food unit sizes and meal serving occasions on eating behaviour characteristics: within person randomised crossover studies on healthy women. Nutrients 10(7), 880 (2018)CrossRefGoogle Scholar
  6. 6.
    Hermsen, S., Frost, J.H., Robinson, E., Higgs, S., Mars, M., Hermans, R.C.J.: Evaluation of a smart fork to decelerate eating rate. J. Acad. Nutr. Diet. 116(7), 1066–1067 (2016)CrossRefGoogle Scholar
  7. 7.
    Theodoridis, T., Solachidis, V., Dimitropoulos, K., Gymnopoulos, L. Daras, P.: A survey on AI nutrition recommender systems. In: 12th International Conference on Pervasive Technologies Related to Assistive Environments Conference, Rhodes, Greece (2019)Google Scholar
  8. 8.
    Kyritsis, K., Diou, C., Delopoulos, A.: Food intake detection from inertial sensors using LSTM networks. In: Battiato, S., Farinella, G.M., Leo, M., Gallo, G. (eds.) ICIAP 2017. LNCS, vol. 10590, pp. 411–418. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-70742-6_39CrossRefGoogle Scholar
  9. 9.
    Simon, T., Joo, H., Matthews, I. Sheikh, Y.: Hand keypoint detection in single images using multiview bootstrapping. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, pp. 4645–4653 (2017)Google Scholar
  10. 10.
    Cao, Z., Simon, T., Wei, S. Sheikh, Y.: Realtime multi-person 2D pose estimation using part affinity fields. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, pp. 1302–1310 (2017)Google Scholar
  11. 11.
    Sazonov, E., Fontana, J.: A sensor system for automatic detection of food intake through non-invasive monitoring of chewing. IEEE Sens. J. 12, 1340–1348 (2012)CrossRefGoogle Scholar
  12. 12.
    Papapanagiotou, V., Diou, C., Langlet, B., Ioakimidis, I. Delopoulos, A.: A parametric probabilistic context-free grammar for food intake analysis based on continuous meal weight measurements. In 37th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society (EMBC), pp. 7853–7856 (2015)Google Scholar
  13. 13.
    Zhang, R., Amft, O.: Monitoring chewing and eating in free-living using smart eyeglasses. IEEE J. Biomed. Health Inf. 22(1), 23–32 (2018)CrossRefGoogle Scholar
  14. 14.
    Kyritsis, K., Diou, C., Delopoulos, A.: End-to-end learning for measuring in-meal eating behavior from a smartwatch. In: 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, pp. 5511–5514 (2018)Google Scholar
  15. 15.
    Papapanagiotou, V., Diou, C., Zhou, L., Van Den Boer, J., Mars, M., Delopoulos, A.: A novel chewing detection system based on ppg, audio, and accelerometry. IEEE J. Biomed. Health Inf. 21(3), 607–618 (2017)CrossRefGoogle Scholar
  16. 16.
    Mirtchouk, M. Merck, C. Kleinberg, S.: Automated estimation of food type and amount consumed from body-worn audio and motion sensors. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 451–462 (2016)Google Scholar
  17. 17.
    Doulah, A., et al.: Meal microstructure characterization from sensor-based food intake detection. Front. Nutr. 4, 31 (2017)CrossRefGoogle Scholar
  18. 18.
    Fontana, J.M., Farooq, M., Sazonov, E.: Automatic ingestion monitor: a novel wearable device for monitoring of ingestive behavior. IEEE Trans. Biomed. Eng. 61, 1772–1779 (2014)CrossRefGoogle Scholar
  19. 19.
    Zhang, S., Stogin, W., Alshurafa, N.: I sense overeating: motif-based machine learning framework to detect overeating using wrist-worn sensing. Inf. Fusion 41, 37–47 (2018)CrossRefGoogle Scholar
  20. 20.
    Ramos-Garcia, R.I., Muth, E.R., Gowdy, J.N., Hoover, A.W.: Improving the recognition of eating gestures using intergesture sequential dependencies. IEEE J. biomedical and health informatics 19(3), 825–831 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Dimitrios Konstantinidis
    • 1
    Email author
  • Kosmas Dimitropoulos
    • 1
    Email author
  • Ioannis Ioakimidis
    • 2
  • Billy Langlet
    • 2
  • Petros Daras
    • 1
  1. 1.CERTH-ITIThessalonikiGreece
  2. 2.Karolinska InstitutetHuddingeSweden

Personalised recommendations