Advertisement

Video-Based Remote Physiological Measurement via Cross-Verified Feature Disentangling

Conference paper
  • 963 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12347)

Abstract

Remote physiological measurements, e.g., remote photoplethysmography (rPPG) based heart rate (HR), heart rate variability (HRV) and respiration frequency (RF) measuring, are playing more and more important roles under the application scenarios where contact measurement is inconvenient or impossible. Since the amplitude of the physiological signals is very small, they can be easily affected by head movements, lighting conditions, and sensor diversities. To address these challenges, we propose a cross-verified feature disentangling strategy to disentangle the physiological features with non-physiological representations, and then use the distilled physiological features for robust multi-task physiological measurements. We first transform the input face videos into a multi-scale spatial-temporal map (MSTmap), which can suppress the irrelevant background and noise features while retaining most of the temporal characteristics of the periodic physiological signals. Then we take pairwise MSTmaps as inputs to an autoencoder architecture with two encoders (one for physiological signals and the other for non-physiological information) and use a cross-verified scheme to obtain physiological features disentangled with the non-physiological features. The disentangled features are finally used for the joint prediction of multiple physiological signals like average HR values and rPPG signals. Comprehensive experiments on different large-scale public datasets of multiple physiological measurement tasks as well as the cross-database testing demonstrate the robustness of our approach.

Notes

Acknowledgment

This work is partially supported by National Key R&D Program of China (grant 2018AAA0102501), Natural Science Foundation of China (grant 61672496), the Academy of Finland for project MiGA (grant 316765), project 6+E (grant 323287), ICT 2023 project (grant 328115), and Infotech Oulu.

Supplementary material

504434_1_En_18_MOESM1_ESM.pdf (341 kb)
Supplementary material 1 (pdf 341 KB)

References

  1. 1.
    Carreira, J., Zisserman, A.: Quo vadis, action recognition? a new model and the kinetics dataset. In: Proceedings of the IEEE CVPR (2017)Google Scholar
  2. 2.
    Chen, W., McDuff, D.: DeepPhys: video-based physiological measurement using convolutional attention networks. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. Lecture Notes in Computer Science, vol. 11206, pp. 356–373. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-01216-8_22CrossRefGoogle Scholar
  3. 3.
    De Haan, G., Jeanne, V.: Robust pulse rate from chrominance-based rPPG. IEEE Trans. Biomed. Eng. 60(10), 2878–2886 (2013)CrossRefGoogle Scholar
  4. 4.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
  5. 5.
    Lam, A., Kuno, Y.: Robust heart rate measurement from video using select random patches. In: Proceedings of the IEEE ICCV (2015)Google Scholar
  6. 6.
    Lee, H.-Y., Tseng, H.-Y., Huang, J.-B., Singh, M., Yang, M.-H.: Diverse image-to-image translation via disentangled representations. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 36–52. Springer, Cham (2018).  https://doi.org/10.1007/978-3-030-01246-5_3CrossRefGoogle Scholar
  7. 7.
    Lewandowska, M., Ruminski, J., Kocejko, T., Nowak, J.: Measuring pulse rate with a webcam - a non-contact method for evaluating cardiac activity. In: Proceedings of the ComSIS (2011)Google Scholar
  8. 8.
    Li, X., et al.: The OBF database: a large face video database for remote physiological signal measurement and atrial fibrillation detection. In: Proceedings of the IEEE FG (2018)Google Scholar
  9. 9.
    Li, X., Chen, J., Zhao, G., Pietikainen, M.: Remote heart rate measurement from face videos under realistic situations. In: Proceedings of the IEEE CVPR (2014)Google Scholar
  10. 10.
    Liu, Y., Wei, F., Shao, J., Sheng, L., Yan, J., Wang, X.: Exploring disentangled feature representation beyond face identification. In: Proceedings of the IEEE CVPR (2018)Google Scholar
  11. 11.
    Lu, B., Chen, J.C., Chellappa, R.: Unsupervised domain-specific deblurring via disentangled representations. In: Proceedings of the IEEE CVPR (2019)Google Scholar
  12. 12.
    Niu, X., Han, H., Shan, S., Chen, X.: VIPL-HR: A multi-modal database for pulse estimation from less-constrained face video. In: Proceedings of the ACCV (2018)Google Scholar
  13. 13.
    Niu, X., Shan, S., Han, H., Chen, X.: RhythmNet: end-to-end heart rate estimation from face via spatial-temporal representation. IEEE Trans. Image Process. 29, 2409–2423 (2020)CrossRefGoogle Scholar
  14. 14.
    Niu, X., et al.: Robust remote heart rate estimation from face utilizing spatial-temporal attention. In: Proceedings of the IEEE FG (2019)Google Scholar
  15. 15.
    Poh, M.Z., McDuff, D.J., Picard, R.W.: Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt. Express 18(10), 10762–10774 (2010)CrossRefGoogle Scholar
  16. 16.
    Poh, M.Z., McDuff, D.J., Picard, R.W.: Advancements in noncontact, multiparameter physiological measurements using a webcam. IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011)CrossRefGoogle Scholar
  17. 17.
    Spetlik, R., Franc, V., Cech, J., Matas, J.: Visual heart rate estimation with convolutional neural network. In: Proceedings of the BMVC (2018)Google Scholar
  18. 18.
    Tran, L., Yin, X., Liu, X.: Disentangled representation learning GAN for pose-invariant face recognition. In: Proceedings of the IEEE CVPR (2017)Google Scholar
  19. 19.
    Tulyakov, S., Alameda-Pineda, X., Ricci, E., Yin, L., Cohn, J.F., Sebe, N.: Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions. In: Proceedings of the IEEE CVPR (2016)Google Scholar
  20. 20.
    Verkruysse, W., Svaasand, L.O., Nelson, J.S.: Remote plethysmographic imaging using ambient light. Opt. Express 16(26), 21434–21445 (2008)CrossRefGoogle Scholar
  21. 21.
    Wang, W., den Brinker, A.C., de Haan, G.: Discriminative signatures for remote-PPG. IEEE Trans. Biomed. Eng. 67(5), 1462–1473 (2020)CrossRefGoogle Scholar
  22. 22.
    Wang, W., den Brinker, A.C., Stuijk, S., de Haan, G.: Algorithmic principles of remote PPG. IEEE Trans. Biomed. Eng. 64(7), 1479–1491 (2017)CrossRefGoogle Scholar
  23. 23.
    Wang, W., den Brinker, A.C., Stuijk, S., de Haan, G.: Amplitude-selective filtering for remote-PPG. Biomed. Opt. Express 8(3), 1965–1980 (2017)CrossRefGoogle Scholar
  24. 24.
    Wang, W., Stuijk, S., De Haan, G.: Exploiting spatial redundancy of image sensor for motion robust rPPG. IEEE Trans. Biomed. Eng. 62(2), 415–425 (2015)CrossRefGoogle Scholar
  25. 25.
    Yu, Z., Peng, W., Li, X., Hong, X., Zhao, G.: Remote heart rate measurement from highly compressed facial videos: an end-to-end deep learning solution with video enhancement. In: Proceedings of the IEEE ICCV (2019)Google Scholar
  26. 26.
    Zhang, Z., Tran, L., Yin, X., Atoum, Y., Liu, X.: Gait recognition via disentangled representation learning. In: Proceedings of the IEEE CVPR (2019)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Key Laboratory of Intelligent Information Processing of Chinese Academy of Sciences (CAS)Institute of Computing Technology, CASBeijingChina
  2. 2.University of Chinese Academy of SciencesBeijingChina
  3. 3.Center for Machine Vision and Signal AnalysisUniversity of OuluOuluFinland
  4. 4.Peng Cheng LaboratoryShenzhenChina

Personalised recommendations