Advertisement

Coded Two-Bucket Cameras for Computer Vision

  • Mian WeiEmail author
  • Navid Sarhangnejad
  • Zhengfan Xia
  • Nikita Gusev
  • Nikola Katic
  • Roman Genov
  • Kiriakos N. Kutulakos
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11207)

Abstract

We introduce coded two-bucket (C2B) imaging, a new operating principle for computational sensors with applications in active 3D shape estimation and coded-exposure imaging. A C2B sensor modulates the light arriving at each pixel by controlling which of the pixel’s two “buckets” should integrate it. C2B sensors output two images per video frame—one per bucket—and allow rapid, fully-programmable, per-pixel control of the active bucket. Using these properties as a starting point, we (1) develop an image formation model for these sensors, (2) couple them with programmable light sources to acquire illumination mosaics, i.e., images of a scene under many different illumination conditions whose pixels have been multiplexed and acquired in one shot, and (3) show how to process illumination mosaics to acquire live disparity or normal maps of dynamic scenes at the sensor’s native resolution. We present the first experimental demonstration of these capabilities, using a fully-functional C2B camera prototype. Key to this unique prototype is a novel programmable CMOS sensor that we designed from the ground up, fabricated and turned into a working system.

Notes

Acknowledgements

We gratefully acknowledge the support of the Natural Sciences and Engineering Research Council of Canada under the RGPIN, RTI and SGP programs, and of DARPA under the REVEAL program. We also wish to thank Hui Feng Ke and Gilead Wolf Posluns for FPGA programming related to the C2B sensor, Sarah Anne Kushner for help with live imaging experiments, and Michael Brown, Harel Haim and the anonymous reviewers for their many helpful comments and suggestions on earlier versions of this manuscript.

Supplementary material

Supplementary material 1 (mp4 88844 KB)

474178_1_En_4_MOESM2_ESM.pdf (1.6 mb)
Supplementary material 2 (pdf 1645 KB)

References

  1. 1.
    Lange, R., Seitz, P.: Solid-state time-of-flight range camera. IEEE J. Quantum Electron. 37(3), 390–397 (2001)CrossRefGoogle Scholar
  2. 2.
    Bamji, C.S., et al.: A 0.13 \(\mu \)m CMOS system-on-chip for a \(512\times 424\) time-of-flight image sensor with multi-frequency photo-demodulation up to 130 MHz and 2 GS/s ADC. IEEE J. Solid-State Circ. 50(1), 303–319 (2015)CrossRefGoogle Scholar
  3. 3.
    Newcombe, R.A., Fox, D., Seitz, S.: DynamicFusion: reconstruction and tracking of non-rigid scenes in real-time. In: Proceedings of IEEE CVPR (2015)Google Scholar
  4. 4.
    Heide, F., Hullin, M.B., Gregson, J., Heidrich, W.: Low-budget transient imaging using photonic mixer devices. In: Proceedings of ACM SIGGRAPH (2013)Google Scholar
  5. 5.
    Kadambi, A., Bhandari, A., Whyte, R., Dorrington, A., Raskar, R.: Demultiplexing illumination via low cost sensing and nanosecond coding. In: Proceedings of IEEE ICCP (2014)Google Scholar
  6. 6.
    Shrestha, S., Heide, F., Heidrich, W., Wetzstein, G.: Computational imaging with multi-camera time-of-flight systems. In: Proceedings of ACM SIGGRAPH (2016)Google Scholar
  7. 7.
    Callenberg, C., Heide, F., Wetzstein, G., Hullin, M.B.: Snapshot difference imaging using correlation time-of-flight sensors. In: Proceedings of ACM SIGGRAPH, Asia (2017)CrossRefGoogle Scholar
  8. 8.
    Lichtsteiner, P., Posch, C., Delbruck, T.: A 128\(\times \)128 120 dB 15 \(\mu \)s latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circ. 43(2), 566–576 (2008)CrossRefGoogle Scholar
  9. 9.
    Kim, H., Leutenegger, S., Davison, A.J.: Real-time 3D reconstruction and 6-DoF tracking with an event camera. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9910, pp. 349–364. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46466-4_21CrossRefGoogle Scholar
  10. 10.
    Matsuda, N., Cossairt, O., Gupta, M.: MC3D: motion contrast 3D scanning. In: Proceedings of IEEE ICCP (2015)Google Scholar
  11. 11.
    Jang, J., Yoo, Y., Kim, J., Paik, J.: Sensor-based auto-focusing system using multi-scale feature extraction and phase correlation matching. Sensors 15(3), 5747–5762 (2015)CrossRefGoogle Scholar
  12. 12.
    Yasuma, F., Mitsunaga, T., Iso, D., Nayar, S.K.: Generalized assorted pixel camera: postcapture control of resolution, dynamic range, and spectrum. IEEE-TIP 19(9), 2241–2253 (2010)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Zhang, J., Etienne-Cummings, R., Chin, S., Xiong, T., Tran, T.: Compact all-CMOS spatiotemporal compressive sensing video camera with pixel-wise coded exposure. Opt. Express 24(8), 9013–9024 (2016)CrossRefGoogle Scholar
  14. 14.
    Sonoda, T., Nagahara, H., Endo, K., Sugiyama, Y., Taniguchi, R.: High-speed imaging using CMOS image sensor with quasi pixel-wise exposure. In: Proceedings of IEEE ICCP (2016)Google Scholar
  15. 15.
    Baraniuk, R.G., Goldstein, T., Sankaranarayanan, A.C., Studer, C., Veeraraghavan, A., Wakin, M.B.: Compressive video sensing: algorithms, architectures, and applications. IEEE Sig. Process. Mag. 34(1), 52–66 (2017)CrossRefGoogle Scholar
  16. 16.
    Fossum, E.R., Hondongwa, D.B.: A review of the pinned photodiode for CCD and CMOS image sensors. IEEE J. Electron Devices 2(3), 33–43 (2014)CrossRefGoogle Scholar
  17. 17.
    Hitomi, Y., Gu, J., Gupta, M., Mitsunaga, T., Nayar, S.K.: Video from a single coded exposure photograph using a learned over-complete dictionary. In: Proceedings of IEEE ICCV (2011)Google Scholar
  18. 18.
    O’Toole, M., Mather, J., Kutulakos, K.N.: 3D shape and indirect appearance by structured light transport. IEEE T-PAMI 38(7), 1298–1312 (2016)CrossRefGoogle Scholar
  19. 19.
    Sheinin, M., Schechner, Y., Kutulakos, K.N.: Computational imaging on the electric grid. In: Proceedings of IEEE CVPR (2017)Google Scholar
  20. 20.
    O’Toole, M., Achar, S., Narasimhan, S.G., Kutulakos, K.N.: Homogeneous codes for energy-efficient illumination and imaging. In: Proceedings of ACM SIGGRAPH (2015)Google Scholar
  21. 21.
    Heintzmann, R., Hanley, Q.S., Arndt-Jovin, D., Jovin, T.M.: A dual path programmable array microscope (PAM): simultaneous acquisition of conjugate and non-conjugate images. J. Microsc. 204(2), 119–135 (2001)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Raskar, R., Agrawal, A., Tumblin, J.: Coded exposure photography: motion deblurring using fluttered shutter. In: Proceedings of ACM SIGGRAPH (2006)Google Scholar
  23. 23.
    Hernandez, C., Vogiatzis, G., Brostow, G.J., Stenger, B., Cipolla, R.: Non-rigid photometric stereo with colored lights. In: Proceedings of IEEE ICCV (2007)Google Scholar
  24. 24.
    Kim, H., Wilburn, B., Ben-Ezra, M.: Photometric stereo for dynamic surface orientations. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6311, pp. 59–72. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-15549-9_5CrossRefGoogle Scholar
  25. 25.
    Fyffe, G., Yu, X., Debevec, P.: Single-shot photometric stereo by spectral multiplexing. In: Proceedings of IEEE ICCP (2011)Google Scholar
  26. 26.
    Van der Jeught, S., Dirckx, J.J.J.: Real-time structured light profilometry: a review. Opt. Lasers Eng. 87, 18–31 (2016)CrossRefGoogle Scholar
  27. 27.
    Sagawa, R., Furukawa, R., Kawasaki, H.: Dense 3D reconstruction from high frame-rate video using a static grid pattern. IEEE T-PAMI 36(9), 1733–1747 (2014)CrossRefGoogle Scholar
  28. 28.
    Narasimhan, S.G., Koppal, S.J., Yamazaki, S.: Temporal dithering of illumination for fast active vision. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5305, pp. 830–844. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-88693-8_61CrossRefGoogle Scholar
  29. 29.
    Gharbi, M., Chaurasia, G., Paris, S., Durand, F.: Deep joint demosaicking and denoising. In: Proceedings of ACM SIGGRAPH Asia (2016)CrossRefGoogle Scholar
  30. 30.
    Heide, F., et al.: FlexISP: a flexible camera image processing framework. In: Proceedings of ACM SIGGRAPH, Asia (2014)CrossRefGoogle Scholar
  31. 31.
    Schechner, Y.Y., Nayar, S.K., Belhumeur, P.N.: Multiplexing for optimal lighting. IEEE T-PAMI 29(8), 1339–1354 (2007)CrossRefGoogle Scholar
  32. 32.
    Wei, M., Sarhangnejad, N., Xia, Z., Gusev, N., Katic, N., Genov, R., Kutulakos, K.N.: Coded two-bucket cameras for computer vision: supplemental document. In: Proceedings of ECCV (2018), also available at http://www.dgp.toronto.edu/C2B
  33. 33.
    Salvi, J., Fernandez, S., Pribanic, T., Llado, X.: A state of the art in structured light patterns for surface profilometry. Pattern Recogn. 43(8), 2666–2680 (2010)CrossRefGoogle Scholar
  34. 34.
    Salvi, J., Pages, J., Batlle, J.: Pattern codification strategies in structured light systems. Pattern Recogn. 37(4), 827–849 (2004)CrossRefGoogle Scholar
  35. 35.
    Woodham, R.J.: Photometric method for determining surface orientation from multiple images. Opt. Eng. 19(1), 191139 (1980)CrossRefGoogle Scholar
  36. 36.
    Sarhangnejad, N., Lee, H., Katic, N., O’Toole, M., Kutulakos, K.N., Genov, R.: CMOS image sensor architecture for primal-dual coding. In: International Image Sensor Workshop (2017)Google Scholar
  37. 37.
    Luo, Y., Mirabbasi, S.: Always-on CMOS image sensor pixel design for pixel-wise binary coded exposure. In: IEEE International Symposium on Circuits & Systems (2017)Google Scholar
  38. 38.
    Luo, Y., Ho, D., Mirabbasi, S.: Exposure-programmable CMOS pixel with selective charge storage and code memory for computational imaging. IEEE Trans. Circ. Syst. 65(5), 1555–1566 (2018)MathSciNetGoogle Scholar
  39. 39.
    Wan, G., Li, X., Agranov, G., Levoy, M., Horowitz, M.: CMOS image sensors with multi-bucket pixels for computational photography. IEEE J. Solid-State Circ. 47(4), 1031–1042 (2012)CrossRefGoogle Scholar
  40. 40.
    Wilburn, B.S., Ben-Ezra, M.: Time interleaved exposures and multiplexed illumination. US Patent 9,100,581 (2015)Google Scholar
  41. 41.
    Wan, G., Horowitz, M., Levoy, M.: Applications of multi-bucket sensors to computational photography. Technical report, Stanford Computer Graphics Lab (2012)Google Scholar
  42. 42.
    Seo, M.W., et al.: 4.3 A programmable sub-nanosecond time-gated 4-tap lock-in pixel CMOS image sensor for real-time fluorescence lifetime imaging microscopy. In: Proceedings of IEEE ISSCC (2017)Google Scholar
  43. 43.
    Yoda, T., Nagahara, H., Taniguchi, R.I., Kagawa, K., Yasutomi, K., Kawahito, S.: The dynamic photometric stereo method using a multi-tap CMOS image sensor. Sensors 18(3), 786 (2018)CrossRefGoogle Scholar
  44. 44.
    Wetzstein, G., Ihrke, I., Heidrich, W.: On plenoptic multiplexing and reconstruction. Int. J. Comput. Vis. 101(2), 384–400 (2013)CrossRefGoogle Scholar
  45. 45.
    Ratner, N., Schechner, Y.Y., Goldberg, F.: Optimal multiplexed sensing: bounds, conditions and a graph theory link. Opt. Express 15(25), 17072–17092 (2007)CrossRefGoogle Scholar
  46. 46.
    Brown, C.M.: Multiplex imaging and random arrays. Ph.D. thesis, University of Chicago (1972)Google Scholar
  47. 47.
    Ratner, N., Schechner, Y.Y.: Illumination multiplexing within fundamental limits. In: Proceedings of IEEE CVPR (2007)Google Scholar
  48. 48.
    Nonoyama, M., Sakaue, F., Sato, J.: Multiplex image projection using multi-band projectors. In: IEEE Workshop on Color and Photometry in Computer Vision (2013)Google Scholar
  49. 49.
    Mitra, K., Cossairt, O.S., Veeraraghavan, A.: A framework for analysis of computational imaging systems: role of signal prior. IEEE T-PAMI Sens. Noise Multiplexing 36(10), 1909–1921 (2014)CrossRefGoogle Scholar
  50. 50.
    Liu, Z., Shan, Y., Zhang, Z.: Expressive expression mapping with ratio images. In: Proceedings of ACM SIGGRAPH (2001)Google Scholar
  51. 51.
    Wang, L., Yang, R., Davis, J.: BRDF invariant stereo using light transport constancy. IEEE T-PAMI 29(9), 1616–1626 (2007)CrossRefGoogle Scholar
  52. 52.
    Pilet, J., Strecha, C., Fua, P.: Making background subtraction robust to sudden illumination changes. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5305, pp. 567–580. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-88693-8_42CrossRefGoogle Scholar
  53. 53.
    Bayer, B.E.: Color imaging array. US Patent 3,971,065 (1976)Google Scholar
  54. 54.
    Narasimhan, S.G., Nayar, S.: Enhancing resolution along multiple imaging dimensions using assorted pixels. IEEE T-PAMI 27(4), 518–530 (2005)CrossRefGoogle Scholar
  55. 55.
    Queau, Y., Mecca, R., Durou, J.D., Descombes, X.: Photometric stereo with only two images: a theoretical study and numerical resolution. Image Vis. Comput. 57, 175–191 (2017)CrossRefGoogle Scholar
  56. 56.
    Gupta, M., Nayar, S.K.: Micro phase shifting. In: Proceedings of IEEE CVPR (2012)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Mian Wei
    • 1
    Email author
  • Navid Sarhangnejad
    • 2
  • Zhengfan Xia
    • 2
  • Nikita Gusev
    • 2
  • Nikola Katic
    • 2
  • Roman Genov
    • 2
  • Kiriakos N. Kutulakos
    • 1
  1. 1.Department of Computer ScienceUniversity of TorontoTorontoCanada
  2. 2.Department of Electrical EngineeringUniversity of TorontoTorontoCanada

Personalised recommendations