Advertisement

Digital Forensics for Frame Rate Up-Conversion in Wireless Sensor Network

  • Wendan Ma
  • Ran LiEmail author
Chapter
Part of the Transactions on Computational Science and Computational Intelligence book series (TRACOSCI)

Abstract

With the rapid development of wireless sensor network, the transmission and processing of multimedia data gradually become the main task of wireless sensors. To reduce the data bandwidth, many wireless sensors use frame rate up-conversion (FRUC) to recover the dropped frames at the receiver. FRUC is actually a temporal-domain tampering operation of video at the receiver, and FRUC forgery can be found by analyzing the statistical feature of the video. In this chapter, a forensics algorithm based on edge feature is proposed to discover forged traces of FRUC by detecting the edge variation of video frames. First, the Sobel operator is used to detect the edge of video frames. Then, the edge is quantified to obtain the edge complexity of each frame. Finally, the periodicity of the edge complexity along time axis is detected, and FRUC forgery is automatically identified by hard threshold decision. Experimental results show that the proposed algorithm has a good forensics performance for different FRUC forgery methods. Especially after the attacks of de-noising and compression, the proposed algorithm can still ensure high detection accuracy.

Keywords

Wireless sensor network Frame rate up-conversion Digital forensics Edge feature Periodicity detection 

References

  1. 1.
    Choudhury, S., Al-Turjman, F., & Pino, T. (2018). Dominating set algorithms for wireless sensor networks survivability. IEEE Access, 6(99), 17527–17532.Google Scholar
  2. 2.
    Al-Turjman, F., & Alturjman, S. (2018). 5G/IoT-enabled UAVs for multimedia delivery in industry-oriented applications. Springer’s Multimedia Tools and Applications, 1, 1–22.Google Scholar
  3. 3.
    Al-Turjman, F. (2018). QoS–aware data delivery framework for safety-inspired multimedia in integrated vehicular-IoT. Elsevier Computer Communications, 121, 33–43.CrossRefGoogle Scholar
  4. 4.
    Tsai, T. H., Shi, A. T., & Huang, K. T. (2016). Accurate frame rate up-conversion for advanced visual quality. IEEE Transactions on Broadcasting, 62(2), 426–435.CrossRefGoogle Scholar
  5. 5.
    Bian, S., Luo, W., & Huang, J. (2014). Exposing fake bit rate videos and estimating original bit rates. IEEE Transactions on Circuits & Systems for Video Technology, 24(12), 2144–2154.CrossRefGoogle Scholar
  6. 6.
    Bian, S., Luo, W., & Huang, J. (2014). Detecting video frame-rate up-conversion based on periodic properties of inter-frame similarity. Multimedia Tools and Applications, 72(1), 437–451.CrossRefGoogle Scholar
  7. 7.
    Wang, Z., Bovik, A. C., & Sheikh, H. R. (2004). Image quality assessment: From error visibility to structural similarity. IEEE Transactions on Image Processing, 13(4), 600–612.CrossRefGoogle Scholar
  8. 8.
    Yang, J., Huang, T., & Su, L. (2016). Using similarity analysis to detect frame duplication forgery in videos. Multimedia Tools and Applications, 75(4), 1–19.CrossRefGoogle Scholar
  9. 9.
    Choi, D., Song, W., & Choi, H. (2015). MAP-based motion refinement algorithm for block-based motion-compensated frame interpolation. IEEE Transactions on Circuits & Systems for Video Technology, 26(10), 1789–1804.CrossRefGoogle Scholar
  10. 10.
    Bestagini, P., Battalia, S., Milani, S., Tagliasacchi, M., & Tubaro, S. (2013). Detection of temporal interpolation in video sequences. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3033–3037.Google Scholar
  11. 11.
    Yao, Y., Yang, G., & Sun, X. (2016). Detecting video frame-rate up-conversion based on periodic properties of edge-intensity. Journal of Information Security & Applications, 26, 39–50.CrossRefGoogle Scholar
  12. 12.
    Xia, M., Yang, G., & Li, L. (2017). Detecting video frame rate up-conversion based on frame-level analysis of average texture variation. Multimedia Tools & Applications, 76(6), 8399–8421.CrossRefGoogle Scholar
  13. 13.
    Ding, X., Yang, G., & Li, R. (2018). Identification of motion-compensated frame rate up-conversion based on residual signal. IEEE Transactions on Circuits & Systems for Video Technology, 28(7), 1497–1512.CrossRefGoogle Scholar
  14. 14.
    De, H. G., Biezen, P. W. A. C., & Huijgen, H. (1993). True-motion estimation with 3-D recursive search block matching. IEEE Transactions on Circuits & Systems for Video Technology, 3(5), 368–379.CrossRefGoogle Scholar
  15. 15.
    Yoo, D. G., Kang, S. J., & Kim, Y. H. (2013). Direction-select motion estimation for motion-compensated frame rate up-conversion. Journal of Display Technology, 9(10), 840–850.CrossRefGoogle Scholar
  16. 16.
    Liu, H., Xiong, R., & Zhao, D. (2012). Multiple hypotheses Bayesian frame rate up-conversion by adaptive fusion of motion-compensated interpolations. IEEE Transactions on Circuits & Systems for Video Technology, 22(8), 1188–1198.CrossRefGoogle Scholar
  17. 17.
    Jeong, S. G., Lee, C., & Kim, C. S. (2013). Motion-compensated frame interpolation based on multi-hypothesis motion estimation and texture optimization. IEEE Transactions on Image Processing, 22(11), 4497–4509.MathSciNetCrossRefGoogle Scholar
  18. 18.
    Kanopoulos, N., Vasanthavada, N., & Baker, R. L. (2002). Design of an image edge detection filter using the Sobel operator. IEEE Journal of Solid-State Circuits, 23(2), 358–367.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Computer and Information TechnologyXinyang Normal UniversityXinyangChina

Personalised recommendations