摘要

Performing detection on surveillance videos contributes significantly to the goals of safety and security. However, performing detection on unprotected surveillance video may reveal the privacy of innocent people in the video. Therefore, striking a proper balance between maintaining personal privacy while enhancing the feasibility of detection is an important issue. One promising solution to this problem is to encrypt the surveillance videos and perform detection on the encrypted videos. Most existing encrypted signal processing methods focus on still images or small data volumes; however, because videos are typically much larger, investigating how to process encrypted videos is a significant challenge. In this article, we propose an efficient motion detection and tracking scheme for encrypted H.264/AVC video bitstreams, which does not require the previous decryption on the encrypted video. The main idea is to first estimate motion information from the bitstream structure and codeword length and, then, propose a region update (RU) algorithm to deal with the loss and error drifting of motion caused by the video encryption. The RU algorithm is designed based on the prior knowledge that the object motion in the video is continuous in space and time. Compared to the existing scheme, which is based on video encryption that occurs at the pixel level, the proposed scheme has the advantages of requiring only a small storage of the encrypted video and has a low computational cost for both encryption and detection. Experimental results show that our scheme performs better regarding detection accuracy and execution speed. Moreover, the proposed scheme can work with more than one format-compliant video encryption method, provided that the positions of the macroblocks can be extracted from the encrypted video bitstream. Due to the coupling of video stream encryption and detection algorithms, our scheme can be directly connected to the video stream output (e.g., surveillance cameras) without requiring any camera modifications.