Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jeffrey Adam Bloom.
Proceedings of SPIE | 2009
Dekun Zou; Jeffrey Adam Bloom
This work addresses the watermarking of an entropy coded H.264/AVC video stream. The phrase Substitution Watermarking is used to imply that the application of the watermark to the stream is accomplished by substituting an original block of bits in the entropy-encoded stream with an alternative block of bits. This substitution is done for many different blocks of bits to embed the watermark. This can be a particularly powerful technique for applications in which the embedder must be very simple (substitution is a very light operation) and a computationally complex, pre-embedding analysis is practical. The pre-embedding analysis can generate a substitution table and the embedder can simply select entries from the table based on the payload. This paper presents the framework along with an example for H.264/AVC streams that use CAVLC for entropy coding. A separate paper addresses the CABAC entropy coding case.
Proceedings of SPIE | 2011
Chunhua Chen; Wen Chen; Jeffrey Adam Bloom
The perceptual quality of digital imagery is of great interest in many applications. Blur artifacts can be among the most annoying in processed images and video sequences. In many applications of perceptual quality assessment, a reference is not available. Therefore no-reference blurriness measures are of interest. In this paper, we present a universal, reference-free blurriness measurement approach. While some other methods are designed for a particular source of blurriness such as block-based compression, the proposed is universal in that it should work for any source of blur. The proposed approach models the gradient image of the given image as Markov chain and utilizes transition probabilities to compute a blurriness measure. This is the first time that transition probabilities are applied to perceptual quality assessment. Specifically, we first compute the transition probabilities for selected pairs of gradient values and then combine these probabilities, using a pooling strategy, to formulate the blurriness measure. Experimental studies compare the proposed method to the state-of-the-art reference-free blurriness measurement algorithms and show that the proposed method outperforms the commonly used measures.
Proceedings of SPIE | 2009
W. Sabrina Lin; Shan He; Jeffrey Adam Bloom
Digital forensic marking is a technology to discourage unauthorized redistribution of multimedia signals by embedding a unique mark into each users copy of the content. A powerful class of attacks on forensic marking is the collusion attack by a group of users. Recently, a new collusion attack, called the minority attack, has been proposed against forensic marking schemes with correlation-based detectors. Although this attack is not very effective on Gaussian-based forensic marking, it is quite powerful on removing the traces of users when the forensic marking is binary. In this paper, we first study the performance of an ECC-based binary forensic code under the minority attack and we model the additional processing, such as compression, applied on colluded copy as a binary symmetric channel. We confirm that the system can be defeated by a minority attack from only 3 colluders. To resist the minority attack, we propose a row-permuted binary orthogonal code to serve as the inner code for ECC-based forensic code, coupled with an adaptive detector. Experimental results show that the proposed scheme has a significantly improved resistance to a minority attack.
Archive | 2009
Shan He; Jeffrey Adam Bloom; Dekun Zou
Archive | 2009
Dekun Zou; Jeffrey Adam Bloom; Shan He
Archive | 2009
Dekun Zou; Jeffrey Adam Bloom; Shan He
Archive | 2009
Shan He; Jeffrey Adam Bloom; Dekun Zou
Archive | 2009
Dekun Zou; Jeffrey Adam Bloom; Shan He
Archive | 2009
Dekun Zou; Jeffrey Adam Bloom; Shan He
Archive | 2011
Shan He; Dekun Zou; Jeffrey Adam Bloom