Gökhan Gül
Technische Universität Darmstadt
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Gökhan Gül.
information hiding | 2011
Gökhan Gül; Fatih Kurugollu
This paper presents a new methodology for the steganalysis of digital images. In principle, the proposed method is applicable to any kind of steganography at any domain. Special interest is put on the steganalysis of Highly Undetectable Steganography (HUGO). The proposed method first extracts features via applying a function to the image, constructing the k variate probability density function (PDF) estimates, and downsampling it by a suitable downsampling algorithm. The extracted feature vectors are then further optimized in order to increase the detection performance and reduce the computational time. Finally using a supervised classification algorithm such as SVM, steganalysis is performed. The proposed method is capable of detecting BOSSRank image set with an accuracy of 85%.
international conference on image processing | 2010
Gökhan Gül; Ismail Avcibas; Fatih Kurugollu
In this paper we present a novel method based on singular value decomposition (SVD) for forensic analysis of digital images. We show that image tampering distorts linear dependencies of image rows/columns and derived features can be accurate enough to detect image manipulations and digital forgeries. Extensive experiments show that the proposed approach can outperform the counterparts in the literature.
international conference on acoustics, speech, and signal processing | 2013
Gökhan Gül; Abdelhak M. Zoubir
We propose a minimax robust hypothesis testing strategy between two composite hypotheses determined by the neighborhoods of two nominal distributions with respect to the squared Hellinger distance. The robust tests obtained are the nonlinearly transformed versions of the nominal likelihood ratios, whereas the least favorable densities are derived in three different regions. In two of them, they are scaled versions of the corresponding nominal densities and in the third region they form a composite version of the two nominal densities. The outcomes and implications of the proposed robust test are discussed through comparisons with the recent literature.
IEEE Transactions on Information Theory | 2017
Gökhan Gül; Abdelhak M. Zoubir
Minimax robust hypothesis testing is studied for the cases where the collected data samples are corrupted by outliers and are mismodeled due to modeling errors. For the former case, Huber’s clipped likelihood ratio test is introduced and analyzed. For the latter case, first, a robust hypothesis testing scheme based on the Kullback–Leibler divergence is designed. This approach generalizes a previous work by Levy. Second, Dabak and Johnson’s asymptotically robust test is introduced, and other possible designs based on <inline-formula> <tex-math notation=LaTeX>
international workshop on signal processing advances in wireless communications | 2013
Gökhan Gül; Abdelhak M. Zoubir
f
IEEE Transactions on Information Forensics and Security | 2013
Gökhan Gül; Fatih Kurugollu
</tex-math></inline-formula>-divergences are investigated. All proposed and analyzed robust tests are extended to fixed sample size and sequential probability ratio tests. Simulations are provided to exemplify and evaluate the theoretical derivations.
IEEE Transactions on Signal Processing | 2016
Gökhan Gül; Abdelhak M. Zoubir
We consider the design of robust hypothesis testing when the sensors censor their observations in order to comply with communication rate constraints. We assume that the sensors are identical and the communication constraint is divided evenly among the sensors. As a result, the scenario considered boils down to a single sensor communicating to a fusion center. The design phase is divided into two parts. In the first part, the censoring regions are assumed to be determined a priori and least favorable densities (LFDs) are sought for under the communication rate constraints whereas in the second part, the LFDs are determined a priori, satisfying the communication rate constraints by the use of a robustness parameter e.
IEEE Transactions on Signal Processing | 2017
Gökhan Gül; Abdelhak M. Zoubir
Blind steganalysis of JPEG images is addressed by modeling the correlations among the DCT coefficients using K -variate (K ≥ 2) p.d.f. estimates (p.d.f.s) constructed by means of Markov random field (MRF) cliques. The reasoning of using high variate p.d.f.s together with MRF cliques for image steganalysis is explained via a classical detection problem. Although our approach has many improvements over the current state-of-the-art, it suffers from the high dimensionality and the sparseness of the high variate p.d.f.s. The dimensionality problem as well as the sparseness problem are solved heuristically by means of dimensionality reduction and feature selection algorithms. The detection accuracy of the proposed method(s) is evaluated over Memons (30.000 images) and Goljans (1912 images) image sets. It is shown that practically applicable steganalysis systems are possible with a suitable dimensionality reduction technique and these systems can provide, in general, improved detection accuracy over the current state-of-the-art. Experimental results also justify this assertion.
ieee signal processing workshop on statistical signal processing | 2014
Gökhan Gül; Abdelhak M. Zoubir
A robust minimax test for two composite hypotheses, which are determined by the neighborhoods of two nominal distributions with respect to a set of distances-called α-divergence distances, is proposed. Sions minimax theorem is adopted to characterize the saddle value condition. Least favorable distributions, the robust decision rule and the robust likelihood ratio test are derived. If the nominal probability distributions satisfy a symmetry condition, the design procedure is shown to be simplified considerably. The parameters controlling the degree of robustness are bounded from above and the bounds are shown to be resulting from a solution of a set of equations. The simulations performed evaluate and exemplify the theoretical derivations.
Archive | 2017
Gökhan Gül
Minimax decentralized detection is studied under two scenarios: with and without a fusion center when the source of uncertainty is the Bayesian prior. When there is no fusion center, the constraints in the network design are determined. Both for a single decision maker and multiple decision makers, the maximum loss in detection performance due to minimax decision making is obtained. In the presence of a fusion center, the maximum loss of detection performance between with and without fusion center networks is derived assuming that both networks are minimax robust. The results are finally generalized.