Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jihad Fahs is active.

Publication


Featured researches published by Jihad Fahs.


IEEE Transactions on Information Theory | 2012

Using Hermite Bases in Studying Capacity-Achieving Distributions Over AWGN Channels

Jihad Fahs; Ibrahim C. Abou-Faycal

This paper studies classes of generic deterministic, discrete time, memoryless, and “nonlinear” additive white Gaussian noise (AWGN) channels. Subject to multiple types of constraints such as the even-moment and compact-support constraints or a mixture, the optimal input is proved to be discrete with finite number of mass points in the vast majority of the cases. Only under the even-moment constraint and for special cases that emulate the average power constrained linear channel, capacity is found to be achieved by an absolutely continuous input. The results are extended to channels where the distortion is generally piecewise nonlinear where the discrete nature of the optimal input is conserved. These results are reached through the development of methodology and tools that are based on standard decompositions in a Hilbert space with the Hermite polynomials as a basis, and it is showcased how these bases are natural candidates for general information-theoretic studies of the capacity of channels affected by AWGN. Intermediately, novel results regarding the output rate of decay of Gaussian channels are derived. Namely, the output probability distribution of any channel subjected to additive Gaussian noise decays necessarily “slower” than the Gaussian itself. Finally, numerical computations are provided for some sample cases, optimal inputs are determined, and capacity curves are drawn. These results put into question the accuracy of adopting the widely used expression 1(1+ SNR) for computing capacities of Gaussian deterministic channels.


international symposium on information theory | 2012

On the capacity of additive white alpha-stable noise channels

Jihad Fahs; Ibrahim C. Abou-Faycal

Many communication channels are reasonably modeled to be impaired by additive noise. Recent studies suggest that many of these channels are affected by additive noise that is best explained by alpha-stable statistics. We study in this work such channel models and we characterize the capacity-achieving input distribution for those channels under fractional order moment constraints. We prove that the optimal input is necessarily discrete with a compact support for all such channels. Interestingly, if the second moment is viewed as a measure of power, even when the channel input is allowed to have infinite second moment, the optimal one is found to have finite power.


international symposium on information theory | 2014

A cauchy input achieves the capacity of a Cauchy channel under a logarithmic constraint

Jihad Fahs; Ibrahim C. Abou-Faycal

In this work, we consider a discrete-time memoryless communication channel where the input is subjected to an independent additive Cauchy noise. We find the input constraint under which a Cauchy input is capacity achieving. The constraint is logarithmic and depends on a scalar parameter k which we interpret as a power measure. We draw a parallelism between this setup and that of the Gaussian channel under the second moment constraint. In fact, a Cauchy input yields a Cauchy output over this channel and achieves a capacity value of “log(1 + SNR)”.


international conference on telecommunications | 2010

On the capacity of some deterministic non-linear channels subject to additive white Gaussian noise

Ibrahim C. Abou-Faycal; Jihad Fahs

We consider a variety of memoryless discrete-time noisy communication channels where the noise is modeled as an additive white Gaussian noise process, and where the input of the channel is distorted according to a deterministic function f(X) of the form i) f(X) = αX<sup>n</sup> ii) f(X) = α|X|<sup>n</sup> iii) f(X) = α|X|<sup>1/q</sup> iv) f(X) = αsgn(X)|X|<sup>1/q</sup>, for all α ∊ R<sup>⋆</sup>, n ∊ N<sup>⋆</sup>\{1}. Subject to an average power constraint, we study the capacity-achieving input distributions of these classes of channels and prove them to be discrete except for the linear case f(X)=αX, where the optimal input is Gaussian distributed. Furthermore, we prove that these optimal input distributions have a finite number of mass points for classes iii) and iv). The results are reached through the development of a methodology and tools that are based on standard decompositions in a Hilbert space with the Hermite Polynomials as a basis, and we conjecture that these bases are natural candidates for general information-theoretic studies of the capacity of channels affected by additive white Gaussian noise.


IEEE Transactions on Communications | 2016

On the Finiteness of the Capacity of Continuous Channels

Jihad Fahs; Ibrahim C. Abou-Faycal

Evaluating the channel capacity is one of many key problems in information theory. In this work, we derive rather-mild sufficient conditions under which the capacity of continuous channels is finite and achievable. These conditions are derived for generic, memoryless, and possibly nonlinear additive noise channels. The results are based on a novel sufficient condition that guarantees the convergence of differential entropies under point-wise convergence of probability density functions. Perhaps surprisingly, the finiteness of channel capacity holds for the majority of setups, including those where inputs and outputs have possibly infinite second-moments.


international conference on telecommunications | 2012

The capacity of average power constrained additive non-Gaussian noise channels

Jihad Fahs; Nizar Ajeeb; Ibrahim C. Abou-Faycal

It is well known that a Gaussian input achieves the capacity of the linear additive white Gaussian noise channel under the average power constraint. While the continuity of an optimal input of a continuous channel might be expected, the Gaussian noise presents the only such scenario. In fact, we study in this paper the capacity-achieving inputs for the linear additive noise channel where the noise is not necessarily Gaussian. We impose an average power constraint on the input and prove that, except for a Gaussian channel, the optimal input is of a discrete nature.


international symposium on information theory | 2011

On the detrimental effect of assuming a linear model for non-linear AWGN channels

Jihad Fahs; Ibrahim C. Abou-Faycal

In communication theory, one of the best understood and commonly adopted channel model is the average-power constrained linear AWGN channel, the capacity of which is given by the expression 1 over 2 log(1 + SNR). But what if the channel is not linear? How bad is it to adopt a linear model for a non-linear channel?


arXiv: Information Theory | 2016

Input Constraints and Noise Density Functions: A Simple Relation for Bounded-Support and Discrete Capacity-Achieving Inputs

Jihad Fahs; Ibrahim C. Abou-Faycal

We study the classical problem of characterizing the channel capacity and its achieving distribution in a generic fashion. We derive a simple relation between three parameters: the input–output function, the input cost function, and the noise probability density function, one which dictates the type of the optimal input. In layman terms, we prove that the support of the optimal input is bounded whenever the cost grows faster than a “cutoff” growth rate equal to the logarithm of the inverse of the noise probability density function evaluated at the input–output function. Furthermore, we prove a converse statement that says whenever the cost grows slower than the “cutoff” rate, the optimal input has necessarily an unbounded support. In addition, we show how the discreteness of the optimal input is guaranteed whenever the triplet satisfy some analyticity properties. We argue that a suitable cost function to be imposed on the channel input is one that grows similarly to the “cutoff” rate. Our results are valid for any cost function that is super-logarithmic. They summarize a large number of previous channel capacity results and give new ones for a wide range of communication channel models, such as Gaussian mixtures, generalized-Gaussians, and heavy-tailed noise models, that we state along with numerical computations.


Entropy | 2015

A New Tight Upper Bound on the Entropy of Sums

Jihad Fahs; Ibrahim C. Abou-Faycal

We consider the independent sum of a given random variable with a Gaussian variable and an infinitely divisible one. We find a novel tight upper bound on the entropy of the sum which still holds when the variable possibly has an infinite second moment. The proven bound has several implications on both information theoretic problems and infinitely divisible noise channels’ transmission rates.


international conference on communications | 2015

Interference modeling and capacity-coverage analysis in downlink cellular networks with user scheduling

Jihad Fahs; Naeem Akl; Zaher Dawy

Interference shapes the interplay between capacity and coverage in cellular networks. However, interference is nondeterministic and depends on various system and channel parameters including user scheduling, frequency reuse, and fading variations. We present an analytical approach for modeling the distribution of intercell interference in the downlink of cellular networks as a function of generic fading channel models and various scheduling schemes. We demonstrate the usefulness of the derived expressions in calculating location-based and average-based data rates in addition to capturing practical interdependence between cell capacity and coverage in downlink cellular networks.

Collaboration


Dive into the Jihad Fahs's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Naeem Akl

American University of Beirut

View shared research outputs
Top Co-Authors

Avatar

Nizar Ajeeb

American University of Beirut

View shared research outputs
Top Co-Authors

Avatar

Zaher Dawy

American University of Beirut

View shared research outputs
Researchain Logo
Decentralizing Knowledge