Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Félix Hernández-Campos is active.

Publication


Featured researches published by Félix Hernández-Campos.


international wireless internet conference | 2006

Spatio-temporal modeling of traffic workload in a campus WLAN

Félix Hernández-Campos; Merkourios Karaliopoulos; Maria Papadopouli; Haipeng Shen

Campus wireless LANs (WLANs) are complex systems with hundreds of access points (APs) and thousands of users. Their performance analysis calls for realistic models of their elements, which can be input to simulation and testbed experiments but also taken into account for theoretical work. However, only few modeling results in this area are derived from real measurement data, and rarely do they provide a complete and consistent view of entire WLANs. In this work, we address this gap relying on extensive traces collected from the large wireless infrastructure of the University of North Carolina. We present a first system-wide, multi-level modeling approach for characterizing the traffic demand in a campus WLAN. Our approach focuses on two structures of wireless user activity, namely the wireless session and the network flow. We propose statistical distributions for their attributes, aiming at a parsimonious characterization that can be the most flexible foundation for simulation studies. We simulate our models and show that the synthesized traffic is in good agreement with the original trace data. Finally, we investigate to what extent these models can be valid at finer spatial aggregation levels of traffic load, e.g., for modeling traffic demand in hotspot APs.


Journal of Applied Statistics | 2011

Long-range dependence analysis of Internet traffic

Cheolwoo Park; Félix Hernández-Campos; Long Le; J. S. Marron; Juhyun Park; Vladas Pipiras; F.D. Smith; Richard L. Smith; Michele Trovero; Zhengyuan Zhu

Long-range-dependent time series are endemic in the statistical analysis of Internet traffic. The Hurst parameter provides a good summary of important self-similar scaling properties. We compare a number of different Hurst parameter estimation methods and some important variations. This is done in the context of a wide range of simulated, laboratory-generated, and real data sets. Important differences between the methods are highlighted. Deep insights are revealed on how well the laboratory data mimic the real data. Non-stationarities, which are local in time, are seen to be central issues and lead to both conceptual and practical recommendations.


Stochastic Models | 2005

Extremal dependence: Internet traffic applications

Félix Hernández-Campos; Cheolwoo Park; J. S. Marron; Sidney I. Resnick

ABSTRACT For bivariate heavy tailed data, the extremes may carry distinctive dependence information not seen from moderate values. For example, a large value in one component may help cause a large value in the other. This is the idea behind the notion of extremal dependence. We discuss ways to detect and measure extremal dependence. We apply the techniques discussed to internet data and conclude that for files transferred, file size and throughput (the inferred rate at which the file is transferred) exhibit extremal independence.


personal, indoor and mobile radio communications | 2005

A comparative measurement study the workload of wireless access points in campus networks

Félix Hernández-Campos; Maria Papadopouli

Our goal is to perform a system-wide characterization of the workload of wireless access points (APs) in a production 802.11 infrastructure. The key issues of this study are the characterization of the traffic at each access point (AP), its modeling, and a comparison among APs of different wireless campus-wide infrastructures. Unlike most other studies, we compare two networks using similar data acquisition techniques and analysis methods. This makes the results more generally applicable. We analyzed the aggregate traffic load of APs and found that the log normality is prevalent. The distributions of the wireless received and sent traffic load for these infrastructures are similar. Furthermore, we discovered a dichotomy of APs: there are APs with the majority of clients that are uploaders and APs in which the majority of their clients are downloaders. Also, the number of non-unicast wireless packets and the percentage of roaming events are large. Finally, there is a correlation between the number of associations and traffic load in the log-log scale


broadband communications, networks and systems | 2007

Modeling and generating TCP application workloads

Félix Hernández-Campos; F. Donelson Smith

In order to perform valid experiments, traffic generators used in network simulators and testbeds require contemporary models of traffic as it exists on real network links. Ideally one would like a model of the workload created by the full range of applications running on the Internet today. Unfortunately, at best, all that is available to the research community are a small number of models for single applications or application classes such as the web or peer-to-peer. We present a method for creating a model of the full TCP application workload that generates the traffic flowing on a network link. From this model, synthetic workload traffic can be generated in a simulation that is statistically similar to the traffic observed on the real link. The model is generated automatically using only a simple packet-header trace and requires no knowledge of the actual identity or mix of TCP applications on the network. We present the modeling method and a traffic generator that will enable researchers to conduct network experiments with realistic, easy-to-update TCP application workloads. An extensive validation study is performed using Abilene and university traces. The method is validated by comparing traces of synthetically generated traffic to the original traces for a set of important measures of realism. We also show how workload models can be re-sampled to generate statistically valid randomized and rescaled variations.


modeling, analysis, and simulation on computer and telecommunication systems | 2005

Understanding patterns of TCP connection usage with statistical clustering

Félix Hernández-Campos; Andrew B. Nobel; F.D. Smith

We describe a new methodology for understanding how applications use TCP to exchange data. The method is useful for characterizing TCP workloads and synthetic traffic generation. Given a packet header trace, the method automatically constructs a source-level model of the applications using TCP in a network without any a priori knowledge of which applications are actually present in a network. From this source-level model, statistical feature vectors can be defined for each TCP connection in the trace. Hierarchical cluster analysis can then be performed to identify connections that are statistically homogeneous and that are likely exerting similar demands on a network. We apply the methods to packet header traces taken from the UNC and Abilene networks and show how classes of similar connections can be automatically detected and modeled.


workshop on local and metropolitan area networks | 2005

Assessing the real impact of 802.11 WLANs: a large-scale comparison of wired and wireless traffic

Félix Hernández-Campos; Maria Papadopouli

We compared the traffic from hosts connected to the network via a wired or wireless interface, emphasizing the impact of 802.11 on packet delay and loss. Our study uses only passive monitoring techniques, namely, inference from TCP header traces. This enabled us to study a population of several thousand hosts in a real production environment, in which more than 31 million TCP connections were made. Our first contribution is methodological. Passive methods always have some degree of uncertainty, and we overcome this limitation by mostly relying on relative differences between wired and wireless traffic. Our analysis revealed that wireless clients experienced substantially higher packet delay variability than wired clients but their loss rates are surprisingly similar. We found that both the number of unnecessary TCP retransmissions and, even more substantially, the number of interrupted connections are higher for the wireless LAN than for the wired LAN. To the best of our knowledge, this is the first research effort to directly contrast wired and wireless traffic of a large production network


Probability in the Engineering and Informational Sciences | 2004

STOCHASTIC DIFFERENTIAL EQUATION FOR TCP WINDOW SIZE: ANALYSIS AND EXPERIMENTAL VALIDATION

Amarjit Budhiraja; Félix Hernández-Campos; Vidyadhar G. Kulkarni; F.D. Smith

In this paper we develop a stochastic differential equation to describe the dynamic evolution of the congestion window size of a single TCP session over a network. The model takes into account recovery from packet losses with both fast recovery and time-outs, boundary behavior at zero and maximum window size, and slow-start after time-outs. We solve the differential equation to derive the distribution of the window size in steady state. We compare the model predictions with the output from the NS simulator.


Archive | 2002

Mice and Elephants Visualization of Internet Traffic

J. S. Marron; Félix Hernández-Campos; F.D. Smith

Internet traffic is composed of flows, sets of packets being transferred from one computer to another. Some visualizations for understanding the set of flows at a busy internet link are developed. These show graphically that the set of flows is dominated by a relatively few “elephants”, and a very large number of “mice”. It also becomes clear that “representative sampling” from heavy tail distributions is a challenging task.


The Annals of Applied Statistics | 2010

Analysis of dependence among size, rate and duration in internet flows

Cheolwoo Park; Félix Hernández-Campos; J. S. Marron; F. Donelson Smith

In this paper we examine rigorously the evidence for dependence among data size, transfer rate and duration in Internet flows. We emphasize two statistical approaches for studying dependence, including Pearson’s correlation coefficient and the extremal dependence analysis method. We apply these methods to large data sets of packet traces from three networks. Our major results show that Pearson’s correlation coefficients between size and duration are much smaller t han one might expect. We also find that correlation coefficients between size and rate are generally small and can be strongly affected by applying thresholds to size or duration. Based on Transmission Control Protocol connection startup mechanisms, we argue that thresholds on size should be more useful than thresholds on duration in the analysis of correlations. Using extremal dependence analysis, we draw a similar conclusion, finding remarkable independence for extremal values of size and rate.

Collaboration


Dive into the Félix Hernández-Campos's collaboration.

Top Co-Authors

Avatar

J. S. Marron

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

F.D. Smith

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

F. Donelson Smith

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew B. Nobel

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Haipeng Shen

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amarjit Budhiraja

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge