Jean Bolot
Sprint Corporation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jean Bolot.
2008 3rd IEEE Symposium on New Frontiers in Dynamic Spectrum Access Networks | 2008
Daniel Willkomm; Sridhar Machiraju; Jean Bolot; Adam Wolisz
Most existing studies of spectrum usage have been performed by actively sensing the energy levels in specific RF bands including cellular bands. In this paper, we provide a unique, complementary analysis of cellular primary usage by analyzing a dataset collected inside a cellular network operator. One of the key aspects of our dataset is its scale - it consists of data collected over three weeks at hundreds of base stations. We dissect this data along different dimensions to characterize and model primary usage as well as understand its temporal and spatial variations. Our analysis reveals several results that are relevant if dynamic spectrum access (DSA) approaches are to be deployed for cellular frequency bands. For instance, we find that call durations show significant deviations from the often- used exponential distribution, which makes call-based modeling more complicated. We also show that a random walk process, which does not use call durations, can often be used for modeling the aggregate cell capacity. Furthermore, we highlight some applications of our results to improve secondary usage of licensed spectrum.
IEEE Communications Magazine | 2009
Daniel Willkomm; Sridhar Machiraju; Jean Bolot; Adam Wolisz
Dynamic spectrum access approaches, which propose to opportunistically use underutilized portions of licensed wireless spectrum such as cellular bands, are increasingly being seen as a way to alleviate spectrum scarcity. However, before DSA approaches can be enabled, it is important that we understand the dynamics of spectrum usage in licensed bands. Our focus in this article is the cellular band. Using a unique dataset collected inside a cellular network operator, we analyze the usage in cellular bands and discuss the implications of our results on enabling DSA in these bands. One of the key aspects of our dataset is its scale-it consists of data collected over three weeks at hundreds of base stations. We dissect this data along different dimensions to characterize if and when spectrum is available, develop models of primary usage, and understand the implications of these results on DSA techniques such as sensing.
knowledge discovery and data mining | 2008
Mukund Seshadri; Sridhar Machiraju; Ashwin Sridharan; Jean Bolot; Christos Faloutsos; Jure Leskove
We analyze a massive social network, gathered from the records of a large mobile phone operator, with more than a million users and tens of millions of calls. We examine the distributions of the number of phone calls per customer; the total talk minutes per customer; and the distinct number of calling partners per customer. We find that these distributions are skewed, and that they significantly deviate from what would be expected by power-law and lognormal distributions. To analyze our observed distributions (of number of calls, distinct call partners, and total talk time), we propose PowerTrack , a method which fits a lesser known but more suitable distribution, namely the Double Pareto LogNormal (DPLN) distribution, to our data and track its parameters over time. Using PowerTrack , we find that our graph changes over time in a way consistent with a generative process that naturally results in the DPLN distributions we observe. Furthermore, we show that this generative process lends itself to a natural and appealing social wealth interpretation in the context of social networks such as ours. We discuss the application of those results to our model and to forecasting.
international conference on computer communications | 2009
Marc Lelarge; Jean Bolot
Entities in the Internet, ranging from individuals and enterprises to service providers, face a broad range of epidemic risks such as worms, viruses, and botnet-driven attacks. Those risks are interdependent risks, which means that the decision by an entity to invest in security and self-protect affects the risk faced by others (for example, the risk faced by an individual decreases when its providers increases its investments in security). As a result of this, entities tend to invest too little in self-protection, relative to the socially efficient level, by ignoring benefits conferred on by others. In this paper, we consider the problem of designing incentives to entities in the Internet so that they invest at a socially efficient level. In particular, we find that insurance is a powerful incentive mechanism which pushes agents to invest in self-protection. Thus, insurance increases the level of self-protection, and therefore the level of security, in the Internet. As a result, we believe that insurance should be considered as an important component of risk management in the Internet.
measurement and modeling of computer systems | 2008
Marc Lelarge; Jean Bolot
Getting new security features and protocols to be widely adopted and deployed in the Internet has been a continuing challenge. There are several reasons for this, in particular economic reasons arising from the presence of network externalities. Indeed, like the Internet itself, the technologies to secure it exhibit network effects: their value to individual users changes as other users decide to adopt them or not. In particular, the benefits felt by early adopters of security solutions might fall significantly below the cost of adoption, making it difficult for those solutions to gain attraction and get deployed at a large scale. Our goal in this paper is to model and quantify the impact of such externalities on the adoptability and deployment of security features and protocols in the Internet. We study a network of interconnected agents, which are subject to epidemic risks such as those caused by propagating viruses and worms, and which can decide whether or not to invest some amount to deploy security solutions. Agents experience negative externalities from other agents, as the risks faced by an agent depend not only on the choices of that agent (whether or not to invest in self-protection), but also on those of the other agents. Expectations about choices made by other agents then influence investments in self-protection, resulting in a possibly suboptimal outcome overall. We present and solve an analytical model where the agents are connected according to a variety of network topologies. Borrowing ideas and techniques used in statistical physics, we derive analytic solutions for sparse random graphs, for which we obtain asymptotic results. We show that we can explicitly identify the impact of network externalities on the adoptability and deployment of security features. In other words, we identify both the economic and network properties that determine the adoption of security technologies. Therefore, we expect our results to provide useful guidance for the design of new economic mechanisms and for the development of network protocols likely to be deployed at a large scale.
internet measurement conference | 2006
Bruno F. Ribeiro; Donald F. Towsley; Tao Ye; Jean Bolot
Packet sampling is widely used in network monitoring. Sampled packet streams are often used to determine flow-level statistics of network traffic. To date there is conflicting evidence on the quality of the resulting estimates. In this paper we take a systematic approach, using the Fisher information metric and the Cramér-Rao bound, to understand the contributions that different types of information within sampled packets have on the quality of flow-level estimates. We provide concrete evidence that, without protocol information and with packet sampling rate p = 0.005, any accurate unbiased estimator needs approximately 1016 sampled flows. The required number of sampled flows drops to roughly 104 with the use of TCP sequence numbers. Furthermore, additional SYN flag information significantly reduces the estimation error of short flows. We present a Maximum Likelihood Estimator (MLE) that relies on all of this information and show that it is efficient, even when applied to a small sample set. We validate our results using Tier-1 Internet backbone traces and evaluate the benefits of sampling from multiple monitors. Our results show that combining estimates from several monitors is 50% less accurate than an estimate based on all samples.
arXiv: Computer Science and Game Theory | 2008
Marc Lelarge; Jean Bolot
Getting agents in the Internet, and in networks in general, to invest in and deploy security features and protocols is a challenge, in particular because of economic reasons arising from the presence of network externalities. Our goal in this paper is to model and investigate the impact of such externalities on security investments in a network. Specifically, we study a network of interconnected agents subject to epidemic risks such as viruses and worms where agents can decide whether or not to invest some amount to deploy security solutions. We consider both cases when the security solutions are strong (they perfectly protect the agents deploying them) and when they are weak. We make three contributions in the paper. First, we introduce a general model which combines an epidemic propagation model with an economic model for agents which captures network effects and externalities. Second, borrowing ideas and techniques used in statistical physics, we introduce a Local Mean Field (LMF) model, which extends the standard mean-field approximation to take into account the correlation structure on local neighborhoods. Third, we solve the LMF model in a network with externalities, and we derive analytic solutions for sparse random graphs of agents, for which we obtain asymptotic results. We find known phenomena such as free riders and tipping points. We also observe counter-intuitive phenomena, such as increasing the quality of the security technology can result in a decreased adoption of that technology in the network. In general, we find that both situations with strong and weak protection exhibit externalities and that the equilibrium is not socially optimal - therefore there is a market failure. Insurance is one mechanism to address this market failure. In related work, we have shown that insurance is a very effective mechanism [3,4], and argue that using insurance would increase the security in a network such as the Internet.
internet measurement conference | 2007
François Baccelli; Sridhar Machiraju; Darryl Veitch; Jean Bolot
Packet delay and loss are two fundamental measures of performance. Using active probing to measure delay and loss typically involves sending Poisson probes, on the basis of the PASTA property (Poisson Arrivals See Time Averages), which ensures that Poisson probing yields unbiased estimates. Recent work, however, has questioned the utility of PASTA for probing and shown that, for delay measurements, i) a wide variety of processes other than Poisson can be used to probe with zero bias and ii) Poisson probing does not necessarily minimize the variance of delay estimates. In this paper, we determine optimal probing processes that minimize the mean-square error of measurement estimates for both delay and loss. Our contributions are twofold. First, we show that a family of probing processes, specifically Gamma renewal probing processes, has optimal properties in terms of bias and variance. The optimality result is general, and only assumes that the target process we seek to optimally measure via probing, such as a loss or delay process, has a convex auto-covariance function. Second, we use empirical datasets to demonstrate the applicability of our results in practice, specifically to show that the convexity condition holds true and that Gamma probing is indeed superior to Poisson probing. Together, these results lead to explicit guidelines on designing the best probe streams for both delay and loss estimation.
Archive | 2009
Jean Bolot; Marc Lelarge
Managing security risks in the Internet has, so far, mostly involved methods to reduce the risks and the severity of the damages. Those methods (such as firewalls, intrusion detection and prevention, etc) reduce but do not eliminate risk, and the question remains on how to handle the residual risk. In this chapter, we consider the problem of whether buying insurance to protect the Internet and its users from security risks makes sense, and if so, identifying specific benefits of insurance and designing appropriate insurance policies.
international conference on computer communications | 2010
Hui Zang; François Baccelli; Jean Bolot
In this paper, we present a general technique based on Bayesian inference to locate mobiles in cellular networks. We study the problem of localizing users in a cellular network for calls with information regarding only one base station and hence triangulation or trilateration cannot be performed. In our call data records, this happens more than 50\% of time. We show how to localize mobiles based on our knowledge of the network layout and how to incorporate additional information such as round-trip-time and signal to noise and interference ratio (SINR) measurements. We study important parameters used in this Bayesian method through mining call data records and matching GPS records and obtain their distribution or typical values. We validate our localization technique in a commercial network with a few thousand emergency calls. The results show that the Bayesian method can reduce the localization error by 20% compared to a blind approach and the accuracy of localization can be further improved by refining the a priori user distribution in the Bayesian technique.