Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Carey L. Williamson is active.

Publication


Featured researches published by Carey L. Williamson.


measurement and modeling of computer systems | 1996

Web server workload characterization: the search for invariants

Martin F. Arlitt; Carey L. Williamson

The phenomenal growth in popularity of the World Wide Web (WWW, or the Web) has made WWW traffic the largest contributor to packet and byte traffic on the NSFNET backbone. This growth has triggered recent research aimed at reducing the volume of network traffic produced by Web clients and servers, by using caching, and reducing the latency for WWW users, by using improved protocols for Web interaction.Fundamental to the goal of improving WWW performance is an understanding of WWW workloads. This paper presents a workload characterization study for Internet Web servers. Six different data sets are used in this study: three from academic (i.e., university) environments, two from scientific research organizations, and one from a commercial Internet provider. These data sets represent three different orders of magnitude in server activity, and two different orders of magnitude in time duration, ranging from one week of activity to one year of activity.Throughout the study, emphasis is placed on finding workload invariants: observations that apply across all the data sets studied. Ten invariants are identified. These invariants are deemed important since they (potentially) represent universal truths for all Internet Web servers. The paper concludes with a discussion of caching and performance issues, using the invariants to suggest performance enhancements that seem most promising for Internet Web servers.


IEEE ACM Transactions on Networking | 1997

Internet Web servers: workload characterization and performance implications

Martin F. Arlitt; Carey L. Williamson

This paper presents a workload characterization study for Internet Web servers. Six different data sets are used in the study: three from academic environments, two from scientific research organizations, and one from a commercial Internet provider. These data sets represent three different orders of magnitude in server activity, and two different orders of magnitude in time duration, ranging from one week of activity to one year. The workload characterization focuses on the document type distribution, the document size distribution, the document referencing behavior, and the geographic distribution of server requests. Throughout the study, emphasis is placed on finding workload characteristics that are common to all the data sets studied. Ten such characteristics are identified. The paper concludes with a discussion of caching and performance issues, using the observed workload characteristics to suggest performance enhancements that seem promising for Internet Web servers.


acm/ieee international conference on mobile computing and networking | 1997

Mobile multicast (MoM) protocol: multicast support for mobile hosts

Tim G. Harrison; Carey L. Williamson; Wayne L. Mackrell; Richard B. Bunt

Thii paper describes a new protocol to support IP multicast for mobile hosts in an IP internetwork. It uses the basic unicast routing capability of IETF Mobile IP as a foundation, and leverages existing IP multicast to provide multicast services for mobile hosts as well. We believe that the resulting scheme is simple, scalable, transparent, and to the extent possible, independent of the underlying multicast routing facility. Discrete-event simulation was used as the vehicle for a LLproof of concept” debugging of the protocol, as well as to determine its performance characteristics. A key feature of the new protocol is the use of designated multicast service providers (DMSPs) to address the scalability issues of mobile multicast. Our simulation results suggest distinct performance advantages of our protocol using DMSPs over two other approaches proposed for the mobile multicast problem, namely remote subscription and bi-directional tunnelling, particularly as the number of mobile group members increases.


IEEE Internet Computing | 2001

Internet traffic measurement

Carey L. Williamson

The Internets evolution over the past 30 years (1971-2001), has been accompanied by the development of various network applications. These applications range from early text-based utilities such as file transfer and remote login to the more recent advent of the Web, electronic commerce, and multimedia streaming. For most users, the Internet is simply a connection to these applications. They are shielded from the details of how the Internet works, through the-information-hiding principles of the Internet protocol stack, which dictates how user-level data is transformed into network packets for transport across the network and put back together for delivery at the receiving application. For many networking researchers however, the protocols themselves are of interest. Using specialized network measurement hardware or software, these researchers collect information about network packet transmissions. With detailed packet-level measurements and some knowledge of the IP stack, they can use reverse engineering to gather significant information about both the application structure and user behavior, which can be applied to a variety of tasks like network troubleshooting, protocol debugging, workload characterization, and performance evaluation and improvement. Traffic measurement technologies have scaled up to provide insight into fundamental behavior properties of the Internet, its protocols, and its users. The author introduces the tools and methods for measuring Internet traffic and offers highlights from research results.


IEEE Transactions on Visualization and Computer Graphics | 2010

A Visual Backchannel for Large-Scale Events

Marian Dörk; Daniel M. Gruen; Carey L. Williamson; M. Sheelagh T. Carpendale

We introduce the concept of a Visual Backchannel as a novel way of following and exploring online conversations about large-scale events. Microblogging communities, such as Twitter, are increasingly used as digital backchannels for timely exchange of brief comments and impressions during political speeches, sport competitions, natural disasters, and other large events. Currently, shared updates are typically displayed in the form of a simple list, making it difficult to get an overview of the fast-paced discussions as it happens in the moment and how it evolves over time. In contrast, our Visual Backchannel design provides an evolving, interactive, and multi-faceted visual overview of large-scale ongoing conversations on Twitter. To visualize a continuously updating information stream, we include visual saliency for what is happening now and what has just happened, set in the context of the evolving conversation. As part of a fully web-based coordinated-view system we introduce Topic Streams, a temporally adjustable stacked graph visualizing topics over time, a People Spiral representing participants and their activity, and an Image Cloud encoding the popularity of event photos by size. Together with a post listing, these mutually linked views support cross-filtering along topics, participants, and time ranges. We discuss our design considerations, in particular with respect to evolving visualizations of dynamically changing data. Initial feedback indicates significant interest and suggests several unanticipated uses.


modeling, analysis, and simulation on computer and telecommunication systems | 2006

A Longitudinal Study of P2P Traffic Classification

Alok Madhukar; Carey L. Williamson

This paper focuses on network traffic measurement of Peer-to- Peer (P2P) applications on the Internet. P2P applications supposedly constitute a substantial proportion of todays Internet traffic. However, current P2P applications use several obfuscation techniques, including dynamic port numbers, port hopping, HTTP masquerading, chunked file transfers, and encrypted payloads. As P2P applications continue to evolve, robust and effective methods are needed for P2P traffic identification. The paper compares three methods to classify P2P applications: port-based classification, application-layer signatures, and transport-layer analysis. The study uses empirical network traces collected from the University of Calgary Internet connection for the past 2 years. The results show that port-based analysis is ineffective, being unable to identify 30%-70% of todays Internet traffic. Application signatures are accurate, but may not be possible for legal or technical reasons. The transport-layer method seems promising, providing a robust means to assess aggregate P2P traffic. The latter method suggests that 30%-70% of the campus Internet traffic for the past year was P2P.


IEEE Network | 2000

Traffic analysis of a Web proxy caching hierarchy

Anirban Mahanti; Carey L. Williamson; Derek L. Eager

Understanding Web traffic characteristics is key to improving the performance and scalability of the Web. In this article Web proxy workloads from different levels of a caching hierarchy are used to understand how the workload characteristics change across different levels of a caching hierarchy. The main observations of this study are that HTML and image documents account for 95 percent of the documents seen in the workload; the distribution of transfer sizes of documents is heavy-tailed, with the tails becoming heavier as one moves up the caching hierarchy; the popularity profile of documents does not precisely follow the Zipf distribution; one-timers account for approximately 70 percent of the documents referenced; concentration of references is less at proxy caches than at servers, and concentration of references diminishes as one moves up the caching hierarchy; and the modification rate is higher at higher-level proxies.


IEEE Transactions on Visualization and Computer Graphics | 2008

VisGets: Coordinated Visualizations for Web-based Information Exploration and Discovery

Marian Dörk; M. Sheelagh T. Carpendale; Christopher Collins; Carey L. Williamson

In common Web-based search interfaces, it can be difficult to formulate queries that simultaneously combine temporal, spatial, and topical data filters. We investigate how coordinated visualizations can enhance search and exploration of information on the World Wide Web by easing the formulation of these types of queries. Drawing from visual information seeking and exploratory search, we introduce VisGets - interactive query visualizations of Web-based information that operate with online information within a Web browser. VisGets provide the information seeker with visual overviews of Web resources and offer a way to visually filter the data. Our goal is to facilitate the construction of dynamic search queries that combine filters from more than one data dimension. We present a prototype information exploration system featuring three linked VisGets (temporal, spatial, and topical), and used it to visually explore news items from online RSS feeds.


international world wide web conferences | 2008

A comparative analysis of web and peer-to-peer traffic

Naimul Basher; Aniket Mahanti; Anirban Mahanti; Carey L. Williamson; Martin F. Arlitt

Peer-to-Peer (P2P) applications continue to grow in popularity, and have reportedly overtaken Web applications as the single largest contributor to Internet traffic. Using traces collected from a large edge network, we conduct an extensive analysis of P2P traffic, compare P2P traffic with Web traffic, and discuss the implications of increased P2P traffic. In addition to studying the aggregate P2P traffic, we also analyze and compare the two main constituents of P2P traffic in our data, namely BitTorrent and Gnutella. The results presented in the paper may be used for generating synthetic workloads, gaining insights into the functioning of P2P applications, and developing network management strategies. For example, our results suggest that new models are necessary for Internet traffic. As a first step, we present flow-level distributional models for Web and P2P traffic that may be used in network simulation and emulation experiments.


measurement and modeling of computer systems | 2008

Analysis of bittorrent-like protocols for on-demand stored media streaming

Nadim Parvez; Carey L. Williamson; Anirban Mahanti; Niklas Carlsson

This paper develops analytic models that characterize the behavior of on-demand stored media content delivery using BitTorrent-like protocols. The models capture the effects of different piece selection policies, including Rarest-First and two variants of In-Order. Our models provide insight into transient and steady-state system behavior, and help explain the sluggishness of the system with strict In-Order streaming. We use the models to compare different retrieval policies across a wide range of system parameters, including peer arrival rate, upload/download bandwidth, and seed residence time. We also provide quantitative results on the startup delays and retrieval times for streaming media delivery. Our results provide insights into the optimal design of peer-to-peer networks for on-demand media streaming.

Collaboration


Dive into the Carey L. Williamson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anirban Mahanti

French Institute for Research in Computer Science and Automation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Richard B. Bunt

University of Saskatchewan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge