Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christian Kreibich is active.

Publication


Featured researches published by Christian Kreibich.


acm special interest group on data communication | 2004

Honeycomb: creating intrusion detection signatures using honeypots

Christian Kreibich; Jon Crowcroft

This paper describes a system for automated generation of attack signatures for network intrusion detection systems. Our system applies pattern-matching techniques and protocol conformance checks on multiple levels in the protocol hierarchy to network traffic captured a honeypot system. We present results of running the system on an unprotected cable modem connection for 24 hours. The system successfully created precise traffic signatures that otherwise would have required the skills and time of a security officer to inspect the traffic manually.


internet measurement conference | 2010

Netalyzr: illuminating the edge network

Christian Kreibich; Nicholas Weaver; Boris Nechaev; Vern Paxson

In this paper we present Netalyzr, a network measurement and debugging service that evaluates the functionality provided by peoples Internet connectivity. The design aims to prove both comprehensive in terms of the properties we measure and easy to employ and understand for users with little technical background. We structure Netalyzr as a signed Java applet (which users access via their Web browser) that communicates with a suite of measurement-specific servers. Traffic between the two then probes for a diverse set of network properties, including outbound port filtering, hidden in-network HTTP caches, DNS manipulations, NAT behavior, path MTU issues, IPv6 support, and access-modem buffer capacity. In addition to reporting results to the user, Netalyzr also forms the foundation for an extensive measurement of edge-network properties. To this end, along with describing Netalyzr s architecture and system implementation, we present a detailed study of 130,000 measurement sessions that the service has recorded since we made it publicly available in June 2009.


internet measurement conference | 2006

Unexpected means of protocol inference

Justin Ma; Kirill Levchenko; Christian Kreibich; Stefan Savage; Geoffrey M. Voelker

Network managers are inevitably called upon to associate network traffic with particular applications. Indeed, this operation is critical for a wide range of management functions ranging from debugging and security to analytics and policy support. Traditionally, managers have relied on application adherence to a well established global port mapping: Web traffic on port 80, mail traffic on port 25 and so on. However, a range of factors - including firewall port blocking, tunneling, dynamic port allocation, and a bloom of new distributed applications - has weakened the value of this approach. We analyze three alternative mechanisms using statistical and structural content models for automatically identifying traffic that uses the same application-layer protocol, relying solely on flow content. In this manner, known applications may be identified regardless of port number, while traffic from one unknown application will be identified as distinct from another. We evaluate each mechanisms classification performance using real-world traffic traces from multiple sites.


ieee symposium on security and privacy | 2011

Click Trajectories: End-to-End Analysis of the Spam Value Chain

Kirill Levchenko; Andreas Pitsillidis; Neha Chachra; Brandon Enright; Mark Felegyhazi; Chris Grier; Tristan Halvorson; Chris Kanich; Christian Kreibich; He Liu; Damon McCoy; Nicholas Weaver; Vern Paxson; Geoffrey M. Voelker; Stefan Savage

Spam-based advertising is a business. While it has engendered both widespread antipathy and a multi-billion dollar anti-spam industry, it continues to exist because it fuels a profitable enterprise. We lack, however, a solid understanding of this enterprises full structure, and thus most anti-Spam interventions focus on only one facet of the overall spam value chain (e.g., spam filtering, URL blacklisting, site takedown).In this paper we present a holistic analysis that quantifies the full set of resources employed to monetize spam email -- including naming, hosting, payment and fulfillment -- usingextensive measurements of three months of diverse spam data, broad crawling of naming and hosting infrastructures, and over 100 purchases from spam-advertised sites. We relate these resources to the organizations who administer them and then use this data to characterize the relative prospects for defensive interventions at each link in the spam value chain. In particular, we provide the first strong evidence of payment bottlenecks in the spam value chain, 95% of spam-advertised pharmaceutical, replica and software products are monetized using merchant services from just a handful of banks.


computer and communications security | 2009

Dispatcher: enabling active botnet infiltration using automatic protocol reverse-engineering

Juan Caballero; Pongsin Poosankam; Christian Kreibich; Dawn Song

Automatic protocol reverse-engineering is important for many security applications, including the analysis and defense against botnets. Understanding the command-and-control (C&C) protocol used by a botnet is crucial for anticipating its repertoire of nefarious activity and to enable active botnet infiltration. Frequently, security analysts need to rewrite messages sent and received by a bot in order to contain malicious activity and to provide the botmaster with an illusion of successful and unhampered operation. To enable such rewriting, we need detailed information about the intent and structure of the messages in both directions of the communication despite the fact that we generally only have access to the implementation of one endpoint, namely the bot binary. Current techniques cannot enable such rewriting. In this paper, we propose techniques to extract the format of protocol messages sent by an application that implements a protocol specification, and to infer the field semantics for messages both sent and received by the application. Our techniques enable applications such as rewriting the C&C messages for active botnet infiltration. We implement our techniques into Dispatcher, a tool to extract the message format and field semantics of both received and sent messages. We use Dispatcher to analyze MegaD, a prevalent spam botnet employing a hitherto undocumented C&C protocol, and show that the protocol information extracted by Dispatcher can be used to rewrite the C&C messages.


ieee symposium on security and privacy | 2012

Prudent Practices for Designing Malware Experiments: Status Quo and Outlook

Christian Rossow; Christian Dietrich; Chris Grier; Christian Kreibich; Vern Paxson; Norbert Pohlmann; Herbert Bos; Maarten van Steen

Malware researchers rely on the observation of malicious code in execution to collect datasets for a wide array of experiments, including generation of detection models, study of longitudinal behavior, and validation of prior research. For such research to reflect prudent science, the work needs to address a number of concerns relating to the correct and representative use of the datasets, presentation of methodology in a fashion sufficiently transparent to enable reproducibility, and due consideration of the need not to harm others. In this paper we study the methodological rigor and prudence in 36 academic publications from 2006-2011 that rely on malware execution. 40% of these papers appeared in the 6 highest-ranked academic security conferences. We find frequent shortcomings, including problematic assumptions regarding the use of execution-driven datasets (25% of the papers), absence of description of security precautions taken during experiments (71% of the articles), and oftentimes insufficient description of the experimental setup. Deficiencies occur in top-tier venues and elsewhere alike, highlighting a need for the community to improve its handling of malware datasets. In the hope of aiding authors, reviewers, and readers, we frame guidelines regarding transparency, realism, correctness, and safety for collecting and using malware datasets.


international conference on detection of intrusions and malware and vulnerability assessment | 2005

Enhancing the accuracy of network-based intrusion detection with host-based context

Holger Dreger; Christian Kreibich; Vern Paxson; Robin Sommer

In the recent past, both network- and host-based approaches to intrusion detection have received much attention in the network security community. No approach, taken exclusively, provides a satisfactory solution: network-based systems are prone to evasion, while host-based solutions suffer from scalability and maintenance problems. In this paper we present an integrated approach, leveraging the best of both worlds: we preserve the advantages of network-based detection, but alleviate its weaknesses by improving the accuracy of the traffic analysis with specific host-based context. Our framework preserves a separation of policy from mechanism, is highly configurable and more flexible than sensor/manager-based architectures, and imposes a low overhead on the involved end hosts. We include a case study of our approach for a notoriously hard problem for purely network-based systems: the correct processing of HTTP requests.


internet measurement conference | 2012

Fathom: a browser-based network measurement platform

Mohan Dhawan; Justin Samuel; Renata Teixeira; Christian Kreibich; Mark Allman; Nicholas Weaver; Vern Paxson

For analyzing network performance issues, there can be great utility in having the capability to measure directly from the perspective of end systems. Because end systems do not provide any external programming interface to measurement functionality, obtaining this capability today generally requires installing a custom executable on the system, which can prove prohibitively expensive. In this work we leverage the ubiquity of web browsers to demonstrate the possibilities of browsers themselves offering such a programmable environment. We present Fathom, a Firefox extension that implements a number of measurement primitives that enable websites or other parties to program network measurements using JavaScript. Fathom is lightweight, imposing < 3.2% overhead in page load times for popular web pages, and often provides 1 ms timestamp accuracy. We demonstrate Fathoms utility with three case studies: providing a JavaScript version of the Netalyzr network characterization tool, debugging web access failures, and enabling web sites to diagnose performance problems of their clients.


passive and active network measurement | 2012

Probe and pray: using UPnP for home network measurements

Lucas DiCioccio; Renata Teixeira; Martin May; Christian Kreibich

Network measurement practitioners increasingly focus their interest on understanding and debugging home networks. The Universal Plug and Play (UPnP) technology holds promise as a highly efficient way to collect and leverage measurement data and configuration settings available from UPnP-enabled devices found in home networks. Unfortunately, UPnP proves less available and reliable than one would hope. In this paper, we explore the usability of UPnP as a means to measure and characterize home networks. We use data from 120,000 homes, collected with the HomeNet Profiler and Netalyzr troubleshooting suites. Our results show that in the majority of homes we could not collect any UPnP data at all, and when we could, the results were frequently inaccurate or simply wrong. Whenever UPnP-supplied data proved accurate, however, we demonstrate that UPnP provides an array of useful measurement techniques for inferring home network traffic and losses, for identifying home gateway models with configuration or implementation issues, and for obtaining ground truth on access link capacity.


international conference on mobile systems, applications, and services | 2015

Beyond the Radio: Illuminating the Higher Layers of Mobile Networks

Narseo Vallina-Rodriguez; Srikanth Sundaresan; Christian Kreibich; Nicholas Weaver; Vern Paxson

Cellular network performance is often viewed as primarily dominated by the radio technology. However, reality proves more complex: mobile operators deploy and configure their networks in different ways, and sometimes establish network sharing agreements with other mobile carriers. Moreover, regulators have encouraged newer operational models such as Mobile Virtual Network Operators (MVNOs) to promote competition. In this paper we draw upon data collected by the ICSI Netalyzr app for Android to characterize how operational decisions, such as network configurations, business models, and relationships between operators introduce diversity in service quality and affect user security and privacy. We delve in detail beyond the radio link and into network configuration and business relationships in six countries. We identify the widespread use of transparent middleboxes such as HTTP and DNS proxies, analyzing how they actively modify user traffic, compromise user privacy, and potentially undermine user security. In addition, we identify network sharing agreements between operators, highlighting the implications of roaming and characterizing the properties of MVNOs, including that a majority are simply rebranded versions of major operators. More broadly, our findings highlight the importance of considering higher-layer relationships when seeking to analyze mobile traffic in a sound fashion.

Collaboration


Dive into the Christian Kreibich's collaboration.

Top Co-Authors

Avatar

Vern Paxson

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stefan Savage

University of California

View shared research outputs
Top Co-Authors

Avatar

Mark Allman

International Computer Science Institute

View shared research outputs
Top Co-Authors

Avatar

Chris Kanich

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge