Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Leigh Metcalf is active.

Publication


Featured researches published by Leigh Metcalf.


international conference on distributed computing and internet technology | 2014

Cyber Security via Signaling Games: Toward a Science of Cyber Security

William Casey; Jose Andre Morales; Thomson Nguyen; Jonathan M. Spring; Rhiannon Weaver; Evan Wright; Leigh Metcalf; Bud Mishra

In March of 2013, what started as a minor dispute between Spamhaus and Cyberbunker quickly escalated to a distributed denial of service DDoS attack that was so massive, it was claimed to have slowed internet speeds around the globe. The attack clogged servers with dummy internet traffic at a rate of about 300 gigabits per second. By comparison, the largest observed DDoS attacks typically against banks had thus far registered only 50 gigabits per second. The record breaking Spamhaus/Cyberbunker conflict arose 13 years after the publication of best practices on preventing DDoS attacks, and it was not an isolated event. Recently, NYUs Courant Institute and Carnegie Mellon Software Engineering Institute have collaboratively devised a game-theoretic approaches to address various cyber security problems involving exchange of information asymmetrically. This research aims to discover and understand complex structures of malicious use cases within the context of secure systems with the goal of developing an incentives-based measurement system that ensures a high level of resilience to attack.


international congress on big data | 2014

SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

Mark Thomas; Leigh Metcalf; Jonathan M. Spring; Paul Krystosek; Katherine Prevost

A large organization can generate over ten billion network flow records per day, a high-velocity data source. Finding useful, security-related anomalies in this volume of data is challenging. Most large network flow tools sample the data to make the problem manageable, but sampling unacceptably reduces the fidelity of analytic conclusions. In this paper we discuss SiLK, a tool suite created to analyze this high-volume data source without sampling. SiLK implementation and architectural design are optimized to manage this Big Data problem. SiLK provides not just network flow capture and analysis, but also includes tools to analyze large sets and dictionaries that frequently relate to network flow data, incorporating higher-variety data sources. These tools integrate disparate data sources with SiLK analysis.


BICT '14 Proceedings of the 8th International Conference on Bioinspired Information and Communications Technologies | 2014

Cyber security via minority games with epistatic signaling: invited paper

William Casey; Rhiannon Weaver; Leigh Metcalf; Jose Andre Morales; Evan Wright; Bud Mishra

We present a game theoretic framework, modeling strategic interactions among humans and things, which are assumed to be interconnected by a social-technological network, as in Internet of Humans and Things (IOHT). Often a pair of agents in the network interact in order for an informed sender-agent to signal an uninformed receiver-agent to take an action that benefits each of the players – the benefits to the pair of agents are modeled by two separate utility functions, both depending on the sender’s private information, the signal exchanged, and the receiver’s revealed (and unrevealed) action. In general, the two agents’ utilities may not be aligned and may encourage deceptive behavior. For example, a sender, aware of his own private “state of ignorance,” may seek useful information from a receiver who owns powerful computational resources to search a large corpora of webpages; the sender does so by sending a signal to the receiver in the-form of a keyword. Obvious examples of deceptiveness here range from attempts to hide one’s intentions to auctioning the keywords on an ad exchange through real-time bidding. A rather troublesome situation occurs when deceptions are employed to breach the security of the system, thus making the entire social-technological network unreliable. Earlier, we proposed a signaling-game-theoretic framework to alleviate this problem. This paper further enhances it by reconfiguring signals to possess more complex structures (epistatic signals to represent attack and defense options over a given set of vulnerabilities). We explore two augmentations to the original evolutionary signaling game by first enhancing mutation bias toward strategies performing well in previous populations and secondly by allowing the parameters of the utility functions to dependent on population preferences giving rise to a minority game with epistatic signaling. The resulting game systems are empirically studied through extensive computer simulation. 1. GAMES AND CYBER CONFLICTS At the core of many dynamic online strategic interactions are simple information-asymmetric games, which do permit the agents to act deceptively to gain advantages. Take for example the flashlight app for smartphones which was also discovered to open a GPS-tracking backdoor to gain private information by logging the device’s physical locations (discovery reported in [6]). Whereas the producer (i.e., sender) of the flashlight app may advertise (i.e., signal) that the application is designed to provide a flashlight feature for smartphones, the sender creates the deceptive impression of respecting the user’s privacy by giving the app a benign sounding name: “Flashlight App.” A typical user’s expectations of privacy would proscribe the surveillance capabilities (physically tracking the user’s device via GPS tracking) and not foresee encroachment by an app that is prima facie simple, benign, and desirable. In this case (and others like it) a typical consumer (receiver) may recognize that they had been deceived, and may label the producer (sender) as a miscreant and tarnish the producer’s reputation with a negative ranking and comments labeling the app as “backdoor,”“Trojan,” or “malware.” Such verification processes are aimed at protecting the future consumers. However, the encounter, concluded before the discovery of the attack, has its costs and benefits: the cost to the receiver is the loss of privacy, and the benefit to the sender is the ability to gain strategic informational advantages with unanticipated usages. In considering signaling games for cyber security to model interactions, such as the one above, we envision that security properties such as non-surveillance can be checked, dynamically and efficiently, via two additional mechanisms: namely, (i) a social-technological recommendation-verification system involving additional players, and (ii) a currency system, represented by M-Coins certificates backing the proofs concerning the behavior of the agents. We also extend the receiver’s strategy space by providing it means to challenge the sender. Note that, without proof or certification that the app’s behavior complies with reasonable security properties, the receiver is left with the options to either trust the sender or attempt to challenge them. Such challenges may seek their own or otherwise trusted proofs or certificates to let the receiver decide whether the sender is being deceptive. Motivated by biological systems, we provide another extension to allow our recommendation-verification system to address the many distinct attacks that a producer (sender) could use to deceptively ensnare a consumer (receiver). Here, we describe this extension of signaling games which include diverse attack vectors and we term this extension epistatic signaling games. After defining epistatic signaling games we present experiments designed to understand their dynamics empirically and how such a system could operate in practice. We relegate a formal description of the system and proofs of its various properties to the full paper. 1.1 Signaling Games in Cyber Security A signaling game is a dynamic game with two players, the sender (S) and the receiver (R). The sender is assumed to possess a certain type, t ∈ T, which is selected by nature (we will think of it as sender’s private information; thus, the sender observes his own type while the receiver does not know the type of the sender). The sender chooses a message α from a set of possible messages M: in epistatic signaling games the message may contain attacks upon any subset of K distinct vulnerabilities, denoted V = {v1, . . . , vK}, including the empty set which we term a clean or benign signal. The receiver observes the message but not the type of the sender or the attacks implicitly encoded in the message. Then the receiver chooses an action γ from a set of feasible actions C, which include challenges to various attacks: Letting ci be the check for an attack on the ith vulnerability vi the sender’s options are subsets of C (also denoted by the vulnerabilities, vi’s, i = 1, . . . ,K, being challenged), with the empty set representing the option of receiving messages with no challenge at all (either trusting or an insouciant option). The utility functions are given by US : T×M×C→ R for the sender and UR : T×M×C→ R for the receiver. In the context of cyber security, we always consider the symmetric game with repetitions (as opposed to one-shot), in which both players play both roles. Basic signaling games were introduced by In-Koo Cho and David M. Kreps in a 1987 article[5]. By considering a singular attack option and singular checking action we explore the effects of deceptive agents in cyber security problems via simulations; these simulations reveal a range of outcomes for system behavior over the space of payoff parameters([3]). Epistatic signaling games differ from signaling games originally introduced by us for cyber security in the following two ways. First, in signaling games the strategic options for sender and receiver are limited to a single attack and a single challenge option; a signaling game is a special case of the general epistatic signaling formulation when K = 1. Higher, but bounded, values of K > 1 add realism to the model by constructing the attack surface to be K vulnerable objects. The second way in which this approach differs from traditional signaling games is that we simplify the transitions in strategies for repeated games. In this approach we are limiting the agents to two transitions based on whether or not a detection event occurred. While this constraint may appear to be limiting, it is more realistic, since agents Vulnerabilities may be considered code objects which can be exploited and attacked by a malicious user. are primarily interested resolving an attack (i.e., detection event); note particularly that in the case of an undetected attack, the user will not have immediate access to what attack succeeded. We briefly review the strategic options and payoff of signaling games for cyber security to fully demonstrate the relation between signaling games, signaling games introduced previously for cyber security and this approach of epistatic signaling games. The strategic options: In signaling games (when K = 1) the sender may select to send cooperatively C or to send an attack D. Similarly the options for the receiver are to accept trusting C or to challenge D. We encode all options of the symmetric game using strings where the first letter denotes the sender’s type and the second the receiver’s action. Using this encoding the option space for a single round of symmetric signaling games is the set {CC,CD,DC,DD}. Game Payoff: The payoff matrix for the symmetric signaling game (with K = 1) is then defined over the product of row-player options and column-player options. In this matrix, d is the benefit of an attack for the sender (assumed to be a zero-sum quantity), e is the cost of getting caught attacking as sender, f is the prize for catching an attacker, and g is the cost of challenging a sender as receiver. The payoff matrix is:


Cybersecurity and Applied Mathematics | 2016

Introduction to data analysis

Leigh Metcalf; William Casey

Statistics is used to study the collection, analysis and interpretation of data. The process can be illustrated as the collection of data, exploratory data analysis, the application of probability and random numbers to the data, followed by inference which relates the analysis of the subset of data collected to the entire set of data. This chapter discusses data collection and exploratory data analysis with an emphasis on visualizations as they pertain to cybersecurity data.


Cybersecurity and Applied Mathematics | 2016

Metrics, similarity, and sets

Leigh Metcalf; William Casey

In this chapter we cover an introduction to set theory, with common operations such as subset, intersection, union, set difference, complement and symmetric difference with examples from cybersecurity data. Set functions are also discussed, which leads us directly to the definition of a metric. We cover the variations of metric, including pseudometric, quasimetric, and semimetric. Similarities are also discussed. We then illustrate the metric on various sets, including strings, sets, Internet and cybersecurity specific metrics.


Cybersecurity and Applied Mathematics | 2016

Visualizing cybersecurity data

Leigh Metcalf; William Casey

Cybersecurity has a history of not doing visualizations well. This chapter discusses the goals of a good visualization and how to create a visualization that will best satisfy the predetermined goals. Visualizations directly related to cybersecurity data are discussed and illustrated. Examples of both good and bad illustrations are given and a thorough discussion of the efficacy of the visualizations is considered.


In: Securing and Trusting Internet Names: SATIN 2011. National Physical Laboratory (2011) | 2011

Correlating domain registrations and DNS first activity in general and for malware

Jonathan M. Spring; Leigh Metcalf; E Stoner


Proceedings of the 2nd ACM Workshop on Information Sharing and Collaborative Security | 2015

Blacklist Ecosystem Analysis: Spanning Jan 2012 to Jun 2014

Leigh Metcalf; Jonathan M. Spring


Archive | 2013

Passive Detection of Misbehaving Name Servers

Leigh Metcalf; Jonathan M. Spring


Archive | 2016

Cybersecurity and Applied Mathematics

Leigh Metcalf; William Casey

Collaboration


Dive into the Leigh Metcalf's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

William Casey

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Evan Wright

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Jose Andre Morales

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Rhiannon Weaver

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Katherine Prevost

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Mark Thomas

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Paul Krystosek

Software Engineering Institute

View shared research outputs
Top Co-Authors

Avatar

Thomson Nguyen

Software Engineering Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge