Rhiannon Weaver
Software Engineering Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rhiannon Weaver.
internet measurement conference | 2007
M. Patrick Collins; Timothy J. Shimeall; Sidney Faber; Jeff Janies; Rhiannon Weaver; Markus De Shon; Joseph B. Kadane
The increased use of botnets as an attack tool and the awareness attackers have of blocking lists leads to the question of whether we can effectively predict future bot locations. To that end, we introduce a network quality that we term uncleanliness: an indicator of the propensity for hosts in a network to be compromised by outside parties. We hypothesize that unclean networks will demonstrate two properties: spatial and temporal uncleanliness. Spatial uncleanliness is the tendency for compromised hosts to cluster within unclean networks. Temporal uncleanliness is the tendency for unclean networks to contain compromised hosts for extended periods. We test for these properties by collating data from multiple indicators (spamming, phishing, scanning and botnet IRC log monitoring). We demonstrate evidence for both spatial and temporal uncleanliness. We further show evidence for cross-relationship between the various datasets, showing that botnet activity predicts spamming and scanning, while phishing activity appears to be unrelated to the other indicators.
Proceedings of the anti-phishing working groups 2nd annual eCrime researchers summit on | 2007
Rhiannon Weaver; M. Patrick Collins
We estimate of the extent of phishing activity on the Internet via capture-recapture analysis of two major phishing site reports. Capture-recapture analysis is a population estimation technique originally developed for wildlife conservation, but is applicable in any environment wherein multiple independent parties collect reports of an activity. Generating a meaningful population estimate for phishing activity requires addressing complex relationships between phishers and phishing reports. Phishers clandestinely occupy machines and adding evasive measures into phishing URLs to evade firewalls and other fraud-detection measures. Phishing reports, in the meantime, may be demonstrate a preference towards certain classes of phish. We address these problems by estimating population in terms of netblocks and by clustering phishing attempts together into scams, which are phishes that demonstrate similar behavior on multiple axes. We generate population estimates using data from two different phishing reports over an 80-day period, and show that these reports capture approximately 40% of scams and 80% of CIDR/24 (256 contiguous address) netblocks involved in phishing.
international conference on distributed computing and internet technology | 2014
William Casey; Jose Andre Morales; Thomson Nguyen; Jonathan M. Spring; Rhiannon Weaver; Evan Wright; Leigh Metcalf; Bud Mishra
In March of 2013, what started as a minor dispute between Spamhaus and Cyberbunker quickly escalated to a distributed denial of service DDoS attack that was so massive, it was claimed to have slowed internet speeds around the globe. The attack clogged servers with dummy internet traffic at a rate of about 300 gigabits per second. By comparison, the largest observed DDoS attacks typically against banks had thus far registered only 50 gigabits per second. The record breaking Spamhaus/Cyberbunker conflict arose 13 years after the publication of best practices on preventing DDoS attacks, and it was not an isolated event. Recently, NYUs Courant Institute and Carnegie Mellon Software Engineering Institute have collaboratively devised a game-theoretic approaches to address various cyber security problems involving exchange of information asymmetrically. This research aims to discover and understand complex structures of malicious use cases within the context of secure systems with the goal of developing an incentives-based measurement system that ensures a high level of resilience to attack.
passive and active network measurement | 2010
Rhiannon Weaver
We estimate the number of active machines per hour infected with the Conficker-C worm, using a probability model of Conficker-Cs UDP P2P scanning behavior. For an observer with access to a proportion δ of monitored IPv4 space, we derive the distribution of the number of times a single infected host is observed scanning the monitored space, based on a study of the P2P protocol, and on network and behavioral variability by relative hour of the day. We use these distributional results in conjunction with the Levy form of the Central Limit Theorem to estimate the total number of active hosts in a single hour. We apply the model to observed data from Conficker-C scans sent over a 51-day period (March 5th through April 24th, 2009) to a large private network.
Ramanujan Journal | 2001
Rhiannon Weaver
AbstractLet p(n) denote the number of unrestricted partitions of a non-negative integer n. In 1919, Ramanujan proved that for every non-negative n
BICT '14 Proceedings of the 8th International Conference on Bioinspired Information and Communications Technologies | 2014
William Casey; Rhiannon Weaver; Leigh Metcalf; Jose Andre Morales; Evan Wright; Bud Mishra
IEEE Transactions on Information Forensics and Security | 2015
Rhiannon Weaver
\begin{gathered} p(5 + 4) \equiv 0(\bmod 5), \hfill \\ p(7n + 5) \equiv 0(\bmod 7), \hfill \\ p(11n + 6) \equiv 0(\bmod 11). \hfill \\ \end{gathered}
Mobile Networks and Applications | 2016
William Casey; Rhiannon Weaver; Jose Andre Morales; Evan Wright; Bud Mishra
Cognitive Science | 2008
Rhiannon Weaver
Recently, Ono proved for every prime m ≥ 5 that there are infinitely many congruences of the form p(An+B)≡0 (mod m). However, his results are theoretical and do not lead to an effective algorithm for finding such congruences. Here we obtain such an algorithm for primes 13≤m≤31 which reveals 76,065 new congruences.
Archive | 2007
Michael P. Collins; Timothy J. Shimeall; Sidney Faber; Jeff Janies; Rhiannon Weaver; Markus De Shon
We present a game theoretic framework, modeling strategic interactions among humans and things, which are assumed to be interconnected by a social-technological network, as in Internet of Humans and Things (IOHT). Often a pair of agents in the network interact in order for an informed sender-agent to signal an uninformed receiver-agent to take an action that benefits each of the players – the benefits to the pair of agents are modeled by two separate utility functions, both depending on the sender’s private information, the signal exchanged, and the receiver’s revealed (and unrevealed) action. In general, the two agents’ utilities may not be aligned and may encourage deceptive behavior. For example, a sender, aware of his own private “state of ignorance,” may seek useful information from a receiver who owns powerful computational resources to search a large corpora of webpages; the sender does so by sending a signal to the receiver in the-form of a keyword. Obvious examples of deceptiveness here range from attempts to hide one’s intentions to auctioning the keywords on an ad exchange through real-time bidding. A rather troublesome situation occurs when deceptions are employed to breach the security of the system, thus making the entire social-technological network unreliable. Earlier, we proposed a signaling-game-theoretic framework to alleviate this problem. This paper further enhances it by reconfiguring signals to possess more complex structures (epistatic signals to represent attack and defense options over a given set of vulnerabilities). We explore two augmentations to the original evolutionary signaling game by first enhancing mutation bias toward strategies performing well in previous populations and secondly by allowing the parameters of the utility functions to dependent on population preferences giving rise to a minority game with epistatic signaling. The resulting game systems are empirically studied through extensive computer simulation. 1. GAMES AND CYBER CONFLICTS At the core of many dynamic online strategic interactions are simple information-asymmetric games, which do permit the agents to act deceptively to gain advantages. Take for example the flashlight app for smartphones which was also discovered to open a GPS-tracking backdoor to gain private information by logging the device’s physical locations (discovery reported in [6]). Whereas the producer (i.e., sender) of the flashlight app may advertise (i.e., signal) that the application is designed to provide a flashlight feature for smartphones, the sender creates the deceptive impression of respecting the user’s privacy by giving the app a benign sounding name: “Flashlight App.” A typical user’s expectations of privacy would proscribe the surveillance capabilities (physically tracking the user’s device via GPS tracking) and not foresee encroachment by an app that is prima facie simple, benign, and desirable. In this case (and others like it) a typical consumer (receiver) may recognize that they had been deceived, and may label the producer (sender) as a miscreant and tarnish the producer’s reputation with a negative ranking and comments labeling the app as “backdoor,”“Trojan,” or “malware.” Such verification processes are aimed at protecting the future consumers. However, the encounter, concluded before the discovery of the attack, has its costs and benefits: the cost to the receiver is the loss of privacy, and the benefit to the sender is the ability to gain strategic informational advantages with unanticipated usages. In considering signaling games for cyber security to model interactions, such as the one above, we envision that security properties such as non-surveillance can be checked, dynamically and efficiently, via two additional mechanisms: namely, (i) a social-technological recommendation-verification system involving additional players, and (ii) a currency system, represented by M-Coins certificates backing the proofs concerning the behavior of the agents. We also extend the receiver’s strategy space by providing it means to challenge the sender. Note that, without proof or certification that the app’s behavior complies with reasonable security properties, the receiver is left with the options to either trust the sender or attempt to challenge them. Such challenges may seek their own or otherwise trusted proofs or certificates to let the receiver decide whether the sender is being deceptive. Motivated by biological systems, we provide another extension to allow our recommendation-verification system to address the many distinct attacks that a producer (sender) could use to deceptively ensnare a consumer (receiver). Here, we describe this extension of signaling games which include diverse attack vectors and we term this extension epistatic signaling games. After defining epistatic signaling games we present experiments designed to understand their dynamics empirically and how such a system could operate in practice. We relegate a formal description of the system and proofs of its various properties to the full paper. 1.1 Signaling Games in Cyber Security A signaling game is a dynamic game with two players, the sender (S) and the receiver (R). The sender is assumed to possess a certain type, t ∈ T, which is selected by nature (we will think of it as sender’s private information; thus, the sender observes his own type while the receiver does not know the type of the sender). The sender chooses a message α from a set of possible messages M: in epistatic signaling games the message may contain attacks upon any subset of K distinct vulnerabilities, denoted V = {v1, . . . , vK}, including the empty set which we term a clean or benign signal. The receiver observes the message but not the type of the sender or the attacks implicitly encoded in the message. Then the receiver chooses an action γ from a set of feasible actions C, which include challenges to various attacks: Letting ci be the check for an attack on the ith vulnerability vi the sender’s options are subsets of C (also denoted by the vulnerabilities, vi’s, i = 1, . . . ,K, being challenged), with the empty set representing the option of receiving messages with no challenge at all (either trusting or an insouciant option). The utility functions are given by US : T×M×C→ R for the sender and UR : T×M×C→ R for the receiver. In the context of cyber security, we always consider the symmetric game with repetitions (as opposed to one-shot), in which both players play both roles. Basic signaling games were introduced by In-Koo Cho and David M. Kreps in a 1987 article[5]. By considering a singular attack option and singular checking action we explore the effects of deceptive agents in cyber security problems via simulations; these simulations reveal a range of outcomes for system behavior over the space of payoff parameters([3]). Epistatic signaling games differ from signaling games originally introduced by us for cyber security in the following two ways. First, in signaling games the strategic options for sender and receiver are limited to a single attack and a single challenge option; a signaling game is a special case of the general epistatic signaling formulation when K = 1. Higher, but bounded, values of K > 1 add realism to the model by constructing the attack surface to be K vulnerable objects. The second way in which this approach differs from traditional signaling games is that we simplify the transitions in strategies for repeated games. In this approach we are limiting the agents to two transitions based on whether or not a detection event occurred. While this constraint may appear to be limiting, it is more realistic, since agents Vulnerabilities may be considered code objects which can be exploited and attacked by a malicious user. are primarily interested resolving an attack (i.e., detection event); note particularly that in the case of an undetected attack, the user will not have immediate access to what attack succeeded. We briefly review the strategic options and payoff of signaling games for cyber security to fully demonstrate the relation between signaling games, signaling games introduced previously for cyber security and this approach of epistatic signaling games. The strategic options: In signaling games (when K = 1) the sender may select to send cooperatively C or to send an attack D. Similarly the options for the receiver are to accept trusting C or to challenge D. We encode all options of the symmetric game using strings where the first letter denotes the sender’s type and the second the receiver’s action. Using this encoding the option space for a single round of symmetric signaling games is the set {CC,CD,DC,DD}. Game Payoff: The payoff matrix for the symmetric signaling game (with K = 1) is then defined over the product of row-player options and column-player options. In this matrix, d is the benefit of an attack for the sender (assumed to be a zero-sum quantity), e is the cost of getting caught attacking as sender, f is the prize for catching an attacker, and g is the cost of challenging a sender as receiver. The payoff matrix is: