Tony Doyle
Hunter College
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tony Doyle.
Reference Services Review | 2006
Tony Doyle; John L. Hammond
Purpose – The purpose of this paper is to show how web sites can be a valuable research source for students if approached with due caution.Design/methodology/approach – This article is the product of collaboration between a sociology professor and a librarian. The authors discuss the nature of their collaboration and present their views on web evaluation in the context of an extensive literature review.Findings – Reputable print sources have numerous mechanisms to help ensure reliability: proven authors and editors, track record, and (sometimes) peer review. Obviously, the vast majority of web sites lack these features. Accordingly, the paper offers a critical look at the standard criteria of web evaluation with illustrations from two sites, one credible, one not.Originality/value – Healthy skepticism regarding the internet is urged. It is suggested that web evaluation has costs and benefits. The chief benefit of careful web site evaluation is that the process makes it more likely than otherwise that one ...
Ethics and Information Technology | 2014
Tony Doyle; Judy Veranas
We defend public anonymity in the light of the threat posed by digital technology. Once people could reasonably assume that they were fairly anonymous when they left the house. They neither drove nor walked around with GPS devices; they paid their highway tolls in cash; they seldom bought on credit; and no cameras marked their strolls in the park or their walks down the street. Times have changed. We begin with a brief discussion of the concept of anonymity. We then argue that public anonymity helps promote privacy in public. Next, we argue that public anonymity is worth protecting insofar as it promotes autonomy. After that we attempt to show how digital technology threatens public anonymity in the context of CCTV and GPS devices. We argue for a significant scaling back of public surveillance. We close with some thoughts on what we take to be the gratuitous costs imposed on those who would attempt to preserve their anonymity in public.
IFLA Journal | 2018
Tony Doyle
As our digital wake ripples out, big data is standing by to ride it, applying its analytics to make unnerving inferences about our characters, preferences, and future behavior. This paper addresses the challenge that big data presents to privacy. I examine what are perhaps the two most promising attempts to repel big data’s attack on privacy: obfuscation and the “propertization” of personal information. Obfuscation attempts to throw data collectors off our digital trail by confusing or misleading them. Propertization calls for treating personal information as intellectual property and would require that data holders compensate data subjects for any secondary use. I try to show that both defenses largely fail. I conclude that privacy is a lost cause and that we should call off the attempts to defend it from the moral point of view. I close with some thoughts about what this all means for libraries.
The Information Society | 2017
Tony Doyle
Cathy O’Neil’s Weapons of Math Destruction is a timely reminder of the power and perils of predictive algorithms and model-driven decision processes. The book deals in some depth with eight case studies of the abuses she associates with WMDs: “weapons of math destruction.” The cases include the havoc wrought by value-added models used to evaluate teacher performance and by the college ranking system introduced by U.S. News and World Report; the collateral damage of online advertising and models devised to track and monetize “eyeballs”; the abuses associated with the recidivism models used in judicial decisions; the inequities perpetrated by the use of personality tests in hiring decisions; the burdens placed on low-wage workers by algorithm-driven attempts to maximize labor efficiency; the injustices written into models that evaluate creditworthiness; the inequities produced by insurance companies’ risk models; and the potential assault on the democratic process by the use of big data in political campaigns. As this summary suggests, O’Neil had plenty of examples to choose from when she wrote the book, but since the publication of Weapons of Math Destruction, two more problems associated with model-driven decision procedures have surfaced, making O’Neil’s work even more essential reading. The first—the role played by fake news, much of it circulated on Facebook, in the 2016 election—has led to congressional investigations. The second—the failure of algorithm-governed oversight to recognize and delete gruesome posts on the Facebook Live streaming service—has caused CEO Mark Zuckerberg to announce the addition of 3,000 human screeners to the Facebook staff. While O’Neil’s book may seem too polemical to some readers and too cautious to others, it speaks forcefully to the cultural moment we share. O’Neil weaves the story of her own credentials and work experience into her analysis, because, as she explains, her training as a mathematician and her experience in finance shaped the way she now understands the world. O’Neil earned a PhD in mathematics from Harvard; taught at Barnard College, where her research area was algebraic number theory; and worked for the hedge fund D. E. Shaw, which uses mathematical analysis to guide investment decisions. When the financial crisis of 2008 revealed that even the most sophisticated models were incapable of anticipating risks associated with “black swans”—events whose rarity make them nearly impossible to predict—O’Neil left the world of corporate finance to join the RiskMetrics Group, where she helped market risk models to financial institutions eager to rehabilitate their image. Ultimately, she became disillusioned with the financial industry’s refusal to take seriously the limitations of risk management models and left RiskMetrics. She rebranded herself a “data scientist” and took a job at Intent Media, where she helped design algorithms that would make big data useful for all kinds of applications. All the while, as O’Neil describes it, she “worried about the separation between technical models and real people, and about the moral repercussions of that separation” (page 48). O’Neil eventually left Intent Media to devote her energies to inWeapons of Math Destruction
Ethics and Information Technology | 2016
Tony Doyle
In her highly influential 2010 book Privacy in Context Helen Nissenbaum confronted the rising threat to privacy posed by information technology and its increased ability to gather, store, analyze, and distribute personal information. Information gathered for one purpose is routinely used for numerous others or shared with inappropriate third parties. The massive surveillance to which we are now routinely subject affects, for instance, the opportunities we are offered and the price we pay for goods and services. Privacy in Context offered not so much a theory of privacy as a novel conception of the morally appropriate flow of personal information. The norms governing its flow are context-relative, Nissenbaum argued, and violating them is prima facie wrong, putting the burden on transgressors to justify their practice. For instance, although my doctor might legitimately share my health information with relevant specialists without my consent, it would not be appropriate for him to pass the information along to my employer or to drug marketers unbeknownst to me. Quotidian or consequential, our actions are ‘‘newly enriched with information’’ (2010, p. 24), with serious implications for both privacy and autonomy. However, Privacy in Context said little about how data subjects might mitigate or even defeat pervasive digital monitoring. This is where Nissenbaum and her co-author Finn Brunton pick up in the book under review, presenting a lucid discussion and careful justification of obfuscation in a broad range of cases. The authors define obfuscation as ‘‘the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection’’ (p. 1). The tactic gums up the works, making the collection of data ‘‘more difficult to act on, and therefore less valuable... adding to the cost, trouble, and difficulty of doing the looking’’ (pp. 46–47). Effective obfuscation can confound adversaries by ‘‘transform[ing] something that is open to scrutiny into something ambiguous, full of false leads, mistaken identities, and unmet expectations’’ (p. 34). The book canvasses many shades of the practice, both digital and non-digital. Consider the following ingenious pre-digital example. To hoodwink German radar, Allied bombers would litter their paths with chaff, that is, ‘‘strips of black paper backed with aluminum foil’’ (p. 8). This confetti would look for all the world like a gang of planes on the screen. Only a few of the blips would be true positives. But which? Each one is equally plausible, at least for the time being. The planes hide their signals in a brief cacophony. This is time-based obfuscation. Natural selection, not surprisingly, has been conjuring up such trickery from time out of mind. The authors adduce an orb-weaving family of spiders. When cruising their exposed web, these spiders are vulnerable to attack from wasps. Their canny response is to strew it with decoys. A false strike by a wasp buys the critters a second to dash for cover. Of course, the authors’ main concern is digital obfuscation. Take Tor, which promotes anonymous internet searching by masking my requests for a given web page. Once I join the Tor network and agree to allow my machine to function as a relay, my requests are encrypted and come not from my IP address but from that of another ‘‘node’’ in the Tor relay network, hidden in a crowd of requests from other Tor users. The response comes back to me via other nodes, thereby shielding my identity. Not only can snoops not decrypt the message; they also will not see that the & Tony Doyle [email protected]
Ethics and Information Technology | 2013
Tony Doyle
Over the last 25 years Anita Allen has established herself as a leading privacy theorist, with several books and numerous articles on the topic. The ‘‘unpopular’’ in Allen’s title refers to the modest paternalism she defends to protect privacy from the considerable digital and social threats that privacy now faces (p. xi). As a liberal, she thinks that the burden is on those who would restrict freedom. She is prepared to accept that burden regarding privacy, arguing that government must both ‘‘create strong privacy rights’’ and be prepared to impose ‘‘coercive privacy mandates’’ on those whose privacy would otherwise be lost or diminished (p. xii). People should not be permitted legally to give up all of the privacy that they might choose to surrender for convenience, savings, or on a whim, since privacy is a primary or foundational good, that is, a good ‘‘on which access to many other goods rests’’ (p. xii). When a good is foundational, government is required not only to protect it but also sometimes to impose it even when the intended beneficiaries might chafe. Compare legal bans on voluntary slavery. Allen is sensitive to the apparent tension between her advocacy of paternalism regarding privacy and the liberal’s conviction both that autonomy is a primary good and that autonomous individuals are better off pursuing their conceptions of the good life without government interference, everything else being equal. Nevertheless, when it comes to privacy, she thinks that some degree of paternalism is ‘‘required by’’ the ‘‘robust liberalism’’ that she defends (p. 197). Today people routinely give away seemingly innocuous bits of personal information in myriad ways: on social media sites, through tracking software on their smart phones, with credit cards, and so on. As others have stressed, the resulting digital records are in principle everlasting. They can be aggregated and analyzed to paint rich portraits, accessible from a single point. This information can then be shared quickly and widely (see Moor 1997; Nissenbaum 1997, 1998, 2010; Reiman 1995; Tavani 1999). The apparent general indifference to this process alarms Allen. Privacy, she claims, ‘‘is too important to be left to chance and taste’’ (p. 196). Most of us don’t appreciate the risks of leaving crumbs of personal information behind in a digital world. Most people are unaware that this information can now be used to invade privacy in ways that were impossible in pre-digital times. Government needs to ‘‘stop the bleeding.’’ Consider an analogy with the US Food and Drug Administration (FDA). No one has the time or resources to test every drug or item of food that would come onto an unregulated market. Instead the majority of us are happy to let the FDA do the hard work of determining what is safe. So it is with disclosing personal information. Most of us are either too young or too busy to appreciate fully ‘‘the risks of data collection, sharing, and storage that come with this strange technology we enjoy’’ (p. 196). As a liberal, Allen is wary of outright prohibitions of the surrender of personal information. Instead she favors offering incentives. She takes a clue from Cass Sunstein and Richard Thaler, who in their 2008 book Nudge reject overweening paternalism in favor of policies that motivate people to choose what is likely in their best interests. Don’t ban Twinkees and beef jerky but instead require supermarkets to place wholesome items at eye level and junk down below. Since privacy is a primary good, government should motivate people to husband it, rather than to give it away. Sometimes this assistance may be unbidden or even unwanted, but good liberal democratic governments will T. Doyle (&) Hunter College Philosophy Department, Hunter College Library, 695 Park Avenue, New York, NY 10065, USA e-mail: [email protected]
Journal of Value Inquiry | 2011
Tony Doyle
Collection Management | 2002
Tony Doyle
Knowledge, Technology & Policy | 2010
Tony Doyle
Journal of Value Inquiry | 2012
Tony Doyle