Janice Y. Tsai
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Janice Y. Tsai.
ubiquitous computing | 2010
Eran Toch; Justin Cranshaw; Paul Hankes Drielsma; Janice Y. Tsai; Patrick Gage Kelley; James Springfield; Lorrie Faith Cranor; Jason I. Hong; Norman M. Sadeh
The rapid adoption of location tracking and mobile social networking technologies raises significant privacy challenges. Today our understanding of peoples location sharing privacy preferences remains very limited, including how these preferences are impacted by the type of location tracking device or the nature of the locations visited. To address this gap, we deployed Locaccino, a mobile location sharing system, in a four week long field study, where we examined the behavior of study participants (n=28) who shared their location with their acquaintances (n=373.) Our results show that users appear more comfortable sharing their presence at locations visited by a large and diverse set of people. Our study also indicates that people who visit a wider number of places tend to also be the subject of a greater number of requests for their locations. Over time these same people tend to also evolve more sophisticated privacy preferences, reflected by an increase in time- and location-based restrictions. We conclude by discussing the implications our findings.
symposium on usable privacy and security | 2009
Michael Benisch; Patrick Gage Kelley; Norman M. Sadeh; Tuomas Sandholm; Janice Y. Tsai; Lorrie Faith Cranor; Paul Hankes Drielsma
Abstract : A recent trend on the Web is a demand for higher levels of expressiveness in the mechanisms that mediate interactions such as the allocation of resources, matching of peers, or elicitation of opinions. In this paper, we demonstrate the need for greater expressiveness in privacy mechanisms, which control the conditions under which private information is shared on the Web. We begin by adapting our recent theoretical framework for characterizing expressiveness to this domain. By leveraging prior results, we are able to prove that any increase in allowed expressiveness for privacy mechanisms leads to a strict improvement in their efficiency (i.e., the ability of individuals to share information without violating their privacy constraints). We validate these theoretical results with a week-long human subject experiment, where we tracked the locations of 30 subjects. Each day we collected their stated ground truth privacy preferences regarding sharing their locations with different groups of people. Our results confirm that 1) most subjects had relatively complex privacy preferences, and 2) that privacy mechanisms with higher levels of expressiveness are significantly more efficient in this domain.
ieee symposium on security and privacy | 2006
Janice Y. Tsai; Serge Egelman
A report of the second annual Symposium on Usable Privacy and Security (SOUPS 2006) held at Carnegie Mellon University (CMU) 12-14 July 2006.
symposium on usable privacy and security | 2009
Eran Toch; Ramprasad Ravichandran; Lorrie Faith Cranor; Paul Hankes Drielsma; Jason I. Hong; Patrick Gage Kelley; Norman M. Sadeh; Janice Y. Tsai
Privacy in location sharing applications is particularly important due to the sensitivity of users’ geographical location. Privacy in most location sharing applications is provided through relatively simple functionality where users have to specify which friends they are willing to share their location with. This approach may not be adequate to fully express users’ privacy requirements. In this short article, we present results obtained with a privacy preserving location sharing application, where users are given the option of specifying location disclosure preferences that reflect recurring scenarios, using attributes such as days of the week, times of the day or specific locations. Our results indicate that, while many users tend to start with relatively simple policies similar to those they could specify using today’s applications, over time they seem to increasingly refine these policies and take advantage of time and location restrictions.
human factors in computing systems | 2018
Joseph 'Jofish' Kaye; Joel E. Fischer; Jason I. Hong; Frank Bentley; Cosmin Munteanu; Alexis Hiniker; Janice Y. Tsai; Tawfiq Ammari
In this panel, we discuss the challenges that are faced by HCI practitioners and researchers as they study how voice assistants (VA) are used on a daily basis. Voice has become a widespread and commercially viable interaction mechanism with the introduction of VAs such as Amazons Alexa, Apples Siri, the Google Assistant, and Microsofts Cortana. Despite their prevalence, the design of VAs and their embeddedness with other personal technologies and daily routines have yet to be studied in detail. Making use of a roundtable, we will discuss these issues by providing a number of VA use scenarios that panel members will discuss. Some of the issues that researchers will discuss in this panel include: (1) obtaining VA data & privacy concerns around the processing and storage of user data; (2) the personalization of VAs and the user value derived from this interaction; and (3) the relevant UX work that reflects on the design of VAs?
digital rights management | 2006
Janice Y. Tsai; Lorrie Faith Cranor; Scott Craver
In high-tech businesses ranging from Internet service providers to e-commerce websites and music stores like Apple iTun-es, there is considerable potential for collecting personal information about customers, monitoring their usage habits, or even exerting control over their behavior - for example, restricting what can be done with a purchased song. A privacy ceiling is an effective limit to these privacy intrusions, created by the perceived or actual legal liability of possessing too much information or control. As we show in this paper, the risk is not simply that of customer backlash, but liability for a customers actions, owing to the ability to identify, report, or prevent them from taking those actions. In some cases high-tech businesses have been obligated to divulge their store of personal information or to police their customers at the demand of third parties; this unwanted result derives from the possession of too much information or control for the companys own good. We argue that vicarious infringement liability in particular creates a privacy ceiling, a point beyond which there is no economic incentive to intrude on a users privacy; and, indeed, there is an incentive to architect ones business so that such intrusions are difficult or impossible.
Information Systems Research | 2011
Janice Y. Tsai; Serge Egelman; Lorrie Faith Cranor; Alessandro Acquisti
Archive | 2009
Janice Y. Tsai; Patrick Gage Kelley; Lorrie Faith Cranor; Norman M. Sadeh
human factors in computing systems | 2009
Serge Egelman; Janice Y. Tsai; Lorrie Faith Cranor; Alessandro Acquisti
WEIS | 2007
Janice Y. Tsai; Serge Egelman; Lorrie Faith Cranor; Alessandro Acquisti