Serge Egelman
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Serge Egelman.
human factors in computing systems | 2008
Serge Egelman; Lorrie Faith Cranor; Jason I. Hong
Many popular web browsers are now including active phishing warnings after previous research has shown that passive warnings are often ignored. In this laboratory study we examine the effectiveness of these warnings and examine if, how, and why they fail users. We simulated a spear phishing attack to expose users to browser warnings. We found that 97% of our sixty participants fell for at least one of the phishing messages that we sent them. However, we also found that when presented with the active warnings, 79% of participants heeded them, which was not the case for the passive warning that we tested---where only one participant heeded the warnings. Using a model from the warning sciences we analyzed how users perceive warning messages and offer suggestions for creating more effective warning messages within the phishing context.
human factors in computing systems | 2011
Saranga Komanduri; Richard Shay; Patrick Gage Kelley; Michelle L. Mazurek; Lujo Bauer; Nicolas Christin; Lorrie Faith Cranor; Serge Egelman
Text-based passwords are the most common mechanism for authenticating humans to computer systems. To prevent users from picking passwords that are too easy for an adversary to guess, system administrators adopt password-composition policies (e.g., requiring passwords to contain symbols and numbers). Unfortunately, little is known about the relationship between password-composition policies and the strength of the resulting passwords, or about the behavior of users (e.g., writing down passwords) in response to different policies. We present a large-scale study that investigates password strength, user behavior, and user sentiment across four password-composition policies. We characterize the predictability of passwords by calculating their entropy, and find that a number of commonly held beliefs about password composition and strength are inaccurate. We correlate our results with user behavior and sentiment to produce several recommendations for password-composition policies that result in strong passwords without unduly burdening users.
ieee symposium on security and privacy | 2009
Stuart E. Schechter; A. J. Bernheim Brush; Serge Egelman
All four of the most popular webmail providers -- AOL, Google, Microsoft, and Yahoo! -- rely on personal questions as the secondary authentication secrets used to reset account passwords. The security of these questions has received limited formal scrutiny, almost all of which predates webmail. We ran a user study to measure the reliability and security of the questions used by all four webmail providers. We asked participants to answer these questions and then asked their acquaintances to guess their answers. Acquaintances with whom participants reported being unwilling to share their webmail passwords were able to guess 17% of their answers. Participants forgot 20% of their own answers within six months. Whats more, 13% of answers could be guessed within five attempts by guessing the most popular answers of other participants, though this weakness is partially attributable to the geographic homogeneity of our participant pool.
human factors in computing systems | 2009
Serge Egelman; Janice Y. Tsai; Lorrie Faith Cranor; Alessandro Acquisti
Many commerce websites post privacy policies to address Internet shoppers privacy concerns. However, few users read or understand them. Iconic privacy indicators may make privacy policies more accessible and easier for users to understand: in this paper, we examine whether the timing and placement of online privacy indicators impact Internet users browsing and purchasing decisions. We conducted a laboratory study where we controlled the placement of privacy information, the timing of its appearance, the privacy level of each website, and the price and items being purchased. We found that the timing of privacy information had a significant impact on how much of a premium users were willing to pay for privacy. We also found that timing had less impact when users were willing to examine multiple websites. Finally, we found that users paid more attention to privacy indicators when purchasing privacy-sensitive items than when purchasing items that raised minimal privacy concerns.
human factors in computing systems | 2009
Stuart E. Schechter; Serge Egelman; Robert W. Reeder
Backup authentication mechanisms help users who have forgotten their passwords regain access to their accounts-or at least try. Todays systems fall short in meeting both security and reliability requirements. We designed, built, and tested a new backup authentication system that employs a social-authentication mechanism. The system employs trustees previously appointed by the account holder to verify the account holders identity. We ran three experiments to determine whether the system could (1) reliably authenticate account holders, (2) resist email attacks that target trustees by impersonating account holders, and (3) resist phone-based attacks from individuals close to account holders. Results were encouraging: seventeen of the nineteen participants who made the effort to call trustees authenticated successfully. However, we also found that users must be reminded of who their trustees are. While email-based attacks were largely unsuccessful, stronger countermeasures will be required to counter highly-personalized phone-based attacks.
human factors in computing systems | 2011
Serge Egelman; Andrew Oates; Shriram Krishnamurthi
We performed a study of Facebook users to examine how they coped with limitations of the Facebook privacy settings interface. Students graduating and joining the workforce create significant problems for all but the most basic privacy settings on social networking websites. We therefore created realistic scenarios exploiting work/play boundaries that required users to specify access control policies that were impossible due to various limitations. We examined whether users were aware of these problems without being prompted, and once given feedback, what their coping strategies were. Overall, we found that simply alerting participants to potential errors was ineffective, but when choices were also presented, participants introduced significantly fewer errors. Based on our findings, we designed a privacy settings interface based on Venn diagrams, which we validated with a usability study. We conclude that this interface may be more effective than the current privacy settings interface.
symposium on usable privacy and security | 2006
Julia Gideon; Lorrie Faith Cranor; Serge Egelman; Alessandro Acquisti
While Internet users claim to be concerned about online privacy, their behavior rarely reflects those concerns. In this paper we investigate whether the availability of comparison information about the privacy practices of online merchants affects users behavior. We conducted our study using Privacy Finder, a privacy-enhanced search engine that displays search results annotated with the privacy policy information of each site. The privacy information is garnered from computer-readable privacy policies found at the respective sites. We asked users to purchase one non-privacy-sensitive item and then one privacy-sensitive item using Privacy Finder, and observed whether the privacy information provided by our search engine impacted users purchasing decisions (participants costs were reimbursed, in order to separate the effect of privacy policies from that of price). A control group was asked to make the same purchases using a search engine that produced the same results as Privacy Finder, but did not display privacy information. We found that while Privacy Finder had some influence on non-privacy-sensitive purchase decisions, it had a more significant impact on privacy-sensitive purchases. The results suggest that when privacy policy comparison information is readily available, individuals may be willing to seek out more privacy friendly web sites and perhaps even pay a premium for privacy depending on the nature of the items to be purchased.
Electronic Commerce Research and Applications | 2008
Lorrie Faith Cranor; Serge Egelman; Steve Sheng; Aleecia M. McDonald; Abdur Chowdhury
We studied the deployment of computer-readable privacy policies encoded using the standard W3C platform for privacy preferences (P3P) format to inform questions about P3Ps usefulness to end users and researchers. We found that P3P adoption is increasing overall and that P3P adoption rates greatly vary across industries. We found that P3P had been deployed on 10% of the sites returned in the top-20 results of typical searches, and on 21% of the sites returned in the top-20 results of e-commerce searches. We examined a set of over 5000 websites in both 2003 and 2006 and found that P3P deployment among these sites increased over that time period, although we observed decreases in some sectors. In the Fall of 2007 we observed 470 new P3P policies created over a 2-month period. We found high rates of syntax errors among P3P policies, but much lower rates of critical errors that prevent a P3P user agent from interpreting them. We also found that most P3P policies have discrepancies with their natural language counterparts. Some of these discrepancies can be attributed to ambiguities, while others cause the two policies to have completely different meanings. Finally, we show that the privacy policies of P3P-enabled popular websites are similar to the privacy policies of popular websites that do not use P3P.
financial cryptography | 2011
Nicholas Christin; Serge Egelman; Timothy Vidas; Jens Grossklags
We examine the cost for an attacker to pay users to execute arbitrary code--potentially malware. We asked users at home to download and run an executable we wrote without being told what it did and without any way of knowing it was harmless. Each week, we increased the payment amount. Our goal was to examine whether users would ignore common security advice--not to run untrusted executables--if there was a direct incentive, and how much this incentive would need to be. We observed that for payments as low as
international conference on electronic commerce | 2006
Serge Egelman; Lorrie Faith Cranor; Abdur Chowdhury
0.01, 22% of the people who viewed the task ultimately ran our executable. Once increased to