I Want My App That Way: Reclaiming Sovereignty Over Personal Devices
II Want My App
That
Way: Reclaiming Sovereignty OverPersonal Devices
Konrad Kollnig ∗ Siddhartha Datta ∗ Max Van Kleek {konrad.kollnig,siddhartha.datta,emax}@cs.ox.ac.ukDepartment of Computer Science, University of OxfordOxford, United Kingdom
ABSTRACT
Dark patterns in mobile apps take advantage of cognitive biasesof end-users and can have detrimental effects on people’s lives.Despite growing research in identifying remedies for dark patternsand established solutions for desktop browsers, there exists no es-tablished methodology to reduce dark patterns in mobile apps. Ourwork introduces
GreaseDroid , a community-driven app modifica-tion framework enabling non-expert users to disable dark patternsin apps selectively.
CCS CONCEPTS • Human-centered computing → Systems and tools for in-teraction design ; Accessibility systems and tools ; •
Security andprivacy → Software reverse engineering . KEYWORDS dark patterns, digital distraction, digital self-control, mobile app,program repair
ACM Reference Format:
Konrad Kollnig, Siddhartha Datta, and Max Van Kleek. 2021. I Want MyApp
That
Way: Reclaiming Sovereignty Over Personal Devices. In
CHIConference on Human Factors in Computing Systems Extended Abstracts (CHI’21 Extended Abstracts), May 8–13, 2021, Yokohama, Japan.
ACM, New York,NY, USA, 8 pages. https://doi.org/10.1145/3411763.3451632
As our primary interfaces to both the digital world and to oneanother, mobile phones have become more than indispensable ac-cessories, but extensions to our very beings that allow us operate ascitizens within a distributed, global society [1]. Yet, the very mobileapps that now enable people to do so much are also being usedas leverage by app developers, platforms, and services to exploitend-users, as evidenced by an increasing variety of user-hostileelements situated within services that provide this essential digital ∗ Both authors contributed equally to this research.Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for components of this work owned by others than ACMmust be honored. Abstracting with credit is permitted. To copy otherwise, or republish,to post on servers or to redistribute to lists, requires prior specific permission and/or afee. Request permissions from [email protected].
CHI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan © 2021 Association for Computing Machinery.ACM ISBN 978-1-4503-8095-9/21/05...$15.00https://doi.org/10.1145/3411763.3451632 functionality [2]. Such exploitation ranges from behavioural design(nudging people to take certain actions, such as making systems ad-dictive), to exploiting users’ psychological weaknesses, to featuresthat directly harm the user such as by undermining their privacyeither by giving up their personal information directly or by track-ing them behind the scenes [3–5]. Even if apps do not pose directharm to the majority of users, they often fail to account for thediversity and sensitivities of the most vulnerable groups of users [6].Unfortunately, the tools for users to change and challenge theiressential technological infrastructure are currently limited. Much ofthe essential software infrastructure is developed and deployed bypowerful international tech companies, with limited means for userparticipation and negotiation in software design. What if users weregive the ability to overcome user-hostile features in technology,thereby putting power back into the hands of its users?This article explores how one specific example of user-hostiledesign in mobile apps, namely dark patterns , can be alleviated. Darkpatterns are carefully-designed interfaces that induce a user to actwith specific intent of the developer and widely used in mobileapps. Studies that inspect UI elements on apps indicate high preva-lence of dark patterns in web and software applications, typicallycovert, deceptive or information hiding in nature [7–9]. Dark pat-terns can have detrimental effects on data protection and consumerprotection through the systematic exploitation of users’ cognitivebiases [10–13]. This is particularly harmful to children, the aged,individuals with disabilities [14], and the broader population [15].An intentionally-designed feature for one user may be considereda prohibitive bug for another.This paper contributes to reducing the impact of dark patternsin mobile apps: • Our conceptual contribution is a system for app modifica-tion as a way to reduce dark patterns in mobile apps. Webelieve that a community-driven approach can allow evennon-expert users to modify apps in a way that fits theirneeds. • Our technical contribution is a working prototype of a user-scripting toolkit that manifests our general contribution tominimize dark patterns. We introduce sample patches witha case study of the Twitter app and show that dark patternscan be easily removed.Whilst this article evaluates dark patterns,
GreaseDroid is by farnot limited to this example of user-hostile design. It can easily beused to modify other aspects of apps, including addictive designpatterns and concerning privacy practices. a r X i v : . [ c s . H C ] F e b HI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan Kollnig, Datta, and Van Kleek (a) Original Twitter app. (b)
GreaseDroid -patched Twitter applicationwith reduced dark patterns.
Figure 1:
GreaseDroid enables the removal of dark patterns (highlighted in red) in Android apps. Compared to the defaultTwitter app (left), stories and notifications have been disabled to reduce distractions in the patched version of the app (right).
As researchers in HCI, we strive to understand users and craft inter-faces and systems for them. Unfortunately, user-focused researchcan be exploited against our users, with the design of interfacesthat abuse cognitive and psychological vulnerabilities. It is proventhat interfaces can inductively change a user’s cognitive function-ing [16–18]. Maliciously-crafted interfaces can thus induce precisecognitive functions intended by the developer, manifesting as digi-tal addiction, digital persuasion, digital nudging, gamification, dataexploitation, and dark patterns [19].This work focuses on dark patterns , which can be defined ascarefully-designed interfaces that induce a user to act with specificintent of the developer based on a user’s cognitive biases.
Grayet al. provide a 5-class taxonomy of dark patterns [20]. (1) Inter-face Interference elements manipulate the user interface such thatcertain actions are induced upon the user compared to other pos-sible actions. (2) Nagging elements interrupt the user’s currenttask with out-of-focus tasks, usually in the form of a choice-basedpopup to redirect a user towards another goal. (3) Forced Action elements introduce sub-tasks forcefully before permitting a userto complete their desired task. (4) Obstruction elements introducesubtasks with the intention of dissuading a user from performingan operation in the desired mode. (5) Sneaking elements conceal ordelay information relevant to the user in performing a task.Given the prevalence of such interface patterns, more and moreusers wish to exercise digital self-control to protect themselves. De-spite the increasing body of literature on digital self-control, alongwith a wide array of anti-distraction apps, many users still struggleto exert meaningful control over their digital device use [3]. It has been shown that interventions against harmful design patterns,such as hiding the Facebook feed in the desktop browser, can beeffective at reducing usage time, and making users feel more in con-trol of their device use [4]. Yet, the limited ways for intervention onmobile devices are a major limitation of current digital self-controlresearch and apps.There is a long history of tools that try to fix deficiencies inthe design of programs on desktops , notably on websites. One ofthese tools is the browser extension Greasemonkey. This tool letsusers choose from a wide array of userscripts to change the designor functionality of a given website to fit their needs. For instance,users can install userscripts to remove the feed from Facebook orincrease the readability of Wikipedia. Similar technologies as usedin Greasemonkey are the backbone of ad blockers, which are widelyin use [21].Greasemonkey and its variants can be considered examples ofprogram repair tools. Program repair methods [22–24] are con-cerned with the improvement of software and removal of bugs orerrors after deployment. Dark patterns can pose functional hin-drances, and could be considered as “bugs”. Dark patterns mayleverage cognitive biases of the individuals that exceed the extentintended by or even beyond the intention of the developer, hencerequiring certain dark patterns to be selectively mitigated.There exist some solutions for mobile devices to enhance theproperties of apps. Methodologically, these solutions either 1) mod-ify the operating system (e.g. Cydia Substrate [25], Xposed Frame-work [26], ProtectMyPrivacy [27], or TaintDroid [28]), 2) modifyapps (e.g. Lucky Patcher [29], apk-mitm [30], Joel et al. [31], Droid-Force [32], RetroSkeleton [33], AppGuard [34], I-ARM-Droid [35],
Want My App
That
Way: Reclaiming Sovereignty Over Personal Devices CHI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan
Aurasium [36]), or 3) use System APIs (e.g. VPN-based ad blockers,such as AdGuard [37] and TrackerControl [38].All of these solutions come with certain limitations. Whilst mod-ifying the operating system can in principle make arbitrary mod-ifications to the behaviour of apps, these usually rely on devicevulnerabilities and are a moving target for device manufacturers.Operating system modifications can pose security risks, potentiallyvoid warranty, and are usually infeasible for non-expert users. Bycontrast, the use of System APIs might often be the most straight-forward approach for a non-expert user, operating in the familiarspace of the user’s own smartphone. This also poses a major limita-tion because only what is permitted by the smartphone operatingsystem can be realised. For instance, the removal of the newsfeedfrom the Facebook app has not been accomplished through SystemAPIs.In app modification on Android, some transformation is appliedto an app used by the user, e.g. the removal of the newsfeed fromthe Facebook app. App modification offers ease of use (installingcustom apps on Android is supported by the operating system), andallows for arbitrary modifications of a provided app (only limitedby the constraints of the operating system). As such, it combinesbenefits of system modification and System APIs. Despite this, thereexist hardly any solutions used in practice using app modification.One exception is the app cracking tool Lucky Patcher that allows toremove ads and paywalls from apps. Another exception is apk-mitm to remove certificate pinning from apps and allow to intercept en-crypted network traffic. Some developers have published modifiedversions of popular apps, including Facebook [39] and YouTube [40](removing ads and other distracting functions). Unfortunately, suchmodified apps rely on the continued support of the developers, maybreak over time (e.g. in case of server-side updates), and exist onlyfor a few select apps.
To date, there exist no tool for non-expert smartphone users toremove dark patterns from their mobile apps. We propose a patch-ing paradigm based upon app modification, called
GreaseDroid ,that enables user-scripting to reduce dark patterns in mobile apps.
GreaseDroid acts as a tool that contains the assets needed for usersto act autonomously in improving apps for their own wellbeing.We retain a high level of abstraction for flexible implementationand additional modules; our implementation in code is covered insection 5. Users go through three main phases (see in Figure 3a).A user (1) selects an app, (2) applies a set of patches, and then (3)re-installs the apps on their phone. We share our code on GitHub . (1) App selection. In the first step, the user selects the app theywant to patch.
GreaseDroid shall allow the user to select arbitraryapps in the form of apk files (the standard format for apps on An-droid). For increased ease of use,
GreaseDroid could directly fetchthe latest version of the app from Google Play, or another third-party app store. This would ensure that the user only ever patchesthe latest version of an app, and minimise unnecessary hassle of find-ing the apk file themselves. Due to potential violations of Google’sTerms & Conditions, we have not yet implemented such an ap-proach. See https://github.com/OxfordHCC/greasedroid. (a) Step 1: User selects an app.(b) Step 2: User selects patches.(c) Step 3: User installs patched app.
Figure 2: Our implementation enables non-expert users toremove dark patterns from apps in three simple steps.(2) Patching.
In the second phase, the user selects a set of patchesto apply to the chosen app. These patches are developed by expertcommunity users, that we call patch developers . Once the user haschosen a set of patches, these are applied to the selected app andmodify the app code (e.g. smali assembly code compiled libraries)or resources (e.g. xml layout files, images, sounds).Patches can be created in at least two ways, either as patchscripts or byte masks. Patch developers can create patch scriptsby first decoding the app with apktool . They then sift throughassembly code (stored as smali files and compiled libraries) andother resources (e.g. xml layout files, images, sounds) to identifypotential code modifications for dark patterns. If the patches aregeneralizable over a number of apps, these are considered app-agnostic , else app-specific . Our
GreaseDroid implementation uses
HI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan Kollnig, Datta, and Van Kleek (a) General architecture. (b) Technical implementation.
Figure 3: Overview of the
GreaseDroid paradigm. patching based upon such patch scripts. We will explain this methodin more detail in the following sections.Byte masks might be a powerful alternative to patch scripts.These describe patches to apps that do not require decompilation,and instead directly modify the compiled bytecode of apps basedon pattern matching. For instance, colours within an app can bechanged by simply searching for their hexadecimal representationin the app binary. Similarly, the existing Lucky Patcher tool replacesbyte patterns in apps’ compiled code to circumvent piracy protec-tions [41]. Since byte masks do not rely on decompilation, theymight be easier to deploy to a client-side patching system on theuser’s phone and might avoid legal issues related to decompilation.We will not study this in more detail in this work because they aremore difficult to develop for patch developers.Ideally, patches crafted by patch developers should be designed tobe executable on the patching device (e.g. a Linux-based patchingenvironment may permit for a variety of patching scripts suchas
Perl or bash ). Patches should also take measures to be robust,including compatibility with patches and with updates to the appor operating system. (3) Re-deployment. After successful app patching, we need to re-sign in order to install the modified app on the user’s device. Thisre-signing process requires a user-generated certificate to ensurethe authenticity of the app on the Android system. It is importantthat the certificate is unique to the user, to prevent attackers frominstalling malicious apps on the user’s phone. A problem is thatsigning apps with user-generated certificate makes it impossible toinstall app updates directly from the Google Play store. GreaseDroidoffers an easy solution because the whole patching process can berepeated for any app updates.
GreaseDroid relies on the availability of patches. To populate patchesinto the patch database, we rely on a network of expert patch devel-opers to craft the patches. Patches can be categorized based on theirdegree of app-specificity (app-agnostic or app-specific patches) anddark pattern blockage (interface or control flow patches). Theseclasses of patches are explained in the following.
Control Flow patches vs Interface patches.
This category ofpatches is concerned with the types of dark patterns disabled inapps through patching. We refer to
Gray et al. ’s taxonomy of darkpatterns [20]:
Interface Interference , Nagging , Forced Action , Obstruction , Sneaking . Each type of dark pattern can be tackledby either control flow patches or interface patches . We expect mostpatches proposed to be interface patches, specifically targeting Interface Interference dark patterns. The distribution of dark patternsfound in apps is skewed towards
Interface Interference elements,thus we expect the distribution of patching efforts to tend towardsInterface Patches [8, 42].
Interface patches [ Interface Interferences ] Some of ourpatches directly change the UI of an app. Specific instances such as pre-selection or hidden information leverage the attention scope ofend-users, so possible patches include the exaggeration of attributesof UI components to draw attention. Another instance, aestheticmanipulation , might draw end-user attention to certain compo-nents, so possible patches include removing those components oraesthetically-modifying them to be less distracting. We show in Fig-ure 1 the concealment of elements that create distraction for users,such as the Stories function and the notification counter. Interfacepatches are often straightforward develop, requiring the modifica-tion of the Android xml layout files only; for example, changingvisibility of elements, e.g. changing graphical asset colours to drawattention, rendering transparent or removing UI components. Suchmodification is similar to modifying html websites, and only re-quires expertise with app development on Android.
Control flow patches [ Nagging , Forced Action , Obstruc-tion , Sneaking ] These patches require the modification of pro-gram control flow. Patches in this category are more difficult tocreate. They rely on the modification of high-level assembly codein the smali programming language. It is challenging to modifythe obfuscated assembly code, as integrity-preserving decompila-tion into
Java for closed-source apps is not guaranteed. Some darkpatterns could be disabled by simply removing a function call to apop-up function (for nagging ). Some dark patterns are built deepinto the function of the app, and may require rewriting how func-tions execute (e.g. removing token mechanisms for gamefication ).For example, forced actions are in-task interruptions that can only
Want My App
That
Way: Reclaiming Sovereignty Over Personal Devices CHI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan
Figure 4: Left: An example of an app-agnostic Control Flow patch – a Perl script to prevent apps from accessing the longitudeand latitude of the current physical user location. Right: An app-specific Interface patch – a diff script to remove the
Stories bar from the Twitter app. be removed by modifying the control flow logic and refactoring thesource code.
App-agnostic vs. app-specific patches.
This category of patchesis concerned with range of apps that the patch can be applied to.With our provided toolkit, we were able to craft patches for severalapps, both (1) app-agnostic patches (patches that can be applied toa wide range of applications) and (2) app-specific patches (patchesthat are tailored to a specific application) (Figure 4).
To test our proposed system of app patching to alleviate dark pat-terns, we constructed a prototype that enables end-users to applypre-defined patches to Android apps of their choice. The followingdiscusses this prototype implementation of our
GreaseDroid para-digm, consisting of end-users and patch developers. Figure 2 showsscreenshots of our implementation of the three-step patching pro-cess. Figure 3b illustrates our reference implementation of this flowin software.End-users open the
GreaseDroid website on their device andselect from a pre-selected list of Android apps to patch (Figure 2a).They are shown a list of patches, then choose to apply their selectedpatches to the app (Figure 2b). When complete, the user is provideda re-signed (Figure 2c) installation file that can be installed on theirphone.Patch developers construct patches in the form of Linux scripts.These patch scripts describe how to modify the code and resourcesof an app. Each patch script comes with additional metadata, includ-ing name, description, author, and further robustness parameters(e.g. apps and app versions supported). The use of Linux scriptsfor patching has multiple advantages. First, it allows individualswith expertise in various programming languages to participate inuser-scripting for
GreaseDroid . Second, Linux-based patches canpotentially be run on the Android device itself, reducing the needfor an external server and increasing the ease for the end-user. Forexample, one of our app-agnostic patch scripts applies a regularexpression through Perl to remove location access from an app, seeFigure 4.In our
GreaseDroid implementation, patching is executed on anexternal server. Patches can be created using common programminglanguages (e.g. python , Perl , bash ) due to server pre-requisites. Patchdevelopers insert a formatted patching higher-order function into the main patching library that calls and runs the patch script topatch a designated app. We provide interface patches that requiremodifications to xml layout and smali code files, so we first decodethe apk file with apktool , then after calling the patch scripts, werun apktool to build a modified apk from source, and re-sign withuser details. We then return a download link for user installation(Figure 2c). To mitigate potential legal issues, we may considerswitching from server-side to client-side patching on the user devicein future work. To illustrate that
GreaseDroid can successfully remove dark patternsfrom apps, we consider the Twitter app on Android. In the mainscreen of the app, we identified two dark patterns that are used toencourage more user interaction, see Figure 1a. First, the top of theTwitter app shows so-called “Fleets”. These are tweets disappearingafter 24 hours, similar to the “Stories” functionality of Facebook,Instagram and Snapchat. Second, the bottom of the app contains anotification counter that informs the users about recent activities ofother users. Neither Fleets nor notifications can be fully deactivatedwithin the Twitter app, so some users might want to remove thisfunctionality with
GreaseDroid , to reduce time spent on Twitter.We were able to remove the two identified dark patterns fromthe Twitter app with the help of
GreaseDroid , by removing therelevant sections from the xml layout files from the decompiledTwitter app (i.e. from those files that describe the UI on Android).It was straightforward to identify the relevant sections of the xml files with one of the run-time app layout inspection tools from theGoogle Play Store. Additionally, with some more work, we injectedcode into the smali code of the decompiled Twitter app to refuseattempts to click the notification button, thereby disable access tothe notification view of the Twitter app.The study of the Twitter app demonstrates that it can be straight-forward to modify UI components of existing apps. Changing theprogram flow is more challenging than UI changes because thesource code is usually obfuscated, but also possible for patch devel-opers with good programming expertise.
HI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan Kollnig, Datta, and Van Kleek
Benefits for app developers.
An ecosystem of patch developersand patch users is not isolated from the original app developers.Indeed, it may bring benefits for them. The development of patchescan serve as a feedback loop, to provide valuable suggestions forthe original app developer. With an active network of patch devel-opers and patch adopters,
GreaseDroid can potentially speed upthe app developer’s development cycle and reduce costs throughcrowdsourced software engineering efforts. Indeed, the ability tocreate patches through
GreaseDroid might create new financial in-centives, wherein app developers reward patch developers, similarto existing bug bounty programs.An open question is how app developers will perceive increaseduser patching, whether it will be a moving target leading to an armsrace, or whether developers will engage with the community so asto make apps better.
Patch robustness across app updates.
App developers regularlyrelease updates to their apps. This poses a particular challenge indesigning patches because patches might lead to breakage of apps.However, the amount of change across app updates will usually berather small. UI resources in Android apps are usually organisedaround xml layout files, which are easier to modify than compiledsource code, as demonstrated by our case study of the Twitter app.In addition, app-agnostic patches should not suffer from issues withapp updates. A focus on UI and app-agnostic patches will help patchcompatibility with app updates.
Protection against malicious patches.
While community-drivenuser-scripting (1) reduces duplicate patch development and in-creases patch development efficiency, (2) increases patch qual-ity selection (e.g. higher quality patches given a higher ratingby patch adopters), and (3) increases collaborative opportunitieswithin cliques in the patch developer network (e.g. identifying newpatches, fixing bugs), a critical challenge is the threat of maliciouspatches.
Greasemonkey ’s script market suffered from the existenceof a significant number of malicious scripts disguised as benignscripts with abusable vulnerabilities [43]. Though there have beenrecent developments in automated malware detection [44–50], dueto the complexity and heterogeneity of patch scripts, automated ap-proaches might not be sufficiently reliable to mitigate the threat ofmalicious patches [51–54]. A review system of expert users mightbe preferable over automated program analysis to reduce maliciousscripts. Each patch uploaded to the patch database could be verified by a (group of) peer patch developer(s). To ensure authenticity ofreviews, patch reviewers could cryptographically sign patches afterverification. The reliability of patch review by experts in practicemay needed to be studied further. Protecting users against mali-cious code is a great challenge in making
GreaseDroid available toa wide array of end-users, and will require more consideration andpractical experience in future work.
Legality.
Using
GreaseDroid as an end-user might violate the law,particularly the DMCA in the US and the Computer ProgramsDirective in the EU. Two main issues arise: 1) the distribution ofpatched apps and 2) the decompilation and disassembly of apps.The decentralised approach of
GreaseDroid might help to overcomesuch legal challenges. Whilst developers previously shared patchedapps online,
GreaseDroid separates the distribution of patches from the patching of apps. Patches are applied at install-time and on theuser’s device. There is no need to distribute patched apps. Suchprivate modification of apps might be covered by the “fair use” and“right to repair” principles in the US because
GreaseDroid allowsusers to remove deficiencies from apps.Our implementation of patching in
GreaseDroid does not rely on decompilation of the program code, but rather on disassembly . Thiscould reduce issues with legislation that bans decompilation. Bytemasks – as used in LuckyPatcher – could even remove the need fordisassembly, and instead make changes to the app binary directly.Under the EU Computer Programs Directive, decompilation is per-mitted when “necessary for the use of the computer program by thelawful acquirer in accordance with its intended purpose, includingfor error correction” . Enabling disadvantaged users to use essentialtechnologies; promoting research in the public interests; and com-plying with fundamental human rights laws, including privacy anddemocratic rights, may well be the intended purpose of many apps.
This work describes
GreaseDroid , a framework to reduce dark pat-terns in mobile apps through app modification, and provide a func-tional prototype implementation. Using the example of the Twitterapp, we illustrate the patch development process and prove the func-tionality of our method. The successful deployment of community-driven user-scripting to minimize dark patterns in mobile apps canbe further pursued with research in digital self-control, app pri-vacy, software law, software engineering practices, and app securityand reverse-engineering. Further work should study how our appmodification framework can be deployed more widely, particularlymitigating the technical and legal challenges; why users wouldwant to modify their apps; how effective app modification is athelping users gain more agency in their app use. This work aims toopen the debate around what choice users should have over theirapps, and whether there should be a “right to fair programs”, evenif this requires changes to existing legislation. Given that usersrely on their apps and have currently limited ways to change them,
GreaseDroid introduces means to change their apps, and negotiatethe terms of their apps.
REFERENCES [1] Irina Shklovski, Scott D. Mainwaring, Halla Hrund Skúladóttir, and HöskuldurBorgthorsson. 2014. Leakiness and Creepiness in App Space: Perceptions ofPrivacy and Mobile App Use. In
Proceedings of the 32nd Annual ACM Conferenceon Human Factors in Computing Systems - CHI ’14 (Toronto, Ontario, Canada).ACM Press, New York, NY, United States, 2347–2356. https://doi.org/10.1145/2556288.2557421[2] Bundeskartellamt. 2019. B6-22/16 (Facebook v Bundeskartellamt).[3] Ulrik Lyngs, Kai Lukoff, Petr Slovak, Reuben Binns, Adam Slack, Michael Inzlicht,Max Van Kleek, and Nigel Shadbolt. 2019. Self-Control in Cyberspace: ApplyingDual Systems Theory to a Review of Digital Self-Control Tools. In
Proceedingsof the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow,Scotland Uk) (CHI ’19) . Association for Computing Machinery, New York, NY,United States, 1–18. https://doi.org/10.1145/3290605.3300361[4] Ulrik Lyngs, Kai Lukoff, Petr Slovak, William Seymour, Helena Webb, MarinaJirotka, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2020. ’I Just Want to HackMyself to Not Get Distracted’: Evaluating Design Interventions for Self-Controlon Facebook. In
Proceedings of the 2020 CHI Conference on Human Factors inComputing Systems . ACM, Honolulu HI USA, 1–15. https://doi.org/10.1145/3313831.3376672[5] Reuben Binns, Ulrik Lyngs, Max Van Kleek, Jun Zhao, Timothy Libert, and NigelShadbolt. 2018. Third Party Tracking in the Mobile Ecosystem. In
Proceedings ofthe 10th ACM Conference on Web Science - WebSci ’18 (Amsterdam, Netherlands).
Want My App
That
Way: Reclaiming Sovereignty Over Personal Devices CHI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan
ACM Press, New York, NY, United States, 23–31. https://doi.org/10.1145/3201064.3201089[6] Ruha Benjamin. 2019.
Race after Technology: Abolitionist Tools for the New JimCode . Polity, Cambridge, United Kingdom.[7] Arunesh Mathur, Gunes Acar, Michael J. Friedman, Elena Lucherini, JonathanMayer, Marshini Chetty, and Arvind Narayanan. 2019. Dark Patterns at Scale:Findings from a Crawl of 11K Shopping Websites.
Proceedings of the ACM onHuman-Computer Interaction
3, CSCW, Article 81 (Nov. 2019), 32 pages. https://doi.org/10.1145/3359183[8] Linda Di Geronimo, Larissa Braz, Enrico Fregnan, Fabio Palomba, and AlbertoBacchelli. 2020. UI Dark Patterns and Where to Find Them: A Study on MobileApplications and User Perception. In
Proceedings of the 2020 CHI Conference onHuman Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20) . Associationfor Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376600[9] Carol Moser, Sarita Y. Schoenebeck, and Paul Resnick. 2019. Impulse Buy-ing: Design Practices and Consumer Needs. In
Proceedings of the 2019 CHIConference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19) . Association for Computing Machinery, New York, NY, USA, 1–15.https://doi.org/10.1145/3290605.3300472[10] Christine Utz, Martin Degeling, Sascha Fahl, Florian Schaub, and Thorsten Holz.2019. (Un)Informed Consent: Studying GDPR Consent Notices in the Field. In
Proceedings of the 2019 ACM SIGSAC Conference on Computer and CommunicationsSecurity (London, United Kingdom) (CCS ’19) . Association for Computing Ma-chinery, New York, NY, USA, 973–990. https://doi.org/10.1145/3319535.3354212[11] Dominique Machuletz and Rainer Böhme. 01 Apr. 2020. Multiple Purposes,Multiple Problems: A User Study of Consent Dialogs after GDPR.
Proceedingson Privacy Enhancing Technologies
Proceedings of the 2020 CHI Conference on Human Factorsin Computing Systems (Honolulu, HI, USA) (CHI ’20) . Association for ComputingMachinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376321[13] Than Htut Soe, Oda Elise Nordberg, Frode Guribye, and Marija Slavkovik. 2020.Circumvention by design – dark patterns in cookie consents for online newsoutlets. arXiv:2006.13985 [cs.HC][14] Shaomei Wu, Lindsay Reynolds, Xian Li, and Francisco Guzmán. 2019. Designand Evaluation of a Social Media Writing Support Tool for People with Dyslexia.In
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
Frontiers in Psychology
Computers in Human Behavior
31 (2014), 373 – 383. https://doi.org/10.1016/j.chb.2013.10.047[18] B.J. Fogg. 2003.
Persuasive Technology: Using Computers to Change What We Thinkand Do . Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.[19] Yvonne Rogers, Paul Dourish, Patrick Olivier, Margot Brereton, and Jodi Forlizzi.2020. The Dark Side of Interaction Design. In
Extended Abstracts of the 2020CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI EA ’20) . Association for Computing Machinery, New York, NY, USA, 1–4.https://doi.org/10.1145/3334480.3381070[20] Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs.2018. The Dark (Patterns) Side of UX Design. In
Proceedings of the 2018 CHIConference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18)
Springer Empirical Software Engineering
22, 4(2016), 1936–1964. https://doi.org/10.1007/s10664-016-9470-4[23] He Ye, Matias Martinez, and Martin Monperrus. 2019. Automated Patch Assess-ment for Program Repair at Scale. arXiv:1909.13694 [cs.SE][24] Edward K. Smith, Earl T. Barr, Claire Le Goues, and Yuriy Brun. 2015. Is the CureWorse than the Disease? Overfitting in Automated Program Repair. In
Proceedingsof the 2015 10th Joint Meeting on Foundations of Software Engineering (Bergamo,Italy) (ESEC/FSE 2015)
Proceedingof the 11th Annual International Conference on Mobile Systems, Applications, andServices - MobiSys ’13 . ACM Press, Taipei, Taiwan, 97. https://doi.org/10.1145/2462456.2464460[28] William Enck, Peter Gilbert, Byung-Gon Chun, Landon P. Cox, Jaeyeon Jung,Patrick McDaniel, and Anmol N. Sheth. 2010. TaintDroid: An Information-FlowTracking System for Realtime Privacy Monitoring on Smartphones. In
Proceedingsof the 9th USENIX Conference on Operating Systems Design and Implementation(OSDI’10)
Proceedings of the SecondACM Workshop on Security and Privacy in Smartphones and Mobile Devices - SPSM’12 . ACM Press, Raleigh, North Carolina, USA, 3. https://doi.org/10.1145/2381934.2381938[32] Siegfried Rasthofer, Steven Arzt, Enrico Lovat, and Eric Bodden. 2014. DroidForce:Enforcing Complex, Data-Centric, System-Wide Policies in Android. In . IEEE, Fribourg,Switzerland, 40–49. https://doi.org/10.1109/ARES.2014.13[33] Benjamin Davis and Hao Chen. 2013. RetroSkeleton: Retrofitting Android Apps.In
Proceeding of the 11th Annual International Conference on Mobile Systems,Applications, and Services - MobiSys ’13 . ACM Press, Taipei, Taiwan, 181. https://doi.org/10.1145/2462456.2464462[34] Michael Backes, Sebastian Gerling, Christian Hammer, Matteo Maffei, and Philippvon Styp-Rekowsky. 2014. AppGuard – Fine-Grained Policy Enforcement forUntrusted Android Applications. In
Data Privacy Management and AutonomousSpontaneous Security , Joaquin Garcia-Alfaro, Georgios Lioudakis, Nora Cuppens-Boulahia, Simon Foley, and William M. Fitzgerald (Eds.). Lecture Notes in Com-puter Science, Vol. 8247. Springer Berlin Heidelberg, Berlin, Heidelberg, 213–231.https://doi.org/10.1007/978-3-642-54568-9_14[35] Benjamin Davis, Ben S, Armen Khodaverdian, and Hao Chen. 2012. I-ARM-Droid:A rewriting framework for in-app reference monitors for android applications.In
In Proceedings of the Mobile Security Technologies 2012, MOST ’12.
IEEE, NewYork, NY, United States, 1–9.[36] Rubin Xu, Hassen Saïdi, and Ross Anderson. 2012. Aurasium: Practical Policy En-forcement for Android Applications. In
International Symposium on Ambi-ent Intelligence and Embedded Systems . Springer, Heraklion, Crete, Greece,1–6. http://amies-2016.international-symposium.org/proceedings_2016/Kannengiesser_Neutze_Baumgarten_Song_AmiEs_2016_Paper.pdf[42] Madison Fansher, Shruthi Sai Chivukula, and Colin M. Gray. 2018.
Extended Abstracts ofthe 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC,Canada) (CHI EA ’18) . Association for Computing Machinery, New York, NY,USA, 1–6. https://doi.org/10.1145/3170427.3188553[43] Steven Van Acker, Nick Nikiforakis, Lieven Desmet, Frank Piessens, and WouterJoosen. 2014. Monkey-in-the-Browser: Malware and Vulnerabilities in Aug-mented Browsing Script Markets. In
Proceedings of the 9th ACM Symposiumon Information, Computer and Communications Security (Kyoto, Japan) (ASIACCS ’14) . Association for Computing Machinery, New York, NY, USA, 525–530.https://doi.org/10.1145/2590296.2590311[44] Lingwei Chen, Shifu Hou, and Yanfang Ye. 2017. SecureDroid: Enhancing Secu-rity of Machine Learning-Based Detection against Adversarial Android MalwareAttacks. In
Proceedings of the 33rd Annual Computer Security Applications Confer-ence (Orlando, FL, USA) (ACSAC 2017) . Association for Computing Machinery,New York, NY, USA, 362–372. https://doi.org/10.1145/3134600.3134636[45] Wei Yang, Deguang Kong, Tao Xie, and Carl A. Gunter. 2017. Malware Detectionin Adversarial Settings: Exploiting Feature Evolutions and Confusions in AndroidApps. In
Proceedings of the 33rd Annual Computer Security Applications Conference (Orlando, FL, USA) (ACSAC 2017) . Association for Computing Machinery, NewYork, NY, USA, 288–302. https://doi.org/10.1145/3134600.3134642[46] ElMouatez Billah Karbab, Mourad Debbabi, Abdelouahid Derhab, and DjedjigaMouheb. 2018. MalDozer: Automatic framework for android malware detectionusing deep learning.
Digital Investigation
24 (2018), S48 – S59. https://doi.org/10.1016/j.diin.2018.01.007
HI ’21 Extended Abstracts, May 8–13, 2021, Yokohama, Japan Kollnig, Datta, and Van Kleek [47] K. Xu, Yingjiu Li, R. Deng, and K. Chen. 2018. DeepRefiner: Multi-layer AndroidMalware Detection System Applying Deep Neural Networks.
1, 1 (2018), 473–487. https://doi.org/10.1109/EuroSP.2018.00040[48] Xin Su, Dafang Zhang, Wenjia Li, and Kai Zhao. 2016. A Deep Learning Ap-proach to Android Malware Feature Learning and Detection. In . IEEE, Tianjin, China, 244–251. https://doi.org/10.1109/TrustCom.2016.0070[49] Hongliang Liang, Yan Song, and Da Xiao. 2017. An end-to-end model for Androidmalware detection. In . IEEE, Beijing, 140–142. https://doi.org/10.1109/ISI.2017.8004891[50] Shifu Hou, Aaron Saas, Lifei Chen, and Yanfang Ye. 2016. Deep4MalDroid:A Deep Learning Framework for Android Malware Detection Based on LinuxKernel System Call Graphs. In . IEEE, Omaha, NE, USA, 104–111. https://doi.org/10.1109/WIW.2016.040[51] Kathrin Grosse, Nicolas Papernot, Praveen Manoharan, Michael Backes, andPatrick McDaniel. 2017. Adversarial examples for malware detection. In
Computer Security – ESORICS 2017 - 22nd European Symposium on Research in ComputerSecurity, Proceedings (Lecture Notes in Computer Science (including subseries LectureNotes in Artificial Intelligence and Lecture Notes in Bioinformatics)) , Simon N. Foley,Dieter Gollmann, and Einar Snekkenes (Eds.). Springer Verlag, Germany, 62–79.https://doi.org/10.1007/978-3-319-66399-9_4[52] Sen Chen, Minhui Xue, Lingling Fan, Shuang Hao, Lihua Xu, Haojin Zhu, andBo Li. 2018. Automated poisoning attacks and defenses in malware detectionsystems: An adversarial machine learning approach.
Computers & Security
IEEE Trans. Dependable Secur. Comput.
16, 4 (July 2019), 711–724. https://doi.org/10.1109/TDSC.2017.2700270[54] Xiao Chen, Chaoran Li, Derui Wang, Sheng Wen, Jun Zhang, Surya Nepal, YangXiang, and Kui Ren. 2020. Android HIV: A Study of Repackaging Malware forEvading Machine-Learning Detection.