A study on users' privacy perception with smart devices
AA study on users’ privacy perception with smart devices.
Alan Ferrari, and Silvia GiordanoInstitute for Information System and NetworkingUniversity of Applied Sciences of Southern Switzerland (SUPSI)Manno, Switzerlandfi[email protected] 5, 2018
Abstract
Nowadays, privacy has become a very serious issue withsmart and mobile platforms. Users tend to allow intru-sive apps access much sensible information without reallyknowing the potential threats. To solve this issue severalsolutions (e.g. GDPR) have been provided. Our claimis that the users currently are not sufficiently involved inthis process for being able to use such solutions. To dothis we developed an application that provides a form ofawareness to the users and we asked them to reply a set ofquestions. Our conclusions are that users must be betterinformed of the risks and value of their personal informa-tion.
In the past few years, smartphone applications have con-tributed greatly to the quality of users experience in theirsmart devices allowing them to reach a huge set of func-tionalities even bigger when the user?s personal informa-tion are used At the same time, however, many applica-tions fail to guarantee any kind of user privacy; this isespecially true for Android smartphones where the Ap-plication Market is self-regulated.The mechanism that Android offers to increase user pri-vacy is the use of system permissions: every time an app is downloaded from the market and the installation pro-cess is started, Android controls the app?s permissionsand then asks the user whether she is willing to allowthe app to use that set of permissions. If the user doesnot agree, the installation process is interrupted. This sys-tem allows users to know what information is used andwhich entity processes it, but it does not indicate whenand how the information is used inside an application. Forinstance, if an application proposes a game that uses sen-sor data and secretly records it, the sensor data could beused for malicious intents without the user?s knowledge.Novel regulations (i.e. GDPR) requires that the useris informed about the data access and collection by theapp. However, it is our claim that users are not informedenough to decide whether a data access in an app is a po-tential threat or not.With this work, we perform a measurement of theuser’s impact of a users awareness campaign about thepotential risks of the app he/she has installed in their ownsmart devices.We developed an Android application that monitors thebehavior of the other apps installed on the user’s device.The app also provides two distinct interaction with thedevice owner: • It informs the owner about a potential risk a givenapp has considering a specific permission. • It asks the user if he/she knows those risks and (incase he didn’t) if his/her perception of the app ischanged.1 a r X i v : . [ c s . C Y ] S e p a) App Main Windows, it allows the user to configure the appbased on her personal needs. A delete button allows the user todelete the app and also erase all her data remotely in order tomaintain full anonymization. (b) Question Windows, it shows the user a set of informationabout the behavior of the app and it asks a set of questions tothe user. Figure 1: PrintScreen of the two key windows of the application.2e perform the first round of experiments with a se-lected group of 17 users in order to obtain a first idea ofthe data we will be able to collect on a larger scale.
Privacy has been a hot topic in several domains and hasbeen approached in different ways.In the context of smart and mobile devices, severalkey works have been proposed during the past years.Langheinrich et. al. in [1] defines a set of challengesthat any Ubiquitous/Pervasive application must satisfy inorder to be privacy-save. They can be resumed in the fol-lowing list: • Principle of Openness or simply Notice: usersshould be aware of the nature of the data shared. • Choice and Consent: where users can choose to of-fer that information to the requester and they have togive explicit consent. • Anonymity and Pseudonymity: Anonymization canbe defined as the state of being not identifiable withina set of subjects. • Proximity and Locality: in essence it expresses thefact that information must not be disseminated indef-initely, not even across a larger geographic boundary. • Adequate Security: Network and disk security arefundamental. • Access and Recourse: provide mechanisms for ac-cess and regulation. Eventually, also penalties ifsomeone break the rules.Notable work on smart devices has been performed byWilliam et. al. on [2] provides TaintDroid, a tool to trackand monitor sensible information inside an application.The idea is to extend and mark each point where sensi-ble information is used inside the application and tracksuch points at runtime. It is shown that there exist 68 in-stances of information misuses in 20 out of 30 popularapplications downloaded from the market, thus confirm-ing the need for efficient and easily deployable privacy-preserving techniques. A possible solution to this problem has been proposedby Ferrari et. al. in [3] where authors propose Mock-ingdroid. Mocking is a traditional technique in softwaretesting; its main goal is to mimic the real object behaviorin a controllable way. Recently, mocking techniques havebeen used in mobile environments to increase the user pri-vacy and their goal is to allow users to select the kind ofinformation they want to pass to the application (if real orrandomly generated). The Mockingbird framework is asolution to mocking that uses recorded context-traces in-stead of randomly generated data, which is easily detectedby applications. We also propose a flexible methodologyto mock an Android application that does not require anychanges at the operating system level and at the virtualmachine level. Mockingbird is a very promising solution;we are currently testing its performance and increasing itsfunctionalityMany of those principles are nowadays included inthe law, for instance, the European Data Protection Law(GDPR) forces the concepts of ?Privacy by Design? and?Privacy by Default? in its regulation. Privacy By De-sign means data protection through technology design” inother words the fact that data protection techniques arealready integrated into the data processing with final goalminimize privacy risks through technical and governancecontrols. Privacy by default means that when a system orservice includes choices for the individual on how muchpersonal data he/she shares with others, the default set-tings should be the most privacy-friendly ones.Even though a valid set of legal tools are nowadaysavailable to the user is our claim that the lack of knowl-edge of the potential risks related to data access in mobilesmart devices makes them less effective. To this extent,we provide the following study that has a final goal todemonstrate that users must be better informed. Our goal is to study the privacy perception in the deviceowned by a group of selected users. To this extent, wedevelop and Android Application 1 that has as goal themeasurement of the key elements: The ”privacy level” of the apps installed on theusers’ device: we collect the Permission granted tothe app by the user and classify them according tothe level of risk. • The level of user awareness: periodically we showto the user a potential risk connected to the data ac-cess for a given app (if there is any) and ask if isaware of that and if with this information his/her per-ception of the app is changed (in terms of safety).
The app is built for the Android operating system and it iscomposed of three key entities: • Background Service: it collects the following in-formation: – List of the application installed in the devicesand the permissions allowed by the users. – Application in execution in a specific time in-terval. – System status (CPU / Memory) and context in-formation (location and user’s activity) and ittriggers the request for users input. • Main Window 1a: it allows the user to configurethe system settings. In other words, the users is ableto decide whether to collect information or not andthe time he/she are willing to answer our questions.This activity also allows the user to delete the appli-cation and all the information we store from him in aprivacy-safe manner. • Awareness Window 1b: periodically the user re-ceives an information about the behavior of one ofthe applications installed on his/her phone and, af-terward, he/she also receives the request to answer aquestion about its perception on a given applicationinstalled on his/her phone.All the information are anonymized and the user is onlyidentifiable by its unique device id. The delete procedureis also secure because we allowed the user to delete allhis/her information by the app thus without making anyconnection between user and user-id.During the first run, we ask the user their profile in theform of the age range, gender, and IT knowledge. P e r m i ss i on s Figure 2: Distribution of the number of permissions perapp classified as dangerous.4 o make the user aware we focus our attention on thepermissions the user grant to the application on its phone.To define the risk of the permission we use the categoriza-tion provided by the Android developers team where apre-defined set of permissions are labeled as ’Protectionlevel: dangerous’ . In order to give an awareness to theuser, we show them the application, the permissions it ac-cess and a text that describes the overall goal of that per-mission and the potential risks. To measure the impact of our awareness we ask toevery time he sees the awareness two key questions: • Do you know that this app access that information? • In case you don’t know; has your perception of theapp changed?These questions allow us to measure if the user is awareof the app dangerous data access (maybe because the appneed that information) and also to measure the impact ofour awareness with the help of the second question.
Our claim is that the users are not enough aware of thebehavior of their application. Users pay very little atten-tion to the data request made by the app, many reasonsare behind this behavior but we believe that the lack ofknowledge of the risks is one of the key elements.To prove our claim we ask 17 subjects to use our appand answer as much question as they can. The user pro-files are shown in Table 1, we see that they span acrossdifferent ages, gender and it knowledge making the setwe collected heterogeneous.The next analysis we performed is connected to thedangerous permissions required by the app installed onthe users’ phones. In Figure ?? is shown the distribution.We clearly see that the majority of the app requires num-bers of permission between 0 and 7 with mean 4. How-ever, there are a group of outliers that requires an aston-ishing number of dangerous permission up to 16.The key outcome of this first analysis is that the pri-vacy of user sensible information may be endangered by https://developer.android.com/reference/android/Manifest.permission.html Aware 41% Not Aware 59% (a) Answers on which the user is aware of the app access [%]
Changed 81% (b) Changes in perception after the awareness phase [%]
Figure 3: Aggregate outcome of the users’ answers.5he majority of application he/she has installed in his/hermobile device.As said before, to determine if the user is aware ofthe potential risks the app he/she has installed in his/herphone we provide an awareness in the form of a notifi-cation of a potential risk in a given application when wenotice the application is under execution. The notificationcontains the app name, the permission it access and a listof potential risks connected to this access.
Age
31 - 40: 8 41 - 50: 4 18 - 30: 7
Gender
Female: 6 Male: 11 It Advanced: 11 Some: 5 None: 3Table 1: User profile categorizations.To measure the impact of the awareness we asked theuser to reply to key questions; if he/she was aware of theapp dangerous accesses, and, in case he was not awareof that, if his/her perception of the app has changed. Weclearly see that in the majority of the cases (more than60%) the user has no idea of the risks connected to thatapp.Another important measure is to determine if theawareness has an impact on the users; to do so we askedthe users if his/her perception has changed in the case ofa successful awareness. The outcome is presented in Fig-ure ?? ; there, the results show that in the circumstancethe user is well informed he/she is able to recognize thepotential threats in the app and therefore he/she changedhis/her perception of the app. With this work, we provide a first study on the perceptionof privacy in users of mobile smart devices. We developan app that studies the permissions of the other apps in-stalled on a smart device and if those app access data thatmay lead to privacy damages we informed the users withour awareness campaign. To measure the impact of theawareness we asked two questions to the users that let usunderstand if the users are aware or not to the risks con-nected to the apps data access and, in case he/she is notaware of that we asked if his/her perception of the app ischanged. Currently, we made an experiment with 17 users tohave a first idea of the data we may be able to collect. Theuser has been selected to be heterogeneously and from theresults, we clearly depict the fact that a better awarenessmust be presented to the users before providing techno-logical and legal solutions.As future work, we plan to extend our analysis includ-ing more users and also study the relationship betweenuser profile and impact of the awareness by collecting theinformation about changes in permissions grants and in-stall/uninstall of the application in the users’ devices.
References [1] Marc Langheinrich. Privacy by design?principles ofprivacy-aware ubiquitous systems. In
Internationalconference on Ubiquitous Computing , pages 273–291. Springer, 2001.[2] William Enck, Peter Gilbert, Seungyeop Han, Vas-ant Tendulkar, Byung-Gon Chun, Landon P Cox,Jaeyeon Jung, Patrick McDaniel, and Anmol N Sheth.Taintdroid: an information-flow tracking system forrealtime privacy monitoring on smartphones.
ACMTransactions on Computer Systems (TOCS) , 32(2):5,2014.[3] Alan Ferrari, Daniele Puccinelli, and Silvia Giordano.Managing your privacy in mobile applications withmockingbird. In