Designing for Pragmatists and Fundamentalists: Privacy Concerns and Attitudes on the Internet of Things
Lesandro Ponciano, Pedro Barbosa, Francisco Brasileiro, Andrey Brito, Nazareno Andrade
DDesigning for Pragmatists and Fundamentalists: PrivacyConcerns and Attitudes on the Internet of Things
Lesandro Ponciano
Pontifical Catholic Universityof Minas GeraisBelo Horizonte, [email protected]
Pedro Barbosa
Federal University of CampinaGrandeCampina Grande, [email protected]
Francisco Brasileiro
Federal University of CampinaGrandeCampina Grande, [email protected]
Andrey Brito
Federal University of CampinaGrandeCampina Grande, [email protected]
Nazareno Andrade
Federal University of CampinaGrandeCampina Grande, [email protected]
ABSTRACT
Internet of Things (IoT) systems have aroused enthusiasm andconcerns. Enthusiasm comes from their utilities in peopledaily life, and concerns may be associated with privacy issues.By using two IoT systems as case-studies, we examine users’privacy beliefs, concerns and attitudes. We focus on four majordimensions: the collection of personal data, the inference ofnew information, the exchange of information to third parties,and the risk-utility trade-off posed by the features. Altogether,113 Brazilian individuals answered a survey about such dimen-sions. Although their perceptions seem to be dependent on thecontext, there are recurrent patterns. Our results suggest thatIoT users can be classified into unconcerned, fundamentalistsand pragmatists. Most of them exhibit a pragmatist profile andbelieve in privacy as a right guaranteed by law. One of the mostprivacy concerning aspect in IoT is the exchange of personalinformation to third parties. Also, individuals’ perceived risktends to be negatively correlated with their perceived utility inthe features of the system. We discuss practical implicationsof these results and suggest heuristics to cope with privacyconcerns when designing IoT systems.
ACM Classification Keywords
K.4.1 COMPUTERS AND SOCIETY: Public Policy Issues -Privacy; H.1.2 MODELS AND PRINCIPLES: User/MachineSystems - Human factors
Author Keywords
Privacy perceptions, Concerns about privacy, Internet ofThings, Information Boundary, Face Keeping.
Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise,or republish, to post on servers or to redistribute to lists, requires prior spe-cific permission and/or a fee. IHC’17, Brazilian Symposium on HumanFactors in Computing Systems. October 23-27, 2017, Joinville, SC, Brazil.Copyright 2017 SBC. ISBN XXX-XX-XXXX-XXX-X (pendrive).
INTRODUCTION
The Internet of Things (IoT) is composed of devices that peo-ple use in their daily life. Besides the traditional desktops,many other devices like vehicles, buildings, and home appli-ances are on the network [11]. The devices are embedded withnetwork connectivity that enable them to collect and exchangedata [10]. Systems based on such network can aggregate datacollected by various devices. The collected data may allow thesystem to devise users’ a unique identifier, and monitor dataas where they have been, with whom they have been and whatactions they have taken. Based on machine learning and datamining algorithms, system can infer information that were notdirectly collected from the users and predict their behaviors,preferences and choices [17, 26]. Such information can beused to provide features to them. This sort of systems hasaroused enthusiasm and concerns. Enthusiasm comes fromtheir potential utilities in people daily life, and concerns aremainly associated to their potential threats to people’s privacy.There is no simple and widely accepted definition of pri-vacy [3, 24, 35]. Different notions of privacy have been objectof analysis in studies from various disciplines such as psy-chology, sociology and computer science. Even in computerscience, privacy has been approached from a variety of per-spectives depending on the type of system, such as onlinesocial networks [2, 8, 19], mobile apps [7, 10, 13, 34], health-care applications [6, 29], and so on. In the context of infor-mation systems, one definition of privacy is the individual’scontrol over the collection, disclosure, and use of her/his per-sonal information [1, 7, 20]. If such control does not exist oris compromised, the privacy of the individuals is not beingguaranteed [7, 20, 37]. When they perceive that the systeminvades their privacy or that is not developed to prevent viola-tions from third parties, users associate a risk to the use of thesystem. Faced with this risk, users may decide do not engagein the system or restrict their level of engagement [7, 18, 22].To successfully take into account the privacy of their users, a r X i v : . [ c s . H C ] J u l he design of IoT systems must adequately cope with users’concern about privacy.Naturally, different people may not perceive privacy and reactto issues in the same way. Individuals can be broadly catego-rized into profiles according to their levels of privacy concerns,their attitudes, and what the system can do using their in-formation [3, 17, 25, 36]. One widely used categorization isbased on three privacy attitude profiles: privacy unconcernedusers; privacy pragmatist users; and the privacy fundamental-ist users [17, 36]. The unconcerned privacy attitude profileconsists of individuals who are inclined to share informationabout them, regardless of actions taken by the system. Individ-uals who exhibit a pragmatist privacy attitude profile conditionthe data provision to action taken by the system, as adoptinga privacy protection mechanism. Finally, individuals who ex-hibit a fundamentalist privacy attitude profile are not inclinedto share information about them. Little is known so far aboutthe occurrence of these profiles in IoT systems and how todevelop IoT systems that meet their privacy needs.We sought to deeply investigate people’s privacy beliefs, con-cerns, and attitudes toward IoT systems. To do so, we addressthe following research questions: 1) what is the occurrenceand characteristics of fundamentalist, pragmatic and uncon-cerned users in IoT systems; 2) which components of IoTsystems can cause more concern in terms of privacy; 3) howdo users perceive the risk-utility trade-off posed by features ofIoT systems. To answer these questions, we surveyed 113 po-tential users of 2 IoT systems. The study was conducted withstudents and technology professionals in Brazil. We classifiedthe users into the three privacy attitude profiles: unconcerned,fundamentalists and pragmatists. Based on the classification,we analyze the privacy beliefs, concerns, and perceptions.Although several of the results seem to be dependent on thecontext of each system, there are some patterns across systemsthat can be highlighted. We find that most of the individualsexhibit a pragmatist privacy attitude profile. They also tendto believe in privacy as a right guaranteed by law. Regardlessthe privacy profile, the most privacy concerning aspect of IoTsystems tend to be the exchange of information to third parties,showing average levels of concerns higher than those of datacollection and information inference. About the risk-utilitytrade-offs posed by the systems, the perceived risk tends to benegatively associated with their perceived utility. The higherthe perception of utility in the feature of the system, the lowerthe perceived risk in such feature.We discuss several practical implications of these results andsuggest heuristics to cope with privacy concerns in the designof IoT systems. In summary, we suggest that system designers1) let users know what information the system has about them;2) make clear the usefulness of the data for each feature; 3)make the exchange of data with third parties configurable; 4)conduct empirical assessments of privacy to deal with casesthat are system dependent.The rest of this work is organized as follows. We provide firsta background of key concepts related to privacy and discussrelevant previous work. Next we discuss our approach to characterize privacy beliefs, concerns, and attitudes. It isfollowed by the evaluation study in two IoT Systems; wediscuss the implications of the results and propose heuristicsto cope with privacy concerns in the design of IoT systems.Finally, we summarize the conclusions of the study. BACKGROUND AND RELATED WORK
In this section, we first briefly review relevant concepts ofprivacy. After that, we detail two theoretical constructs toexplain people’s privacy concerns and attitudes. Finally, weanalyze what is known about privacy concerns and attitudes inthe context of IoT systems.
What is privacy?
From a legislation viewpoint, considering the laws in theUnited States of America, privacy can be defined as “the rightof an individual to be let alone” [30]. People, in turn, usuallyassociate the word privacy with a diversity of meanings. Somepeople believe that privacy is the right to control what informa-tion about them may be made public [3, 24, 35]. Other peoplebelieve that if someone cares about privacy is because he/sheis involved in wrongdoing [4]. Privacy is also associated withthe states of solitude, intimacy, anonymity, and reserve [16,35].Solitude means the physical separation from other individuals.Intimacy is some kind of close relationship between individu-als with which information is exchanged. Anonymity is thestate of freedom from identification and surveillance. Finally,reserve means the creation of psychological protection againstintrusion by other unwanted individuals.In information and communications technology (ICT), theconcept of privacy is usually associated to the degree of controlover the flow of personal information [7]. In this context,people associate privacy to something regarding to their levelof control over the collection of personal information, andusage of the collected information, and the third parties thatcan have access to the information, such as relatives, friends,hierarchical superiors, and government agencies [1, 7, 20].
Do people care about privacy?
Concerns about privacy usually arise from unauthorized col-lection of personal data, unauthorized secondary use of thedata, errors in personal data, and improper access to personaldata [27]. People concerns are indeed associated to possibleconsequences that these occurrences may have on their lives.Two relevant theoretical constructs that explore this view areface keeping and information boundary.
Face keeping
Face keeping considers that to perform social interaction suc-cessfully, people must provide the others with some iden-tity [16]. Providing some identity is to share information, suchas providing information about the tastes, preferences, beliefs,and past actions. In doing so, individuals construct a public“face” of themselves made with the features that they are shar-ing. In daily life, people engage in a variety of activities. Theydo not necessarily use the same “face” in all activities. Forexample, in some instances they may want to play the roleof a friendly individual, while in other they may want to playthe role of an individual who is expert in a specific domain.hat people would say, do and disclose when wearing thesedifferent faces may be quite different. In general, “faces” area combination of roles and status.People care about keeping a public face that pleases them.Thus, from a “face keeping” perspective, a privacy breach is acase of losing a face, in which people are forced to put on orendorse an unwanted face. It occurs, for example, when somepeople want to maintain their control over the informationabout their conditions or past, such as people who carry certaindiseases such as AIDS, ex-convicts, or members of somereligious groups. To avoid stigma, they seek to prevent theirinformation from being disseminated and become part of theirfaces. The privacy concerns come from the need to maintain adesired public face.
Information boundary
Information boundary explains the dynamics of individualprivacy concern based on the metaphor of two inner states:“boundary opening” and “boundary closure” [33]. When theboundary is open, the individual reveals information freely.When the boundary is closed, the information flow is restricted.Individuals establish privacy rules to define when the boundaryis open and when it is closed. Each individual has a mentalcalculus that is used to construct rules based on the risk-benefitcalculation of information disclosure [9].After individuals disclose their personal information to some-body or something, the information moves to a collective do-main where all the co-owners of such information share a jointresponsibility for keeping it private. The main concern withprivacy is that information leaks beyond this domain. Privacyrules used in the decision to disclose personal information arebased on the expectations about who will have access to theinformation in the collective domain. Policies about the pro-tection of the information in collective domain are negotiatedbefore the information is disclosed. When the policies are notmet, individual boundary management becomes turbulent andthe individual becomes more restrictive in their privacy rules,recalculating their risk-benefit in information disclosure.
What is known about individual privacy concerns and at-titudes toward IoT systems?
Studies have reported a number of aspects of IoT systems thatmay cause privacy concern, among them we highlight: • The potential to enable a systematic mass surveillance andto impinge on the personal privacy of users, especially theirlocation privacy [10]. • The automatic collection of data from the network of de-vices and their secure protocols cannot be verified by thedata owners [32]. • The increase of privacy concerns because of three mainfactors: they are based on novel, heterogeneous and con-strained user interfaces; the ubiquitous presence of devicesand their potential to vast data collection; and market forcesand misaligned incentives [31]About users’ privacy attitudes, some studies discuss that in faceprivacy concerns people my refuse to provide their personal data or submit inaccurate data [32]. Other studies, however,identify that privacy concerns have a relatively weak effecton influencing the adoption of IoT systems [12], being otherfactors more relevant in terms of adoption. In fact, in the liter-ature on people’s privacy perception, the association betweenprivacy concerns and privacy attitudes is usually confusingand paradoxical. This phenomenon is known as the privacyparadox , in which people claim to care about privacy, but theyare often perceived to act to the contrary [2, 21]. It is believedthat this paradox must also occur in IoT [31].
CHARACTERIZING PRIVACY BELIEFS, CONCERNS ANDATTITUDES
Our approach to characterize users’ privacy concerns and at-titudes toward IoT systems consists of three dimensions: (i)demographics, general privacy beliefs, attitudes; (ii) sources ofprivacy concerns in IoT system; and (ii) risk-utility trade-offsposed by the features of the system. Throughout this sectionwe discuss how we address each of these dimensions.
Demographics, general privacy beliefs and attitudes
This dimension focuses more on individual general character-istics than on characteristics they exhibit when using a specificsystem. We consider a set of characteristics that may be in-sightful about how do users perceive privacy issues in IoTsystems, which are their demographics, beliefs, and attitudes.In characterizing demographics characteristics, we considergender, education, and age. In designing a survey instrument,we ask respondent to inform their gender (females, male orinform other) and education (basic school, high school, un-dergraduate, master and doctoral degrees) and age (Under 18years old, 18–24 years old, 25–34 years old, 35–44 years old,45–54 years old, 55 years or older).In assessing individual beliefs, we consider a set of privacybeliefs reported in previous studies of privacy perceptions,which are: privacy as a right guaranteed by law, privacy as anindividual’s responsibility, and privacy as a need associated topeople who are involved in wrongdoing. We asked users toselect the option that best describes their feelings, which are:(a)
I believe that privacy is a right guaranteed by law , (b)
Ibelieve that each person has a responsibility to protect his/herown privacy , and (c)
I believe that people who cares aboutprivacy is because he/she is involved in wrongdoing . Theseprivacy beliefs may be insightful about how do users perceiveprivacy issues in IoT systems.Finally, privacy attitudes are investigated by asking users abouthow they are likely to act when they are requested to provideprivate data to a system. We provide to the respondents thefollowing instruction and ask them to select the option thatis closest to the way they see the situation:
If you are us-ing a system and it asks you to provide information that youconsider personal, which of these behaviors you adopt? (a)You provide the requested information. (b) You provide therequested information only if the system informs its privacypolicy. (c) You provide the requested information only if youknow that the system will give you something in return. (d) Youdo not provide the requested information.
It is the core point inthe categorization of users in the three profiles: unconcerned;ragmatist; and fundamentalist [17, 36]. The chosen optionallows to infer user’s privacy attitude profile: unconcerned(answer a ), fundamentalist (answer d ), and pragmatist (answer b or c ). Sources of privacy concerns in IoT systems
Three components of typical IoT systems that can be sourceof users’ privacy concerns are: the collection of personal datafrom users; the inference of richer information based on thecollected data; and the exchange of users’ information withthird parties.
Data collection
Systems can collect data directly from users by asking themto provide data or indirectly via their devices, such as smartphones and smart watches. In order to investigate the level ofprivacy concern caused by each collected data, we ask usersto inform the level of privacy concern in a 5-point Likert scale.For instance, about location information collected by mobiles,we can ask:
Assuming that the system is able to collect thelocations where your mobile has been throughout the day,how concerned would you feel about it? (a) Not concernedat all; (b) Slightly concerned; (c) Moderately concerned; (d)Very concerned; (e) Extremely concerned. . We also ask forthe level of concerns caused by the collection of informationabout choices made by the user, and information about otherusers with who the user meets.
Inference of richer information
The data collected by the system can be used to find newinformation about the users. Systems can perform informationinference for a variety of purposes, such as identifying users’friends, and recommending products and services to them. Inorder to investigate the level of privacy concern caused byeach piece of information inferred by a system, we ask usersto inform the level of privacy concern in a 5-point Likert scale.For instance, about the inference of favorite TV shows, we canask:
Assuming that when using the system it would be ableto infers your favorite TV shows, how concerned would youfeel about it? (a) Not concerned at all; (b) Slightly concerned;(c) Moderately concerned; (d) Very concerned; (e) Extremelyconcerned.
Exchange of information with third parties
As part of its operation or due to a failure of operation, systemcan let users’ data be accessible to third parties. Users considerthis kind of situation as an improper access to their personaldata [16]. To whom information is made accessible is relevant,for example, users’ friends, relatives, employers, governmentagencies, and other systems [15, 28]. In order to investigatethe level of privacy concern caused by information exchange,for each piece of information handled by the system, we askusers to inform the level of privacy concern caused by thedata being provided to third parties in a 5-point Likert scale.For instance, about the data being provided to governmentagencies, we can ask:
Enter the level of concern you feel if thedata about the places where your mobile was throughout theday are provided to government agencies: (a) Not concernedat all; (b) Slightly concerned; (c) Moderately concerned; (d)Very concerned; (e) Extremely concerned . All the questions about the sources of privacy concerns (datacollection, inference of richer information, and exchange ofinformation with third parties) are based on in a 5-point Likertscale. It is an ordinal scale, so we can say for example thatthe "Extremely concerned" score is higher than the “Veryconcerned” score. When analyzing the answers, we assignnumbers to the scale, so 1 is lower value, 3 is the neutralresponse, and 5 is the higher value. We assess the level ofconsensus in the answers provided by the users [14, 23]. Tomeasure the level of consensus we use the Krippendorff’salpha coefficient [14]. The maximum value to this metric is1, indicating complete consensus among the respondents, andthe minimum is -1, indicating complete lack of consensus.In this characterization of the sources of privacy concernsin IoT system, we are usually interested in identify if theaverage of users privacy concerns lies below or above theneutral concern (3 – “Moderately concerned”). In this case,the neutral concern serves as an arbitrary zero point.
System information use trade-offs
In order to evaluate the risk-benefit trade-off of features in IoTsystems, we also measure the level of utility perceived by usersin the feature that use such information. To do so, we ask usersto inform the level of utility of the feature in a 5-point Likertscale. For example, in the case of TV shows schedule features,we can ask users
How useful do you consider the feature thatinfers your favorite TV shows and automatically turns on theTV on their schedule? (a) Not useful at all; (b) Slightly useful;(c) Moderately useful; (d) Very useful; (e) Extremely useful.
Based on the measures of privacy concern caused by the datacollection/inference and the measures of utility of the featurethat use the data, we measure the degree of correlation betweenthe risk indicated by the concern and utility reported by theusers. When there is a correlation, we shape it into an equationby means of a regression. Thus, the trade-off can be evaluatedas the model of how an increase in utility of a feature impactsthe risk perceived by the users.
PRIVACY IN TWO IOT SYSTEMS
By using the approach discussed in the previous section, wecharacterize users’ privacy beliefs, concerns and attitudes intwo IoT systems: Lumen and Pulso. We begin this section byintroducing these systems. After that, we detail our survey, therecruitment of participants, and the results of our study.
Pulso system measures the “pulse” of locations, i.e. howmany people are there. It is based on people counting sen-sors that are installed in specific locations, such as restaurants,public squares, and bus stop points. The sensors work justlike internet access points. When users arrive at the locationwith their smart phones with the WiFi turned on, the smartphones probe for connections. A smart phone probing con-nections is considered as a person in the location; the smartphones are uniquely identified by their Media Access Controladdress (MAC address). The number of people in a location isestimated by the number of smart phones at the location thathave WiFi on. Pulso system collects and stores the time andMAC address of each smart phone probing a connection. Suchdata is processed in order to provide features/services to theirsers. The features are accessible to the users via a mobileapplication. In the application, users can search for locationsand, for each searched location, the system reports in real timehow many people are there. Users can use it for differentthings. For instance, they can use it to know how many peopleare in the restaurant they want to lunch, and decide to go in aless crowded time.
Lumen system provides notifications of disproportionatepower consumption in an organization. The system combinesinformation of the number of people with the energy consump-tion. The number of people is counted by Lumen in the sameway that Pulso does. Sensors are installed in the organiza-tion. When people arrive with their smart phones with theWiFi connection turned on, the smart phone probes for WiFiconnections. The number of people is the number of uniqueMAC addresses. The energy consumption in the organizationis obtained by using a smart power meter. The collected datais processed and presented to users via a Web application. Theapplication allows users to check the environmental impacthistory – the ratio between the energy consumption and thenumber of people in the organization. It informs the momentsin time when there is an above-average environmental impact.At such times, the system generates a notification that alertsusers about power consumption that is disproportionate to thenumber of people in the organization. The notifications aremade available in a notification board that can be visualizedby everyone in the organization.
Relevance of Pulso and Lumen to study privacy in IoT:
Both systems involve the face keeping concept. In Pulso,one of users’ privacy concerns may be that information aboutplaces where they go routinely becomes public. In Lumen,the concern is that the system publicly assigns to users theresponsibility for an energy consumption problem in the or-ganization. Thus, in both cases, the system can potentiallychange the public face that the users wear. The two systemsalso involve information boundary concept. In both cases, peo-ple seek to control what information about them may belong tothe public domain. Also, the systems involve different spacesand forms of adoption. Pulso is designed for a public spaceand anyone can join and leave the system at anytime. Lumen,in turn, is designed for a corporate space and members of theorganization are joined to the system as part of their activities.
Survey design and recruitment of participants:
To assessindividuals privacy perceptions and concerns towards the Pulsoand the Lumen system, we build a survey with a set of ques-tions derived from the proposed approach (Section 3). Thedynamic of the system is explained to the respondent by pre-senting screens of the application and a video prototype thatexplain how the system works. The survey was designed byusing Google Forms tools .After validating the survey in pilot tests, we collected sampleof answers from potential users of the systems. For the Pulsosystems, answers were collected from 58 respondents in atransportation hub in Campina Grande, Brazil. For the Lumen The forms are currently retired and permanently availableat https://goo.gl/forms/2HpdQmvg9uHCL86f1 (Pulso) andhttps://goo.gl/forms/y9W1TVrxQ81mPI483 (Lumen). system, answers were collected from 55 participants that workin computer science research labs and in a software develop-ment company in Campina Grande, Brazil. In both studies,respondents answered the survey individually and without anyexternal interference.
Results
As summarized by our research questions, our main objectiveis to characterize individuals privacy concerns and attitudestoward IoT systems. The results of the characterization areorganized in three parts presented in the following. We firstanalyze users’ demographics and privacy beliefs according totheir privacy attitude profiles. Next we discuss the sources oftheir privacy concerns in IoT systems (collection, inferenceand exchange of information). Finally, we present the resultson individuals perceptions of the risk-benefit trade-offs.
Privacy profiles: distribution, demographics and consensus
We identified people who exhibited the three privacy attitudeprofiles: unconcerned, fundamentalists, and pragmatist. AsFigure 1 shows, the pragmatist is the most frequent profile.Most respondents exhibits this privacy attitude profile in thePulso system (40 individuals or a proportion of 0.69) and inthe Lumen system (39 individuals or a proportion of 0.71).Table 1 and Table 2 show the statistic summary of the demo-graphic characteristics and privacy beliefs of participants inthe Pulso and Lumen systems, respectively. These statisticsshow that demographics and beliefs of participants who fallin each privacy attitude profile change according to the set ofrespondents who participated in the analysis of each system.We summarize profiles demographics and beliefs as follows: • Unconcerned.
Most of the users who exhibit an uncon-cerned privacy attitude profile in the Pulso system are fe-male with age ranging from 18 and 24 years old and thatcompleted just high school. In the Lumen System, all ofthem are male respondents, and most of them have age rang-ing from 25 and 34 years old and completed undergraduatedegree. The unconcerned users are practically divided be-tween those who believe in privacy as a right and those whobelieve in privacy as a personal responsibility of each user. • Fundamentalists.
In the Pulso system, most of them arefemale respondent with age ranging from 18 and 24 yearsold and that completed just high school. In the Lumensystem, most of them are male respondent with age rangingfrom 25 and 34 years old and that completed undergraduatedegree. Users who exhibit a fundamentalist are dividedinto those who believe in privacy as a right and those whobelieve in privacy as a personal responsibility. • Pragmatists.
Most of them are female respondents withage ranging from 18 and 24 years old and that completedundergraduate degree. In the Lumen system, the majorityof fundamentalists are male respondents, with age rangingfrom 25 and 34 years old and that completed undergraduatedegree. In both systems, the pragmatist users are morelikely to believe in privacy as a right.These results for each profile reproduce the demographicsdistribution of the whole sample of users of the systems. Users
Privacy Attitude P r opo r t i on o f peop l e UnconcernedFundamentalistPragmatist (a) Pulso
Privacy Attitude P r opo r t i on o f peop l e UnconcernedFundamentalistPragmatist (b) Lumen
Figure 1. The proportion of respondents who falls into each privacy attitude profile: unconcerned, pragmatists, and fundamentalists.
Unconcern. Fundament. Pragmat. All
Gender
Male 5 (0.45) 1 (0.14) 14 (0.35) 20 (0.34)Female 6 (0.55) 6 (0.86) 26 (0.65) 38 (0.66)
Sum
11 (1.00) 7 (1.00) 40 (1.00) 58 (1.00)
Age
Under 18 years old 1 (0.09) - 7 (0.18) 8 (0.14)18–24 years old 7 (0.64) 5 (0.71) 26 (0.65) 38 (0.65)25–34 years old 2 (0.18) 2 (0.29) 5 (0.13) 9 (0.16)35–44 years old 1 (0.09) - 2 (0.05) 3 (0.05)45 years or older - - - -
Sum
11 (1.00) 7 (1.00) 40 (1.01) 58 (1.00)
Education
Basic education - - 2 (0.05) 2 (0.05)High school 6 (0.55) 4 (0.57) 17 (0.43) 27 (0.47)Undergraduate degree 5 (0.45) 3 (0.43) 19 (0.48) 27 (0.47)Master’s degree - - 2 (0.05) 2 (0.03)Doctoral degree - - - -
Sum
11 (1.00) 7 (1.00) 40 (1.01) 58 (1.00)
Belief
Guaranteed by law 5 (0.45) 3 (0.43) 27 (0.68) 35 (0.60)Personal responsibility 6 (0.55) 4 (0.57) 13 (0.32) 23 (0.40)Needed in wrongdoing - - - -
Sum
11 (1.00) 7 (1.00) 40 (1.00) 58 (1.00)
Table 1. Statistical summary of users’ demographics and privacy beliefsin the Pulso system. who exhibit a pragmatist privacy attitude profile stand out inthe analysis.
They are the majority of users . Also, differentlyfrom users who exhibit the other profiles, users who exhibita pragmatist privacy attitude profile are more inclined to thebelief of privacy as a right guaranteed by law .We analyze the level of consensus observed in the answersprovided by users to questions about privacy. The result isshown in Figure 2. By putting together the individuals whoexhibit the same privacy attitude profiles, we identify groupsof individuals who exhibit a level of consensus among themhigher than the level of consensus that exists among partici-pants from all profiles . This is exemplified by the pragmatistprofile in the case of Pulso system and the unconcerned andfundamentalist profiles in case of Lumen system. As theseresults show, the level of consensus among the users aboutprivacy issues is not something that remains invariant acrosssystems.
Unconcern. Fundament. Pragmat. All
Gender
Male 5 (1.00) 10 (0.90) 35 (0.90) 50 (0.90)Female - 1 (0.10) 4 (0.10) 5 (0.10)
Sum
Age
Under 18 years old - - - -18–24 years old 1 (0.20) 2 (0.18) 14 (0.36) 17 (0.31)25–34 years old 3 (0.60) 9 (0.82) 24 (0.62) 36 (0.65)35–44 years old 1 (0.20) - 1 (0.03) 2 (0.04)45 years or older - - - -
Sum
Education
Basic education - - - -High school - - 4 (0.10) 4 (0.07)Undergraduate degree 4 (0.80) 10 (0.91) 31 (0.79) 45 (0.82)Master’s degree 1 (0.20) 1 (0.09) 4 (0.10) 6 (0.10)Doctoral degree - - - -
Sum
Belief
Guaranteed by law 1 (0.20) 5 (0.45) 28 (0.72) 34 (0.62)Personal responsibility 4 (0.80) 6 (0.55) 11 (0.28) 21 (0.38)Needed in wrongdoing - - - -
Sum
Table 2. Statistical summary of users’ demographics and privacy beliefsin the Lumen system.
The sources of privacy concerns: Collection, inference andexchange of private information
We assess the average level of privacy concern of participantsconsidering the collection of personal data, inference of richerinformation and exchange of private information with thirdparties. Figure 3 shows the results according to user’s privacyattitude profiles. There are high variations around the averagelevel of concern, mainly in the level reported by privacy un-concerned and fundamentalist users. The results also show atendency of users to exhibit a higher level of concern aboutinformation exchange, sometimes the average lies above thelevel of privacy concern equivalent to “Moderately concerned.”This tendency is observed in all privacy attitude profiles and itis stronger in the case of Lumen system. It indicates that usersdo not perceive so much problem in the data collection andinformation inference carried by the system, but they worrythat this information will be available to third parties . Privacy attitude profiles K r i ppendo r ff ' s A l pha UnconcernedPragmatistFundamentalist (a) Pulso
Privacy attitude profiles K r i ppendo r ff ' s A l pha UnconcernedPragmatistFundamentalist (b) Lumen
Figure 2. Krippendorff’s Alpha consensus coefficient according to users’ privacy attitudes profiles. The dashed line is the level of consensus when allusers are put together. l l l l l l l l l l l l
Privacy attitude profiles R e s ponden t s ' a v e r age p r i v a cy c on c e r n lll Data collectionInformation inferenceInformation exchange (a) Pulso l l l l l l l l l l l l
Privacy attitude profiles R e s ponden t s ' a v e r age p r i v a cy c on c e r n lll Data collectionInformation inferenceInformation exchange (b) Lumen
Figure 3. Users’ average privacy concern about data collection, information inference and information exchange in the system: (a) Lumen and (b) Pulso.The dashed line is the level of privacy concern equivalent to “Moderately concerned.” Each error bar represents the 95% confidence interval.
To complement our understanding about this result, we ana-lyze which third party users fear most in terms of privacy. Asshown in Figure 4, the difference of concern regarding theinvestigated sources is small. In both case studies, the only clearly distinguishable high source of concern is governmentagencies . Users are concerned about their data being accessi-ble to government agencies. The level of concern lies abovethe moderated level in both systems.
Perceptions on system’s data use trade-offs
We investigate the association between the level of utility thatusers perceive in a feature and the risk in terms of level ofconcern they feel about the personal data used by the feature.In the case of the Pulso system, the feature consists of showingthe name and photo of people who are in a specific location.The utility is having access to this information provided byother users and the risk is the level of privacy concern in pro-viding this information to be available to other users. Table 3shows the correlation between risk and utility in the Lumensystem. In the whole set of users, the Pearson correlationbetween utility and risk is low and not significant. Analyz-ing the correlation per profile, we find that the fundamentalist users show a significant strong negative correlation . A sim-ple linear regression was calculated to model this relationshipbetween utility and risk exhibited by the fundamentalist users.A significant regression equation was found (F(1,5)=6.80, p-value < r of 0.49. In this case, the riskis 4.82-0.75(utility). The risk perceived by users decreases0.75 to each unit increased in utility. Privacy attitude profiles Pulso system Lumen systemUnconcerned -0.15 -0.37Fundamentalists -0.76* -0.45Pragmatist 0.03 -0.57**All together -0.16 -0.50**
Significance codes: *p-value < < Table 3. Correlation between risk and utility according toprivacy attitude profiles in the Pulso and Lumen systems.
In the case of the Lumen system, the studied feature consistsof showing the name of people who are related to an energyconsumption problem in the organization. The utility is theusefulness of this feature and the risk is having his/her thename and photo exposed in this situation. In the whole set ofusers, the Pearson correlation between the level of perceived l l l l l
Sources of concern R e s ponden t s ' a v e r age p r i v a cy c on c e r n (a) Pulso l l l l Sources of concern R e s ponden t s ' a v e r age p r i v a cy c on c e r n (b) Lumen Figure 4. Users’ average privacy concern about information exchange. The dashed line is the level of privacy concern equivalent to “Moderatelyconcerned”. Each error bar represents the 95% confidence interval. utility and the level of perceived concern is moderate andsignificant. Analyzing the correlation per profile, only prag-matist users show a significant correlation , the unconcernedand fundamentalist users exhibit no significant correlations.To the whole set of users, a significant regression equation wasfound (F(1,53)=17.42, p-value < r of 0.23. In this case, the risk is 4.74-0.52(utility). Users’ per-ceived risk decreases 0.52 to each unit of utility. A significantregression equation was also found to the pragmatist users(F(1,37)=13.55, p-value < r of 0.24.In this case, the risk is 4.75-0.52(utility). The risk perceivedby the users decreases in 0.52 to each unit increased in utility.In general, these results indicate a negative relation betweenutility and risk. This relation means that people tend to bemore concerned about the provision of data to features thatthey do not perceive usefulness or, alternatively, they tendto penalize the level of utility of features that make use ofinformation that causes them much privacy concern. DISCUSSIONS AND DESIGN IMPLICATIONS
Our research helps to answer the question about the underlyingfactors of users’ perceptions in IoT systems. It emphasizesusers’ privacy beliefs, attitudes, systems’ source of concern,and risk-utility trade-offs. The statistical analysis showed adiversity of privacy perception and differences among the in-dividuals. In the following, we first discuss these results, thenwe propose a set of heuristics to cope with privacy concernsin the design of IoT systems.
Privacy in IoT systems
Our analysis allowed us to identify users who exhibit thethree privacy attitude profiles: unconcerned, fundamentalists,and pragmatist. We show that, as in other contexts, mostusers are pragmatists. Our results suggests that users whoexhibit a pragmatist profile tend to believe in privacy as a rightguaranteed by law, while users who exhibit unconcerned andfundamentalists profiles balance their beliefs between privacyas a right and privacy as a personal responsibility.Privacy is a subject in which people often have different opin-ions and low consensus among them because it relies on sev- eral personal characteristics. Our results show that in IoTsystems even users who fall in the same privacy attitude pro-file can diverge among them in terms of their privacy beliefsand concern. Users’ privacy attitude profiles have an effect onthe level of consensus obtained in the responses provided bythem to privacy issues. In our analysis, the level of consensusabout privacy issues is not something that remains invariantacross systems.The major source of privacy concerns in IoT systems tendsto be the exchange of user personal data to third parties. De-pending on the context of the system, the level of the privacyconcern caused by data exchange exceeds the moderated level.The third party that caused the higher levels of concern is thegovernment, but other third parties as people hierarchically su-periors and other systems can also cause high levels of concerndepending on the context.Our analysis of risk-utility trade-off indicates a negative corre-lation between utility and risk. The higher the perception ofutility, the lower the perception of risk. This correlation meansthat users tend to penalize the level of utility of features thatmay cause privacy concerns.Finally, we believe that some level of “concern about privacy”is an unavoidable consequence of sharing personal information.Even users who exhibit an unconcerned privacy attitude profiletend to exhibit some level of concerns, though low. Concernsoccur in people’s daily lives and are natural that they alsooccur in their interaction with IoT systems. The design of thesystem should focus on reducing the concerns to levels thatare comfortable to the users.
Heuristics to cope with privacy concerns in IoT
To summarize the insights gained from our analysis of users’privacy perceptions and attitudes, we present design heuristicsthat can be expected to support the design and configuration ofIoT systems following a “design for privacy” perceptive [5].
Let users know what information the system has about them.
Users may not properly understand the data collection processbecause data can be collected indirectly via their devices andsuch collection is intrinsically ubiquitous. One way for thesers to know what kind of data is collected and inferred aboutthem is to allow the user to see and download that data.
Make clear the usefulness of the data for each feature.
Users tend to be less concerned about privacy when they un-derstand the utility of the feature that make use of the collecteddata. If the utility of the feature is low, any data collected orinferred by the system will let users feel high levels of concern.
Make the exchange of data with third parties explicit and con-figurable.
The provision of users data to third parties is one of the majorsources of privacy concern. This can not be done without theknowledge and the explicit authorization of the user, so it cannot be a default system configuration. System should alsogive to users the possibility to specify to which third partytheir data can be provided, if any. It allows one to address thevariances in the level of concern that exists in the set of users.
Conduct empirical assessments of privacy.
Our results show several privacy issues that cannot be general-ized across systems. New systems may be based on featuresthat manifest significant privacy risks to their users. The ef-fects of such features on users’ privacy should be tested beforethe system is made available to its users.
Threats to the validity
This study was conducted in Brazil and had students andtechnology professionals as participants. The results are basedon answers provided by the participants, which is subject to thesocial desirability effect. Responses provided by participantsmay differ from their behavior when using the system; whichis known as privacy paradox [2, 21, 31].While the proposed heuristics will be helpful in ensuring thepresence of basics privacy requirements, it should not leaddesigners to assume that they are enough to design privacy-friendly systems. We are convinced that such heuristics in-terplay with other initiatives, such as security mechanisms,compliance with legislation and good data governance prac-tices.
CONCLUSIONS
In this study, we sought to deeply investigate individualsprivacy perceptions about systems based on the Internet ofThings. Building on previous studies, we proposed a surveyapproach to characterize users’ perceptions. The characteri-zation focuses on individuals privacy beliefs, components ofIoT systems that cause privacy concern (data collection, infor-mation inference, and information exchange), and perceivedrisk-benefit trade-offs in the features provided by the system.We survey 113 individuals about their privacy perceptionsin two IoT systems: Lumen and Pulso. We classify the in-dividuals into three privacy attitude profiles: unconcerned,fundamentalists and pragmatists. We found that most of theindividuals exhibit a pragmatist attitude towards privacy andtend to believe in privacy as a right guaranteed by law. Theexchange of information to third parties tends to be the mostconcerning aspect compared to data collection and informationinference. The perceived privacy risk tends to be negatively associated with the utility; the higher the perception of utility,the lower the perceived risk. We derive design heuristics tohelp system designers to cope with user privacy concerns.As future work, we suggest the extension of this study inorder to also investigate the effects of national, cultural andeconomic factors. We also suggest the conduction of an obser-vational study with users over their interaction of the systems.Besides of how privacy rules (concerns and attitudes) areformed in IoT system, such study would also characterize howthe rules change over time.
ACKNOWLEDGMENTS
This research was partially funded by EU-BRA SecureCloudproject (MCTI/RNP 3rd Coordinated Call) and by CNPq,Brazil.
REFERENCES
1. D Alhalafi. 2015. A New Methodology to DisambiguatePrivacy.
Acta Physica Polonica A
First Monday
11, 9 (2006).3. Lemi Baruh and Zeynep Cemalcılar. 2014. It is more thanpersonal: Development and validation of amultidimensional privacy orientation scale.
Personalityand Individual Differences
70 (2014), 165–170.4. R. Beckwith. 2003. Designing for ubiquity: theperception of privacy.
IEEE Pervasive Computing
2, 2(April 2003), 40–46.5. Victoria Bellotti and Abigail Sellen. 1993. Design forPrivacy in Ubiquitous Computing Environments. In
European Conference on Computer-SupportedCooperative Work . Kluwer Academic Publishers, MA,USA, 77–92.6. Lei Chen, Ji-Jiang Yang, Qing Wang, and Yu Niu. 2012.A framework for privacy-preserving healthcare datasharing. In
IEEE International Conference on e-HealthNetworking, Applications and Services . 341–346.7. Hua Dai, Lakshmi Iyer, and Rahul Singh. 2007. Aninvestigation of consumer’s security and privacyperceptions in mobile commerce.
Americas Conferenceon Information Systems (2007), 1–16.8. Andrey Antonio de O. Rodrigues, FabianeAparecida Santos Clemente, and Antonio Alberto Senados Santos. 2016. An Information Window About OnlinePrivacy Aspects Perceived by Social Networks Users. In
Brazilian Symposium on Human Factors in ComputerSystems (IHC ’16) . ACM, NY, USA, 18:1–18:10.9. Tamara Dinev and Paul Hart. 2006. An extended privacycalculus model for e-commerce transactions.
InformationSystems Research
17, 1 (2006), 61–80.10. M. Elkhodr, S. Shahrestani, and H. Cheung. 2012. Areview of mobile location privacy in the Internet ofThings. In
International Conference on ICT andKnowledge Engineering . 266–272.1. Jayavardhana Gubbi, Rajkumar Buyya, Slaven Marusic,and Marimuthu Palaniswami. 2013. Internet of Things(IoT): A vision, architectural elements, and futuredirections.
Future Generation Computer Systems
29, 7(2013), 1645–1660.12. Chin-Lung Hsu and Judy Chuan-Chuan Lin. 2016. Anempirical examination of consumer adoption of Internetof Things services: Network externalities and concern forinformation privacy perspectives.
Computers in HumanBehavior
62 (2016), 516 – 527.13. Nancy J King and Pernille Wegener Jessen. 2010.Profiling the mobile customer–Is industry self-regulationadequate to protect consumer privacy when behaviouraladvertisers target mobile phones?–Part II.
Computer Law& Security Review
26, 6 (2010), 595–612.14. Klaus Krippendorff. 1970. Estimating the reliability,systematic error and random error of interval data.
Educational and Psychological Measurement
30, 1(1970), 61–70.15. Michelle Kwasny, Kelly Caine, Wendy A. Rogers, andArthur D. Fisk. 2008. Privacy and Technology: FolkDefinitions and Perspectives. In
CHI ’08 ExtendedAbstracts on Human Factors in Computing Systems (CHIEA ’08) . ACM, New York, NY, USA, 3291–3296.16. Saadi Lahlou. 2008. Identity, social status, privacy andface-keeping in digital society.
Social science information
47, 3 (2008), 299–330.17. Dong-Joo Lee, Jae-Hyeon Ahn, and Youngsok Bang.2011. Managing consumer privacy concerns inpersonalization: a strategic analysis of privacy protection.
MIS Quarterly
35, 2 (2011), 423–444.18. Yuan Li. 2014. A Multi-level Model of IndividualInformation Privacy Beliefs.
Electron. Commer. Rec.Appl.
13, 1 (Jan. 2014), 32–44.19. Bárbara Gabrielle C. O. Lopes, Geanderson E. dos Santos,Maria Lúcia Villela, and Raquel O. Prates. 2016. PrivacyDesign Model Application on Sharing Pictures Apps. In
Brazilian Symposium on Human Factors in ComputerSystems (IHC ’16) . ACM, NY, USA, 46:1–46:4.20. R. Mekovec and N. Vrcek. 2011. Factors that influenceInternet users’ privacy perception. In
InternationalConference on Information Technology Interfaces .227–232.21. Patricia A Norberg, Daniel R Horne, and David A Horne.2007. The privacy paradox: Personal informationdisclosure intentions versus behaviors.
Journal ofConsumer Affairs
41, 1 (2007), 100–126.22. Oded Nov and Sunil Wattal. 2009. Social ComputingPrivacy Concerns: Antecedents and Effects. In
Proc. ofthe SIGCHI Conference on Human Factors in ComputingSystems . ACM, New York, NY, USA, 333–336.23. Lesandro Ponciano, Francisco Brasileiro, NazarenoAndrade, and Lívia Sampaio. 2014. Considering humanaspects on strategies for designing and managingdistributed human computation.
Journal of InternetServices and Applications
5, 1 (2014), 10. 24. Kathy S. Schwaig, Albert H. Segars, Varun Grover, andKirk D. Fiedler. 2013. A Model of Consumers’Perceptions of the Invasion of Information Privacy.
Inf.Manage.
50, 1 (Jan. 2013), 1–12.25. Kim Bartel Sheehan. 2002. Toward a typology of Internetusers and online privacy concerns.
The InformationSociety
18, 1 (2002), 21–32.26. Reza Shokri, George Theodorakopoulos, Jean-YvesLe Boudec, and Jean-Pierre Hubaux. 2011. QuantifyingLocation Privacy. In
IEEE Symposium on Security andPrivacy . IEEE, Washington, DC, USA, 247–262.27. H Jeff Smith, Sandra J Milberg, and Sandra J Burke.1996. Information privacy: measuring individuals’concerns about organizational practices.
MIS quarterly (1996), 167–196.28. Wouter MP Steijn and Anton Vedder. 2015. Privacyunder Construction A Developmental Perspective onPrivacy Perception.
Science, Technology & HumanValues
40, 4 (2015), 615–637.29. Mónica Tentori, Jesús Favela, and Victor M González.2006. Quality of Privacy (QoP) for the Design ofUbiquitous Healthcare Applications.
Journal of UniversalComputer Science
12, 3 (2006), 252–269.30. Samuel D Warren and Louis D Brandeis. 1890. The rightto privacy.
Harvard law review (1890), 193–220.31. M. Williams, J. R. C. Nurse, and S. Creese. 2016. ThePerfect Storm: The Privacy Paradox and theInternet-of-Things. In . 644–652.32. Kok-Seng Wong and Myung Ho Kim. 2014. Towardsself-awareness privacy protection for Internet of thingsdata collection.
Journal of Applied Mathematics
Journalof the Association for Information Systems
12 (2011),798.34. Eduardo A. Yamauchi, Patricia C. de Souza, andDeógenes P. S. Junior. 2016. Prominent Issues for PrivacyEstablishment in Privacy Policies of Mobile Apps. In
Brazilian Symposium on Human Factors in ComputerSystems (IHC ’16) . ACM, NY, USA, 26:1–26:9.35. Mike Z Yao, Ronald E Rice, and Kier Wallis. 2007.Predicting user concerns about online privacy.
Journal ofthe American Society for Information Science andTechnology
58, 5 (2007), 710–722.36. Anne Yau. 2007. Measuring Levels of Privacy Concern:Context and Trade-offs Between Competing Desires.
Proc. of the Pacific Asia Conference on InformationSystems (2007), 72.37. Wei Zhou and Selwyn Piramuthu. 2015. InformationRelevance Model of Customized Privacy for IoT.