Towards an accountable Internet of Things: A call for reviewability
““towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 1 —
Chapter 1
Towards an accountable Internet of Things
A call for ‘reviewability’
Chris Norval, Jennifer Cobbe, Jatinder Singh Cite as: Norval, C., Cobbe, J. and Singh, J. (2021). Towards an accountable Internet of Things: A call for ‘reviewability’.In
Privacy by Design for the Internet of Things . The Institution of Engineering and Technology.
As the IoT becomes increasingly ubiquitous, concerns are being raised abouthow IoT systems are being built and deployed. Connected devices will generate vastquantities of data, which drive algorithmic systems and result in real-world conse-quences. Things will go wrong, and when they do, how do we identify what hap-pened, why they happened, and who is responsible? Given the complexity of suchsystems, where do we even begin?This chapter outlines aspects of accountability as they relate to IoT, in the con-text of the increasingly interconnected and data-driven nature of such systems. Specif-ically, we argue the urgent need for mechanisms—legal, technical, and organisational— that facilitate the review of IoT systems . Such mechanisms work to support account-ability, by enabling the relevant stakeholders to better understand, assess, interrogateand challenge the connected environments that increasingly pervade our world.
Our physical environments are becoming increasingly interconnected, automated,and data-driven as grand visions of the Internet of Things (IoT) move closer to be-coming realised. The concept of IoT involves a range of devices interacting andcollaborating to achieve particular goals [1]. We already see many such examplesbeing deployed, including within homes and vehicles [2, 3, 4], and across towns andcities [5, 6, 2, 7, 8, 4]. The claimed benefits afforded by so-called ‘smart’ deviceshave led to considerable interest, with one estimate suggesting that the number ofIoT devices could grow from 8 billion in 2019 to 41 billion by 2027 [9].There is, however, more to the IoT than just these ‘smart’ devices. The IoTcomprises a socio-technical ecosystem of components, systems, and organisationsinvolving both technical and human elements. In these interconnected IoT environ-ments data flow drives everything , in that the flow of data between, through, andacross systems and organisations works to both integrate them and deliver the over- Compliant and Accountable Systems Group, Department of Computer Science & Technology, Universityof Cambridge, UK. Contact: fi[email protected] a r X i v : . [ c s . C Y ] F e b towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 2 — Privacy by Design for the Internet of Things arching functionality [10]. An IoT deployment will often include vast quantities ofdata coming from a range of local and external sources, including from sensor read-ings, online services, and user inputs, in addition to the data flows across technicalcomponents (at various levels of technical abstraction) that drive particular function-alities. These data streams help to form wider algorithmic systems (which includedegrees of automation, and may leverage the use of machine learning (ML) models),from which real-world consequences result.These interactions introduce important socio-technical considerations regardingthe wider contexts in which these IoT systems are built and deployed. As theseecosystems become ever more pervasive and complex, a range of devices, manu-facturers, data sources, models, and online service providers may be involved inautomated outcomes, that can have impact at scale. Decisions made by people—vendors, developers, users, and others—about which systems and services to build,deploy and connect with, how these operate, their broader aims and incentives, andso forth, all serve to determine these data flows, with a direct impact on systemfunctionality. Moreover, the complexity of these ecosystems means that systems canexhibit emergent properties and behaviours, acting in ways that are not explainedby the features of their component parts when considered individually. This raisesimportant questions for how these systems should be governed, and what happenswhen things inevitably go wrong.
Algorithmic and data-driven technologies have come under increasing scrutiny in re-cent years. Reports of high-profile data misuse have made such systems increasinglythe subject of legal and regulatory attention [11, 12], leading to calls for more ac-countability in the socio-technical systems which surround us. Put simply, account-ability involves apportioning responsibility for a particular occurrence and determin-ing from and to whom any explanation for that occurrence is owed [13]. However,this can prove challenging in an IoT context.Accountability is often discussed in relation to a single organisation, or in asystems-context as discrete systems, where the actors (individuals, organisations,etc.) involved are preknown or defined. However, the various components of an IoTecosystem—including sensors, actuators, hubs, mobile devices, software tools, cloudservices, and so on—typically do not operate discretely or in isolation. Rather, theyare employed as part of a system-of-systems [14]. That is, these systems involve in-teractions with others, forming an assemblage of many socio-technical systems [15].In this way, they are increasingly part of and reliant upon a data-driven supply chainof other (modular) systems, joined together to bring overarching functionality. Wealready see many such systems-of-systems which are composed of interconnectedcomponents – the popularity of cloud ([anything]-as-a-service) , which provide un-derpinning and supporting functionality on demand, is a case in point [16].To illustrate, a given ecosystem may involve a range of devices streaming data tocloud-based ML services to produce new data outputs (a classification) which drivesan automated decision, the decision itself then forms a (data) input to another sys-tem. This, in turn, results in that system producing an output (actuation command)towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 3 —
Towards an accountable Internet of Things
The IoT operates as part of a broader socio-technical ecosystem. As IoT deploymentsbecome increasingly automated and consequential, it becomes ever more importantthat they are designed with accountability in mind. In the event of failure or harm,a natural first step is to look at what went wrong and why. In determining legalcompliance, it is important to understand how systems are and have been operating.Yet the complexity, opacity, interconnectedness, and modularity of these ecosystemspose particular challenges for accountability [17]. Complex data flows across legal,technical, and organisational boundaries can produce emergent behaviours and canaffect systems in unpredictable ways [10, 14]. An unexpected behaviour or failuretowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 4 —
Privacy by Design for the Internet of Things in one system or device can propagate throughout the ecosystem, with potentiallyserious consequences.There has been some consideration by the technical community of issues ofaccountability, though these tend to focus on particular technical methods, compo-nents, or systems (see §5). Considering accountability by focusing only on particulartechnical aspects or components of the IoT misses the bigger picture of their context,contingencies, dependencies, and of the organisational and human processes aroundtheir design, deployment, and use. We therefore argue here for a more holistic viewof transparency and accountability in these complex socio-technical ecosystems. Weterm this approach reviewability . This encompasses a targeted form of transparencyinvolving technical and organisational logging and record-keeping mechanisms. Thepurpose is to expose the information necessary to review and assess the functioningand legal compliance of socio-technical systems in a meaningful way and to supportaccountability (and redress, where appropriate).
As we have discussed, an IoT ecosystem is driven by data flows and may involve datamoving between any number of different components and entities. As data passesfrom one entity or component to another (which can involve crossing a technicalor organisational boundary), the data flows that drive these interconnected systemswill often become invisible or opaque [14, 16]. For example, it may not be clearto a developer how an external AI as a Service (AIaaS) model actually works orhow reliable its predictions are. Similarly, it may not always be apparent wheredata provided by online services is coming from, or how sensors are calculating andpre-processing data before feeding it to the next component. This loss of contex-tual integrity [19] risks the inadvertent misuse of that data, perhaps losing crucialinformation regarding how it should be used, information about biases or samplingerrors, or other potential problems that could continue to propagate throughout thewider ecosystem.Though there are methods for logging and ongoing research efforts on ‘explain-ing’ technical components, their benefit may be limited in terms of improving ac-countability in an IoT context (§5). As we will discuss, there is a need for trackingdata as it flows between systems and across boundaries, given the interconnectednature of the IoT [20]. Moreover, even where records of system behaviour (includ-ing data flow) are collected, the usability and utility of this information remains arecognised challenge [21, 22].Having multiple components involved in a given deployment also increases thenumber of points of possible failure and the potential for unpredictable emergentproperties and behaviour. This increases the difficulty of determining where issuesare occurring. As data flows through an ecosystem, decisions and actions taken byone component might propagate widely, making it difficult to trace the source (or-ganisational or technical) and all the consequential (flow-on) effects of an issue. For This is where a machine learning model is offered as a service, and can be applied by customers to theirdata on-demand, typically as a request-response interaction. Example services include those for detectingobjects, faces, converting text-to-speech, etc. [18]. towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 5 —
Towards an accountable Internet of Things reviewability of IoT systems, components, and organisational processes – pro-viding targeted transparency and accountability mechanisms that allow for a holisticview to be taken of the socio-technical ecosystem as a whole. Auditors and othersassessing systems can then ‘hone in’ on specific systems, components, organisationsand their practices where further investigation is required.
In view of these challenges to accountability in complex IoT ecosystems, we arguefor “reviewability” as an approach for shedding light on socio-technical processes.
Reviewability
Reviewability involves systematically implementing comprehensive technical andorganisational transparency mechanisms that allow the design, deployment, andfunctioning of socio-technical systems and processes to be reviewed as a whole [24].Reviewable systems, therefore, are those that are designed and operate in such a wayas to record and expose (though means such as, for example, record-keeping andlogging) the contextually appropriate information necessary to allow technical sys-tems and organisational processes to be comprehensively interrogated and assessedfor legal compliance, are functioning appropriately, and so on. This targeted, holis-tic form of transparency supports meaningful accountability of systems, processes,and organisations to whomever is relevant in a given context (auditors, investigators,regulators, users, other developers, and so on).towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 6 —
Privacy by Design for the Internet of Things
From a legal point of view, accountability is often tied to notions of responsibil-ity, liability, and transparency – indeed, transparency is often a regulatory require-ment for identifying responsibility or liability and thus facilitating accountability,as we will discuss in §3. However, from a legal perspective, ‘transparency’ doesnot necessarily mean full transparency over the internal workings of a system, noris it limited only to technical components and processes. Instead, the transparencyrequired in law often involves information about the entities involved, high-level in-formation about what is happening with data or about what systems are doing (ratherthan necessarily the specifics of how they function), and information about the risksof using those systems [17].In other words, legal mechanisms to provide accountability depend upon theability of stakeholders to meaningfully review technical and organisational systemsand processes—partially or as a whole—in order to determine which person or or-ganisation is responsible for a particular system, device, or process, its effects, andfrom (and to) whom an explanation or redress is owed [17]. This may or may not in-volve exploring the inner workings of particular technologies or algorithms, as tendsto be the focus of the technical research community. It could equally involve exam-ining the broader socio-technical processes in and around a technical system itself.Indeed, there is debate about the degree to which exposing the details of code andalgorithmic models actually helps with accountability [25, 24]. Therefore, insteadof exposing the inner workings of technical systems and devices, achieving reviewa-bility of IoT ecosystems requires (i) technical and organisational mechanisms formaking transparent the connections and data flows across these entire ecosystems,as well as (ii) information relating to decisions made in design, deployment, op-eration, and during investigations, so as to indicate the context in which they areoperating, their effects, and the entities involved (of which there may be a num-ber). To support meaningful accountability, the information recorded about thesesystems and processes should be contextually appropriate [24] – that is, informationwhich is relevant to the kinds of accountability involved; accurate , in that is correct,complete, and representative; proportionate to the level of transparency required; comprehensible by those to whom an account will likely be owed).In practice, the information provided by reviewable systems will include thatindicating who is involved, the nature of their role, how data is being processed, andhow data flows between systems for each component in the IoT ecosystem. In anIoT context, reviewability could mean device vendors and service providers keepingrecords of, for example, the systems they are connecting to and interact with (in termsof input and output), what choices were made about which kinds of data to processand how it should be processed, information about assessments for security, dataprotection, and other legal obligations, and information about the design, training,and testing of models (where applicable). At its most basic, this kind of informationmay indicate that Organisation B used an online service (e.g. MLaaS) provided byOrganisation C, and that the response from that prediction was then fed into a serviceprovided by Organisation D. But an analysis of even this ‘high-level’ informationcould also provide an indication of where the source (either technical or otherwise)of a particular issue may lie. In this sense, such information can make it easiertowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 7 —
Towards an accountable Internet of Things
The complex, interconnected and data-driven nature of the IoT ecosystem poses var-ious challenges for legal accountability [26, 17]. Broadly speaking, these challengescome from two sources: (1) the lack of visibility over data flows in interconnectedenvironments, which makes it difficult to know where and with whom in a systema problem originates; and (2) the lack of visibility over the technical and organisa-tional systems and processes of various entities within that ecosystem, which makesit difficult to assess their compliance with legal requirements and obligations.From a legal point of view, each entity in an IoT ecosystem could potentiallybe accountable to various others for the functioning of their systems and devices –to customers and users, the vendors of other IoT devices and systems, to regulatorsand other oversight bodies, to law enforcement agencies, and to courts and disputeresolution bodies, among others. For each of these, providing ‘transparency’ as ageneral concept is unlikely to provide much benefit, nor is opening up code or theinternal workings of technical systems likely to assist in every case (although it will,of course, in some [27]) [28]. Rather, the information necessary to meet account-ability and disclosure requirements and obligations established in law is more likelyto relate to technical and organisational processes, to data flows given the nature ofthe IoT, and to understanding what happened, when, why, and with what effect [17].For example, data protection law, such as the EU’s General Data ProtectionRegulation (GDPR) [29], establishes accountability mechanisms and requirementsfor various actors involved in processing personal data (defined broadly in GDPRto include any data from which an individual can be identified, whether directlytowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 8 —
Privacy by Design for the Internet of Things or indirectly, either from that data alone or when combined with other data ). IoTecosystems will often involve a flow of personal data, and processing means anyoperation or set of operations performed on that data. Under GDPR, certain in-formation is required to be provided by data controllers (the entities that determinethe purposes and means of processing ) to data subjects (the individuals to whompersonal data relates ) about the processing of their personal data. Data processors(who process personal data on behalf and under the instruction of data controllers )are obliged to facilitate auditing of their processing by the relevant data controller.Data controllers can be obliged to provide information about processing to and facil-itate auditing by supervisory authorities (national data protection regulators). Con-trollers will also need to know where personal data has been obtained from, underwhat circumstances, and on what conditions.Beyond data protection law, accountability requirements could also arise from avariety of other sources [26] – product safety law, for instance, could require devicevendors to supply customers and oversight bodies with information. The adjudica-tion and enforcement of contractual disputes could require disclosure of informationabout technical and organisational processes to counterparties and to courts. Crim-inal investigations could require the provision of information to law enforcementagencies.In each case, approaching transparency through the lens of reviewability couldassist in meeting these obligations, and help communicate legally relevant, contex-tually appropriate information to customers, users, technical partners, as well as theappropriate legal, regulatory, and other oversight authorities. Although reviewabilitycould assist across various legal frameworks, we now explore three general ways inparticular where a reviewability approach to accountability would be legally bene-ficial: compliance and obligation management; oversight and regulatory audit; andliability and legal investigation. It is worth noting that while we have delineatedthese benefits, they are interrelated. As much of the IoT ecosystem will involve per-sonal data, we use the transparency and accountability requirements and obligationsestablished in GDPR as examples throughout. As alluded to above, a reviewability approach to accountability—such as by usingrecord-keeping, logging or other information capture mechanisms for technical andorganisational processes—can assist those responsible for socio-technical ecosys-tems such as the IoT to comply with their legal and regulatory (and other) obliga-tions [30, 20].Where IoT ecosystems process personal data (perhaps, for example, contain-ing names, user profiles/accounts, photographs or video of people, voice recordings,or any other information that can be linked to an individual), and therefore come GDPR, Art. 4(1), recital 26 GDPR, Art. 4(2) GDPR, Art. 4(7) GDPR, Art. 4(1) GDPR, Art. 4(8) towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 9 —
Towards an accountable Internet of Things In this context, reviewability would generally not involve recordingthe personal data itself—a potentially significant data protection issue and privacyviolation—but metadata about organisational processes and technical processing.Organisations found to be in breach of the GDPR can face significant penalties, in-cluding warnings, fines of up to the greater of e
20m or 4% of annual global turnover,and bans on processing. In the first instance, data controllers are obliged to take technical and organi-sational measures to ensure and be able to demonstrate compliance with GDPR. Record-keeping and logging would assist data controllers with implementing GDPR’sdata protection principles and with showing that they have taken steps to do so.For instance, the ‘purpose limitation’ principle requires that personal data be col-lected for specific purposes and only processed in a way compatible with those pur-poses. The keeping of records, logs and other relevant information would providecontrollers with relevant information on the purposes for which personal data wascollected, and helps ensure that the data is only processed accordingly. Implement-ing such measures would also assist with fulfilling data subject rights, such as thoseto obtain copies of the personal data being processed and, in some circumstances,to require its deletion. Moreover, maintaining records of technical and organisa-tional processes around the design and deployment of systems would help controllersdemonstrate to supervisory authorities that processing is in fact taking place in linewith GDPR’s requirements and that rights have been fulfilled.In advance of processing, data controllers will in many situations need to un-dertake a Data Protection Impact Assessment (DPIA). This consists of a broadassessment of the risks posed by the proposed processing to the rights and freedomsto data subjects and of the steps taken by the controller to mitigate those risks. Wherethe DPIA indicates a high risk, controllers are obliged to consult with the supervi-sory authority before proceeding and to provide them with the information containedin the DPIA. Reviewability can greatly assist with undertaking the kind of holisticanalysis of processing and its risks required for DPIAs, with demonstrating how andwhether any mitigating measures have actually been implemented, and with assess-ing on an ongoing basis whether processing is in fact being undertaken in line withthe DPIA. GDPR, Art. 5 GDPR, Art. 83 GDPR, Art. 24 GDPR, Art. 5(1)(b) GDPR, Art. 15 GDPR, Art. 17 GDPR, Art. 35 GDPR, Art. 36 towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 10 —
Privacy by Design for the Internet of Things
It is worth noting that controllers and processors are also already obliged inmany circumstances to keep records of processing. Controllers should record,among other things, the purposes of processing, the categories of data subjects andof personal data, the categories of those to whom data will be disclosed, and a gen-eral description of technical and organisational security measures. Processors shouldrecord, among other things, the name and contact details of processors, the categoriesof processing, and a general description of their technical and organisational securitymeasures. Through comprehensive logging and record-keeping of technical and or-ganisational systems and processes, reviewability can help controllers and processorsfulfil these obligations and others.Reviewability may also be of use in assisting compliance with future regula-tions; for example, the proposed ePrivacy Regulation [31], which establishes somesimilar requirements on the use of some non-personal data. They can similarly assistin managing a broader range of legal, regulatory, and other obligations (‘soft law’,codes of practice, and so on). For instance, where contractual obligations exist be-tween entities, knowledge of their respective technical and organisational processesand of the nature of data flow between parties could make it possible to ensure anddemonstrate that data is processed in a way compatible with the contract govern-ing that data flow or processing relationship. And information on the selection ofdatasets and on the sources and lineage of data used for analytics and ML mightassist with issues of unfairness and discrimination [32].
Reviewability also has much potential to aid the auditing and oversight activities ofregulators, as reviewability entails generating detailed information for assisting over-sight that may otherwise be unavailable. As noted previously, data controllers andprocessors are obliged to maintain extensive records of processing. Data controllersare also obliged to undertake various assessments (in relation to data protection bydesign, for example, or for implementing security measures, carrying out DPIAs,and so on), to use only processors who can comply with GDPR, and to consult withsupervisory authorities in certain circumstances. Supervisory authorities are them-selves empowered to audit data controllers, to inspect their systems, to access to theirrecords, and to require from controllers any other information necessary to performthe supervisory authority’s tasks, including to investigate controllers’ compliancewith GDPR. Engineering socio-technical systems to be reviewable through mechanisms forrecord-keeping, logging and otherwise capturing information about of technical andorganisational processes would therefore facilitate audit and oversight by supervi-sory authorities. Indeed, many of the potential legal benefits of reviewability dis-cussed previously would also apply to regulatory oversight more generally. More-over, recording information about data flows would allow data protection regulatorsto use this information to assess whether the IoT ecosystems match that described GDPR, Art. 30 GDPR, Arts. 57-58 towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 11 —
Towards an accountable Internet of Things
IoT systems have real-world impacts, including direct physical-world consequencesgiven that actuators can feature throughout. This means that questions of civil lia-bility or criminal responsibility could arise where harm or injury is caused [26, 17].Reviewability could therefore be particularly helpful in assessing liability and under-taking legal or criminal investigations. Where harm is caused by a system failure,comprehensive records and logs would allow the circumstances, causal behavioursand factors, and consequences of the failure to be reviewed, determined, and liabil-ity established. In complex, interconnected environments, records and logs aboutdata flows to, from, and between devices and systems could also help identify whichsystem caused that harm. This, thereby, helps to identify the component (and en-tity) responsible and potentially liable for that harm, and can assist in holding themto account. This kind of reviewability information may even help absolve systemsdesigners, operators or those otherwise involved from responsibility, by providingevidence demonstrating that the right decisions were made and actions were taken.For example, controllers and processors are exempt from liability for damage causedby infringements of GDPR if they can prove that they were not responsible for theinfringement (such as where a controller can demonstrate that their processor actedunlawfully, for instance). As such, the information provided by implementing reviewability is useful formultiple actors in interconnected IoT ecosystems – to vendors, in helping them man-age their liability exposure; to users, in taking action and obtaining redress whereharm has been caused by a product; to courts, in adjudicating claims in tort or con-tract; as well as to regulators and law enforcement as they investigate potential reg-ulatory or criminal violations. Vendors and others could contractually require thiskind of record-keeping and logging (of technical and organisational processes anddata flows, both within organisations and across boundaries). This could give thema mechanism for monitoring how data is used, potentially in real-time, and enableaction where data is being used in a way that is inappropriate or prohibited by law orby contract.
In an IoT context, there are clear benefits of reviewability, in addition to those legalas just discussed. In §2, we described how the complex, interconnected, data-driven,and multi-stakeholder nature of the IoT can result in transparency challenges at thetechnical level, making it difficult to determine how systems are designed, deployed, GDPR, Art. 82(3) towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 12 —
Privacy by Design for the Internet of Things and functioning. The opacity of these complex systems-of-systems arrangementsmeans that the information necessary to properly review their functioning may notbe available, which can significantly hinder accountability processes.Information about the organisations, components, and data flows driving IoTecosystems therefore has an important role to play in supporting review processes.That is, certain details and related information about IoT systems and componentsand their design, deployment, and operation—which includes workflows, businessprocesses, and other organisational practices—will often be necessary for enablingproper oversight, audit, interrogation, and inspection. Moreover, the ongoing reviewof the socio-technical systems is important. This is because the complexity, intercon-nectedness, and ‘long-lived’ nature of IoT ecosystems makes it more likely that theywill exhibit emergent properties and behaviours, potentially giving rise to unforeseenor unforeseeable problems.In short, information about the design, deployment, and functioning of the socio-technical systems that comprise the IoT facilitates their review, including for whetherthey are designed and behaving appropriately, and for issues to be unpacked andconsequences dealt with when they arise. This brings additional benefits to the legalaspects described in §3, for a range of actors in an IoT ecosystem.
Implementing technical and organisational measures for logging, record-keeping orotherwise capturing information of systems increases their reviewability. This bringsa range of benefits for technologists, who have a clear interest in overseeing the sys-tems they build and operate. This is because details on how systems are designed,deployed, function, behave, interact, operate, and so forth provide information rele-vant for testing, monitoring, maintaining and improving the quality of those systems,while helping them to meet and manage their responsibilities and obligations.Specifically, greater visibility over the technical and organisational aspects ofsystems, and their interconnections with others, assists technologists with inves-tigations, enhancing their ability to review errors and systems failures when they(inevitably) occur. That is, such information helps support processes of repair anddebugging. Moreover, information about run-time behaviour also enables more pro-active steps to be taken, where details of system operation can allow the identificationand mitigation of certain issues in advance of them becoming problematic; for ex-ample, by enabling the identification of abnormal deviations in system behaviour, orthrough run-time reaction mechanisms, alerts, or automatic countermeasures that aretriggered where unexpected events or interactions occur. This is particularly impor-tant given that, as discussed previously, IoT systems can exhibit emerging propertiesand behaviours. Further, in the IoT, components have the potential to be used orreused in ways that were not envisaged by the original designers – a temperaturesensor in a building designed for climate control could suddenly be used to influencehealth wearables, for instance. It follows that the ability for technologists to under-take ongoing review will be crucial for ensuring appropriate system behaviour, whileproviding insight into (any) emerging properties.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 13 —
Towards an accountable Internet of Things As we have described, the lack of technical information in an IoT context has thepropensity to hinder investigation. While reviewability promises benefits for legaland regulatory oversight bodies (see §3.2), information about technical and organi-sational systems can also assist the activities of other overseers, such as civil societyorganisations, trade-sector groups, or those internal to a technology firm. This is bygiving (i) information on the nature of systems and their behaviour; and (ii) insightinto the practices of their designers and operators.In this way, information on systems helps accountability processes, by providingevidence or lines for investigation that can assist overseers in determining whetherthe technology-related actions and measures taken were appropriate (and therebybuilding the foundations for recourse where they are not). This can include infor-mation about design and development, as well as run-time systems behaviour, anddetails of the reviews and audits undertaken. In addition to supporting (ex-post) in-vestigation, technical processes that produce streams of data on system operationalso paves the way for more active and responsive oversight regimes. This is wherecertain system behaviours or events could trigger alerts, warnings or otherwise bringcertain information to an overseer’s attention, and with timely action, may serve tomitigate the potential severity of outcomes.
The information derived from IoT systems can potentially also assist end-users byhelping them to make better-informed decisions about whether or how they interactwith a given deployment. Although overloading users with information and optionsis not a desirable or useful outcome [33], there may be opportunities for a greaterdegree of user empowerment. If a user could meaningfully and reliably know inadvance that engaging with (having their data flow into) a particular system could,for example, (i) result in certain decisions being made that may have particular un-desired consequences; or (ii) involve data flowing to an undesired entity (such as acertain advertisement network), then the user could take reasonable steps to avoidengaging with such a system [14]. Proactive measures are again possible, such aswhere user policies might work to constrain particular information flows [20], or tokeep users informed of any change in circumstance, thereby enabling them (manu-ally or perhaps automatically) to respond. One can imagine, for instance, users beingautomatically told of changes in policy or system operation – for example, where therecords of data flows indicate an online service suddenly engaging a new advertisingnetwork.
Technology has a key role to play in supporting reviewability, as it provides themeans to capture or otherwise produce information or records about systems, bothabout their design and during operation. As we have argued, reviews are assistedthrough mechanisms that can enable more (relevant) transparency over systems.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 14 —
Privacy by Design for the Internet of Things
Though the idea of capturing information about systems is nothing new, muchof the focus so far has been on the design and engineering phases. Logging anddebugging frameworks, for example, are generally aimed at a technical audience,particularly those developing or administering systems. These tend to focus nar-rowly on particular technical specifics. There has been less consideration of record-ing information (i) to support broader accountability regimes (of which, we argue,reviewability enables), and (ii) in a manner appropriate for the broad vision for IoT.That said, accountability is an area of increasing attention by the technical re-search community (e.g. see [21, 17]). ‘Accountability’ is a term that gained traction,particularly in the machine learning community where the focus was predominatelyon the ‘explainability’ of machine learning models and decisions. While such workis important—perhaps particularly for machine learning (engineering/testing) pro-cesses [34]—a model doesn’t operate in isolation, but rather, it exists as part of abroader socio-technical system [35, 24]. This broader aspect requires consideration,and work in the community is beginning to expand and recognise this [24]. Oneexample is ‘datasheets for datasets’ [36], which involves capturing metadata that de-scribes the nature of datasets, including how the data was collated, pre-processed, thelegal basis for its processing (e.g. consent), etc. Similarly, ‘model cards’ [37] recordvarious information about trained models, such as how they perform across a varietyof different conditions and/or demographic groups, alongside the context in whichthey are intended to be used. The idea is to provide more information, for instance,to enable a fuller interrogation of model design (e.g. how it was trained), help informengineers and investigators, for example, as to whether the dataset or model is (orwas) appropriate for a particular use, give information about usage restrictions (e.g.consent), and so on.Considering the broader systems context is important in the IoT. IoT ecosystemsare complex, interconnected, dynamic, multi-actor, socio-technical environments,which are long-lived and can potentially elicit emergent behaviours – perhaps thosenot designed for or previously envisaged. As we have discussed, greater visibilityover the interconnections and assemblages of systems is important for increasingaccountability for them. Therefore, a natural starting point for technology aimingto assist reviewability is that which helps uncover, or ‘map out’, the (ongoing) na-ture of IoT ecosystems. This requires mechanisms for capturing information aboutcomponents, systems, and their interactions (data flows) that comprise IoT systems.Particularly important in an IoT context are the ongoing interactions (data ex-changes) with other technical components, as these provide a foundation for both (i)describing system behaviour (the ‘how’ and ‘why’) and (ii) the actors involved (the‘who’). However mapping this represents a particular challenge given that while theIoT is data-driven, the visibility over the movement of data currently tends to belimited to particular scopes; it is often difficult to trace what happens to data onceit moves across a technical (e.g. between/across different software or devices) oradministrative (to another organisation, department, etc.) boundary.Towards such a mapping, provenance mechanisms show real potential [38, 23,30, 20, 39].
Provenance concerns capturing information describing data: it can in-volve recording the data’s lineage, including where it came from, where it moves to,towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 15 —
Towards an accountable Internet of Things To reiterate, the IoT is data-driven, where functionality is brought about throughthe interactions—the exchanges of data—between systems components. As such,provenance methods can be useful for reviewability as they entail taking a ‘followthe data’ approach – whereby information regarding the data flow in the IoT canindicate how systems behave, what led (or leads) to particular occurrences, as wellas the components (and therefore the actors) involved [20, 23]. Data regarding theinformation flows driving systems also provides the foundation paving the way fornew accountability-related tools, methods, and analysis [17].In line with this, we have developed the concept of decision provenance [14],which can help in mapping out complex ecosystems. We argue this has real potentialin the IoT space, accounting for its interconnectedness, and the role of automation(including the use of ML) in such environments. It works by recording what (andhow) data moves from one system (device, model, service, etc.) to another. We nowpresent an overview of decision provenance to indicate the potential for technicalmethods that can assist review, and support accountability regimes more generally.
Provenance is an active area of research [41], and is commonly applied in a researchcontext to assist in reproducibility by recording the data, workflows and computationof scientific processes [41, 42]. The potential for provenance to assist compliancewith specific information management (compliance) obligations has previously beenconsidered [38, 30, 39, 20, 23], though these often focus on a particular technicalaspect, be it representation or capture. However, the use of data provenance meth-ods for general systems-related accountability concerns represents an emerging areawarranting further consideration.Decision provenance is a response to (i) the concerns around accountability forautomated, algorithmic, and data-driven systems (which naturally comprise the IoT),(ii) the increasingly interconnected nature of technical deployments, where function-ality is driven through data exchanges, (iii) the current lack of visibility over data asit moves across technical and administrative boundaries, and (iv) the potential ofprovenance methods in this space.Specifically, decision provenance concerns recording information of the dataflowing throughout a system, as relevant for accountability concerns. It involvescapturing details and metadata relating to data, and the system components throughwhich data moves. This can include how data was processed and used, who the datacomes from or goes to (by virtue of the components involved), and other appropriatecontextual information, such as system configurations, business processes, workflowstate, etc. See [40, 41] for more details on data provenance. towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 16 —
Privacy by Design for the Internet of Things
Decision Provenance
Decision provenance concerns the use and means for provenance mechanisms toassist accountability considerations in algorithmic systems. Specifically, decisionprovenance involves (i) providing information on the nature, contexts and processingof the data flows and interconnections leading up to a decision or action, and the flow-on effects; and (ii) also how such information can be leveraged for better systemdesign, inspection, validation and operational (run-time) behaviour. In this way,decision provenance helps expose the decision pipelines in order to make visiblethe nature of the inputs to and cascading consequences of any decision or action (atdesign or run-time), alongside the entities involved, systems-wide.The broad aim is to help increase levels of accountability, both by providing theinformation and evidence for investigation, questioning, and recourse, and by pro-viding information which can be used to proactively take steps towards reducing andmitigating risks and concerns, and facilitating legal compliance and user empower-ment. For more details on decision provenance, see [14].
AIaaS ModelTraining DatasetCloud Database HubActuationInfrastructure PlanningPhoneSensors
Figure 1: An example IoT application, where by sensor readings work to trigger an actuation in a smarthome, with some data feeding to a city council’s planning service. Even a simple scenario involvesdata flows across a range of technical and organisational boundaries, as each arrow indicates. Decisionprovenance works to capture information about these data flows, thereby supporting review.
Decision provenance is predicated on the idea that it is the flow of data—theinteractions between components in a system—that drives system functionality. Itsname is in recognition that the actions or decisions taken in a systems context, be theymanual (e.g. initiated by users) or automated (e.g. resulting from the application ofan ML model), entails a series of steps that leading up to a particular happening,as well as the cascading consequences. All of these steps entail data flow (at somelevel) about which information can be recorded. Indeed, this characterisation directlyaccords with the grand visions of the IoT.The purpose of decision provenance is to expose decision pipelines , by provid-ing records regarding the happenings leading up to a particular decision/action, andthe cascading consequences. This can be at design time, e.g. by capturing informa-tion about a machine learning process, such as the data sources comprising a datasetfor model training, run-time, e.g. capturing information about the data flowing toparticular entities, and the decisions being made by algorithmic components, or attowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 17 —
Towards an accountable Internet of Things solve accountability challenges (nor will any technical measure). How-ever, it does show potential in supporting review, by exposing and enabling the ongo-ing monitoring of the relations, interactions and dependencies of such systems wheresuch visibility may not otherwise exist.
We have outlined how information on the nature of systems is important for sup-porting review processes, with provenance representing but one way forward forcapturing such information in an IoT context. Given accountability is a nascent areaof focus by the technical community, there are a number of research challenges andopportunities for tools and methods that support accountability regimes [17, 14].One key consideration is performance and scalability. Record keeping has thepotential to be onerous, not least as there may be range of actors interested in re-viewing such information. This means that a wealth of technical detail will oftenbe required to satisfy the aims of the variety of reviews that can potentially be un-dertaken. It is therefore important to consider carefully what and how much infor-mation is likely to actually be needed to provide meaningful accounts of system be-haviour (the ‘relevance’ and ‘proportionate’ dimensions of contextually appropriateinformation). Here, considerations include the overheads imposed on devices, espe-cially given the heterogeneous nature of components and their capabilities in the IoTspace. Also crucial for deployment are architectural considerations, including policytowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 18 —
Privacy by Design for the Internet of Things regimes that can flexibly determine what, when, where, and how data is captured,stored, and transmitted. Similarly, mechanisms are needed that are capable of work-ing reliably and efficiently across both technical and administrative boundaries, andare able to aggregate or resolve data from across these.Further, for a review to be meaningful, it must be based on complete and reli-able information (the ’accuracy’ dimension of contextually appropriate information).This can be a challenge in the IoT context, where components may be operated bydifferent entities, with different aims. This means the information relevant for reviewmight come from various sources, be they different technical components, poten-tially operating at different levels of technical abstraction, by different entities, andmay require supplementation with non-technical (out-of-band) detail. Mechanismstowards resolving and aligning captured information is an area requiring consider-ation. Similarly, the risks and incentives in an accountability context are complex,meaning that the records themselves can pose an organisational burden – given thatinformation might relate to their responsibilities, and has the potential to be usedagainst them (as part of a recourse mechanism). Means for ensuring the integrityand validity of not only the technical information provided, but also the mechanismsthat capture, manage and present that information are therefore important consider-ations.Related is that capturing information of systems is essentially a form of surveil-lance. This bears consideration, as the information captured might be inherently sen-sitive, be it personal data, representing confidential business processes, or generallyrequiring a stringent management regime. Mechanisms that help ensure that only theappropriate parties are able to view the right information in the right circumstancesis an important area for consideration and research.Usability is another key consideration; as we have made clear for a review to beeffective, the information of technical system must be meaningful for reviewers (the‘comprehensible’ dimension of contextually appropriate information). However, onechallenge in this space is dealing with the complexity of the technical information(systems provenance information, for example, can quickly become extremely com-plex [43, 42, 44, 45]), as well as how detailed technical information can be madeaccessible and understandable. Tools that facilitate the interrogation and represen-tation of technical details will be crucial for facilitating meaningful review in anIoT context, though further work is required to explore the presentation of technicaldetails for accountability purposes.As such, there are real opportunities for human computer interaction (HCI) re-search methods to be employed and extended to assist with the usability and in-terpretability of audit information. Through such means, the tools and techniquesavailable can better support the aims and literacies of the various actors (techni-cal experts, regulators, users, etc.) in conducting a review. Though there has beensome consideration of this in certain contexts (e.g. in some specific provenancecases [43, 46, 44, 47, 48, 22]), there appears real scope for more general explorationsinto such concerns. The development of standards and ontologies could assist (e.g.[49] describes an onotlogy for the GDPR) with both the management and interpreta-tion of information for review.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 19 —
Towards an accountable Internet of Things We now use an example of a smart city, a common IoT scenario, to illustrate ata high level how methods supporting reviewability relate to broader accountabilityconcerns.Smart cities aim to make “better use of the public resources, increasing the qual-ity of the services offered to the citizens” [50]. The IoT is often seen as integral toachieving such functionality [51], some commonly considered examples includingthe automation of home security [2], traffic rerouting [7, 8, 4], vehicle control [3, 4],and disaster management [5, 6, 2]. Such deployments may involve data and interac-tions from a wide range of sensors (infrastructural and/or mobile), devices and actu-ators, online services (cloud storage, MLaaS), etc, which may personally, privately,or publicly operated. Naturally, large number of actors may be involved, exchangingand processing data in order to achieve some overarching functionality.Importantly, while the boundaries of an IoT application may appear clear andwell-defined (at least to those that operate them), in practice they will often interact(communicate and exchange data) with other systems – either to provide the expectedfunctionality or perhaps to be used and reused in ways beyond what may have beenoriginally intended. For example, information may be collected independently byhome, vehicle, and mobile sensors, which might interact with deployments withinpublic and private spaces (e.g. for automation, security, insurance, billing), whichmay in turn feed data into broader algorithmic systems and services (energy grid,emergency services, public transport, etc), all of which may result in decisions andactions with real-world consequences.In these complex systems arrangements, any number of things could potentiallygo wrong; a data breach could see the release of personal data, automated actionscould be erroneously taken, inferences could be made on incorrect information, andso on. Yet, given the ability for such issues to cascade throughout an IoT ecosystem,the true extent of the impact may not always be clear. As we have argued, the abilityto review such systems (and their role at each step in the process) may be instrumen-tal in identifying who was involved, how such incidents occurred, and any knock-oneffects of the issues. We now explore how such an investigation might take place,and how the ability to review IoT ecosystems can be leveraged to provide legal andoperational benefits alike.
Consider a traffic incident occurring in a smart city. While walking to a sportingevent, a pedestrian crosses the road at a busy set of traffic lights and is struck andinjured by a vehicle. A witness immediately calls an ambulance, which is dispatchedto the scene of the incident, but takes a very long time to arrive. Given the natureof the smart city ecosystem, a regulator commissions an investigation to discoverwhat parts of the IoT infrastructure may have contributed to the incident, what wentwrong, and who (if anyone) may be liable.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 20 —
Privacy by Design for the Internet of Things
Speaking to those involved, the investigator gains three initial lines of inquiry:(i) the injured pedestrian claims that the driver ran a red light; (ii) one witness com-mented on how dark the streetlights were, despite it being a busy area with lots ofvehicles and pedestrians, and questioned whether that might have affected visibil-ity; (iii) the ambulance driver noted that they had been in that area earlier, but hadbeen redirected away shortly before the incident occurred. In all cases, these threepotential threads entail interactions with the smart city’s infrastructure.
The injured pedestrian had indicated that the driver ran a red light. However, thedriver claims that they were not at fault, the argument being that their vehicle is oneby CarNet—a highly popular manufacturer of networked ‘smart cars’—and couldtherefore not have breached the traffic rules. CarNet is known to use complex AIsystems to detect and inform of hazards on the road, in collaboration with a meshnetwork of all nearby CarNet vehicles, and autonomously applies the brakes andother countermeasures to avoid accidents. Since the vehicle did not detect the hazard(the pedestrian), the driver believed that they couldn’t have been at fault, and thatthe pedestrian must have suddenly and unexpectedly attempted to cross the road at adangerous time.Given that CarNet have built their systems to be reviewable, they retain extensiverecords about the design of their cars’ systems, as well as ensuring each car produceslogs of their operation. Log files containing certain information on how systems arefunctioning are sent to CarNet (including whether systems are operating correctlyand sensors are reporting in line with expected parameters), however, some data(such as personal data, including information on vehicle movements and camerafeeds) is stored locally on each car. CarNet has also secured contractual obligationsthroughout their decision supply chain, ensuring that record keeping practices arealso in place by the third-parties suppliers on which their vehicles depend. The resultof this is a high degree of oversight and reviewability over how CarNet’s vehiclesoperate on the road.The investigator reviews CarNet’s records, which indicate that the car’s systemsand sensors were fully operational, and logs retrieved from the car in question indi-cate that there were four CarNet vehicles nearby at the time of the incident, whichshared information about potential hazards at the particular area where the incidentoccurred. Looking more in-depth at the operational logs retrieved from the car (in-cluding camera footage fed into the car’s AI system, and network communicationssent from other vehicles), the investigator determines that the traffic light was indeedred – however, none of the vehicles detected any pedestrians or other hazards near toor present on the road.Through reviewing the data from the car, the investigator also learns that CarNetvehicles work by using an object recognition model which takes camera footage fromtheir vehicles and determines the location of pedestrians and other vehicles on theroad. This model is built and operated by CloudVision through their cloud-basedAIaaS service, though the vehicle uses a locally stored version which is periodicallyupdated from CloudVision’s servers. The investigator notes that the model has nottowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 21 —
Towards an accountable Internet of Things
Privacy by Design for the Internet of Things
The investigator learns that the management of the street lighting on the road ofthe incident was recently outsourced to SmartLight Inc., a supplier of automatedstreet lighting solutions which vary the brightness of the lights according to levelsof activity (busyness) in the area. On reviewing the street lighting system, the inves-tigator discovers that the service works by retrieving external information about thenumber of pedestrians and vehicles present on the street, which is obtained throughCloudMap’s popular mapping service. Through SmartLight’s log files, the investiga-tor determines that the mapping service had indicated that no vehicles or pedestrianswere present at the time of the incident, which led to SmartLight’s system deter-mining that the street lights should be dimmed in line with the Council’s desire topreserve electricity. However, the camera footage from the vehicle clearly showsthat the area was busy, and therefore CloudMap’s information was incorrect. Ask-ing SmartLight about the reliability of CloudMap, they explain that they have neverhad any past problems with the service, and that they had therefore not implementedany back up data sources or contingency plans for when the mapping service fails tosupply accurate information.In considering CloudMap’s mapping service, the investigator can see that theirpedestrian and vehicle density information is ‘crowdsourced’ through the mobile de-vices and vehicles that have the mapping application installed. This is used, collec-tively, to determine where congestion is high and where delays are likely. CloudMapis approached regarding the discrepancy, who (after checking their release schedule)inform the investigator that they had temporarily rolled out an update which appearsto have negatively affected their cloud databases and mapping tool. This resulted inincorrect information relating the congestion of certain roads, including the one onwhich the incident occurred. On probing, CloudMap reveal that their access logs in-dicate that the incorrect information was also retrieved by EmergenSolutions Ltd.—an organisation known to be responsible for coordinating and distributing emergencyservice vehicles across the city—and that they may have also been affected by theissue.
The investigator turns to how this information was used by EmerSolutions. Decisionprovenance records help reveal that the area where the traffic incident occurred hadbeen classified as ‘low risk’ for emergency services, leading to an automated decisionto redirect ambulances away from the area shortly before the incident took place.This ‘low risk’ classification was the result of three data sources; (i) informationfrom CityMap reporting that there was low congestion in the area; (ii) data providedby CarNet indicating that few pedestrians or vehicles were present on the street, and(iii) historic information about emergency service call-outs showing that the streetwas not a ‘hot spot’ for incidents.On contacting EmerSolutions, the organisation emphasises that they have re-dundancy measures in place, and that their planning system distributes emergencyresponse vehicles based on these disparate sources of information. Probing further,the investigator discovers that EmerSolutions had a process in place to adjust thetowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 23 —
Towards an accountable Internet of Things
This example indicates that even incidents that appear simple can be complex, andthat deployments designed to facilitate review will not only be important, but neces-sary for supporting investigation. The ability to review the technical and organisa-tional workings of complex IoT ecosystems can provide both legal and operationalbenefits to entities involved.From a legal standpoint, effective record keeping and logging will assist investi-gators in mapping out what happened and where issues originated. For instance, thescenario shows that reviewability allowed the investigator to determine that (i) Car-Net’s ability to detect pedestrians was affected by the low street lighting and ques-tionable software update procedures; (ii) CityMap’s services had been temporarilyaffected by an update; and (iii) EmerSolutions failed to follow their own operationalprocedures. As such, because systems and processes were designed to be reviewable,the investigator was able to identify multiple issues, with several entities contributingto the incident throughout the IoT ecosystem – reflecting that there may be severalpoints of failure in any ecosystem and that apportioning blame will often be com-plex in reality. That said, mechanisms that shine light on the happenings of systemsmeans that the appropriate legal consequences, in terms of assigning liability andpursuing redress, are able to follow from the information obtained through review.Operationally, reviewability has also allowed the organisations involved in thisexample to obtain insights into weaknesses or potential misuses of their systems.As such, they are able to improve their operational processes and prevent problemsfrom re-occurring in future. Such information may also feed into their processes forensuring and demonstrating legal compliance, by showing that they are quick to iden-tify and act upon failures. For example, based on the investigations that took place,CarNet may choose to change their processes around software updates to have theirvehicles automatically look for updates every night. CloudMap may also re-assesstheir processes surrounding their deployment of updates – perhaps better staging re-leases or having a more rigorous testing process to prevent such issues from beingdeployed in production. Both CloudVision and CloudMap may also choose to moreclosely monitor what kinds of applications and clients are using their services, andwork with those clients in more high-stakes scenarios to implement bespoke servicesor processes to ensure that a high level of functionality is maintained in critical situ-ations. Similarly, SmartLight may review their role in the incident, and set up con-tingency plans for when the mapping service fails – perhaps using a second mappingtowards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 24 —
Privacy by Design for the Internet of Things service, or other sources of information (such as historic data from past usage logs),to ensure that the lights aren’t dimmed on streets that are known to likely be busy.EmergenSolutions may learn from the incident that relying too heavily on CityMap’sservices may have implications for system resilience, and that their failure to followinternal procedures can lead to issues with potentially significant consequences.In the cases above, reviewability plays an important role for identifying andaddressing issues in socio-technical systems. Without access to the comprehensiverecords and logs needed to holistically review the operation and interaction of thecomponent parts of IoT ecosystems, it would be extremely challenging to reliablyidentify where problems arise and where failures occur. Without oversight of thevarious entities involved (as well as their roles in the incident and their organisationalprocesses), it would difficult, perhaps impossible to act on the investigation and takenext steps. As we have shown, implementing reviewability—through comprehen-sive technical and organisational record keeping and logging mechanisms—offerssignificant benefits in terms of the improving of accountability in these complex IoTecosystems.
There is a push for stronger governance regimes, as the IoT continues to become in-creasingly interconnected, algorithmic, and consequential. However, the complexityof the socio-technical, systems-of-systems that comprise the IoT introduces opacity,which poses real challenges for accountability.We have argued that a key way forward is approaching accountability throughthe concept of reviewability. This involves mechanisms, such as logging and record-keeping, that provide the necessary information to allow technical systems and or-ganisational processes to be comprehensively interrogated, assessed, and audited.The aim is to provide for a targeted form of transparency, that paves the way formeaningful accountability regimes.There appears real potential for technical methods, such as those provenance-based, to capture information of the systems that form the IoT. Such information canwork to facilitate processes of review – however, more research is required. Thoughhere we have focused on some technical capture mechanisms, the concerns are socio-technical, meaning that reviewability will require broader record keeping measuresfrom the commissioning and design of the IoT infrastructure, all the way to its con-tinued operation. In other words, implementing effective reviewability requires aholistic approach, encompassing a range of perspectives that expose the contextuallyappropriate information necessary to assess the wider contexts in which the systemis deployed. That is, there are legal, compliance and oversight considerations, issuesaround incentives and organisational processes, the usability and relevance of theinformation driving review, as well as technical aspects concerning system design,development and monitoring – to name but a few. Realising greater accountabilityinvolves considering the relationships and interplays between these concerns, andmore.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 25 —
REFERENCES now . Even mechanisms that onlysupport internal review (i.e. within an organisation or development project) can helpin the management of systems, processes, and obligations, while assisting compli-ance and providing evidence demonstrating good practice. That is, any reviewabilityundertakings can work to pave the way towards an IoT that is more understandable,transparent, legally compliant, and therefore accountable.
We acknowledge the financial support of the Engineering & Physical Sciences Re-search Council (EP/P024394/1, EP/R033501/1), University of Cambridge, and Mi-crosoft via the Microsoft Cloud Computing Research Centre.
References [1] Miorandi D, Sicari S, De Pellegrini F, et al. Internet of things: Vision, appli-cations and research challenges. Ad Hoc Networks. 2012;10(7):1497–1516.[2] Dlodlo N, Gcaba O, Smith A. Internet of things technologies in smart cities.In: 2016 IST-Africa Week Conference. IEEE; 2016. p. 1–7.[3] McKee DW, Clement SJ, Almutairi J, et al. Massive-Scale Automation inCyber-Physical Systems: Vision Challenges. In: 2017 IEEE 13th Interna-tional Symposium on Autonomous Decentralized System (ISADS). IEEE;2017. p. 5–11.[4] Xiong Z, Sheng H, Rong W, et al. Intelligent transportation systems forsmart cities: A progress review. Science China Information Sciences.2012;55(12):2908–2914.[5] Asimakopoulou E, Bessis N. Buildings and crowds: Forming smart citiesfor more effective disaster management. In: 2011 Fifth International Confer-ence on Innovative Mobile and Internet Services in Ubiquitous Computing.Springer; 2011. p. 229–234.[6] Boukerche A, Coutinho RWL. Smart disaster detection and response systemfor smart cities. In: 2018 IEEE Symposium on Computers and Communica-tions (ISCC). IEEE; 2018. p. 1102–1107.[7] Misbahuddin S, Zubairi JA, Saggaf A, et al. IoT based dynamic road traf-fic management for smart cities. In: 2015 12th International Conferenceon High-capacity Optical Networks and Enabling/Emerging Technologies(HONET). IEEE; 2015. p. 1–5.towards˙an˙accountable˙IoT-Norval” — 2021/2/17 — 2:19 — page 26 —
Privacy by Design for the Internet of Things
REFERENCES