Designing for Contestation: Insights from Administrative Law
DDesigning for Contestation:Insights from Administrative Law
HENRIETTA LYONS,
The University of Melbourne
EDUARDO VELLOSO,
The University of Melbourne
TIM MILLER,
The University of Melbourne
ACM Reference Format:
Henrietta Lyons, Eduardo Velloso, and Tim Miller. 2021. Designing for Contestation: Insights from Adminis-trative Law. In
Proceedings of 2019 CSCW Workshop on Contestability in Algorithmic Systems (Contestability’19).
ACM, New York, NY, USA, 8 pages.
Algorithmic decision-making systems are increasingly being deployed to make, or to supporthumans to make, decisions that impact people’s lives in significant ways. Yet, decision subjects ,those affected by algorithmic decisions, can be limited in their ability to contest these decisions.For example, the Education Value-Added Assessment System (EVAAS), a statistical method used topredict academic growth, was used by the Houston Independent School District to evaluate teachers’performance and, in a number of cases, to terminate teachers’ contracts. Twelve teachers and theHouston Federation of Teachers successfully argued in court that the teachers’ constitutional rightto due process was violated because they were unable to contest, or ‘meaningfully challenge’, thetermination of their contracts because there was a ‘lack of sufficient information’ — the privatecompany that designed EVAAS would not release the source codes or methodology used as theywere proprietary trade secrets [1].Even for decision subjects who are able to understand why a decision has been made and areprovided with means to contest that decision, contestation systems can be seen as severely lacking[12]. Sarah Myers West studied content moderation across a number of social media platforms,most of which offered a way for users to contest a decision to remove their content from theplatform [12]. Myers West reported user dissatisfaction with the contestation systems for a numberof reasons, including a lack of clear instruction about how to lodge an appeal, no reply beingreceived, no resolution being reached after a challenge has been lodged, and a lack of access tohuman intervention.These examples demonstrate that being able to challenge algorithmic decisions is importantto decision subjects, yet numerous factors can limit a person’s ability to contest such decisions.We propose that administrative law systems, which were created to ensure that governments arekept accountable for their actions and decision making in relation to individuals [5, 6], can provideguidance on how to design contestation systems for algorithmic decision-making.There are similarities between government decision-making and algorithmic decision-makingthat suggest there is value in considering how the administrative law system enables contestation.For example, in both cases decision making can be said to occur ‘behind closed doors’, which limits
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without feeprovided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice andthe full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses,contact the owner/author(s).
Contestability’19, November 9–13, Austin, TX, USA © 2021 Copyright held by the owner/author(s). 1 a r X i v : . [ c s . H C ] F e b ontestability’19, November 9–13, Austin, TX, USA Henrietta Lyons, Eduardo Velloso, and Tim Miller transparency, raises questions about accountability, and has the potential to impact public trust. Inthis paper, we focus on the specific case of the Australian administrative law system because this isthe authors’ country of residence, and because of the lead author’s familiarity with the Australianlegal system and her experience working within a variety of government departments and agencieswithin Australia. Further to this, the Australian administrative law system has been refined over thepast 40 years: it provides a comprehensive and well-designed example of an established contestationsystem.This paper contributes: (1) a methodological proposal for the development of a framework thatcan be used to guide the design of contestation systems for algorithmic decision-making, basedon the Australian administrative law system; (2) a summary of key features of the Australianadministrative law system that enable individuals to contest government decision-making; (3)a list of considerations to help to prompt design thinking in relation to contestation systemsfor algorithmic decision-making. While this paper raises more questions than it answers, thesequestions form a useful base from which to consider the development of contestation systems thatallow decision subjects to meaningfully challenge algorithmic decisions. Recent work highlights the benefits of designing ‘contestability’ into machine learning systemsused by experts for decision support [10]. We view ‘designing for contestation’ as quite distinctfrom ‘designing for contestability’. ‘Designing for contestability’ enables expert users to engageinteractively with a machine learning system, to explore it, and to provide it with feedback, with theaim to produce better decisions [10]. In contrast, ‘designing for contestation’ does not (necessarily)entail an interactive design component within the algorithmic decision-making system. Indeed, itmay not be appropriate for a decision subject to interact and engage with the algorithmic systemthemselves, depending on the decision. The aim of designing for contestation is to enable decisionsubjects to appeal a decision by providing them with the means to do so. For example, as highlightedby the EVAAS court case [1], by providing enough information about the decision to allow decisionsubjects to ‘meaningfully challenge’ it. Or, by ensuring that an effective process to appeal a decisionexists.
In a 2017 workshop paper on algorithmic appeals, Vaccaro and Karahalios highlight that establishedcontestation systems can be used to inform contestation design for algorithmic decision-makingsystems [17]. Vaccaro and Karahalios explored three contexts where people can currently seek toreview decisions (the court system, credit scoring, and insurance claims) to understand how thesesystems enable contestation and to reveal challenges that algorithmic appeal systems are likely toface. The authors argue that being able to contest a decision is not only about understanding, orbeing able to interpret, the decision made by an algorithm, although this is certainly a necessaryelement—there is also a need for a system to support appeals.Myers West surveyed 519 social media platform users to understand their experience with contentmodification systems on the platforms that they use [12]. Myers West found that a majority of theplatforms allowed users to appeal decisions to remove content, but that users who chose to makean appeal (n=230) experienced difficulties with the process. For example, users reported that whilethe platform notified them of the option to appeal, they were not provided with any instructionsabout how to go about it. Other users were frustrated by the lack of response or resolution offeredby the platform after they had lodged a complaint, and the lack of human intervention that was esigning for Contestation:Insights from Administrative Law Contestability’19, November 9–13, Austin, TX, USA provided. Of those surveyed, Myers West reports that 245 users chose not to appeal, with someexplaining that they did not know how to lodge an appeal or that they chose not to because theydid not think that they would receive a response.Almada considers algorithmic decision contestation in the context of the General Data ProtectionRegulation, in which Article 22(3) provides a right to contest a decision and a right of humanintervention [4]. Almada argues that ‘[a]t a bare minimum, a system that is contestable by designshould be built in ways that allow users and third parties to effectively seek human interventionin a given automated decision’, and he suggests that literature relating to privacy by design anddesigning for contestability can provide insights into designing for contestation. While algorithmic decision-making has been occurring for decades, in recent years advances inmachine learning has resulted in an increased use of these often “black box” systems to support,or to make, high consequence decisions [2, 15, 19]. As a result, there have been calls for thesealgorithms to be made explainable, transparent, accountable, and fair to ensure that users are ableto trust their decisions [2, 14, 15]. While these calls have resulted in the burgeoning research fieldof explainable artificial intelligence (XAI), limited attention has been given to contestation as away to address these goals. There is potential here for XAI to be informed by contestation. But inaddition, there is a real opportunity for contestation to be supported by XAI.XAI is a dynamic field: technical solutions to produce explanations are being proposed, anda greater focus is now also being placed on the need for human-centred explanation design[2, 3, 11, 19], with more research being conducted into what users require from an explanation[15]. An explanation would provide decision subjects with valuable information that could informa basis from which they could contest a decision. Yet, few papers have proposed helping a decisionsubject to contest a decision as a goal for an explanation. Notable exceptions to this include Wachteret al., who suggest that a potential goal for an explanation is ‘to provide information that helpscontest automated decisions when an adverse or otherwise undesired decision is received’, [18]and Binns et al. who studied the affect of explanation style on justice perceptions, but did not delveinto contestation per se [8].
We propose that administrative law systems can provide guidance on how to design contestationsystems for algorithmic decision-making. In this paper, we focus on the Australian administrativelaw system, which regulates commonwealth government decision-making that affects individuals[5]. This system enables decision subjects to contest decisions made by decision-makers in theexecutive arm of the government — namely officers in government departments. In Australia, therole of the executive government is to administer the laws created by parliament [13]. Figure 1sets out the roles of the parliament, the executive, and the judiciary. In its role administering andapplying laws, the executive has a direct impact on individual members of the public: it has thepower to make decisions that affect a person’s rights, including whether a person is entitled towelfare, is granted a visa, or is granted a licence [16]. Bannister, Olijnyk, and McDonald state that‘[t]he executive’s application of laws affects the day-to-day lives of individuals more often andmore directly than the actions of the legislative and judicial branches of government’ [6].Australian administrative law is made up of a number of elements that each play a role inregulating the actions of the executive: it is a system of accountability. Bannister, Olijnyk, andMcDonald describe in detail the elements of the system, which include: ontestability’19, November 9–13, Austin, TX, USA Henrietta Lyons, Eduardo Velloso, and Tim Miller Fig. 1. The parliament, executive and judiciary each have a separate role to play in Australian gover-nance [Public domain], via The Parliamentary Education Office ( ). • Mechanisms that enhance transparency by allowing access to information, for examplethrough rules enabling freedom of information; • A range of bodies that can carry out investigations relating to the actions or decisions of theexecutive, including the Commonwealth Ombudsman, the Administrative Review Council,and Royal Commissions; and • Mechanisms that provide individuals with avenues to contest decisions that affect them, forexample via an internal or external review [6].While the last element listed above is the most informative for developing a contestation framework,understanding that administrative law is a system is important. There are numerous ways for theexecutive to breach its powers, and the administrative law mechanisms provide different avenuesto hold the executive to account. For example, the contestation system supports individual rightsand provides individual redress, whereas bodies such as the Ombudsman can investigate systemicissues with decision making that may not be identifiable at the individual level. In a similar vein,in the context of algorithmic decision-making, providing a way for decision subjects to contestdecisions will enable the review of individual decisions, but may not reveal systemic problemsembedded in an algorithm, such as biases that result in discrimination. An auditing program maybetter suit that task by enabling third-parties to measure how well the algorithm performs againstpredefined goals, which could include fairness and equity [7, 9, 17]. When designing a contestationsystem, designers need to understand what that system can and cannot feasibly achieve.
We are in the process of developing a framework that can be used to guide the design of contestationsystems for algorithmic decision-making. The first column in Table 1 contains a list of the key esigning for Contestation:Insights from Administrative Law Contestability’19, November 9–13, Austin, TX, USA features of the Australian administrative law system that enable individuals to contest decisionsmade by the government that affect them. An informal approach was taken to compile this initiallist of key features: the lead author drew on her legal training and experience working withingovernment agencies to select the pertinent features. Criteria used to select these features waswhether the feature limits, or supports, a decision subject to contest a decision. The features incolumn 1 will be augmented and refined with further research into the administrative law systemas well as other established contestation systems.The second column of Table 1 contains questions and considerations derived from the firstcolumn that will be useful to consider when designing contestation systems for algorithmic decision-making. Again, an informal approach was taken to develop these questions; they were the productof brainstorming prompted by reflection on the features in column 1. The questions contained inthe second column are not exhaustive, they can (and should) be added to and refined.Many jurisdictions, including the United States, the United Kingdom, Canada, and France haveadministrative law systems that govern the actions and decision-making of their respective govern-ments, however the specific mechanisms and processes used to meet this aim differ. For example,while ‘fair hearing’ forms part of the Australian administrative law system, in the United States,the equivalent right is constitutional: a citizen is entitled to due process of law if being deprived oflife, liberty or property. The draft framework in Table 1 displays key elements that make up an established contestationsystem. Given its legal nature, the administrative law system is heavily rule-based and quite complex.Algorithmic decision-making contestation systems may not need to be as complicated as this.However, when decisions are being made that affect a person’s rights and interests, a comprehensiveand considered approach to contestation may be required. A one-size-fits-all approach to designing acontestation system will not be appropriate given the range of decisions being made and influencedby algorithmic decision-making systems, and the various contexts these decisions are being made in.In addition to understanding user needs, the use case, and constraints, designers must be alive to thelegal and regulatory contexts in which decisions are being made, to ensure that legal requirementsare met. For example, Article 22(3) of the General Data Protection Regulations creates a right tocontest certain automated decisions; and there is a constitutional right to due process in the UnitedStates.Further, given that the administrative law system is designed to regulate human decision-makers,some of the elements in column 1 of Table 1 may not be relevant for algorithmic decision-making.Similarly, there may be additional, unique elements that are needed to cater for algorithmic decision-making. For example, a major challenge presented by "black box" algorithmic decision-makingsystems is that their opacity obscures the decision-making process. Thus, it is unlikely to beeasy for an algorithmic decision-making system, or a person using such a system, to produce thesame information contained in a ‘statement of reasons’ (which is an element of the Australianadministrative law system). However, with advances in XAI research, methods are being developedto produce explanations for decisions. An explanation that provides information about the decisionand the decision-making process would provide decision subjects with a base from which theycan meaningfully contest a decision that has been made, or influenced, by an algorithmic decision-making system. In fact, depending on the type of information that a decision subject wishes tocontest, understanding this internal logic of an algorithm may not even be required; for example,counterfactual explanations can be used to provide valuable information to decision subjectswithout opening the "black box" [18]. Further research needs to be conducted into what type of ontestability’19, November 9–13, Austin, TX, USA Henrietta Lyons, Eduardo Velloso, and Tim Miller Table 1. A framework to consider when designing contestation systems for algorithmic decision-making
Pertinent features of the administrative lawsystem that enable individual contestation Considerations for contestation systems foralgorithmic decision-making N ot all decisions are reviewable: a person needs‘standing’ to appeal a decision e.g. they needto have been ‘affected’ by the decision; the Ad-ministrative Appeals Tribunal can only reviewdecisions if legislation provides that power. Should there be limits on who can contest adecision? Which decisions should be able to becontested? All decisions? Only decisions thataffect certain rights or interests? B est practice is to notify a person if the decisionis likely to be adverse and to give them anopportunity to respond. This helps to ensurethat a person is given a ‘fair hearing’. Is a ‘fair hearing’ required? What processwould provide decision subjects with a fairhearing’? O nce a decision is made, notice of that decisionis provided to the person along with informa-tion about how to initiate a review. How will a person be notified of a decision?What information will be provided about re-view? How should an interface for contesta-tion be designed? W hen a decision subject has standing to appeala decision, they can request a ‘statement ofreasons’, which should contain: findings onmaterial questions of fact; evidence or othermaterial the findings were based on; and thereasons for the decision. What does a statement of reasons look like foran algorithm? What information does a deci-sion subject need to be able to meaningfullycontest a decision? Can this information beprovided? What if the algorithm is a "blackbox"? V arious avenues of review are available e.g. in-ternal merits review, external merits review,and judicial review. There are processes dic-tating how to seek each type of review. Inter-nal merits review (where the decision makingagency reviews the decision) is usually the firststep. What types of review can be provided? If adecision subject can access an internal review,who will carry out the review? A human oran algorithm? Would an algorithmic reviewsimply result in the same decision? Will furtheravenues of review be provided? T he review bodies are limited in their ability toreview a decision e.g. a merits review can deter-mine whether the ‘right’ decision was made bylooking at the relevant facts, law, and policy,whereas a judicial review can only considerwhether the decision-making process was law-ful. What aspects of a decision or decision-makingprocess can be contested? Just the deci-sion/output that pertains to the decision sub-ject? The training data? The inputs? The algo-rithm’s decision rules? The process for deriv-ing the decision-making model? T he remedies provided to a decision subject dif-fer depending on the review body e.g. a meritsreview can make a fresh decision, whereas ajudicial review cannot make a new decision,but can order that a new decision be made bythe original decision maker. What redress can be provided to a decisionsubject? Can a new decision be made? Howwill it be made? If it is determined that an inputwas used in error, can a new decision be madewithout considering that input? esigning for Contestation:Insights from Administrative Law Contestability’19, November 9–13, Austin, TX, USA information would allow a decision subject to meaningfully contest an algorithmic decision, andhow this could be produced. We will continue to develop the framework to guide the design of contestation systems for algo-rithmic decision-making by taking into account other contestability systems and by refining andaugmenting the key features using formal methodological approaches such as thematic codingand content expert interviews. We also propose to conduct research into user requirements forcontestation systems, and in particular into what type of information users require in order tocontest an algorithmic decision.
For decision subjects who have been affected by algorithmic decision-making, such as the Houstonteachers whose contracts were terminated, being able to challenge an adverse algorithmic decisionis vital. Yet, contestation systems for algorithmic decisions are currently underdeveloped, withdecision subjects left with limited information about how the decision was made and how tocontest the decision. Existing contestation systems, such as administrative law systems, can providevaluable insights into how contestation systems for algorithmic decision-making can be designedso that decision subjects can meaningfully contest decisions that affect them.
ACKNOWLEDGMENTS
Henrietta Lyons’s work is supported by the Australian Government Research and Training Programscholarship and a Google Travel Scholarship. This research was partly funded by Australian ResearchCouncil Discovery Grant DP190103414
Explanation in Artificial Intelligence: A Human-CentredApproach . Eduardo Velloso is the recipient of an Australian Research Council Discovery EarlyCareer Researcher Award (Project Number: DE180100315) funded by the Australian Government.
REFERENCES [1] 2017. Houston Federation of Teachers, Local 2415, et al v Houston Independent School District 251 F.Supp.3d 1168.[2] Ashraf Abdul, Jo Vermeulen, Danding Wang, Brian Y. Lim, and Mohan Kankanhalli. 2018. Trends and Trajectoriesfor Explainable, Accountable and Intelligible Systems: An HCI Research Agenda. In
Proceedings of the 2018 CHIConference on Human Factors in Computing Systems (CHI ’18) . ACM, New York, NY, USA, Article 582, 18 pages.https://doi.org/10.1145/3173574.3174156[3] Amina Adadi and Mohammed Berrada. 2018. Peeking Inside the Black-Box: A Survey on Explainable ArtificialIntelligence (XAI).
IEEE Access
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Law (ICAIL ’19)
Government accountability: Australian administrativelaw (2nd. ed.). Cambridge University Press, Port Melbourne, Vic.[7] Tess Bennett. 2018. Governments Should Independently Audit AI Tools For Fairness: Analytics Expert. RetrievedOctober 3, 2019 from https://which-50.com/governments-should-independently-audit-ai-tools-for-fairness-analytics-expert/[8] Reuben Binns, Max Van Kleek, Michael Veale, Ulrik Lyngs, Jun Zhao, and Nigel Shadbolt. 2018. It’s Reducing a HumanBeing to a Percentage.
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18 (2018).https://doi.org/10.1145/3173574.3173951[9] James Guszcza, Iyad Rahwan, Will Bible, Manuel Cebrian, and Vic Katyal. 2018. Why We Need to Audit Algorithms.Retrieved October 3, 2019 from https://hbr.org/2018/11/why-we-need-to-audit-algorithms7 ontestability’19, November 9–13, Austin, TX, USA Henrietta Lyons, Eduardo Velloso, and Tim Miller [10] Tad Hirsch, Kritzia Merced, Shrikanth Narayanan, Zac E. Imel, and David C. Atkins. 2017. Designing Contestability:Interaction Design, Machine Learning, and Mental Health. In
Proceedings of the 2017 Conference on Designing InteractiveSystems (DIS ’17) . ACM, New York, NY, USA, 95–99. https://doi.org/10.1145/3064663.3064703[11] Tim Miller, Piers Howe, and Liz Sonenberg. 2017. Explainable AI: Beware of Inmates Running the Asylum. In
Proceedingsof the 2017 IJCAI Workshop on Explainable Artificial Intelligence (XAI) (IJCAI ’17) . 36–42. http://people.eng.unimelb.edu.au/tmiller/pubs/explanation-inmates.pdf[12] Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on socialmedia platforms.
New Media & Society
Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and DataMining (KDD ’16) . ACM, New York, NY, USA, 1135–1144. https://doi.org/10.1145/2939672.2939778[15] Mireia Ribera and Agata Lapedriza. 2016. Can we do better explanations? A proposal of User-Centred Explainable AI.In
Joint Proceedings of the ACM IUI 2019 Workshop
Workshop on Trustworthy Algorithmic Decision-Making . https://s3.amazonaws.com/kvaccaro.com/documents/algappeal.pdf[18] Sandra Wachter, Brent Mittelstadt, and Chris Russell. 2018. Counterfactual Explanations without Opening the BlackBox: Automated Decisions and the GDPR.
Harvard Journal of Law and Technology
31, 2 (2018), 841–887.[19] Danding Wang, Qian Yang, Ashraf Abdul, and Brian Y. Lim. 2019. Designing Theory-Driven User-Centric ExplainableAI. In