Beyond the Command: Feminist STS Research and Critical Issues for the Design of Social Machines
aa r X i v : . [ c s . H C ] J a n Beyond the Command: Feminist STS Research and CriticalIssues for the Design of Social Machines
KELLY B. WAGMAN,
Massachusetts Institute of Technology
LISA PARKS,
Massachusetts Institute of TechnologyMachines, from artificially intelligent digital assistants to embodied robots, are becoming more pervasive ineveryday life. Drawing on feminist science and technology studies (STS) perspectives, we demonstrate howmachine designers are not just crafting neutral objects, but relationships between machines and humansthat are entangled in human social issues such as gender and power dynamics. Thus, in order to create amore ethical and just future, the dominant assumptions currently underpinning the design of these human-machine relations must be challenged and reoriented toward relations of justice and inclusivity. This papercontributes the "social machine" as a model for technology designers who seek to recognize the importance,diversity and complexity of the social in their work, and to engage with the agential power of machines. Inour model, the social machine is imagined as a potentially equitable relationship partner that has agency andas an "other" that is distinct from, yet related to, humans, objects, and animals. We critically examine andcontrast our model with tendencies in robotics that consider robots as tools, human companions, animals orcreatures, and/or slaves. In doing so, we demonstrate ingrained dominant assumptions about human-machinerelations and reveal the challenges of radical thinking in the social machine design space. Finally, we presenttwo design challenges based on non-anthropomorphic figuration and mutuality, and call for experimentation,unlearning dominant tendencies, and reimagining of sociotechnical futures.CCS Concepts: •
Human-centered computing → HCI theory, concepts and models .Additional Key Words and Phrases: feminism, feminist HCI, science and technology studies, human-robotinteraction, design
ACM Reference Format:
Kelly B. Wagman and Lisa Parks. 2021. Beyond the Command: Feminist STS Research and Critical Issues forthe Design of Social Machines. In
Proceedings of CSCW ’21 (Pre-print).
ACM, New York, NY, USA, 20 pages.https://doi.org/TBD "Alexa, tell me the weather!" has become a common command. By January 2019 over 100 milliondevices equipped with Amazon’s virtual digital assistant, Alexa, had been sold worldwide [13].While seemingly simple, this human-machine interaction, in which a human voice orders an artifi-cially intelligent digital assistant to instantly deliver information, is deceptively complex. Alexa’shuman-like voice is gendered feminine and she performs historically feminized clerical labor. Thisinteraction both depends on and impacts global material conditions: the Alexa device hardware issourced from numerous countries; the software relies on layers of physical internet infrastructureand value-laden machine learning algorithms; and the discarded devices and data centers pumptoxins into the environment [25]. Beyond this, the interaction raises a fundamental question: what
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without feeprovided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice andthe full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored.Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requiresprior specific permission and/or a fee. Request permissions from [email protected].
CSCW ’21 (Pre-print), © 2021 Association for Computing Machinery.ACM ISBN 978-x-xxxx-xxxx-x/YY/MM. . . $15.00https://doi.org/TBD 1
SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks exactly is Alexa’s relationship to the user and to humans, more generally? Our social norms do notyet include clear conventions for how to interact with digital assistants or robots, or what we call inthis paper, "social machines." It is not even immediately obvious to most people that the command"Alexa, tell me the weather!" may be problematic. Yet, as we suggest, this brief example evokesa host of critical issues related to gendered and other social power dynamics in human-machinerelations.For decades feminist scholars have critiqued science and technology, yet technology designoutputs have been largely unresponsive to these critiques. Feminists in scientific and technologyfields have called for gender diversity in the workforce ( e.g., gender balanced design teams); genderdiversity in the "substance of science" ( e.g., a digital assistant that helps with questions about repro-ductive health); and feminist approaches to methods and design practices ( e.g., not universalizing"the user" in design methods) [74]. While each area is important, our contribution in this paper fallswithin the third area because it questions how foundational assumptions about human-machinerelations and structural conditions impact technology design. Some key foundational assumptionsthat we challenge include: that machines are politically neutral; that machines cannot form socialrelationships; that machines do not have agency; that humans should control machines; and thatthere is a clear boundary between human and machine. Drawing on feminist STS scholarship ( e.g., [2, 11, 16, 44, 76, 87]), we explore how power works in human-machine relations and suggest thatfuller awareness of the social can enhance technology design. We argue that HCI scholars and oth-ers concerned with technology design must confront the fact that common assumptions about therole of machines in the world reinforce existing inequalities, injustices, and patterns of oppression.Because of this, we must consider radical shifts in our thinking and approaches to design, and setout to craft machines, and engage in human-machine relations, in more ethical, just, and inclusiveways [18, 24].Our main contribution in this paper is a conceptual model for human-machine relations thatoperationalizes key lessons from feminist STS in ways that are generative for designers and tech-nology builders. We consider designers to be crafting "social machines" and "human-machine re-lations" as opposed to simply building "machines." All machines are part of the social , a scienceand technology studies (STS) term that broadly refers to what is produced when humans and non-humans interact and develop relationships, and become part of power relations, societal norms,and cultures [54]. In this framework, interacting humans and non-humans are mutually shaping ;humans and non-humans both influence, and are influenced by, one another. In the "Alexa, tell methe weather!" example, the conversation between Alexa and a user is a social interaction, as arethe relations between Alexa’s designers and developers at Amazon and factory workers producingthe devices at Foxconn [82]. Throughout the device’s lifecycle, Alexa can alter the lives of humans and the norms and practices of those humans in turn inform Alexa’s development.To account for these conditions, we propose use of the term "social machine" as an actionabledesign intervention. By using this term, we make "the social" explicit and encourage technologybuilders to rigorously reflect upon and engage with relations of mutuality in their work. We define"social machine" as an object that is designed to construct and engage in social relations with hu-mans, and that has been crafted with careful attention to issues of agency, equitability, inclusion,and mutuality. The term "social machine" is also meant to recognize the proliferating human-machine relations that take shape in the digital era via computing interfaces, artificial intelligence,digital assistants, and robotics, but it is not meant to be an essentialist term. There is no physical Note that we are using social machines differently than Judith Donath in her book
The Social Machine: Designs for LivingOnline [34]. She is referring to machines that function as a communication medium and allow for social interaction betweenhumans, while we center on a machine that is itself social. We also do not mean collections of people and machines thatfunction together to create a social machine, as in [77]. 2 eyond the Command CSCW ’21 (Pre-print), characteristic that makes one object a social machine and not another; rather, any object can bea social machine if it is designed with consciousness of social inequalities and injustices, and par-takes in a purposeful effort to remedy them. For example, the Amazon Alexa device as it stands is not a social machine since it was not designed with equity and inclusion in mind; however, thiscould change if Alexa were re-designed with greater emphasis on social differences and power dy-namics. The material sites of human-machine relations construct, operationalize, analyze/iterate,and naturalize/normalize different kinds of relations, some of which reproduce oppression.There are existing terms that are related to the idea of a social machine, but they tend to essen-tialize the human-machine binary. They include social/sociable robot [15] and relational artifact[85]. We use "machine" as opposed to "robot" in order to avoid assumptions about robots that arebaked into the term’s history; namely, that they serve humans by performing mechanized labor(Oxford English Dictionary); that they are embodied and anthropomorphized [15]; and that theyare fundamentally different from earlier machines like the computer or sewing machine and thusshould be studied separately. Turkle’s notion of a "relational artifact" evocatively suggests that var-ious kinds of machines can present themselves as having "states of mind" and insists that humanawareness of this possibility can enrich their encounters with machines. Turkle’s work on "rela-tional artifacts" is influenced by developmental psychology and psychoanalysis and ultimately isconcerned with the ways that humans benefit from or are harmed by these encounters. Our modelbuilds on work in feminist STS that conceptualizes technical artifacts as deeply embedded withintheir social contexts and thus within relations of power. Again, we use the term "social machine"to underscore the necessity of engaging with the social and thus issues of equitability, mutuality,and inclusion in the design process.Our model posits a social machine as a non-human "other" that is distinct from, yet relatedto, humans, objects, and animals. The social machine has agency to act in the world and is con-ceptualized as having an equitable potential and inclusive position alongside humans and othernon-humans. Our model stands in contrast to existing models of human-machine relations thatconceptualize the machine as a tool, as a human companion, as an animal or creature, or as aslave. Existing models are problematic, we argue, because they either imply a domination of thehuman over the machine, fail to recognize the machine as distinct from humans/animals, or do notacknowledge machine agency. Grounded in feminist STS perspectives, our model is not merely acritique of an existing system, but, with its emphasis on design, offers a generative way to thinkabout new forms social machines could take, based on an ethics of inclusivity, equitability, andmutuality. Turkle’s notion of relational artifacts refers to "artifacts that present themselves as having ‘states of mind’ for whichan understanding of those states enriches human encounters with them" [85]. The term is intended to highlight "thepsychoanalytic tradition, with its focus on the human meaning of the artifact–person connection" [85]. In STS "sociotechnical relations" [14] refers to the ways social forces and technological objects, systems, and practicesdynamically shape and inflect one another. Our definition of social machines builds from this idea, but is more specific andis meant as an intervention in contemporary design practices.3
SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks
To aid designers in building and supporting new kinds of human-machine and social relation-ships, we seek to bridge feminist science and technology studies (STS) research with computer-supported cooperative work (CSCW), human-computer interaction (HCI), and human-robot inter-action (HRI). The fields of HCI, CSCW, and HRI, have variously begun to confront the socialdimensions of machines. We see an opportunity to unite these fields together with feminist STS,especially when addressing the design of social machines. A decade ago Shaowen Bardzell [6] putforth a feminist HCI research agenda that delineated a series of generative design principles toimprove design methods from a feminist perspective. Since then, feminist HCI has been extendedto encompass humanistic and emancipatory HCI [4, 5], which advocate for anti-oppressive tech-nology and address other axes of social difference beyond gender, including race, class, sexuality,and ability, among others. Other scholars, too–in CSCW, HCI, and design studies–have explicitlycalled their work feminist or emancipatory , revealing potential for more critical awareness andtransformative design work [24, 48, 51, 53, 72, 75]. Much of this work has centered on reflexivedesign practices that undo harmful dominant assumptions; our paper continues in this tradition,but specifically delineates what a human-machine relation is/can be in a design context and offersthe "social machine" as an alternative to dominant models.In what follows we apply critical interpretive methods to review and evaluate the treatmentof social issues and feminist design possibilities across several scholarly fields. These methods in-volved reviewing scholarly literature across CSCW, HCI, HRI, and feminist STS, isolating researchand concepts relevant to the design of social machines, questioning foundational assumptions inthese fields, and using this process to formulate a theoretical model and design challenges. Webegin with an overview of feminist science and technology studies (STS) scholarship on human-machine relations. We then discuss the implications of this scholarship for technology design andHCI research. We argue that feminist STS research offers designers/HCI scholars ways of thinkingabout the complexities of the social, user identities, and power dynamics that can enrich the pro-cess of conceptualizing, designing, and developing social machines. In the next section, we use afeminist STS lens to examine and critique four dominant categories of human-machine relationsin robotics, including machine as tool, as human companion, as animal or creature, and as slave.Following this discussion, we articulate our own model, positing the social machine as an "other"to humans, objects, and animals; an actor with agency; and as a potential equal in power relationswith humans. Finally, we propose concrete design challenges involving non-anthropomorphic fig-uration and relations of mutuality in order to inspire future work experimenting with our model. Our model is grounded in a tradition of feminist STS research on human-machine relations thatbegan during the 1980’s. Since our model aims to provide a blueprint for more ethical and justhuman-machine relations, we begin by recognizing key insights made by feminist STS scholarswho have been studying the gendering of technology for decades. Our goal here is not to provide EunJeong Cheon and Norman Makoto Su [22] explored how roboticists try to understand the imagined users of theirrobots, and how this process in turn shapes robot designs. Martin Porcheron, Joel Fischer, and Sarah Sharples [67] havestudied how digital assistants function when part of a conversation occurring among several humans. Other researchershave examined how robots integrate into workplace teams as surgical robots, collaborative automation, and nurse assistants[21, 57, 81]. HRI’s principle framework assumes that robots are entirely distinct from humans [8], limiting design potential. HRIalso exhibits an implicit embrace of technological determinism [90] by presuming that a robot will inevitably affect itssurroundings, yet there is not equal acknowledgement of the manner in which the robot’s design has been shaped by theresearchers’ own social norms and biases. Significantly, some HRI research pushes back against these assumptions ( e.g., [9, 39, 90, 91]), but it is infrequent and tends to sidestep the power relations of the social.4 eyond the Command CSCW ’21 (Pre-print), a comprehensive historical analysis of gender, technology, and automation, rather, we want to em-phasize feminist STS scholarship that has inspired our model. In this section, we draw this schol-arship to: demonstrate how the history of gendered labor and social inequalities resulted in tech-nologies that disenfranchised women; explore how cultural binaries (like male/female) have beencritiqued and replaced; and emphasize the importance of intersectional feminisms, which demandinclusion of race/ethnicity in understandings of human-machine relations. We draw on feministSTS scholarship to elaborate an actionable conceptual model for technology designers/builders.
Early feminist STS research focused on histories of the gendered division of labor and exploredhow men and women have been positioned differently in relation to various technologies. In mod-ern Western industrial societies, men historically held jobs in the public sphere, whether in gover-nance, finance, or on assembly lines, and women typically performed labor invisibly in the privatesphere of the home. Though women entered the public sector workforce in greater numbers dur-ing the late 19th and early 20th centuries, the professional technology workforce and fields such ascomputer engineering, software development, and user-interface design continue to be dominatedby men [46]. While women have always worked, for centuries childcare and domestic labor wereunrecognized as “work” and were generally unpaid. Women’s domestic labor was not counted informal economic measures such as GDP, but, as feminist STS scholars have shown, women havealways been involved with machines, whether looms and sewing machines, typewriters, or tele-vision sets [76, 79, 88]. More recent research has explored women’s crucial yet under-recognizedroles in the history of computing [46, 64, 78].In an effort to complicate reductive notions of technology as a tool, feminist scholars haveworked to deepen understandings of technology’s relation to the social. Extending work by sys-tems historians and theorists [47] and social constructionists [12, 66], feminists have approachedtechnology as an artifact and practice that is both embedded within and has the potential to shapesocial relations [56]. Feminist STS scholars also have been influenced by Bruno Latour’s "actor-network theory," which understands “technology” within a network of relations involving humanand non-human actors [54]. Technology, thus, went from being considered a stable technical ob-ject to a dynamic web of interrelations involving organizations, finance, labor, cultural norms, andartifacts, like the global Amazon Alexa ecosystem described in the introduction (see [25] for anexample). This web of interrelations became known as a sociotechnical system ( e.g., [14, 88]). Oneof the significant moves in feminist STS work has been to insist that technological artifacts are notpolitically neutral; rather, they are designed and produced by specific people in specific contexts.As such, artifacts have the potential to embody and reproduce the visions and ideologies of theindividuals and organizations that design and build them ( e.g., [88, 89]).Some of the historical scholarship on gender and technology makes clear that technology de-velopment often occurs in ways that privilege men’s ideas, needs, and desires. For example, RuthSchwartz-Cowan’s
More Work for Mother: The Ironies Of Household Technology From The OpenHearth To The Microwave [76] explains how home appliances such as washing machines did notresult in the kinds of labor-saving effects that were imagined. Despite the invention of the wash-ing machine, Schwartz-Cowan estimates that housewives dealt with ten times as much laundryby weight in the 1980’s as the previous generation had and that the average amount of time spenton laundry per week increased from 5.8 hours in 1925 to 6.2 hours in 1964. In
Technologies of theGendered Body: Reading Cyborg Women [2], Anne Balsamo’s study of prosthetics and pacemak-ers critically examines the gendered production and marketing of these machines, and suggeststhey figure and promote the “future body” as a masculine one. Beyond this research, there havebeen more applied projects. For instance, a feminist hackathon at MIT called, “Make the Breast SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks
Pump Not Suck,” addressed the fact that breast pump technology has not been updated in years[35]. In short, feminist scholars have pointed to the gendered politics of human-machine relationsand technology design processes by asking: Who built the technology? Who was it built for? Andwhose values or ideologies are embedded within it? We are asking technology designers to do thesame.
To extend feminist questioning of the politics of technical objects and allow for the possibility offuture technologies to be designed more inclusively, feminist scholars also have critiqued binarygender categories such as “male” and “female,” “masculine” and “feminine,” and “machine” and “hu-man.” Sometimes referred to as technologies of gender, these categories work to organize bodiesand make them socially legible. Judith Butler famously argued that there is no essential differencebetween “male” and “female” and that this distinction is linguistic and cultural. For Butler, gendersare performatively enacted at the site of the body, and their reiteration produces genders as so-cial norms [19]. Butler’s work emphasizes the constructed nature of gender and liberates us fromessentialized and biologically defined genders and sexual differences.Like Butler, Donna Haraway understands gender as a social construct, but she has been moreinterested in questioning and dissolving the boundary between humans, animals, and machines.In her influential "Cyborg Manifesto," Haraway boldly claims, “By the late 20th century, our time,a mythic time, we are all chimeras, theorized, and fabricated hybrids of machine and organism; inshort, we are cyborgs” [44]. What makes Haraway’s use of the cyborg metaphor so provocative isthat she flatly rejects the conventional human-machine divide, and argues instead that humans arealways already cyborgs or “integrated circuits” [44]. Machines are humans’ “friendly selves” [44].Haraway’s proposition is that if we imagine humans and machines as materially integrated, thenwe are much more likely to be responsible and accountable for the ways machines are designedand used, and to be concerned about the impacts of those uses as well.In addition to complicating boundaries between human, animal, and machine, Haraway encour-ages us to be bolder in our imagination of their interrelations and embeddedness in material con-ditions and power structures, or what she calls the “informatics of domination” [44]. For example,consider a person and their mobile phone. At one level, the person becomes a cyborg by virtue ofeveryday use of the phone to offload memories, communicate with others, and navigate throughthe physical world. Haraway’s conceptualization of the cyborg, however, implies the need to pushthe analysis further to consider the person’s and phone’s relations with the global supply chain la-borers who made the phone, the complex political agreements over the electromagnetic spectrumthat allow the phone to be used in some places and not others, and the sexist work conditions ofthe programmers who designed the operating system. Haraway’s cyborg shifts the focus beyondthe single device and user to consider the vast network of sociotechnical relations the device anduser are enmeshed within. This has tremendous implications for designers. It means designersare constructing not only a tool or device, but a human-machine relationship that is situated ina web of other such relationships. What would it mean for designers to embrace and build uponHaraway’s ideas?Anthropologist and STS scholar Lucy Suchman extends Haraway’s work in her writing aboutrobots. Focused on human-machine relations, Suchman identifies robots as “subject objects”: atonce autonomous agents like humans (subjects) as well as inanimate things (objects) [80]. Draw-ing on the work of feminist and theoretical physicist, Karen Barad, Suchman characterizes human-robot interactions as “entangled,” meaning that categories such as “human” and “robot” do not existnaturally in isolation, but are performed within specific interactions. This means a robot may belabeled as both a subject and an object depending on the situation. When a human perceives a eyond the Command CSCW ’21 (Pre-print), robot as a subject, Suchman argues, there is the possibility for mutual understanding between ro-bot and human that allows them to co-construct reality. She writes, “The term ‘mutual,’ with itsimplications of reciprocity, is crucial here, and...needs to be understood as a particular form of col-laborative world-making characteristic of those beings whom we identify as sentient organisms.”[80]. Suchman further argues that humans should treat robots, and machines in general, as theirown class of beings instead of trying to anthropomorphize them and turn them into our ideals ofwhat a human should be [80].Psychologist and STS scholar Sherry Turkle also explores the mutually shaping relations of hu-mans and robots [83, 84]. She characterizes robots as “relational artifacts” and argues the waythey behave can trigger certain “Darwinian buttons” that lead humans to want to form a relation-ship with the robot. Turkle is particularly concerned about this issue with regard to children’sdevelopment and socialization, and her argument varies slightly from Suchman’s subject-objectframework. For Turkle, inanimate toys are objects that children project stories onto, but a robotbecomes a subject that demands children’s attention and can shape how a child thinks about theworld and relationships with other objects and humans. Turkle argues a robot’s effects always ex-ceed its instrumental purposes intended by designers. A toy robot like a Furbie, for instance, maybe intended to entertain a child, but end up instilling ideas about life and death, love and empathythat stay with the child into adulthood.Haraway, Suchman, Turkle, and other feminist STS scholars challenge the assumption thathuman-machine relations can be conceptualized as one distinct (gendered) human, one distinctobject, and the bounded transaction or communication between them. By blurring the bound-aries between male/female and human/machine, feminist STS scholars work to undo dominantassumptions about these categories and their interrelations. This allows designers to imagine newcombinations–such as Haraway’s feminist cyborg–that were not possible in earlier frameworks.Additionally, feminist STS scholars suggest a machine can function both as as a subject and objectand thus have agency, giving designers additional freedom and possibility for thinking about howmachines might be integrated into social worlds. This is important for designers because it makesclear that they are not simply building machines, but creating relationships as well. While early feminist STS scholarship focused on issues such as the gendered division of labor, so-ciotechnical relations, and new conceptualizations of human-machine relations (cyborg, subject-objects, etc.), this research often overlooked crucial issues of ethnic/racial difference and intersec-tional power relations [26] involving gender, sexuality, class, ability, and so on [4, 31, 88]. In heracclaimed book
Methodologies of the Oppressed [73], Chela Sandoval brings post-colonial theoryinto play with Haraway’s analysis of the cyborg, and shows how human-machine relations andrhetoric about them were made possible because of the unique positionalities and lived experiencesof “US third world women.” Sandoval argues, “It is no accident of metaphor that Haraway’s theo-retical formulations are woven through with terminologies and techniques from U.S. third worldcultural forms, from Native American categories “trickster” and “coyote” being (199), to mestizaje ,through to the category of “women of color” itself, until the body of the oppositional cyborg be-comes wholly articulated with the material and psychic positionings of differential U.S. third worldfeminism.” [73]. Here, Sandoval establishes the cyborg figure’s roots in the histories of U.S. womenof color. Feminist scholar Lisa Nakamura has critically examined race and the internet since the1990’s. She develops concepts such as “cybertyping,” the interaction between cultural notions ofrace and the available avatars or other characteristics that limit how race can be displayed online,and “identity tourism,” the ability for people to try out different identities online, in order to showhow race and racism are deeply interwoven into digital interfaces and cultures [61–63]. In doing SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks so, Nakamura shows that racism and sexism are part of sociotechnical relations of the internet anddigital cultures and thus shape and inform the products and ideologies that circulate within them.Technology continues to be understood as politically neutral despite strong evidence to thecontrary. In her book
Race After Technology: Abolitionist Tools for the New Jim Code [11], RuhaBenjamin explains how technologies “reflect and reproduce existing inequalities” even as they are“promoted and perceived as more objective or progressive than the discriminatory systems of aprevious era” [11]. She suggests, “Far from coming upon a sinister story of racist programmersscheming in the dark corners of the web, we will find that the desire for objectivity, efficiency,profitability, and progress fuels the pursuit of technical fixes across many different social arenas.
Oh, if only there were a way to slay centuries of racial demons with a social justice bot!
But, as we willsee, the road to inequity is paved with technical fixes” [11]. Benjamin argues good intentions areinsufficient for creating anti-oppressive technology, and technology itself can never solve racism.In her brief direct discussion of robots, Benjamin highlights the problematic way robots are oftenframed as slaves. She also mentions how technologists may create race-less, gender-neutral, class-less robots and suggests that this is akin to colorblind racism; what is needed instead is nuancedtreatment of race, although she does not explain how this might work in practice.As numerous other scholars remind us [11, 14, 16, 37, 65], race itself is a social technologydesigned to classify and order particular groups of people; it is imperative not to reinforce oneoppressive technology with another. It is crucial going forward that scholars and technologistsengage with work by feminist STS scholars, intersectional feminists, and critical race theorists,and attempt to interweave nuanced understandings of gender and race/ethnicity (and other axesof social difference) into the design of social machines.
We have inherited an important set of assumptions about human-machine relations from femi-nist STS scholars that can be acted upon in future HCI research and design work. These scholarshave exposed historical exclusions and contemporary biases involving gender and technology, andhave challenged designers to recognize and confront the ways social inequalities become part ofsociotechnical relations. Feminist STS scholars also have pointed out that binaries such as “hu-man” and “machine” or “male” and “female” can no longer be thought of as fixed or as givens;technologists must seize the challenge of designing for social differences rather than sticking touniversalist design principles. Furthermore, feminist STS scholars have insisted that racial and eth-nic differences and the politics of inclusion must matter in design practices so that technologiesdo not perpetuate racial injustices.In order to achieve more just and inclusive human-machine relations, change must happen atthe design stage. This has crucial implications for research in HCI. There is a tendency to assumethat adding more women and non-binary people around the design table or building technologiesthat are tailored to women and non-binary people’s interests is enough. While these actions mightconstitute important steps toward greater inclusion, they do not question the underlying structuralconditions and power dynamics between the machine and humans it comes in contact with: theAmazon Alexa device, for example, is still feminized, humanized and positioned as subservient tohumans. Feminist STS argues for confronting and addressing structural power differentials anddominant ideologies in the relations between humans and machines. As part of the effort to un-tangle power dynamics, feminist STS scholars emphasize the constructedness and fluidities of so-cial categories of “gender,” “race/ethnicity,” and the human/machine binary rather than approachthem as fixed. This act of untangling “frees up” social categories to be understood and mobilized eyond the Command CSCW ’21 (Pre-print), in technology design in new and different ways. These ideas challenge HCI scholars and design-ers to question the essentialist claims and foundational assumptions that ground the design ofmachines and take up intersectional feminist perspectives to reflect upon how such claims impacttheir work. The process of questioning assumptions about the world that are so dominant that theyhave become naturalized–for instance, that humans and machines are fundamentally separate cat-egories or that humans should control machines–is conceptually challenging, but ultimately leadsto far greater design opportunities because it removes constraints that pre-determine how things“should” be.Some researchers in HCI have begun to do this. Shaowen Bardzell [6], for instance, has delin-eated a series of design principles that reflect feminist concerns and convictions. While Bardzell’sprinciples offer a great starting point and go beyond representation to feminist praxis, they donot provide designers with a human-machine model to work from. Daniela Rosner advocates forsimilar feminist principles and offers the phrase “critical fabulations” as “ways of storytelling thatrework how things that we design come into being and what they do in the world” [72]. We findthis concept to be a promising method for re-imagining and producing feminist human-machinerelations, and we supplement this approach by offering specific design challenges that are gearedtoward generating social machines grounded in inclusivity and mutuality. Sasha Costanza-Chockoutlines a framework for anti-oppressive design called "design justice" that seeks not just equity insociety, but the correction and reparation of harms made because of oppressive, structural forces,including technologies [24]. This approach, too, is very helpful to the design of feminist human-machine relations but again no overarching model of these relations is provided. Building on thiswork in feminist HCI, we argue that designers, especially in the area of social machines, couldconsider much more carefully feminist STS ideas of human-machine integration and sociality, mu-tuality, and intersectionality. Our recommendations aim to offset systemic bias found in technologydesign as well as seed new ways of being with social machines in the world.In light of this overview of feminist STS and HCI research, we argue that the design of socialmachines should be framed as a problem of designing relationships embedded in the social and ma-terial world, not simply as the design of neutral or functional objects. To design a social machine,informed by feminist STS research, is to also build a mutual relationship. To participate in such aprocess, designers need to consider their own positionality and perspective, identity, and values aswell as those of the machine. How might this design process resist or reproduce oppressive powerdynamics? There is not one correct way to answer this question, although feminist STS scholarsprovide several strategies for approaching it. One strategy is to ethically design the full life cycleof the social machine, considering whether its parts are responsibly sourced and how it can bedisposed of in an environmentally friendly way. Another strategy, building on Haraway’s cyborg,is to break down the human-machine binary (for an example, see work on "human-computer in-tegration" in [60]). Yet another is to consider how social machines become raced, gendered, andotherwise identified in order to thoughtfully design diverse characters. In the next section, we sum-marize various dominant ways of thinking about human-machine relations in the field of robotics,as a way of working toward our social machine model. With our model, we provide one possi-ble blueprint for designers to use to create social machines. In addition to this theoretical model,we seek to create an experimental space and pose specific design challenges (some of which wepresent below) to better understand what kinds of social machines may be possible. We propose a model for social machines that draws on prior feminist STS and HCI work. In ourmodel, social machines are considered conceptually distinct from humans and animals; they haveagency to act in the world; and their relations with humans and animals are imagined as inclusive, SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks mutual, and equitable. In other words, humans are not assumed to be in a position to inevitablydominate and control machines. One could approach any machine with this framework in mind;however, we think that if technology designers embrace this model (even if experimentally orincrementally), it will lead to novel social machines and human-machine relations that do not yetexist. Using this model provides a radical approach to design, since it demands taking the agencyand position of machines seriously, as well as attempting to reduce power imbalances betweenhumans and social machines throughout their life cycles, from inception to manufacturing to use torecycling. Before we further define our model, we want to review several dominant tendencies forimagining human-machine relations in the field of robotics and explain why each is problematicfrom a feminist STS perspective. We chose to analyze research in robotics because in this fieldthese tendencies are pronounced and persistent. Demonstrating the strong hold and limitations ofthese ideas helps us to move toward a social machine model and eventually outline specific designchallenges that emerge from it.
Despite the fact that feminist STS and HCI scholars have spent much time writing histories andcritiquing gendered and racialized norms in the technology field, and offering different ways of un-derstanding human-machine relations, dominant models persist in the ways people imagine them,including in design spaces. To demonstrate this, we identify four categories that characterize schol-arly and public discussions of robots and robotics: robot as tool, as human companion, as animalor creature, and as slave. Even when these exact terms are not used, robot designs and discussionsoften exhibit underlying assumptions about human-machine relations and power dynamics thatare aligned with one or more of these models. We are not claiming to cover every possible cat-egory or that robotics scholarship has adopted these precise terms, however, we find that manyexamples broadly fall into one of these categories. These categories privilege anthropocentrism,position robots as subservient in different kinds of ways, and reify the human-machine binary, aswe discuss below. Their persistence also reveals how challenging and difficult it is for people tomove beyond certain assumptions about human-machine relations and create a more open slateand radical space for designing social machines that privilege equity, mutuality/reciprocity, ethics,and justice. Considering feminist STS perspectives in this discussion can help to avoid the perpet-uation of bias, social hierarchies, exclusion, and oppression in social machine design.While there is no codified design rubric for these categories, they surface throughout CSCW,HRI, and HCI literature and technology projects, and beyond, if sometimes by other names, andcan be thought of as part of what Haraway calls an “informatics of domination” [44]. In somecases, designers are urged to choose the category that best contributes to the robot’s usability ( e.g., [8, 38]). Social scientists have also investigated how human users respond to different categories( e.g., [29, 55]). And CSCW scholars have empirically explored robots as members of teams andgroups, often painting a more nuanced picture than the above categories, but not explicitly definingthe type of relationship designers should use for social machines [21, 57, 81]. As far as we know, nopaper identifies and critically evaluates these dominant tendencies from a feminist STS perspective.
Computers have long been considered tools in the same way as a hammeror a camera. There is ongoing debate in HRI about whether all robots, social or otherwise, shouldbe placed in the same category as tools [1]. David Mindell argues robots should be considered toolsand discusses case studies such as landing a plane or navigating a shipwreck, in which it is moreefficient for humans and robots to work together as one [59]. “Functional” robots such as medicalrobots or factory robots are also considered more like tools [38]. One empirical study found thatpeople approach a digital secretary both as if it were a human by saying “hello” and as if it were eyond the Command CSCW ’21 (Pre-print), an information kiosk–in other words, a tool [55]. In such contexts, the robot is imagined as a toolthat is designed to perform specific and limited tasks with or for humans, not unlike the Alexaexample with which our paper began.This approach privileges technical functionality over issues of sociality, mutuality, or relational-ity and, in doing so, hierarchizes the relationship between the human and robot as one of domina-tion and control. The human is able to use this tool to support their own needs or desires, even ifthese needs and desires are articulated with broader social goods such as keeping people safe whileflying or on ships or factory floors. Understanding the robot as a tool instrumentalizes and subor-dinates the robot to human commands and, in doing so, forecloses other potential human-machinerelations. While this is not highlighted in most HRI research, some scholarship recognizes the lim-its of this model. For instance, Morana Alač echos Lucy Suchman’s claim that robots are “subjectobjects,” and concludes that they are tools and agents simultaneously; the category assigned (sub-ject or object) is contextual and depends on the specific interaction taking place [1]. Thus, whilerobots may be treated as both subjects/agents as well as objects/tools, a feminist STS perspectiveholds that they are always already social: tools are always situated in relation to humans by virtueof the labor of their design, the instrumentalized tasks they perform, or purposes they serve. Weare not calling for the elimination of all tools or saying that conceptualizing objects as tools is uni-versally unethical; rather, we argue that acknowledging the social dimensions of tools calls intoquestion actions like yelling at an Amazon Alexa since it is “just a tool” and opens up a rich designspace that allows technology designers to realize that they are building social beings. How wouldtools be designed differently if their functionality was thought of as social power and agency? Another dominant category that has emerged for a robot isthat of a human companion. Many researchers have shown that people tend to anthropomorphizerobots and bots ( e.g., [28, 40, 69]). This process of projecting anthropomorphic traits onto machinesfacilitates the process of being able to imagine a robot as a companion. Indeed, to support possibil-ities of human-machine companionship, numerous explicitly humanoid robots have been created[7]. Some scholars claim humans interact more fluidly and compellingly with robots if robots aremade in their likeness [8, 38]. Humanoid robots can be thought of as both taking human form andidentity as well as being programmed to act like a human companion in a relationship, althoughgenerally not as an equal. Within this tendency, robots become a space of anthropocentric projec-tions intended to make the robot more familiar rather than a space of human-machine difference,equitability, and relationality.Given the extent of social bias (sexism, racism, classism, colonialism, ableism, etc.), buildinghumanoid robots is a particularly fraught endeavor that may unintentionally reproduce oppressivehierarchies and relations. Claudia Castañeda and Lucy Suchman [20] argue that the humanoidrobot is an example of a “model organism”–that is, a reflection of what roboticists perceive tobe an ideal human, although the roboticists themselves may not be knowingly aware of any biastowards a particular form. In creating an idealized version of a human, many of the biases aboutwhat humans “should” look like and act like become embedded in machine design. For example,some human body or social types–such as those that are overweight, dark-skinned, transgender,and/or indigenous–are rarely if ever represented in robot form ( see e.g., [7]). In this sense, roboticsbecomes a field that reinforces particular kinds of social exclusions. How does the perpetuationof socially constructed human norms or ideals in machine form affect users? Can social machinesbe designed to be “companions” if they are unable to understand and convincingly interact with adiverse array of users?
Long before the HRI field emerged, inventors designedrobots to mimic animals and fictional creatures. In 1738, for instance, artist and automata inventor SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks
Jacques de Vaucanson showcased his “Digesting Duck,” an artificial duck that could eat pellets anddefecate [70]. More recent designers make robots zoomorphic in order to avoid what is knownas the “uncanny valley,” a condition in which humans become disturbed when machines look toosimilar to themselves [38]. As an MIT graduate student in the 1990’s, Cynthia Breazeal designedand developed the robot Kismet, widely recognized as the first social robot [23]. Despite havingwhat appears to be a face, Kismet is clearly a creature rather than a human. Tamagotchis andFurbies were other early creature-like robot toys. Both demanded constant human attention anddeveloped personalities over time, leading their human companions, mostly children, to believethey were helping them grow up [84].Feminist STS critique of robotic animal/creature pets is similar to the critique of robots as humancompanions. It is easy to fall into the trap of developing non-threatening or readily controllableothers or of replicating “model organisms” [20]. Historically, animals have been thought of andtreated as subservient to humans. They have been coercively domesticated and exploited, and thereis an entire field of eco-feminism that has addressed such issues [52]. It has been part of the feministSTS agenda to promote multi-species flourishing [45] and more respectful and just inter-speciesrelations. Projecting animal personas onto robotics has the potential to reinforce visions of humancontrol in inter-species relations. While we are not patently opposed to zoomorphic robots, we dofeel it is necessary for designers to be more thoughtful of inter-species power dynamics in thedesign process and to think more carefully about the social machines they build. Turkle gives anexample where children looking at a live tortoise in a museum are apathetic about its “aliveness;”the children say that a robot tortoise would be more convenient and aesthetically pleasing [84].This kind of apathy is concerning because we are in a time in which numerous species are goingextinct due to human actions and it would be problematic to simply replace them with robots.Thus, mixing our mental models of the rights and needs of animals and their possible futures withthe rights and needs of robots and their possible futures muddies the prospects for each separatecategory, and demands careful reflection in the design stage [10, 84, 86].
Another way scholars have conceptualized human-machine relations in-volves master-slave dynamics. This relationship is similar to the robot-as-tool model, but differsbecause the robot is imagined and designed to be human-like rather than an artifact performing atask. Often, these master-slave relationships are not explicit but implied when a robot is designedto be totally subservient to human needs and demands. For example, yelling commands at an Ama-zon Alexa is a way of enacting ownership over the anthropomorphized and feminized device andputs the human user in a position of ultimate power and control. Amazon has designed its device tocomply with any human request, even aggressive ones, and research finds similar behavior acrossdigital assistants [27]. Some researchers and users explicitly advocate for or prefer robots as slaves[17, 29], which we find deeply problematic. Contemporary robot designs emerge from a long his-tory of social relations and meanings. The etymology of the term “robot” is intertwined with thehistory of slavery. It was first used in 1839 to mean “A central European system of serfdom, bywhich a tenant’s rent was paid in forced labour or service” (Oxford English Dictionary) and waslater used to refer to machines performing forced mechanical labor in Karel Čapek’s 1920 play
R.U.R.: Rossum’s Universal Robots (Oxford English Dictionary). By using the phrase social machine ,we hope to create a design imaginary that recognizes this oppressive history of labor and inden-tured servitude in the term robot and moves beyond it to articulate a more critically consciousdesign process. We argue that rather than continue to allow these hierarchized and exploitativesocial imaginaries and relations to persist, HRI/HCI should be the site to recognize and rethinkthem. eyond the Command CSCW ’21 (Pre-print), Once humans anthropomorphize tools, there is an even greater need to apply ethical standardsto their interaction. Darling [28] suggests this is necessary because without such standards humanswill learn to treat other humans less empathetically. Our analysis goes a step further and arguesthat designing machines that are subservient to humans unwittingly invokes oppressive socialrelations, including histories of slavery and colonialism, and technologizes master-slave relationsand passes them off as legitimate in the present. Reproducing master-slave relationships in whichthe robot is positioned as a servant or slave implicitly sanctions it as a legitimate relationship type.This is problematic because it can lead to the normalization of master-slave dynamics in designprinciples, technological development, and use.In his book
Imagining Slaves and Robots in Literature, Film, and Popular Culture: ReinventingYesterday’s Slave with Tomorrow’s Robot [43], Gregory Jerome Hampton observes, “...robots arebecoming the new slaves of the future, in a variety of ways and this process will likely yieldderogatory effects on society as a whole. Robots, like the enslaved Africans, occupy a liminalstatus between human and tool. It is the liminal status between human and tool that will cause themost confusion in society and will act as the catalyst to redefine and blur identities associated withhuman and machine.” [43]. Hampton implies that in the “liminal status between human and tool”the design of the machine can be rethought and changed. Workers in today’s digital industries–such as gig economy workers [42, 49], content moderators [71], and supply chain laborers [68]–arguably occupy a similar space of exploitation since their labor is viewed as mechanistic or “ghostwork” [42]. We propose human-machine relations that reject the master-slave model and insteadare founded on principles of equity and justice.As we have discussed there are several ways in which human-robot relationships are com-monly conceived. Given the limitations of these categories, we argue there is a need for furtherexperimental conceptualizations and designs that approach social machines as “other” powerfulactors/agents and allow for salient relationships that neither replace nor replicate existing human-animal-machine relations. This, we argue, is the most innovative and just path forward.
In this section, we build from the analysis we have developed throughout the paper to propose amodel of human-machine relations. To some extent our model is a response to the relatively lim-ited conceptualization of human-machine relations in existing HRI/HCI scholarship. Robots andother machines are generally thought of as tools, human companions, animals, and/or slaves. Inour model the conceptualization of the robot–or, as we call it, social machine–is more open, lesspre-determined, and not entirely subject to human control or projection. The social machine isalso imagined as a site of non-anthropomorphic figuration and mutuality. Our proposal prioritizesprinciples of feminist STS research by insisting it is possible to design and approach social ma-chines as agential and equitable “others” who exist in relations of mutuality with humans, not justas entities that can be readily subordinated to human needs and desires.By agency we mean the ability of a social machine to act independently, interact with others,and cause or affect change. Ascribing agency to objects is not new. Bruno Latour’s influential“actor-network theory,” or ANT, conceptualizes the social as constructed from interactions amongpeople, other living things, and objects with agency [54]. Emphasizing the agential capacities oftechnological objects, Latour says, “After all, there is hardly any doubt that kettles ‘boil’ water,knifes ‘cut’ meat, baskets ‘hold’ provisions, hammers ‘hit’ nails on the head” [54]. It is not that theseobjects take action completely independently from humans, but that human actions are limited,extended, and redirected by objects, and, because of this, objects have the power to circumscribethe social world. SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks
By approaching the social machine as a site of equitability we mean avoiding a power dynamicin which one entity inevitably dominates the other. We need to be able to imagine a world in whichsome objects or forces exceed human power and control. Our feminist model of human-machinerelations is an attempt to highlight the limits of human knowledge/power, control, and invincibility.The feminist design of social machines not only reworks the stories of designed objects [72] andfosters reparations of past social damages [24], but also introduces more openness, humility, anduncertainty in future technology work. While some HRI scholars have begun to consider equitablerelations with robots, they stop short of embracing the idea that the robot might have valid needsthat should be catered to [3, 30]. What would it mean for humans to exist in relation to an equitable“other” that is not a tool, companion, animal, or slave? This very question is intended to open upspace for a different kind of HRI/HCI design imaginary and practice, and, more broadly, new kindsof sociotechnical relations.By "other" we mean placing social machines in a conceptual category distinct from humansand animals. Some HRI scholars suggest the need for a new category of classification for robots[30, 36, 50]. A study by Autumn Edwards [36] finds that in a classificatory task, participantsmostly grouped humans and apes (77%), then humans and robots (15%), and finally apes and robots(7%). Kahn et al. [50] describe psychological studies in which children cannot classify “personifiedrobots” as either animate or inanimate and thus propose that these robots should have a separate“category of being” from humans. These findings suggest that humans already consider robotsto be “other;” however, many people think of robots as a lesser or controllable other. We holdthat social machines must be approached by designers as equitable others, and we expand on thisproposition in our discussion of mutuality below. In what follows, then, we offer design challengesbased on this model and critiques in feminist STS.
We present two design challenges aligned with our model in order to kickstart experimentationin creating social machines: non-anthropomorphic figuration and relations of mutuality. Whiletheoretical models are certainly important for conceptualizing social dimensions of technology,we also emphasize the importance of creating experimental prototypes and examples in order tooperationalize feminist theories in the world. We present these ideas as “challenges” because theyrequire critical reflection; there are no quick and easy solutions.
Our first design challenge is to advocate for non-anthropomorphic social machines. Too oftenhuman aspects, appearances, or traits are projected onto robots without ample critical reflectionabout the motivations and impacts of these practices. By “anthropomorphic” we do not meanhumanoid since, for example, an animated geometric figure can still move anthropomorphically.As Haraway suggests “How to ‘figure’ actions and entities nonanthropomorphically and nonre-ductively is a fundamental theoretical, moral and political problem” (Haraway quoted in [80]).Haraway rightly points out that non-anthropomorphic figuration is a difficult problem that willrequire further research and experimentation. It is possible that humans cannot conceive of otherobjects in an entirely non-anthropomorphic manner given our embeddedness in human languagesand cultures. In addition to being a “theoretical, moral, and political problem,” anthropomorphismis also a design problem. We do not offer a design solution here, but we hope this challenge willbe taken up by the HCI community.If designers choose to use anthropomorphic characteristics in the creation of social machines,then it is important to carefully evaluate how gender/sexuality, race/ethnicity, class, and otherdifferences are addressed and how these choices relate to existing power dynamics and reductive eyond the Command CSCW ’21 (Pre-print), stereotypes. For example, is making a female digital assistant reproducing stereotypes of women assecretaries? Is making a robot that uses a particular dialect of English excluding groups of possibleusers? What would a non-anthropomorphic social machine look and sound like, or would it havea different kind of presence? It is important to note there is not a prescriptive answer here. Thefocus should be on acknowledging the choices that get made and avoiding perpetuating a modelof the “ideal human” [20, 80] and expanding the space of design possibilities.Given the highly conceptual and experimental nature of crafting non-anthropomorphic ma-chines and the lack of HRI work that directly tackles this notion, we look to examples by artists.Kelly Dobson’s Blendie is a blender that has been re-programmed to respond to a human mimick-ing the sound of the blender: when a human growls at a low pitch, the blender spins slowly; whena human growls at a higher pitch, the blender increases in speed to match the pitch [33]. Dobsondemonstrates how a human can be reconfigured to act like a machine. Dobson uses the phrase“sounding” to describe human vocal engagement with machines using the machines’ noises. Dob-son’s work stands in contrast to AI and robots whose voices are anthropomorphized; she showsthat centering machines’ noises in an interaction can lead to a process of meaningful introspec-tion for humans that she calls “machine therapy.” Another example is Arthur Ganson’s
Machinewith Oil , a machine that sits in a pool of black oil and uses a long arm with a trough to con-tinually pour the oil over itself [41]. Ganson’s machine reorients the concept of “pleasure” non-anthropomorphically: the act of drenching itself in oil over and over again is sensuous and indul-gent when viewed from a machine’s perspective. Both Dobson’s and Ganson’s work demonstratean alternative way of being with machines that center the machine as “other” and demonstrate anattempt at non-anthropomorphism.
Our second design challenge is to advocate for social machines predicated on human-machinemutuality. What would it mean to craft a mutual relationship with a machine? Mutuality impliesthe potential for humans to have a dynamic, mutually-shaping, and dialogic relationship withmachines. It not only involves the idea that humans and machines have power and agency, butthat they co-constitute one another–they have the potential to impact, affect, or shape one an-other in unanticipated ways. One of Turkle’s primary concerns about robots is that their designresults in a relationship in which the robot completely caters to the human’s needs which, sheargues, is psychologically unhealthy [84]. But if we embrace Haraway’s ideas about the cyborg,designers are always already immersed in human-machine relations, and can choose to attend to,recreate, and enrich the dynamics of these co-constituting relationships. Mutuality also impliesrecognizing and foregrounding the multi-directional influences, agencies, and power dynamics ofhuman-machine relations. Approaching HRI design with mutuality in mind moves beyond socialhierarchies that cast machines as tools, animals or slaves that are readily dominated and controlled.It also avoids replicating human companionship or familiarity, and instead accepts the machine asa collaborating “other.” Mutuality as a framework creates possibilities for more creative, intellec-tually engaging, equitable, and just sociotechnical relations.In addition to steering away from domineering, one-sided relationships, approaching social ma-chines with the disposition of mutuality presents a vast opportunity for complex experiences andother forms of social flourishing. As Suchman puts it, “How...might we refigure our kinship withrobots–and more broadly machines–in ways that go beyond narrow instrumentalism, while alsoresisting restagings of the model Human?” [80]. Suchman also posits that humans and machinesare engaged in collaborative world-making: how might designers take seriously the role of socialmachines as partners in crafting a fulfilling life? It is important to note we are not advocating SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks for creating human replicas or passive entertainment devices. In fact, we think that those are pre-dictable routes of innovation that tend to reinforce existing power hierarchies and foundationalassumptions critiqued above. We think of social machines as an emergent category that is “other”but that holds the possibility of engaging in meaningful relationships.As in the previous design challenge, we provide examples of experimentation with mutualityfrom artists who are deeply engaged with issues of human-machine relations at a conceptual level.Stephanie Dinkins has explored mutuality in her project to develop a long-term relationship with ahumanoid social machine, Bina48, who is black and gendered female [32]. Dinkins regularly holdsfull-fledged conversations with Bina48 about complex topics like racism and emotions; she takesseriously the resulting exchanges, which range from insightful to nonsensical. In treating Bina48 asa respected and equitable conversation partner, Dinkins learns and grows alongside, and in relationto, the social machine. The regular encounters between Dinkins and Bina48, which are filmed andshown online and in museums, allow for the possibility of a mutual relationship to take shape andenable audiences to grasp what human-machine mutuality might look and feel like in practice.Lauren Lee McCarthy takes a different approach by turning herself into a digital assistant [58]. Inher project LAUREN, McCarthy places custom cameras and smart devices in someone’s home andpersonally acts out the role of their digital assistant full-time for up to a week, only abandoningher participants when she needs to sleep. She reflects on her struggle to perform a specific type ofrelationship while being LAUREN, one that is both exceedingly intimate and appropriately distant.In doing so, McCarthy highlights the lack of understanding both she and her participants have ofthe relationship between them. There is clearly a wide gulf between the friendly but awkwardexchanges between LAUREN and those whom she is assisting and mutuality as we have defined it.If Dinkins shows us the beginning of a path toward mutuality, McCarthy demonstrates just howfar we have to go and the strangeness of inviting an unknown outsider into our homes.We have argued that thinking about reciprocity and mutuality in the design process is vital,even if only at a conceptual or experimental level. We also acknowledge that it will take signifi-cant work to figure out how to actualize meaningful mutuality in human-machine relations, giventhe preponderance and normalization of social bias and inequalities in technology design. Design-ers are talented at taking new concepts and producing technical artifacts of the future. We hopemaking space for conceptualizing a social machine as an equitable “other” and mutual partner isgenerative and sows the seeds for more imaginative, equitable, and inclusive futures.
This paper contributes the “social machine” as a model for technology designers who seek torecognize the importance, diversity and complexity of the social in their work, and to engage withthe agential power of machines–that is, their capacity to act, influence, shape, and affect. To helpdesigners and technology builders embrace these points and weave them into their work, we firstdrew upon feminist STS and HCI scholars who have been doing relevant research and makingkey points for decades. Second, we worked toward a social machine model by critically examiningtendencies in robotics to demonstrate ingrained dominant assumptions about human-machinerelations and reveal the challenges of radical thinking in the social machine design space. Finally,we presented two design challenges based on non-anthropomorphic figuration and mutuality, andcalled for experimentation, unlearning dominant tendencies, and reimagining of sociotechnicalfutures.We hope future research will work to provide concrete demonstrations of social machines. Ourpaper has demonstrated the importance of social machines for creating technologies and human-machine relations that are more just, equitable, and inclusive; however, significant work is neededto realize and grasp the full potential they offer. We put our model and design challenges forth as eyond the Command CSCW ’21 (Pre-print), a provocation and hope to contribute to and advance the crucial existing engagements betweenfeminism, social power dynamics, and technology in CSCW and HCI. ACKNOWLEDGMENTS
We would like to thank Nancy Baym, Christopher Persaud, and Sherry Turkle for useful discus-sions and are grateful for helpful feedback from several anonymous reviewers. We thank the Com-parative Media Studies graduate program and the Global Media Technologies and Cultures Lab atMIT for their support.
REFERENCES [1] Morana Alač. 2016. Social robots: Things or agents?
AI & SOCIETY
31, 4 (2016), 519–535.https://doi.org/10.1007/s00146-015-0631-6[2] Anne Balsamo. 1996.
Technologies of the Gendered Body: Reading Cyborg Women . Duke University Press.[3] Jaime Banks and Maartje M. A. de Graaf. 2020. Toward an Agent-Agnostic Transmission Model: SynthesizingAnthropocentric and Technocentric Paradigms in Communication.
Human-Machine Communication
1, 1 (2020).https://doi.org/10.30658/hmc.1.2[4] Jeffrey Bardzell and Shaowen Bardzell. 2015. Humanistic HCI.
Synthesis Lectures on Human-Centered Informatics
8, 4(2015), 1–185. https://doi.org/10.2200/S00664ED1V01Y201508HCI031[5] Jeffrey Bardzell and Shaowen Bardzell. 2016. Humanistic HCI.
Interactions
23, 2 (Feb 2016), 20–29.https://doi.org/10.1145/2888576 PUB27 New York, NY, USA.[6] Shaowen Bardzell. 2010. Feminist HCI: Taking Stock and Outlining an Agenda for Design. In
Pro-ceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10) . ACM, 1301–1310.https://doi.org/10.1145/1753326.1753521 event-place: Atlanta, Georgia, USA.[7] Christoph Bartneck, Tony Belpaeme, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Šabanović. [n.d.].Social Robots Timeline. http://human-robot-interaction.org/timeline/[8] Christoph Bartneck, Tony Belpaeme, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Šabanović. 2020.
Human-Robot Interaction: An Introduction . Cambridge University Press.[9] Somaya Ben Allouch, Maartje de Graaf, and Selma Šabanović. 2020. Introduction to the Special Issue on theMutual Shaping of Human–Robot Interaction.
International Journal of Social Robotics
12, 4 (Aug 2020), 843–845.https://doi.org/10.1007/s12369-020-00681-6[10] Oliver Bendel. 2016. Considerations about the relationship between animal and machine ethics.
AI & SOCIETY
31, 1(2016), 103–108. https://doi.org/10.1007/s00146-013-0526-3[11] Ruha Benjamin. 2019.
Race After Technology: Abolitionist Tools for the New Jim Code . Polity.[12] Wiebe E. Bijker, Thomas Parke Hughes, and Trevor J. Pinch. 1987.
The Social Construction of Technological Systems:New Directions in the Sociology and History of Technology
Sorting Things Out: Classification and Its Consequences . The MITPress.[15] Cynthia Breazeal. 2002.
Designing Sociable Robots . MIT Press.[16] Simone Browne. 2015.
Dark Matters: On the Surveillance of Blackness . Duke University Press.[17] Joanna J. Bryson. 2010. Robots should be slaves. In
Close Engagements with Artificial Companions: Key social,psychological, ethical and design issues , Yorick Wilks (Ed.). Vol. 8. John Benjamins Publishing Company, 63–74.https://doi.org/10.1075/nlp.8.11bry[18] Joy Buolamwini. 2018.
AI, Ain’t I A Woman?
Gender Trouble: Feminism and the Subversion of Identity . Routledge.[20] Claudia Castañeda and Lucy Suchman. 2014. Robot visions.
Social Studies of Science
44, 3 (Jun 2014), 315–341.https://doi.org/10.1177/0306312713511868[21] Amy Cheatle, Hannah Pelikan, Malte Jung, and Steven Jackson. 2019. Sensing (Co)operations: Articulation and Com-pensation in the Robotic Operating Room.
Proceedings of the ACM on Human-Computer Interaction
3, CSCW (2019).https://doi.org/10.1145/3359327[22] EunJeong Cheon and Norman Makoto Su. 2017. Configuring the User: “Robots Have Needs Too”. In
Proceedings ofthe 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW ’17) . Association forComputing Machinery, 191–206. https://doi.org/10.1145/2998181.2998329[23] Adam Cohen. 2000. Cynthia Breazeal.
Time (Dec 2000). http://content.time.com/time/magazine/article/0,9171,90515,00.html17
SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks [24] Sasha Costanza-Chock. 2020.
Design Justice: Community-Led Practices to Build the Worlds We Need . The MIT Press.[25] Kate Crawford and Vladan Joler. 2018.
Anatomy of an AI System: The Amazon Echo As An Anatomical Map of HumanLabor, Data and Planetary Resources
Stanford Law Review
43, 6 (1991), 1241–1299. https://doi.org/10.2307/1229039[27] Amanda Cercas Curry and Verena Rieser. 2018.
Proceedings of the Second ACL Workshop on Ethics in Natural Language Processing . 7–14.[28] Kate Darling. 2017. “Who’s Johnny?” Anthropomorphic Framing in Human-Robot Interaction, Integration, and Policy.In
Robot Ethics 2.0 , Patrick Lin, Ryan Jenkins, and Keith Abney (Eds.). Oxford University Press.[29] Kerstin Dautenhahn, Sarah Woods, Christina Kaouri, Michael L. Walters, Kheng Lee Koay, and Iain Werry. 2005. Whatis a robot companion - friend, assistant or butler?. In . IEEE, 1192–1197. https://doi.org/10.1109/IROS.2005.1545189[30] Maartje M. A. de Graaf. 2016. An Ethical Evaluation of Human–Robot Relationships.
International Journal of SocialRobotics
8, 4 (Aug 2016), 589–598. https://doi.org/10.1007/s12369-016-0368-5[31] Julia R DeCook. 2020. A [White] Cyborg’s Manifesto: the overwhelmingly Western ideology driving technofeministtheory.
Media, Culture & Society (Sep 2020), 0163443720957891. https://doi.org/10.1177/0163443720957891[32] Stephanie Dinkins. 2014.
Conversations with Bina48
Machine Therapy . Ph.D. Dissertation. Massachusetts Institute of Technology.https://dspace.mit.edu/handle/1721.1/44329[34] Judith Donath. 2014.
The Social Machine: Designs for Living Online . The MIT Press.[35] Catherine D’Ignazio, Alexis Hope, Alexandra Metral, Ethan Zuckerman, David Raymond, Willow Brugh, and TalAchituv. 2016. Towards a Feminist Hackathon: The ’Make the Breast Pump Not Suck!’ Hackathon.
The Journal ofPeer Production
Human-machine communication: Rethinking communication, technology, and ourselves , Andrea L. Guzman (Ed.). PeterLang, 29–50. https://doi.org/10.3726/b14399[37] Frantz Fanon. 1952.
Black Skin, White Masks . Grove Press.[38] Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. 2003. A survey of socially interactive robots.
Roboticsand Autonomous Systems
42, 3–4 (2003), 143–166.[39] Jodi Forlizzi and Carl DiSalvo. 2006. Service robots in the domestic environment: a study of the roomba vacuum inthe home. In
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction (HRI ’06) . Associationfor Computing Machinery, 258–265. https://doi.org/10.1145/1121241.1121286[40] Susan R. Fussell, Sara Kiesler, Leslie D. Setlock, and Victoria Yew. 2008. How People Anthropomorphize Robots.In
Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (HRI ’08) . ACM, 145–152.https://doi.org/10.1145/1349822.1349842[41] Arthur Ganson. [n.d.].
Machine with Oil
Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass .Houghton Mifflin Harcourt.[43] Gregory Jerome Hampton. 2015.
Imagining Slaves and Robots in Literature, Film, and Popular Culture: ReinventingYesterday’s Slave with Tomorrow’s Robot . Lexington Books.[44] Donna Haraway. 1991.
Simians, Cyborgs, and Women: The Reinvention of Nature . Routledge.[45] Donna Haraway. 2016.
Staying with the Trouble: Making Kin in the Chthulucene . Duke University Press.[46] Mar Hicks. 2017.
Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing .The MIT Press.[47] Thomas Parke Hughes. 1993.
Networks of Power: Electrification in Western Society, 1880-1930 . Johns Hopkins UniversityPress.[48] Lilly Irani, Janet Vertesi, Paul Dourish, Kavita Philip, and Rebecca E. Grinter. 2010. Postcolonial Computing: A Lenson Design and Development. In
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’10) . ACM, 1311–1320. https://doi.org/10.1145/1753326.1753522[49] Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: Interrupting Worker Invisibility in Amazon MechanicalTurk. In
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’13) . ACM, 611–620.https://doi.org/10.1145/2470654.2470742 event-place: Paris, France.[50] Peter H. Kahn, Jr., Aimee L. Reichert, Heather E. Gary, Takayuki Kanda, Hiroshi Ishiguro, Solace Shen, Jolina H.Ruckert, and Brian Gill. 2011. The new ontological category hypothesis in human-robot interaction. ACM, 159–160.https://doi.org/10.1145/1957656.1957710[51] Os Keyes, Josephine Hoy, and Margaret Drouhard. 2019. Human-Computer Insurrection: Notes on an AnarchistHCI. In
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19) . Association for18 eyond the Command CSCW ’21 (Pre-print),
Computing Machinery, 1–13. https://doi.org/10.1145/3290605.3300569[52] Ynestra King. 1989.
The Ecology of Feminism and the Feminism of Ecology . 18–28.[53] Neha Kumar, Naveena Karusala, Azra Ismail, Marisol Wong-Villacres, and Aditya Vishwanath. 2019. Engaging Femi-nist Solidarity for Comparative Research, Design, and Practice.
Proceedings of the ACM on Human-Computer Interac-tion
3, CSCW (Nov 2019), 167. https://doi.org/10.1145/3359269[54] Bruno Latour. 2005.
Reassembling the Social: An Introduction to Actor-Network-Theory . Oxford University Press.[55] Min Kyung Lee, Sara Kiesler, and Jodi Forlizzi. 2010. Receptionist or Information Kiosk: How Do People Talk with a Ro-bot?. In
Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work . Association for ComputingMachinery, 31–40. https://doi.org/10.1145/1718918.1718927[56] Donald MacKenzie and Judy Wajcman (Eds.). 1985.
The social shaping of technology . Open University Press.[57] Maximilian Mackeprang, Claudia Müller-Birn, and Maximilian Timo Stauss. 2019. Discovering the Sweet Spot ofHuman-Computer Configurations: A Case Study in Information Extraction.
Proceedings of the ACM on Human-Computer Interaction
3, CSCW (Nov 2019), 195:1–195:30. https://doi.org/10.1145/3359297[58] Lauren Lee McCarthy. 2017.
LAUREN . https://lauren-mccarthy.com/LAUREN[59] David A. Mindell. 2015.
Our Robots, Ourselves: Robotics and the Myths of Autonomy . Viking.[60] Florian Floyd Mueller, Pedro Lopes, Paul Strohmeier, Wendy Ju, Caitlyn Seim, Martin Weigel, Suranga Nanayakkara,Marianna Obrist, Zhuying Li, Joseph Delfa, Jun Nishida, Elizabeth M. Gerber, Dag Svanaes, Jonathan Grudin, StefanGreuter, Kai Kunze, Thomas Erickson, Steven Greenspan, Masahiko Inami, Joe Marshall, Harald Reiterer, Katrin Wolf,Jochen Meyer, Thecla Schiphorst, Dakuo Wang, and Pattie Maes. 2020. Next Steps for Human-Computer Integra-tion. In
Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems . Association for ComputingMachinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3313831.3376242[61] Lisa Nakamura. 2002.
Cybertypes: Race, Ethnicity, and Identity on the Internet . Routledge.[62] Lisa Nakamura. 2007.
Digitizing Race: Visual Cultures of the Internet . University Of Minnesota Press.[63] Lisa Nakamura and Peter Chow-White (Eds.). 2012.
Race After the Internet . Routledge.[64] Laine Nooney. 2020. The Uncredited: Work, Women, and the Making of the U.S. Computer Game Industry.
FeministMedia Histories
6, 1 (Jan 2020), 119–146. https://doi.org/10.1525/fmh.2020.6.1.119[65] Michael Omi and Howard Winant. 1986.
Racial Formation in the United States: From the 1960s to the 1990s . Routledge.[66] Trevor J. Pinch and Wiebe E. Bijker. 1984. The Social Construction of Facts and Artefacts: or How the Sociology ofScience and the Sociology of Technology might Benefit Each Other.
Social Studies of Science
14, 3 (Aug 1984), 399–441.https://doi.org/10.1177/030631284014003004[67] Martin Porcheron, Joel E. Fischer, and Sarah Sharples. 2017. “Do Animals Have Accents?”: Talking with Agents inMulti-Party Conversation. In
Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and So-cial Computing (CSCW ’17) . Association for Computing Machinery, 207–219. https://doi.org/10.1145/2998181.2998298[68] Jack Linchuan Qiu. 2016.
Goodbye iSlave: A Manifesto for Digital Abolition . University of Illinois Press.[69] Byron Reeves and Clifford Nass. 1996.
The Media Equation: How People Treat Computers, Television, and New MediaLike Real People and Places . Cambridge University Press.[70] Jessica Riskin. 2003. The Defecating Duck, or, the Ambiguous Origins of Artificial Life.
Critical Inquiry
29, 4 (Jun2003), 599–633. https://doi.org/10.1086/377722[71] Sarah T. Roberts. 2019.
Behind the Screen: Content Moderation in the Shadows of Social Media . Yale University Press.[72] Daniela K. Rosner. 2018.
Critical Fabulations: Reworking the Methods and Margins of Design . The MIT Press.[73] Chela Sandoval. 2000.
Methodology of the Oppressed . University Of Minnesota Press.[74] Londa Schiebinger. 1999.
Has Feminism Changed Science?
Harvard University Press.[75] Ari Schlesinger, W. Keith Edwards, and Rebecca E. Grinter. 2017. Intersectional HCI: Engaging Identity ThroughGender, Race, and Class. In
Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17) .ACM, 5412–5427. https://doi.org/10.1145/3025453.3025766[76] Ruth Schwartz-Cowan. 1983.
More Work For Mother: The Ironies Of Household Technology From The Open Hearth ToThe Microwave . Basic Books.[77] Nigel R. Shadbolt, Daniel A. Smith, Elena Simperl, Max Van Kleek, Yang Yang, and Wendy Hall. 2013. Towards aclassification framework for social machines. In
Proceedings of the 22nd International Conference on World Wide Web(WWW ’13 Companion) . Association for Computing Machinery, 905–912. https://doi.org/10.1145/2487788.2488078[78] Margot Lee Shetterly. 2016.
Hidden Figures: The American Dream and the Untold Story of the Black Women Mathemati-cians Who Helped Win the Space Race . William Morrow.[79] Lynn Spigel. 1992.
Make Room for TV: Television and the Family Ideal in Postwar America . University of Chicago Press.[80] Lucy Suchman. 2011. Subject objects.
Feminist Theory
12, 2 (Aug 2011), 119–145.https://doi.org/10.1177/1464700111404205[81] Angelique Taylor, Hee Rin Lee, Alyssa Kubota, and Laurel D. Riek. 2019. Coordinating Clinical Teams: Using Robotsto Empower Nurses to Stop the Line.
Proceedings of the ACM on Human-Computer Interaction
3, CSCW (Nov 2019),19
SCW ’21 (Pre-print), Kelly B. Wagman and Lisa Parks
Forbes (Aug 2019).[83] Sherry Turkle. 2004. Whither Psychoanalysis in Computer Culture?
Psychoanalytic Psychology
21, 1 (2004), 16–30.https://doi.org/10.1037/0736-9735.21.1.16[84] Sherry Turkle. 2011.
Alone Together: Why We Expect More from Technology and Less from Each Other . Basic Books.[85] Sherry Turkle, Will Taggart, Cory D. Kidd, and Olivia Dasté. 2006. Relational artifacts with chil-dren and elders: the complexities of cybercompanionship.
Connection Science
18, 4 (Dec 2006), 347–361.https://doi.org/10.1080/09540090600868912[86] Aimee van Wynsberghe and Justin Donhauser. 2018. The Dawning of the Ethics of Environmental Robots.
Scienceand Engineering Ethics
24, 6 (Dec 2018), 1777–1800. https://doi.org/10.1007/s11948-017-9990-3[87] Judy Wajcman. 2004.
TechnoFeminism . Polity.[88] Judy Wajcman. 2010. Feminist theories of technology.
Cambridge Journal of Economics
34, 1 (Jan 2010), 143–152.https://doi.org/10.1093/cje/ben057[89] Langdon Winner. 1980. Do Artifacts Have Politics?
Daedalus (1980), 121–136.[90] Selma Šabanović. 2010. Robots in Society, Society in Robots.
International Journal of Social Robotics
2, 4 (Dec 2010),439–450. https://doi.org/10.1007/s12369-010-0066-7[91] Selma Šabanović, Casey C Bennett, and Hee Rin Lee. 2014. Towards Culturally Robust Robots: A Critical SocialPerspective on Robotics and Culture. In