Design Judgment in Data Visualization Practice
DDesign Judgment in Data Visualization Practice
Paul Parsons * Colin M. Gray † Ali Baigelenov ‡ Ian Carr § Purdue University A BSTRACT
Data visualization is becoming an increasingly popular field of de-sign practice. Although many studies have highlighted the knowl-edge required for effective data visualization design, their focushas largely been on formal knowledge and logical decision-makingprocesses that can be abstracted and codified. Less attention hasbeen paid to the more situated and personal ways of knowing thatare prevalent in all design activity. In this study, we conducted semi-structured interviews with data visualization practitioners duringwhich they were asked to describe the practical and situated aspectsof their design processes. Using a philosophical framework of de-sign judgment from Nelson and Stolterman [23], we analyzed thetranscripts to describe the volume and complex layering of designjudgments that are used by data visualization practitioners as theydescribe and interrogate their work. We identify aspects of data vi-sualization practice that require further investigation beyond notionsof rational, model- or principle-directed decision-making processes.
Index Terms:
Human-centered computing—Visualization;
NTRODUCTION
One of the goals of visualization research is to influence designpractice. This aim is becoming increasingly relevant, as data visual-ization as a profession is rapidly growing in popularity [2, 24]. Toeffectively influence design practice, it is necessary to understandthe ways in which practitioners structure their work in relation tothe complexity they face in real-world settings [31]. While researchfocusing on tools and issues that practitioners face is gaining tractionin InfoVis [2, 3, 13, 19, 24, 33], only a small number of studies havebeen grounded in real-world design practice (e.g., [2, 13, 24, 25, 33]).Although recent work has highlighted the messy, complex, andsituated nature of visualization design [17,18,20] and the challengesin evaluating visualization authoring systems [27, 28], these effortshave focused on visualization researchers rather than practitioners .Unlike other design-oriented fields, InfoVis lacks a recognizableexpansion of scholarly inquiry into design practice as an activityin its own right—distinct from research activity—in ways that canaccount for the complexity of everyday design situations.Multiple design-related fields have undergone a turn to practiceas a means of informing research, with a recognition that barriersto knowledge production and use have arisen between the academicand practitioner communities. The role of practice-led research isrelevant both in emergent design fields such as interaction designand instructional design [4,15], and in more traditional domains suchas architecture and studio art. We position this paper as a meansof motivating further work that is practice-focused and practice-led,drawing inspiration from these design fields in an effort to betterunderstand the ways in which practitioners design and the knowledgethat can inform future visualization design research. * e-mail: [email protected] † e-mail: [email protected] ‡ e-mail: [email protected] § e-mail: [email protected] In this work, we focus on how designers think and make decisionsduring their design process. Although design decision and processmodels have previously been proposed in InfoVis (e.g., [18, 21,22, 29]), the limitations of such models in describing real-worlddesign practice have been noted by design scholars in other fields.[7, 11, 12, 16, 31]. These scholars highlight the personal and situatedfactors of design practice that are essential for making decisionsand moving through a design process—factors that tend to resistmodeling and codification.We use a philosophical framework from Nelson and Stolter-man [23,32] to guide our inquiry into the ways that data visualizationpractitioners make decisions during their design work. We buildupon decades-old interest in phronesis —or the practical knowledgethat underlies and supports the work of professionals [10, 30]—andinvestigate how design judgment , a particular aspect of that profes-sional engagement, aids a designer in confronting the complexity ofreal-world design situations and guides decision making [23]. Thiswork builds upon prior investigation of design judgments in instruc-tional design, where Boling and colleagues described the complexand layered nature of judgments in informing design practice [4, 12]and design education [9].The contribution of this paper is two-fold: First, we identify anddescribe the volume of design judgments that are used by data visu-alization practitioners as they describe and interrogate their work,laying the foundation for future study of the complexity of designpractices in pedagogy and industry contexts. Second, we character-ize the complex layering of judgments both within discrete designacts and over time, demonstrating aspects of data visualization prac-tices that require further investigation beyond notions of rational,model- or principle-directed decision making processes.
NDERSTANDING D ESIGN P RACTICE
Early InfoVis research was largely tool- and technique-driven, withlittle emphasis placed on the design process and the role of thedesigner in that process. Over roughly the past decade, InfoVisresearchers have embraced a more expansive view of design, turningattention to both the process designers go through and the decisionsthey make while designing. Often drawn from personal experience indesigning visualization systems, researchers have proposed variousmodels for visualization design (e.g., [6, 17, 18, 21, 22]). Thesemodels have especially focused on process (e.g., [29]) and decisions(e.g., [18, 21, 22]) that designers purportedly make while designing.While design models have been influential in the research com-munity, it is unclear how well they characterize and support designpractice, especially in non-research settings where practitioners mayground their work and process in less theoretical and more pragmaticways. Furthermore, as the development of these models did not in-volve the study of design practitioners , it is unclear how ecologicallyvalid claims of process and decision-making might be “in the wild”of practice. In particular, extant investigations have not accountedfor the personal, situated aspects of design that significantly influ-ence designers’ thinking and are inseparable from the designer. Thepersonal, situated nature of design activity requires a more expan-sive vocabulary that goes beyond formal decision making to include judgments that are undertaken at every stage of visualization designin opportunistic and personal ways. a r X i v : . [ c s . H C ] S e p .2 Decision Making and Complexity Rational decision-making processes are often viewed as the stan-dard for dealing with complex situations. However, the study ofreal-world decision-making has shown that rational processes arethe exception rather than the rule when people are faced with un-certainty in complex situations [14]. Formal models offer neat andclear descriptions of how people make decisions, yet they do so byneglecting the contextual aspects of decision making that are noteasy to model. Empirical investigations show that people insteadrely on patterns of experience, making judgments that do not lookmuch like formal decisions at all [1].We use the concept of design judgment —a particular form ofjudgment—to investigate how data visualization practitioners engagewith design complexity, recognizing that design situations are messy,complex, and situated in the real-world. In this context, designersdo not stand apart from the design process—they participate init, shape it, and mold it using their lived experience and a rangeof knowledge types. Thus, particular judgments are subjectivelypositioned, not replicable, and generally resist codification. Thissubjectivity of practice links to broader understandings of decision-making, such as the role of personal experience [14] and otherpersonal commitments in even the most “objective” situations [26].Although judgments do not look like ‘rational’ decisions, they arenot irrational or mere opinion; judgment relies on familiarity andprototypicality, in which patterns of experience are drawn from in acontextually bound manner that resists codification or objectivation[14]. It is this space in which we seek to make a contribution in theempirical work that follows.
ETHOD
As part of a broader effort to investigate aspects of design complexityin data visualization practice, we interviewed practitioners and askedthem questions about their design practice. Recruiting was done viasocial media, the DataVis Societys Slack workspace, the InfoVisemail list, and our personal networks. To mitigate sampling bias,we also searched widely for professionals and agencies, ultimatelycontacting more than 200 individuals and more than 30 agencies.Interviews were semi-structured and were conducted remotelyvia videoconferencing. For this paper, we selected 10 of the 20 tran-scripts to analyze, aiming for diversity across the self-reported char-acteristics (see Table 1). We selected one section of the transcriptsin which we asked participants to describe: (1) their typical designprocess at a high level and (2) how they assessed their progress,including determining if they were on the right track and makingdecisions about what to do next in their process. Depending on theanswers given, we asked a number of follow-up questions regardinghow participants started their design process, what kinds of con-siderations they made, whether their process was more structuredor unstructured, and how they would make specific decisions (e.g.,choosing chart types or visual encoding channels). This sectionaccounted for roughly 15 minutes near the beginning of the 60-75minutes of each interview. The transcripts were deductively codedusing Nelson and Stolterman’s framework on design judgment [23].We operationalized the judgment types described in this frameworkbased on prior work by one of the authors in another design disci-pline [12] and used these types as a priori codes in our top-downthematic analysis [5]. Table 2 lists these design judgments and theirdefinitions. All judgment types were coded non-exclusively by twoor more researchers, with the goal of reaching consensus and fullagreement. We regularly discussed our code application to assureconsistency of coding and a shared agreement regarding the meaningof each judgment type.
INDINGS
We first provide an overview of how judgments emerged in the con-versations with our participants, specifically highlighting instances
ID Job Title Exp.(yrs) HighestDegree G
P1 Sr. UX Design Lead 8-10 D MP2 Graphics Editor 8-10 B FP3 Data Communicator 2-4 M MP4 Sr. DataVis Dev >
10 D FP5 DataVis Designer 5-7 M FP6 Data Architect >
10 M MP7 DataVis Designer 5-7 B FP8 DataVis Designer >
10 M MP9 Senior UX Designer 5-7 FP10 DataVis Journalist/Designer 5-7 M M
Table 1: Our 10 participants and their self-reported characteristics:job title, years of experience, highest degree (Bachelors-B, Masters-Mand Doctoral-D), and gender.
JudgmentType Description
Framing Creating a working area for design activity to oc-cur, often through the introduction of constraints.Appreciative Assigning importance to some things, while notto others, without the intervention of hierarchy.Appearance Making determinations relating to style, nature,character, and experience.Quality Making determinations relating to craftsperson-ship and connoisseurship.Instrumental Selecting or reflecting on the influence of tools,methods, or techniques.Navigational Identifying and reflecting on a path to achieve aspecific design outcome in complex situations.Compositional Forming connections among multiple artifacts orconcepts, creating a sense of holism or centrality.Connective Making connections or bridging design objects toaddress specific aspects of a design situation.
Table 2: Operationalized design judgment types used in our analysis(adapted from Nelson and Stolterman [23] and Gray et al. [12]) where one type of judgment is foregrounded (i.e., brought into fo-cus) and others exist in the background (i.e., related but not focusedon). In the next section, we will further describe how these judg-ment types are complex and layered, with multiple judgment typesemerging together.
Framing judgments (n=25) focus on identifying what elements aresalient, often through the introduction of explicit or decisive con-straints. In some instances, these judgments were more conceptualand philosophical in nature, such as P1’s goal to “ apply best prac-tices and [. . . ] basically trying to reduce cognitive load and painpoints ” or P3’s focus on cognition: “
And it’s about what techniquesimprove cognition and behavior change, right? ” For other partici-pants, these judgments represented points they could use to begintheir design work, representing an initial problem frame. For P7,this focus was on the client goal and the desired outcome: “
So thefirst thing that I need to have clear once I know that I want to acceptthe project is what is the goal for the client? So what is the mainthing that the visualization should do—be able to kind of say that inone or two sentences. What should the people learn and then havinga feeling of the data, what variables are in there? ” Appreciative judgments (n=43) often occur alongside framingjudgments, representing places where certain elements of the designsituation were foregrounded, leading other elements to be back-grounded. For some participants, appreciative judgments repre-sented their approach to a design project, such as P6’s goal of mak[ing] that easier, simpler, clearer ” or P9’s focus on ” show-ing the data that the client wants to show off. ” Some exampleswere more precisely grounded in project decisions, such as thisexample from P1 where they try to foreground different elementsof complexity to guide their decisions: “
There’s obviously someseasonal variation; the way it’s portrayed now, you can’t tell whatthat seasonal variation is. So I was thinking, okay, do we wantto communicate seasonal variation? And if so, let me think aboutbreaking it out this way. Do we want to communicate an overalldecline? And if so, let me break it out this way instead. ” Appearance judgments (n=11) focus on a personal sense of style,character, or experience. While discussing a minimalist style, P5noted “
I wouldn’t say I’m a minimalist [. . . ] I need to grab thereader’s eye. I find that [minimalism] doesn’t really work for that.But in terms of embellishments, I always try and keep it very closeto the data. So, instead of a color, I might make it a subtle gradient,I might give it a slight drop shadow to emphasize it. Instead of astraight line between two points, I might give it a small curve. Soit’s like these, I guess smaller additions onto the basics. Thingslike going beyond the default in a more stylistic way. ” Appearancejudgments do not always relate directly to visual characteristics,however, and may be concerned more with character, form, andmaterial or temporal experience. For example, P2 had a desire to“ try something cooler ” if there was time to return to the original data.P7 alluded to a more experiential goal underlying their visual style:“
I’ve always tried my best to not have fanciness for fanciness sake[. . . ] I would never be like, let me do a D3 force layout because itbounces around and people think it’s fun or something like that. Ihave used a D3 force layout to make all the nodes burst as peoplescroll, because that actually got people to scroll. And I wantedthem to keep scrolling. So I have some sort of a goal. [. . . ] I think[engaging users] is a really important function. ” Quality judgments (n=22) focus more on craftspersonship, incomparison to the more subjective qualities of appearance repre-sented earlier. Some participants, such as P1, pointed towards the“ craft ” that is often connected with the science of visualization: “
It’ssort of, it’s part science. It’s part craft. I’d say it’s actually morecraft. ” In another example, P1 shared a more extensive exampleof how they connected their approach to existing design languagesto improve the quality of their design, while perhaps reducing thefelt creativity of this work: “ these frameworks [. . . ] in some wayhomogenize your work so that if you look less creative and morecorporate on one hand, on the other hand, I think getting it provesusability because there’s familiarity to it. ” Instrumental judgments (n=48) point towards the influence oftools and their impact on the designer’s work within the design pro-cess. While there were a substantial number of examples of specifictools that are commonly used in visualization work, some of themore nuanced examples pointed out the limitations and strengths ofvarious tools and how they might relate to one another. For example,P8 assessed the visual vs. data-focused elements of different tools:“
In Illustrator it’s all about how does this look? [. . . ] sometimesit’s somewhat about the data because if I’m in Illustrator, I’m notgoing to worry about getting everything data–accurate. But if it’sa complex visualization or it would just be too hard to execute inillustrator and I go to code first, I literally might write the code togenerate the visualization in D3, and in D3 I can export an SVG fileand bring that back into Illustrator to then design it up. ” P10 alsoshared a more data-specific example, where the form that data takesmight encourage certain design paths and discourage others: “ in alot of cases, data is shared in the form of excel files and that’s notreally a format that I can work with. Well so what I usually do is tryto get it into R [. . . ]. So that’s almost always the first step. ” Navigational judgments (n=19) point towards the path the de-signer intends to take to execute their plan as they engage in com-plex situations. Most of our participant responses related to the navigational elements of their overall design process, where theyforegrounded certain tools or sequences of tools they used to under-take their work. For P3, this process started by “ look[ing] at thematerial, extract[ing] the data and then start sketching alternatives, ”and then later “ pull[ing] it into a visualization platform and startmanipulating the data. ” P10 described their navigational judgmentsas more situational, noting that they “ don’t really think that there’sa typical design process or typical data visualization project, ” laterdescribing their work as initially more focused on data cleaning.P7 described “asking questions” of the data to work through theirprocess “ when I get the data, the first thing to do is [. . . ] look atthe metadata, the attributes [. . . ] and then I start thinking aboutwhat might be interesting about that dataset or interesting questionsto ask that dataset. [. . . ] And I’ll keep trying at it until I get asatisfactory answer or, if I don’t, then I note that and I go on to thenext answer, and I’ll do that until I piece together some sort of astory, some sort of interesting things that I could potentially makeinto a story. ” Compositional judgments (n=11) identify relationships amongdesign elements, pointing toward an emergent whole as a central,foregrounded concern, with P1 discussing the role of visualizationparadigms in helping them “ think very structurally. ” P7 pointedtowards a narrative focus in their process, describing how they iter-atively work to “ piece together some sort of a story—some sort ofinteresting things that I could potentially make into a story. ” In asimilar fashion, P10 described creating a “design layer” by bringingtogether multiple concerns to tell a story: “
First is figuring out whatyou can do with the data and, within those boundaries, select thebest way to visualize it. And then it’s more like a design layer thatyou put over the visualization to make it more compelling and thenreally letting it tell a story, so to speak, by highlighting differentthings, playing with colors, playing with fonts. And so yeah, that’sthe design layer you can play with. ” Connective judgments (n=19) address the ways in which differ-ent elements of design work or designed artifacts are drawn togetherby the designer. Many of these judgments also pointed towards com-positional assemblages as they were operationalized in navigationalways. In one example, P6 describes the ways they connect differentjudgments relating to the availability of data and its use in creating avisualization: “
So the broad steps are fairly straight forward, right?So we need to acquire and explore the data. We need to derive anydata that we need out of data that doesn’t already exist [. . . ]. Weneed to design the visualization and sort of get approval on that.And then we need to do the work of building the visualization andthen promoting it into some environment where it can be consumed. ”P9 describes a functional assembly of connections that moves theirdesign process: “
And so they have done some data manipulation ineither R or Excel or something like that. And then we work towardsbrainstorming how we convey the data and what sort of form itshould take. And then I will usually do work on it in Adobe Illustra-tor. So take that data in vector format and pull it in and do the sortof graphic manipulation to really get their point across and createsomething unique. ” Our second finding is that judgments typically occur in a complex,layered fashion, where one judgment is often foregrounded whileothers have influence in the background. Our thematic analysisregularly resulted in multiple codes being applied to statements inan overlapping manner. For instance, consider the excerpt from P9shown in Figure 1. Here we see P9 engaging in multiple, overlap-ping judgments as a means of confronting the complexity of thedesign situation. There are framing and appreciative judgments toset the initial design space and potential outputs; connective andinstrumental judgments about the tools being used and the waysin which they are connected while moving towards an outcome; igure 1: Complex layering of judgments described by P9 as a means of confronting complexity in the design situation. navigational judgments that help deal with the complexity of thedata in context; and quality and appearance judgments, relating tothe creativity and craft of making a point and creating somethingunique. This complex layering demonstrates how situated, personaljudgments are a means of moving through complexity toward adesign outcome.
ISCUSSION
When we asked practitioners to reflect on their design work, they de-scribed continuously relying on situated judgments in a layered andcomplex fashion. This was true across all aspects of their process,including how they start, how they assess their progress and decidewhat steps to take next, and how they select visual channels and charttypes. Rarely did practitioners describe reaching conclusions as aresult of rational decision-making processes in which options wereweighed and specific steps were followed. This is not to suggest thatpractitioners are irrational or unprincipled in their work. Many of thepractitioners we interviewed are experts in data visualization—theyhave written influential books, won awards for their work, and arehighly active and respected in the practitioner community. Rather,they make judgments that draw from an accumulation of prior expe-rience in combination with a personal design philosophy and varioussituational factors. Studies in naturalistic decision making showprecisely this—that experts draw from familiar, prototypical patternsof experience to make good judgments, relying on a process thatmimics intuition more than logical deliberation [1, 14].Aside from a brief mention of design judgment [18], the visualiza-tion literature has largely promoted rational, model-driven forms ofdecision-making [21, 22] and moving through a design process [29].While these popular models may be useful for researchers, our worksuggests they are not sufficient for characterizing real-world designpractice, especially in relation to the personal and situated ways prac-titioners confront complex design situations. Decision and processmodels suggest that design thinking is rational and structured, yetour analysis reveals that practitioners largely do not describe theirdesign activity in this way.The emphasis on rational models likely stems from the positivistfoundations of field of visualization [20]. Models and other abstractforms of knowledge ignore the complexity and messiness of particu-lar situations in an effort to generalize and be “objective”. While theobjectivity of this kind of knowledge is questionable on its own [26],it is also questionable whether rational models have much descriptiveor prescriptive value when it comes to decision making in complex environments. Both decision scientists (e.g., [1, 14]) and designscholars (e.g., [8, 31]) have largely abandoned attempts to modeldecision making in formal and rationalistic ways. At the very least,our findings indicate that we need more knowledge types—onesthat are better able to characterize situated, personal knowledge—ifwe are to adequately understand and influence visualization designpractice. Judgments are one such type of knowledge, which we haveshown here to be valuable in characterizing the way designers thinkand work in practice.
UMMARY
Our analysis shows how designers rely on personal and situatedforms of knowledge that cannot be not generalized and modeled inthe ways that process and decision models tend to be. Our analysisalso shows that judgments are complex and layered, rather thantaking place in individual and disconnected ways, strengthening theclaims made by Gray et al. [12] in another design context. Thisfinding calls into question the adequacy of design models for char-acterizing design practice—generally speaking, but especially withrespect to practitioners designing in non-research settings.Our work contributes a new conceptual language and theoreticalframing for studying visualization design, particularly in terms ofdesign practice and the ways in which designers face complexityand move through their design process. This contribution can alsobe valuable for practitioners, as surfacing this way of knowing canhelp them become aware of their own judgment making and identifymeans to improve it. Our work has implications for data visualiza-tion pedagogy, as an appreciation of the personal and situated natureof design is critical for preparing designers to face the complexitiesof real-world practice. Future scholarship on data visualization ped-agogy and practice would benefit from focusing attention beyondformal, objective knowledge and logical processes of decision mak-ing, allowing access to the rich nature of design expertise and theways in which this expertise is developed over time. A CKNOWLEDGMENTS
We would like to thank our interview participants and the anonymousreviewers of this paper. This work was supported in part by NSFaward 1755957. R EFERENCES [1] L. R. Beach and R. Lipshitz. Why classical decision theory is aninappropriate standard for evaluating and aiding most human decisionaking. In G. Klein, J. Orasanu, R. Calderwood, and C. E. Zsambok,eds.,
Decision Making in Action: Models and Methods , pp. 21–35.Ablex Publishing, 1993.[2] A. Bigelow, S. Drucker, D. Fisher, and M. Meyer. Reflections on howdesigners design with data. In
Proceedings of the 2014 InternationalWorking Conference on Advanced Visual Interfaces (AVI ’14) , pp. 17–24. ACM Press, Como, Italy, 2014. doi: 10.1145/2598153.2598175[3] A. Bigelow, S. Drucker, D. Fisher, and M. Meyer. Iterating betweentools to create and edit visualizations.
IEEE Transactions on Visualiza-tion and Computer Graphics , 23(1):481–490, 2016.[4] E. Boling, H. Alangari, I. M. Hajdu, M. Guo, K. Gyabak, Z. Khlaif,R. Kizilboga, K. Tomita, M. Alsaif, A. Lachheb, H. Bae, F. Ergulec,M. Zhu, M. Basdogan, C. Buggs, A. Sari, and R. Techawitthayachinda.Core Judgments of Instructional Designers in Practice.
PerformanceImprovement Quarterly , 30(3):199–219, 2017. doi: 10.1002/piq.21250[5] V. Braun and V. Clarke. Using thematic analysis in psychology.
Quali-tative Research in Psychology , 3(2):77–101, 2006.[6] E. Chi. A taxonomy of visualization techniques using the data statereference model. pp. 69–75. IEEE Symposium on Information Visual-ization 2000, 2000. doi: 10.1109/INFVIS.2000.885092[7] N. Cross. Designerly ways of knowing.
Design Studies , 3(4):7, 1982.[8] N. Cross, J. Naughton, and D. Walker. Design method and scientificmethod.
Design Studies , 2(4):195201, 1981.[9] M. Demiral-Uzan. Instructional design students’ design judgment inaction.
Performance Improvement Quarterly , 28(3):7–23, Oct. 2015.doi: 10.1002/piq.21195[10] J. Dunne. Professional judgment and the predicaments of practice.
European Journal of Marketing , 33(7/8):707–720, 1999. doi: 10.1108/03090569910274339[11] E. Goodman, E. Stolterman, and R. Wakkary. Understanding interac-tion design practices. In
Proceedings of the 2011 SIGCHI Conferenceon Human Factors in Computing Systems (CHI ’11) , p. 1061. Vancou-ver, BC, Canada, 2011. doi: 10.1145/1978942.1979100[12] C. M. Gray, C. Dagli, M. Demiral-Uzan, F. Ergulec, V. Tan, A. A.Altuwaijri, K. Gyabak, M. Hilligoss, R. Kizilboga, K. Tomita, andE. Boling. Judgment and Instructional Design: How ID PractitionersWork In Practice.
Performance Improvement Quarterly , 28(3):25–49,2015. doi: 10.1002/piq.21198[13] J. Hoffswell, W. Li, and Z. Liu. Techniques for flexible responsivevisualization design. In
Proceedings of the 2020 SIGCHI Conferenceon Human Factors in Computing Systems (CHI ’20) , pp. 1–13, 2020.[14] G. A. Klein.
Sources of power: How people make decisions . MITpress, 2 ed., 2017.[15] K. Kuutti and L. J. Bannon. The turn to practice in HCI: towardsa research agenda. In
Proceedings of the 2014 SIGCHI Conferenceon Human Factors in Computing Systems (CHI ’14) , pp. 3543–3552.ACM Press, Toronto, Canada, 2014. doi: 10.1145/2556288.2557111[16] B. Lawson.
How designers think: The design process demystified .Routledge, 2006.[17] N. McCurdy, J. Dykes, and M. Meyer. Action Design Research andVisualization Design. In
Proceedings of the 6th Biannual Workshop onevaluation and BEyond - methodoLogIcal approaches for Visualization(BELIV) , pp. 10–18. ACM Press, New York, New York, USA, 2016.doi: 10.1145/2993901.2993916[18] S. McKenna, D. Mazur, J. Agutter, and M. Meyer. Design ActivityFramework for Visualization Design.
IEEE Transactions on Visual-ization and Computer Graphics , 20(12):2191–2200, 2014. doi: 10.1109/TVCG.2014.2346331[19] G. G. M´endez, U. Hinrichs, and M. A. Nacenta. Bottom-up vs. top-down: trade-offs in efficiency, understanding, freedom and creativitywith infovis tools. In
Proceedings of the 2017 SIGCHI Conference onHuman Factors in Computing Systems (CHI ’17) , pp. 841–852, 2017.[20] M. Meyer and J. Dykes. Criteria for Rigor in Visualization DesignStudy.
IEEE Transactions on Visualization and Computer Graphics ,26(1):87–97, 2020. doi: 10.1109/TVCG.2019.2934539[21] M. Meyer, M. Sedlmair, P. S. Quinan, and T. Munzner. The nestedblocks and guidelines model.
Information Visualization , 14(3):234–249, 2015. doi: 10.1177/1473871613510429[22] T. Munzner. A nested model for visualization design and validation.
IEEE Transactions on Visualization and Computer Graphics , 15:921– 928, 2009. doi: 10.1109/TVCG.2009.111[23] H. G. Nelson and E. Stolterman.
The Design Way: Intentional Changein an Unpredictable World . The MIT Press, 2 ed., 2012.[24] P. Parsons, A. Baigelenov, Y.-H. Hung, and C. Schrank. What De-sign Methods do DataVis Practitioners Know and Use? In
ExtendedAbstracts of the 2020 CHI Conference on Human Factors in Com-puting Systems , pp. 1–8. ACM, Honolulu HI USA, 2020. doi: 10.1145/3334480.3383048[25] P. Parsons and P. Shukla. Data visualization practitioners’ perspec-tives on chartjunk. In
Proceedings of the 2020 IEEE VisualizationConference (VIS), short papers , 2020.[26] M. Polanyi.
The Tacit Dimension . University of Chicago Press, 1966.[27] D. Ren, B. Lee, M. Brehmer, and N. H. Riche. Reflecting on the evalu-ation of visualization authoring systems: Position paper. In , pp. 86–92. IEEE, 2018.[28] A. Satyanarayan, B. Lee, D. Ren, J. Heer, J. Stasko, J. Thompson,M. Brehmer, and Z. Liu. Critical reflections on visualization authoringsystems.
IEEE Transactions on Visualization and Computer Graphics ,26(1):461–471, 2019.[29] M. Sedlmair, M. Meyer, and T. Munzner. Design Study Methodology:Reflections from the Trenches and the Stacks.
IEEE Transactions onVisualization and Computer Graphics , 18(12):2431–2440, 2012. doi:10.1109/TVCG.2012.213[30] J. Shotter and H. Tsoukas. Performing phronesis: On the way toengaged judgment.
Management Learning , July 2014.[31] E. Stolterman. The Nature of Design Practice and Implications forInteraction Design Research.
International Journal of Design , 2(1):55–65, 2008. doi: 10.1016/j.phymed.2007.09.005[32] S. G. Vickers. Judgment. In
The Vickers Papers , pp. 230–245. Harper& Row, London, Jan. 1984.[33] J. Walny, C. Frisson, M. West, D. Kosminsky, S. Knudsen, S. Carpen-dale, and W. Willett. Data changes everything: Challenges and oppor-tunities in data visualization design handoff.