Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dan M. Kahan is active.

Publication


Featured researches published by Dan M. Kahan.


Nature | 2010

Fixing the Communications Failure

Dan M. Kahan

There is a culture war in America over science. Why? And what should be done to promote the ability of culturally diverse citizens to agree on how science can inform their common interests in health, security, and prosperity? This article uses the findings of Cultural Cognition Project studies to address these question.


Nature | 2012

Why we are poles apart on climate change.

Dan M. Kahan

Understandably anxious to explain persistent controversy over climate change, the media have discovered a new culprit: the public. By piecing together bits of psychological research, many news reporters, opinion writers and bloggers have concluded that people are simply too irrational to recognize the implications of climate-change science. This conclusion gets it half right. Studying things from a psychological angle does help to make sense of climate-change scepticism. But the true source of the problem, research suggests, is not that people are irrational. Instead, it is that their reasoning powers have become disabled by a polluted science-communication environment. Social-science research indicates that people with different cultural values — individualists compared with egalitarians, for example — disagree sharply about how serious a threat climate change is. People with different values draw different inferences from the same evidence. Present them with a PhD scientist who is a member of the US National Academy of Sciences, for example, and they will disagree on whether he really is an ‘expert’, depending on whether his view matches the dominant view of their cultural group (D. M. Kahan et al. J. Risk Res. 14, 147–174; 2011). The positions on climate change of both groups track their impressions of recent weather. Yet their impressions of what the recent weather has been are polarized, too, and bear little relationship to reality (K. Goebbert et al. Weath. Clim. Soc. 4, 132–144; 2012). But is this sort of cultural polarization evidence of irrationality? If it is, then how can we explain the fact that members of the lay public who are the most science literate, and the most proficient at technical reasoning, are also the most culturally polarized (D. M. Kahan et al. Nature Clim. Change http://dx.doi.org/10.1038/ nclimate1547; 2012)? If anything, social science suggests that citizens are culturally polarized because they are, in fact, too rational — at filtering out information that would drive a wedge between themselves and their peers. For members of the public, being right or wrong about climatechange science will have no impact. Nothing they do as individual consumers or as individual voters will meaningfully affect the risks posed by climate change. Yet the impact of taking a position that conflicts with their cultural group could be disastrous. Take a barber in a rural town in South Carolina. Is it a good idea for him to implore his customers to sign a petition urging Congress to take action on climate change? No. If he does, he will find himself out of a job, just as his former congressman, Bob Inglis, did when he himself proposed such action. Positions on climate change have come to signify the kind of person one is. People whose beliefs are at odds with those of the people with whom they share their basic cultural commitments risk being labelled as weird and obnoxious in the eyes of those on whom they depend for social and financial support. So, if the cost of having a view of climate change that does not conform with the scientific consensus is zero, and the cost of having a view that is at odds with members of one’s cultural community can be high, what is a rational person to do? In that situation, it is perfectly sensible for individuals to be guided by modes of reasoning that connect their beliefs to ones that predominate in their group. Even people of modest scientific literacy will pick up relevant cues. Those who know more and who can reason more analytically will do a still better job, even if their group is wrong on the science. So whom should we ‘blame’ for the climatechange crisis? To borrow a phrase, it’s the ‘sciencecommunication environment, stupid’ — not stupid people. People acquire their scientific knowledge by consulting others who share their values and whom they therefore trust and understand. Usually, this strategy works just fine. We live in a science-communication environment richly stocked with accessible, consequential facts. As a result, groups with different values routinely converge on the best evidence for, say, the value of adding fluoride to water, or the harmlessness of mobile-phone radiation. The trouble starts when this communication environment fills up with toxic partisan meanings — ones that effectively announce that ‘if you are one of us, believe this; otherwise, we’ll know you are one of them’. In that situation, ordinary individuals’ lives will go better if their perceptions of societal risk conform with those of their group. Yet when all citizens simultaneously follow this individually rational strategy of belief formation, their collective well-being will certainly suffer. Culturally polarized democracies are less likely to adopt polices that reflect the best available scientific evidence on matters — such as climate change — that profoundly affect their common interests. Overcoming this dilemma requires collective strategies to protect the quality of the science-communication environment from the pollution of divisive cultural meanings. Psychology — along with anthropology, sociology, political science and economics — will play a part. But to apply the insights that social science has already given us, we will have to be smart enough to avoid reducing what we learn to catchy simplifications. ■


University of Chicago Law Review | 2000

Gentle Nudges vs. Hard Shoves: Solving the Sticky Norms Problem

Dan M. Kahan

The resistance of law enforcers sometimes confounds the efforts of law makers to change social norms. Thus, as legislators expand liability for date rape, domestic violence, and drunk driving, police become less likely to arrest, prosecutors to charge, jurors to convict, and judges to sentence severely. The conspicuous resistance of these decisionmakers in turn reinforces the norms that law makers intended to change. Can this ?sticky norms? pathology be effectively treated? It can be, this article argues, if law makers apply ?gentle nudges? rather than ?hard shoves.? When the law embodies a relatively mild degree of condemnation, the desire of most decisionmakers to discharge their civic duties will override their reluctance to enforce a law that attacks a widespread social norm. The willingness of most decisionmakers to enforce can initiate a self-reinforcing wave of condemnation, thereby allowing lawmakers to increase the severity of the law in the future without prompting resistance from most decisionmakers. The article presents a formal model of this strategy for norm reform, illustrates it with real world examples, and identifies its normative and prescriptive implications.


Archive | 2011

The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change

Dan M. Kahan; Maggie Wittlin; Ellen Peters; Paul Slovic; Lisa Larrimore Ouellette; Donald Braman; Gregory N. Mandel

The conventional explanation for controversy over climate change emphasizes impediments to public understanding: Limited popular knowledge of science, the inability of ordinary citizens to assess technical information, and the resulting widespread use of unreliable cognitive heuristics to assess risk. A large survey of U.S. adults (N = 1540) found little support for this account. On the whole, the most scientifically literate and numerate subjects were slightly less likely, not more, to see climate change as a serious threat than the least scientifically literate and numerate ones. More importantly, greater scientific literacy and numeracy were associated with greater cultural polarization: Respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased. We suggest that this evidence reflects a conflict between two levels of rationality: The individual level, which is characterized by citizens’ effective use of their knowledge and reasoning capacities to form risk perceptions that express their cultural commitments; and the collective level, which is characterized by citizens’ failure to converge on the best available scientific evidence on how to promote their common welfare. Dispelling this, “tragedy of the risk-perception commons,” we argue, should be understood as the central aim of the science of science communication.


The Journal of Legal Studies | 1998

Social Meaning and the Economic Analysis of Crime

Dan M. Kahan

This essay examines the importance of social meaning for the economic analysis of crime. Against the background of social norms, the actions of individuals and communities convey information about what they value. Individuals take these meanings into account when they are responding to the incentives created by criminal law; communities take them into account when they decide what to punish, how to punish it, and how severely. Because meaning matters in these ways, economic analyses of criminal law that abstract from meaning—by, say, considering only how various policies affect the expected penalty for wrongdoing—produce unreliable predictions and prescriptions. The essay makes out this claim by considering a number of concrete examples, including tax evasion, juvenile gun possession, gang criminality, alternative sanctions (such as shaming penalties), and corporate criminal liability.


Science | 2013

A Risky Science Communication Environment for Vaccines

Dan M. Kahan

Neglecting the science of science communication puts the value of decision-relevant science at risk. [Also see Perspective by Bauch and Galvani] Controversy over childhood vaccinations is an instance of what might be styled the “science communication problem”—the failure of compelling scientific evidence to resolve public dispute over risks and similar facts (1). This problem itself has been the focus of scientific study since the 1970s, when psychologists began to investigate the divergence between expert and public opinion on nuclear power. Indeed, the science of science communication that this body of work comprises can now be used not just to explain controversy over risk but also to predict, manage, and in theory avoid conditions likely to trigger it. The example of childhood vaccinations illustrates these points—and teaches an important practical lesson.


Archive | 2007

The Second National Risk and Culture Study: Making Sense of - and Making Progress In - The American Culture War of Fact

Dan M. Kahan; Donald Braman; Paul Slovic; John Gastil; Geoffrey L. Cohen

Cultural Cognition refers to the disposition to conform ones beliefs about societal risks to ones preferences for how society should be organized. Based on surveys and experiments involving some 5,000 Americans, the Second National Risk and Culture Study presents empirical evidence of the effect of this dynamic in generating conflict about global warming, school shootings, domestic terrorism, nanotechnology, and the mandatory vaccination of school-age girls against HPV, among other issues. The Study also presents evidence of risk-communication strategies that counteract cultural cognition. Because nuclear power affirms rather than threatens the identity of persons who hold individualist values, for example, proposing it as a solution to global warming makes persons who hold such values more willing to consider evidence that climate change is a serious risk. Because people tend to impute credibility to people who share their values, persons who hold hierarchical and egalitarian values are less likely to polarize when they observe people who hold their values advocating unexpected positions on the vaccination of young girls against HPV. Such techniques can help society to create a deliberative climate in which citizens converge on policies that are both instrumentally sound and expressively congenial to persons of diverse values.


Annals of The American Academy of Political and Social Science | 2015

Geoengineering and Climate Change Polarization

Dan M. Kahan; Hank C. Jenkins-Smith; Tor Tarantola; Carol L. Silva; Donald Braman

The cultural cognition thesis posits that individuals rely extensively on cultural meanings in forming perceptions of risk. The logic of the cultural cognition thesis suggests that a two-channel science communication strategy, combining information content (“Channel 1”) with cultural meanings (“Channel 2”), could promote open-minded assessment of information across diverse communities. We test this kind of communication strategy in a two-nation (United States, n = 1,500; England, n = 1,500) study, in which scientific information content on climate change was held constant while the cultural meaning of that information was experimentally manipulated. We found that cultural polarization over the validity of climate change science is offset by making citizens aware of the potential contribution of geoengineering as a supplement to restriction of CO2 emissions. We also tested the hypothesis, derived from a competing model of science communication, that exposure to information on geoengineering would lead citizens to discount climate change risks generally. Contrary to this hypothesis, we found that subjects exposed to information about geoengineering were slightly more concerned about climate change risks than those assigned to a control condition.


The Journal of Law and Economics | 1999

Shaming White-Collar Criminals: A Proposal for Reform of the Federal Sentencing Guidelines

Dan M. Kahan; Eric A. Posner

From stigmatizing publicity to coerced gestures of public contrition to ritualized debasement ceremonies, shaming penalties are on the rise in American law. This paper considers the feasibility and value of such penalties for federal white‐collar offenders. It develops a theoretical model that connects the deterrent efficacy of such penalties to their power to signal the undesirable propensities of wrongdoers and the desirable propensities of citizens who shun wrongdoers. It also considers how the efficiency of such penalties is affected by their power to express publicly valued social meanings. Finally, it examines practical issues relating to the incorporation of shaming penalties into the Federal Sentencing Guidelines.


PS Political Science & Politics | 2011

The Cultural Orientation of Mass Political Opinion

John Gastil; Donald Braman; Dan M. Kahan; Paul Slovic

Most Americans lack any substantial degree of ideological sophistication (Kinder 1998), yet they often manage to express coherent views across a range of issues.The conventional explanation for this is that people rely on judgmental shortcuts (e.g., Sniderman, Brody, and Tetlock 1991). These “heuristics” permit individuals with sufficient political sophistication to sort and filter incoming messages to form relatively consistent views that align with preexisting values (Zaller 1992). If the key cueing device in such models is the source credibility heuristic (Mondak 1993), how do people who lack the time and ability to become actual policy experts have the time and capacity to figure out which policy experts are credible? How does this theory explain the coherence some have found in the views of those with limited political knowledge (Goren 2004)? We approach these two questions with the perspective offered by Mary Douglas (1982) and Aaron Wildavsky’s (1987) cultural theory. In brief, we argue that most peoples neither have the time, inclination, and ability to derive policy positions from abstract ideological principles, nor do they have the inclination or resources at-hand to sort through the empirical claims advanced in technical policy debates. Instead, as Wildavsky (1987, 8) said, “ordinary folk” use the orienting force of culture “to generate miles of preferences” from only “inches of fact.” To make the case for this conception of public opinion, we begin with a theoretical overview of how this process, which we call the Wildavsky Heuristic Model, relates to existing accounts of mass political opinion, particularly those featuring ideology. Then, we test some of this model’s core propositions using original national survey data, and finally, we draw out the theoretical and practical implications of those results.

Collaboration


Dive into the Dan M. Kahan's collaboration.

Top Co-Authors

Avatar

Donald Braman

George Washington University

View shared research outputs
Top Co-Authors

Avatar

John Gastil

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David A. Hoffman

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asheley R. Landrum

University of Texas at Dallas

View shared research outputs
Researchain Logo
Decentralizing Knowledge