Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Liane Young is active.

Publication


Featured researches published by Liane Young.


Nature | 2007

Damage to the prefrontal cortex increases utilitarian moral judgements

Michael Koenigs; Liane Young; Ralph Adolphs; Daniel Tranel; Fiery Cushman; Marc D. Hauser; Antonio R. Damasio

The psychological and neurobiological processes underlying moral judgement have been the focus of many recent empirical studies. Of central interest is whether emotions play a causal role in moral judgement, and, in parallel, how emotion-related areas of the brain contribute to moral judgement. Here we show that six patients with focal bilateral damage to the ventromedial prefrontal cortex (VMPC), a brain region necessary for the normal generation of emotions and, in particular, social emotions, produce an abnormally ‘utilitarian’ pattern of judgements on moral dilemmas that pit compelling considerations of aggregate welfare against highly emotionally aversive behaviours (for example, having to sacrifice one person’s life to save a number of other lives). In contrast, the VMPC patients’ judgements were normal in other classes of moral dilemmas. These findings indicate that, for a selective set of moral dilemmas, the VMPC is critical for normal judgements of right and wrong. The findings support a necessary role for emotion in the generation of those judgements.


Psychological Science | 2006

The Role of Conscious Reasoning and Intuition in Moral Judgment Testing Three Principles of Harm

Fiery Cushman; Liane Young; Marc D. Hauser

Is moral judgment accomplished by intuition or conscious reasoning? An answer demands a detailed account of the moral principles in question. We investigated three principles that guide moral judgments: (a) Harm caused by action is worse than harm caused by omission, (b) harm intended as the means to a goal is worse than harm foreseen as the side effect of a goal, and (c) harm involving physical contact with the victim is worse than harm involving no physical contact. Asking whether these principles are invoked to explain moral judgments, we found that subjects generally appealed to the first and third principles in their justifications, but not to the second. This finding has significance for methods and theories of moral psychology: The moral principles used in judgment must be directly compared with those articulated in justification, and doing so shows that some moral principles are available to conscious reasoning whereas others are not.


Proceedings of the National Academy of Sciences of the United States of America | 2007

The neural basis of the interaction between theory of mind and moral judgment

Liane Young; Fiery Cushman; Marc D. Hauser; Rebecca Saxe

Is the basis of criminality an act that causes harm, or an act undertaken with the belief that one will cause harm? The present study takes a cognitive neuroscience approach to investigating how information about an agents beliefs and an actions consequences contribute to moral judgment. We build on prior developmental evidence showing that these factors contribute differentially to the young childs moral judgments coupled with neurobiological evidence suggesting a role for the right temporoparietal junction (RTPJ) in belief attribution. Participants read vignettes in a 2 × 2 design: protagonists produced either a negative or neutral outcome based on the belief that they were causing the negative outcome (“negative” belief) or the neutral outcome (“neutral” belief). The RTPJ showed significant activation above baseline for all four conditions but was modulated by an interaction between belief and outcome. Specifically, the RTPJ response was highest for cases of attempted harm, where protagonists were condemned for actions that they believed would cause harm to others, even though the harm did not occur. The results not only suggest a general role for belief attribution during moral judgment, but also add detail to our understanding of the interaction between these processes at both the neural and behavioral levels.


Proceedings of the National Academy of Sciences of the United States of America | 2010

Disruption of the right temporoparietal junction with transcranial magnetic stimulation reduces the role of beliefs in moral judgments

Liane Young; Joan A. Camprodon; Marc D. Hauser; Alvaro Pascual-Leone; Rebecca Saxe

When we judge an action as morally right or wrong, we rely on our capacity to infer the actors mental states (e.g., beliefs, intentions). Here, we test the hypothesis that the right temporoparietal junction (RTPJ), an area involved in mental state reasoning, is necessary for making moral judgments. In two experiments, we used transcranial magnetic stimulation (TMS) to disrupt neural activity in the RTPJ transiently before moral judgment (experiment 1, offline stimulation) and during moral judgment (experiment 2, online stimulation). In both experiments, TMS to the RTPJ led participants to rely less on the actors mental states. A particularly striking effect occurred for attempted harms (e.g., actors who intended but failed to do harm): Relative to TMS to a control site, TMS to the RTPJ caused participants to judge attempted harms as less morally forbidden and more morally permissible. Thus, interfering with activity in the RTPJ disrupts the capacity to use mental states in moral judgment, especially in the case of attempted harms.


Psychological Inquiry | 2012

Mind Perception Is the Essence of Morality.

Kurt Gray; Liane Young; Adam Waytz

Mind perception entails ascribing mental capacities to other entities, whereas moral judgment entails labeling entities as good or bad or actions as right or wrong. We suggest that mind perception is the essence of moral judgment. In particular, we suggest that moral judgment is rooted in a cognitive template of two perceived minds—a moral dyad of an intentional agent and a suffering moral patient. Diverse lines of research support dyadic morality. First, perceptions of mind are linked to moral judgments: dimensions of mind perception (agency and experience) map onto moral types (agents and patients), and deficits of mind perception correspond to difficulties with moral judgment. Second, not only are moral judgments sensitive to perceived agency and experience, but all moral transgressions are fundamentally understood as agency plus experienced suffering—that is, interpersonal harm—even ostensibly harmless acts such as purity violations. Third, dyadic morality uniquely accounts for the phenomena of dyadic completion (seeing agents in response to patients, and vice versa), and moral typecasting (characterizing others as either moral agents or moral patients). Discussion also explores how mind perception can unify morality across explanatory levels, how a dyadic template of morality may be developmentally acquired, and future directions.


Proceedings of the National Academy of Sciences of the United States of America | 2011

Impaired theory of mind for moral judgment in high-functioning autism

Joseph M. Moran; Liane Young; Rebecca Saxe; Su Mei Lee; Daniel R. O'Young; Penelope L. Mavros; John D. E. Gabrieli

High-functioning autism (ASD) is characterized by real-life difficulties in social interaction; however, these individuals often succeed on laboratory tests that require an understanding of another persons beliefs and intentions. This paradox suggests a theory of mind (ToM) deficit in adults with ASD that has yet to be demonstrated in an experimental task eliciting ToM judgments. We tested whether ASD adults would show atypical moral judgments when they need to consider both the intentions (based on ToM) and outcomes of a persons actions. In experiment 1, ASD and neurotypical (NT) participants performed a ToM task designed to test false belief understanding. In experiment 2, the same ASD participants and a new group of NT participants judged the moral permissibility of actions, in a 2 (intention: neutral/negative) × 2 (outcome: neutral/negative) design. Though there was no difference between groups on the false belief task, there was a selective difference in the moral judgment task for judgments of accidental harms, but not neutral acts, attempted harms, or intentional harms. Unlike the NT group, which judged accidental harms less morally wrong than attempted harms, the ASD group did not reliably judge accidental and attempted harms as morally different. In judging accidental harms, ASD participants appeared to show an underreliance on information about a persons innocent intention and, as a direct result, an overreliance on the actions negative outcome. These findings reveal impairments in integrating mental state information (e.g., beliefs, intentions) for moral judgment.


Neuron | 2010

Damage to ventromedial prefrontal cortex impairs judgment of harmful intent.

Liane Young; Antoine Bechara; Daniel Tranel; Hanna Damasio; Marc D. Hauser; Antonio R. Damasio

Moral judgments, whether delivered in ordinary experience or in the courtroom, depend on our ability to infer intentions. We forgive unintentional or accidental harms and condemn failed attempts to harm. Prior work demonstrates that patients with damage to the ventromedial prefrontal cortex (VMPC) deliver abnormal judgments in response to moral dilemmas and that these patients are especially impaired in triggering emotional responses to inferred or abstract events (e.g., intentions), as opposed to real or actual outcomes. We therefore predicted that VMPC patients would deliver abnormal moral judgments of harmful intentions in the absence of harmful outcomes, as in failed attempts to harm. This prediction was confirmed in the current study: VMPC patients judged attempted harms, including attempted murder, as more morally permissible relative to controls. These results highlight the critical role of the VMPC in processing harmful intent for moral judgment.


Journal of Cognitive Neuroscience | 2009

An fmri investigation of spontaneous mental state inference for moral judgment

Liane Young; Rebecca Saxe

Human moral judgment depends critically on “theory of mind,” the capacity to represent the mental states of agents. Recent studies suggest that the right TPJ (RTPJ) and, to lesser extent, the left TPJ (LTPJ), the precuneus (PC), and the medial pFC (MPFC) are robustly recruited when participants read explicit statements of an agents beliefs and then judge the moral status of the agents action. Real-world interactions, by contrast, often require social partners to infer each others mental states. The current study uses fMRI to probe the role of these brain regions in supporting spontaneous mental state inference in the service of moral judgment. Participants read descriptions of a protagonists action and then either (i) “moral” facts about the actions effect on another person or (ii) “nonmoral” facts about the situation. The RTPJ, PC, and MPFC were recruited selectively for moral over nonmoral facts, suggesting that processing moral stimuli elicits spontaneous mental state inference. In a second experiment, participants read the same scenarios, but explicit statements of belief preceded the facts: Protagonists believed their actions would cause harm or not. The response in the RTPJ, PC, and LTPJ was again higher for moral facts but also distinguished between neutral and negative outcomes. Together, the results illuminate two aspects of theory of mind in moral judgment: (1) spontaneous belief inference and (2) stimulus-driven belief integration.


Neuropsychologia | 2009

Innocent intentions: A correlation between forgiveness for accidental harm and neural activity !

Liane Young; Rebecca Saxe

Contemporary moral psychology often emphasizes the universality of moral judgments. Across age, gender, religion and ethnicity, peoples judgments on classic dilemmas are sensitive to the same moral principles. In many cases, moral judgments depend not only on the outcome of the action, but on the agents beliefs and intentions at the time of action. For example, we blame agents who attempt but fail to harm others, while generally forgiving agents who harm others accidentally and unknowingly. Nevertheless, as we report here, there are individual differences in the extent to which observers exculpate agents for accidental harms. Furthermore, we find that the extent to which innocent intentions are taken to mitigate blame for accidental harms is correlated with activation in a specific brain region during moral judgment. This brain region, the right temporo-parietal junction, has been previously implicated in reasoning about other peoples thoughts, beliefs, and intentions in moral and non-moral contexts.


Social Neuroscience | 2012

Where in the brain is morality? Everywhere and maybe nowhere

Liane Young; James Dungan

The neuroscience of morality has focused on how morality works and where it is in the brain. In tackling these questions, researchers have taken both domain-specific and domain-general approaches—searching for neural substrates and systems dedicated to moral cognition versus characterizing the contributions of domain-general processes. Where in the brain is morality? On one hand, morality is made up of complex cognitive processes, deployed across many domains and housed all over the brain. On the other hand, no neural substrate or system that uniquely supports moral cognition has been found. In this review, we will discuss early assumptions of domain-specificity in moral neuroscience as well as subsequent investigations of domain-general contributions, taking emotion and social cognition (i.e., theory of mind) as case studies. Finally, we will consider possible cognitive accounts of a domain-specific morality: Does uniquely moral cognition exist?

Collaboration


Dive into the Liane Young's collaboration.

Top Co-Authors

Avatar

Rebecca Saxe

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge