Walter Sinnott-Armstrong
Duke University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Walter Sinnott-Armstrong.
Journal of Cognitive Neuroscience | 2006
Jana Schaich Borg; Catherine Hynes; John D. Van Horn; Scott T. Grafton; Walter Sinnott-Armstrong
The traditional philosophical doctrines of Consequentialism, Doing and Allowing, and Double Effect prescribe that moral judgments and decisions should be based on consequences, action (as opposed to inaction), and intention. This study uses functional magnetic resonance imaging to investigate how these three factors affect brain processes associated with moral judgments. We find the following: (1) Moral scenarios involving only a choice between consequences with different amounts of harm elicit activity in similar areas of the brain as analogous nonmoral scenarios; (2) Compared to analogous nonmoral scenarios, moral scenarios in which action and inaction result in the same amount of harm elicit more activity in areas associated with cognition (such as the dorsolateral prefrontal cortex) and less activity in areas associated with emotion (such as the orbitofrontal cortex and temporal pole); (3) Compared to analogous nonmoral scenarios, conflicts between goals of minimizing harm and of refraining from harmful action elicit more activity in areas associated with emotion (orbitofrontal cortex and temporal pole) and less activity in areas associated with cognition (including the angular gyrus and superior frontal gyrus); (4) Compared to moral scenarios involving only unintentional harm, moral scenarios involving intentional harm elicit more activity in areas associated with emotion (orbitofrontal cortex and temporal pole) and less activity in areas associated with cognition (including the angular gyrus and superior frontal gyrus). These findings suggest that different kinds of moral judgment are preferentially supported by distinguishable brain systems.
Philosophy and Phenomenological Research | 1987
Walter Sinnott-Armstrong; Simon Blackburn
Provides a comprehensive introduction to the major philosophical theories attempting to explain the workings of language.
Proceedings of the National Academy of Sciences of the United States of America | 2013
Eyal Aharoni; Gina M. Vincent; Carla L. Harenski; Vince D. Calhoun; Walter Sinnott-Armstrong; Michael S. Gazzaniga; Kent A. Kiehl
Identification of factors that predict recurrent antisocial behavior is integral to the social sciences, criminal justice procedures, and the effective treatment of high-risk individuals. Here we show that error-related brain activity elicited during performance of an inhibitory task prospectively predicted subsequent rearrest among adult offenders within 4 y of release (N = 96). The odds that an offender with relatively low anterior cingulate activity would be rearrested were approximately double that of an offender with high activity in this region, holding constant other observed risk factors. These results suggest a potential neurocognitive biomarker for persistent antisocial behavior.
Journal of Cognitive Neuroscience | 2011
Carolyn Parkinson; Walter Sinnott-Armstrong; Philipp E. Koralus; Angela Mendelovici; Victoria McGeer; Thalia Wheatley
Much recent research has sought to uncover the neural basis of moral judgment. However, it has remained unclear whether “moral judgments” are sufficiently homogenous to be studied scientifically as a unified category. We tested this assumption by using fMRI to examine the neural correlates of moral judgments within three moral areas: (physical) harm, dishonesty, and (sexual) disgust. We found that the judgment of moral wrongness was subserved by distinct neural systems for each of the different moral areas and that these differences were much more robust than differences in wrongness judgments within a moral area. Dishonest, disgusting, and harmful moral transgression recruited networks of brain regions associated with mentalizing, affective processing, and action understanding, respectively. Dorsal medial pFC was the only region activated by all scenarios judged to be morally wrong in comparison with neutral scenarios. However, this region was also activated by dishonest and harmful scenarios judged not to be morally wrong, suggestive of a domain-general role that is neither peculiar to nor predictive of moral decisions. These results suggest that moral judgment is not a wholly unified faculty in the human brain, but rather, instantiated in dissociable neural systems that are engaged differentially depending on the type of transgression being judged.
Journal of Abnormal Psychology | 2012
Eyal Aharoni; Walter Sinnott-Armstrong; Kent A. Kiehl
A prominent view of psychopathic moral reasoning suggests that psychopathic individuals cannot properly distinguish between moral wrongs and other types of wrongs. The present study evaluated this view by examining the extent to which 109 incarcerated offenders with varying degrees of psychopathy could distinguish between moral and conventional transgressions relative to each other and to nonincarcerated healthy controls. Using a modified version of the classic Moral/Conventional Transgressions task that uses a forced-choice format to minimize strategic responding, the present study found that total psychopathy score did not predict performance on the task. Task performance was explained by some individual subfacets of psychopathy and by other variables unrelated to psychopathy, such as IQ. The authors conclude that, contrary to earlier claims, insufficient data exist to infer that psychopathic individuals cannot know what is morally wrong.
Review of Philosophy and Psychology | 2010
Joshua May; Walter Sinnott-Armstrong; Jay G. Hull; Aaron Zimmerman
In defending his interest-relative account of knowledge, Jason Stanley relies heavily on intuitions about several bank cases. We experimentally test the empirical claims that Stanley seems to make concerning our common-sense intuitions about these cases. Additionally, we test the empirical claims that Jonathan Schaffer seems to make, regarding the salience of an alternative, in his critique of Stanley. Our data indicate that neither raising the possibility of error nor raising stakes moves most people from attributing knowledge to denying it. However, the raising of stakes (but not alternatives) does affect the level of confidence people have in their attributions of knowledge. We argue that our data impugn what both Stanley and Schaffer claim our common-sense judgments about such cases are.
Annals of the New York Academy of Sciences | 2008
Eyal Aharoni; Chadd M. Funk; Walter Sinnott-Armstrong; Michael S. Gazzaniga
Can neurological evidence help courts assess criminal responsibility? To answer this question, we must first specify legal criteria for criminal responsibility and then ask how neurological findings can be used to determine whether particular defendants meet those criteria. Cognitive neuroscience may speak to at least two familiar conditions of criminal responsibility: intention and sanity. Functional neuroimaging studies in motor planning, awareness of actions, agency, social contract reasoning, and theory of mind, among others, have recently targeted a small assortment of brain networks thought to be instrumental in such determinations. Advances in each of these areas bring specificity to the problems underlying the application of neuroscience to criminal law.
Behavior Research Methods | 2015
Scott Clifford; Vijeth Iyengar; Roberto Cabeza; Walter Sinnott-Armstrong
Research on the emotional, cognitive, and social determinants of moral judgment has surged in recent years. The development of moral foundations theory (MFT) has played an important role, demonstrating the breadth of morality. Moral psychology has responded by investigating how different domains of moral judgment are shaped by a variety of psychological factors. Yet, the discipline lacks a validated set of moral violations that span the moral domain, creating a barrier to investigating influences on judgment and how their neural bases might vary across the moral domain. In this paper, we aim to fill this gap by developing and validating a large set of moral foundations vignettes (MFVs). Each vignette depicts a behavior violating a particular moral foundation and not others. The vignettes are controlled on many dimensions including syntactic structure and complexity making them suitable for neuroimaging research. We demonstrate the validity of our vignettes by examining respondents’ classifications of moral violations, conducting exploratory and confirmatory factor analysis, and demonstrating the correspondence between the extracted factors and existing measures of the moral foundations. We expect that the MFVs will be beneficial for a wide variety of behavioral and neuroimaging investigations of moral cognition.
Neuropsychologia | 2010
Michael B. Miller; Walter Sinnott-Armstrong; Liane Young; Danielle King; Aldo Paggi; Mara Fabri; Gabriele Polonara; Michael S. Gazzaniga
Recent neuroimaging studies suggest lateralized cerebral mechanisms in the right temporal parietal junction are involved in complex social and moral reasoning, such as ascribing beliefs to others. Based on this evidence, we tested 3 anterior-resected and 3 complete callosotomy patients along with 22 normal subjects on a reasoning task that required verbal moral judgments. All 6 patients based their judgments primarily on the outcome of the actions, disregarding the beliefs of the agents. The similarity in performance between complete and partial callosotomy patients suggests that normal judgments of morality require full interhemispheric integration of information critically supported by the right temporal parietal junction and right frontal processes.
Episteme | 2008
Walter Sinnott-Armstrong; Adina L. Roskies; Teneille R. Brown; Emily R. Murphy
This paper explores whether brain images may be admitted as evidence in criminal trials under Federal Rule of Evidence 403, which weighs probative value against the danger of being prejudicial, confusing, or misleading to fact finders. The paper summarizes and evaluates recent empirical research relevant to these issues. We argue that currently the probative value of neuroimages for criminal responsibility is minimal, and there is some evidence of their potential to be prejudicial or misleading. We also propose experiments that will directly assess how jurors are influenced by brain images.