Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joshua D. Greene is active.

Publication


Featured researches published by Joshua D. Greene.


Trends in Cognitive Sciences | 2002

How (and where) does moral judgment work

Joshua D. Greene; Jonathan Haidt

Moral psychology has long focused on reasoning, but recent evidence suggests that moral judgment is more a matter of emotion and affective intuition than deliberate reasoning. Here we discuss recent findings in psychology and cognitive neuroscience, including several studies that specifically investigate moral judgment. These findings indicate the importance of affect, although they allow that reasoning can play a restricted but significant role in moral judgment. They also point towards a preliminary account of the functional neuroanatomy of moral judgment, according to which many brain areas make important contributions to moral judgment although none is devoted specifically to it.


Nature | 2012

Spontaneous giving and calculated greed

David G. Rand; Joshua D. Greene; Martin A. Nowak

Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring ‘rational’ self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.


Proceedings of the National Academy of Sciences of the United States of America | 2009

Patterns of neural activity associated with honest and dishonest moral decisions

Joshua D. Greene; Joseph M. Paxton

What makes people behave honestly when confronted with opportunities for dishonest gain? Research on the interplay between controlled and automatic processes in decision making suggests 2 hypotheses: According to the “Will” hypothesis, honesty results from the active resistance of temptation, comparable to the controlled cognitive processes that enable the delay of reward. According to the “Grace” hypothesis, honesty results from the absence of temptation, consistent with research emphasizing the determination of behavior by the presence or absence of automatic processes. To test these hypotheses, we examined neural activity in individuals confronted with opportunities for dishonest gain. Subjects undergoing functional magnetic resonance imaging (fMRI) gained money by accurately predicting the outcomes of computerized coin-flips. In some trials, subjects recorded their predictions in advance. In other trials, subjects were rewarded based on self-reported accuracy, allowing them to gain money dishonestly by lying about the accuracy of their predictions. Many subjects behaved dishonestly, as indicated by improbable levels of “accuracy.” Our findings support the Grace hypothesis. Individuals who behaved honestly exhibited no additional control-related activity (or other kind of activity) when choosing to behave honestly, as compared with a control condition in which there was no opportunity for dishonest gain. In contrast, individuals who behaved dishonestly exhibited increased activity in control-related regions of prefrontal cortex, both when choosing to behave dishonestly and on occasions when they refrained from dishonesty. Levels of activity in these regions correlated with the frequency of dishonesty in individuals.


Trends in Cognitive Sciences | 2007

Why are VMPFC Patients More Utilitarian? A Dual-process Theory of Moral Judgment Explains

Joshua D. Greene

Koenigs, Young and colleagues [1] recently tested patients with emotion-related damage in the ventromedial prefrontal cortex (VMPFC) using moral dilemmas used in previous neuroimaging studies [2,3]. These patients made unusually utilitarian judgments (endorsing harmful actions that promote the greater good). My collaborators and I have proposed a dual-process theory of moral judgment [2,3] that we claim predicts this result. In a Research Focus article published in this issue of Trends in Cognitive Sciences, Moll and de Oliveira-Souza [4] challenge this interpretation.


Journal of Experimental Psychology: General | 2012

Divine intuition: Cognitive style influences belief in God

Amitai Shenhav; David G. Rand; Joshua D. Greene

Some have argued that belief in God is intuitive, a natural (by-)product of the human mind given its cognitive structure and social context. If this is true, the extent to which one believes in God may be influenced by ones more general tendency to rely on intuition versus reflection. Three studies support this hypothesis, linking intuitive cognitive style to belief in God. Study 1 showed that individual differences in cognitive style predict belief in God. Participants completed the Cognitive Reflection Test (CRT; Frederick, 2005), which employs math problems that, although easily solvable, have intuitively compelling incorrect answers. Participants who gave more intuitive answers on the CRT reported stronger belief in God. This effect was not mediated by education level, income, political orientation, or other demographic variables. Study 2 showed that the correlation between CRT scores and belief in God also holds when cognitive ability (IQ) and aspects of personality were controlled. Moreover, both studies demonstrated that intuitive CRT responses predicted the degree to which individuals reported having strengthened their belief in God since childhood, but not their familial religiosity during childhood, suggesting a causal relationship between cognitive style and change in belief over time. Study 3 revealed such a causal relationship over the short term: Experimentally inducing a mindset that favors intuition over reflection increases self-reported belief in God.


Cognitive Science | 2012

Reflection and reasoning in moral judgment.

Joseph M. Paxton; Leo Ungar; Joshua D. Greene

While there is much evidence for the influence of automatic emotional responses on moral judgment, the roles of reflection and reasoning remain uncertain. In Experiment 1, we induced subjects to be more reflective by completing the Cognitive Reflection Test (CRT) prior to responding to moral dilemmas. This manipulation increased utilitarian responding, as individuals who reflected more on the CRT made more utilitarian judgments. A follow-up study suggested that trait reflectiveness is also associated with increased utilitarian judgment. In Experiment 2, subjects considered a scenario involving incest between consenting adult siblings, a scenario known for eliciting emotionally driven condemnation that resists reasoned persuasion. Here, we manipulated two factors related to moral reasoning: argument strength and deliberation time. These factors interacted in a manner consistent with moral reasoning: A strong argument defending the incestuous behavior was more persuasive than a weak argument, but only when increased deliberation time encouraged subjects to reflect.


Topics in Cognitive Science | 2010

Moral Reasoning: Hints and Allegations

Joseph M. Paxton; Joshua D. Greene

Recent research in moral psychology highlights the role of emotion and intuition in moral judgment. In the wake of these findings, the role and significance of moral reasoning remain uncertain. In this article, we distinguish among different kinds of moral reasoning and review evidence suggesting that at least some kinds of moral reasoning play significant roles in moral judgment, including roles in abandoning moral intuitions in the absence of justifying reasons, applying both deontological and utilitarian moral principles, and counteracting automatic tendencies toward bias that would otherwise dominate behavior. We argue that little is known about the psychology of moral reasoning and that it may yet prove to be a potent social force.


Psychological Science | 2012

You see, the ends don't justify the means: visual imagery and moral judgment

Elinor Amit; Joshua D. Greene

We conducted three experiments indicating that characteristically deontological judgments—here, disapproving of sacrificing one person for the greater good of others—are preferentially supported by visual imagery. Experiment 1 used two matched working memory tasks—one visual, one verbal—to identify individuals with relatively visual cognitive styles and individuals with relatively verbal cognitive styles. Individuals with more visual cognitive styles made more deontological judgments. Experiment 2 showed that visual interference, relative to verbal interference and no interference, decreases deontological judgment. Experiment 3 indicated that these effects are due to people’s tendency to visualize the harmful means (sacrificing one person) more than the beneficial end (saving others). These results suggest a specific role for visual imagery in moral judgment: When people consider sacrificing someone as a means to an end, visual imagery preferentially supports the judgment that the ends do not justify the means. These results suggest an integration of the dual-process theory of moral judgment with construal-level theory.


Ethics | 2014

Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics*

Joshua D. Greene

In this article I explain why cognitive science (including some neuroscience) matters for normative ethics. First, I describe the dual-process theory of moral judgment and briefly summarize the evidence supporting it. Next I describe related experimental research examining influences on intuitive moral judgment. I then describe two ways in which research along these lines can have implications for ethics. I argue that a deeper understanding of moral psychology favors certain forms of consequentialism over other classes of normative moral theory. I close with some brief remarks concerning the bright future of ethics as an interdisciplinary enterprise.


The Journal of Neuroscience | 2014

Integrative moral judgment: dissociating the roles of the amygdala and ventromedial prefrontal cortex.

Amitai Shenhav; Joshua D. Greene

A decades research highlights a critical dissociation between automatic and controlled influences on moral judgment, which is subserved by distinct neural structures. Specifically, negative automatic emotional responses to prototypically harmful actions (e.g., pushing someone off of a footbridge) compete with controlled responses favoring the best consequences (e.g., saving five lives instead of one). It is unknown how such competitions are resolved to yield “all things considered” judgments. Here, we examine such integrative moral judgments. Drawing on insights from research on self-interested, value-based decision-making in humans and animals, we test a theory concerning the respective contributions of the amygdala and ventromedial prefrontal cortex (vmPFC) to moral judgment. Participants undergoing fMRI responded to moral dilemmas, separately evaluating options for their utility (Which does the most good?), emotional aversiveness (Which feels worse?), and overall moral acceptability. Behavioral data indicate that emotional aversiveness and utility jointly predict “all things considered” integrative judgments. Amygdala response tracks the emotional aversiveness of harmful utilitarian actions and overall disapproval of such actions. During such integrative moral judgments, the vmPFC is preferentially engaged relative to utilitarian and emotional assessments. Amygdala-vmPFC connectivity varies with the role played by emotional input in the task, being the lowest for pure utilitarian assessments and the highest for pure emotional assessments. These findings, which parallel those of research on self-interested economic decision-making, support the hypothesis that the amygdala provides an affective assessment of the action in question, whereas the vmPFC integrates that signal with a utilitarian assessment of expected outcomes to yield “all things considered” moral judgments.

Collaboration


Dive into the Joshua D. Greene's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jonathan Baron

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge