Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joshua Klayman is active.

Publication


Featured researches published by Joshua Klayman.


Psychology of Learning and Motivation | 1995

Varieties of Confirmation Bias

Joshua Klayman

Publisher Summary This chapter reviews research concerning a variety of confirmation biases and discusses what they have in common and where they differ. The overall picture is one of heterogeneous, complex, and inconsistent phenomena, from which it is nevertheless possible to discern a general direction, namely a general tendency for people to believe too much in their favored hypothesis. The chapter discusses ideas about how to reconcile the apparent heterogeneity and the apparent generality of confirmation biases. There has been considerable interest among cognitive and social psychologists in the idea that people tend to hang on to their favored hypotheses with unwarranted tenacity and confidence. This tendency has been referred to as perseverance of beliefs, hypothesis preservation, and confirmation bias. Research in this area presents a rather heterogeneous collection of findings: a set of confirmation biases, rather than one unified confirmation bias. There are often substantial task-to-task differences in the observed phenomena, their consequences, and the underlying cognitive processes. There is no consensus about such basic questions as what is a favored hypothesis, against what norm is a belief unwarranted, and under what circumstances are people susceptible or not susceptible to a bias.


Memory & Cognition | 1992

Information selection and use in hypothesis testing: what is a good question, and what is a good answer?

Louisa M. Slowiaczek; Joshua Klayman; Steven J. Sherman; Richard B. Skov

The process of hypothesis testing entails both information selection (asking questions) and information use (drawing inferences from the answers to those questions). We demonstrate that although subjects may be sensitive to diagnosticity in choosing which questions to ask, they are insufficiently sensitive to the fact that different answers to the- same question can have very different diagnosticities. This can lead subjects to overestimate or underestimate the information in the answers they receive. This phenomenon is demonstrated in two experiments using different kinds of inferences (category membership of individuals and composition of sampled populations). In combination with certain information-gathering tendencies, demonstrated in a third experiment, insensitivity to answer diagnosticity can contribute to a tendency toward preservation of the initial hypothesis. Results such as these illustrate the importance of viewing hypothesistesting behavior as an interactive, multistage process that includes selecting questions, interpreting data, and drawing inferences.


Journal of Experimental Child Psychology | 1982

A Hierarchy of Information Sources for Inferring Emotional Reactions.

Jackie Gnepp; Joshua Klayman; Tom Trabasso

Abstract The ability of children and adults to infer emotional reactions from different sources of information was investigated. Preschoolers (mean age, 4–9), second graders (mean age, 4–9), and college students were asked to infer emotional reactions from verbally presented stories in which sources of information occurred either singly, in conflict, or in congruence. The sources investigated were situational (the protagonists social or physical situation), normative (the dispositions of a group to which the protagonist belongs), and personal (the dispositions of the protagonist in particular). Subjects indicated the protagonists emotional reaction to an event in the story by choosing one of three facial expressions, representing happy, upset, and afraid. A hierarchy of sources for making emotional inferences was found at all three grade levels: personal information was preferred over normative information, and both were dominant over situational information. The relevance of present methods and findings to the study of the development of “empathy” is discussed.


Acta Psychologica | 1984

Learning from feedback in probabilistic environments

Joshua Klayman

Abstract Probability-learning studies have documented peoples inability to learn probabilistic relationships from outcome feedback. However, the real world is inherently probabilistic, and people do seem to be able to develop some understanding of their environment. It is proposed that typical probability-learning tasks fail to capture some important learning processes because they focus only on the perception of the shapes and magnitudes of cue-criterion functions, and because they have characteristics that encourage an inappropriate deterministic mental set. It is hypothesized that learning in natural environments takes place primarily through the discovery of new valid predictive cues, and the incorporation of these new cues into the learners predictive model. Results of a recent study provide evidence of this model-building process in a suitable experimental setting. Further research into model-building processes can elucidate the role of learning from feedback in the development of real-world expertise.


Advances in psychology | 1983

Analysis of Predecisional Information Search Patterns1

Joshua Klayman

Abstract Recent interest in complex and varied decision strategies has highlighted the need for more sophisticated process tracing analyses, e.g., in analyzing information gathering patterns. Earlier studies have classified strategies as high/low proportion of available information used, constant/variable amount of search across alternatives, and intra-/interdimensional direction of search. However, more powerful analyses are needed, since the search characteristics of a given strategy may be variable and highly task-dependent. Two major means of improving search analysis are discussed: (a) the use of task-specific simulations to establish the search characteristics expected from different strategies; and (b) the analysis of additional search characteristics, such as the extent to which future information search is controlled by prior information (contingency), and different types of search variability. An experimental example of the use of these techniques is presented. Applications are proposed in three areas: (a) the study of sequential combinations of decision rules, and multi-phase decision making; (b) exploration of the possibility that there exists continuous variation among strategies along various parameters, rather than a set of discrete rules; and (c) the investigation of how decision makers adapt strategy to task.


Advances in psychology | 1988

Chapter 4 On the How and Why (not) of Learning from Outcomes

Joshua Klayman

Publisher Summary Human judgment in a wide array of areas can be captured, measured, analyzed, and facilitated. Human judgment in a given domain is shaped by the feedback received over time concerning how judgments compare to the standards of accuracy in that domain. The information received may be in the form of cognitive feedback. In that case, the judgmental policy will be analyzed and compared to the known optimal judgmental model. More often, though, outcome feedback is obtained, that is, information only about how accurate each of ones judgments turned out to be. Using outcome feedback to learn is described as “the hard way,” as it makes learning difficult even in a deterministic environment. The outcomes observed are prone to inscrutable variation due to unknown controlling variables, inaccurate feedback, and perhaps even some truly random error. The basic Brunswikian view of learning gives rise to a paradigm for laboratory studies of learning in probabilistic environments, usually referred to as multiple cue probability learning (MCPL). The MCPL paradigm captures several important features of learning from feedback in natural situations.


Psychological Review | 1987

Confirmation, Disconfirmation, and Information in Hypothesis Testing

Joshua Klayman; Young-won Ha


Organizational Behavior and Human Decision Processes | 1999

Overconfidence: It Depends on How, What, and Whom You Ask

Joshua Klayman; Jack B. Soll; Claudia González-Vallejo; Sema Barlas


Journal of Experimental Psychology: Learning, Memory and Cognition | 2004

Overconfidence in interval estimates.

Jack B. Soll; Joshua Klayman


Journal of Experimental Psychology: Learning, Memory and Cognition | 1989

Hypothesis testing in rule discovery: Strategy, structure, and content

Joshua Klayman; Young-won Ha

Collaboration


Dive into the Joshua Klayman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jackie Gnepp

Northern Illinois University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kate O'Connor

National Kidney Foundation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Willa Lang

National Kidney Foundation

View shared research outputs
Researchain Logo
Decentralizing Knowledge