Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gerd Gigerenzer is active.

Publication


Featured researches published by Gerd Gigerenzer.


Psychological Review | 1991

Probabilistic mental models : a brunswikian theory of confidence

Gerd Gigerenzer; Ulrich Hoffrage; Heinz Kleinbölting

Research on peoples confidence in their general knowledge has to date produced two fairly stable effects, many inconsistent results, and no comprehensive theory. We propose such a comprehensive framework, the theory of probabilistic mental models (PMM theory). The theory (a) explains both the overconfidence effect (mean confidence is higher than percentage of answers correct) and the hard-easy effect (overconfidence increases with item difficulty) reported in the literature and (b) predicts conditions under which both effects appear, disappear, or invert. In addition, (c) it predicts a new phenomenon, the confidence-frequency effect, a systematic difference between a judgment of confidence in a single event (i.e., that any given answer is correct) and a judgment of the frequency of correct answers in the long run. Two experiments are reported that support PMM theory by confirming these predictions, and several apparent anomalies reported in the literature are explained and integrated into the present framework.


Psychological Review | 2002

Models of ecological rationality : The recognition heuristic

Daniel G. Goldstein; Gerd Gigerenzer

One view of heuristics is that they are imperfect versions of optimal statistical procedures considered too complicated for ordinary minds to carry out. In contrast, the authors consider heuristics to be adaptive strategies that evolved in tandem with fundamental psychological mechanisms. The recognition heuristic, arguably the most frugal of all heuristics, makes inferences from patterns of missing knowledge. This heuristic exploits a fundamental adaptation of many organisms: the vast, sensitive, and reliable capacity for recognition. The authors specify the conditions under which the recognition heuristic is successful and when it leads to the counterintuitive less-is-more effect in which less knowledge is better than more for making accurate inferences.


Psychological Science in the Public Interest | 2007

Helping Doctors and Patients Make Sense of Health Statistics

Gerd Gigerenzer; Wolfgang Gaissmaier; Elke Kurz-Milcke; Lisa M. Schwartz; Steven Woloshin

Many doctors, patients, journalists, and politicians alike do not understand what health statistics mean or draw wrong conclusions without noticing. Collective statistical illiteracy refers to the widespread inability to understand the meaning of numbers. For instance, many citizens are unaware that higher survival rates with cancer screening do not imply longer life, or that the statement that mammography screening reduces the risk of dying from breast cancer by 25% in fact means that 1 less woman out of 1,000 will die of the disease. We provide evidence that statistical illiteracy (a) is common to patients, journalists, and physicians; (b) is created by nontransparent framing of information that is sometimes an unintentional result of lack of understanding but can also be a result of intentional efforts to manipulate or persuade people; and (c) can have serious consequences for health. The causes of statistical illiteracy should not be attributed to cognitive biases alone, but to the emotional nature of the doctor–patient relationship and conflicts of interest in the healthcare system. The classic doctor–patient relation is based on (the physicians) paternalism and (the patients) trust in authority, which make statistical literacy seem unnecessary; so does the traditional combination of determinism (physicians who seek causes, not chances) and the illusion of certainty (patients who seek certainty when there is none). We show that information pamphlets, Web sites, leaflets distributed to doctors by the pharmaceutical industry, and even medical journals often report evidence in nontransparent forms that suggest big benefits of featured interventions and small harms. Without understanding the numbers involved, the public is susceptible to political and commercial manipulation of their anxieties and hopes, which undermines the goals of informed consent and shared decision making. What can be done? We discuss the importance of teaching statistical thinking and transparent representations in primary and secondary education as well as in medical school. Yet this requires familiarizing children early on with the concept of probability and teaching statistical literacy as the art of solving real-world problems rather than applying formulas to toy problems about coins and dice. A major precondition for statistical literacy is transparent risk communication. We recommend using frequency statements instead of single-event probabilities, absolute risks instead of relative risks, mortality rates instead of survival rates, and natural frequencies instead of conditional probabilities. Psychological research on transparent visual and numerical forms of risk communication, as well as training of physicians in their use, is called for. Statistical literacy is a necessary precondition for an educated citizenship in a technological democracy. Understanding risks and asking critical questions can also shape the emotional climate in a society so that hopes and anxieties are no longer as easily manipulated from outside and citizens can develop a better-informed and more relaxed attitude toward their health.


Psychological Bulletin | 1989

Do studies of statistical power have an effect on the power of studies

Peter Sedlmeier; Gerd Gigerenzer

The long-term impact of studies of statistical power is investigated using J. Cohens (1962) pioneering work as an example. We argue that the impact is nil; the power of studies in the same journal that Cohen reviewed (now the Journal of Abnormal Psychology) has not increased over the past 24 years. In 1960 the median power (i.e., the probability that a significant result will be obtained if there is a true effect) was .46 for a medium size effect, whereas in 1984 it was only .37. The decline of power is a result of alpha-adjusted procedures. Low power seems to go unnoticed: only 2 out of 64 experiments mentioned power, and it was never estimated. Nonsignificance was generally interpreted as confirmation of the null hypothesis (if this was the research hypothesis), although the median power was as low as .25 in these cases. We discuss reasons for the ongoing neglect of power.


Psychological Review | 2006

The priority heuristic : making choices without trade-offs

Edouard Brandstätter; Gerd Gigerenzer; Ralph Hertwig

Bernoullis framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, the authors generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (a) the Allais paradox, (b) risk aversion for gains if probabilities are high, (c) risk seeking for gains if probabilities are low (e.g., lottery tickets), (d) risk aversion for losses if probabilities are low (e.g., buying insurance), (e) risk seeking for losses if probabilities are high, (f) the certainty effect, (g) the possibility effect, and (h) intransitivities. The authors test how accurately the heuristic predicts peoples choices, compared with previously proposed heuristics and 3 modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory.


Behavioral and Brain Sciences | 2000

Précis of simple heuristics that make us smart.

Peter M. Todd; Gerd Gigerenzer

How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the minds adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.


Perspectives on Psychological Science | 2008

Why heuristics work

Gerd Gigerenzer

The adaptive toolbox is a Darwinian-inspired theory that conceives of the mind as a modular system that is composed of heuristics, their building blocks, and evolved capacities. The study of the adaptive toolbox is descriptive and analyzes the selection and structure of heuristics in social and physical environments. The study of ecological rationality is prescriptive and identifies the structure of environments in which specific heuristics either succeed or fail. Results have been used for designing heuristics and environments to improve professional decision making in the real world.


Psychopharmacology | 1999

Specific attentional dysfunction in adults following early start of cannabis use

Hannelore Ehrenreich; Thomas Rinn; Hanns Jürgen Kunert; M. R. Moeller; Wolfgang Poser; Lothar Schilling; Gerd Gigerenzer; Margret R. Hoehe

Rationale and objective: The present study tested the hypothesis that chronic interference by cannabis with endogenous cannabinoid systems during peripubertal development causes specific and persistent brain alterations in humans. As an index of cannabinoid action, visual scanning, along with other attentional functions, was chosen. Visual scanning undergoes a major maturation process around age 12–15 years and, in addition, the visual system is known to react specifically and sensitively to cannabinoids. Methods: From 250 individuals consuming cannabis regularly, 99 healthy pure cannabis users were selected. They were free of any other past or present drug abuse, or history of neuropsychiatric disease. After an interview, physical examination, analysis of routine laboratory parameters, plasma/urine analyses for drugs, and MMPI testing, users and respective controls were subjected to a computer-assisted attention test battery comprising visual scanning, alertness, divided attention, flexibility, and working memory. Results: Of the potential predictors of test performance within the user group, including present age, age of onset of cannabis use, degree of acute intoxication (THC+THCOH plasma levels), and cumulative toxicity (estimated total life dose), an early age of onset turned out to be the only predictor, predicting impaired reaction times exclusively in visual scanning. Early-onset users (onset before age 16; n = 48) showed a significant impairment in reaction times in this function, whereas late-onset users (onset after age 16; n = 51) did not differ from controls (n = 49). Conclusions: These data suggest that beginning cannabis use during early adolescence may lead to enduring effects on specific attentional functions in adulthood. Apparently, vulnerable periods during brain development exist that are subject to persistent alterations by interfering exogenous cannabinoids.


Journal of Experimental Psychology: Human Perception and Performance | 1988

Presentation and content: The use of base rates as a continuous variable

Gerd Gigerenzer; Wolfgang Hell; Hartmut Blank

Do subjects, in probability revision experiments, generally neglect base rates due to the use of a representativeness heuristic, or does the use of base rates depend on what we call the internal problem representation? In Experiment 1, we used Kahneman and Tversky’s (1973) engineer-lawyer problem, where random sampling of descriptions is crucial to the internal representation of the problem as one in probability revision. If random sampling was performed and observed by the subjects themselves, then their judgments conformed more to Bayesian theory than to the representativeness hypothesis. If random sampling was only verbally asserted, judgments followed the representativeness heuristic. In Experiment 2 we used the soccer problem, which has the same formal structure but which the subjects’ every day experience already represents as a probability revision problem. With this change in content, subjects’ judgments were indistinguishable from Bayesian performance. We conclude that by manipulating presentation and content, one can elicit either base rate neglect or base rate use, as well as points in between. Th is result suggests that representativeness is neither an all-purpose mental strategy nor even a tendency, but rather a function of the content and the presentation of crucial information. From its origins circa 1660 until the mid-nineteenth century, probability theory was closely identifi ed with rational thinking. In Laplace’s famous phrase, probability theory was believed to be “only common sense reduced to calculus” (Laplace, 1814/1951, p. 196). For the classical probabilists, their calculus codifi ed the intuitions of an elite of reasonable men in the face of uncertainty. And if these reasonable intuitions deviated from the laws of probability theory, it was the latter that were cast into doubt. Such discrepancies actually infl uenced the way in which probability theory developed mathematically (Daston, 1980). In the early decades of the nineteenth century, probability theory shifted from being a description of the intuitions of rational individuals to one of the behavior of the irrational masses (Porter, 1986). But in the 1960s and 1970s experimental psychology reestablished the link between probability theory and rational thinking under uncertainty. However, the new alliance diff ered from the old in two important respects. First, it was now probability theory, rather than intuitive judgments, that was the normative standard. Although probabilists have from time to time doubted whether the additivity law holds in all cases (Shafer, 1978), and although there is evidence that diff erent statistical approaches suggest diff erent answers to the same problem (Birnbaum, 1983), psychologists have generally assumed that statistics spoke with one voice—a necessary assumption for the new normative approach. Second, the link between probability theory and human thinking has become the subject of experimental research. First, by using urn-and-balls problems (e.g., Edwards, 1968; Phillips & Edwards, 1966) and then more


Academic Medicine | 1998

Using natural frequencies to improve diagnostic inferences

Ulrich Hoffrage; Gerd Gigerenzer

PURPOSE: To test whether physicians diagnostic inferences can be improved by communicating information using natural frequencies instead of probabilities. Whereas probabilities and relative frequencies are normalized with respect to disease base rates, natural frequencies are not normalized. METHOD: The authors asked 48 physicians in Munich and DÃsseldorf to determine the positive predictive values (PPVs) of four diagnostic tests. Information presented in the four problems appeared either as probabilities (the traditional way) or as natural frequencies. RESULTS: When the information was presented as probabilities, the physicians correctly estimated the PPVs in only 10% of cases. When the same information was presented as natural frequencies, that percentage increased to 46%. CONCLUSION: Representing information in natural frequencies is a fast and effective way of facilitating diagnosis insight, which in turn helps physicians to better communicate risks to patients, and patients to better understand these risks.

Collaboration


Dive into the Gerd Gigerenzer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter M. Todd

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lorenz Krüger

University of Göttingen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John Beatty

University of British Columbia

View shared research outputs
Researchain Logo
Decentralizing Knowledge