Arthur Paul Pedersen
Max Planck Society
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arthur Paul Pedersen.
Journal of Symbolic Logic | 2010
Itaï Ben Yaacov; Arthur Paul Pedersen
Continuous first-order logic has found interest among model theorists who wish to extend the classical analysis of “algebraic” structures (such as fields, group, and graphs) to various natural classes of complete metric structures (such as probability algebras, Hilbert spaces, and Banach spaces). With research in continuous first-order logic preoccupied with studying the model theory of this framework, we find a natural question calls for attention. Is there an interesting set of axioms yielding a completeness result? The primary purpose of this article is to show that a certain, interesting set of axioms does indeed yield a completeness result for continuous first-order logic. In particular, we show that in continuous first-order logic a set of formulae is (completely) satisfiable if (and only if) it is consistent. From this result it follows that continuous first-order logic also satisfies an approximated form of strong completeness, whereby Σ⊧ φ (if and) only if Σ⊢ φ ∸2 −n for all n . This approximated form of strong completeness asserts that if Σ⊧ φ , then proofs from Σ, being finite, can provide arbitrarily better approximations of the truth of φ . Additionally, we consider a different kind of question traditionally arising in model theory—that of decidability. When is the set of all consequences of a theory (in a countable, recursive language) recursive? Say that a complete theory T is decidable if for every sentence φ , the value φ T is a recursive real, and moreover, uniformly computable from φ . If T is incomplete, we say it is decidable if for every sentence φ the real number φ T o is uniformly recursive from φ , where φ T o is the maximal value of φ consistent with T . As in classical first-order logic, it follows from the completeness theorem of continuous first-order logic that if a complete theory admits a recursive (or even recursively enumerable) axiomatization then it is decidable.
International Journal of Approximate Reasoning | 2012
Horacio L. Arló-Costa; Arthur Paul Pedersen
This paper considers varieties of probabilism capable of distilling paradox-free qualitative doxastic notions (e.g., full belief, expectation, and plain belief) from a notion of probability taken as a primitive. We show that core systems, collections of nested propositions expressible in the underlying algebra, can play a crucial role in these derivations. We demonstrate how the notion of a probability core can be naturally generalized to high probability, giving rise to what we call a high probability core, a notion that when formulated in terms of classical monadic probability coincides with the notion of stability proposed by Hannes Leitgeb [32]. Our work continues by one of us in collaboration with Rohit Parikh [7]. In turn, the latter work was inspired by the seminal work of Bas van Fraassen [46]. We argue that the adoption of dyadic probability as a primitive (as articulated by van Fraassen [46]) admits a smoother connection with the standard theory of probability cores as well as a better model in which to situate doxastic notions like full belief. We also illustrate how the basic structure underlying a system of cores naturally leads to alternative probabilistic acceptance rules, like the so-called ratio rule initially proposed by Isaac Levi [34]. Core systems in their various guises are ubiquitous in many areas of formal epistemology (e.g., belief revision, the semantics of conditionals, modal logic, etc.). We argue that core systems can also play a natural and important role in Bayesian epistemology and decision theory. In fact, the final part of the article shows that probabilistic core systems are naturally derivable from basic decision-theoretic axioms which incorporate only qualitative aspects of core systems; that the qualitative aspects of core systems alone can be naturally integrated in the articulation of coherence of primitive conditional probability; and that the guiding idea behind the primary qualitative features of a core system gives rise to the formulation of lexicographic decision rules.
Belief Revision meets Philosophy of Science | 2010
Horacio L. Arló-Costa; Arthur Paul Pedersen
This chapter elaborates on foundational issues in the social sciences and their impact on the contemporary theory of belief revision. Recent work in the foundations of economics has focused on the role external social norms play in choice. Amartya Sen (Econometrica 61(3): 495–521, 1993) has argued that the traditional rationalizability approach used in the theory of rational choice has serious problems accommodating the role of social norms. Sen’s more recent work (Language, world and reality 1996, pp. 19–31; Econometrica 65 (4): 745–779, 1997) proposes how one might represent social norms in the theory of choice, and in a very recent article Walter Bossert and Kotaro Suzumura (Social norms and rationality of choice, preprint, 2007) develop Sen’s proposal, offering an extension of the classical theory of choice that is capable of dealing with social norms.
Minds and Machines | 2016
Ralph Hertwig; Arthur Paul Pedersen
In November 2013, we held an interdisciplinary workshop at the Max Planck Institute for Human Development in Berlin entitled ‘‘Finding Foundations for Bounded and Adaptive Rationality.’’ The invited speakers and discussants included psychologists and cognitive, computer, and decision scientists, as well as philosophers; the late Patrick Suppes gave a video presentation from his office at Stanford University. Each presentation had two discussants, one from philosophy and one from the sciences. The discourse that ensued among the workshop’s participants was intensive and constructive, resulting in the eight articles that comprise this special issue of Minds and Machines. In organizing the workshop, we pursued two interrelated goals. The first was to facilitate critical discussion about old and new problems in the study of rationality, particularly those raised and addressed by the simple-heuristics program, a research paradigm that, since the mid-1990s, has pursued a novel vision of bounded rationality. The second goal was to transcend the conventional division of labor between behavioral decision scientists and philosophers. Over many decades, the two sets of researchers appear to have agreed on a labor contract. Philosophers are to explicate the nature of rationality and articulate its normative standards. Taking these normative standards lock, stock, and barrel, behavioral decision scientists are then to empirically investigate people’s behavior to ascertain the extent to which
Studia Logica archive | 2014
Arthur Paul Pedersen
I introduce a mathematical account of expectation based on a qualitative criterion of coherence for qualitative comparisons between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent’s comparative preference judgments over options or more directly as an agent’s comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti’s quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent’s comparative judgments, it does not require the binary relation reflecting an agent’s comparative judgments to be reflexive, complete or even transitive, and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.). Moreover, unlike de Finetti’s criterion of coherence, the qualitative criterion respects the principle of weak dominance, a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti’s Fundamental Theorem of Prevision, from which it follows that any coherent system of comparative expectations can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti’s quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that comparative previsions satisfy qualitative analogues of de Finetti’s famous bookmaking theorem and his Fundamental Theorem of Prevision.The results of this article complement those of another article (Pedersen, Strictly coherent preferences, no holds barred, Manuscript, 2013). I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an expectation function taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of formal power series in a single infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features.
Archive | 2011
Horacio Arló Costa; Arthur Paul Pedersen
Herb Simon pioneered the study of bounded models of rationality. Simon famously argued that decision makers typically satisfice rather than optimize. According to Simon, a decision maker normally chooses an alternative that meets or exceeds specified criteria, even when this alternative is not guaranteed to be unique or in any sense optimal. For example, Simon argued that an organism – instead of scanning all the possible alternatives, computing each probability of every outcome of each alternative, calculating the utility of each alternative, and thereupon selecting the optimal option with respect to expected utility – typically chooses the first option that satisfies its “aspiration level.”
Philosophy of Science | 2012
Arthur Paul Pedersen; Clark Glymour
In an essay recently published in this journal, Branden Fitelson argues that a variant of Miller’s argument for the language dependence of the accuracy of predictions can be applied to Joyce’s notion of accuracy of credences formulated in terms of scoring rules, resulting in a general potential problem for Joyce’s argument for probabilism. We argue that no relevant problem of the sort Fitelson supposes arises since his main theorem and his supporting arguments presuppose the validity of nonlinear transformations of credence functions that Joyce’s theory, charitably construed, would identify as invalid on the basis of the principle of simple dominance.
Synthese | 2013
Horacio L. Arló-Costa; Arthur Paul Pedersen
Gerd Gigerenzer and Thomas Sturm have recently proposed a modest form of what they describe as a normative, ecological and limited naturalism. The basic move in their argument is to infer that certain heuristics we tend to use should be used in the right ecological setting. To address this argument, we first consider the case of a concrete heuristic called Take the Best (TTB). There are at least two variants of the heuristic which we study by making explicit the choice functions they induce, extending these variants of TTB beyond binary choice. We argue that the naturalistic argument can be applied to only one of the two variants of the heuristic; we also argue that the argument for the extension requires paying attention to other “rational” virtues of heuristics aside from efficacy, speed, and frugality. This notwithstanding, we show that there is a way of extending the right variant of TTB to obtain a very well behaved heuristic that could be used to offer a stronger case for the naturalistic argument (in the sense that if this heuristic is used, it is also a heuristic that we should use). The second part of the article considers attempts to extending the naturalistic argument from algorithms dealing with inference to heuristics dealing with choice. Our focus is the so-called Priority Heuristic, which we extend from risk to uncertainty. In this setting, the naturalist argument seems more difficult to formulate, if it remains feasible at all. Normativity seems in this case extrinsic to the heuristic, whose main virtue seems to be its ability to describe actual patterns of choice. But it seems that a new version of the naturalistic argument used with partial success in the case of inference is unavailable to solve the normative problem of whether we should exhibit the patterns of choice that we actually display.
Proceedings of the 9th International Symposium on Imprecise Probability: Theories and Applications (ISIPTA 2015) | 2015
Arthur Paul Pedersen; Gregory R. Wheeler
theoretical aspects of rationality and knowledge | 2013
Eric Pacuit; Arthur Paul Pedersen; Jan-Willem Romeijn