Roberto Festa
University of Trieste
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Roberto Festa.
Erkenntnis | 1997
Roberto Festa
An important problem in inductive probability theory is the design of exchangeable analogical methods, i.e., of exchangeable inductive methods that take into account certain considerations of analogy by similarity for predictive inferences. Here a precise reformulation of the problem of predictive analogy is given and a new family of exchangeable analogical methods is introduced.
Synthese | 2012
Roberto Festa
Confirmation of a hypothesis by evidence can be measured by one of the so far known incremental measures of confirmation. As we show, incremental measures can be formally defined as the measures of confirmation satisfying a certain small set of basic conditions. Moreover, several kinds of incremental measure may be characterized on the basis of appropriate structural properties. In particular, we focus on the so-called Matthew properties: we introduce a family of six Matthew properties including the reverse Matthew effect; we further prove that incremental measures endowed with reverse Matthew effect are possible; finally, we shortly consider the problem of the plausibility of Matthew properties.
Synthese | 2013
Gustavo Cevolani; Roberto Festa; Theodorus Kuipers
In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify suitable conditions concerning the (partial) correctness of acquired data, under which revising our theories by data leads us closer to “the nomic truth”, construed as the target of scientific inquiry. We conclude by indicating some further developments, generalizations, and open issues arising from our results.
Archive | 2009
Vincenzo Crupi; Roberto Festa; Carlo Buttasi
The foundations of a detailed grammar of Bayesian confirmation are presented as a theoretical tool for the formal analysis of reasoning in epistemology and philosophy of science. After a discussion of core intuitions grounding the measurement of confirmation in probabilistic terms, a number of basic, derived and structural properties of Bayesian incremental confirmation are defined, distinguished and investigated in their logical relationships. Illustrations are provided that a thorough development of this line of research would yield an appropriate general framework of inquiry for several analyses and debates surrounding confirmation and Bayesian confirmation in particular.
Synthese | 1986
Roberto Festa
The problem of distance from the truth, and more generally distance between hypotheses, is considered here with respect to the case of quantitative hypotheses concerning the value of a given scientific quantity.Our main goal consists in the explication of the concept of distance D(I, θ) between an interval hypothesis I and a point hypothesis θ. In particular, we attempt to give an axiomatic foundation of this notion on the basis of a small number of adequacy conditions.Moreover, the distance function introduced here is employed for the reformulation of the approach to scientific inference — developed by Hintikka, Levi and other scholars — labelled “cognitive decision theory”. In this connection, we supply a concrete illustration of the rules for inductive acceptance of interval hypotheses that can be obtained on the basis of D(I, θ).Lastly, our approach is compared with other proposals made in literature about verisimilitude and distance from the truth.
Philosophy of Science | 2017
Roberto Festa; Gustavo Cevolani
We explore the grammar of Bayesian confirmation by focusing on some likelihood principles, including the Weak Law of Likelihood. We show that none of the likelihood principles proposed so far is satisfied by all incremental measures of confirmation, and we argue that some of these measures indeed obey new, prima facie strange, antilikelihood principles. To prove this, we introduce a new measure that violates the Weak Law of Likelihood while satisfying a strong antilikelihood condition. We conclude by hinting at some relevant links between the likelihood principles considered here and other properties of Bayesian confirmation recently explored in the literature.
The British Journal for the Philosophy of Science | 2008
Vincenzo Crupi; Roberto Festa; Tommaso Mastropasqua
Bayesian epistemology postulates a probabilistic analysis of many sorts of ordinary and scientific reasoning. Huber ([2005]) has provided a novel criticism of Bayesianism, whose core argument involves a challenging issue: confirmation by uncertain evidence. In this paper, we argue that under a properly defined Bayesian account of confirmation by uncertain evidence, Hubers criticism fails. By contrast, our discussion will highlight what we take as some new and appealing features of Bayesian confirmation theory. 1. Introduction2. Uncertain Evidence and Bayesian Confirmation3. Bayesian Confirmation by Uncertain Evidence: Test Cases and Basic Principles Introduction Uncertain Evidence and Bayesian Confirmation Bayesian Confirmation by Uncertain Evidence: Test Cases and Basic Principles
Synthese | 2018
Gustavo Cevolani; Roberto Festa
Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial entailment in defining a new measure of truthlikeness which satisfies a number of desiderata. The resulting account has some interesting and surprising connections with other accounts on the market, thus shedding new light on current attempts of systematizing different approaches to verisimilitude.
Archive | 2012
Roberto Festa
Tendency hypotheses – T-hypotheses, for short –, such as “the individuals of the kind Y tend to be X”, are used within several empirical sciences and play an important role in some of them, for instance in social sciences. However, so far T-hypotheses have received little or no attention by philosophers of science and statisticians.1 An exception is the work made in the seventies of the past century by the statisticians and social scientists David K. Hildebrand, James D. Laing, and Howard Rosenthal who worked out – under the label of prediction logic –, an interesting approach to the analysis of T-hypotheses.2
Erkenntnis | 2011
Gustavo Cevolani; Vincenzo Crupi; Roberto Festa