Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Milne is active.

Publication


Featured researches published by Peter Milne.


The British Journal for the Philosophy of Science | 1997

Bruno de Finetti and the logic of conditional events

Peter Milne

This article begins by outlining some of the history—beginning with brief remarks of Quines—of work on conditional assertions and conditional events. The upshot of the historical narrative is that diverse works from various starting points have circled around a nexus of ideas without convincingly tying them together. Section 3 shows how ideas contained in a neglected article of de Finettis lead to a unified treatment of the topics based on the identification of conditional events as the objects of conditional bets. The penultimate section explores some of the consequences of the resulting logic of conditional events while the last defends it.


Synthese | 1994

Classical harmony: rules of inference and the meaning of the logical constants

Peter Milne

The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny.


Thinking & Reasoning | 2012

Indicative conditionals, conditional probabilities, and the “defective truth-table”: A request for more experiments

Peter Milne

While there is now considerable experimental evidence that, on the one hand, participants assign to the indicative conditional as probability the conditional probability of consequent given antecedent and, on the other, they assign to the indicative conditional the “defective truth-table” in which a conditional with false antecedent is deemed neither true nor false, these findings do not in themselves establish which multi-premise inferences involving conditionals participants endorse. A natural extension of the truth-table semantics pronounces as valid numerous inference patterns that do seem to be part of ordinary usage. However, coupled with something the probability account gives us—namely that when conditional-free ϕ entails conditional-free ψ, “if ϕ then ψ” is a trivial, uninformative truth—we have enough logic to derive the paradoxes of material implication. It thus becomes a matter of some urgency to determine which inference patterns involving indicative conditionals participants do endorse. Only thus will we be able to arrive at a realistic, systematic semantics for the indicative conditional.


Archive | 2015

Inversion principles and introduction rules

Peter Milne

Following Gentzen’s practice, borrowed from intuitionist logic, Prawitz takes the introduction rule(s) for a connective to show how to prove a formula with the connective dominant. He proposes an inversion principle to make more exact Gentzen’s talk of deriving elimination rules from introduction rules. Here I look at some recent work pairing Gentzen’s introduction rules with general elimination rules. After outlining a way to derive Gentzen’s own elimination rules from his introduction rules, I give a very different account of introduction rules in order to pair them with general elimination rules in such a way that elimination rules can be read off introduction rules, introduction rules can be read off elimination rules, and both sets of rules can be read off classical truth-tables. Extending to include quantifiers, we obtain a formulation of classical first-order logic with the subformula property.


Review of Symbolic Logic | 2010

Subformula and separation properties in natural deduction via small Kripke models

Peter Milne

Various natural deduction formulations of classical, minimal, intuitionist, and inter- mediate propositional and first-order logics are presented and investigated with respect to satisfaction of the separation and subformula properties. The technique employed is, for the most part, seman- tic, based on general versions of the Lindenbaum and Lindenbaum-Henkin constructions. Careful attention is paid (i) to which properties of theories result in the presence of which rules of inference, and (ii) to restrictions on the sets of formulas to which the rules may be employed, restrictions determined by the formulas occurring as premises and conclusion of the invalid inference for which a counterexample is to be constructed. We obtain an elegant formulation of classical propositional logic with the subformula property and a singularly inelegant formulation of classical first-order logic with the subformula property, the latter, unfortunately, not a product of the strategy otherwise used throughout the article. Along the way, we arrive at an optimal strengthening of the subformula results for classical first-order logic obtained as consequences of normalization theorems by Dag Prawitz and Gunnar St˚ §1. Introduction. Although what follows can be seen as containing a contribution to the debate over proof-theoretic semantics and the revisionist challenge posed by such authors as Dag Prawitz, Michael Dummett, and Neil Tennant, its original motivation lies in two technical concerns. One of these starts out from the observation that in providing counterexamples to invalid inference patterns one does not follow the sequence set out in standard completeness proofs employing the Lindenbaum and Henkin-Lindenbaum constructions: when we come to construct Kripke models with a view to showing a certain sequent underivable in intuitionist logic, be it propositional or first order, we attend at most to the behavior of subformulas of the formulas in the sequent and, in the first-order case, instances of quantified formulas, subformulas of these, instances of these, and so on, at nodes in a suitable model; we do not attend to all formulas of the language in question, tacitly we let our model take care of the others—soundness guarantees that nothing untoward happens. Put another way, we can see this as a delimiting of the language of interest, a very narrow delimiting indeed to the formulas of the sequent in question and their subformulas. This ties the first motive to the second: the fact that, given completeness of, say, intuitionist propositional logic with respect to a semantics, the separation property yields completeness results for various subsets of the rules when applied to semantically valid sequents composed of formulas containing only connectives governed by those rules, and the subformula property yields completeness under an even greater restriction, namely that the rules governing connectives that occur in a valid sequent be confined in application


Journal of Philosophical Logic | 2004

Algebras of Intervals and a Logic of Conditional Assertions

Peter Milne

Intervals in boolean algebras enter into the study of conditional assertions (or events) in two ways: directly, either from intuitive arguments or from Goodman, Nguyen and Walkers representation theorem, as suitable mathematical entities to bear conditional probabilities, or indirectly, via a representation theorem for the family of algebras associated with de Finettis three-valued logic of conditional assertions/events. Further representation theorems forge a connection with rough sets. The representation theorems and an equivalent of the boolean prime ideal theorem yield an algebraic completeness theorem for the three-valued logic. This in turn leads to a Henkin-style completeness theorem. Adequacy with respect to a family of Kripke models for de Finettis logic, Łukasiewiczs three-valued logic and Priests Logic of Paradox is demonstrated. The extension to first-order yields a short proof of adequacy for Körners logic of inexact predicates.


International Journal of Approximate Reasoning | 1996

Avoiding Triviality: Fuzzy Implications and Conditional Events

Peter Milne

Abstract This paper investigates four questions. What are the logical presuppositions underlying classical probability that have a role to play in David Lewiss proof of triviality concerning probabilities of conditionals and conditional probabilities? To what extent and how are they avoided in fuzzy logics when we treat semantic evaluations as the analogues of probability distributions? The introduction into the classical setting of conditional events (or assertions)—as opposed to implications—as a class of objects whose probabilities are equated with conditional probabilities has been the object of much recent investigation. To what extent, if any, can fuzzy logics accommodate the analogues of conditional events? How is triviality avoided in conditional event algebras?


Analysis | 2003

Bayesianism v. scientific realism

Peter Milne

Scientific realism holds that we have good reason to regard our current best scientific theories as approximately true. Faced with the threat that corresponding to any given theory there can be inelegant, ad hoc, gerrymandered alternatives that accommodate the data of observation and experiment equally well, the scientific realist takes that which makes for the bestness of our current best theories, not just their empirical adequacy, as evidence for their (approximate) truth. Possession of theoretical virtues becomes an evidential consideration. In Stathis Psilloss book on scientific realism we read:


History and Philosophy of Logic | 1995

On the completeness of non-philonian stoic logic

Peter Milne

The majority of formal accounts attribute to Stoic logicians the classical truth-functional understanding of the material conditional and exclusive disjunction.These interpretations were disputed, some Stoic logicians favouring modal and/or temporal analyses; moreover, what comes down to us of Stoic logic fails to secure the classical interpretations on purely formal grounds.It is therefore of some interest to see how the non-classical interpretations fare. I argue that the strongest logic we have good grounds to attribute to Stoic logicians is not complete with respect to the non-classical interpretations of disjunction and the conditional


Journal of Logic, Language and Information | 2012

Probability as a Measure of Information Added

Peter Milne

Some propositions add more information to bodies of propositions than do others. We start with intuitive considerations on qualitative comparisons of information added. Central to these are considerations bearing on conjunctions and on negations. We find that we can discern two distinct, incompatible, notions of information added. From the comparative notions we pass to quantitative measurement of information added. In this we borrow heavily from the literature on quantitative representations of qualitative, comparative conditional probability. We look at two ways to obtain a quantitative conception of information added. One, the most direct, mirrors Bernard Koopman’s construction of conditional probability: by making a strong structural assumption, it leads to a measure that is, transparently, some function of a function P which is, formally, an assignment of conditional probability (in fact, a Popper function). P reverses the information added order and mislocates the natural zero of the scale so some transformation of this scale is needed but the derivation of P falls out so readily that no particular transformation suggests itself. The Cox–Good–Aczél method assumes the existence of a quantitative measure matching the qualitative relation, and builds on the structural constraints to obtain a measure of information that can be rescaled as, formally, an assignment of conditional probability. A classical result of Cantor’s, subsequently strengthened by Debreu, goes some way towards justifying the assumption of the existence of a quantitative scale. What the two approaches give us is a pointer towards a novel interpretation of probability as a rescaling of a measure of information added.

Collaboration


Dive into the Peter Milne's collaboration.

Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge