Neil Tennant
Ohio State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Neil Tennant.
Journal of Symbolic Logic | 1989
W. D. Hart; Neil Tennant
Anti-realism is a doctrine about logic, language, and meaning with roots in the work of Wittgenstein and Frege. In this book, the author clarifies Dummetts case for anti-realism and develops his arguments further. He concludes by advocating a radical reform of our logical practices.
Studia Logica | 1984
Neil Tennant
This paper treats entailment as a subrelation of classical consequence and deducibility. Working with a Gentzen set-sequent system, we define an entailment as a substitution instance of a valid sequent all of whose premisses and conclusions are necessary for its classical validity. We also define a sequent Proof as one in which there are no applications of cut or dilution. The main result is that the entailments are exactly the Provable sequents. There are several important corollaries. Every unsatisfiable set is Provably inconsistent. Every logical consequence of a satisfiable set is Provable therefrom. Thus our system is adequate for ordinary mathematical practice. Moreover, transitivity of Proof fails upon accumulation of Proofs only when the newly combined premisses are inconsistent anyway, or the conclusion is a logical truth. In either case Proofs that show this can be effectively determined from the Proofs given. Thus transitivity fails where it least matters — arguably, where it ought to fail! We show also that entailments hold by virtue of logical form insufficient either to render the premisses inconsistent or to render the conclusion logically true. The Lewis paradoxes are not Provable. Our system is distinct from Anderson and Belnaps system of first degree entailments, and Johanssons minimal logic. Although the Curry set paradox is still Provable within naive set theory, our system offers the prospect of a more sensitive paraconsistent reconstruction of mathematics. It may also find applications within the logic of knowledge and belief.
Archive | 1999
Neil Tennant
I argue for a rule-based account of negation answering to both constructivist and relevantist demands. We can give such an account in terms of basic contrarieties, and by co-inductively defining proofs and disproofs, without having to make explicit appeal to the absurdity constant ┴. If we do make such an appeal, it is to ┴ only as a structural punctuation marker within deductions, a device that allows us to assimilate disproofs to the general class of proofs. ┴ does not, in this role, need to be governed by any ‘introduction’ or ‘elimination’ rules of its own. Nor does ┴ need to be treated as a propositional constant eligible for embedding within other sentences. But even if we do treat ┴ as an embeddable propositional constant, it does not follow that negation can, let alone should, be defined in terms of it. Negation should be taken as primitive, and one should explain how a grasp of its sense arises from one’s prior grasp of primitive metaphysical contrarieties within an interpreted language.
Australasian Journal of Philosophy | 2001
Neil Tennant
Michael Hand and Jonathan Kvanvig claim that ‘Tennant grants the validity of the reductio and agrees with realists that (1) should be denied.’ This is not so. It is one thing to analyse in detail the precise logical structure of the would-be reductio; yet it is another thing entirely to grant its validity (in a way that carries with it recognition that a crisis of thought has been precipitated). To be clear for the record: I do not grant the validity of the would-be reductio in the important sense just clarified. Even though I grant the formal correctness of the reductio, I am more inclined to refrain from asserting (2) than I am inclined to refrain from asserting (1). The advantage of setting out in detail the precise formal structure of the would-be reductio as I did in the text is that one has a complete inventory of every premiss and step of inference that might be called into question by one who refuses to accept the result. Among those premisses and steps, in the present instance, are the following.
The British Journal for the Philosophy of Science | 1994
Neil Tennant
The Theory of theory change has contraction and revision as its central notions. Of these, contraction is the more fundamental. The best-known theory, due to Alchourrón, Gärdenfors, and Makinson, is based on a few central postulates. The most fundamental of these is the principle of recovery: if one contracts a theory with respect to a sentence, and then adds that sentence back again, one recovers the whole theory. Recovery is demonstrably false. This paper shows why, and investigates how one can nevertheless characterize contraction in a theoretically fruitful way. The theory proposed lends itself to implementation, which in turn could yield new theoretical insights. The Main proposal is a ‘staining algorithm’ which identifies which sentences to reject when contracting a theory. The algorithm requires one to be clear about the structure of reasons one has for including sentences within ones theory.
Journal of Symbolic Logic | 1987
Neil Tennant
Relevance logic began in an attempt to avoid the so-called fallacies of relevance. These fallacies can be in implicational form or in deductive form. For example, Lewiss first paradox can beset a system in implicational form, in that the system contains as a theorem the formula ( A & ∼ A ) → B ; or it can beset it in deductive form, in that the system allows one to deduce B from the premisses A , ∼ A . Relevance logic in the tradition of Anderson and Belnap has been almost exclusively concerned with characterizing a relevant conditional . Thus it has attacked the problem of relevance in its implicational form. Accordingly for a relevant conditional → one would not have as a theorem the formula ( A & ∼ A ) → B . Other theorems even of minimal logic would also be lacking. Perhaps most important among these is the formula ( A → ( B → A )). It is also a well-known feature of their system R that it lacks the intuitionistically valid formula (( A ∨ B ) & ∼ A ) → B (disjunctive syllogism). But it is not the case that any relevance logic worth the title even has to concern itself with the conditional, and hence with the problem in its implicational form. The problem arises even for a system without the conditional primitive. It would still be an exercise in relevance logic, broadly construed, to formulate a deductive system free of the fallacies of relevance in deductive form even if this were done in a language whose only connectives were, say, &, ∨ and ∼. Solving the problem of relevance in this more basic deductive form is arguably a precondition for solving it for the conditional, if we suppose (as is reasonable) that the relevant conditional is to be governed by anything like the rule of conditional proof.
Logic Journal of The Igpl \/ Bulletin of The Igpl | 2002
Neil Tennant
The system of natural deduction that originated with Gentzen (1934–5), and for which Prawitz (1965) proved a normalization theorem, is re-cast so that all elimination rules are in parallel form. This enables one to prove a very exigent normalization theorem. The normal forms that it provides have all disjunction-eliminations as low as possible, and have no major premisses for eliminations standing as conclusions of any rules. Normal natural deductions are isomorphic to cut-free, weakening-free sequent proofs. This form of normalization theorem renders unnecessary Gentzen’s resort to sequent calculi in order to establish the desired metalogical properties of his logical system. Ultimate normal forms are well-adapted to the needs of the computational logician, affording valuable constraints on proof-search. They also provide an analysis of deductive relevance. There is a deep isomorphism between natural deductions and sequent proofs in the relevantized system.
Review of Symbolic Logic | 2012
Neil Tennant
The motivation for Core Logic is explained. Its system of proof is set out. It is then shown that, although the system has no Cut rule, its relation of deducibility obeys Cut with epistemic gain .
Journal of Applied Non-Classical Logics | 1997
Neil Tennant
ABSTRACT The well-known AGM-theory-contraction and theory-revision, due to Alchourron, Gardenfors and Makinson, relies heavily on the so-called postulate of recovery. This postulate is supposed to capture the requirement of “minimum mutilation”; but it does not. Recovery can be satisfied even when there is more mutilation than is necessary. Recovery also ensures that very often too little is given up in a contraction, in this paper I bring out clearly the deficiencies of the AGM-theory in these two regards, showing how it is doubly off-beam. I show that some of the most serious inadequacies of the AGM-theory derive from early claims in some of its founding contributions, claims that have not been seriously questioned within the tradition since. The upshot of these investigations is that recovery cannot, and should not, be recovered. Theory contraction is hysteretic. Whether the AGM-theory can now recover is a good question.
Journal of Philosophical Logic | 2006
Neil Tennant
AGM-theory, named after its founders Carlos Alchourrón, Peter Gärdenfors and David Makinson, is the leading contemporary paradigm in the theory of belief-revision. The theory is reformulated here so as to deal with the central relational notions ‘J is a contraction of K with respect to A’ and ‘J is a revision of K with respect to A’. The new theory is based on a principal-case analysis of the domains of definition of the three main kinds of theory-change (expansion, contraction and revision). The new theory is stated by means of introduction and elimination rules for the relational notions. In this new setting one can re-examine the relationship between contraction and revision, using the appropriate versions of the so-called Levi and Harper identities. Among the positive results are the following. One can derive the extensionality of contraction and revision, rather than merely postulating it. Moreover, one can demonstrate the existence of revision-functions satisfying a principle of monotonicity. The full set of AGM-postulates for revision-functions allow for completely bizarre revisions. This motivates a Principle of Minimal Bloating, which needs to be stated as a separate postulate for revision. Moreover, contractions obtained in the usual way from the bizarre revisions, by using the Harper identity, satisfy Recovery. This provides a new reason (in addition to several others already adduced in the literature) for thinking that the contraction postulate of Recovery fails to capture the Principle of Minimal Mutilation. So the search is still on for a proper explication of the notion of minimal mutilation, to do service in both the theory of contraction and the theory of revision. The new relational formulation of AGM-theory, based on principal-case analysis, shares with the original, functional form of AGM-theory the idealizing assumption that the belief-sets of rational agents are to be modelled as consistent, logically closed sets of sentences. The upshot of the results presented here is that the new relational theory does a better job of making important matters clear than does the original functional theory. A new setting has been provided within which one can profitably address two pressing questions for AGM-theory: (1) how is the notion of minimal mutilation (by both contractions and revisions) best analyzed? and (2) how is one to rule out unnecessary bloating by revisions?