We compare Tarski’s notion of logical consequence (preservation of truth) with that of Prawitz (t... more We compare Tarski’s notion of logical consequence (preservation of truth) with that of Prawitz (transformability of warrants for assertion). The latter is our point of departure for a definition of consequence in terms of the transformability of truthmakers (verifications) relative to all models. A sentence’s Tarskian truth-in-M coincides with its having an M-relative truthmaker. An M-relative truthmaker serves as a winning strategy or game plan for player T in the ‘material game’ played on that sentence against the background of the model M. We enter conjectures about soundness and completeness of Classical Core Logic with respect to the notion of consequence that results when the domain is required to be decidable. We consider whether the truthmaker semantics threatens a slide to realism. We work with examples of core proofs whose premises are given M-relative truthmakers; and show how these can be systematically transformed into a truthmaker for the proof’s conclusion.
This is the first logically precise, computationally implementable, book-length account of ration... more This is the first logically precise, computationally implementable, book-length account of rational belief revision. It explains how a rational agent ought to proceed when adopting a new belief - a difficult matter if the new belief contradicts the agent's old beliefs. Belief systems are modeled as finite dependency networks. So one can attend not only to what the agent believes, but also to the variety of reasons the agent has for so believing. The computational complexity of the revision problem is characterized. Algorithms for belief revision are formulated, and implemented in Prolog. The implementation tests well on a range of simple belief-revision problems that pose a variety of challenges for any account of belief revision. The notion of 'minimal mutilation' of a belief system is explicated precisely for situations when the agent is faced with conflicting beliefs. The proposed revision methods are invariant across different global justificatory structures (foundationalist, coherentist, etc.). They respect the intuition that, when revising one's beliefs, one should not hold on to any belief that has lost all its former justifications. The limitation to finite dependency networks is shown not to compromise theoretical generality.This account affords a novel way to argue that there is an inviolable core of logical principles. These principles, which form the system of Core Logic, cannot be given up, on pain of not being able to carry out the reasoning involved in rationally revising beliefs.The book ends by comparing and contrasting the new account with some major representatives of earlier alternative approaches, from the fields of formal epistemology, artificial intelligence and mathematical logic.
University of Pittsburgh Press eBooks, Jan 15, 1994
... Other im-portant Kantian synthetic a priori truths were discredited and de-moted to a ... and... more ... Other im-portant Kantian synthetic a priori truths were discredited and de-moted to a ... and insufficiency, from verbal solutions, from bad a priori reasons, from fixed principles, closed systems ... Being nothing essentially new, it harmonizes with many ancient philosophic tendencies. ...
The logicizing trend of Dedekind and Frege arose in the wake of the arithmetization of analysis b... more The logicizing trend of Dedekind and Frege arose in the wake of the arithmetization of analysis begun by Gauss and consummated by Weierstraß. In Kantian terms, these mathematicians and foundationalists sought to account for number theory as the product of our faculty of understanding, and not of a priori forms of intuition. The discussion turns to Frege’s class theory and his ill-fated Basic Law V, which was prey to Russell’s Paradox; and describes the later development of modern set theory as the ‘carrier theory’ for mathematical foundations.
Numerosity statements are statements of the form ‘There are exactly nΦs’, where the number-word ... more Numerosity statements are statements of the form ‘There are exactly nΦs’, where the number-word in place of n is understood adjectivally in English. A numerosity statement of this form does not commit one to the existence of the natural number n, or of any numbers at all. The question is how best to regiment these numerosity statements in first-order logical notation. Four different methods for doing so are presented. Each method involves so defining a regimenting formal sentence that, given any natural number n, one can effectively determine which formal sentence at first order regiments the numerosity statement, in English, that there are exactly nΦs. The four different regimentations, which are abbreviated as ∇n xΦ(x), ♢n xΦ(x), Ʊn xΦ(x), and ♡n xΦ(x), are interdeducible.
The chapter seeks to show how rational numbers—fractions—are entities to which ra tional thinkers... more The chapter seeks to show how rational numbers—fractions—are entities to which ra tional thinkers are logically entitled. It begins by introducing mereology as part of our conceptual and methodological framework for eventually attaining a grasp of fractions. The basic concepts of mereology are laid out, with their formal notations. The treatment of identity is discussed. Mereology is the theory of the part-whole relation, and is of very low consistency strength; indeed, it is arguably analytically true. Mereological fusion is treated here as an abstraction operator, in this study’s preferred single-barreled fashion, with introduction and elimination rules in free Core Logic. The notion of a dimensional magnitude(such as length) is introduced, along with the correlative notion of dimensional equality (such as line-segment congruence). One can define whole-number multiples of dimensional magnitudes (such as lengths of line segments; or ‘identical’ pizzas). This enables one to address the problem of characterizing what constitutes an equal share of p dimensionally equal items when these are to be divvied up into q dimensionally equal shares. An easy application of fractional abstraction then delivers the rational number pq. The introduction and elimination rules for fractional abstraction furnish easy proof of all instances of Schema Q. One learns too why q cannot be 0, and why the rational number n1is the natural number n.
Natural Logicism is a new species of logicism. It is based on Gentzenian rules of natural deducti... more Natural Logicism is a new species of logicism. It is based on Gentzenian rules of natural deduction, including ones governing logico-mathematical expressions that Gentzen himself did not treat. Natural Logicism could be developed for any branch of mathematics. The aim would be to determine how much of its foundation is ‘logical’, or analytic; and in what sense the objects dealt with might be logical objects. Here the focus is on Natural Logicism for the numbers. The two main traditional aims of logicism are pursued. The numbers are shown to be ‘logical objects’, and deeper principles about them are formulated from which one can then deduce as theorems the usual ‘first principles’ that mathematicians lay down. This requires that one develop the free logic for number-abstraction operators (natural, rational, and real). The logical treatment of these operators can be called ‘single-barreled abstractionism’. It is an important advance on, and is in contrast with, the ‘double-barreled’ abstractionism of other neo-Fregeans. Another contrast with these others is that the underlying logic employed—Core Logic—is not only free but also constructive and relevant. An important methodological condition of adequacy is imposed on the natural logicist account: it must show how the different kinds of number are applicable in our wider thought and talk about the world.
Contemporary foundationalists have examined extensively whether certain results in a branch of ma... more Contemporary foundationalists have examined extensively whether certain results in a branch of mathematics depend for their proof on principles governing notions not embedded in those results themselves. A pure proof of a theorem ϕ is one whose premises are axioms embedding only non-logical primitives that are embedded in ϕ. With reference to famous results in number theory (such as the Prime Number Theorem, Dirchlet’s Theorem, and Fermat’s Last Theorem) and to foundational results about Gödelian incompleteness of theories, it is questioned whether Bolzano was justified in insisting on purity of proof. The investigation therefore underscores, by tacit implication, the likelihood that any ‘purely logical’ re-capture of various mathematical theories is bound to be of rather limited extent. This lends heightened interest to the question of what, exactly, the ‘purely logical’ parts of these theories are.
This chapter states and explains all the formal rules of inference that are involved in the Const... more This chapter states and explains all the formal rules of inference that are involved in the Constructive Logicist account of the natural numbers. Natural-deduction rules of introduction and elimination govern the primitives 0, s, and #, as well as various pasigraphs (such as Nx, for ‘x is a natural number’) that are inferentially definable in terms of the primitives. We set out important inferences about 1‒1 mappings, and define ancestrals of one-place functions by means of special introduction and elimination rules. The ancestral of the successor function affords a definition of Nx. All the reasoning involving these notions, by means of these rules, will be both constructive and relevant.
The investigations in Chapter 18 revealed that there is no dispositive argument against the exerc... more The investigations in Chapter 18 revealed that there is no dispositive argument against the exercise of geometric intuition in the logico-genesis of real analysis. This chapter sets about resisting that downgrading of geometric intuition. For it is crucially involved in one’s attaining a grasp of the real numbers in a form that is enlightened enough to meet the four conditions of adequacy on which this work has laid such stress. The concern here is to identify exactly how much has to be vouchsafed by synthetic a priori intuition in order for logic to ‘do the rest’ in ensuring grasp of the reals. The view to be developed is that a very simple and elemental geometric intuition is all one needs. Moreover, it is one that is not shaken by the development of non-Euclidean geometries. The intuition is this. Any sequence of points on a line that proceeds strictly left-to-right and all of whose points are to the left of some bound has a leftmost right bound. This is an a priori intuitive conviction that finds no expression in Euclid. But it would be shared immediately by any Euclidean who is brought to understand the proposition involved. (It is implicit in Archimedes’ famous method of exhaustion.) What the de-geometrizers failed to anticipate is how the logical tradition would at last deliver, in the form of Gentzen’s system of natural deduction, a way of ensuring absolute formal rigor in one’s deductive reasoning. The logico-geneticist can take the aforementioned continuity principle as axiomatic, and then proceed with purely logical reasoning. One will be able to see how and why one can generate infinite bicimals (say) as canonical representations of real numbers; and why it is that one talks, nowadays almost unwittingly, of the real line. The chapter closes by revisiting Bolzano and Dedekind, and examining the contributions of Cauchy and Weierstraß in laying a more rigorous foundation for real analysis. It is a belated irony that ‘Weierstraß’sche Strenge’ can be co-opted by the natural logicist so as to ensure the ultimate foundational rigor of an elemental intuition about the continuity of a line. For it was Weierstraß who ensured genuine rigor in defining central mathematical concepts by using iterated quantifiers. All one needs thereafter is Gentzenian rigor in unpacking them.
This chapter picks up the threads of the discussion in Chapter 23 of bicimal expansions for point... more This chapter picks up the threads of the discussion in Chapter 23 of bicimal expansions for points on a line. There is a natural way of ordering such expansions. One defines a progressive matrix as a list of non-decreasing bicimals. Every progressive matrix β has a characteristic bicimal expansion, which will be called β∞. The matrix β of bicimal expansions β 1 , β 2 , … for a strictly increasing sequence of points γ 1, γ 2, … is a progressive matrix. The expansion β ∞ is that for the right limit point of the point-sequence γ 1 , γ 2 , … It is shown that every canonical bicimal expansion α corresponds to the limit point of an increasing sequence of points γ 1 , γ 2 , … , where each γi is the point whose bicimal representation is given by the first n digits of α (followed thereafter by 0s). Distinct geometric points receive distinct bicimal expansions. To every bicimal expansion there corresponds a unique point, namely, the limit of the increasing sequences of points that correspond to the ever longer ‘finite truncations’ of that bicimal expansion. To distinct bicimal expansions there correspond distinct such limit points. The author wraps up and hangs his spurs after explaining the operation of addition on (dimensionless) real numbers in terms of a notion of addition-of-line-segments as the ur-operation, in geometric intuition.
Gentzen’s pioneering method of natural deduction had a considerable impact. There are important m... more Gentzen’s pioneering method of natural deduction had a considerable impact. There are important meaning-theoretic ideas underlying its design. Gentzen’s rules of natural deduction afforded a clear logical distinction between constructive and non-constructive reasoning. The rules of Core Logic and its classicized extension are explained. The Core systems capture the further logical property of relevance (between premises and conclusions of proofs) that is a basic feature of all mathematical reasoning. The details of free logic are laid out. It is vital for the proper handling of various singular terms in mathematics that fail to denote. Important examples of such terms are ones formed by variable-binding operators—definite descriptive terms, number-abstraction terms, and set-abstraction terms. Rules of natural deduction are formulated, governing the introduction and elimination of these abstraction operators in the context of canonical identity statements. These are the single-barreled rules on which this study lays great stress.
We compare Tarski’s notion of logical consequence (preservation of truth) with that of Prawitz (t... more We compare Tarski’s notion of logical consequence (preservation of truth) with that of Prawitz (transformability of warrants for assertion). The latter is our point of departure for a definition of consequence in terms of the transformability of truthmakers (verifications) relative to all models. A sentence’s Tarskian truth-in-M coincides with its having an M-relative truthmaker. An M-relative truthmaker serves as a winning strategy or game plan for player T in the ‘material game’ played on that sentence against the background of the model M. We enter conjectures about soundness and completeness of Classical Core Logic with respect to the notion of consequence that results when the domain is required to be decidable. We consider whether the truthmaker semantics threatens a slide to realism. We work with examples of core proofs whose premises are given M-relative truthmakers; and show how these can be systematically transformed into a truthmaker for the proof’s conclusion.
This is the first logically precise, computationally implementable, book-length account of ration... more This is the first logically precise, computationally implementable, book-length account of rational belief revision. It explains how a rational agent ought to proceed when adopting a new belief - a difficult matter if the new belief contradicts the agent's old beliefs. Belief systems are modeled as finite dependency networks. So one can attend not only to what the agent believes, but also to the variety of reasons the agent has for so believing. The computational complexity of the revision problem is characterized. Algorithms for belief revision are formulated, and implemented in Prolog. The implementation tests well on a range of simple belief-revision problems that pose a variety of challenges for any account of belief revision. The notion of 'minimal mutilation' of a belief system is explicated precisely for situations when the agent is faced with conflicting beliefs. The proposed revision methods are invariant across different global justificatory structures (foundationalist, coherentist, etc.). They respect the intuition that, when revising one's beliefs, one should not hold on to any belief that has lost all its former justifications. The limitation to finite dependency networks is shown not to compromise theoretical generality.This account affords a novel way to argue that there is an inviolable core of logical principles. These principles, which form the system of Core Logic, cannot be given up, on pain of not being able to carry out the reasoning involved in rationally revising beliefs.The book ends by comparing and contrasting the new account with some major representatives of earlier alternative approaches, from the fields of formal epistemology, artificial intelligence and mathematical logic.
University of Pittsburgh Press eBooks, Jan 15, 1994
... Other im-portant Kantian synthetic a priori truths were discredited and de-moted to a ... and... more ... Other im-portant Kantian synthetic a priori truths were discredited and de-moted to a ... and insufficiency, from verbal solutions, from bad a priori reasons, from fixed principles, closed systems ... Being nothing essentially new, it harmonizes with many ancient philosophic tendencies. ...
The logicizing trend of Dedekind and Frege arose in the wake of the arithmetization of analysis b... more The logicizing trend of Dedekind and Frege arose in the wake of the arithmetization of analysis begun by Gauss and consummated by Weierstraß. In Kantian terms, these mathematicians and foundationalists sought to account for number theory as the product of our faculty of understanding, and not of a priori forms of intuition. The discussion turns to Frege’s class theory and his ill-fated Basic Law V, which was prey to Russell’s Paradox; and describes the later development of modern set theory as the ‘carrier theory’ for mathematical foundations.
Numerosity statements are statements of the form ‘There are exactly nΦs’, where the number-word ... more Numerosity statements are statements of the form ‘There are exactly nΦs’, where the number-word in place of n is understood adjectivally in English. A numerosity statement of this form does not commit one to the existence of the natural number n, or of any numbers at all. The question is how best to regiment these numerosity statements in first-order logical notation. Four different methods for doing so are presented. Each method involves so defining a regimenting formal sentence that, given any natural number n, one can effectively determine which formal sentence at first order regiments the numerosity statement, in English, that there are exactly nΦs. The four different regimentations, which are abbreviated as ∇n xΦ(x), ♢n xΦ(x), Ʊn xΦ(x), and ♡n xΦ(x), are interdeducible.
The chapter seeks to show how rational numbers—fractions—are entities to which ra tional thinkers... more The chapter seeks to show how rational numbers—fractions—are entities to which ra tional thinkers are logically entitled. It begins by introducing mereology as part of our conceptual and methodological framework for eventually attaining a grasp of fractions. The basic concepts of mereology are laid out, with their formal notations. The treatment of identity is discussed. Mereology is the theory of the part-whole relation, and is of very low consistency strength; indeed, it is arguably analytically true. Mereological fusion is treated here as an abstraction operator, in this study’s preferred single-barreled fashion, with introduction and elimination rules in free Core Logic. The notion of a dimensional magnitude(such as length) is introduced, along with the correlative notion of dimensional equality (such as line-segment congruence). One can define whole-number multiples of dimensional magnitudes (such as lengths of line segments; or ‘identical’ pizzas). This enables one to address the problem of characterizing what constitutes an equal share of p dimensionally equal items when these are to be divvied up into q dimensionally equal shares. An easy application of fractional abstraction then delivers the rational number pq. The introduction and elimination rules for fractional abstraction furnish easy proof of all instances of Schema Q. One learns too why q cannot be 0, and why the rational number n1is the natural number n.
Natural Logicism is a new species of logicism. It is based on Gentzenian rules of natural deducti... more Natural Logicism is a new species of logicism. It is based on Gentzenian rules of natural deduction, including ones governing logico-mathematical expressions that Gentzen himself did not treat. Natural Logicism could be developed for any branch of mathematics. The aim would be to determine how much of its foundation is ‘logical’, or analytic; and in what sense the objects dealt with might be logical objects. Here the focus is on Natural Logicism for the numbers. The two main traditional aims of logicism are pursued. The numbers are shown to be ‘logical objects’, and deeper principles about them are formulated from which one can then deduce as theorems the usual ‘first principles’ that mathematicians lay down. This requires that one develop the free logic for number-abstraction operators (natural, rational, and real). The logical treatment of these operators can be called ‘single-barreled abstractionism’. It is an important advance on, and is in contrast with, the ‘double-barreled’ abstractionism of other neo-Fregeans. Another contrast with these others is that the underlying logic employed—Core Logic—is not only free but also constructive and relevant. An important methodological condition of adequacy is imposed on the natural logicist account: it must show how the different kinds of number are applicable in our wider thought and talk about the world.
Contemporary foundationalists have examined extensively whether certain results in a branch of ma... more Contemporary foundationalists have examined extensively whether certain results in a branch of mathematics depend for their proof on principles governing notions not embedded in those results themselves. A pure proof of a theorem ϕ is one whose premises are axioms embedding only non-logical primitives that are embedded in ϕ. With reference to famous results in number theory (such as the Prime Number Theorem, Dirchlet’s Theorem, and Fermat’s Last Theorem) and to foundational results about Gödelian incompleteness of theories, it is questioned whether Bolzano was justified in insisting on purity of proof. The investigation therefore underscores, by tacit implication, the likelihood that any ‘purely logical’ re-capture of various mathematical theories is bound to be of rather limited extent. This lends heightened interest to the question of what, exactly, the ‘purely logical’ parts of these theories are.
This chapter states and explains all the formal rules of inference that are involved in the Const... more This chapter states and explains all the formal rules of inference that are involved in the Constructive Logicist account of the natural numbers. Natural-deduction rules of introduction and elimination govern the primitives 0, s, and #, as well as various pasigraphs (such as Nx, for ‘x is a natural number’) that are inferentially definable in terms of the primitives. We set out important inferences about 1‒1 mappings, and define ancestrals of one-place functions by means of special introduction and elimination rules. The ancestral of the successor function affords a definition of Nx. All the reasoning involving these notions, by means of these rules, will be both constructive and relevant.
The investigations in Chapter 18 revealed that there is no dispositive argument against the exerc... more The investigations in Chapter 18 revealed that there is no dispositive argument against the exercise of geometric intuition in the logico-genesis of real analysis. This chapter sets about resisting that downgrading of geometric intuition. For it is crucially involved in one’s attaining a grasp of the real numbers in a form that is enlightened enough to meet the four conditions of adequacy on which this work has laid such stress. The concern here is to identify exactly how much has to be vouchsafed by synthetic a priori intuition in order for logic to ‘do the rest’ in ensuring grasp of the reals. The view to be developed is that a very simple and elemental geometric intuition is all one needs. Moreover, it is one that is not shaken by the development of non-Euclidean geometries. The intuition is this. Any sequence of points on a line that proceeds strictly left-to-right and all of whose points are to the left of some bound has a leftmost right bound. This is an a priori intuitive conviction that finds no expression in Euclid. But it would be shared immediately by any Euclidean who is brought to understand the proposition involved. (It is implicit in Archimedes’ famous method of exhaustion.) What the de-geometrizers failed to anticipate is how the logical tradition would at last deliver, in the form of Gentzen’s system of natural deduction, a way of ensuring absolute formal rigor in one’s deductive reasoning. The logico-geneticist can take the aforementioned continuity principle as axiomatic, and then proceed with purely logical reasoning. One will be able to see how and why one can generate infinite bicimals (say) as canonical representations of real numbers; and why it is that one talks, nowadays almost unwittingly, of the real line. The chapter closes by revisiting Bolzano and Dedekind, and examining the contributions of Cauchy and Weierstraß in laying a more rigorous foundation for real analysis. It is a belated irony that ‘Weierstraß’sche Strenge’ can be co-opted by the natural logicist so as to ensure the ultimate foundational rigor of an elemental intuition about the continuity of a line. For it was Weierstraß who ensured genuine rigor in defining central mathematical concepts by using iterated quantifiers. All one needs thereafter is Gentzenian rigor in unpacking them.
This chapter picks up the threads of the discussion in Chapter 23 of bicimal expansions for point... more This chapter picks up the threads of the discussion in Chapter 23 of bicimal expansions for points on a line. There is a natural way of ordering such expansions. One defines a progressive matrix as a list of non-decreasing bicimals. Every progressive matrix β has a characteristic bicimal expansion, which will be called β∞. The matrix β of bicimal expansions β 1 , β 2 , … for a strictly increasing sequence of points γ 1, γ 2, … is a progressive matrix. The expansion β ∞ is that for the right limit point of the point-sequence γ 1 , γ 2 , … It is shown that every canonical bicimal expansion α corresponds to the limit point of an increasing sequence of points γ 1 , γ 2 , … , where each γi is the point whose bicimal representation is given by the first n digits of α (followed thereafter by 0s). Distinct geometric points receive distinct bicimal expansions. To every bicimal expansion there corresponds a unique point, namely, the limit of the increasing sequences of points that correspond to the ever longer ‘finite truncations’ of that bicimal expansion. To distinct bicimal expansions there correspond distinct such limit points. The author wraps up and hangs his spurs after explaining the operation of addition on (dimensionless) real numbers in terms of a notion of addition-of-line-segments as the ur-operation, in geometric intuition.
Gentzen’s pioneering method of natural deduction had a considerable impact. There are important m... more Gentzen’s pioneering method of natural deduction had a considerable impact. There are important meaning-theoretic ideas underlying its design. Gentzen’s rules of natural deduction afforded a clear logical distinction between constructive and non-constructive reasoning. The rules of Core Logic and its classicized extension are explained. The Core systems capture the further logical property of relevance (between premises and conclusions of proofs) that is a basic feature of all mathematical reasoning. The details of free logic are laid out. It is vital for the proper handling of various singular terms in mathematics that fail to denote. Important examples of such terms are ones formed by variable-binding operators—definite descriptive terms, number-abstraction terms, and set-abstraction terms. Rules of natural deduction are formulated, governing the introduction and elimination of these abstraction operators in the context of canonical identity statements. These are the single-barreled rules on which this study lays great stress.
Uploads
Papers by Neil Tennant