fbpx
Wikipedia

Natural deduction

In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning.[1] This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.

History edit

Natural deduction grew out of a context of dissatisfaction with the axiomatizations of deductive reasoning common to the systems of Hilbert, Frege, and Russell (see, e.g., Hilbert system). Such axiomatizations were most famously used by Russell and Whitehead in their mathematical treatise Principia Mathematica. Spurred on by a series of seminars in Poland in 1926 by Łukasiewicz that advocated a more natural treatment of logic, Jaśkowski made the earliest attempts at defining a more natural deduction, first in 1929 using a diagrammatic notation, and later updating his proposal in a sequence of papers in 1934 and 1935.[2] His proposals led to different notations such as Fitch-style calculus (or Fitch's diagrams) or Suppes' method for which Lemmon gave a variant called system L.

Natural deduction in its modern form was independently proposed by the German mathematician Gerhard Gentzen in 1933, in a dissertation delivered to the faculty of mathematical sciences of the University of Göttingen.[3] The term natural deduction (or rather, its German equivalent natürliches Schließen) was coined in that paper:

Gentzen was motivated by a desire to establish the consistency of number theory. He was unable to prove the main result required for the consistency result, the cut elimination theorem—the Hauptsatz—directly for natural deduction. For this reason he introduced his alternative system, the sequent calculus, for which he proved the Hauptsatz both for classical and intuitionistic logic. In a series of seminars in 1961 and 1962 Prawitz gave a comprehensive summary of natural deduction calculi, and transported much of Gentzen's work with sequent calculi into the natural deduction framework. His 1965 monograph Natural deduction: a proof-theoretical study[5] was to become a reference work on natural deduction, and included applications for modal and second-order logic.

In natural deduction, a proposition is deduced from a collection of premises by applying inference rules repeatedly. The system presented in this article is a minor variation of Gentzen's or Prawitz's formulation, but with a closer adherence to Martin-Löf's description of logical judgments and connectives.[6]

History of notation styles edit

Natural deduction has had a large variety of notation styles,[7] which can make it difficult to recognize a proof if you're not familiar with one of them. To help with this situation, this article has a § Notation section explaining how to read all the notation that it will actually use. This section just explains the historical evolution of notation styles, most of which cannot be shown because there are no illustrations available under a public copyright license – the reader is pointed to the SEP and IEP for pictures.

  • Gentzen invented natural deduction using tree-shaped proofs – see § Gentzen's tree notation for details.
  • Jaśkowski changed this to a notation that used various nested boxes.[7]
  • Fitch changed Jaśkowski method of drawing the boxes, creating Fitch notation.[7]
  • 1940: In a textbook, Quine[8] indicated antecedent dependencies by line numbers in square brackets, anticipating Suppes' 1957 line-number notation.
  • 1950: In a textbook, Quine (1982, pp. 241–255) demonstrated a method of using one or more asterisks to the left of each line of proof to indicate dependencies. This is equivalent to Kleene's vertical bars. (It is not totally clear if Quine's asterisk notation appeared in the original 1950 edition or was added in a later edition.)
  • 1957: An introduction to practical logic theorem proving in a textbook by Suppes (1999, pp. 25–150). This indicated dependencies (i.e. antecedent propositions) by line numbers at the left of each line.
  • 1963: Stoll (1979, pp. 183–190, 215–219) uses sets of line numbers to indicate antecedent dependencies of the lines of sequential logical arguments based on natural deduction inference rules.
  • 1965: The entire textbook by Lemmon (1965) is an introduction to logic proofs using a method based on that of Suppes, what is now known as Suppes–Lemmon notation.
  • 1967: In a textbook, Kleene (2002, pp. 50–58, 128–130) briefly demonstrated two kinds of practical logic proofs, one system using explicit quotations of antecedent propositions on the left of each line, the other system using vertical bar-lines on the left to indicate dependencies.[a]

Notation edit

The reader should be familiar with common symbols for logical connectives. Here is a table with the most common notational variants.

Notational variants of the connectives[9][10]
Connective Symbol
AND  ,  ,  ,  ,  
equivalent  ,  ,  
implies  ,  ,  
NAND  ,  ,  
nonequivalent  ,  ,  
NOR  ,  ,  
NOT  ,  ,  ,  
OR  ,  ,  ,  
XNOR   XNOR  
XOR  ,  

Gentzen's tree notation edit

Gentzen, who invented natural deduction, had his own notation style for arguments. This will be exemplified by a simple argument below. Let's say we have a simple example argument in propositional logic, such as, "if it's raining then it's cloudly; it is raining; therefore it's cloudy". (This is in modus ponens.) Representing this as a list of propositions, as is common, we would have:

 
 
 

In Gentzen's notation,[7] this would be written like this:

 

The premises are shown above a line, called the inference line,[11][12] separated by a comma, which indicates combination of premises.[13] The conclusion is written below the inference line.[11] The inference line represents syntactic consequence,[11] sometimes called deductive consequence,[14] which is also symbolized with ⊢.[15][14] So the above can also be written in one line as  . (The turnstile, for syntactic consequence, is of a higher level than the comma, which represents premise combination, which in turn is of a higher level than the arrow, used for material implication; so no parentheses are needed to interpret this formula.)[13]

Syntactic consequence is contrasted with semantic consequence,[16] which is symbolized with ⊧.[15][14] In this case, the conclusion follows syntactically because natural deduction is a syntactic proof system, which assumes inference rules as primitives.

Gentzen's style will be used in much of this article. Gentzen's discharging annotations used to internalise hypothetical judgments can be avoided by representing proofs as a tree of sequents Γ ⊢A instead of a tree of A true judgments.

Suppes–Lemmon notation edit

Many textbooks use Suppes–Lemmon notation,[7] so this article will also give that – although as of now, this is only included for propositional logic, and the rest of the coverage is given only in Gentzen style. A proof, laid out in accordance with the Suppes–Lemmon notation style, is a sequence of lines containing sentences,[17] where each sentence is either an assumption, or the result of applying a rule of proof to earlier sentences in the sequence.[17] Each line of proof is made up of a sentence of proof, together with its annotation, its assumption set, and the current line number.[17] The assumption set lists the assumptions on which the given sentence of proof depends, which are referenced by the line numbers.[17] The annotation specifies which rule of proof was applied, and to which earlier lines, to yield the current sentence.[17] Here's an example proof:

Suppes–Lemmon style proof (first example)
Assumption set Line number Sentence of proof Annotation
1 1   A
2 2   A
3 3   A
1, 3 4   1, 3 →E
1, 2 5   2, 4 RAA

This proof will become clearer when the inference rules and their appropriate annotations are specified – see § Propositional inference rules (Suppes–Lemmon style).

Propositional language syntax edit

This section defines the formal syntax for a propositional logic language, contrasting the common ways of doing so with a Gentzen-style way of doing so.

Common definition styles edit

The formal language   of a propositional calculus is usually defined by a recursive definition, such as this one, from Bostock:[18]

  1. Each sentence-letter is a formula.
  2. " " and " " are formulae.
  3. If   is a formula, so is  .
  4. If   and   are formulae, so are  ,  ,  ,  .
  5. Nothing else is a formula.

There are other ways of doing it, such as the BNF grammar  .[19][20]

Gentzen-style definition edit

Interestingly, § Gentzen's tree notation can be used to give a syntax definition,[20] by introducing the notation "prop" to mean "is a proposition" (in this context, meaning a well-formed formula). The rules of the syntax may be called "formation rules" for the connectives, as contrasted with the "inference rules" of proof. Here's the formation rule for conjunction:

 

or, omitting the parentheses, for simplicity:

 

And here are the other formation rules:

 
 

The rest of this article will focus on the natural deduction proof system itself, using "true" (or nothing) instead of "prop" to show that it is giving inference rules, rather than formation rules.

Gentzen-style propositional logic edit

Gentzen-style inference rules edit

The following is a complete list of primitive inference rules for natural deduction in classical propositional logic:[20]

Rules for classical propositional logic
Introduction rules Elimination rules
   
   
   
   
 

This table follows the custom of using Greek letters as schemata, which may range over any formulas, rather than only over atomic propositions. The name of a rule is given to the right of its formula tree. For instance, the first introduction rule is named  , which is short for "conjunction introduction".

Gentzen-style example proofs edit

As an example of the use of inference rules, consider commutativity of conjunction. If AB is true, then BA is true; this derivation can be drawn by composing inference rules in such a fashion that premises of a lower inference match the conclusion of the next higher inference.

 

As a second example, consider the derivation of "A ⊃ (B ⊃ (A ∧ B)) true":

 

This full derivation has no unsatisfied premises; however, sub-derivations are hypothetical. For instance, the derivation of "B ⊃ (A ∧ B) true" is hypothetical with antecedent "A true" (named u).

Suppes–Lemmon-style propositional logic edit

Suppes–Lemmon-style inference rules edit

Natural deduction inference rules, due ultimately to Gentzen, are given below.[21] There are ten primitive rules of proof, which are the rule assumption, plus four pairs of introduction and elimination rules for the binary connectives, and the rule reductio ad adbsurdum.[17] Disjunctive Syllogism can be used as an easier alternative to the proper ∨-elimination,[17] and MTT and DN are commonly given rules,[21] although they are not primitive.[17]

List of Inference Rules
Rule Name Alternative names Annotation Assumption set Statement
Rule of Assumptions[21] Assumption[17] A[21][17] The current line number.[17] At any stage of the argument, introduce a proposition as an assumption of the argument.[21][17]
Conjunction introduction Ampersand introduction,[21][17] conjunction (CONJ)[17][22] m, n &I[17][21] The union of the assumption sets at lines m and n.[17] From   and   at lines m and n, infer  .[21][17]
Conjunction elimination Simplification (S),[17] ampersand elimination[21][17] m &E[17][21] The same as at line m.[17] From   at line m, infer   and  .[17][21]
Disjunction introduction[21] Addition (ADD)[17] m ∨I[17][21] The same as at line m.[17] From   at line m, infer  , whatever   may be.[17][21]
Disjunction elimination Wedge elimination,[21] dilemma (DL)[22] j,k,l,m,n ∨E[21] The lines j,k,l,m,n.[21] From   at line j, and an assumption of   at line k, and a derivation of   from   at line l, and an assumption of   at line m, and a derivation of   from   at line n, infer  .[21]
Disjunctive Syllogism Wedge elimination (∨E),[17] modus tollendo ponens (MTP)[17] m,n DS[17] The union of the assumption sets at lines m and n.[17] From   at line m and   at line n, infer  ; from   at line m and   at line n, infer  .[17]
Arrow elimination[17] Modus ponendo ponens (MPP),[21][17] modus ponens (MP),[22][17] conditional elimination m, n →E[17][21] The union of the assumption sets at lines m and n.[17] From   at line m, and   at line n, infer  .[17]
Arrow introduction[17] Conditional proof (CP),[22][21][17] conditional introduction n, →I (m)[17][21] Everything in the assumption set at line n, excepting m, the line where the antecedent was assumed.[17] From   at line n, following from the assumption of   at line m, infer  .[17]
Reductio ad absurdum[21] Indirect Proof (IP),[17] negation introduction (−I),[17] negation elimination (−E)[17] m, n RAA (k)[17] The union of the assumption sets at lines m and n, excluding k (the denied assumption).[17] From a sentence and its denial[b] at lines m and n, infer the denial of any assumption appearing in the proof (at line k).[17]
Double arrow introduction[17] Biconditional definition (Df ↔),[21] biconditional introduction m, n ↔ I[17] The union of the assumption sets at lines m and n.[17] From   and   at lines m and n, infer  .[17]
Double arrow elimination[17] Biconditional definition (Df ↔),[21] biconditional elimination m ↔ E[17] The same as at line m.[17] From   at line m, infer either   or  .[17]
Double negation[21][22] Double negation elimination m DN[21] The same as at line m.[21] From   at line m, infer  .[21]
Modus tollendo tollens[21] Modus tollens (MT)[22] m, n MTT[21] The union of the assumption sets at lines m and n.[21] From   at line m, and   at line n, infer  .[21]

Suppes–Lemmon-style example proof edit

Recall that an example proof was already given when introducing § Suppes–Lemmon notation. This is a second example.

Suppes–Lemmon style proof (second example)
Assumption set Line number Sentence of proof Annotation
1 1   A
2 2   A
3 3   A
2, 3 4   2, 3 →E
1, 2, 3 5   1, 4 &E
1, 3 6   2, 5 RAA(2)
2, 3 7   2, 3 RAA(1)

Consistency, completeness, and normal forms edit

A theory is said to be consistent if falsehood is not provable (from no assumptions) and is complete if every theorem or its negation is provable using the inference rules of the logic. These are statements about the entire logic, and are usually tied to some notion of a model. However, there are local notions of consistency and completeness that are purely syntactic checks on the inference rules, and require no appeals to models. The first of these is local consistency, also known as local reducibility, which says that any derivation containing an introduction of a connective followed immediately by its elimination can be turned into an equivalent derivation without this detour. It is a check on the strength of elimination rules: they must not be so strong that they include knowledge not already contained in their premises. As an example, consider conjunctions.

 

Dually, local completeness says that the elimination rules are strong enough to decompose a connective into the forms suitable for its introduction rule. Again for conjunctions:

 

These notions correspond exactly to β-reduction (beta reduction) and η-conversion (eta conversion) in the lambda calculus, using the Curry–Howard isomorphism. By local completeness, we see that every derivation can be converted to an equivalent derivation where the principal connective is introduced. In fact, if the entire derivation obeys this ordering of eliminations followed by introductions, then it is said to be normal. In a normal derivation all eliminations happen above introductions. In most logics, every derivation has an equivalent normal derivation, called a normal form. The existence of normal forms is generally hard to prove using natural deduction alone, though such accounts do exist in the literature, most notably by Dag Prawitz in 1961.[23] It is much easier to show this indirectly by means of a cut-free sequent calculus presentation.

First and higher-order extensions edit

 
Summary of first-order system

The logic of the earlier section is an example of a single-sorted logic, i.e., a logic with a single kind of object: propositions. Many extensions of this simple framework have been proposed; in this section we will extend it with a second sort of individuals or terms. More precisely, we will add a new kind of judgment, "t is a term" (or "t term") where t is schematic. We shall fix a countable set V of variables, another countable set F of function symbols, and construct terms with the following formation rules:

 

and

 

For propositions, we consider a third countable set P of predicates, and define atomic predicates over terms with the following formation rule:

 

The first two rules of formation provide a definition of a term that is effectively the same as that defined in term algebra and model theory, although the focus of those fields of study is quite different from natural deduction. The third rule of formation effectively defines an atomic formula, as in first-order logic, and again in model theory.

To these are added a pair of formation rules, defining the notation for quantified propositions; one for universal (∀) and existential (∃) quantification:

 

The universal quantifier has the introduction and elimination rules:

 

The existential quantifier has the introduction and elimination rules:

 

In these rules, the notation [t/x] A stands for the substitution of t for every (visible) instance of x in A, avoiding capture.[c] As before the superscripts on the name stand for the components that are discharged: the term a cannot occur in the conclusion of ∀I (such terms are known as eigenvariables or parameters), and the hypotheses named u and v in ∃E are localised to the second premise in a hypothetical derivation. Although the propositional logic of earlier sections was decidable, adding the quantifiers makes the logic undecidable.

So far, the quantified extensions are first-order: they distinguish propositions from the kinds of objects quantified over. Higher-order logic takes a different approach and has only a single sort of propositions. The quantifiers have as the domain of quantification the very same sort of propositions, as reflected in the formation rules:

 

A discussion of the introduction and elimination forms for higher-order logic is beyond the scope of this article. It is possible to be in-between first-order and higher-order logics. For example, second-order logic has two kinds of propositions, one kind quantifying over terms, and the second kind quantifying over propositions of the first kind.

Proofs and type theory edit

The presentation of natural deduction so far has concentrated on the nature of propositions without giving a formal definition of a proof. To formalise the notion of proof, we alter the presentation of hypothetical derivations slightly. We label the antecedents with proof variables (from some countable set V of variables), and decorate the succedent with the actual proof. The antecedents or hypotheses are separated from the succedent by means of a turnstile (⊢). This modification sometimes goes under the name of localised hypotheses. The following diagram summarises the change.

──── u1 ──── u2 ... ──── un J1 J2 Jn  ⋮  J 
u1:J1, u2:J2, ..., un:Jn ⊢ J 

The collection of hypotheses will be written as Γ when their exact composition is not relevant. To make proofs explicit, we move from the proof-less judgment "A true" to a judgment: "π is a proof of (A true)", which is written symbolically as "π : A true". Following the standard approach, proofs are specified with their own formation rules for the judgment "π proof". The simplest possible proof is the use of a labelled hypothesis; in this case the evidence is the label itself.

u ∈ V ─────── proof-F u proof 
───────────────────── hyp u:A true ⊢ u : A true 

For brevity, we shall leave off the judgmental label true in the rest of this article, i.e., write "Γ ⊢ π : A". Let us re-examine some of the connectives with explicit proofs. For conjunction, we look at the introduction rule ∧I to discover the form of proofs of conjunction: they must be a pair of proofs of the two conjuncts. Thus:

π1 proof π2 proof ──────────────────── pair-F (π1, π2) proof 
Γ ⊢ π1 : A Γ ⊢ π2 : B ───────────────────────── ∧I Γ ⊢ (π1, π2) : A ∧ B 

The elimination rules ∧E1 and ∧E2 select either the left or the right conjunct; thus the proofs are a pair of projections—first (fst) and second (snd).

π proof ─────────── fst-F fst π proof 
Γ ⊢ π : A ∧ B ───────────── ∧E1 Γ ⊢ fst π : A 
π proof ─────────── snd-F snd π proof 
Γ ⊢ π : A ∧ B ───────────── ∧E2 Γ ⊢ snd π : B 

For implication, the introduction form localises or binds the hypothesis, written using a λ; this corresponds to the discharged label. In the rule, "Γ, u:A" stands for the collection of hypotheses Γ, together with the additional hypothesis u.

π proof ──────────── λ-F λu. π proof 
Γ, u:A ⊢ π : B ───────────────── ⊃I Γ ⊢ λu. π : A ⊃ B 
π1 proof π2 proof ─────────────────── app-F π1 π2 proof 
Γ ⊢ π1 : A ⊃ B Γ ⊢ π2 : A ──────────────────────────── ⊃E Γ ⊢ π1 π2 : B 

With proofs available explicitly, one can manipulate and reason about proofs. The key operation on proofs is the substitution of one proof for an assumption used in another proof. This is commonly known as a substitution theorem, and can be proved by induction on the depth (or structure) of the second judgment.

Substitution theorem edit

If Γ ⊢ π1 : A and Γ, u:A ⊢ π2 : B, then Γ ⊢ [π1/u] π2 : B.

So far the judgment "Γ ⊢ π : A" has had a purely logical interpretation. In type theory, the logical view is exchanged for a more computational view of objects. Propositions in the logical interpretation are now viewed as types, and proofs as programs in the lambda calculus. Thus the interpretation of "π : A" is "the program π has type A". The logical connectives are also given a different reading: conjunction is viewed as product (×), implication as the function arrow (→), etc. The differences are only cosmetic, however. Type theory has a natural deduction presentation in terms of formation, introduction and elimination rules; in fact, the reader can easily reconstruct what is known as simple type theory from the previous sections.

The difference between logic and type theory is primarily a shift of focus from the types (propositions) to the programs (proofs). Type theory is chiefly interested in the convertibility or reducibility of programs. For every type, there are canonical programs of that type which are irreducible; these are known as canonical forms or values. If every program can be reduced to a canonical form, then the type theory is said to be normalising (or weakly normalising). If the canonical form is unique, then the theory is said to be strongly normalising. Normalisability is a rare feature of most non-trivial type theories, which is a big departure from the logical world. (Recall that almost every logical derivation has an equivalent normal derivation.) To sketch the reason: in type theories that admit recursive definitions, it is possible to write programs that never reduce to a value; such looping programs can generally be given any type. In particular, the looping program has type ⊥, although there is no logical proof of "⊥ true". For this reason, the propositions as types; proofs as programs paradigm only works in one direction, if at all: interpreting a type theory as a logic generally gives an inconsistent logic.

Example: Dependent Type Theory edit

Like logic, type theory has many extensions and variants, including first-order and higher-order versions. One branch, known as dependent type theory, is used in a number of computer-assisted proof systems. Dependent type theory allows quantifiers to range over programs themselves. These quantified types are written as Π and Σ instead of ∀ and ∃, and have the following formation rules:

Γ ⊢ A type Γ, x:A ⊢ B type ───────────────────────────── Π-F Γ ⊢ Πx:A. B type 
Γ ⊢ A type Γ, x:A ⊢ B type ──────────────────────────── Σ-F Γ ⊢ Σx:A. B type 

These types are generalisations of the arrow and product types, respectively, as witnessed by their introduction and elimination rules.

Γ, x:A ⊢ π : B ──────────────────── ΠI Γ ⊢ λx. π : Πx:A. B 
Γ ⊢ π1 : Πx:A. B Γ ⊢ π2 : A ───────────────────────────── ΠE Γ ⊢ π1 π2 : [π2/x] B 
Γ ⊢ π1 : A Γ, x:A ⊢ π2 : B ───────────────────────────── ΣI Γ ⊢ (π1, π2) : Σx:A. B 
Γ ⊢ π : Σx:A. B ──────────────── ΣE1 Γ ⊢ fst π : A 
Γ ⊢ π : Σx:A. B ──────────────────────── ΣE2 Γ ⊢ snd π : [fst π/x] B 

Dependent type theory in full generality is very powerful: it is able to express almost any conceivable property of programs directly in the types of the program. This generality comes at a steep price — either typechecking is undecidable (extensional type theory), or extensional reasoning is more difficult (intensional type theory). For this reason, some dependent type theories do not allow quantification over arbitrary programs, but rather restrict to programs of a given decidable index domain, for example integers, strings, or linear programs.

Since dependent type theories allow types to depend on programs, a natural question to ask is whether it is possible for programs to depend on types, or any other combination. There are many kinds of answers to such questions. A popular approach in type theory is to allow programs to be quantified over types, also known as parametric polymorphism; of this there are two main kinds: if types and programs are kept separate, then one obtains a somewhat more well-behaved system called predicative polymorphism; if the distinction between program and type is blurred, one obtains the type-theoretic analogue of higher-order logic, also known as impredicative polymorphism. Various combinations of dependency and polymorphism have been considered in the literature, the most famous being the lambda cube of Henk Barendregt.

The intersection of logic and type theory is a vast and active research area. New logics are usually formalised in a general type theoretic setting, known as a logical framework. Popular modern logical frameworks such as the calculus of constructions and LF are based on higher-order dependent type theory, with various trade-offs in terms of decidability and expressive power. These logical frameworks are themselves always specified as natural deduction systems, which is a testament to the versatility of the natural deduction approach.

Classical and modal logics edit

For simplicity, the logics presented so far have been intuitionistic. Classical logic extends intuitionistic logic with an additional axiom or principle of excluded middle:

For any proposition p, the proposition p ∨ ¬p is true.

This statement is not obviously either an introduction or an elimination; indeed, it involves two distinct connectives. Gentzen's original treatment of excluded middle prescribed one of the following three (equivalent) formulations, which were already present in analogous forms in the systems of Hilbert and Heyting:

────────────── XM1 A ∨ ¬A true 
¬¬A true ────────── XM2 A true 
──────── u ¬A true ⋮ p true ────── XM3u, p A true 

(XM3 is merely XM2 expressed in terms of E.) This treatment of excluded middle, in addition to being objectionable from a purist's standpoint, introduces additional complications in the definition of normal forms.

A comparatively more satisfactory treatment of classical natural deduction in terms of introduction and elimination rules alone was first proposed by Parigot in 1992 in the form of a classical lambda calculus called λμ. The key insight of his approach was to replace a truth-centric judgment A true with a more classical notion, reminiscent of the sequent calculus: in localised form, instead of Γ ⊢ A, he used Γ ⊢ Δ, with Δ a collection of propositions similar to Γ. Γ was treated as a conjunction, and Δ as a disjunction. This structure is essentially lifted directly from classical sequent calculi, but the innovation in λμ was to give a computational meaning to classical natural deduction proofs in terms of a callcc or a throw/catch mechanism seen in LISP and its descendants. (See also: first class control.)

Another important extension was for modal and other logics that need more than just the basic judgment of truth. These were first described, for the alethic modal logics S4 and S5, in a natural deduction style by Prawitz in 1965,[5] and have since accumulated a large body of related work. To give a simple example, the modal logic S4 requires one new judgment, "A valid", that is categorical with respect to truth:

If "A true" under no assumptions of the form "B true", then "A valid".

This categorical judgment is internalised as a unary connective ◻A (read "necessarily A") with the following introduction and elimination rules:

A valid ──────── ◻I ◻ A true 
◻ A true ──────── ◻E A true 

Note that the premise "A valid" has no defining rules; instead, the categorical definition of validity is used in its place. This mode becomes clearer in the localised form when the hypotheses are explicit. We write "Ω;Γ ⊢ A true" where Γ contains the true hypotheses as before, and Ω contains valid hypotheses. On the right there is just a single judgment "A true"; validity is not needed here since "Ω ⊢ A valid" is by definition the same as "Ω;⋅ ⊢ A true". The introduction and elimination forms are then:

Ω;⋅ ⊢ π : A true ──────────────────── ◻I Ω;⋅ ⊢ box π : ◻ A true 
Ω;Γ ⊢ π : ◻ A true ────────────────────── ◻E Ω;Γ ⊢ unbox π : A true 

The modal hypotheses have their own version of the hypothesis rule and substitution theorem.

─────────────────────────────── valid-hyp Ω, u: (A valid) ; Γ ⊢ u : A true 

Modal substitution theorem edit

If Ω;⋅ ⊢ π1 : A true and Ω, u: (A valid) ; Γ ⊢ π2 : C true, then Ω;Γ ⊢ [π1/u] π2 : C true.

This framework of separating judgments into distinct collections of hypotheses, also known as multi-zoned or polyadic contexts, is very powerful and extensible; it has been applied for many different modal logics, and also for linear and other substructural logics, to give a few examples. However, relatively few systems of modal logic can be formalised directly in natural deduction. To give proof-theoretic characterisations of these systems, extensions such as labelling or systems of deep inference.

The addition of labels to formulae permits much finer control of the conditions under which rules apply, allowing the more flexible techniques of analytic tableaux to be applied, as has been done in the case of labelled deduction. Labels also allow the naming of worlds in Kripke semantics; Simpson (1993) presents an influential technique for converting frame conditions of modal logics in Kripke semantics into inference rules in a natural deduction formalisation of hybrid logic. Stouppa (2004) surveys the application of many proof theories, such as Avron and Pottinger's hypersequents and Belnap's display logic to such modal logics as S5 and B.

Comparison with sequent calculus edit

The sequent calculus is the chief alternative to natural deduction as a foundation of mathematical logic. In natural deduction the flow of information is bi-directional: elimination rules flow information downwards by deconstruction, and introduction rules flow information upwards by assembly. Thus, a natural deduction proof does not have a purely bottom-up or top-down reading, making it unsuitable for automation in proof search. To address this fact, Gentzen in 1935 proposed his sequent calculus, though he initially intended it as a technical device for clarifying the consistency of predicate logic. Kleene, in his seminal 1952 book Introduction to Metamathematics, gave the first formulation of the sequent calculus in the modern style.[24]

In the sequent calculus all inference rules have a purely bottom-up reading. Inference rules can apply to elements on both sides of the turnstile. (To differentiate from natural deduction, this article uses a double arrow ⇒ instead of the right tack ⊢ for sequents.) The introduction rules of natural deduction are viewed as right rules in the sequent calculus, and are structurally very similar. The elimination rules on the other hand turn into left rules in the sequent calculus. To give an example, consider disjunction; the right rules are familiar:

Γ ⇒ A ───────── ∨R1 Γ ⇒ A ∨ B 
Γ ⇒ B ───────── ∨R2 Γ ⇒ A ∨ B 

On the left:

Γ, u:A ⇒ C Γ, v:B ⇒ C ─────────────────────────── ∨L Γ, w: (A ∨ B) ⇒ C 

Recall the ∨E rule of natural deduction in localised form:

Γ ⊢ A ∨ B Γ, u:A ⊢ C Γ, v:B ⊢ C ─────────────────────────────────────── ∨E Γ ⊢ C 

The proposition A ∨ B, which is the succedent of a premise in ∨E, turns into a hypothesis of the conclusion in the left rule ∨L. Thus, left rules can be seen as a sort of inverted elimination rule. This observation can be illustrated as follows:

natural deduction sequent calculus
 ────── hyp | | elim. rules | ↓ ────────────────────── ↑↓ meet ↑ | | intro. rules | conclusion
 ─────────────────────────── init ↑ ↑ | | | left rules | right rules | | conclusion

In the sequent calculus, the left and right rules are performed in lock-step until one reaches the initial sequent, which corresponds to the meeting point of elimination and introduction rules in natural deduction. These initial rules are superficially similar to the hypothesis rule of natural deduction, but in the sequent calculus they describe a transposition or a handshake of a left and a right proposition:

────────── init Γ, u:A ⇒ A 

The correspondence between the sequent calculus and natural deduction is a pair of soundness and completeness theorems, which are both provable by means of an inductive argument.

Soundness of ⇒ wrt. ⊢
If Γ ⇒ A, then Γ ⊢ A.
Completeness of ⇒ wrt. ⊢
If Γ ⊢ A, then Γ ⇒ A.

It is clear by these theorems that the sequent calculus does not change the notion of truth, because the same collection of propositions remain true. Thus, one can use the same proof objects as before in sequent calculus derivations. As an example, consider the conjunctions. The right rule is virtually identical to the introduction rule

sequent calculus natural deduction
Γ ⇒ π1 : A Γ ⇒ π2 : B ─────────────────────────── ∧R Γ ⇒ (π1, π2) : A ∧ B 
Γ ⊢ π1 : A Γ ⊢ π2 : B ───────────────────────── ∧I Γ ⊢ (π1, π2) : A ∧ B 

The left rule, however, performs some additional substitutions that are not performed in the corresponding elimination rules.

sequent calculus natural deduction
Γ, u:A ⇒ π : C ──────────────────────────────── ∧L1 Γ, v: (A ∧ B) ⇒ [fst v/u] π : C 
Γ ⊢ π : A ∧ B ───────────── ∧E1 Γ ⊢ fst π : A 
Γ, u:B ⇒ π : C ──────────────────────────────── ∧L2 Γ, v: (A ∧ B) ⇒ [snd v/u] π : C 
Γ ⊢ π : A ∧ B ───────────── ∧E2 Γ ⊢ snd π : B 

The kinds of proofs generated in the sequent calculus are therefore rather different from those of natural deduction. The sequent calculus produces proofs in what is known as the β-normal η-long form, which corresponds to a canonical representation of the normal form of the natural deduction proof. If one attempts to describe these proofs using natural deduction itself, one obtains what is called the intercalation calculus (first described by John Byrnes), which can be used to formally define the notion of a normal form for natural deduction.

The substitution theorem of natural deduction takes the form of a structural rule or structural theorem known as cut in the sequent calculus.

Cut (substitution) edit

If Γ ⇒ π1 : A and Γ, u:A ⇒ π2 : C, then Γ ⇒ [π1/u] π2 : C.

In most well behaved logics, cut is unnecessary as an inference rule, though it remains provable as a meta-theorem; the superfluousness of the cut rule is usually presented as a computational process, known as cut elimination. This has an interesting application for natural deduction; usually it is extremely tedious to prove certain properties directly in natural deduction because of an unbounded number of cases. For example, consider showing that a given proposition is not provable in natural deduction. A simple inductive argument fails because of rules like ∨E or E which can introduce arbitrary propositions. However, we know that the sequent calculus is complete with respect to natural deduction, so it is enough to show this unprovability in the sequent calculus. Now, if cut is not available as an inference rule, then all sequent rules either introduce a connective on the right or the left, so the depth of a sequent derivation is fully bounded by the connectives in the final conclusion. Thus, showing unprovability is much easier, because there are only a finite number of cases to consider, and each case is composed entirely of sub-propositions of the conclusion. A simple instance of this is the global consistency theorem: "⋅ ⊢ ⊥ true" is not provable. In the sequent calculus version, this is manifestly true because there is no rule that can have "⋅ ⇒ ⊥" as a conclusion! Proof theorists often prefer to work on cut-free sequent calculus formulations because of such properties.

See also edit

Notes edit

  1. ^ A particular advantage of Kleene's tabular natural deduction systems is that he proves the validity of the inference rules for both propositional calculus and predicate calculus. See Kleene 2002, pp. 44–45, 118–119.
  2. ^ To simplify the statement of the rule, the word "denial" here is used in this way: the denial of a formula   that is not a negation is  , whereas a negation,  , has two denials, viz.,   and  .[17]
  3. ^ See the article on lambda calculus for more detail about the concept of substitution.

References edit

General references edit

  • Barker-Plummer, Dave; Barwise, Jon; Etchemendy, John (2011). Language Proof and Logic (2nd ed.). CSLI Publications. ISBN 978-1575866321.
  • Gallier, Jean (2005). "Constructive Logics. Part I: A Tutorial on Proof Systems and Typed λ-Calculi". Retrieved 12 June 2014.
  • Gentzen, Gerhard Karl Erich (1934). "Untersuchungen über das logische Schließen. I". Mathematische Zeitschrift. 39 (2): 176–210. doi:10.1007/BF01201353. S2CID 121546341. (English translation Investigations into Logical Deduction in M. E. Szabo. The Collected Works of Gerhard Gentzen. North-Holland Publishing Company, 1969.)
  • Gentzen, Gerhard Karl Erich (1935). "Untersuchungen über das logische Schließen. II". Mathematische Zeitschrift. 39 (3): 405–431. doi:10.1007/bf01201363. S2CID 186239837.
  • Girard, Jean-Yves (1990). . Cambridge Tracts in Theoretical Computer Science. Cambridge University Press, Cambridge, England. Archived from the original on 2016-07-04. Retrieved 2006-04-20. Translated and with appendices by Paul Taylor and Yves Lafont.
  • Jaśkowski, Stanisław (1934). On the rules of suppositions in formal logic. Reprinted in Polish logic 1920–39, ed. Storrs McCall.
  • Kleene, Stephen Cole (1980) [1952]. Introduction to metamathematics (Eleventh ed.). North-Holland. ISBN 978-0-7204-2103-3.
  • Kleene, Stephen Cole (2009) [1952]. Introduction to metamathematics. Ishi Press International. ISBN 978-0-923891-57-2.
  • Kleene, Stephen Cole (2002) [1967]. Mathematical logic. Mineola, New York: Dover Publications. ISBN 978-0-486-42533-7.
  • Lemmon, Edward John (1965). Beginning logic. Thomas Nelson. ISBN 978-0-17-712040-4.
  • Martin-Löf, Per (1996). "On the meanings of the logical constants and the justifications of the logical laws" (PDF). Nordic Journal of Philosophical Logic. 1 (1): 11–60. Lecture notes to a short course at Università degli Studi di Siena, April 1983.
  • Pfenning, Frank; Davies, Rowan (2001). "A judgmental reconstruction of modal logic" (PDF). Mathematical Structures in Computer Science. 11 (4): 511–540. CiteSeerX 10.1.1.43.1611. doi:10.1017/S0960129501003322. S2CID 16467268.
  • Prawitz, Dag (1965). Natural deduction: A proof-theoretical study. Acta Universitatis Stockholmiensis, Stockholm studies in philosophy 3. Stockholm, Göteborg, Uppsala: Almqvist & Wicksell.
  • Prawitz, Dag (2006) [1965]. Natural deduction: A proof-theoretical study. Mineola, New York: Dover Publications. ISBN 978-0-486-44655-4.
  • Quine, Willard Van Orman (1981) [1940]. Mathematical logic (Revised ed.). Cambridge, Massachusetts: Harvard University Press. ISBN 978-0-674-55451-1.
  • Quine, Willard Van Orman (1982) [1950]. Methods of logic (Fourth ed.). Cambridge, Massachusetts: Harvard University Press. ISBN 978-0-674-57176-1.
  • Simpson, Alex (1993). The proof theory and semantics of intuitionistic modal logic (PDF). University of Edinburgh. PhD thesis.
  • Stoll, Robert Roth (1979) [1963]. Set Theory and Logic. Mineola, New York: Dover Publications. ISBN 978-0-486-63829-4.
  • Stouppa, Phiniki (2004). The Design of Modal Proof Theories: The Case of S5. University of Dresden. CiteSeerX 10.1.1.140.1858. MSc thesis.
  • Suppes, Patrick Colonel (1999) [1957]. Introduction to logic. Mineola, New York: Dover Publications. ISBN 978-0-486-40687-9.
  • Van Dalen, Dirk (2013) [1980]. Logic and Structure. Universitext (5 ed.). London, Heidelberg, New York, Dordrecht: Springer. doi:10.1007/978-1-4471-4558-5. ISBN 978-1-4471-4558-5.

Inline citations edit

  1. ^ "Natural Deduction | Internet Encyclopedia of Philosophy". Retrieved 2024-05-01.
  2. ^ Jaśkowski 1934.
  3. ^ Gentzen 1934, Gentzen 1935.
  4. ^ Gentzen 1934, p. 176.
  5. ^ a b Prawitz 1965, Prawitz 2006.
  6. ^ Martin-Löf 1996.
  7. ^ a b c d e Pelletier, Francis Jeffry; Hazen, Allen (2024), "Natural Deduction Systems in Logic", in Zalta, Edward N.; Nodelman, Uri (eds.), The Stanford Encyclopedia of Philosophy (Spring 2024 ed.), Metaphysics Research Lab, Stanford University, retrieved 2024-03-22
  8. ^ Quine (1981). See particularly pages 91–93 for Quine's line-number notation for antecedent dependencies.
  9. ^ Plato, Jan von (2013). Elements of logical reasoning (1. publ ed.). Cambridge: Cambridge University press. p. 9. ISBN 978-1-107-03659-8.
  10. ^ Weisstein, Eric W. "Connective". mathworld.wolfram.com. Retrieved 2024-03-22.
  11. ^ a b c Plato, Jan von (2013). Elements of logical reasoning (1. publ ed.). Cambridge: Cambridge University press. pp. 9, 32, 121. ISBN 978-1-107-03659-8.
  12. ^ "Propositional Logic". www.cs.miami.edu. Retrieved 2024-03-22.
  13. ^ a b Restall, Greg (2018), "Substructural Logics", in Zalta, Edward N. (ed.), The Stanford Encyclopedia of Philosophy (Spring 2018 ed.), Metaphysics Research Lab, Stanford University, retrieved 2024-03-22
  14. ^ a b c "Compactness | Internet Encyclopedia of Philosophy". Retrieved 2024-03-22.
  15. ^ a b "Lecture Topics for Discrete Math Students". math.colorado.edu. Retrieved 2024-03-22.
  16. ^ Paseau, Alexander; Pregel, Fabian (2023), "Deductivism in the Philosophy of Mathematics", in Zalta, Edward N.; Nodelman, Uri (eds.), The Stanford Encyclopedia of Philosophy (Fall 2023 ed.), Metaphysics Research Lab, Stanford University, retrieved 2024-03-22
  17. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al am an ao ap aq ar as at au av aw ax ay az ba bb bc bd be Allen, Colin; Hand, Michael (2022). Logic primer (3rd ed.). Cambridge, Massachusetts: The MIT Press. ISBN 978-0-262-54364-4.
  18. ^ Bostock, David (1997). Intermediate logic. Oxford : New York: Clarendon Press ; Oxford University Press. p. 21. ISBN 978-0-19-875141-0.
  19. ^ Hansson, Sven Ove; Hendricks, Vincent F. (2018). Introduction to formal philosophy. Springer undergraduate texts in philosophy. Cham: Springer. p. 38. ISBN 978-3-030-08454-7.
  20. ^ a b c Ayala-Rincón, Mauricio; de Moura, Flávio L.C. (2017). Applied Logic for Computer Scientists. Undergraduate Topics in Computer Science. Springer. pp. 2, 20. doi:10.1007/978-3-319-51653-0. ISBN 978-3-319-51651-6.
  21. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag Lemmon, Edward John (1998). Beginning logic. Boca Raton, FL: Chapman & Hall/CRC. pp. passim, especially 39-40. ISBN 978-0-412-38090-7.
  22. ^ a b c d e f Arthur, Richard T. W. (2017). An introduction to logic: using natural deduction, real arguments, a little history, and some humour (2nd ed.). Peterborough, Ontario: Broadview Press. ISBN 978-1-55481-332-2. OCLC 962129086.
  23. ^ See also his book Prawitz 1965, Prawitz 2006.
  24. ^ Kleene 2009, pp. 440–516. See also Kleene 1980.

External links edit

  • Laboreo, Daniel Clemente (August 2004). "Introduction to natural deduction" (PDF).
  • "Domino On Acid". Retrieved 10 December 2023. Natural deduction visualized as a game of dominoes
  • Pelletier, Francis Jeffry. "A History of Natural Deduction and Elementary Logic Textbooks" (PDF).
  • "Natural Deduction Systems in Logic" entry by Pelletier, Francis Jeffry; Hazen, Allen in the Stanford Encyclopedia of Philosophy, 29 October 2021
  • Levy, Michel. "A Propositional Prover".

natural, deduction, logic, proof, theory, natural, deduction, kind, proof, calculus, which, logical, reasoning, expressed, inference, rules, closely, related, natural, reasoning, this, contrasts, with, hilbert, style, systems, which, instead, axioms, much, pos. In logic and proof theory natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the natural way of reasoning 1 This contrasts with Hilbert style systems which instead use axioms as much as possible to express the logical laws of deductive reasoning Contents 1 History 1 1 History of notation styles 2 Notation 2 1 Gentzen s tree notation 2 2 Suppes Lemmon notation 3 Propositional language syntax 3 1 Common definition styles 3 2 Gentzen style definition 4 Gentzen style propositional logic 4 1 Gentzen style inference rules 4 2 Gentzen style example proofs 5 Suppes Lemmon style propositional logic 5 1 Suppes Lemmon style inference rules 5 2 Suppes Lemmon style example proof 6 Consistency completeness and normal forms 7 First and higher order extensions 8 Proofs and type theory 8 1 Substitution theorem 8 2 Example Dependent Type Theory 9 Classical and modal logics 9 1 Modal substitution theorem 10 Comparison with sequent calculus 10 1 Cut substitution 11 See also 12 Notes 13 References 13 1 General references 13 2 Inline citations 14 External linksHistory editNatural deduction grew out of a context of dissatisfaction with the axiomatizations of deductive reasoning common to the systems of Hilbert Frege and Russell see e g Hilbert system Such axiomatizations were most famously used by Russell and Whitehead in their mathematical treatise Principia Mathematica Spurred on by a series of seminars in Poland in 1926 by Lukasiewicz that advocated a more natural treatment of logic Jaskowski made the earliest attempts at defining a more natural deduction first in 1929 using a diagrammatic notation and later updating his proposal in a sequence of papers in 1934 and 1935 2 His proposals led to different notations such as Fitch style calculus or Fitch s diagrams or Suppes method for which Lemmon gave a variant called system L Natural deduction in its modern form was independently proposed by the German mathematician Gerhard Gentzen in 1933 in a dissertation delivered to the faculty of mathematical sciences of the University of Gottingen 3 The term natural deduction or rather its German equivalent naturliches Schliessen was coined in that paper Ich wollte nun zunachst einmal einen Formalismus aufstellen der dem wirklichen Schliessen moglichst nahe kommt So ergab sich ein Kalkul des naturlichen Schliessens 4 First I wished to construct a formalism that comes as close as possible to actual reasoning Thus arose a calculus of natural deduction Gentzen was motivated by a desire to establish the consistency of number theory He was unable to prove the main result required for the consistency result the cut elimination theorem the Hauptsatz directly for natural deduction For this reason he introduced his alternative system the sequent calculus for which he proved the Hauptsatz both for classical and intuitionistic logic In a series of seminars in 1961 and 1962 Prawitz gave a comprehensive summary of natural deduction calculi and transported much of Gentzen s work with sequent calculi into the natural deduction framework His 1965 monograph Natural deduction a proof theoretical study 5 was to become a reference work on natural deduction and included applications for modal and second order logic In natural deduction a proposition is deduced from a collection of premises by applying inference rules repeatedly The system presented in this article is a minor variation of Gentzen s or Prawitz s formulation but with a closer adherence to Martin Lof s description of logical judgments and connectives 6 History of notation styles edit Natural deduction has had a large variety of notation styles 7 which can make it difficult to recognize a proof if you re not familiar with one of them To help with this situation this article has a Notation section explaining how to read all the notation that it will actually use This section just explains the historical evolution of notation styles most of which cannot be shown because there are no illustrations available under a public copyright license the reader is pointed to the SEP and IEP for pictures Gentzen invented natural deduction using tree shaped proofs see Gentzen s tree notation for details Jaskowski changed this to a notation that used various nested boxes 7 Fitch changed Jaskowski method of drawing the boxes creating Fitch notation 7 1940 In a textbook Quine 8 indicated antecedent dependencies by line numbers in square brackets anticipating Suppes 1957 line number notation 1950 In a textbook Quine 1982 pp 241 255 demonstrated a method of using one or more asterisks to the left of each line of proof to indicate dependencies This is equivalent to Kleene s vertical bars It is not totally clear if Quine s asterisk notation appeared in the original 1950 edition or was added in a later edition 1957 An introduction to practical logic theorem proving in a textbook by Suppes 1999 pp 25 150 This indicated dependencies i e antecedent propositions by line numbers at the left of each line 1963 Stoll 1979 pp 183 190 215 219 uses sets of line numbers to indicate antecedent dependencies of the lines of sequential logical arguments based on natural deduction inference rules 1965 The entire textbook by Lemmon 1965 is an introduction to logic proofs using a method based on that of Suppes what is now known as Suppes Lemmon notation 1967 In a textbook Kleene 2002 pp 50 58 128 130 briefly demonstrated two kinds of practical logic proofs one system using explicit quotations of antecedent propositions on the left of each line the other system using vertical bar lines on the left to indicate dependencies a Notation editThe reader should be familiar with common symbols for logical connectives Here is a table with the most common notational variants Notational variants of the connectives 9 10 Connective Symbol AND A B displaystyle A land B nbsp A B displaystyle A cdot B nbsp A B displaystyle AB nbsp A amp B displaystyle A amp B nbsp A amp amp B displaystyle A amp amp B nbsp equivalent A B displaystyle A equiv B nbsp A B displaystyle A Leftrightarrow B nbsp A B displaystyle A leftrightharpoons B nbsp implies A B displaystyle A Rightarrow B nbsp A B displaystyle A supset B nbsp A B displaystyle A rightarrow B nbsp NAND A B displaystyle A overline land B nbsp A B displaystyle A mid B nbsp A B displaystyle overline A cdot B nbsp nonequivalent A B displaystyle A not equiv B nbsp A B displaystyle A not Leftrightarrow B nbsp A B displaystyle A nleftrightarrow B nbsp NOR A B displaystyle A overline lor B nbsp A B displaystyle A downarrow B nbsp A B displaystyle overline A B nbsp NOT A displaystyle neg A nbsp A displaystyle A nbsp A displaystyle overline A nbsp A displaystyle sim A nbsp OR A B displaystyle A lor B nbsp A B displaystyle A B nbsp A B displaystyle A mid B nbsp A B displaystyle A parallel B nbsp XNOR A displaystyle A nbsp XNOR B displaystyle B nbsp XOR A B displaystyle A underline lor B nbsp A B displaystyle A oplus B nbsp Gentzen s tree notation edit Gentzen who invented natural deduction had his own notation style for arguments This will be exemplified by a simple argument below Let s say we have a simple example argument in propositional logic such as if it s raining then it s cloudly it is raining therefore it s cloudy This is in modus ponens Representing this as a list of propositions as is common we would have 1 P Q displaystyle 1 P to Q nbsp 2 P displaystyle 2 P nbsp Q displaystyle therefore Q nbsp In Gentzen s notation 7 this would be written like this P Q P Q displaystyle frac P to Q P Q nbsp The premises are shown above a line called the inference line 11 12 separated by a comma which indicates combination of premises 13 The conclusion is written below the inference line 11 The inference line represents syntactic consequence 11 sometimes called deductive consequence 14 which is also symbolized with 15 14 So the above can also be written in one line as P Q P Q displaystyle P to Q P vdash Q nbsp The turnstile for syntactic consequence is of a higher level than the comma which represents premise combination which in turn is of a higher level than the arrow used for material implication so no parentheses are needed to interpret this formula 13 Syntactic consequence is contrasted with semantic consequence 16 which is symbolized with 15 14 In this case the conclusion follows syntactically because natural deduction is a syntactic proof system which assumes inference rules as primitives Gentzen s style will be used in much of this article Gentzen s discharging annotations used to internalise hypothetical judgments can be avoided by representing proofs as a tree of sequents G A instead of a tree of A true judgments Suppes Lemmon notation edit Many textbooks use Suppes Lemmon notation 7 so this article will also give that although as of now this is only included for propositional logic and the rest of the coverage is given only in Gentzen style A proof laid out in accordance with the Suppes Lemmon notation style is a sequence of lines containing sentences 17 where each sentence is either an assumption or the result of applying a rule of proof to earlier sentences in the sequence 17 Each line of proof is made up of a sentence of proof together with its annotation its assumption set and the current line number 17 The assumption set lists the assumptions on which the given sentence of proof depends which are referenced by the line numbers 17 The annotation specifies which rule of proof was applied and to which earlier lines to yield the current sentence 17 Here s an example proof Suppes Lemmon style proof first example Assumption set Line number Sentence of proof Annotation 1 1 P Q displaystyle P to Q nbsp A 2 2 Q displaystyle Q nbsp A 3 3 P displaystyle P nbsp A 1 3 4 Q displaystyle Q nbsp 1 3 E 1 2 5 P displaystyle P nbsp 2 4 RAA This proof will become clearer when the inference rules and their appropriate annotations are specified see Propositional inference rules Suppes Lemmon style Propositional language syntax editMain article Propositional calculus SyntaxThis section defines the formal syntax for a propositional logic language contrasting the common ways of doing so with a Gentzen style way of doing so Common definition styles edit The formal language L displaystyle mathcal L nbsp of a propositional calculus is usually defined by a recursive definition such as this one from Bostock 18 Each sentence letter is a formula displaystyle top nbsp and displaystyle bot nbsp are formulae If f displaystyle varphi nbsp is a formula so is f displaystyle neg varphi nbsp If f displaystyle varphi nbsp and ps displaystyle psi nbsp are formulae so are f ps displaystyle varphi land psi nbsp f ps displaystyle varphi lor psi nbsp f ps displaystyle varphi to psi nbsp f ps displaystyle varphi leftrightarrow psi nbsp Nothing else is a formula There are other ways of doing it such as the BNF grammar ϕ a 1 a 2 ϕ ϕ amp ps ϕ ps ϕ ps ϕ ps displaystyle phi a 1 a 2 ldots neg phi phi amp psi phi vee psi phi rightarrow psi phi leftrightarrow psi nbsp 19 20 Gentzen style definition edit Interestingly Gentzen s tree notation can be used to give a syntax definition 20 by introducing the notation prop to mean is a proposition in this context meaning a well formed formula The rules of the syntax may be called formation rules for the connectives as contrasted with the inference rules of proof Here s the formation rule for conjunction A prop B prop A B prop F displaystyle frac A hbox prop qquad B hbox prop A wedge B hbox prop wedge F nbsp or omitting the parentheses for simplicity A prop B prop A B prop F displaystyle frac A hbox prop qquad B hbox prop A wedge B hbox prop wedge F nbsp And here are the other formation rules A prop B prop A B prop F A prop B prop A B prop F displaystyle frac A hbox prop qquad B hbox prop A vee B hbox prop vee F qquad frac A hbox prop qquad B hbox prop A supset B hbox prop supset F nbsp prop F prop F A prop A prop F displaystyle frac hbox top hbox prop top F qquad frac hbox bot hbox prop bot F qquad frac A hbox prop neg A hbox prop neg F nbsp The rest of this article will focus on the natural deduction proof system itself using true or nothing instead of prop to show that it is giving inference rules rather than formation rules Gentzen style propositional logic editGentzen style inference rules edit The following is a complete list of primitive inference rules for natural deduction in classical propositional logic 20 Rules for classical propositional logic Introduction rules Elimination rules f ps f ps i displaystyle begin array c varphi qquad psi hline varphi wedge psi end array wedge i nbsp f ps f e displaystyle begin array c varphi wedge psi hline varphi end array wedge e nbsp f f ps i displaystyle begin array c varphi hline varphi vee psi end array vee i nbsp f ps f u x ps v x x E u v displaystyle cfrac varphi vee psi quad begin matrix varphi u vdots chi end matrix quad begin matrix psi v vdots chi end matrix chi vee E u v nbsp f u ps f ps i u displaystyle begin array c varphi u vdots psi hline varphi to psi end array to i u nbsp f f ps ps e displaystyle begin array c varphi qquad varphi to psi hline psi end array to e nbsp f u f i u displaystyle begin array c varphi u vdots bot hline neg varphi end array neg i u nbsp f f e displaystyle begin array c varphi qquad neg varphi hline bot end array neg e nbsp f u f PBC u displaystyle begin array c neg varphi u vdots bot hline varphi end array text PBC u nbsp This table follows the custom of using Greek letters as schemata which may range over any formulas rather than only over atomic propositions The name of a rule is given to the right of its formula tree For instance the first introduction rule is named i displaystyle wedge i nbsp which is short for conjunction introduction Gentzen style example proofs edit As an example of the use of inference rules consider commutativity of conjunction If A B is true then B A is true this derivation can be drawn by composing inference rules in such a fashion that premises of a lower inference match the conclusion of the next higher inference A B true B true E 2 A B true A true E 1 B A true I displaystyle cfrac cfrac A wedge B hbox true B hbox true wedge E2 qquad cfrac A wedge B hbox true A hbox true wedge E1 B wedge A hbox true wedge I nbsp As a second example consider the derivation of A B A B true A true u B true w A B true I B A B true A B A B true I u I w displaystyle cfrac cfrac cfrac A hbox true u quad cfrac B hbox true w A wedge B hbox true wedge I cfrac B supset left A wedge B right hbox true A supset left B supset left A wedge B right right hbox true supset I u supset I w nbsp This full derivation has no unsatisfied premises however sub derivations are hypothetical For instance the derivation of B A B true is hypothetical with antecedent A true named u Suppes Lemmon style propositional logic editSuppes Lemmon style inference rules edit Natural deduction inference rules due ultimately to Gentzen are given below 21 There are ten primitive rules of proof which are the rule assumption plus four pairs of introduction and elimination rules for the binary connectives and the rule reductio ad adbsurdum 17 Disjunctive Syllogism can be used as an easier alternative to the proper elimination 17 and MTT and DN are commonly given rules 21 although they are not primitive 17 List of Inference Rules Rule Name Alternative names Annotation Assumption set Statement Rule of Assumptions 21 Assumption 17 A 21 17 The current line number 17 At any stage of the argument introduce a proposition as an assumption of the argument 21 17 Conjunction introduction Ampersand introduction 21 17 conjunction CONJ 17 22 m n amp I 17 21 The union of the assumption sets at lines m and n 17 From f displaystyle varphi nbsp and ps displaystyle psi nbsp at lines m and n infer f amp ps displaystyle varphi amp psi nbsp 21 17 Conjunction elimination Simplification S 17 ampersand elimination 21 17 m amp E 17 21 The same as at line m 17 From f amp ps displaystyle varphi amp psi nbsp at line m infer f displaystyle varphi nbsp and ps displaystyle psi nbsp 17 21 Disjunction introduction 21 Addition ADD 17 m I 17 21 The same as at line m 17 From f displaystyle varphi nbsp at line m infer f ps displaystyle varphi lor psi nbsp whatever ps displaystyle psi nbsp may be 17 21 Disjunction elimination Wedge elimination 21 dilemma DL 22 j k l m n E 21 The lines j k l m n 21 From f ps displaystyle varphi lor psi nbsp at line j and an assumption of f displaystyle varphi nbsp at line k and a derivation of x displaystyle chi nbsp from f displaystyle varphi nbsp at line l and an assumption of ps displaystyle psi nbsp at line m and a derivation of x displaystyle chi nbsp from ps displaystyle psi nbsp at line n infer x displaystyle chi nbsp 21 Disjunctive Syllogism Wedge elimination E 17 modus tollendo ponens MTP 17 m n DS 17 The union of the assumption sets at lines m and n 17 From f ps displaystyle varphi lor psi nbsp at line m and f displaystyle varphi nbsp at line n infer ps displaystyle psi nbsp from f ps displaystyle varphi lor psi nbsp at line m and ps displaystyle psi nbsp at line n infer f displaystyle varphi nbsp 17 Arrow elimination 17 Modus ponendo ponens MPP 21 17 modus ponens MP 22 17 conditional elimination m n E 17 21 The union of the assumption sets at lines m and n 17 From f ps displaystyle varphi to psi nbsp at line m and f displaystyle varphi nbsp at line n infer ps displaystyle psi nbsp 17 Arrow introduction 17 Conditional proof CP 22 21 17 conditional introduction n I m 17 21 Everything in the assumption set at line n excepting m the line where the antecedent was assumed 17 From ps displaystyle psi nbsp at line n following from the assumption of f displaystyle varphi nbsp at line m infer f ps displaystyle varphi to psi nbsp 17 Reductio ad absurdum 21 Indirect Proof IP 17 negation introduction I 17 negation elimination E 17 m n RAA k 17 The union of the assumption sets at lines m and n excluding k the denied assumption 17 From a sentence and its denial b at lines m and n infer the denial of any assumption appearing in the proof at line k 17 Double arrow introduction 17 Biconditional definition Df 21 biconditional introduction m n I 17 The union of the assumption sets at lines m and n 17 From f ps displaystyle varphi to psi nbsp and ps f displaystyle psi to varphi nbsp at lines m and n infer f ps displaystyle varphi leftrightarrow psi nbsp 17 Double arrow elimination 17 Biconditional definition Df 21 biconditional elimination m E 17 The same as at line m 17 From f ps displaystyle varphi leftrightarrow psi nbsp at line m infer either f ps displaystyle varphi to psi nbsp or ps f displaystyle psi to varphi nbsp 17 Double negation 21 22 Double negation elimination m DN 21 The same as at line m 21 From f displaystyle varphi nbsp at line m infer f displaystyle varphi nbsp 21 Modus tollendo tollens 21 Modus tollens MT 22 m n MTT 21 The union of the assumption sets at lines m and n 21 From f ps displaystyle varphi to psi nbsp at line m and ps displaystyle psi nbsp at line n infer f displaystyle varphi nbsp 21 Suppes Lemmon style example proof edit Recall that an example proof was already given when introducing Suppes Lemmon notation This is a second example Suppes Lemmon style proof second example Assumption set Line number Sentence of proof Annotation 1 1 P Q displaystyle P lor Q nbsp A 2 2 P displaystyle neg P nbsp A 3 3 P Q displaystyle neg P to neg Q nbsp A 2 3 4 Q displaystyle neg Q nbsp 2 3 E 1 2 3 5 P displaystyle P nbsp 1 4 amp E 1 3 6 P displaystyle P nbsp 2 5 RAA 2 2 3 7 P displaystyle neg P nbsp 2 3 RAA 1 Consistency completeness and normal forms editThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message A theory is said to be consistent if falsehood is not provable from no assumptions and is complete if every theorem or its negation is provable using the inference rules of the logic These are statements about the entire logic and are usually tied to some notion of a model However there are local notions of consistency and completeness that are purely syntactic checks on the inference rules and require no appeals to models The first of these is local consistency also known as local reducibility which says that any derivation containing an introduction of a connective followed immediately by its elimination can be turned into an equivalent derivation without this detour It is a check on the strength of elimination rules they must not be so strong that they include knowledge not already contained in their premises As an example consider conjunctions A true u B true w A B true I A true E 1 A true u displaystyle begin aligned cfrac cfrac cfrac A text true u qquad cfrac B text true w A wedge B text true wedge I A text true wedge E1 end aligned quad Rightarrow quad cfrac A text true u nbsp Dually local completeness says that the elimination rules are strong enough to decompose a connective into the forms suitable for its introduction rule Again for conjunctions A B true u A B true u A true E 1 A B true u B true E 2 A B true I displaystyle cfrac A wedge B text true u quad Rightarrow quad begin aligned cfrac cfrac cfrac A wedge B text true u A text true wedge E1 qquad cfrac cfrac A wedge B text true u B text true wedge E2 A wedge B text true wedge I end aligned nbsp These notions correspond exactly to b reduction beta reduction and h conversion eta conversion in the lambda calculus using the Curry Howard isomorphism By local completeness we see that every derivation can be converted to an equivalent derivation where the principal connective is introduced In fact if the entire derivation obeys this ordering of eliminations followed by introductions then it is said to be normal In a normal derivation all eliminations happen above introductions In most logics every derivation has an equivalent normal derivation called a normal form The existence of normal forms is generally hard to prove using natural deduction alone though such accounts do exist in the literature most notably by Dag Prawitz in 1961 23 It is much easier to show this indirectly by means of a cut free sequent calculus presentation First and higher order extensions editThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message nbsp Summary of first order system The logic of the earlier section is an example of a single sorted logic i e a logic with a single kind of object propositions Many extensions of this simple framework have been proposed in this section we will extend it with a second sort of individuals or terms More precisely we will add a new kind of judgment t is a term or t term where t is schematic We shall fix a countable set V of variables another countable set F of function symbols and construct terms with the following formation rules v V v term var F displaystyle frac v in V v hbox term hbox var F nbsp and f F t 1 term t 2 term t n term f t 1 t 2 t n term app F displaystyle frac f in F qquad t 1 hbox term qquad t 2 hbox term qquad cdots qquad t n hbox term f t 1 t 2 cdots t n hbox term hbox app F nbsp For propositions we consider a third countable set P of predicates and define atomic predicates over terms with the following formation rule ϕ P t 1 term t 2 term t n term ϕ t 1 t 2 t n prop pred F displaystyle frac phi in P qquad t 1 hbox term qquad t 2 hbox term qquad cdots qquad t n hbox term phi t 1 t 2 cdots t n hbox prop hbox pred F nbsp The first two rules of formation provide a definition of a term that is effectively the same as that defined in term algebra and model theory although the focus of those fields of study is quite different from natural deduction The third rule of formation effectively defines an atomic formula as in first order logic and again in model theory To these are added a pair of formation rules defining the notation for quantified propositions one for universal and existential quantification x V A prop x A prop F x V A prop x A prop F displaystyle frac x in V qquad A hbox prop forall x A hbox prop forall F qquad qquad frac x in V qquad A hbox prop exists x A hbox prop exists F nbsp The universal quantifier has the introduction and elimination rules a term u a x A true x A true I u a x A true t term t x A true E displaystyle cfrac begin array c cfrac a text term text u vdots a x A text true end array forall x A text true forall I u a qquad qquad frac forall x A text true qquad t text term t x A text true forall E nbsp The existential quantifier has the introduction and elimination rules t x A true x A true I a term u a x A true v x A true C true C true E a u v displaystyle frac t x A hbox true exists x A hbox true exists I qquad qquad cfrac begin array cc amp underbrace cfrac a hbox term hbox u quad cfrac a x A hbox true hbox v amp vdots exists x A hbox true quad amp C hbox true end array C hbox true exists E a u v nbsp In these rules the notation t x A stands for the substitution of t for every visible instance of x in A avoiding capture c As before the superscripts on the name stand for the components that are discharged the term a cannot occur in the conclusion of I such terms are known as eigenvariables or parameters and the hypotheses named u and v in E are localised to the second premise in a hypothetical derivation Although the propositional logic of earlier sections was decidable adding the quantifiers makes the logic undecidable So far the quantified extensions are first order they distinguish propositions from the kinds of objects quantified over Higher order logic takes a different approach and has only a single sort of propositions The quantifiers have as the domain of quantification the very same sort of propositions as reflected in the formation rules p prop u A prop p A prop F u p prop u A prop p A prop F u displaystyle cfrac begin matrix cfrac p hbox prop hbox u vdots A hbox prop end matrix forall p A hbox prop forall F u qquad qquad cfrac begin matrix cfrac p hbox prop hbox u vdots A hbox prop end matrix exists p A hbox prop exists F u nbsp A discussion of the introduction and elimination forms for higher order logic is beyond the scope of this article It is possible to be in between first order and higher order logics For example second order logic has two kinds of propositions one kind quantifying over terms and the second kind quantifying over propositions of the first kind Proofs and type theory editThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message The presentation of natural deduction so far has concentrated on the nature of propositions without giving a formal definition of a proof To formalise the notion of proof we alter the presentation of hypothetical derivations slightly We label the antecedents with proof variables from some countable set V of variables and decorate the succedent with the actual proof The antecedents or hypotheses are separated from the succedent by means of a turnstile This modification sometimes goes under the name of localised hypotheses The following diagram summarises the change u1 u2 un J1 J2 Jn J u1 J1 u2 J2 un Jn J The collection of hypotheses will be written as G when their exact composition is not relevant To make proofs explicit we move from the proof less judgment A true to a judgment p is a proof of A true which is written symbolically as p A true Following the standard approach proofs are specified with their own formation rules for the judgment p proof The simplest possible proof is the use of a labelled hypothesis in this case the evidence is the label itself u V proof F u proof hyp u A true u A true For brevity we shall leave off the judgmental label true in the rest of this article i e write G p A Let us re examine some of the connectives with explicit proofs For conjunction we look at the introduction rule I to discover the form of proofs of conjunction they must be a pair of proofs of the two conjuncts Thus p1 proof p2 proof pair F p1 p2 proof G p1 A G p2 B I G p1 p2 A B The elimination rules E1 and E2 select either the left or the right conjunct thus the proofs are a pair of projections first fst and second snd p proof fst F fst p proof G p A B E1 G fst p A p proof snd F snd p proof G p A B E2 G snd p B For implication the introduction form localises or binds the hypothesis written using a l this corresponds to the discharged label In the rule G u A stands for the collection of hypotheses G together with the additional hypothesis u p proof l F lu p proof G u A p B I G lu p A B p1 proof p2 proof app F p1 p2 proof G p1 A B G p2 A E G p1 p2 B With proofs available explicitly one can manipulate and reason about proofs The key operation on proofs is the substitution of one proof for an assumption used in another proof This is commonly known as a substitution theorem and can be proved by induction on the depth or structure of the second judgment Substitution theorem edit This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message If G p1 A and G u A p2 B then G p1 u p2 B So far the judgment G p A has had a purely logical interpretation In type theory the logical view is exchanged for a more computational view of objects Propositions in the logical interpretation are now viewed as types and proofs as programs in the lambda calculus Thus the interpretation of p A is the program p has type A The logical connectives are also given a different reading conjunction is viewed as product implication as the function arrow etc The differences are only cosmetic however Type theory has a natural deduction presentation in terms of formation introduction and elimination rules in fact the reader can easily reconstruct what is known as simple type theory from the previous sections The difference between logic and type theory is primarily a shift of focus from the types propositions to the programs proofs Type theory is chiefly interested in the convertibility or reducibility of programs For every type there are canonical programs of that type which are irreducible these are known as canonical forms or values If every program can be reduced to a canonical form then the type theory is said to be normalising or weakly normalising If the canonical form is unique then the theory is said to be strongly normalising Normalisability is a rare feature of most non trivial type theories which is a big departure from the logical world Recall that almost every logical derivation has an equivalent normal derivation To sketch the reason in type theories that admit recursive definitions it is possible to write programs that never reduce to a value such looping programs can generally be given any type In particular the looping program has type although there is no logical proof of true For this reason the propositions as types proofs as programs paradigm only works in one direction if at all interpreting a type theory as a logic generally gives an inconsistent logic Example Dependent Type Theory edit This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message Like logic type theory has many extensions and variants including first order and higher order versions One branch known as dependent type theory is used in a number of computer assisted proof systems Dependent type theory allows quantifiers to range over programs themselves These quantified types are written as P and S instead of and and have the following formation rules G A type G x A B type P F G Px A B type G A type G x A B type S F G Sx A B type These types are generalisations of the arrow and product types respectively as witnessed by their introduction and elimination rules G x A p B PI G lx p Px A B G p1 Px A B G p2 A PE G p1 p2 p2 x B G p1 A G x A p2 B SI G p1 p2 Sx A B G p Sx A B SE1 G fst p A G p Sx A B SE2 G snd p fst p x B Dependent type theory in full generality is very powerful it is able to express almost any conceivable property of programs directly in the types of the program This generality comes at a steep price either typechecking is undecidable extensional type theory or extensional reasoning is more difficult intensional type theory For this reason some dependent type theories do not allow quantification over arbitrary programs but rather restrict to programs of a given decidable index domain for example integers strings or linear programs Since dependent type theories allow types to depend on programs a natural question to ask is whether it is possible for programs to depend on types or any other combination There are many kinds of answers to such questions A popular approach in type theory is to allow programs to be quantified over types also known as parametric polymorphism of this there are two main kinds if types and programs are kept separate then one obtains a somewhat more well behaved system called predicative polymorphism if the distinction between program and type is blurred one obtains the type theoretic analogue of higher order logic also known as impredicative polymorphism Various combinations of dependency and polymorphism have been considered in the literature the most famous being the lambda cube of Henk Barendregt The intersection of logic and type theory is a vast and active research area New logics are usually formalised in a general type theoretic setting known as a logical framework Popular modern logical frameworks such as the calculus of constructions and LF are based on higher order dependent type theory with various trade offs in terms of decidability and expressive power These logical frameworks are themselves always specified as natural deduction systems which is a testament to the versatility of the natural deduction approach Classical and modal logics editThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message For simplicity the logics presented so far have been intuitionistic Classical logic extends intuitionistic logic with an additional axiom or principle of excluded middle For any proposition p the proposition p p is true This statement is not obviously either an introduction or an elimination indeed it involves two distinct connectives Gentzen s original treatment of excluded middle prescribed one of the following three equivalent formulations which were already present in analogous forms in the systems of Hilbert and Heyting XM1 A A true A true XM2 A true u A true p true XM3u p A true XM3 is merely XM2 expressed in terms of E This treatment of excluded middle in addition to being objectionable from a purist s standpoint introduces additional complications in the definition of normal forms A comparatively more satisfactory treatment of classical natural deduction in terms of introduction and elimination rules alone was first proposed by Parigot in 1992 in the form of a classical lambda calculus called lm The key insight of his approach was to replace a truth centric judgment A true with a more classical notion reminiscent of the sequent calculus in localised form instead of G A he used G D with D a collection of propositions similar to G G was treated as a conjunction and D as a disjunction This structure is essentially lifted directly from classical sequent calculi but the innovation in lm was to give a computational meaning to classical natural deduction proofs in terms of a callcc or a throw catch mechanism seen in LISP and its descendants See also first class control Another important extension was for modal and other logics that need more than just the basic judgment of truth These were first described for the alethic modal logics S4 and S5 in a natural deduction style by Prawitz in 1965 5 and have since accumulated a large body of related work To give a simple example the modal logic S4 requires one new judgment A valid that is categorical with respect to truth If A true under no assumptions of the form B true then A valid This categorical judgment is internalised as a unary connective A read necessarily A with the following introduction and elimination rules A valid I A true A true E A true Note that the premise A valid has no defining rules instead the categorical definition of validity is used in its place This mode becomes clearer in the localised form when the hypotheses are explicit We write W G A true where G contains the true hypotheses as before and W contains valid hypotheses On the right there is just a single judgment A true validity is not needed here since W A valid is by definition the same as W A true The introduction and elimination forms are then W p A true I W box p A true W G p A true E W G unbox p A true The modal hypotheses have their own version of the hypothesis rule and substitution theorem valid hyp W u A valid G u A true Modal substitution theorem edit This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message If W p1 A true and W u A valid G p2 C true then W G p1 u p2 C true This framework of separating judgments into distinct collections of hypotheses also known as multi zoned or polyadic contexts is very powerful and extensible it has been applied for many different modal logics and also for linear and other substructural logics to give a few examples However relatively few systems of modal logic can be formalised directly in natural deduction To give proof theoretic characterisations of these systems extensions such as labelling or systems of deep inference The addition of labels to formulae permits much finer control of the conditions under which rules apply allowing the more flexible techniques of analytic tableaux to be applied as has been done in the case of labelled deduction Labels also allow the naming of worlds in Kripke semantics Simpson 1993 presents an influential technique for converting frame conditions of modal logics in Kripke semantics into inference rules in a natural deduction formalisation of hybrid logic Stouppa 2004 surveys the application of many proof theories such as Avron and Pottinger s hypersequents and Belnap s display logic to such modal logics as S5 and B Comparison with sequent calculus editThis section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message Main article Sequent calculus The sequent calculus is the chief alternative to natural deduction as a foundation of mathematical logic In natural deduction the flow of information is bi directional elimination rules flow information downwards by deconstruction and introduction rules flow information upwards by assembly Thus a natural deduction proof does not have a purely bottom up or top down reading making it unsuitable for automation in proof search To address this fact Gentzen in 1935 proposed his sequent calculus though he initially intended it as a technical device for clarifying the consistency of predicate logic Kleene in his seminal 1952 book Introduction to Metamathematics gave the first formulation of the sequent calculus in the modern style 24 In the sequent calculus all inference rules have a purely bottom up reading Inference rules can apply to elements on both sides of the turnstile To differentiate from natural deduction this article uses a double arrow instead of the right tack for sequents The introduction rules of natural deduction are viewed as right rules in the sequent calculus and are structurally very similar The elimination rules on the other hand turn into left rules in the sequent calculus To give an example consider disjunction the right rules are familiar G A R1 G A B G B R2 G A B On the left G u A C G v B C L G w A B C Recall the E rule of natural deduction in localised form G A B G u A C G v B C E G C The proposition A B which is the succedent of a premise in E turns into a hypothesis of the conclusion in the left rule L Thus left rules can be seen as a sort of inverted elimination rule This observation can be illustrated as follows natural deduction sequent calculus hyp elim rules meet intro rules conclusion init left rules right rules conclusion In the sequent calculus the left and right rules are performed in lock step until one reaches the initial sequent which corresponds to the meeting point of elimination and introduction rules in natural deduction These initial rules are superficially similar to the hypothesis rule of natural deduction but in the sequent calculus they describe a transposition or a handshake of a left and a right proposition init G u A A The correspondence between the sequent calculus and natural deduction is a pair of soundness and completeness theorems which are both provable by means of an inductive argument Soundness of wrt If G A then G A Completeness of wrt If G A then G A It is clear by these theorems that the sequent calculus does not change the notion of truth because the same collection of propositions remain true Thus one can use the same proof objects as before in sequent calculus derivations As an example consider the conjunctions The right rule is virtually identical to the introduction rule sequent calculus natural deduction G p1 A G p2 B R G p1 p2 A B G p1 A G p2 B I G p1 p2 A B The left rule however performs some additional substitutions that are not performed in the corresponding elimination rules sequent calculus natural deduction G u A p C L1 G v A B fst v u p C G p A B E1 G fst p A G u B p C L2 G v A B snd v u p C G p A B E2 G snd p B The kinds of proofs generated in the sequent calculus are therefore rather different from those of natural deduction The sequent calculus produces proofs in what is known as the b normal h long form which corresponds to a canonical representation of the normal form of the natural deduction proof If one attempts to describe these proofs using natural deduction itself one obtains what is called the intercalation calculus first described by John Byrnes which can be used to formally define the notion of a normal form for natural deduction The substitution theorem of natural deduction takes the form of a structural rule or structural theorem known as cut in the sequent calculus Cut substitution edit This section does not cite any sources Please help improve this section by adding citations to reliable sources Unsourced material may be challenged and removed May 2024 Learn how and when to remove this message If G p1 A and G u A p2 C then G p1 u p2 C In most well behaved logics cut is unnecessary as an inference rule though it remains provable as a meta theorem the superfluousness of the cut rule is usually presented as a computational process known as cut elimination This has an interesting application for natural deduction usually it is extremely tedious to prove certain properties directly in natural deduction because of an unbounded number of cases For example consider showing that a given proposition is not provable in natural deduction A simple inductive argument fails because of rules like E or E which can introduce arbitrary propositions However we know that the sequent calculus is complete with respect to natural deduction so it is enough to show this unprovability in the sequent calculus Now if cut is not available as an inference rule then all sequent rules either introduce a connective on the right or the left so the depth of a sequent derivation is fully bounded by the connectives in the final conclusion Thus showing unprovability is much easier because there are only a finite number of cases to consider and each case is composed entirely of sub propositions of the conclusion A simple instance of this is the global consistency theorem true is not provable In the sequent calculus version this is manifestly true because there is no rule that can have as a conclusion Proof theorists often prefer to work on cut free sequent calculus formulations because of such properties See also edit nbsp Philosophy portal Mathematical logic Sequent calculus Gerhard Gentzen System L tabular natural deduction Argument map the general term for tree like logic notationNotes edit A particular advantage of Kleene s tabular natural deduction systems is that he proves the validity of the inference rules for both propositional calculus and predicate calculus See Kleene 2002 pp 44 45 118 119 To simplify the statement of the rule the word denial here is used in this way the denial of a formula f displaystyle varphi nbsp that is not a negation is f displaystyle varphi nbsp whereas a negation f displaystyle varphi nbsp has two denials viz f displaystyle varphi nbsp and f displaystyle varphi nbsp 17 See the article on lambda calculus for more detail about the concept of substitution References editGeneral references edit Barker Plummer Dave Barwise Jon Etchemendy John 2011 Language Proof and Logic 2nd ed CSLI Publications ISBN 978 1575866321 Gallier Jean 2005 Constructive Logics Part I A Tutorial on Proof Systems and Typed l Calculi Retrieved 12 June 2014 Gentzen Gerhard Karl Erich 1934 Untersuchungen uber das logische Schliessen I Mathematische Zeitschrift 39 2 176 210 doi 10 1007 BF01201353 S2CID 121546341 English translation Investigations into Logical Deduction in M E Szabo The Collected Works of Gerhard Gentzen North Holland Publishing Company 1969 Gentzen Gerhard Karl Erich 1935 Untersuchungen uber das logische Schliessen II Mathematische Zeitschrift 39 3 405 431 doi 10 1007 bf01201363 S2CID 186239837 Girard Jean Yves 1990 Proofs and Types Cambridge Tracts in Theoretical Computer Science Cambridge University Press Cambridge England Archived from the original on 2016 07 04 Retrieved 2006 04 20 Translated and with appendices by Paul Taylor and Yves Lafont Jaskowski Stanislaw 1934 On the rules of suppositions in formal logic Reprinted in Polish logic 1920 39 ed Storrs McCall Kleene Stephen Cole 1980 1952 Introduction to metamathematics Eleventh ed North Holland ISBN 978 0 7204 2103 3 Kleene Stephen Cole 2009 1952 Introduction to metamathematics Ishi Press International ISBN 978 0 923891 57 2 Kleene Stephen Cole 2002 1967 Mathematical logic Mineola New York Dover Publications ISBN 978 0 486 42533 7 Lemmon Edward John 1965 Beginning logic Thomas Nelson ISBN 978 0 17 712040 4 Martin Lof Per 1996 On the meanings of the logical constants and the justifications of the logical laws PDF Nordic Journal of Philosophical Logic 1 1 11 60 Lecture notes to a short course at Universita degli Studi di Siena April 1983 Pfenning Frank Davies Rowan 2001 A judgmental reconstruction of modal logic PDF Mathematical Structures in Computer Science 11 4 511 540 CiteSeerX 10 1 1 43 1611 doi 10 1017 S0960129501003322 S2CID 16467268 Prawitz Dag 1965 Natural deduction A proof theoretical study Acta Universitatis Stockholmiensis Stockholm studies in philosophy 3 Stockholm Goteborg Uppsala Almqvist amp Wicksell Prawitz Dag 2006 1965 Natural deduction A proof theoretical study Mineola New York Dover Publications ISBN 978 0 486 44655 4 Quine Willard Van Orman 1981 1940 Mathematical logic Revised ed Cambridge Massachusetts Harvard University Press ISBN 978 0 674 55451 1 Quine Willard Van Orman 1982 1950 Methods of logic Fourth ed Cambridge Massachusetts Harvard University Press ISBN 978 0 674 57176 1 Simpson Alex 1993 The proof theory and semantics of intuitionistic modal logic PDF University of Edinburgh PhD thesis Stoll Robert Roth 1979 1963 Set Theory and Logic Mineola New York Dover Publications ISBN 978 0 486 63829 4 Stouppa Phiniki 2004 The Design of Modal Proof Theories The Case of S5 University of Dresden CiteSeerX 10 1 1 140 1858 MSc thesis Suppes Patrick Colonel 1999 1957 Introduction to logic Mineola New York Dover Publications ISBN 978 0 486 40687 9 Van Dalen Dirk 2013 1980 Logic and Structure Universitext 5 ed London Heidelberg New York Dordrecht Springer doi 10 1007 978 1 4471 4558 5 ISBN 978 1 4471 4558 5 Inline citations edit Natural Deduction Internet Encyclopedia of Philosophy Retrieved 2024 05 01 Jaskowski 1934 Gentzen 1934 Gentzen 1935 Gentzen 1934 p 176 a b Prawitz 1965 Prawitz 2006 Martin Lof 1996 a b c d e Pelletier Francis Jeffry Hazen Allen 2024 Natural Deduction Systems in Logic in Zalta Edward N Nodelman Uri eds The Stanford Encyclopedia of Philosophy Spring 2024 ed Metaphysics Research Lab Stanford University retrieved 2024 03 22 Quine 1981 See particularly pages 91 93 for Quine s line number notation for antecedent dependencies Plato Jan von 2013 Elements of logical reasoning 1 publ ed Cambridge Cambridge University press p 9 ISBN 978 1 107 03659 8 Weisstein Eric W Connective mathworld wolfram com Retrieved 2024 03 22 a b c Plato Jan von 2013 Elements of logical reasoning 1 publ ed Cambridge Cambridge University press pp 9 32 121 ISBN 978 1 107 03659 8 Propositional Logic www cs miami edu Retrieved 2024 03 22 a b Restall Greg 2018 Substructural Logics in Zalta Edward N ed The Stanford Encyclopedia of Philosophy Spring 2018 ed Metaphysics Research Lab Stanford University retrieved 2024 03 22 a b c Compactness Internet Encyclopedia of Philosophy Retrieved 2024 03 22 a b Lecture Topics for Discrete Math Students math colorado edu Retrieved 2024 03 22 Paseau Alexander Pregel Fabian 2023 Deductivism in the Philosophy of Mathematics in Zalta Edward N Nodelman Uri eds The Stanford Encyclopedia of Philosophy Fall 2023 ed Metaphysics Research Lab Stanford University retrieved 2024 03 22 a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al am an ao ap aq ar as at au av aw ax ay az ba bb bc bd be Allen Colin Hand Michael 2022 Logic primer 3rd ed Cambridge Massachusetts The MIT Press ISBN 978 0 262 54364 4 Bostock David 1997 Intermediate logic Oxford New York Clarendon Press Oxford University Press p 21 ISBN 978 0 19 875141 0 Hansson Sven Ove Hendricks Vincent F 2018 Introduction to formal philosophy Springer undergraduate texts in philosophy Cham Springer p 38 ISBN 978 3 030 08454 7 a b c Ayala Rincon Mauricio de Moura Flavio L C 2017 Applied Logic for Computer Scientists Undergraduate Topics in Computer Science Springer pp 2 20 doi 10 1007 978 3 319 51653 0 ISBN 978 3 319 51651 6 a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag Lemmon Edward John 1998 Beginning logic Boca Raton FL Chapman amp Hall CRC pp passim especially 39 40 ISBN 978 0 412 38090 7 a b c d e f Arthur Richard T W 2017 An introduction to logic using natural deduction real arguments a little history and some humour 2nd ed Peterborough Ontario Broadview Press ISBN 978 1 55481 332 2 OCLC 962129086 See also his book Prawitz 1965 Prawitz 2006 Kleene 2009 pp 440 516 See also Kleene 1980 External links editLaboreo Daniel Clemente August 2004 Introduction to natural deduction PDF Domino On Acid Retrieved 10 December 2023 Natural deduction visualized as a game of dominoes Pelletier Francis Jeffry A History of Natural Deduction and Elementary Logic Textbooks PDF Natural Deduction Systems in Logic entry by Pelletier Francis Jeffry Hazen Allen in the Stanford Encyclopedia of Philosophy 29 October 2021 Levy Michel A Propositional Prover Retrieved from https en wikipedia org w index php title Natural deduction amp oldid 1222385780, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.