Propositional formula

Propositional formula

In propositional logic, a propositional formula is a type of syntactic formula which is well formed and has a truth value. If the values of all variables in a propositional formula are given, it determines a unique truth value. A propositional formula may also be called a propositional expression, a sentence, or a sentential formula.

A propositional formula is constructed from simple propositions, such as "x" is greater than three" or propositional variables such as "P" and "Q", using connectives such as NOT, AND, OR, and IMPLIES; for example:

:("x" = 2 AND "y" = 4) IMPLIES "x" + "y" = 6.

In mathematics, a propositional formula is often more briefly referred to as a "proposition", but, more precisely, a propositional formula is not a proposition but a formal expression that "denotes" a proposition, a formal object under discussion, just like an expression such as "nowrap|"x" + "y"" is not a value, but denotes a value. In some contexts, maintaining the distinction may be of importance.

Propositions

For the purposes of the propositional calculus, propositions (utterances, sentences, assertions) are considered to be either simple or compound. [ Hamilton 1978:1] Compound propositions are considered to be linked by sentential connectives, some of the most common of which are AND, OR, "IF ... THEN ...", "NEITHER ... NOR...", "... IS EQUIVALENT TO ..." . The linking semicolon " ; ", and connective BUT are considered to be expressions of AND. A sequence of discrete sentences are considered to be linked by ANDs, and formal analysis applies a recursive "parenthesis rule" with respect to sequences of simple propositions (see more below about well-formed formulas). : For example: The assertion: "This cow is blue. That horse is orange but this horse here is purple." is actually a compound proposition linked by ANDs: " ( ("This cow is blue" AND "that horse is orange") AND "this horse here is purple" ) ". Simple propositions are declarative in nature, that is, they make assertions about the condition or nature of a "particular" object of sensation e.g. "This cow is blue", "There's a coyote!" ("That coyote IS "there", behind the rocks."). [PM p. 91 eschews "the" because they require a clear-cut "object of sensation"; they stipulate the use of "this" (p. 91)] Thus the simple "primitive" assertions must be about specific objects or specific states of mind. Each must have at least a subject (an immediate object of thought or observation), a verb (in the active voice and present tense preferred), and perhaps an adjective or adverb. "Dog!" probably implies "I see a dog" but should be rejected as too ambiguous.

: Example: "That purple dog is running", "This cow is blue", "Switch M31 is closed", "This cap is off", "Tomorrow is Friday".

For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simple sentences, although the result will probably sound stilted.

Relationship between propositional and predicate formulas

The predicate calculus goes a step further than the propositional calculus to an "analysis of the "inner structure" of propositions" [ (italics added) Reichenbach p.80. ] It breaks a simple sentence down into two parts (i) its subject (the object (singular or plural) of discourse) and (ii) a predicate -- a verb or possibly verb-clause that asserts a quality or attribute of the object(s)). The predicate calculus then generalizes the "subject|predicate" form (where | symbolizes concatenation (stringing together) of symbols) into a form with the following blank-subject structure " ___|predicate," and the predicate in turn generalized to all things with that property.: Example: "This blue pig has wings" becomes two sentences in the "propositional calculus": "This pig has wings" AND "This pig is blue." The first sentence breaks into "pig" as subject, and "has wings" as the predicate. Thus it asserts that object "pig" is a member of the class (set, collection) of "winged things". The second sentence asserts that object "pig" has an attribute "blue" and thus is a member of the class of "blue things". One might choose to write the two sentences connected with AND as::: p|W AND p|B

The generalization of "pig" to a (potential) member of two classes "winged things" and "blue things" means that it has a truth-relationship with both of these classes. In other words, given a domain of discourse "wingled things", either we find p to be a member of this domain or not. Thus we have a relationship W (wingedness) between p (pig) and { T, F }, W(p) evaluates to { T, F }. Likewise for B (blueness) and p (pig) and { T, F }: B(p) evaluates to { T, F }. So we now can analyze the connected assertions "B(p) AND W(p)" for its overall truth-value, i.e.: : ( B(p) AND W(p) ) evaluates to { T, F }

In particular, simple sentences that employ notions of "all", "some", "a few", "one of", etc are treated by the predicate calculus. Along with the new function symbolism "F(x)" two new symbols are introduced: ∀ (For all), and ∃ (There exists ..., At least one of ... exists, etc.). The predicate calculus can handle the following statement:: "All blue pigs have wings but some winged pigs are not blue".

Identity

Tarski asserts that the notion of IDENTITY (as distinguished from LOGICAL EQUIVALENCE) lies outside the propositional calculus; however, he notes that if a logic is to be of use for mathematics and the sciences it must contain a "theory" of IDENTITY. [Tarski p.54-68. Suppes calls IDENTITY a "further rule of inference" and has a brief development around it; Robbin, Bender and Williamson, and Goodstein introduce the sign and its usage without comment or explanation. Hamilton p. 37 employs two signs ≠ and = with respect to the valuation of a formula in a formal calculus. Kleene p. 70 and Hamilton p. 52 place it in the predicate calculus, in particular with regards to the arithmetic of natural numbers.] Some authors refer to "predicate logic with identity" to emphasize this extension. See more about this below.

An algebra of propositions, the propositional calculus

An algebra (and there are many different ones), loosely defined, is a method by which a collection of symbols called variables together with some other symbols such as parentheses (, ) and some sub-set of symbols such as *, +, ~, &, V, =, ≡, ⋀, ¬ are manipulated within a system of rules. These symbols, and well-formed strings of them, are said to represent objects, but in a specific algebraic system these objects do not have meanings. Thus work inside the algebra becomes an exercise in obeying certain laws (rules) of the algebra's syntax (symbol-formation) rather than in semantics (meaning) of the symbols. The meanings are to be found outside the algebra.

For a well-formed sequence of symbols in the algebra -- a formula -- to have some usefulness outside the algebra the symbols are assigned meanings and eventually the variables are assigned values; then by a series of rules the formula is evaluated.

When the values are restricted to just two and applied to the notion of simple sentences (e.g. spoken utterances or written assertions) linked by propositional connectives this whole algebraic system of symbols and rules and evaluation-methods is usually called the propositional calculus or the sentential calculus.

While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions (e.g. the commutative and associative laws for AND and OR), some do not (e.g. the distributive laws for AND, OR and NOT).

Usefulness of propositional formulas

Analysis: In deductive reasoning, philosophers, rhetoricians and mathematicians reduce arguments to formulas and then study them (usually with truth tables) for correctness (soundness). For example: Is the following argument sound? : "Given that consciousness is sufficient for an artificial intelligence and only conscious entities can pass the Turing test, before we can conclude that a robot is an artificial intelligence the robot must pass the Turing test." Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reduction and minimization techniques to simplify their designs.

Synthesis: Engineers in particular synthesize propositional formulas (that eventually end up as circuits of symbols) from truth tables. For example, one might write down a truth table for how binary addition should behave given the addition of variables "b" and "a" and "carry_in" "ci", and the results "carry_out" "co" and "sum" Σ:: Example: in row 5, ( (b+a) + ci ) = ( (1+0) + 1 ) = the number "2". written as a binary number this is 102, where "co"=1 and Σ=0 as shown in the right-most columns.

CASE connective: IF ... THEN ... ELSE ...

The IF ... THEN ... ELSE ... connective appears as the simplest form of CASE operator of recursion theory and computation theory and is the connective responsible for conditional goto's (jumps, branches). From this one connective all other connectives can be constructed (see more below). Although " IF c THEN b ELSE a " sounds like an implication it is, in its most reduced form, a switch that makes a decision and offers as outcome only one of two alternatives "a" or "b" (hence the name switch statement in the C programming language). [A careful look at its Karnaugh map shows that IF...THEN...ELSE can also be expressed, in a rather round-about way, in terms of two exclusive-ORs: ( (b AND (c XOR a)) OR (a AND (c XOR b)) ) = d.]

The following three propositions are equivalent (as indicated by the logical equivalence sign ≡ ):

:* (1) ( IF 'counter is zero' THEN 'go to instruction "b" ' ELSE 'go to instruction "a" ') ≡:* (2) ( (c → b) & (~c → a) ) ≡ ( ( IF 'counter is zero' THEN 'go to instruction "b" ' ) AND ( IF 'It is NOT the case that counter is zero' THEN 'go to instruction "a" ) " ≡:* (3) ( (c & b) V (~c & a) ) = " ( 'Counter is zero' AND 'go to instruction "b" ) OR ( 'It is NOT the case that 'counter is zero' AND 'go to instruction "a" ) " Thus IF ... THEN ... ELSE -- unlike implication -- does not evaluate to an ambiguous "TRUTH" when the first proposition is false i.e. c = F in (c → b). For example, most people would reject the following compound proposition as a nonsensical "non sequitor" because the second sentence is "not connected in meaning" to the first. [Robbin p. 3.] : Example: The proposition " IF 'Winston Churchill was Chinese' THEN 'The sun rises in the east' " evaluates as a TRUTH given that 'Winston Church was Chinese' is a FALSEHOOD and 'The sun rises in the east' evaluates as a TRUTH.

In recognition of this problem, the sign → of formal implication in the propositional calculus is called material implication to distinguish it from the everday, intuitive implication. [Rosenbloom p. 30 and p. 54ff discusses this problem of implication at some length. Most philosophers and mathematicians just accept the material definition as given above. But some do not, including the intuitionists; they consider it a form of the law of excluded middle misapplied.]

The use of the IF ... THEN ... ELSE construction avoids controversy because it offers a completely deterministic choice between two stated alternatives; it offers two "objects" (the two alternatives b and a), and it "selects" between them exhaustively and unabiguously. [Indeed, exhaustive selection between alternatives -- mutual exclusion -- is required by the definition that Kleene gives the CASE operator (Kleene 1952229)] . In the truth table below, d1 is the formula: ( (IF c THEN b) AND (IF NOT-c THEN a) ). Its fully-reduced form d2 is the formula: ( (c AND b) OR (NOT-c AND a). The two formulas are equivalent as shown by the columns "=d1" and "=d2". Electrical engineers call the fully-reduced formula the AND-OR-SELECT operator. The CASE (or SWITCH) operator is an extension of the same idea to "n" possible, but mutually-exclusive outcomes. Electrical engineers call the CASE operator a multiplexer.

Enterprising readers might challenge themselves to invent an "axiomatic system" that uses the symbols { V, &, ~, (, ), variables a, b, c }, the formation rules specified above, as few as possible of the laws listed below, and then derive as theorems the others as well as the truth-table valuations for V, &, and ~. One set attributed to Huntington (1904) (Suppes:204) uses 8 of the laws defined below.

Note that that if used in an axiomatic system, the symbols 1 and 0 (or T and F) are considered to be wffs and thus obey all the same rules as the variables. Thus the laws listed below are actually axiom schemas, that is, they stand in place of an infinite number of instances. Thus ( x V y ) ≡ ( y V x ) might be used in one instance, ( p V 0 ) ≡ ( 0 V p ) and in another instance ( 1 V q ) ≡ ( q V 1 ), etc.

Connective seniority (symbol rank)

In general, to avoid confusion during analysis and evaluation of propositional formulas make liberal use parentheses. However, quite often authors leave them out. To parse a complicated fromula one first needs to know the seniority, or rank that each of the connectives (excepting *) has over the other connectives. To" well-form" a formula start with the connective with the highest rank and add parentheses around its components, then move down in rank (paying close attention to the connective's scope over which the it is working). From most- to least-senior, with the precidate signs ∀x and ∃x, the IDENTITY = and arithmetic signs added for completeness [Rosenbloom 1950:32. Kleene 1952:73-74 ranks all 11 symbols.] ::: (LOGICAL EQUIVALENCE), (IMPLICATION), & (AND), V (OR), ~ (NOT), ∀x (FOR ALL x), ∃x (THERE EXISTS AN x), = (IDENTITY), + (arithmetic sum), *(arithmetic multiply), ' (s, arithmetic successor).

Thus the formula can be parsed -- but note that, because NOT does not obey the distributive law, the parentheses around the inner formula (~c & ~d) is mandatory::Example: " d & c V w " rewritten is ( (d & c) V w ):Example: " a & a → b ≡ a & ~a V b " rewritten (rigorously) is::* ≡ has seniority: ( ( a & a → b ) ≡ ( a & ~a V b ) )::* → has seniority: ( ( a & (a → b) ) ≡ ( a & ~a V b ) )::* & has seniority both sides: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~a V b) ) )::* ~ has seniority: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b) ) )::* check 9 ( -parenthesis and 9 ) -parenthesis: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) V b) ) ) :Example::: d & c V p & ~(c & ~d) ≡ c & d V p & c V p & ~d rewritten is ( ( (d & c) V ( p & ~((c & ~(d)) ) ) ) ≡ ( (c & d) V (p & c) V (p & ~(d)) ) )

Commutative and associative laws

Both AND and OR obey the commutative law and associative law:
* Commutative law for OR: ( a V b ) ≡ ( b V a )
* Commutative law for AND: ( a & b ) ≡ ( b & a )
* Associative law for OR: (( a V b ) V c ) ≡ ( a V (b V c) )
* Associative law for AND: (( a & b ) & c ) ≡ ( a & (b & c) )

Omitting parentheses in strings of AND and OR: The connectives are considered to be unary (one-variable, e.g. NOT) and binary (i.e. two-variable AND, OR, IMPLIES). For example: :( (c & d) V (p & c) V (p & ~d) ) above should been written ( ((c & d) V (p & c)) V (p & ~(d) ) ) or possibly ( (c & d) V ( (p & c) V (p & ~(d)) ) )However, a truth-table demonstration shows that the form without the extra parentheses is perfectly adequate.

Omitting parentheses with regards to a single-variable NOT: While ~(a) where a is a single variable is perfectly clear, ~a is adequate and is the usual way this literal would appear. When the NOT is over a formula with more than one symbol, then the parentheses are mandatory, e.g. ~(a V b)

Distributive laws

OR distributes over AND and AND distributes over OR. NOT does not distribute over AND nor OR. See below about De Morgan's law:
* Distributive law for OR: ( c V ( a & b) ) ≡ ( (c V a) & (c V b) )
* Distributive law for AND: ( c & ( a V b) ) ≡ ( (c & a) V (c & b) )

De Morgan's laws

NOT, when distributed over OR or AND, does something peculiar (again, these can be verified with a truth-table):
* De Morgan's law for OR: ~(a V b) ≡ (~a & ~b)
* De Morgan's law for AND: ~(a & b) ≡ (~a V ~b)

Laws of absorption

Absorption, in particular the first one, cause the "laws" of logic to differ from the "laws" of arithmetic:
* Absorption (idempotency) for OR: (a V a) ≡ a
* Absorption (idempotency) for AND: (a & a) ≡ a

Laws of evaluation: Identity, nullity, and complement

The sign " = " (as distinguished from logical equivalence ≡, alternately ↔ or ⇔) symbolizes the assignment of value or meaning. Thus the string (a & ~(a)) symbolizes "1", i.e. it means the same thing as symbol "1" ". In some "systems" this will be an axiom (definition) perhaps shown as ( (a & ~(a)) =Df 1 ) ; in other systems, it may be derived in the truth table below:

Reduced sets of connectives

A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formula with just the connectives in that set. There are many complete sets of connectives, including {land, lnot}, {lor, lnot}, and { o, lnot}. There are two binary connectives that are complete on their own, corresponding to NAND and NOR, respectively. [As well as the first three, Hamilton pp.19-22 discusses logics built from only | (NAND), and ↓ (NOR).] Some pairs are not complete, for example {land, lor}.

The stroke (NAND)

The binary connective corresponding to NAND is called the Sheffer stroke, and written with a vertical bar | or vertical arrow ↑. The completeness of this connective was noted in "Principia Mathematica" (1927:xvii). Since it is complete on its own, all other connectives can be expressed using only the stroke. For example, where the symbol " ≡ " represents "logical equivalence":: ~p ≡ p|p: p → q ≡ p|~q: p V q ≡ ~p|~q: p & q ≡ ~(p|q) In particular, the zero-ary connectives op (representing truth) and ot (representing falsity) can be expressed using the stroke:: op equiv (a|(a|a)):ot equiv ( op | op)

IF ... THEN ... ELSE

This connective together with { 0, 1 }, ( or { F, T } or { ot, op } ) forms a complete set. In the following the IF...THEN...ELSE relation (c, b, a) = d represents ( (c → b) V (~c → b) ) ≡ ( (c & b) V (~c & a) ) = d: (c, b, a):: (c, 0, 1) ≡ ~c: (c, b, 1) ≡ (c → b): (c, c, a) ≡ (c V a) : (c, b, c) ≡ (c & b)

Example: The following shows how a theorem-based proof of "(c, b, 1) ≡ (c → b)" would proceed, below the proof is its truth-table verification. ( Note: (c → b) is "defined" to be (~c V b) ): :* Begin with the reduced form: ( (c & b) V (~c & a) ):* Substitute "1" for a: ( (c & b) V (~c & 1) ) :* Identity (~c & 1) = ~c: ( (c & b) V (~c) ):* Law of commutation for V: ( (~c) V (c & b) ):* Distribute "~c V" over (c & b): ( ((~c) V c ) & ((~c) V b ):* Law of excluded middle (((~c) V c ) = 1 ): ( (1) & ((~c) V b ) ):* Distribute "(1) &" over ((~c) V b): ( ((1) & (~c)) V ((1) & b )) ):* Commutivity and Identity (( 1 & ~c) = (~c & 1) = ~c, and (( 1 & b) ≡ (b & 1) ≡ b: ( ~c V b ):* ( ~c V b ) is defined as c → b Q. E. D.

In the following truth table the column labelled "taut" for tautology evaluates logical equivalence (symbolized here by ≡) between the two columns labelled d. Because all four rows under "taut" are 1's, the equivalence indeed represents a tautology.

(2) Create the formula's Karnaugh map

Use the values of the formula (e.g. "p") found by the truth-table method and place them in their into their respective (associated) Karnaugh squares (these are numbered per the Gray code convention). If values of "d" for "don't care" appear in the table, this adds flexibility during the reduction phase.

(3) Reduce minterms

Minterms of adjacent (abutting) 1-squares (T-squares) can be reduced with respect to the number of their literals, and the number terms also will be reduced in the process. Two abutting squares (2 x 1 horizontal or 1 x 2 vertical, even the edges represent abutting squares) loose one literal, four squares in a 4 x 1 rectangle (horizontal or vertical) or 2 x 2 square (even the four corners represent abutting squares) loose two literals, eight squares in a rectangle loose 3 literals, etc. (One seeks out the largest square or rectangles and ignores the smaller squares or rectanges contained totally within it. ) This process continues until all abutting squares are accounted for, at which point the propositional formula is minimized.

For example, squares #3 and #7 abut. These two abutting squares can loose one literal (e.g. "p" from squares #3 and #7), four squares in a rectangle or square loose two literals, eight squares in a rectangle loose 3 literals, etc. (One seeks out the largest square or rectangles.) This process continues until all abutting squares are accounted for, at which point the propositional formula is said to be minimized.

Example: The map method usually is done by inspection. The following example expands the algebraic method to show the "trick" behind the combining of terms on a Karnaugh

Observe that by the Idempotency law (A V A) = A, we can create more terms. Then by association and distributive laws the variables to disappear can be paired, and then "disappeared" with the Law of contradiction (x & ~x)=0. The following uses brackets [ and ] only to keep track of the terms; they have no special significance:
* Put the formula in conjunctive normal form with the formula to be reduced: ::: q = ( (~p & d & c ) V (p & d & c) V (p & d & ~c) V (p & ~d & ~c) ) = ( #3 V #7 V #6 V #4 )
* Idempotency (absorption) [ A V A) = A: ::: ( #3 V [ #7 V #7 ] V [ #6 V #6 ] V #4 )
* Associative law (x V (y V z)) = ( (x V y) V z ) ::: ( [ #3 V #7 ] V [ #7 V #6 ] V [ #6 V #4] )::: [ (~p & d & c ) V (p & d & c) ] V [ (p & d & c) V (p & d & ~c) ] V [ (p & d & ~c) V (p & ~d & ~c) ] .
* Distributive law ( x & (y V z) ) = ( (x & y) V (x & z) ) : ::: ( [ (d & c) V (~p & p) ] V [ (p & d) V (~c & c) ] V [ (p & ~c) V (c & ~c) ] )
* Commutative law and law of contradiction (x & ~x) = (~x & x) = 0: ::: ( [ (d & c) V (0) ] V [ (p & d) V (0) ] V [ (p & ~c) V (0) ] )
* Law of identity ( x V 0 ) = x leading to the reduced form of the formula: ::: q = ( (d & c) V (p & d) V (p & ~c) )

(4) Verify reduction with a truth table

Memory

Without delay inconsistencies must be eliminated from a truth table analysis. With the notion of “delay”, this condition presents itself as a momentary inconsistency between the fed-back output variable q and p = qdelayed.

A truth table reveals the rows where inconsistencies occur between p = qdelayed at the input and q at the output. After "breaking" the feed-back [McKlusky p. 194-5 discusses "breaking the loop" and inserts "amplifiers" to do this; Wickes (p. 118-121) discuss inserting delays. McCluskey p. 195ff discusses the problem of "races" caused by delays.] , the truth table construction proceeds in the conventional manner. But afterwards, in every row the output q is compared to the now-independent input p and any inconsistencies between p and q are noted (i.e. p=0 together with q=1, or p=1 and q=0); when the "line" is "remade" both are rendered impossible by the Law of contradiction ~(p & ~p)). Rows revealing inconsistences are either considered transient states or just eliminated as inconsistent and hence "impossible".

Once-flip memory

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output "q" feeds back into "p". Given that the formula is first evaluated (initialized) with p=0 & q=0, it will "flip" once when "set" by s=1. Thereafter, output "q" will sustain "q" in the "flipped" condition (state q=1). This behavior, now time-dependent, is shown by the state diagram to the right of the once-flip.

Historical development

Bertrand Russell (1912:74) lists three laws of thought that derive from Aristotle: (1) The law of identity: "Whatever is, is.", (2) The law of contradiction: "Nothing cannot both be and not be", and (3) The law of excluded middle: "Everything must be or not be.": Example: Here O is an expression about an objects BEING or QUALITY::: (1) Law of Identity: O = O :: (2) Law of contradiction: ~(O & ~(O)):: (3) Law of excluded middle: (O V ~(O))

The use of the word "everything" in the law of excluded middle renders Russell's expression of this law open to debate. If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects (a finite "universe of discourse") -- the members of which can be investigated one after another for the presence or absence of the assertion -- then the law is considered intuitionistically appropriate. Thus an assertion such as: "This object must either BE or NOT BE (in the collection)", or "This object must either have this QUALITY or NOT have this QUALITY (relative to the objects in the collection)" is acceptable. See more at Venn diagram.

Although a propositional calculus originated with Aristotle, the notion of an "algebra" applied to propositions had to wait until the early 1800s. In an (adverse) reaction to the 2000 year tradition of Aristotle's syllogisms, John Locke's "Essay concerning human understanding (1690)" used the word semiotics (theory of the use of symbols). By 1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy torward Locke's semiotics. George Bentham's work (1827) resulted in the notion of "quantification of the predicate" (1827) (nowadays symbolized as ∀ ≡ "for all"). A "row" instigated by William Hamilton over a priority dispute with Augustus De Morgan "inspired George Boole to write up his ideas on logic, and to publish them as MAL [Mathematical Analysis of Logic] in 1847" (Grattin-Guinness and Bornet 1997:xxviii).

About his contribution Grattin-Guinness and Bornet comment::"Boole's principal single innovation was [the] law [ xn = x ] for logic: it stated that the mental acts of choosing the property x and choosing x again and again is the same as choosing x once... As consequence of it he formed the equations x•(1-x)=0 and x+(1-x)=1 which for him expressed respectively the law of contradiction and the law of excluded middle" (p. xxviiff). For Boole "1" was the universe of discourse and "0" was nothing.

Gottlob Frege's massive undertaking (1879) resulted in a formal calculus of propositions, but his symbolism is so daunting that it had little influence excepting on one person: Bertrand Russell. First as the student of Alfred North Whitehead he studied Frege's work and suggested a (famous and notorious) emendation with respect to it (1904) around the problem of an antinomy that he discovered in Frege's treatment ( cf Russell's paradox ). Russell's work led to a collatoration with Whitehead that, in the year 1912, produced the first volume of "Principia Mathematica" (PM). It is here that what we consider "modern" propositional logic first appeared. In particular, PM introduces NOT and OR and the assertion symbol ⊦ as primitives. In terms of these notions they define IMPLICATION → ( def. *1.01: ~p V q ), then AND (def. *3.01: ~(~p V ~q) ), then EQUIVALENCE p ←→ q (*4.01: (p → q) & ( q → p ) ).

* Henry M. Sheffer (1921) and Jean Nicod demonstrate that only one connective, the "stroke" | is sufficient to express all propositional formulas.
* Emil Post (1921) develops the truth-table method of analysis in his "Introduction to a general theory of elementary propostions". He notes Nicod's stroke | .
* Whitehead and Russell add an introduction to their 1927 re-publication of PM adding, in part, a favorable treatment of the "stroke".

Computation and switching logic:
* William Eccles and F. W. Jordan (1919) describe a "trigger relay" made from a vacuum tube.
* George Stibitz (1937) invents the binary adder using mechanical relays. He builds this on his kitchen table.: Example: Given binary bits ai and bi and carry-in ( c_ini), their summation Σi and carry-out (c_outi) are::*( ( ai XOR bi ) XOR c_ini )= Σi:* ( ai & bi ) V c_ini ) = c_outi;
* Alan Turing builds a multiplier using relays (1937-1938). He has to hand-wind his own relay coils to do this.
* Textbooks about "switching circuits" appear in early 1950s.
* Willard Quine 1952 and 1955, E. W. Veitch 1952, and M. Karnaugh (1953) develop map-methods for simplifying propositional functions.
* George Mealy (1955) and Edward F. Moore (1956) address the theory of sequential (i.e. switching-circuit) "machines".
* E. J. McCluskey and H. Shorr develop a method for simplifying propositional (switching) circuits (1962).

Footnotes

References

* Bender, Edward A. and Williamson, S. Gill, 2005, "A Short Course in Discrete Mathematics", Dover Publications, Mineola NY, ISBN 0-486-43946-1. This text is used in a "lower division two-quarter [computer science] course" at UC San Diego.

* Enderton, H. B., 2002, "A Mathematical Introduction to Logic." Harcourt/Academic Press. ISBN 0-12-238452-0

* Goodstein, R. L., (Pergamon Press 1963), 1966, (Dover edition 2007), "Boolean Algebra", Dover Publications, Inc. Minola, NY, ISBN 0-486-45894-6. Emphasis on the notion of "algebra of classes" with set-theoretic symbols such as ∩, ∪, ' (NOT), ⊂ (IMPLIES). Later Goldstein replaces these with &, ∨, ¬, → (respectively) in his treatment of "Sentence Logic" pp. 76-93.

* Ivor Grattan-Guinness and Gérard Bornet 1997, "George Boole: Selected Manuscripts on Logic and its Philosophy", Birkhäuser Verlag, Basil, ISBN 0-876-5456-9 (Boston).

* A. G. Hamilton 1978, "Logic for Mathematicians", Cambridge University Press, Cambridge UK, ISBN 0 521 21838 1.

* E. J. McCluskey 1965, "Introduction to the Theory of Switching Circuits", McGraw-Hill Book Company, NY. No ISBN. Library of Congress Catalog Card Number 65-17394. McCluskey was a student of Willard Quine and developed some notable theorems with Quine and on his own. For those interested in the history, the book contains a wealth of references.

*Paul C. Rosenbloom 1950, Dover edition 2005, "The Elements of Mathematical Logic", Dover Publications, Inc., Mineola, NY, ISBN 0-486-44617-4.

*Joel W. Robbin 1969, 1997, "Mathematical Logic: A First Course", Dover Publications, Inc., Mineola, NY, ISBN 0-486-45018-X (pbk.).

* Patrick Suppes 1957 (1999 Dover edition), "Introduction to Logic", Dover Publications, Inc., Mineola, New York. ISBN 0-486-40687-3 (pbk.). This book is in print and readily available.: On his page 204 in a footnote he references his set of axioms to E. V. Huntington, "Sets of Independent Postulates for the Algebra of Logic," "transacctions of the American Mathematical Socienty, Vol. 5 91904) pp. 288-309.

* Alfred Tarski 1941 (1995 Dover edition), "Introduction to Logic and to the Methodology of Deductive Sciences", Dover Publications, Inc., Mineola, New York. ISBN 0-486-28462-X (pbk.). This book is in print and readily available.

* Jean van Heijenoort 1967, 3rd printing with emendations 1976, "From Frege to Gödel: A Source Book in Mathematical Logic, 1879-1931", Harvard University Press, Cambridge, MA. ISBN 0-674-32449-8 (pbk.) Translation/reprints of Frege (1879), Russell's letter to Frege (1902) and Frege's letter to Russell (1902), Richard's paradox (1905), Post (1921) can be found here.

* Alfred North Whitehead and Bertrand Russell 1927 2nd edition, paperback edition to *53 1962, "Principia Mathematica", Cambridge University Press, no ISBN. In the years between the first edition of 1912 and the 2nd edition of 1927, H. M. Sheffer 1921 and M. Jean Nicod (no year cited) brought to Russell's and Whitehead's attention that what they considered their primitive propositions (connectives) could be reduced to a single |, nowadays known as the "stroke" or NAND (NOT-AND, NEITHER ... NOR...). Russell-Whitehead discuss this in their "Introduction to the Second Edition" and makes the definitions as discussed above.

* William E. Wickes 1968, "Logic Design with Integrated Circuits", John Wiley & Sons, Inc., NY. No ISBN. Library of Congress Catalog Card Number: 68-21185. Tight presentation of engineering's analysis and synthesis methods, references McCluskey 1965. Unlike Suppes, Wickes' presentation of "Boolean algebra" starts with a set of postulates of a truth-table nature and then derives the customary theorems of them (p. 18ff).


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Propositional calculus — In mathematical logic, a propositional calculus or logic (also called sentential calculus or sentential logic) is a formal system in which formulas of a formal language may be interpreted as representing propositions. A system of inference rules… …   Wikipedia

  • Propositional variable — In mathematical logic, a propositional variable (also called a sentential variable or sentential letter) is a variable which can either be true or false. Propositional variables are the basic building blocks of propositional formulas, used in… …   Wikipedia

  • propositional calculus — Logic. See sentential calculus. [1900 05] * * * Formal system of propositions and their logical relationships. As opposed to the predicate calculus, the propositional calculus employs simple, unanalyzed propositions rather than predicates as its… …   Universalium

  • Implicational propositional calculus — In mathematical logic, the implicational propositional calculus is a version of classical (two valued) propositional calculus which uses only one connective, called implication or conditional. In formulas, this binary operation is indicated by… …   Wikipedia

  • Well-formed formula — In mathematical logic, a well formed formula (often abbreviated WFF, pronounced wiff or wuff ) is a symbol or string of symbols (a formula) that is generated by the formal grammar of a formal language. To say that a string S is a WFF with respect …   Wikipedia

  • Atomic formula — In mathematical logic, an atomic formula (also known simply as an atom) is a formula with no deeper propositional structure, that is, a formula that contains no logical connectives or equivalently a formula that has no strict subformulas. Atoms… …   Wikipedia

  • Frege's propositional calculus — In mathematical logic Frege s propositional calculus was the first axiomatization of propositional calculus. It was invented by Gottlob Frege, who also invented predicate calculus, in 1879 as part of his second order predicate calculus (although… …   Wikipedia

  • Barcan formula — In quantified modal logic, the Barcan formula and the converse Barcan formula (more accurately, schemata rather than formulae) (i) syntactically state principles or interchange between quantifiers and modalities; (ii) semantically state a… …   Wikipedia

  • Sahlqvist formula — In modal logic, Sahlqvist formulae are a certain kind of modal formula with remarkable properties. The Sahlqvist correspondence theorem states that everySahlqvist formula is canonical,and corresponds to a first order definable class of Kripke… …   Wikipedia

  • Tautology (logic) — In propositional logic, a tautology (from the Greek word ταυτολογία) is a propositional formula that is true under any possible valuation (also called a truth assignment or an interpretation) of its propositional variables. For example, the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”