Classical Propositional Logic:
A twovalued logic for statements involving the logical terms 'not', 'and' and
'or' that infers that 'p' is true if 'not not p' is true. 1.
Introducuction:
This article gives a truthvaluational
semantics for
Classic Propositional Logic, briefly CPL.
In the present
treatment I try to be clear about the assumptions made and presume both
English and basic mathematics including ordinary algebra of real numbers
known to every reader who knows English and finished secondary education.

This allows a very simple presentation of classical
propositional logic in terms of basic mathematics and English.
Part of the reason for this is that there is
a fargoing parallel between propositional logic and ordinary algebra
that was first noticed by George Boole.
Ordinary algebra is a formal language with statements
about numbers and variable terms like "x", "y" for
numbers, that enable one to write statements like "x >= x" i.e.
"x is greater or equal than x" etc.. Constant terms in algebra are
terms for numbers like "0", "1" etc. If the the term
"x" is uniformly by respectively "y" and "0" in
the statement "x>=x", the statements "y>=y" and
"0>=0" result.
It should be noted that variables are terms that do not
occur normally in ordinary language, for that contains no distinction between
variables and constants.
A formal language is characterized by its containing
variables, and by having an explicitly assumed precise grammar, that allows
one to decide what is and what is not a wellformed expression of that formal
language.
Formal logic is here characterized as
a formal language with
explicitly assumed rules of inference that all have the property of leading
from true assumptions to only true conclusions. And
rules of inference which have this property of leading
from true assumptions to only true conclusions  that will be further
explained below  will be called logically
valid or simply logical rules of inference.

Again, in ordinary algebra one such a rule of inference
is known as the symmetry of identity, and says that if it is true that x=y,
then it may be concluded y=x. Another such rule in algebra is the asymmetry
of smaller than: if it is true x<y, then it is true that not y<x. This
last rule allows one to conclude in ordinary algebra that if 0<1, then it
is not true 1<0.
Ordinary algebra also shows the reason for the
introduction of variables: with them one can write statements and rules that
are true of any numbers, such as "x>=x" and "if x>y then
not y>x", in the end simply on the basis of the assumption that one
may uniformly replace numbervariables in algebraic statements by any
numberconstants.
A propositional
logic is a
formal logic the variables
of which stand for propositions, and that has apart from propositional
variables only propositional logical
constants and interpunction signs.
A proposition is a statement that is true or false.
The fundamental logical constants
in CPL are normally taken to be
"and", "or" and "not", and
one of the main ends of setting up a logic is to be clear about what does
and does not follow from propositions with those terms, and in what sense.
The interpunction signs that will be used to mark grouping, and are brackets and spaces.

So a propositional logic differs from ordinary algebra in
being about propositions, where propositions are statements that are either
true or false, and the "being about" is because a propositional
logic is a formal language with variables for propositions, in the same sort
of way algebra has variables for numbers, and hence is about numbers.
The property that characterizes propositions  that a
proposition is a statement that is either true or false  makes propositions
differ from other kind of statements (or sentences) like questions,
exclamations, exhortations, orders, excuses etc.
One interesting feature of basic propositional logic is
that it presumes the notion of truth and
falsity, and does not define them: all that is assumed is that each and any
proposition has the property of being true or false, and that these
properties can be used to define further aspects of propositions.
The fundamental constants of ordinary algebra are +, *, ,
: and =, <, allowing statements like "x+y=y+x", "x+0=x"
etc. that the reader must have some familiarity with. And in algebra, spaces
and brackets are also used to mark grouping, as in "(x+(y+z)) = ((x+y)+z)".
2. Propositional Logic: The fundamental constants of CPL are as in the following
table, in which "iff" means "if and only if":
Logical constant

Read as

Name

Property

&

and

conjunction

"p&q"
is true iff p is true and q is true

V

or

disjunction

"pVq" is
true iff p is true or q is true

~

not

denial

"~p" is
true iff p is not true

So we get the possibility of writing "p&q"
reading it as "p is true and q is true" (presuming that
"p" and "q" may be uniformly replaced by any
proposition), and similarly as indicated in the above table.
It is possible to add other logical constants, such as
"==>" which is named implication and "p==>q" is
read as "if p then q", but one can define implication in terms of
& and ~ or V and ~, as will be shown below, and in any case the notions
of and terms for "and", "or" and "not" are very
fundamental for any language.
Now just as with ordinary algebra, we need a syntax to
write our formulas and know which expressions are, and which are not,
formulas of CPL. This comes about as follows, and has been illustrated and
used already:
The syntax of CPL is
derived from the syntax of natural language by the following abstractions
and simplifications:
We assume  for the purposes of
Classical Propositonal Logic  that:
1. natural
language consists of statements that are either true or false; that
2. each statement may be fronted by a oneplace operator "it is not true that" briefly "not",
3. which produces a new statement called its denial and
that
4. each two statements may be joined into one statement by "and",
5. producing
a new statement called the conjunction
of the statements
6. each two statements may be joined into one statement by "or",
7. producing a new statement called the disjunction of
the statements.

One reason to be so explicit about this is to show what
CPL is not about:
This abstracts from (in the sense: knowingly disregards)
nonstatements, that are not true or false, like questions and exclamations,
and it also abstracts from other operators, like "it may be the case
that", "if", and many others.
What remains is very simple and very general indeed, and
consists of
simple statements,
namely those without any operators, that are either true or false
the denials of simple statements, and
the compound
statements that may be formed by joining
already formed statements by "and" or "or".

Again, all of these compound statements are assumed to be
either true or false, as in Natural Language, and as indeed will follow from
the assumptions we will make.
What we now have abstacted may be considered as a rather
natural core of any reasoning with statements, at least when this happens
using the logical constants "and", "or" and
"not" to form statements, that may be extended by further
assumptions, and that may be further clarified and made amenable to a formal
treatment by the following assumptions, that introduce notation and
terminology:
Operators of CPL

"~" is a 1place operator.
"&" and "V" are 2place
operators.

"~" corresponds to "it is not true
that". It is said to be 1place because when it is written before a
statement "X" a new statement "~X" is formed.
"&" corresponds to "and" and is said to be 2place
because when it is written between two statements "X" and
"Y" a new statement "X&Y" is formed, and simlarly for
"V" that corresponds to "or".
One reason to introduce a new notation is to be able to
write formulas that are as brief as possible, and are free of any irrelevant
symbols.
Another reason is that the new terms will receive a more
precise meaning by further assumptions than the terms in natural language to
which they correspond.
It is possible, and for some purposes convenient, to
write "&" and "V" not between but before the two
statements they are meant to connect into one new proposition, but as the
notation adopted is much easier to read than the alternative considered in
this paragraph I merely mention it.
Next, we introduce notation for statements or
propositions:
Propositional variables:

X
is a propositional variable.
If X is a propositional
variable, X' is another propositional variable.

A propositional variable is a term that may be uniformly
substituted in a formula by a proposition. A proposition is a statement that
is either true or false. Formulas are built from propositional variables and
operators according to the rules given below. A uniform substitution of a
propositional variable "P" in a propositonal schema S by a
proposition such as "it rains" is a replacement of ALL occurences
of "P" in S by "it rains": uniform substitition means
that copies of the same variable are replaced by copies of the same constants
(or other variable: "relettering").
According to C and D there are the propositional
variables X,X',X'',X''' etc. It is often convenient to use any letter of the
alphabet, and write (A&(B&C)) rather than (X&(X'&X'')), and
this convention will be used without mention from now on. The reason to make
this a convention rather than a postulate that e.g. a,b,...z are
propositional variables is to have simple postulates not cluttered by what
may be made conventional.
Formulas:

1. If X is a propositional variable, (X) is a
formula
2. If X is a proposition and s is a 1place operator, (sX) is a formula.
3. If X is a proposition and X' is a proposition and c is a 2place
operator, (XcX') is a formula.
4. Nothing else is a formula.

According to 1..4 and using conventional relettering,
((~(A)&(B))V~(C)) is a formula. This can be seen as follows: (X) is a PS
by (1), and therefore ~(X) by (2). (X') is a PS by (1) and so (~(X)&(X'))
by (3). ~(X'') is a PS as before, and therefore ((~(X)&(X'))V~(X'') is,
by (3). Relettering X by A, X' by B and X'' by C yields the desired result.
The example shows that we can delete the brackets
produced by rule (1) without ambiguity, i.e. write ((~A&B)V~C) instead,
and also that we can delete the outermost brackets, resulting in
(~A&B)V~C. These simplifications will be made in what follows without
mention. Rules for the systematic deletion of brackets without introducing
ambiguity may be given, but this will not be done since we will not deal with
very long formulas.
Note that (4) has a positive function: It enables us to
conclude that whatever can not be formed by (1)..(3) is not a formula, which
we could not do without it.
3. Formal Semantics of CPL: A
semantics for a language states the meanings
of its terms. A formal semantics for CPL states
the conditions under which its formulas are true or false. Accordingly, it
involves a functional assigments of the terms "true" and
"false", or abbreviations thereof, to propositions, and one of
the things that makes a formal semantics differ from a semantics for a
natural language is that in a formal semantics the nonlogical terms
may be  and usually are 
variables.
We shall use the following:
Conventional notation:

If "p" is a proposition, "[p]" is the truthvalue
of "p".

This abbreviates the more normal functional expression
"value(p)" and is introduced merely for convenience.
In semantical postulates the "p" in
"[p]" is ANY formula, whether simple or compound, i.e. with or
without operators. Uniform substitution is presumed i.e. if one substitutes
e.g. "q" for "p" in a formula and "p" occurs
elsewhere in the formula it too must be substituted by "q". (If one
wanted to invoke settheory at this point, one would say that it is assumed
here that there is a function from the set of propositions to the set of
truthvalues, and that if "p" is any proposition, "[p]"
is the value the function assigns to "p".)
The basic postulate is BV:
The postulate of bivalence:

Every proposition has the value 1 or the value 0.

This corresponds to "every
proposition is true or not", but with the difference that using
the numbers 1 and 0 instead of the predicate "is true" and its
denial enables us to use arithmetic and algebra later on.
The reason it is said "every proposition is true or
not" (rather than: "true or false") is to be logical and to
reserve the use of "false" for
possible other uses than a synonym of "not true".
The use of arithmetic can be avoided, but to do so tends
to complicate arguments, and is less clear about a definite analogy between
arithmetic and propositional logic that was first noted and developed by Boole.
Besides, in the present treatment propositional logic is
founded on one's knowledge of English and basic mathematics. The reader is assumed to know his
or her "three Rs": reading, writing and arithmetic, and
propositional logic is based on that knowledge.
The postulate of bivalence can be formulated as follows,
introducing algebra, and it being presumed that [p] may be any proposition of
PL, with or without operators, and of arbitrary (finite) length:
BV

[p]=1 or
[p]=0

Postulate of
Bivalence

The three postulates for CPL deal with the three
operators ~, & and V. The postulates are as follows, with the same
presumption as before:
P~

[p]+[~p]=1

NegationPostulate

P&

[p&q]=[p][q]

Conjunction Postulate

PV

[pVq]=[p]+[q][p][q]

Disjunction Postulate

P~ enables us
to find the value of a denial of a proposition from the value of the
proposition and conversely, because  using arithmetical laws, which we
assumed known at the outset  it amounts to [~p]=1[p]. P& defines a conjunction as the product of the
values of the conjuncts. PV defines a
disjunction as the sum of the values of disjuncts diminished by the product
of the values of the disjuncts.
The reason for the occurence of "[p][q]" in PV can be seen when considering
all possible values:


[p]+[q][p][q]

[p]=1

[q]=1

1 + 1 
1.1 = 1

[p]=1

[q]=0

1 + 0 
1.0 = 1

[p]=0

[q]=1

0 + 1 
0.1 = 1

[p]=0

[q]=0

0 + 0 
0.0 = 0

The diminuend makes no difference in the last three
lines, and takes care that the sum does not exceed 1 in the first line, thus
preserving BV.
Note that PV can be avoided by using instead De Morgan's
definition of V:
(1)

[pVq]=[~(~p&~q)]

Definition

(2)

=1[(~p&~q)]

P~, (1)

(3)

=1[~p][~q]

P&, (2)

(4)

=1(1[p])(1[q])

P~, (3)

(5)

=1(1[p][q]+[p][q])

Algebra, (4)

(6)

=[p]+[q][p][q]

Algebra, (5)

This also shows how we can reason about the values of
propositions, which propositions have by our axiom and postulates, simply
using the definitions, uniform substitutions and algebra: Here in fact we
reduce CPL to a small and simple part of standard algebra, that is presumed
known to the reader.
UP
4. Semantical Theorems of CPL: We have explained the grammar and semantical postulates
of CPL, and can proceed to give some theorems with proof.
Indeed, what we in fact have achieved at this point 
barring some unclarities, the avoidance of which would require too much fuss
 is in fact a relation between formulas of CPL and algebra, that allows us
to reason algebraically about formulas of CPL.
So the proofs that follow are again simply arguments in
English and algebra about the values of the propositions of CPL. And
established theorems (in fact algebraic identities) are used later on in the
same way as original assumptions, again according to common algebraic
conventions.
Also, I shall supply commonly used names in many cases,
but leave most comments out as CPL is very well known. The reason to list
these theorems is to have a foundation for the system of Extended
Propositional Logic, to be founded on top of CPL and indeed extending it.
T1

[~~p]=[p]

Double denial

(1)

[~p]+[~~p] = 1

P~ (putting
"~p" for "p")

(2)

= [p]+[~p]

P~, (1)

(3)

[~~p] = [p]

Algebra, (2)

T1 says that the double denial of a proposition has the
value the proposition has.
T2

[p&p]=[p]=[p][p]

Idempotence of &

(1)

[p&p]=[p][p]

P&

(2)

=[p]

Algebra and BV, as
1.1=1 and 0.0=0

T2 says that the conjunction of a statement and itself
has the value of the statement.
T3

[p][~p]=0=[p&~p]

Contradiction

(1)

[p][~p]=[p](1[ p])

P~

(2)

=[p][p][p]

Algebra, (1)

(3)

=[p][p]=0

Algebra, T2

(4)

=[p&~p]

P&, (1)

T3 says that the conjunction of a proposition and its
denial is always false
T4

[pV~p]=1

Tautology

(1)

[pV~p]=[p]+[~p][p][~p]

PV

(2)

=[p]+[~p]

T3

(3)

=1

BV, (2)

T4 says that the disjunction of a proposition and its
denial is always true
T5

[~(pV~p)]=[p&~p]

De Morgan's
rule

T5 is a version of De Morgan's theorem in one variable.
It is proved by PV, P~ and T3, and says when a conjunction and disjunction
amount to the same.
Using these theorems about only one propositional
variable we can easily restate their import in a table:
Possibilities

p

~p

~~p

p&p

p&~p

~(p&~p)

pV~p

~(pV~p)

P

1

0

1

1

0

1

1

0

~p

0

1

0

0

0

1

1

0

The first row apart from the first column displays
propositional schemes. The first column  "Possibilities"
 is unusual in this kind of tabular representations of propositional logic,
since it is usually absent and replaced by a copy of the second and third
columns that list the possible truthvalues of p. However, explicitly stating
the possibilities  that may be conceived in the present case as respectively
the fact represented by the formula p and the fact represented by the formula
~p  will turn out to be very helpful later (since it preempts the need for a
third truthvalue in such tables, as we shall see).
In any case, the first column(s) in all the tables that
follow will intuitively display all possible facts that may be represented by
the propositional variables used, and truthvalues accordingly are supposed
to be given by a function from facts represented by simple propositions to
values of propositional schemes that contain some or all of these simple
propositions and no other propositions.
T6

[p]=[p&q]+[p&~q]

Expansion

(1)

[p&q]+[p&~q]=[p][q]+[p][~q]

P&

(2)

=[p]([q]+[~q])

Algebra

(3)

=[p]

P~

T6 says that any proposition can be expanded in terms of
any other proposition.
T7

[p&q] =[q&p]

Commutativity of
&

(1)

[p&q]
=[p][q]

P&

(2)

=[q][p]

Algebra

(3)

=[q&p]

P&

T7 says that the order of conjuncts doesn't matter for
the truth of a conjunction.
T8

[(p&q)&r]=[p&(q&r)]

Associativity of &

(1)

[(p&q)&r]=[p&q][r]

P&

(2)

=[p][q][r]

P&

(3)

=[p][q&r]

P&

(4)

=[p&(q&r)]

P&

T8 extends T7.
T9

[p&q]=[p&(p&q)]


(1)

[p&q]=[p][q]

P&

(2)

=[p][p][q]

T2

(3)

=[p][p&q]

P&

(4)

=[p&(p&q)]

P&

T9 is used in the proof of T10, that follows:
T10

[pVq]=[p&q]+[p&~q]+[~p&q]

Analysis of disjunction

(1)

[pVq]=[p]+[q][p][q]

PV

(2)

=[p&q]+[p&~q]+[q&p]+[q&~p][p&q]

T6

(3)

=[p&~q]+[p&q]+[q&~p]

Algebra and T7

T10 analyses a disjunction of p and q into one of
precisely three conjunctions. (T7 is needed to get from (3) to (1).)
T11

[p&q]+[p&~q]+[~p&q]+[~p&~q]=1

All binary classical possibilities

(1) 
[p&q]+[p&~q]+[~p&q]+[~p&~q]= 

(2)

[p][q]+[p][~q]+[~p][q]+[~p][~q]=

P&

(3)

[p]([q]+[~q])+[~p]([q]+[~q])=

Algebra

(4)

[p]+[~p]=1

P~

T11 gives all four possibilities involved in two
alternatives.
T12

[~(p&q)]=[~pV~q]

De Morgan for
&

(1)

[~(p&q)]=1[p&q]

P~

(2)

=[p&~q]+[~p&q]+[~p&~q]

T11

(3)

=[~pV~q]

T8

T12 is the law of De Morgan (already known in Antiquity,
incidentally) mentioned earlier.
Here are some of the theorems just proved in terms of
truth tables:
Possibilities

p&q

pVq

pVq iff ~(~p&~q)

p&qVp&~qV~p&qV~p&~q

p q

1

1

1

1

p ~q

0

1

1

1

~p q

0

1

1

1

~p ~q

0

0

1

1

Finally, here are two fundamental theorems about &
and V in CPL:
T13

[p&(qVr)]=[(p&q)V(p&r)]

Distribution of &
over V

(1)

[p&(qVr)]=[p][qVr]

P&

(2)

=[p]([q]+[r][q][r])

PV

(3)

=[p][q]+[p][r][p][q][r]

Algebra

(4)

=[p&q]+[p&r][p][q][p][r]

T1

(5)

=[p&q]+[p&r][p&q][p&r]

P&

(6)

=[(p&q)V(p&r)]

PV

T13 is the law of distributing & over V. This is
fairly intuitive, but in CPL V also distributes over &:
T14

[pV(q&r)]=[(pVq)&(pVr)]

Distribution of V over &`

(1)

[(pVq)&(pVr)]=[(pVq)][(pVr)]

P&

(2)

=[([p]+[q][p][q])][([p]+[r][p][r])]

PV

(3)

=[p][p]+[p][q][p][p][q]
+[p][r]+[q][r][p][q][r]
[p][p][r][p][q][r]+[p][p][q][r]

Algebra

(4)

=[p]+[p][q][p][q]
+[p][r]+[q][r][p][q][r][p][r]
[p][q][r]+[p][q][r]

T2

(5)

=[p]+[q][r][p][q][r]

Algebra

(6)

=[p]+[q&r][p][q&r]

P&

(7)

=[pV(q&r)]

PV

At this point it is possible to prove that all rules of
propositional logic stated in the previous section are valid, provided one uses the following:

Definitions of
other propositional operators


>

[(~pVq)]=[(p>q)]

Def >

IFF

[(p>q)
& (q>p)] = [p iff q]

Def iff

The proofs all follow the same format: Take the premisses
of the assumed rule as a conjunction and show that given the semantical
assumptions each assumed rule that starts with a true premiss must have a
true conclusion. Of course, the reason valid rules of inference are so
desirable is that once one knows a rule of inference is valid, one knows one
can use it to infer new truths from known or assumed truths, precisely
because a valid rule of inference can infer only truths from truths.
