Lecture Notes on Propositional Logic (6)
Lecture Notes on Propositional Logic (6)
Declaration
This note was prepared based on the book Logic for Computer Science by A. Singha.
1 Introduction
Our goal is to model reasoning as it is used in mathematics and computer science, taking
cues from that found in day-to-day communication. We start with the simplest kind of
reasoning, called reasoning with propositions and connectives. Here are some proposi-
tions:
• No bachelor is married.
• The woman who committed the crime did not have three legs.
• Goldbach’s Conjecture: Every even number bigger than 2 is a sum of two prime
numbers.
1
As of now, we do not have any way of showing the truth or falsity of these propositions.
However, each of them is either true or false.
We are not defining here what a proposition is. We are only getting familiarized with
the kind of objects in question. A safer way to describe a proposition is to see whether
the question, “Is it true that X?” is meaningful or not. If it is, then X is a proposition;
otherwise, X is not a proposition.
The sentences which are not propositions include questions, orders, exclamations, etc.,
for which we may not like to associate a truth value. For example, we do not know how
to say whether ”Is the night sky beautiful?” is true or false. Similarly, we may not assert
that ”How beautiful is the morning sky!” is true or false.
Our building blocks here are propositions; we will not try to go beyond them. It is not
our concern to determine whether ”Each bachelor is married,” for we pretend not to know
the meanings of the words uttered in the proposition. Our units here are propositions,
nothing less and nothing more. However, we seem to know that two propositions such as
”I know logic” and ”You know logic” can be composed to get another proposition such
as ”I and you know logic.”
We are only interested in propositions and how they are composed to yield other propo-
sitions. This is what we mean when we say that propositions are our building blocks.
Thus, we are interested in the forms rather than the meanings of propositions. Since
propositions can be true or false, we must know how to assign truth values to compound
propositions.
If indeed I like logic and you like logic, then we must agree that the proposition ”I and
you like logic” is true. But what about the proposition:
I like logic and you like logic or you do not like logic?
This is problematic, for we do not know exactly how this compound proposition has been
composed or formed. Which of the following ways must we parse it?
(I like logic and you like logic) or (you do not like logic),
(I like logic) and (you like logic or you do not like logic).
We will use parentheses for disambiguating compound propositions. Moreover, we will
start with some commonly used connectives; and if the need arises, we will enrich our
formalization by adding new ones. Of course, we will explain what ”follows from” means.
In the sequel, we will shorten the phrase ”if and only if” to ”iff,” and denote the set of
natural numbers {0, 1, 2, 3, . . . } by N.
Exercises
1. Do the following pairs of sentences mean the same thing? Explain.
2
(b) Children and senior citizens get concession. Children or senior citizens get
concession.
2. In Smullyan’s island, there are two types of people: knights, who always tell the
truth, and knaves, who always lie. A person there asserts: “This is not the first
time I have said what I am now saying.” Is the person a knight or a knave?
3. In Smullyan’s island, a person says A and B where A and B are two separate
sentences. (For instance, A is ”I have a brother” and B is ”I have a sister.”) The
same person later asserts A, and then after a minute, asserts B. Did they convey
the same as earlier?
2 Syntax of PL
For any simple proposition, called a propositional variable, we will use any of the symbols
p0 , p1 , . . .. For connectives ‘not’, ‘and’, ‘or’, ‘if . . . then . . . ’, ‘. . . if and only if . . . ’, we use
the symbols ¬, ∧, ∨, →, ↔, respectively; their names are negation, conjunction, disjunc-
tion, conditional, biconditional. We use the parentheses ‘)’ and ‘(’ as punctuation marks.
We also have the special propositions ⊤ and ⊥, called propositional constants; they
stand for propositions which are ‘true’ and ‘false’, respectively. Read ⊤ as top, and ⊥
as bottom or falsum. Both propositional variables and propositional constants are com-
monly called atomic propositions or atoms. So, the alphabet of Propositional Logic,
PL, is the set:
{), (, ¬, ∧, ∨, →, ↔, ⊤, ⊥, p0 , p1 , p2 , . . .}.
(Do not read the commas and dots.) Only the last one of these is a propositional formula
or a proposition. In fact, we are interested only in such expressions. The propositions (in
PL) are defined by the following grammar:
w ::= ⊤ | ⊥ | p | ¬w | (w ∧ w) | (w ∨ w) | (w → w) | (w ↔ w)
Here p stands for any generic propositional variable, and w stands for any generic propo-
sition. The symbol ::= is read as ‘can be’; and the vertical bar ‘|’ describes alternate
possibilities (read it as ‘or’). The same symbol w may be replaced by different proposi-
tions; that is what the word ”generic” means. This way of writing the grammatical rules
3
is called the Bacus-Naur form or BNF, for short. The grammar can be written in terms
of the following formation rules of propositions:
The set of all propositions is written as PROP. The formation rule (5) is called the
closure rule. It states that PROP is the smallest set that satisfies (1)-(4). The ‘smallest’
is in the sense that A is smaller than B iff A ⊆ B.
Propositions are also called PL-formulas and well-formed formulas, wff for short. The
non-atomic propositions are also called compound propositions.
The key fact is that any object that has been formed (generated) by this grammar can
also be parsed. That is, you can always find out the last rule that has been applied to
form a proposition and then proceed backward. Such an unfolding of the formation of a
proposition can be depicted as a tree, called a parse tree.
For the given proposition, the last rule applied was w ::= ¬w. This means that it is a
proposition if the expression ((p0 ∧ ¬p1 ) → (p2 ∨ (p3 ↔ ¬p4 ))) is also a proposition. Look
at the root and its child in the left tree of Figure 1.1.
Further, ((p0 ∧¬p1 ) → (p2 ∨(p3 ↔ ¬p4 ))) is a proposition if both the expressions (p0 ∧¬p1 )
and (p2 ∨ (p3 ↔ ¬p4 )) are propositions (the rule w ::= (w → w)). If you proceed further,
4
you would arrive at the left parse tree in Figure 1.1. The corresponding abbreviated tree
is on the right.
Example 1.3. Consider the string (∨(p1 ∧ p2 ) → (¬p1 ↔ p2 )). We cannot apply the rule
for ∨, since to its left is just a parenthesis. But we find that by taking x as ∨(p1 ∧ p2 )
and y as (¬p1 ↔ p2 ), the string appears as (x → y), which can be parsed. Look at the
left tree in Figure 1.2. We cannot parse ∨(p1 ∧ p2 ) any further.
Similarly, the string (∨ → ¬p1 ↔ p2 ) can be parsed in two ways; first with →, and next
with ↔. Look at the middle and the right trees in Figure 1.2. Neither can we parse
∨, ¬p1 ↔ p2 nor ∨ → ¬p1 .
Notice that the leaves of the trees of Figure 1.1 are atomic propositions, while the leaves
of the trees in Figure 1.2 are not. The corresponding expressions in the latter cases are
not propositions.
Exercises
1. Which of the following strings are in PROP and which are not? Why?
5
(a) p0 ∨ (p1 → ¬p2 )
(b) ((p3 ↔ p4 ) ∧ ¬p1 )
(c) ((p5 ) → (p2 ↔ p3 ))
(d) ((p3 ↔ p4 ) ∧ ¬p1 )
(e) (((p0 ∧ ¬(p1 ∨ p2 )) → (p3 ↔ ¬p4 )) ∨ (¬(p5 → p4 ) → ¬p1 ) ∧ p2 )
(f) ((p1 ∧ ¬p1 ) ∨ (p0 → p1 )) ∧ (¬(p0 ∧ ¬¬p1 ) → ((¬p3 ∨ ¬p1 ) ↔ p2 ))
3. Construct parse trees for the following strings, and then determine which of them
are propositions.
Example 1.4. Construct parse trees for the following strings and determine which of
them are propositions:
Given a proposition, we aim to determine in which way it has been formed. If the
proposition is of the form ¬x, then the x here is uniquely determined from the proposition.
If it is in the form (x ∨ y) for propositions x and y, can it also be in the form (z ∧ u) for
some (other) propositions z and u? Consider the proposition:
6
We see that w = (x ∧ y), where x = (p1 → p2 ) and y = (p1 ∨ (p0 ↔ ¬p2 )). If we write it
in the form (z ∨ u), then z = (p1 → p2 ) ∧ (p1 and u = (p0 ↔ ¬p2 ). Here, z and u are not
propositions.
Recall that the prefix of a string means reading the string symbol by symbol from left to
right and stopping somewhere. The prefixes of w above are:
Theorem 2.1. (Unique Parsing). Let w be a proposition. Then exactly one of the
following happens:
1. w ∈ {⊤, ⊥, p0 , p1 , . . .}.
The theorem is so named because it asserts that each proposition has a unique parse tree,
which demonstrates how it has been formed by applying the rules of the grammar. Of
course, unique parsing does not give any clue as to how to determine whether a given
string over the alphabet of PROP is a proposition or not.
Observe that the unique parse tree for a proposition has only atomic propositions on its
leaves. Whereas if an expression is not a proposition, in any parse tree for the expression
some leaf will contain an expression other than an atomic proposition.
7
Algorithm 1 Procedure PropDet
1: Input: Any string x over the alphabet of PL.
2: Output: ’yes’, if x is a proposition, else, ’no’.
3:
4: if x is a (single) symbol and x ∈
/ {), (, ¬, ∧, ∨, →, ↔} then
5: Report ’yes’
6: else
7: Report ’no’; and stop
8: end if
9:
10: Otherwise, scan x from left to right to get a substring w in one of the forms
11: ¬p, (p ∧ q), (p ∨ q), (p → q), (p ↔ q)
12: where p, q are symbols not in the set {), (, ¬, ∧, ∨, →, ↔}.
13:
14: if no such substring is found then
15: Report ’no’; and stop
16: end if
17:
18: if a valid substring w is found then
19: Replace w by p0
20: Go to Step 1
21: end if
Step 2: We scan from left to right to find a substring of the form ¬p, (p ∧ q), (p ∨ q),
(p → q), or (p ↔ q). The substring (p1 ∧ p2 ) matches the form (p ∧ q).
Step 3: We replace (p1 ∧ p2 ) with p0 , so the string becomes (∨p0 → (¬p1 ↔ p2 )).
Step 4: We repeat Step 2. The substring (¬p1 ↔ p2 ) matches the form (p ↔ q).
Step 6: The string now consists of atomic propositions and the logical operator →, which
confirms that it is a valid proposition. Therefore, the output is ’yes’.
String 2: (∨ → ¬p1 ↔ p2 )
Step 2: We scan from left to right to find a substring of the form ¬p, (p ∧ q), (p ∨ q),
(p → q), or (p ↔ q). However, no valid substring is found.
Step 3: Since no valid substring is found, we report ’no’. Therefore, the output is ’no’.
8
Correctness of PropDet:
PropDet works by identifying a substring of the given string as a proposition, and then
replacing the substring by a propositional variable, p0 . Thus, if it starts from a propo-
sition, it must end with one. The invariance of the loop executed by PropDet is the
proposition-hood of the string. Moreover, by any such replacement, PropDet reduces the
length of the string. Thus, the given string is a proposition if and only if the final string
of length 1 is p0 , which is also a proposition. This criterion is checked in Step 1. This
completes the correctness proof of PropDet.
To write less, we put down some precedence rules and omit the outer parentheses. Recall
the precedence rule that multiplication has higher precedence than addition. This means
that the expression x × y + z is rewritten as ((x × y) + z), and not as (x × (y + z)).
can be abbreviated to
p1 ∨ (p3 ∧ p6 ) → (p100 ↔ ¬p1 ).
Using abbreviations p, q, r, s for p1 , p3 , p6 , p100 , respectively, the abbreviated proposition
becomes:
p ∨ (q ∧ r) → (s ↔ ¬p).
However, you must not stretch too much the notion of abbreviated propositions. For
instance, p0 → p1 is not a sub-proposition of (p0 → (p1 ∧ p2 )). The reason is that the
abbreviated proposition in full form looks like (p0 → (p1 ∧ p2 )), and the propositions
p0 → p1 and (p0 → p1 ) are not even substrings of (p0 → (p1 ∧ p2 )).
Exercises
9
(ii) ((¬p ↔ (¬q ∧ r)) → (¬p → (r → s)))
(iii) (((¬p ↔ (¬q ∧ r))) → (¬p → ((r → s) ↔ ¬q)))
3 Semantics of PL
3.1 Interpretation
The meaning associated with any proposition is of two kinds, called true and false, for
convenience. In what follows we write 0 for false, and 1 for true. These two tokens,
true and false, or 0 and 1, are called the truth values. Propositions are built from
the atomic propositions with the help of connectives. The propositional constants are
special; ⊤ always receives the truth value true, and ⊥ always receives the truth value
false. Depending on situations, the propositional variables may receive either of the truth
values. We must then prescribe how to deal with connectives.
The common sense meaning of the connectives ¬, ∧, ∨, →, and ↔ are respectively, not,
and, or, implies, and if and only if. It means, ¬ reverses the truth values. That is, if x
is true, then ¬x is false; and if x is false, then ¬x is true. When both x and y are true,
x ∧ y is true; and when at least one of x or y is false, x ∧ y is false. If at least one of x or y
is true, then x ∨ y is true; and if both x and y are false, x ∨ y is false. Similarly, x ↔ y is
true when x and y are true together, or when x and y are false together; x ↔ y is false if
one of x, y is true and the other is false. The problematic case is x → y. We will consider
some examples to see what do we mean by this phrase implies, or as is commonly written
if . . ., then . . ..
The sentence if x then y is called a conditional sentence with antecedent as x and conse-
quent as y. In main stream mathematics, the meaning of a conditional sentence is fixed
by accepting it as false only when its antecedent is true but its consequent is false. It is
problematic in the sense that normally people think of a conditional statement expressing
causal connections; but this view does not quite go with that. However, this meaning of
the conditional statement is not so counter intuitive as is projected by the philosophers.
Let us take some examples.
Your friend asks you whether you have got an umbrella, and you answer, ”If I have got
an umbrella, then I would not have been wet”. Suppose, you do not have an umbrella.
10
Then is your statement true or false? Certainly it is not false if you had been really wet.
It is also not false even if you had not been wet since it did not rain at all. That is, the
statement is not false whenever its antecedent ”I have got an umbrella” is false.
Since it is tricky, we consider one more example. At the bank, you asked me for a pen to
fill up a form. Before searching I just replied ”If I have a pen, I will oblige you.” I searched
my pockets and bag, but could not find a pen. Looking around I spotted a friend, and
borrowed a pen from him for you. Did I contradict my own statement? Certainly not. I
would have done so had I got a pen and I did not lend it to you. Even if I did not have a
pen, I obliged you; and that did not make my statement false. That is, the sentence ”if
x then y” is true, if its antecedent x is false.
Formally, the assumed association of truth values to the propositional variables is called
a truth assignment. That is, a truth assignment is any function from {p0 , p1 , . . .} to
{0, 1}. An extension of a truth assignment to the set of all propositions that evaluates the
connectives in the above manner is called an interpretation. That is, an interpretation
is any function i : PROP → {0, 1} satisfying the following properties for all x, y ∈ PROP:
1. i(⊤) = 1.
2. i(⊥) = 0.
The same conditions are also exhibited in Table 1.1, where the symbols u,x,y stand for
propositions. The conditions are called boolean conditions; and such a table is called
a truth table. You must verify that these conditions and the truth table convey the
same thing.
⊤ ⊥ u ¬u x y x∧y x∨y
1 0 1 0 1 1 1 1
0 1 1 0 0 1 0 1
0 1 0 1 1 0 0 1
0 0 0 1 0 0 0 0
11
Alternatively, the boolean conditions can be specified in the following way:
i(⊤) = 1, i(⊥) = 0,
i(¬x) = 1 − i(x),
i(x ∧ y) = min{i(x), i(y)},
i(x ∨ y) = max{i(x), i(y)},
i(x → y) = max{1 − i(x), i(y)},
i(x ↔ y) = 1 − |i(x) − i(y)|.
Is the bottom-up way of extending a function from propositional variables to all proposi-
tions well-defined by the required properties? Can there be two different interpretations
that agree on all propositional variables?
Theorem 3.1. Let f : {p0 , p1 , . . .} → {0, 1} be any function. There exists a unique
interpretation g such that g(pj ) = f (pj ) for each j ∈ N.
Convention 1.3. Due to Theorem 3.1, we write the interpretation that agrees with a
truth assignment i as i itself.
The following result implies that if a propositional variable does not occur in a proposition,
then changing the truth value of that propositional variable does not change the truth
value of that proposition.
Theorem 3.2 (Relevance Lemma). Let w be a proposition. Let i and j be two interpre-
tations. If i(p) = j(p) for each propositional variable p occurring in w, then i(w) = j(w).
The Relevance Lemma shows that in order to assign a truth value to a proposition,
it is enough to know how an interpretation assigns the truth values to the propositional
variables occurring in it. We do not need to assign truth values to the propositional
variables which do not occur in the proposition.
Example 3.1. The truth table for (p → (¬p → p)) → (p → (p → ¬p)) is shown in Table
1.2, where we write u = p → (¬p → p), v = p → (p → ¬p), and the given proposition as
u → v.
12
p ¬p p → ¬p ¬p → p u v u→v
0 1 1 0 1 1 1
1 0 0 1 1 0 0
The first row is the interpretation that extends the truth assignment i with i(p) = 0 to the
propositions ¬p, p → ¬p, ¬p → p, u, v, and u → v. The second row is the interpretation
j with j(p) = 1. We see that i(u → v) = 1 while j(u → v) = 0.
Example 3.2. The truth table for u = ¬(p ∧ q) → (p ∨ (r ↔ ¬q)) is given in Table 1.3.
The first row is the assignment i with i(p) = i(q) = i(r) = 0. This is extended to the
interpretation i where:
Exercises
(a) i(r ↔ p ∨ q)
(c) i(r → p ∨ s)
(d) i(¬s ∨ q) ↔ (r ∨ p)
13
(a) p → (q → (p → q))
(b) ¬p ∨ q → (q → p)
(c) ¬(p ↔ q)
(d) (p ↔ q) ↔ (p → q)
(e) p → (p → q)
(f) (p → ⊥) ↔ ¬p
3. For an interpretation i, we know that i(p) = 1. Then which of the following can be
determined uniquely by i with this information only?
(a) (p → q) ↔ (r → ¬p)
(b) (q → r) → (q → ¬p)
(c) p → (q ↔ (r → p))
(d) p ↔ (q → (r ↔ p))
• p: It is normal,
• q: It is cold,
• r: It is hot,
• s: It is small,
(a) p ∨ q ∧ s
(b) p ↔ q
(c) p → ¬(q ∨ r)
(d) p ∨ (s → q)
(e) p → ¬(q ∧ r)
14
3.2 Models
Example 3.3. In Table 1.3, let i be the interpretation as given in the first row. That is,
i(p) = i(q) = i(r) = 0. The table says that i |= u. Check that
The interpretation j with j(p) = 1, j(q) = j(r) = 0 is a model of u. Which line in Table
1.3 is the interpretation j?
A proposition w is called:
Valid propositions are also called tautologies, and unsatisfiable propositions are called
contradictions.
Notice that:
Example 3.4. The proposition p ∨ ¬p is valid, i.e., |= p ∨ ¬p, since each interpretation
evaluates it to 1. Each of its interpretations is its model.
15
Let u = ¬(p ∧ q) → (p ∨ (r ↔ ¬q)). Look at Table 1.3. Let i, j be interpretations of u
with i(p) = 1, i(q) = i(r) = 0 and j(p) = j(q) = j(r) = 0. It is clear that i |= u whereas
j ̸|= u. Therefore, u is contingent.
Example 3.5. Categorize the following propositions into valid, invalid, satisfiable, and
unsatisfiable:
You can construct a truth table to answer both (a) and (b). Here is an approach that
avoids the construction of a truth table.
In general, each valid proposition is satisfiable and each unsatisfiable proposition is in-
valid. Validity and unsatisfiability are dual concepts. Further, if i is an interpretation,
then i(w) = 1 iff i(¬w) = 0. This proves the following statement.
Exercises
1. Categorize the following propositions into valid, invalid, satisfiable, and unsatisfiable:
1. p → (q → p)
3. p ∧ (p → q) → q
16
4. (¬p → ¬q) → ((¬p → q) → p)
5. p ∨ q ↔ ((p → q) → q)
6. p ∧ q ↔ ((q → p) → q)
7. (¬p ∨ q) → ((p ∨ r) ↔ r)
8. (p ∧ q ↔ p) → (p ∨ q ↔ q)
9. (p ∨ q → p) → (q → p ∧ q)
11. (q → p) → p
13. ((p ∧ q) ↔ p) → q
In the context of reasoning, it is important to determine whether saying this is the same
as saying that. It is the notion of equivalence. Along with this, we must also specify the
meaning of ”follows from.”
w1 , w2 , . . . , wn Therefore, w.
17
The propositions wi may not be valid. The argument compels us to imagine a world
where all of w1 , w2 , . . . , wn become true. In any such world, it is to be checked whether
w is also true. In order for the argument to be declared correct, all those interpretations
which are simultaneously satisfying all the propositions w1 , w2 , . . . , wn must also satisfy
w.
The set Σ is called satisfiable iff Σ has a model. Σ semantically entails w, written
Σ |= w, iff each model of Σ is a model of w. Σ |= w is also read as “w follows from Σ” and
also as “the consequence Σ! ⊢ w is valid.” For a consequence Σ! ⊢ w, the propositions in
Σ are called the premises or hypotheses, and w is called the conclusion.
Thus, Σ |= w if and only if for each interpretation i, if i falsifies w, then i falsifies some
proposition from Σ. Moreover, {w1 , w2 , . . . , wn } |= w if and only if w1 ∧w2 ∧· · ·∧wn |= w.
It also follows that x ≡ y if and only if x |= y and y |= x.
We try out each model of the set of premises and check whether it is also a model of the
conclusion. So, let i |= p → q and i |= ¬q. We see that i(q) = 0. Since i(p → q) = 1, we
have i(p) = 0. The only model of all premises is the interpretation i with i(p) = i(q) = 0.
Now, this i is also a model of ¬p. Therefore, the consequence is valid.
Example 3.8. Show that the following argument is correct (Stoll (1963)):
If the band performs, then the hall will be full provided that the tickets are not
too costly. However, if the band performs, the tickets will not be too costly.
Therefore, if the band performs, then the hall will be full.
We identify the simple declarative sentences in the above argument and build a vocabulary
for translation:
p : the band performs, q : the hall is (will be) full, r : tickets are not too costly.
Then the hypotheses are the propositions p → (r → q), p → r, and the conclusion is
p → q. We check the following consequence for validity:
p → (r → q), p→r !⊢ p → q.
18
Since there are only three propositional variables, by the Relevance Lemma, there are
23 = 8 interpretations. These are given in the second, third, and fourth columns of the
table below.
Row No. p q r p→r r→q p → (r → q) p→q
1 0 0 0 1 1 1 1
2 1 0 0 0 1 1 0
3 0 1 0 1 1 1 1
4 1 1 0 0 1 1 1
5 0 0 1 1 0 1 1
6 1 0 1 1 0 0 0
7 0 1 1 1 1 1 1
8 1 1 1 1 1 1 1
For the time being, do not read the column for p → q in the above table. You must find
out all (common) models of both p → (r → q) and p → r. They are in rows 1, 3, 5, 7,
and 8. In order for the argument to be correct, you must check whether p → q is true
(evaluated to 1) in all these rows. This is the case.
Of course, we should use the definition of |=. So, let i be a model of x and also of x → y.
If i is not a model of y, then i(x) = 1 and i(y) = 0. Consequently, i(x → y) = 0, which
is not possible. Therefore, i is a model of y. Hence, {x, x → y} |= y.
Theorem 3.4. Let u and v be propositions. Then the following are true:
19
Proof. Suppose Σ has no model. Then it is not the case that there exists an interpretation
which satisfies the propositions in Σ and falsifies any given proposition w. Thus, Σ |= w
never holds, which means Σ |= w.
Theorem 3.7 (RA: Reductio ad Absurdum). Let Σ be a set of propositions, and let
w be a proposition.
20
Example 3.11. We use Monotonicity, Reductio ad Absurdum and/or Deduction Theo-
rem to justify
p → (r → q), p → r ! ⊢ p → q.
We show the latter. Let i be an interpretation such that i(p → r) = 1, i(p → (r → q)) = 1,
and i(p) = 1. If i(r) = 0, then it contradicts i(p → r) = 1. So, i(r) = 1. Similarly, from
the second and third premises, we have i(r → q) = 1, and hence i(q) = 1. Therefore, the
consequence p → q is valid.
Exercises
(b) p ∨ ¬q, p → ¬r |= q → ¬r
(c) p ∨ q → r ∧ s, t ∧ s → u |= p → u
(d) p ∨ q → r ∧ s, s ∨ t → u, p ∨ ¬u |= p → (q → r)
(e) p → q ∧ r, q → s, d → t ∧ u, q → p ∧ ¬t |= q → t
The Indian economy will decline unless cashless transaction is accepted by the public. If
the Indian economy declines, then Nepal will be more dependent on China. The public
will not accept cashless transaction. So, Nepal will be more dependent on China.
21