CHAPTER 4-2
REASONING DEDUCTIVELY
Knowing the location of all the tourist attractions, and the
distances between every pair of important locations in the
country, is not very valuable knowledge if you cannot put it
together so as to plan a trip efficiently and enjoyably. Some
people have a vast range of information at their finger tips, yet
they do not arrive at sensible conclusions. Though they know a
lot, they reason poorly, and their knowledge may cause more
trouble than benefit.
This chapter explains the nature and use of deductive
methods -- which includes both logic and mathematics -- to draw
new and useful conclusions from existing information and
classifications of it.
Formal deductive logic is the first topic. Then we move on
to the logic of mathematics, and then to probability, the branch
of mathematics that is particularly useful in decision-making.
Last, we shall discuss the use of multi-variable analyses to
refine the meaning of the available information.
The `mathematical' methods discussed in the chapter include
the use of simple tables to draw out the meaning of the data so
as to explain the deeper connections that account for the
patterns which we see. One does not even need to use arithmetic
to appreciate the process.
Explaining what is going on is interesting for its own sake.
Additionally, however, it leads into the interpretive processes
of prediction and causal analysis, discussed in the following
chapter, which enable us to deal more effectively with the world
around us.
CLASSICAL DEDUCTIVE LOGIC
The simplest kind of formal logic goes like this: All boys
get their shoes muddy. We might get this assertion -- call it
"A" -- from informal or from scientific observation.) Shmegeggi
is a boy. (This statement -- call it "B" -- derives from
observation plus the creation and use of our classification
scheme about boys and other creatures.) Therefore, Shmegeggi
gets his shoes muddy. (This conclusion C is a logical deduction
from the two prior premises.)
Deductive logic is the system that enables us to conclude if
A is true and if B is true, then under some specified conditions
one can safely say that C is true. Deduction need not be as
simple as the case of Shmegeggi's shoes. The chains of reasoning
can be long, complex, and probabilistic, but the same sort of
system is at work to draw conclusions based on logic.
Consider this example: You want to know whether it is
raining outside. Your previous experience tells you that if
people are holding open umbrellas over their heads rain is
falling; this observation is major premise A. Minor premise B is
what you see from your window -- that almost everyone who is
carrying an umbrella has it open. You deduce your conclusion C,
that it is indeed raining outside.
Notice that whether your conclusion C is correct depends on
the correctness (truth) of your premises. If you are in a
country where people hold umbrellas over them to shade the sun,
then premise A is wrong in that case, and your conclusion will
probably be in error. Similarly, if your eyes deceive you about
whether people are holding open umbrellas, then premise B is
incorrect, and your conclusion will probably be incorrect too.
In one sense, there is nothing "new" in the conclusion C;
all the information contained in the conclusion is already
contained in the premises. Nevertheless, deduction helps us
understand the world about us because it shows us the connections
among some chunks of information that we already have. Like all
logical-mathematical analysis, deduction is a device for
discovery of the truths that are latent within a set of
statements.
In the case of the umbrellas and the rain, the deductive
method might or might not be more effective than an empirical
investigation. The empirical method of going outside and
extending an upturned palm has the advantage of rendering you
safe from most false premises. (But the premise that water
coming down from above means rain might be incorrect; someone
might be spraying a hose out a window. In a way, all knowledge
is deduced, and all knowledge depends on various premises.) The
advantage of the deductive method in this case is that you do not
have to go outside and get wet to obtain an answer.
We constantly use deduction to help us deal with the world.
You might have followed the following line of thought this
morning: Classes meet Monday to Friday; today is Monday;
therefore classes meet today. And off you go to class on the
basis of your deduction, without telephoning to check that
classes are indeed meeting.
Deduction is particularly useful for improving our
understanding of the world in two situations. The first is where
we believe we know the logic of some phenomenon, but have little
information about it. Thus, scientists deduced much knowledge
about nuclear fission from other knowledge about physics, long
before they were able to run actual fission experiments. Their
deductions helped guide the subsequent experiments, making them
much more efficient than pure trial-and-error procedures would
have been; the deductions also warned the scientists to be
careful lest they accidentally blow up their laboratories.
When we have more than two premises, or when the premises
are more complex, or when the relationships among the premises
are more complex, the deductions are not so obvious.
Sophisticated logical analysis can then help us draw conclusions.
Logic's main job is to assist in determining whether a) a given
deduced conclusion is or is not consistent with the premises, and
b) whether we are able to determine if the conclusion is or is
not true -- that is, whether we have enough enough information to
determine if Shmegeggi is a boy, and if so, whether Shmegeggi
really is indeed a boy.
Until recent decades many people considered logic an arid
subject, useless for practical purposes. Then, as is so often
the way with ideas, computers came along and made excellent use
of the fundamental principles of logic. This is because, like
the true-false nature of logic, a computer only has yes-no
states, with nothing in between.
About theory now: the label "theory" applies to a set of
logically-interrelated propositions that allow the user to deduce
a variety of propositions. That is, for a field to possess
theory it must have a body of theory which covers a substantial
portion of the material and problems in the field, and which is
systematically organized so that it all hangs together and the
various parts are consistent with each other. In this sense, not
all fields have theory; economics and physics do, but sociology
does not.
What is called a "model" is usually a small chunk of theory.
A model is a mini-theory which, like, theory generally, focuses
on a few elements abstracted from the totality of reality.
People differ in willingness to construct and believe long chains
of reasoning. Mathematical arguments are the most prominent
example of such long chains. And some mathematicians pride
themselves on their ability to hold long chains of logic in their
minds, as if this is a sign of general intellectual power. I
suggest that this is more like a talent for tightrope walking -
spectacular but seldom practical, and extremely dangerous at all
times. Especially with respect to the study of human behavior,
long chains of reasoning seldom produce reliable knowledge.
Every step in the reasoning depends upon empirical evidence.
Given that all such evidence is at least a bit shaky, the
unreliability quickly mounts up.
Furthermore, one quickly loses one's intuition -- that is,
one's sense of what is going on in the real-life situation in
which one is interested -- when following a long logical chain.
One can then reach the most amazingly wrong conclusions,
unchecked by common sense. (I do not mean to suggest that
science should reach conclusions consistent with what is
considered common sense; indeed, science is only interesting and
useful when it reaches conclusions that differ from common
belief. But a counter-intuitive conclusion usually must be
backed with direct empirical test of the theoretical
proposition.)
Deduction is a method of getting knowledge, just as
gathering data is a method of getting knowledge. An empirically
established fact has a superior claim to the title "knowledge",
however, than does a conclusion arrived at deductively, because
whenever a deduction and an empirically established fact collide,
the deduction must give way; the empirical demonstration is the
ultimate test. As someone has said, many a beautiful theory has
been slain by an ugly fact. But caution is needed here.
Empirical "facts" that conflict with sound theory too often turn
out to be non-facts.
For example, to explain the alternation of periods of
economic prosperity and depression, W. Stanley Jevons, the famous
British nineteenth-century economist, deduced that business
cycles are caused by sunspots. Business cycles, he calculated,
had a duration of 10.46 years, and sunspots had a cycle of 10.45
years. He reasoned that sunspots caused weather cycles, which
caused rain cycles, which caused crop cycles, which caused
business cycles (Heilbroner, 1968, p. 375). The logic was sound.
But new data showed that the sunspot cycles were longer than
previously believed, and they did not coincide with business
cycles. A new theory of the business cycle therefore had to be
sought.
The contest between deduction and empirical knowledge is not
always easily settled, however. If the empirical knowledge is
shaky, a strong deductive argument may be more persuasive,
particularly when policy decisions must be made on the basis of
available knowledge, or where people concerned about an issue
disagree strongly about premises.
As an example of the policy decision: when considering the
effect of population growth on economic development, for many
years most people put more stock in deduction than in empirical
evidence. Research showed an absence of relationship between the
two variables. But the empirical analyses were somewhat weak,
for many reasons. And the theoretical reasons to believe that
increased population causes lower income seemed very strong.
Therefore, most people advocated population control as a means of
speeding economic development, and many nations (China most
intensely) followed this recommendation, on the basis of
deduction and without empirical support.
Sometimes different people interested in the same phenomenon
try to explain it on the basis of quite different premises, and
consequently they arrive at different explanations. Where the
data are less than conclusive, both sides are likely to continue
relying on their deductions. An example is the question of why
managers and supervisors usually make much more money than the
people they supervise. On the basis of their theoretical
premises, most economists conclude that managers are paid more
because they contribute more to their businesses and are rewarded
accordingly. Some sociologists and economists believe, however,
that managers are paid more because they have more power than
ordinary workers and use the power to take more income than they
deserve. There is some evidence on each side of this question,
but none of it is conclusive (see Wright, 1979; Keyfitz, 1981).
As a result, people on neither side change their minds.
In short, when the accepted theory and the apparent facts
conflict, you should not automatically prefer one to the other.
Instead, you should examine both until you determine which -- or
both -- are in error.
Note that "deduction" is not the same as "theory." Theory
is an interlaced body of existing systematic knowledge, important
because it is a source of premises for deduction.
Let's end this section with the usual admonition: Do not
ignore deduction when it is useful. But do not limit yourself to
its use and neglect other methods.
MATHEMATICS
A group of three people has fifteen absurd coconut pies.
How many pies does each person get if they are divided equally?
Five, of course. The phrase "of course" simply shows how
accustomed we are to using the convenient language of mathematics
to manipulate the original information -- premises of 15 pies,
and 3 people in the group -- into a new and useful form, the
conclusion of 5 pies per person.
The information expressed in the conclusion is latent within
the premises. In a logical sense the conclusion contains no
"new" information. But to the person who does not do arithmetic
division automatically, the conclusion of five pies is quite new,
and very useful. That is, the mathematical manipulation and
rearrangement of the information that can bestow information that
is psychologically new even though it is not logically "new."
That is, it is new to us. The fact that mathematics is all a
logical tautology does not make it less valuable.
More generally, the manner in which information is presented
to us matters greatly. Two sides of an equation may be equal to
each other and therefore identical logically, but it matters for
our understanding which half of the equation is written on the
left and which is written on the right.
COMPLEX PROBABILITIES
Before you can assess the value to you of an alternative,
you must estimate the chance that the event will occur. But the
probability of an event often is not obvious. One complication
is the existence of several rather than just two possible
outcomes for a choice. Another complication is that you may be
interested in a combination of outcomes -- for example: What is
the probability of getting a total of eight in a roll of two
dice, each with six sides numbered 1 through 6? Or, what is the
probability that you will strike oil with your next well, and the
price of oil will remain high?
One's untutored intuition often is a poor guide to the
probability of an event. For example, what is the chance that
two or more people among a party of 25 will have the same month-
and-day birthday? People commonly guess that the probability is
1 in 50, or 1 in 100, or even less. In fact the probability is
greater than 1 in 2. Or, what is the probability of both the
mother and father dying in a given year if the probability is 1
in 200 that either one will die? If both the mother and the
father have paying jobs which can support their children, this
probability is relevant to the purchase of life insurance for
them, because only the death of both would be financially
disastrous for the family.
In recent centuries, mathematicians have invented many
formulae to calculate such compound probabilities. But it is
always possible -- and often much less subject to error -- to
estimate compound probabilities by trial-and-error methods, just
as gamblers did before mathematicians made their discoveries.
For example, you can simply roll a pair of dice a hundred or a
thousand times to satisfactorily estimate the chance of rolling
an eight. And you can estimate the likelihood of two people
having the same birthday by putting 365 dated slips into a hat,
and a) 25 times picking (and then replacing each time) a slip, b)
recording the result, then c) examining if there has been a
duplication. To demonstrate this example to a large class, I
start at one end of the room and ask the students to state their
birthdays in order. The class listens until a duplication
occurs. Then, starting with the next student, the exercise is
repeated until a duplicate birthday is found, and so on. It is
quickly apparent that on average it takes only 23 people for
there to be a duplication. These simulation trials can be done
in jig time with a personal computer nowadays, to as high a level
of accuracy as you wish. 1
For simple situations two easy formulae are handy, however:
1. The probability of both event A and event B occurring when they are independent of each other -- for example the
probability of both the mother and the father dying from illness (the chances of each dying in an auto accident are not
independent, of course) is the multiple of the two probabilities. In the example above, the probability is (1/200) x (1/200) =
1/40,000. Or, what is the chance of a basketball player who on average misses 1 foul shot in 10 (that is, a 90% success rate)
missing three foul shots in a row? Answer: 1x.1x.1=.001 or 1/1000.
2. The probability of one or the other of two possible outcomes of a "trial" is the sum of the probabilities. For example,
if the chance of rain tomorrow is .1, and the chance of snow is .05, the chance of rain or snow is .15. For more complex
probabilities, use trial-and-error simulation instead of formulae, I suggest. You are less likely to go wrong.
CORRELATIONS AND EXPLANATIONS
Let's say that you think you discern a connection between the performance of the Dutch stock market and the monthly number of
housing starts in the United States. You plot these data on a graph, and the connection does seem clear cut. You could push the
analysis further with the mathematics of statistical correlation to compute the extent of the relationship between the two series
of events, but this seems unnecessary because the graph shows the connection so vividly to your eye. The question is: What
meaning(s) should you attach to the observed connection? Should you conclude that you can predict one of these factors with the
other, and if so, which? Or, can you explain one with the other in a causal fashion?
Deciding which interpretation to place upon such a relationship is a very tricky matter. We can distinguish two kinds kinds
of reasoning that we may bring to bear upon such a connection -- a) purely deductive refinement of the data with additional
variables which help bring out their meaning, which we shall proceed to discuss in this chapter, and b) reasoning which goes
beyond pure deduction and enters into the domain of interpretation; this is discussed in Chapter 4-3.
Take a look a the data in Table 4-2-1 on candy eating and marital status. Clearly there is a connection. But what should
one conclude from these data? Does being single cause one to eat candy regularly? Does eating a lot of candy cause one to stay
single? Are there other interpretations that ought to be considered?
Figure 4-2-1 (Table 25.1 From Simon and Burstein, p. 312) Perhaps age is involved. To examine that possibility, look at
Table 4-2-2. There it appears as if the important connection is between age and eating candy, with being married no longer an
interesting connection.
These data may be interpreted in several ways. Should we say that youth causes candy eating? (Pretty clearly, candy eating
does not cause youth.) Should we predict that young people will eat more candy than older people in India? And 50 years from now
in the United States? Such interpretative extensions of the data will be discussed in the following chapter.
Table 4-4-2 25.2 in Simon and Burstein, p. 313
Consider another example, the relationship of education to salaries of economists, in Table 4-4-3. Can it really be that
getting more education reduces your income?
Table 4-4-3
Now look at Table 4-4-4. Mystery solved. Within each category of employment, more education is connected with more income.
That connection seems more reasonable, and more satisfying, than a higher salary being connected to less education.
Refining the data to bring out its meaning is often done with fancy statistical methods. But "elegant" and "sophisticated"
methods use essentially the same principles for augmenting simple relationships with additional variables as is done in these
tables.
FOOTNOTE
1 The language and program RESAMPLING STATS performs all the
Monte Carlo simulations of probability and statistics in an
intuitively clear and simple fashion. For more information,
contact the author.
Page # thinking logic42# 3-3-4d