CHAPTER 4-6 MY FAVORITE WORST SOURCES OF ERRORS Causes of error abound, as the long list of pitfalls in Chapter 4-5 demonstrates. But there are some pitfalls that are so frequent and so serious that they deserve special attention here. This chapter includes my personal selection of "favorite" pitfalls. 1. Assuming that the source of a problem is the nature of the individual or group involved, when it is mainly the circumstances that cause the probability of the problem to be high ("others" file). In the Wall Street Journal of June 26, 1989, a front-page story talked about how a 33 year old woman, Deborah Dean, ran a corrupt operation involving billions of dollars at the U. S. Department of Housing and Urban Development (HUD). The story referred to her and her associates as "young adults" and a "kiddie corps". It asked: "Where were the grown-ups?" Two pages later in the very same paper, a story about transfer of power after the Tiananmen Square massacre headlined "China's Old Men". The story referred to the Chinese leaders as "old guys in their 80s". Both stories imply that the officials' ages are important determinants of their activities and decisions. The writers suggest that if people of more "appropriate" ages were put into office, the public would be better served. These stories illustrate a recurrent fallacy in human thinking. In their book Human Inference, Robert Nisbett and Lee Ross review data showing that people are likely to err by explaining events with recourse to the characteristics of the actors instead of the structural conditions of the situation. In both China and HUD, the key element is the existence of a governmental entity which puts money and/or power at the disposal of officials. As long as this condition exists, the likelihood of abuse exists. Sometimes the public gets lucky and the officials are honest and humble. But inevitably some officials will take advantage of the opportunities. The solution, then, is not to call for better officials -- of the right age, say, at which people are free of the urge to turn opportunities for money and power to their own advantage. The solution is to change the system. It is interesting to wonder what Dean and Deng might have done had government jobs not been available and if they had applied their talents and energies in other fields, instead. Deng might have clambered to the top of a huge firm in a swashbuckling industry such as ocean shipping. And Dean might have started a bangup temporary employment firm and franchised it across the country, both enterprises to the great benefit of the public. Perhaps I give both of them too much credit on the skills side. But there is little doubt that the same people who choose the anti-social route in one set of circumstances will choose socially-beneficial activities in another set of circumstances. One of the findings of modern psychology is that people tend not to be consistent about whether they are "honest" or "dishonest". An experience of mine illustrates the principle. After I got out of the Navy I took a summer course in organic chemistry to complete my qualifications for entering medical school in the fall; most of the other 200 students also were pre-meds. There were two hours of classes and six hours of laboratory work every day -- forty hours a week, with lots of homework. The instructor put tough competitive pressure on the students to obtain high yields on their lab experiments. The tension in the laboratory rose so palpably that it became obvious that students would begin to cheat. I passed on that observation to a lab assistant, but nothing was changed. Two-thirds of the way through the course the cheating began, and then the system broke down completely. The wholesale cheating was not due mainly to the characters of the students, but rather to the structure of the system. Indeed, sometimes the only difference is whether the society calls an activity legal and ethical or illegal and despicable. In the United States, a supermarket manager wins profit and praise for stocking delicacies and supplying them to customers who desire them and are willing to pay a handsome price. In the Soviet Union the outcome is different, as this incident shows: A few years back the director of Moscow's Gastronom No. 1 -- a grocery store that sells gourmet foods to the elite -- was put to death for selling delicacies like black caviar and wild boar out the back door. In the U. S., he could use the front door and he would get a bonus for his initiative. (One of the great advantages of a market system is that people are enabled to make a wider variety of voluntary exchanges legally than in a system where prices are fixed centrally and where supply does not equal demand at the market price. Of course individuals' values matter, and people will differ in how they behave in particular circumstances. I suggest only that we realize that underlying dispositions of people are far from the whole story, and if we change the circumstances sufficiently we often can alter behavior considerably. Perhaps one reason that so many people prefer to find the cause of corruption in the individuals involved is that this allows them to cluck their tongues at how immoral other people are, which by implication makes them feel very moral themselves because they are not involved in those dirty dealings. Attributing poverty or corruption or other undesired behavior to the "cultures" of particular ethnic groups (while attributing positive characteristics to one's own culture, of course) is a related error. This error, such as the "Protestant Ethic" theory of Max Weber, has long thrived among scholars. These beliefs are again and again discredited by the extraordinary differences in behavior of the same cultural group in different circumstances, as for example: 1) The low Catholic and Jewish fertility in the United States now, compared to the high fertility of these groups in Europe in the 19th Century supposedly due to religious doctrine. 2) The lack of the Protestant Ethic among Chinese and Indians, which supposedly accounted for their poverty in China and India but is discredited by how these same people flourish economically outside of China and India. 3) Once in the Indian city of Jubalpur I met Sara Israel, a member of the small Jewish community there. When I asked her what the various persons in the community did for a living she mentioned people working for the government railway, the army, and several other civil servants. I asked if any were in business and she answered, "It is well-known that Jews do not have a talent for business." So much for that cultural stereotype. Too often the "culture" explanation results from some combination of ahistoricity and mental laziness. 2) My Personal Favorite Among Errors: Not Recognizing the Amount of Variability in a System, and Therefore Jumping to a Conclusion About What is Going On From Just a Few Observations. I first noticed this error when I was in the Navy. When the men aboard ship exchanged observations about the European ports at which we called, after every port there were some men who said it was the greatest place ever, whereas others said it was the worst. Nor was this simply the tendencies of some to criticize and others to praise, because the same man might have a very good experience in one port and then a very bad one in the next, or vice versa. The obvious explanation was that a person could have only a very limited amount of experience in a given port, and the luck of the draw might therefore influence whether the overall impression was positive or negative. .ig regression fallacy and intelligence and heredity and Rookie of Year and good restaurants The fallacy of "regression to the mean" -- "regression fallacy" for short -- is an interesting special case. The child of very tall parents is likely to be shorter than its parents, though taller than average, and a child of very short parents is likely to be taller than its parents. This is because height is the result of a chance combination of the genes that a person carries, and even if a person carries a set of genes that would on average produce a person somewhat taller than average, a very tall person is an unusual result even for such a person's genes. (Leave aside the complication of there being two parents; the principle is the same.) By analogy, if you know that people on average have jars containing half red and half black jelly beans, while you know that you have a jar containing 60 percent red jelly beans, you predict that the next ten jellybeans you draw are likely to be more than 50 percent black. But it would be unlikely -- though possible -- for you to draw ten which are only red. If you do not know the composition of your jar but you do know that the average for everyone taken together is half and half, and you draw a handful of ten red jellybeans, your best guess would not be 100 percent red for your next handful, but rather something over 50 percent. That models the situation for the heights of children of tall parents. Have you noticed that, when you return to a restaurant where you had a remarkably good meal the first time you ate there, the second meal is seldom as good as the first time? That's because the remarkable first meal is likely to have been so good at least partly by chance, and the chance element is unlikely to be repeated again. Another example: The Rookie of the Year in professional sports often produces disappointing results the second year in the league. (From now on, perhaps you will not be surprised when this happens.) An interesting twist on the regression fallacy was observed by Daniel Kahneman in the training of pilots. When the student made a good maneuver, the trainer would say "good", and move on to the next part of the training. When the student made a bad maneuver, the trainer would say "bad", and have the student repeat the maneuver. On average, the next trial of that maneuver would be less bad-- closer to the average just by chance because only bad maneuvers were followed by another trial. The trainers therefore concluded that criticism had a better effect on performance than praise. This unwarranted conclusion led them to rely on criticism rather than praise. 3. Overestimating Your Mental Powers Compared to Other People's. Most of us believe that we know our own situations so much better than another person can know it that we quickly reject advice and ideas about our situations. Sometimes this is well- warranted because the advice-giver really does know too little about the situation at hand to offer good advice; this disregard of dispensed knowledge is one of the main reasons that socialist centralized economic planning fails. But sometimes the advice- giver does know enough about the situation to offer useful advice on the basis of general knowledge, yet is rejected without sufficient hearing. "You don't understand my industry" or "... my family" is too often an automatic response, and often a costly one. That was the universal reaction when I proposed a volunteer auction plan to the airline industry as a way of dealing with the overbooking problem. And it is the reaction of my close friend T when I suggest to him that he do a bit of market research about what consumers like in the garments he sells; he is sure that his accumulated knowledge is sufficiently great that it is not worth spending even a bit of money and time in systematic investigation, though this seems never to have been done in his industry. Perhaps such self-assurance is necessary, and the absence of it damaging. I probably would have been a better father if, during my children's adolescent years, I was more sure that I knew what I was doing and that my decisions were good ones. Instead, my lack of confidence led me to indecision and weakness in the face of the kids' determined claim that my wife and I should have the same rules as other parents (except where our rules were more lenient, of course), and that the kids knew best what was good for them; this indecision and weakness were, I now believe, unhelpful and unnecessarily painful. Even if I really did not know well what I was doing, it probably would have been better if I thought that I did, and acted so. More generally, people tend to have an inflated assessment of their own capacities relative to those of other people. For example, surveys in several countries show that about 80% of the respondents believe that they are better-than-average drivers (Paul Slovic in Against All Odds). Charming recent research by cognitive psychologists has documented how people tend to be overconfident in their mental abilities in a wide variety of test circumstances. (When their actual chances of being right are low -- below, say, 20% -- they tend to be underconfident, however. For a review of this literature see Baron, 1988, pp. 200-202.) Corollary 3a) Thinking that you can figure out and control processes and events that are unknowable and uncontrollable. Our belief in our capacities to decipher patterns feeds on the assumption that there must be a pattern to be discovered. (And this assumption in turn is intertwined with a world view which sees the cosmos as a structure of inherent "laws"; see Chapter 6- 7 on World Views.) This belief in our pattern-finding abilities emerged strikingly in an experiment I once conducted. I presented a series of randomly-generated numbers to people saying that they were stock prices. (They might just as well have been actual stock prices, which look just about identical to randomly- generated numbers; see Chapter 8-0 on portfolio selection.) People tried an extraordinary variety of devices to impose pattern on the numbers -- drawing curves, taking averages, making assumptions about which directional movements must follow which (e.g. an up must follow a down, etc.), and so on -- even though they had no sound reason in experience or theory to believe that patterns could be found. Even more striking, after I ended the experiment and told the subjects that the numbers were randomly generated, many wanted badly to continue playing and testing their pattern-deciphering systems against other peoples' results. Another piece of relevant evidence is the resistance of almost all sports fans to the idea that there are no patterns in baseball batting or basketball shooting that allow the coach to identify "slumps: or "hot hands" (see Chapter 1-4[?]). Perhaps it is good that we are not aware of the frailty of our thinking lest we would walk about in fear and trembling, an idea that has come up earlier in other contexts (see the introduction). Overconfidence in our capacities may also have the beneficial effect of making us optimistic about the future, and contributing toward our being in a good mood. Studies show that depressives have a less positive view of their capacities in laboratory tests than do "normal" people -- but the depressives' assessments are more accurate in this regard than are those of normal people. [cite from Depression]. The good feeling may be worth the damage caused by the overconfidence. Corollary 3b) Believing that there is a connection between morality and education or "intelligence". Many people assume that the more "intelligent" people are (whatever is meant by "intelligence"), and the more education they get, the better will be their character. But to my knowledge there are no data showing this to be the case. One of the reasons that the horrors of World War II had such a crushing effect on many people's world views and their assessment of human progress was because it was thought by many that Germany was the most "enlightened" country in Europe earlier. Yet Germany behaved in ways that were worse than primitive or barbaric. Moral expectations about Germany based on its intellectual accomplishments were simply founded on an unsound assumption. (This is not to say that on average moral behavior does not improve somewhat as civilization evolves cultural patterns.) One of the most painful errors in human conduct is to assume that the people with whom one shares political ideas are of higher character than the others. This especially afflicted the Communists in the 1930s. The disappointing let-downs by supposed friends in that party were devastating to many. Corollary 3c) Underestimating the power of "ordinary" . 4) Over- estimating the Power of "Clever" Individuals --Others and Ourselves -- To Create Sound New Social Systems and Feasible Radical Solutions for Real-life Economic and Political Problems. Over-estimating one's own capacity, and the capacity of others, to decipher complex patterns of relationships in society and in nature has been the source of many of the costliest blunders of the human race. Individuals' blunders are magnified greatly when the blunders are made by officials on behalf of governments. Since the beginning of history, military leaders have led armies to slaughter with a combination of fancy reasoning and slick rhetoric. In modern times the possibilities for catastrophe have increased as countries have developed the totalitarian organization necessary for regimes to rule by force even into the household. The combination of dazzling Marxist theory and equally dazzling Marxian oratory promised that the "rationality" of Communism must surely produce greater efficiency and equity than the "chaotic" and "unplanned" system of capitalism, as discussed in Chapter 6-4. The proponents of socialism have often been persons with extraordinary intellectual credentials, persons considered clever and intelligent by all the usual tests -- from Marx and Engels and Lenin, to Einstein and Russell and Sakharov. The doctrine sounded good. It still sounds good to those unacquainted with its results in practice. Many other new theories of society also sound good. And many of them deserve to be tried. The tragedy is that people can become so convinced of the correctness of a scheme on the basis of speculation and "logic" that they are willing to make vast changes without conducting small-scale experiments first, or without examining evidence that is available on the working of the scheme. One of the key elements in Marxian theory is that under socialist conditions officials could be expected to act mainly for the good of the public. Another key element of the theory is that ownership of an enterprise such as a farm is unnecessary to motivate people to work hard. Much could have been learned about the validity of these propositions from the behavior of officials and farmers in a variety of other circumstances. If citizens had been less ready to trust in political leaders on the basis of their intellect, and if they had -- in James Michael Curley's immortal phrase -- preferred to be governed by the first thousand people in the Boston telephone directory than by the Harvard faculty -- these huge errors in governance would have been less likely to occur. While not much intellect is necessary to drag a country into war, considerable intellect usually is required to persuade people to put their trust in new economic schemes. Aside from the Marxians, the most successful economic doctrine of this century came from John Maynard Keynes, in the eyes of his contemporaries the most brilliant of humans; this characteristic which contributed enormously to his success. Bertrand Russell said of Keynes' intellect that it was "the sharpest and clearest that I have ever known. When I argued with him, I felt that I took my life in my hands, and I seldom emerged without feeling something of a fool" (Skidelsky, p. 124). But the Keynsian doctrine has now been John Maynard Keynes: Hopes Betrayed, 1883-1920 (New York: Viking, 1983)discredited as a remedy in most sets of economic circumstances, and it is more of a burden than a benefit to the human community. If people had been less impressed by Keynes' intellect, economic science and economic life almost surely would now be better off. Many children are awesomely willing to sally forth on the basis of their own reasonings, and to be so sure of their reasoning that they have only contempt for the advice of adults whose experience contradicts those reasonings. The fallacy of long chains of logic (see Chapter 4-2) goes hand-in-glove with the fallacy of cleverness and intellectual dexterity. One of the frequent characteristics of people considered clever is that they master mathematics quicker than do others, and they often purport to prove conclusions with complex mathematical arguments that less-clever people do not follow. This is wrongly taken as a sign of the likely soundness of the long logical argument. 5) Not Taking Into Account the Indirect and Far-Off Consequences of a Change, Along With the Direct and Immediate Effects. Frederic Bastiat was the great expositor of the "seen- unseen" error (see Chapter 00) which occurs frequently in all kinds of cost-benefit analysis. This error of neglecting indirect and long-run consequences is especially common in analyses of social policy in which public opinion is susceptible to the fallacy of giving undue weight to vivid evidence. Consider, for example, social policy with respect to drugs. Legislators vie with each other to propose ever more draconian sanctions against drug use because people are so frightened of the terrible effects of drug use in their families and communities. Horrifying assessments are made of the effects of drug use, which prohibition of drugs is hoped to prevent. But less notice is given to the ill effects of the sanctions themselves. The deaths due to drugs being illegal -- not only from drug-related killings, but also due to unsafe drug use due to quality-controlled drugs not being available legally -- are perhaps 8,000 per year (James Ostraowski, Spelling?), "Thinking About Drug Legalization", Cato Policy Analysis, May 25, 1989). A sound over-all cost-benefit analysis in terms of lives or resources might well show drug legalization -- or just legalization of marijuana -- in a positive light, as the public seems to have decided it has with respect to alcohol legalization. Another example is the belief that the number of jobs is fixed, and therefore immigration and labor-saving technology necessarily imply more unemployment. This misconception is a classic case of taking into account only "the seen" and not taking account of the "unseen" -- the latter being the process of creation of new jobs. And if economic logic alone were not enough to refute this notion, the history of the human enterprise should demonstrate conclusively that new technology on average creates more jobs than it eliminates. Oscar Wilde is said to have said that an economist is someone who knows the price of everything and the value of nothing. But a good economist -- or a good thinker with respect to any cost-benefit matter -- is a person who calculates value well. The key hurdle to be overcome in calculating well is including the indirect and far-off costs and benefits into the reckoning. Bastiat effects occur not only in human affairs but also in the biological and physical realms. For example, biologists have "recently raised the surprising possibility that very low doses of ionizing radiation may not be harmful after all or may even have net benefits" (Sagan, Leonard, "On Radiation, Paradigms, and Hormesis", Science, 11 Aug,. 1989,pp 574, 621 [sic]; see also Wolff, Sheldon, "Are Radiation-Induced Effects Hormetic?" ibid, 575-621). But this should not be very "surprising". Low-level radiation has always been part of the human's environment, and our relationship with our environment is so complex and intimate that it seems reasonable that just about every facet of the environment has mixed effects upon us. Indeed, the same author tells us that "The stimulatory effect of low doses of a wide variety of chemical agents on the growth of organisms had been noted by Hugo Arndt and Rudoph Schultz...in the 19th century. They considered the phenomenon to be universal". (Sagan, p 574). The "surprise" comes only if you expect to happen only what you have observed to happen in the range of the observations you have already made, and do not expect there to be an extraordinary number of effects which we have not yet learned about. Similarly, based on the physics that has developed within the last hundred years, many people worry that the world will eventually run out of energy. But this assumes that within the next 7 billion years humans will not find new suns as sources of energy, or new ways to get energy from nature that we do not comprehend (just as nuclear energy was not even imagined a hundred years ago), or that our needs for energy will diminish rapidly, perhaps faster than a fixed supply of energy (as Freeman Dyson as suggested) -- a very strong assumption indeed about the state of our current knowledge, it would seem to me, and one which likely illustrates the error of overestimation of our own knowledge and mental powers. This general point of view influenced my choice of career. I had planned to go to medical school after I got out of the Navy in 1956. But I had an intuitively-based belief that people should not take medical drugs except when the need to do so is clear-cut and strong. I feared that that view would cause me to be badly-adjusted to the medical profession, which at that time dispensed drugs quite freely as if drugs have only direct effects on a given disease and no side-effects except those already known about -- and especially, no delayed side effects in the far future. One basis of my view was experiments showing that rats and babies often can choose their diets wisely from a cafeteria of possibilities, and also Walter Cannon's notion of "the wisdom of the body" Additionally, my maternal grandmother's face was faintly blue. I had heard that this was due to a physician prescribing the drug Argyrol to her on a steady basis many decades earlier; only later had it emerged that Argyrol had delayed effects. So I had somehow come to the belief that tampering with a complex system that we still understand so little was inherently dangerous. Medical practice nowadays is closer to this view. And ecologists' conception of nature is similar to this conception of the human body, if I understand the ecologists correctly. They believe that we ought not to make changes in the existing natural order unless there is pressing reason, and unless our knowledge of the likely consequences is extensive, because so many undesirable effects may be indirectly caused by a particular alteration. The same view applied to nutrition suggests that one should eat a diet containing a wide variety of foods, because we may have needs that science has not yet discovered, and a varied diet is more likely to fill those needs than is a narrow diet, no matter how healthy the narrow diet may be deemed on the basis of current knowledge. And the same should hold in reverse: too much of any food may be dangerous because it may contain something that we have not yet identified as harmful. Furthermore, a diet that corresponds roughly to human diets over the millennia is probably safer than a diet that is a large break with human history, because we have evolved in consonance with that traditional diet. Without further knowledge, there would seem to be a reasonable presumption that a traditional diet is reasonably safe. Is this an argument against any intervention in human as well as non-human matters? This charge has been brought against such writers as Hume, Smith, and Hayek. I do not think that it is accurate, however. This view does imply that we should study the existing state of affairs very closely before changing it, to check for complex relationships that have heretofore escaped our notice. That is, we should find out why the fence was put up before we take it down, in Robert Frost's words. But knowledge need not be perfect, and never can be. When we are satisfied that we understand the system as best we can, then we should proceed to make changes. This point of view differs from a more interventionist point of view only in emphasis, not in logic. I do not, however, share a more extreme non-interventionist anti- growth point of view with those ecologists who are so convinced that any alteration might knock our whole system out of whack, which would then induce a series of changes to compensate, which would then have even worse effects, and so on until the whole system comes apart or explodes; this was the viewpoint that underlies the famous book Silent Spring by Rachel Carson (New York: Houghton Mifflin, 1962 Crest reprint, esp. pp. 16, 17, 18, 22). Interestingly, many persons who espouse the latter view with respect to natural systems are very ready to intervene in human systems.) 6) Misjudging the role of motives. Considering the motives of the source of information can cause error in both directions -- a) ignoring motives when appropriate to do so, and hence not making proper allowance for bias, and b) discounting valuable information on the basis of an assumed motive on the part of the person providing the information, when it is not appropriate to do so, and hence rendering the information less valuable. The great African boxer Battling Siki fought Mike McTigue for the light-heavyweight championship in Dublin on St. Patrick's Day 1923 and lost by a close decision. One would not have to be paranoid to wonder if there might have been some bias in the judging. Even very young children learn that they should not accept at face value all the impressions that advertisers seek to leave with television audiences. The children understand that the advertisers have a stake in their believing that the advertised products are better than the competition. The desire of journalists to come up with a scoop, together with the power of scare stories to sell newspapers and attract television audiences, contributes to the propensity of journalists to seek out supposed social problems for stories. It also contributes to their propensity to put a bad-news twist on what is mainly a good-news situation. For example, a story about an improvement in the worldwide food supply, though at a somewhat slower rate than the year before -- it is inevitable that the improvement some years be less than others -- may be headlined "Food Supply Lags". Another case -- this is for real from the esteemed Washington Post - "Good Crops a Bad Sign". It is necessary to understand the journalists' motives in order to make sense of a world which is getting better in most material respects and yet is described in horrifying terms every morning and evening. It helps to keep perspective if you notice the subjects of the stories. When the worst problems the journalists can dig up are local scandals, and scares about the shortages of places to dump disposable diapers, you can infer that nothing very bad is happening elsewhere in the world. Judging information in the light of the supposed motives of the source when it is not appropriate to do so probably leads to more error than neglecting to do so, however. It is said of one of the main human groups that "If you say to a Z that everything you tell a Z the Z takes personally, and the Z will say `I do not'". Perhaps this remark should simply be referred to the entire human race rather than to just one group, however. Journalists habitually ask "Why is she saying that?" and "What's his angle?" Especially journalists who work in the very political environment of a capital city find it difficult to accept that anyone makes certain statements simply out of love of truth or concern for the public welfare or other enduring value, independent of any personal or institutional benefit; rather, the journalists assume that there must be some short-run motive. (Politicians tend to share this attitude even though many themselves love truth for its own sake.) This attitude leads to faulty evaluation of information. The flip side of this fallacy is the attempt to prove that something did not happen because the person had no motive to do it. Consider the statement of Mayor Marion Barry of Washington, D. C., who had been accused of using cocaine: Barry said the allegations of convicted crack dealer Charles Lewis were too implausible for anyone to believe. "Anybody who knows anything about America knows any mayor who really wants to get dope can get it without going out on the streets," the mayor said. He also said it would be "idiocy" for any politician to use drugs "around four or five other people", adding: "I've never been accused of being an idiot". (The Wash Post, Sep 16, 1989, p. B 1). Journalists are not without reason to inquire into motive, however. Politicians routinely deny that their taking fees from a trade association, or owning shares in a corporation, could possibly affect their votes in Congress or their administrative actions. But AT&T long ago understood that owning just a few shares of that corporation's stock could make the owner favorably-disposed toward its fortunes, and it therefore set out to sell AT&T shares to as many Americans as possible, with successful results. It is quite amazing how strongly a person's self-interest can affect the person's interpretation of the facts. A group of South Florida physicians desired to prevent the very large Cleveland Clinic, which has been a pioneer in heart surgery, from entering their market. The physicians arranged that "Carl C. Gill, the chief executive of the Clinic's Floria enterprise, and a surgeon who has done 4,000 [note: an astonishingly large number] open-heart operations, was denied local hospital privileges. The credentials committee of Broward General Medical Center said he lacked enough experience" (WSJ, Aug 18, 89, p. 1). Who could believe that such a wild charge would stick? But it did. How could anyone truly believe such a proposition? Every one of us must have come across some religious beliefs (of other groups, of course, not of ours) that seem to us outlandish passing any possible belief by a sensible person. Yet such beliefs are held by many sensible persons. Chapter 8-0 on religion [to come] offers some explanation. And there is no obvious difference between those "outlandish" religious views and believing so strongly that a conventional idea is correct that you are impervious to contrary evidence. Even when presented with irrefutable data on the scarcity of raw materials over the centuries, as measured by their prices, many people cannot/will not divest themselves of the belief that raw materials have gotten more scarce, and are likely to become more scarce; the same is true with the cleanliness of urban air and of the drinking water supply in the United States. Not only may people disbelieve the truth for their own interests, they may even "kill the messenger", as in this case: John Kartak...was assigned to an Army recruiting station in Minneapolis. When he discovered that the office forged high school diplomas and concealed criminal records in order to meet recruiting goals, Kartak called the Army hotline. The response from the Army was to order two psychological evaluations on Kartak, with one superior telling hospital officials, "He has lodged numerous complaints recently...I find his behavior highly unstable. I am concerned that he may do something to harm himself or others". Kartak's superior was indeed afraid that Kartak would do something to harm others -- more exactly, harm to the careers of his superiors. And the Army inspector general later found at least 58 in the office guilty of illegal acts. (Washington Post, Aug 25, 89, p. A 19). But again, there are some people with the courage and integrity to fight for the truth -- John Kartak, in this case. How can you sort out the courageous truth-lovers from the gutless finks who will bend the truth any which way for their own self interest? I wish I knew. My only advice is to hold your judgment in abeyance until you see a person tested by events, and try not to let your judgment be swayed by whether the person is pleasant or seems sincere. It is prudent to attempt to design a fact-finding system so that it is as free as possible of interests which may bias the result. The jury is perhaps the most successful example of such a system. The heart of the jury system is that the individual jurors have no personal stake in the outcome. Their reputations are not at stake, nor their personal economic positions. And because there are twelve jurors, individual prejudices tend to offset one another. Research (R. Simon, 1967?) shows resoundingly that the jurors generally do behave in a sensible, thoughtful, disinterested fashion -- better than in any other institution that I know of. Even the jury system can be perverted, however. When I served as defense counsel aboard a Navy destroyer -- a small war ship with a complement of about 300 officers and men most of whom knew each other personally -- trials were sometimes kangaroo courts, with the verdicts arranged in the officers' wardroom in advance. (The "jurors" were three officers, unless the enlisted man on trial asked for enlisted personnel, and few did.) The defense counsel was "supposed" to be an orchestrated part of this arranged justice, though I naively did not fully understand the system until years later: I assumed that a defense counsel was "supposed" to give the accused the best defense possible. In one case the entire summation of the "trial counsel" (the prosecutor), angrily flushed in the face, was: "I know he's guilty. You guys [the three judges acting as jurors] know he's guilty. Even Simon knows he's guilty [not quite so]. So find him guilty." He then sat down. And of course the verdict was "guilty". (On appeal the verdict was reversed -- at which time I found that appealing a verdict off the ship was an even worse breach of proper behavior than providing a vigorous defense at the trial.) Perhaps involved here is a belief by many that - - to paraphrase Mao Tse Tung's remark that power comes from the barrel of a gun -- truth and justice are what the majority, or a powerful political bloc, says that they are. This is a view of politics which asserts that whatever the majority or an effective bloc are able to wrest from the governmental machinery and pocketbook is fair game. Indeed, the view of many is that that struggle is the whole of politics. And in this struggle the only appropriate limitation on rhetoric is whether it will be sufficiently credible to advance the cause. (It should be noted that this view is inconsistent with the Constitution of the United States, which asserts limits upon a majority's power to turn events in its favor.) 7) Assuming that effects are either-or when they are actually more-or-less. Ask the person sitting next to you: What would be the effect on the smoking rate of raising the tax on cigarettes? Your neighbor will probably say "A raise in the price will neverget people to stop", or perhaps, "Yes, raise it good and high and they will have to stop." In fact, an increase of 10 percent will lead to some people stopping -- perhaps 6 percent, and another increase of 10 percent will cause a decrease of another six percent, and so on -- but many will continue to smoke. The point is that the answer to the question should be phrased in how-much terms and not in yes-no terms. Converting the discussion from qualitative to quantitative often has a powerful calming effect upon bitter arguments between people with locked-in positions. When people are asked for a how-much estimate they often find that they are much less opposed to each other than they had thought, and they recognize how imprecise is everyone's knowledge. The parties then find it easier to work toward a reasoned and satisfying conclusion. The effect of most social policies on people's behavior is best viewed quantitatively -- the effects of welfare on the propensity to work, of public transportation upon auto driving, of seatbelt laws upon the use of seatbelts. This is because there is dispersion among people's behavior in response to particular incentives. 8. Focusing on Blame Rather than on Improvement. The human animal has an extraordinary propensity for making itself feel good by charging someone else with being bad. Newspaper journalists spend much of their careers ferreting out who is to blame for public ills, rather than analysing the underlying causes which may make the bad thing happen again and again, or discussing how to improve the situation. And the rest of us are forever attributing responsibility to others for all the unwelcome events in life. We even curse the chair when it clumsily gets it the way of our shins. This urge to blame, and to ask people to justify themselves, is sunk deep into our language. Try speaking for a week without using an expression such as "Why didn't you...?" or "Why can't you...?" instead of the less blameful "I suggest that next time you..." We even use a related form toward the future, "Why don't you...?" All of these negative terms -- don't, can't, didn't -- are literally downers, and just put people on the defensive rather than evoking their constructive thoughts. Maybe the satisfaction that we get out of blaming makes it worth the doing. But there is every other reason for not doing it. Most important, it distracts our attention from looking for solutions to our problems. *** Following on all this discussion of errors in thinking in Chapters 4-5 and 4-6, you may despair about our ever thinking well enough about any matter to avoid every-day catastrophe. And yet there is irrefutable evidence that civilization advances in material ways, and we live longer and healthier and more comfortably with every decade. How come? There are several partial explanations: 1. Many of the laboratory demonstrations of people's incapacity to think clearly are based on quite subtle situations -- for example, peoples' inability to choose correctly among gambles that are not at all obvious even to trained statisticians. But the decisions that really matter are -- by their very definition -- those where there is a large difference between the alternatives, and such large differences are easier not to overlook. The same is true in scientific investigations. If fancy statistical tests and subtle interpretations are necessary, the conclusion usually -- though not always -- is not earthshaking. Where the results of the study are powerful and important, they can usually be seen in simple tables or graphs. 2. More than one person -- sometimes an entire committee -- is likely to be involved in important decisions, and the quirky personal aspects of one person's thinking are likely to be neutralized by others' quirks. 3. Many of our decisions are made repeatedly, and we therefore have a chance to learn what is sound and what is erroneous. (Sometimes, however, there is no way to check on the results and therefore learn by experienced, and hence we may never improve. This is the problem with choosing a dentist; we have almost no way to determine if she or he is doing good work.) 4. We don't need to be anywhere near perfect in order to do well. This is apparent in our competitions with other persons and organizations. Even the best tennis player in the world, on the best day she or he ever has, makes lots of errors. Think how many more errors a good club player can make and still win. It is much the same in business or politics. And it is true of our struggles with nature, too. Weather forecasting does not need to be perfect to be useful. 5. The worst performers in a situation tend to leave that situation. The weather forecasters with the worst record eventually must find other lines of work. This is not true in politics or religion or even some aspects of science; the persons who prophesied resource scarcities in the late 1960's and early 1970's have been confounded by subsequent events, but continue to offer the same prophecies, and continue to be received respectfully. But in most situations that are not so emotionally charged, failure does not persist so successfully. Page # thinking error46# 3-3-4d