網路城邦
回本城市首頁 時事論壇
市長:胡卜凱  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市政治社會政治時事【時事論壇】城市/討論區/
討論區知識和議題 字體:
看回應文章  上一個討論主題 回文章列表 下一個討論主題
認知困擾(4之1) - S. McNerney
 瀏覽1,288|回應4推薦1

胡卜凱
等級:8
留言加入好友
文章推薦人 (1)

胡卜凱

The Sartre Fallacy, or Being Irrational About Reason

 

Sam McNerney, 02/22/13

 

Consider the story of my first encounter with Sartre.

 

I read Being and Nothingness in college. The professor, a Nietzsche aficionado, explained Sartre’s adage that existence precedes essence. After two years of ancient philosophy the idea struck me as profound. If it was true, then Plato and Aristotle were wrong: there are no Forms, essences or final causes. Meaning isn’t a fundamental abstract quality; it emerges from experience.

 

But that’s not what has remained with me. We might simply say that the flamboyant French existentialist believed that we ought to live in “good faith” in order to live authentically. To the psychologist the authentic life is a life without cognitive dissonance; the realist might say it means you don’t bullshit yourself. For Leon Festinger it’s a world in which his doomsayers admit that the destruction of Earth is not, in fact, imminent. For Aesop it’s a world in which the fox admits the grapes are ripe.

 

Sartre’s philosophy of authenticity remains with me because of a relationship I had with a classmate. I have in mind an ironically inclined peer whom I recall getting high after class and talking about living authentically. Mind you, this occurred while he wore clothes that referenced an earlier generation and drank beer targeting a much different demographic than his own. He was doubling down on irony.

 

You can imagine my horror when I realized that I was not unlike him. I was a college student, after all: I was selective with my clothes and I too engaged in pseudo-intellectual conversations with my friends. Worse, I used Being and Nothingness to confirm my authenticity and his phoniness. If the role of the conscious mind is to reconcile the pressures of the external world with the true self then I was hopelessly inauthentic. I was a poseur living an unexamined life. The joke was on me. 

 

It was only until recently that the lesson of my time with Sartre and the hipster truly registered. There are times, I realized, when people (myself included) behave or decide counter to recently acquired knowledge. It’s worse than simply misinterpreting or not learning; it’s doing the opposite of what you’ve learned – you’re actually worse off. I’m tempted to term this the “Hipster Mistake” after my existential experience in which reading Sartre caused me to live less authentically. But let’s call this counter-intuitive phenomenon the Sartre Fallacy after my friend Monsieur Sartre.

 

After the Sartre Fallacy fully registered, a perfect example from cognitive psychology emerged in my mind. Change blindness occurs when we don’t notice a change in our visual field. Sometimes this effect is dramatic. In one experiment researchers created a brief movie about a conversation between two friends, Sabina and Andrea, who spoke about throwing a surprise party for a mutual friend. The two women discuss the party as the camera cuts back and forth between the two, sometimes focusing only on one and other times both. After the participants watched the minute-long movie the researchers asked,

 

Did you notice any unusual difference from one shot to the next where objects, body positions, or clothing suddenly changed?

 

The researchers – Dan Simons and Daniel Levin – are wily psychologists. There were, in fact, nine differences throughout the movie purposely put in to test change blindness. They were not subtle either. In one scene Sabina is wearing a scarf around her neck and in the next scene she isn’t; the color of the plates in front of Sabina and Andrea change from one scene to the next. Nearly every participant said nothing of the alterations. 

 

In a follow up experiment Levin explained the premise of the Sabina-Andrea study (including the results) to undergraduates. 70 percent reported that they would have noticed changes even though no one from the original study did. Let this detonate in your brain:

 

the undergrads concluded that they would notice the changes knowing that the participants in the original study did not.

 

The lesson here is not that change blindness exists. It is that we do not reconsider how much of the world we miss after we learn about how much of the world we miss – the follow up experiment actually caused participants to be more confident about their visual cortices. They suffered from change blindness blindness, so to speak (the visual equivalent to the Sartre Fallacy), even though the sensible conclusion would be to downgrade confidence. Here we see the Sartre Fallacy in action (and if you doubt the results of the second experiment you’re in real trouble). 

 

I committed a similar error as an undergraduate about one year after falling pray to the Sartre Fallacy. My interest in philosophy peaked around the time I finished Being and Nothingness. The class continued with de Beauvoir’s The Second Sex but it was not, I quickly noticed, that sexy; The Fall and Nietzsche’s bellicosity were much more seductive. In fact, with a few exceptions (Popper and later Wittgenstein) philosophy becomes incredibly boring after Nietzsche. The reason, simply, is the old prevails over the new and not enough time has passed to determine who is worth reading. It’s a safe bet that the most popular 20th century philosopher in the 23rd century is currently unpopular.* This is one reason why death is usually the best career move for philosophers. Good ideas strengthen with time.

 

With this in mind I turned to psychology where experiments like the ones the Dans conducted reignited my neurons. I especially enjoyed the literature on decision-making, now a well-known domain thanks to at least three more psychologists named Dan: Kahneman, Ariely, and Gilbert. I read Peter Wason’s original studies from the 1960s and subsequent experimentation including Tom Gilovich’s papers on the hot hand and clustering illusions. Confirmation bias, the first lesson in this domain, stood out:

 

the mind drifts towards anything that confirms its intuitions and preconceptions while ignoring anything that looks, sounds or even smells like opposing evidence.

 

An endless stream of proverbs since language emerged underlines this systematic and predictable bias but the best contemporary description comes from Gilbert:

 

The brain and the eye may have a contractual relationship in which the brain has agreed to believe what they eye sees, but in return the eye has agreed to look for what the brain wants.”

 

I still remember reading this gem and immediately uncapping my pen. 

 

But then something unexpected occurred. Suddenly and paradoxically I only saw the world through the lens of this research. I made inaccurate judgments, illogical conclusions and was irrational about irrationality because I filtered my beliefs through the literature on decision-making – the same literature, I remind you, that warns against the power of latching onto beliefs. Meanwhile, I naively believed that knowledge of cognitive biases made me epistemically superior to my peers (just like knowing Sartre made me more authentic).

 

Only later did I realize that learning about decision-making gives rise to what I term the confirmation bias bias, or the tendency for people to generate a narrow (and pessimistic) conception of the mind after reading literature on how the mind thinks narrowly. Biases restrict our thinking but learning about them should do the opposite. Yet you’d be surprised how many decision-making graduate students and overzealous Kahneman, Ariely and Gilbert enthusiasts do not understand this. Knowledge of cognitive biases, perhaps the most important thing to learn for good thinking, actually increases ignorance in some cases. This is the Sartre Fallacy – we think worse after learning how to think better.

 

Speaking of Thomas Gilovich, right around the time I became a victim of confirmation bias bias he gave a lecture in our psychology department. I sat in on a class he taught before his lecture and listened to him explain a few studies that I already knew from his book, which I had bought and read. Of course, I took pride in this much like a hipster at a concert bragging about knowing the band “before everyone else.” It was Sartre part II. Somehow I had learned nothing.

 

After his class I jumped on an opportunity to spend a few hours with him while he killed time before his lecture to the department. I’d been studying his work for months, so you might imagine how exciting this was for me. I asked him about Tversky (his adviser) and Kahneman, practically deities in my world at the time, as well as other researchers in the field. I regret not remembering a thing he said. But thanks to pen and paper I will remember the note he wrote after I asked him to sign my copy of his book: “To a kindred spirit in the quest for more rational thinking.” How foolish, I thought. Didn’t Gilovich know that given confirmation bias “more rational thinking” is impossible? After all, if biases plagued the mind how could we think rationally about irrationality?

 

Again, the joke was on me.    

 

*Likewise, it’s unlikely a new ancient Greek philosopher will become popular, unless they dig up some more charred papyrus fragments at Oxyrhynchus or Herculaneum. Another example. The youngest book from top ten of the Modern Library's 100 best English-Language novels from the 20th century is Catch-22, published in 1961. AFI top 100 movies shows a similar effect. 

 

http://bigthink.com/insights-of-genius/the-sartre-fallacy-or-being-irrational-about-reason



本文於 修改第 3 次
回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4949812
 回應文章
淺釋「認知困擾」 - Wikipedia
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

 Cognitive  Dissonance

 

From Wikipedia

 

The Fox and the Grapes by Aesop. When the fox fails to reach the grapes, he decides he does not want them after all. Rationalization (making excuses) is often involved in reducing anxiety about conflicting cognitions, according to cognitive dissonance theory. In modern psychology, cognitive dissonance is the feeling of discomfort when simultaneously holding two or more conflicting cognitions: ideas, beliefs, values or emotional reactions. In a state of dissonance, people may sometimes feel "disequilibrium": frustration, hunger, dread, guilt, anger, embarrassment, anxiety, etc. [1] The phrase was coined by Leon Festinger in his 1956 book When Prophecy Fails, which chronicled the followers of a UFO cult as reality clashed with their fervent belief in an impending apocalypse.[2][3] Festinger subsequently (1957) published a book called A Theory of Cognitive Dissonance in which he outlines the theory. Cognitive dissonance is one of the most influential and extensively studied theories in social psychology.

 

The theory of cognitive dissonance in social psychology proposes that people have a motivational drive to reduce dissonance by altering existing cognitions, adding new ones to create a consistent belief system, or alternatively by reducing the importance of any one of the dissonant elements.[1] It is the distressing mental state that people feel when they "find themselves doing things that don't fit with what they know, or having opinions that do not fit with other opinions they hold." [4] A key assumption is that people want their expectations to meet reality, creating a sense of equilibrium.[5] Likewise, another assumption is that a person will avoid situations or information sources that give rise to feelings of uneasiness, or dissonance.[1]

 

Cognitive dissonance theory explains human behavior by positing that people have a bias to seek consonance between their expectations and reality. According to Festinger, people engage in a process he termed "dissonance reduction", which can be achieved in one of three ways:

lowering the importance of one of the discordant factors,

adding consonant elements, or

changing one of the dissonant factors.[6] This bias sheds light on otherwise puzzling, irrational, and even destructive behavior.

 

http://en.wikipedia.org/wiki/Cognitive_dissonance



本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4949951
認知困擾(4之4) - S. McNerney
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

Sartre Fallacy Part IV: Philosophizing Without a Ph.D.

 

Sam McNerney, 03/08/13

 

Consider one last autobiographical note before I answer the question: “How do we avoid the Sartre Fallacy?”

 

I conducted an independent study my senior year that focused on biases and heuristics. I lived in this literature for months, covering what seemed like every study demonstrating how we humans screw up; it was difficult not to conclude that we are hopelessly irrational. What occurred to me, however, was that the word “rational” was misleading. What’s rational is usually relative to the homo economicus perspective, in which good decisions are about surveying alternatives, estimating probabilities and optimizing self-interest. But what looks rational in one setting might be deeply irrational in another if you attribute deviations in judgments not to deficits of the mind but to the homo economicus perspective itself. That is, it might be better to think about how biases and heuristics relate to the environment rather than to logic and optimization. 

 

Like every idea I’ve ever had, I discovered that I was not the first person to think it. Gerd Gigerenzer, one of the most important decision-making theorists of the last few decades, has made a career pursuing this line of reasoning. I remember buying his book Rationality for Mortals and finding a passage that crystalized my insight:

 

Violations of logical reasoning [are] interpreted as cognitive fallacies, yet what appears to be a fallacy can often also be seen as adaptive behavior, if one is willing to rethink the norm.”

 

Consequently, I realized that if we judge behavior not relative to logic and optimization but to the physical and social environments we exist in, then I cannot understand judgment by only studying the literature on decision-making. (We’ve seen that doing so actually increases your bias.)

 

It’s fitting that Gigerenzer’s insight struck me during an especially bibulous weekend, not when I was binging on judgment literature but on top of a couch dancing to Katy Perry. I still remember the idea arriving, clearly, into my conscious mind, despite the inebriation. In that moment I pledged to step outside the body of research I had been living in. I needed to go out and see how we decide in reality. Doing so would be the only way I could think clearly about biases and heuristics.  

 

Much later, after I graduated, I realized that an overlooked source of wisdom in college is the double life: one in the classroom and the other socializing. That is, the library and the party are not opposites but complements:

 

you don’t know what you know until you’ve tested your knowledge in the real world.

 

Therefore, the first step in avoiding the Sartre Fallacy is not to stop thinking but to start experiencing

 

Nietzsche, Buridan & Elliot  

 

The second step is balancing the two. Young Nietzsche outlined a strategy in The Birth of Tragedy. Recall his two central characters: The Dionysian is the chaotic, untamed, partygoer; the Apollonian is measured, rational, and self-controlled. Nietzsche argued that ancient Athens thrived by balancing each until the playwright Euripides, influenced by Socrates’ stubborn commitment to the truth and reason, focused the spotlight on the Apollonian thereby suffocating the city of its livelihood.

 

As an illustration, consider the thought experiment “Buridan’s Donkey,” named after medieval philosopher and priest Jean Buridan. An equally thirsty and hungry donkey stands exactly midway between a stack of hay and a pail of water. The donkey is rational, so he needs a reason to pick one over the other. But both are equally far away so he idles between the two until he dies. I’ll translate: a high dose of the Apollonian kills - a deadlocked mind requires a degree of chaos and randomness.*

 

If this sounds too philosophical, here’s an equivalent example from neuroscience. Perhaps you know Elliot, a centerpiece in Antonio Damasio’s Descartes’ Error. Elliot damaged his frontal lobes and subsequently lost his capacity to make trivial decisions. In a now famous example, Damasio suggested two alternative dates, both in the coming month and just a few days apart from each other. [Elliot] pulled out his appointment book and began consulting the calendar. The behavior that ensued, which was witnessed by several investigators, was remarkable. For the better part of half hour, the patient enumerated reasons for and against each of the two dates: previous engagements, proximity to other engagements, possible meteorological conditions, virtually anything that one could reasonably think about concerning a simple date... He was now walking us through a tiresome cost-benefit analysis, and endless outlining and fruitless comparison of options and possible consequences. It took enormous discipline to listen to all of this without pounding on the table and telling him to stop.

 

Damasio concludes that rationality is impossible without emotion – we need the Dionysian to push us one way or another or else we will remain stuck analyzing the pros and cons, just like Buridan’s Donkey.

 

Research on creativity is rife with supporting examples. Insights occur not in stressful moments but during warm showers, long walks, the weekend, and even exercise. The unconscious mind needs time to incubate ideas before delivering them to the conscious mind. I’m not a sucker for counter-intuitive research suggesting that alcohol and doziness are the “keys” to creativity, but there’s no doubt that a clenched state of mind is a bane on creativity. Stepping away from a problem is often the secret to solving it. Without the Dionysian, creativity output will cease like Monsieur Buridan’s Donkey and Elliot’s capacity to decide. 

 

Nassim’s Barbell

 

For another example, consider two reappearing characters in Nassim Taleb’s books on uncertainty.

 

The first, Nero Tulip, is a risk-averse scholar who lives in books. His background is in statistics and probability and he has a particular interest in medical texts. He is a disciplined thinker – an erudite-autodidact-philosopher at heart. The second, Fat Tony, is a street smart Brooklyn-type (although he lives in New Jersey) who does not read and despises name-dropping intellectuals. He’s classy. He likes fine wine and good food; I imagine him to be like Tony Soprano without the violence. Fat Tony benefits from chaos and unpredictability. Nero does not but he is not fragile either. Both hate boredom and avoid being “the turkey.” Nassim’s characters, likely disguised reflections of his experiences, are, in many ways, extensions of Nietzsche’s Apollonian (Nero) and Dionysian (Fat Tony).

 

Nassim writes about the “Barbell Strategy.” As a financial strategy this means playing it safe most of the time (e.g., buying Treasury bills) and making “extremely speculative” bets the rest of the time. As a lifestyle it means hating middlebrow stuff and moderation. Consider stuffing yourself with fat and carbs and then fasting; taking long leisurely walks and then sprinting for short periods; reading gossip magazines on one hand and a Platonic dialog, in Latin, on the other; staying sober six days a week and drinking excessively one day a week; Wittgenstein, for example, switched between writing philosophical texts and teaching elementary school. Why combine extremes and avoid the middle?

 

The human body hates moderation and loves stressors (to a point). What happens when you lay in bed all day long? You become exhausted and lazy. The best way to get over a hangover? Go running. Vaccines harm the body a little bit so it becomes stronger in the long run. Bones become denser when subjected to episodes of stress; lifting weights strengthens your muscles. Think about Hydra, the mythological serpent that gained two heads every time it lost one, or the airline industry: every plane crash makes flying safer. The barbell strategy benefits from the Dionysian but, importantly, does not abandon the Apollonian. For Nassim it’s a means towards an “antifragile” lifestyle, in which you live part Fat Tony part Nero Tulip. For our purposes it means not spending too much time in the library. 

 

Conclusion

 

Imagine a student at the University of Chicago. Let’s call him John. This enlightened individual deprives himself of the weekend, thereby forgoing late night dorm room bull sessions - an unaccredited source of wisdom. John has a promising but boring future as a philosophy professor. He agrees that all X’s are Y’s but not all Y’s are X’s, but cringes with fear when someone from the south side approaches him. He wants to learn Chinese – he figures it will look good on his resume – so he studies grammar books and buys a Rosetta Stone. Of course, if he really wanted to learn the language he would have moved to Beijing and found a Mandarin-speaking girlfriend. Unfortunately, his lack of social skills makes this difficult. In fact, John wonders why the unofficial slogan of his Alma Mater is “Where fun comes to die.” In the future, John will write a dissertation on Kant’s categorical imperative but he will not live more virtuously or ethically than anybody else. Despite a successful life, John will be boring.  

 

How to avoid the Sartre Fallacy? Don’t be like John. But don’t be a college-drop out either. Balance the Apollonian with the Dionysian – Fat Tony with Nero. Learn about biases and then commit them in the real world. Understanding how the mind decides and makes judgments requires experience in physical and social environments in the same way bones require stress and Hydra requires volatility. This is the virtue of adapting the Barbell strategy; it forces you to step outside the library and test your theories in real life – in the midst of chaos. Doing so will help you think about your irrationality, rationally.

 

* Fittingly, I learned about Buridan’s Donkey in one of the most boring class I ever took: modern philosophy. It didn’t help that it was in the morning and the professor, a Descartes specialist, was dreadfully dry.

 

This is the final installment of the Sartre Fallacy Series.

 

http://bigthink.com/insights-of-genius/sartre-fallacy-part-iv-philosophizing-without-a-phd



本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4949815
認知困擾(4之3) -- S. McNerney
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

The Sartre Fallacy Part III: The (Real) Socratic Problem

 

Sam McNerney, 03/01/13

 

This brings me to an ancient Greek, the master himself, Socrates of Athens. In a segment of Gorgias that foresees decades of modern psychological research, the erudite interlocutor observes that we always act out of a belief that what we are doing is good. This delusion is self-reinforcing. Time, experience, and more information (even opposing information) strengthen beliefs. Alas, the snub nosed Greek erroneously concluded that if we do wrong it must be because we are ignorant.

 

Consider a recent article published in the New York Times Magazine, an excerpt from Michael Moss’ book Salt Sugar Fat. Moss reports on a study published in 2011 that highlights Americans’ growing obesity problem. It tracked 120,877 subjects from 1986 to 2010 and found that, “every four years, the participants exercised less, watched TV more and gained an average of 3.35 pounds.” Who were these people? Each of 120,877 participants was, in fact, a health care professional presumably well informed about nutrition and how to live a healthy lifestyle. This toxic demonstration of cognitive dissonance is not unusual: smokers know smoking is bad; husbands know the detriments of an affair; thieves know the consequences of stealing; alcoholics know the perils of drinking. Phaedra was right:

 

“We do sometimes know the good and fail to do it.”

 

The Socratic problem usually refers to the fact that our accounts of Socrates are second hand, a problem exacerbated by his stubborn refusal to document, with pen and papyrus, his philosophy. I want to rebrand the Socratic problem to describe his false belief – a belief that pervades the Western world today – that knowledge is necessarily a panacea.

 

Go back to Moss. Will readers learn that information about nutrition does not make people healthier? No. Will they realize, when his book does nothing to curb our diets, that the mind struggles to act on helpful information? No. They will learn that too much salt, sugar and fat is bad for you. And that’s it. We’re good at learning and retaining information. We’re pitiful at abstracting rules and changing behavior accordingly. Worse, we don’t learn that we don’t learn.

 

With this in mind, imagine your pre-bike riding days. Would a complete bike-riding manual have helped? Probably not. You needed experience. The Greeks made a similar distinction. We know from the Roman physician Galen that Menodotus of Nicomedia, a member of the Empirical School, sought practitioners who embodied techne (know-how) and not epistêmê (know-that). I’m with Menodotus on this one. I don’t want a doctor who knows how the heart works; I want a doctor who knows how to operate on the heart.

 

Perhaps you’re familiar with this distinction, so I’ll draw the conclusion for you: the body is much smarter than the brain. I realize that this might sound strange in an era preoccupied with “ideas.” I have in mind our obsequious attachment to those dreadful articles that promise to reveal the secrets to human creativity or intelligence and those 20-minute video clips that turn scientists into rock stars. We love simple explanations of complex things, especially brain functions. The next time you’re in a bookstore, navigate to the cognitive science section and count the number of books that begin with the words “how” or “why.” Don’t read these books. 

 

I wouldn’t deny the mind’s capacities. But pause, for a moment, and think about the body. At one point in your life you took up a sport or performed a simple task like throwing a dart. Carrying out these skills was, believe it or not, a tremendous accomplishment, a marvel of evolution. Shakespeare observed that we humans are noble in reason, but the principle achievement of the brain is not Romeo and Juliet, or General Relativity or the Sistine Chapel; it is movement. That a baseball player can hit a 90 mph curving baseball and catch a line drive on the run is more impressive than lucid prose, scientific theorems or aesthetic prowess. Here’s Pinker again: 

 

I would pay a lot for a robot that would put away the dishes or run simple errands, but I can’t, because all of the little problems that you’d need to solve to build a robot to do that, like recognizing objects, reasoning about the world, and controlling hands and feet, are unsolved engineering problems. They’re much harder than putting a man on the moon or sequencing the human genome. But a four-year-old solves them every time she runs across the room to carry out an instruction from her mother

 

The body also remembers better than the mind. In Moonwalking With Einstein Joshua Foer makes the point that we’re horrible at remembering lists but excellent at remembering routes. Notice, for example, how easy it is to mentally traverse the route you took to elementary school. Compare that to what you remember about High School chemistry. The body doesn’t forget skills either, hence the proverb “it’s like riding a bike” while the mind forgets the vast majority of the information it obtains. The body even “remembers” the experience of being sick, which is why you’ll never contract mono, polio, or the chicken pox twice. (The wily Richard Dawkins calls a vaccination a “false memory.”)

 

There is an evolutionary component to this. We existed as non-thinking animals throughout most of our evolutionary history. During that time natural selection focused on how the brain might get the body to move through space safely. Using your legs to chase after food (or run from it) and your hands to eat it was more important that philosophizing. Only in the last blip of time did consciousness, abstract thought and language - the real game changer - emerged from our frontal lobes. Today Homo sapiens are the most creative and intelligent species on Earth. But it’s important to remember that the ability to move our limbs in concert is the principal biological achievement. We forget this, of course, when we pause to think about it.

 

Let’s return to Gorgias, the Platonic dialog eponymously named after the Greek sophist. Socrates asks Gorgias if a person who’s come to understand building is a builder and if a person who’s come to understand music a musician. Yes, Gorgias replies.

 

Socrates: And a person who’s come to understand medicine is a doctor, and so on and so forth. By the same token, anyone who has come to understand a given subject is described in accordance with the particular character his brand of knowledge confers. Do you agree?

Gorgias: Yes.

Socrates: So doesn’t it follow that someone who has come to understand morality is moral? 

 

It appears that Socrates had the wrong user’s manual. The brain was designed to do – not to think and introspect. Expertise in a domain requires 10,000 hours of deliberate practice. Spend a lifetime studying epistemology and you’ll be no wiser. Yet, thanks to Socrates and his Greek cohorts we still believe that acquiring theoretical, conceptual or factual knowledge necessarily improves judgment and behavior. And with respect to the relationship between understanding morality and acting moral, well, he was clearly unfamiliar with the Catholic Church. 

 

The purpose of this section is to answer the question,

 

Why is the Sartre Fallacy difficult to avoid?

 

We’ve seen that humans did not evolve to be philosophers, which is fine except when we act like it. Every time we provide a premise, supply the inferences and come to a conclusion we expect the audience to change their minds. This style of reasoning, a direct result of the ancients, rarely changes anything. There are no guarantees that acquiring more knowledge – reading more self-help, diet, business, or decision-making books – will improve life. We’re cognitive misers; we only use System 2 if we have to, and even then we’re reluctant. The intellect is, evolutionarily speaking, the youngest addition to the human condition. We don’t know how to use it very well. 

 

This is why the Sartre Fallacy is difficult to avoid: we don’t “do” knowledge well. Natural selection focused on the body, not the mind.

 

http://bigthink.com/insights-of-genius/the-sartre-fallacy-part-iii-the-real-socratic-problem-or-the-bodys-acuity



本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4949814
認知困擾(4之2) -- S. McNerney
推薦0


胡卜凱
等級:8
留言加入好友

 

The Sartre Fallacy Part II: Is It Inevitable?

 

Sam McNerney, 02/27/13,

 

Years before my run in with Monseiur Sartre I landed a summer job in the painting business. If you’ve painted houses perhaps you ran into the same problem I did: poor planning. One summer I discovered that a one week job took closer to two weeks; a three week job lasted about a month and a half, and so on. I devised a rule of thumb: double your completion date. The problem is I didn’t stick to this heuristic even though I knew it was true. Why? Experience and knowledge do not necessarily improve judgment; we’ve seen, in fact, that sometimes the opposite occurs. The mind is stubborn – we stick to our intuitions despite the evidence

 

Let’s go beyond the anecdote. In the spring of 2005 Bent Flyvbjerg, Mette K. Skamris Holm, and Søren L. Buhl published an article in Journal of the American Planning Association that presented “results from the rst statistically signicant study of trafc forecasts in transportation infrastructure projects.” The paper gathered data from rail and road projects undertaken worldwide between 1969 and 1998. They found that in over 90 percent of rail projects the ridership was overestimated and that 90 percent of rail and road projects fell victim to cost overrun. Worse, although it became obvious that most planners underestimate the required time and money their accuracy actually declined over the years. Today a sizable engineering feat completed on time and within budget is an imaginary one.

 

In Thinking, Fast and Slow Daniel Kahneman describes the planning fallacy as

 

plans or forecasts that are unrealistically close to best-case scenarios.”

 

Two dramatic examples come to mind. In 1957 The Sydney Opera house was estimated to cost $7 million (Australian dollars) and the completion date was set for early 1963. It opened in 1973 with a price tag of $102 million. Boston’s Big Dig was nearly one decade late and $12 billion dollars overpriced. The one exception that I can think of from the engineering world is New York’s Empire State Building, completed in 410 days, several months ahead of schedule, at $24.7 million, which is close to half of the projected $43 million.

 

Around the time I was painting houses I discovered more examples of the planning fallacy in other domains. I eventually landed on this question:

 

If we know that we are bad at predicting and can account for the underlying psychology then why do we continue to make bad predictions?

 

Kahneman suggests that to improve predictions we should consult “the statistics of similar cases.” However, I realized that the two biases that contribute to the planning fallacy, overconfidence and optimism, also distort an effort to use similar cases to generate more objective projections. Even when we have access to the knowledge required to make a reasonable estimation we choose to ignore it and focus instead on illusionary best-case scenarios. 

 

This idea returns me to my last post where I coined the term the Sartre Fallacy to describe cases in which acquiring information that warns or advocates against X influences us to do X. I named the fallacy after de Beauvoir’s lover because I acted like a pseudo-intellectual, thereby living less authentically, after reading Being and Nothingness. I noticed other examples from cognitive psychology. Learning about change blindness caused participants in one study to overestimate their vulnerability to the visual mistake. They suffered from change blindness blindness. The planning fallacy provides another example. When planners notice poor projections made in similar projects they become more confident instead of making appropriate adjustments (“We’ll never be that over budget and that late”). This was my problem. When I imagined the worst-case scenario my confidence in the best-case scenario increased.  

 

After I posted the article I was happy to notice an enthusiastic response in the comment section. Thanks to the sagacity of my commenters I identified a problem with the Sartre Fallacy. Here it is; follow closely.

 

If you concluded from the previous paragraph that you would not make the same mistake as the participants who committed change blindness blindness then you’ve committed what I cheekily term the Sartre Fallacy Fallacy (or change blindness x3). If you conclude from the previous sentence that you would not commit the Sartre Fallacy Fallacy (or change blindness x3) then, mon ami, you’ve committed the Sartre Fallacy Fallacy Fallacy (or change blindness x4). I’ll stop there. The idea, simply, is that we tend to read about biases and conclude that we are immune from them because we know they exist. This is of course itself a bias and as we’ve seen it quickly leads to an ad infinitum problem.

 

The question facing my commentators and me is if the Sartre Fallacy is inevitable. For the automatic, effortless, stereotyping, overconfident, quick judging System 1 the answer is yes. Even the most assiduous thinkers will jump to the conclusion that they are immune to innate biases after reading about innate biases, if only for a split second. Kahneman himself notes that after over four decades researching human error he (his System 1) still commits the mistakes his research demonstrates.

 

But this does not imply that the Sartre Fallacy is unavoidable. Consider a study published in 1996. Lyle Brenner and two colleagues gave students from San Jose State University and Stanford fake legal scenarios. There were three groups: one heard from one lawyer, the second heard from another lawyer, and the third, a mock jury, heard both sides. The bad news is that even though the participants were aware of the setup (they knew that they were only hearing one side or the entire story), those who heard one-sided evidence provided more confident judgments than those who saw both sides. However, the researchers also found that simply prompting participants to consider the other side’s story reduced their bias. The deliberate, effortful, calculating System 2 is capable of rational analysis; we simply need a reason to engage it.

 

A clever study by Ivan Hernandez and Jesse Lee Preston provides another reason for optimism. In one experiment liberal and conservative participants read a short pro-capital punishment article. There were two conditions. The fluent condition read that article in 12-point Times New Roman font; the disfluent condition read the article in an italicized Haettenschweiler font presented in a light gray bold. It was difficult to read and that was the point. Hernandez and Preston found that participants in the later condition “with prior attitudes on an issue became less extreme after reading an argument on the issues in a disuent format.” We run on autopilot most of the time. Sometimes offsetting biases means pausing, and giving System 2 a chance to assess the situation more carefully.

 

One last point. If the Sartre Fallacy was inevitable then we could not account for moral progress. The Yale psychologist Paul Bloom observes in a brief but cogent article for Nature that rational deliberation played a large part in eliminating “beliefs about the rights of women, racial minorities and homosexuals… [held] in the late 1800s.” Bloom’s colleague Steven Pinker similarity argues that reason is one of our “better angels” that helped reduce violence over the millennia:

 

Reason is… an open-ended combinatorial system, an engine for generating an unlimited number of new ideas. Once it is programmed with a basic self-interest and an ability to communicate with others, its own logic will impel it, in the fullness of time, to respect the interest of ever-increasing numbers of others. It is reason too that can always take note of the shortcomings of previous exercises of reasoning, and update and improve itself in response. And if you detect a flaw in this argument, it is reason that allows you to point it out and defend an alternative.

 

When Hume noted that, “reason is, and ought to be, only the slave of the passion” he was not suggesting that since irrationality is widespread we should lie back and enjoy the ride. He was making the psychological observation that our emotions mostly run the show and advising a counter strategy: we should use reason to evaluate the world more accurately in order to decide and behave better. The Sartre Fallacy is not inevitable, just difficult to avoid.

 

http://bigthink.com/insights-of-genius/the-sartre-fallacy-part-ii-is-it-inevitable



本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4949813