Chapter 4: Intelligence vs
rationality
Einstellung effect: The tendency to apply tried-and-tested solutions to problems
which seem to fit a pattern of previously encountered problems, instead of
evaluating each problem from first principles. It is also described as the
development of a mechanized state of mind.
Why does it seem impossible
to convince people by citing facts?
Let’s start with a
description of rationality. It is an incomplete description but still useful.
A rational person exhibits
“fully disjunctive reasoning”. That means reasoning that considers all
possibilities. The human mind has a built-in tendency to trade-off accuracy
(“truth”) for efficiency (“quick decisions”), effectively taking shortcuts to conclusions. In the past this
has helped our ancestors to survive in the wild. In fact, some complex tasks
that require skill (like driving or playing a musical instrument) are better
accomplished by not thinking consciously the whole time. But
while fully disjunctive reasoning may be computationally expensive, and not
practical to apply all the time, it is usually necessary when attempting to
reach accurate conclusions from a large volume of facts.
A rational person possesses
certain “mindware”, or cognitive tools if you like. One of them is the scientific method — starting with the “facts”, seeking tangible evidence
for other people’s claims, forming and testing
one’s own hypotheses etc. Also very useful are tools
that help to account for uncertainty (or randomness), like the concept of probability. More tools were described in
Chapter 1 (in the previous post (Part 1) in this series).
Does this mean that only
scientists (or statisticians) can be “rational”? No. We would like concepts
like “hypothesis” and “probability” to be practical, commonsensical tools that
we can use, rather than rigorous, mathematical notions for geeks to bore us
with.
Now that we have a working
understanding of what rationality means to us, we may expect intelligent people
to be (on average) more rational than less intelligent people. But studies
actually find a quite low correlation (less than 0.3 on a
scale that goes from -1 to 1) between intelligence, as measured by IQ tests,
and rationality as described here. Is there an explanation for this?
We can investigate in terms
of a few known cognitive obstacles to rationality.
Confirmation bias: The tendency to interpret
new information and recall old information in a way that confirms one’s
pre-existing beliefs. It can also manifest as a tendency to seek out facts
which support pre-existing beliefs and ignore facts which contradict them.
Implicit bias: The ability to recognize
patterns and make generalizations are critical to the way our minds make sense
of the world. But the same thought
processes that make people smart also tend to make them biased. Most people
are implicitly biased even if they do not think of themselves as prejudiced
(e.g. racist or sexist). This becomes clear, for instance, when comparing “stated” preferences against “revealed” preferences.
Groupthink: Occurs when a group makes
suboptimal (or irrational) decisions because individuals within the group want
to minimize conflict and maximize consensus. The more cohesive a group, the
more prone it is to groupthink. In the play (and movie) Twelve Angry Men, members of a jury under the spell of groupthink,
are ready to convict an innocent man for murder.
A combination of these
factors can partly explain why ideas like evolution, climate change (and the
idea of an earth that is not flat!) continue to have large numbers of deniers.
As we saw in Chapter 3, no amount of evidence is sufficient to “prove” a theory
right. But the issue is not with lack of credible evidence — it is with denying that
evidence is even relevant to the discussion.
Many of the ideas presented
in this chapter so far can be found in the book Return to Reason — The Science of Thought published by Scientific
American.
In addition to these, I feel
that age and indoctrination may be factors that strongly influence rationality,
or lack of it. Why age? Younger people are more responsive to new information — children wouldn’t be able to pick up languages so quickly otherwise.
But they are less exposed to diverse viewpoints through peer interactions,
though this is less true now thanks to the internet. Conversely, older people
are less responsive to new information for reasons already described. But they
are less dependent on authority figures and more exposed to diverse viewpoints.
Some societies (e.g. Islamic
countries) are more ideologically rigid than others, so even adults could be
less exposed to diverse viewpoints. Statements made by authority figures are
expected to be believed by fiat. We’ve already seen how evidence in conflict
with core beliefs tends to be ignored or discredited, while evidence that
supports tends to be filtered in. What do I mean by “core beliefs”? These would
be beliefs in which an individual is heavily invested — essentially one’s ideology. They may be beliefs in which one’s
identity (family, community, nation…) is anchored or beliefs imbibed from
authority figures (parents, teachers, employers…). When new ideas start to
spread, those most invested in old belief systems are expected to resist them,
much as the petroleum lobby does its best to block renewable energy.
While some beliefs prop up
our identity, others derive from our moral sense; e.g. if injustice ought to be
corrected someone must do the correcting. This can actually
give rise to the paradox of self-fulfilling beliefs; e.g. trial by ordeal practised in medieval times
actually worked in a lot of cases. Studies show without a doubt that placebo effects are real and measurable,
though the precise mechanism remains unknown.
Finally, there is another
reason for myths to be preferred over scientific theories, and it is well
encapsulated in this poetic statement — “the universe is made of
stories, not atoms”. We understand best through analogy. Stories go beyond analogy — they allow the reader to
insert themselves into the centre of the narrative. But it is possible only
because stories have human (or anthropomorphic) characters. Unfortunately,
a poor
analogy always confounds more than it clarifies.
Fundamental concepts are often best understood from “first principles” even if
it means wading through jargon and math. Otherwise you end up with Quantum Woo
and New Age Science.
Am I suggesting that the
Rationalist is immune to, or has somehow been able to transcend, these biases?
Certainly not. Everyone is biased. All we can do is be aware of our inherent
biases and try to prevent them from shaping our beliefs — precisely what the methods
outlined in Chapter 1 are designed to do.
Chapter 5: Social media
Could social media be
creating cultural and ideological bubbles?
While the internet has got to
be the most powerful way for people across the world to get exposed to diverse
knowledge and ideas from outside their community, it can also have a perverse,
polarizing influence. For instance, take social media. Regardless of the
reasons for each individual user to be on a social media platform (Facebook,
Twitter, WhatsApp, YouTube…) the “goal” of the platform is the same — to retain maximum number of
active users and to ensure that each user spends as much time as possible on
the platform. Each of these goals is tied to revenue from digital advertising.
And the platform, through a set of robotic algorithms running in the
background, pursues this goal blindly and relentlessly, exploiting whatever it
“knows” about your tastes, preferences and ideological leanings.
One of these algorithms is
the Recommender Engine — it feeds the user suggestions
for content that it predicts the user will be “interested” in. So YouTube might
show you recommendations based on what other users clicked on
right after watching the same video that you just watched. And it may use your
clicks to similarly generate recommendations for other users.
Google Search may sort
or filter your search results based on information that it
has about you (your location, search history etc.) and not
just your search phrase, effectively hiding results that it “thinks” might not
be of interest to you.
There is another way in which
content on social media is different from traditional media: most of the
content is crowd-sourced (or “user-generated”). That means
the cost for an individual or small group, of publishing content to anyone who
cares to consume it, is much lower than it used to be with traditional media,
where content was filtered (in effect moderated) by a publishing
industry. Though this new, democratic process of content-generation is exciting
and in some sense fair, it does have a side-effect. Relatively extreme views
can now be broadcast via mainstream platforms, to huge audiences, where earlier
those wanting to express any views (extreme or otherwise) had to go through the
effort of writing books or magazine columns or speaking at public forums. To
add to that, there is no need to maintain even a modicum of factual accuracy
when it comes to user-generated content. This is more than evident from many
YouTube videos and the problem of fake
news on WhatsApp.
All these factors combine to
create filter
bubbles and echo
chambers. It may have a part in explaining manipulation of voter sentiment with rumours, the growing
animosity between right-wing and liberal across the world, and radicalization of youth through online
propaganda.
The Rationalist can hope to
avoid the manipulative aspects of the internet and social media just by being
aware of them. Every time YouTube appears to be leading you down a narrow
alley, ignore the “recommendations” and enter a fresh search. Use search
engines that respect your privacy. Don’t read YouTube
comments, they are mostly spam anyway. These are my humble tips. A little bit
can be a lot.
Written by Ambar Nag
ambarnag@gmail.com
(Continued in Part 3)
Written by Ambar Nag
ambarnag@gmail.com
(Continued in Part 3)
No comments:
Post a Comment