7 days 7 ways: Take these daily steps to a more optimistic outlook.
One goal for positive thinking for each day of the week.
Turn your negative thinking around and live a happier, more optimistic life. Start with one goal for positive thinking each day.
Monday
Start with the simple things and smile. Even this basic act will make you feel happier.
Researchers at Barnard College in the US studied people who had undergone Botox and found the lack of facial expressions infl uenced their ability to feel emotions. Starting the day with a smile makes you happier.
Tuesday
Identify your strengths.
Psychologist Dr Tim Sharp, from The Happiness Institute, says, "We all need to work on and/or manage our weaknesses and limitations, but there's no doubt those who spend more time building on what they're already good at tend to be happier, healthier and more successful."
Wednesday
Choose to be happy.
Philosopher Bertrand Russell said: "Happiness must be, for most men and women, an achievement rather than a gift of the gods."
Eat a lunch that makes you smile, put on music that brings back happy memories or call a friend who always makes you feel good.
Thursday
Find a solution.
There's one for every problem if you look hard enough. Rather than risking disturbing your sleep by thinking late at night, Dr Sharp recommends finding the time of day when you're most creative and thinking most clearly to find solutions.
"We're all different so find what time works for you."
Friday
Be grateful. We tend to forget all the simple things we have to be grateful for, so take some time to write them down. Start with the basics (health, family, shelter) and then move on to more specific items. Starting a gratitude journal can also help hone your focus on all the good things in your life.
Saturday
Turn negative talk into optimism. "Real optimism includes an active search for and focus on positive things, but it's also grounded in realism," Dr Sharp says. "Aim for flexible thinking that focuses on positives as often as possible, but that also focuses on challenges when necessary, in a constructive way."
Sunday
Keep focused. Plan your week around what's going to make you feel positive. Having a short-term, mid-term and long-term goal to improve your life will see you focused on a better future. If you know what you want, and deal with any issues that come up along the way, you'll have a more positive frame of mind.
The yearly examination results announcement season is here again and as usual the nation celebrates the achievements of top scorers not realizing that Malaysians' preoccupation with "scoring" in school examinations does no one any favours.
For the high-achieving students themselves, it instills the perception that straight A’s are the be-all and end-all of school life. Co-curricular activities and simply socialising with friends -- so important in developing a child's social skills -- may thereby be neglected. Moreover, the pressure to keep on getting top marks could prove overbearing, and if the student should fare less well in a subsequent exam, there might be adverse effects on his or her emotional health sometimes resulting in depression or even suicides.
On their part, the non-top-scorers may feel as if they are left by the wayside amid the glorification of good grades, and end up having a sense of low self-worth and an inferiority complex.
Nor is society as a whole best served by the race for A’s. The prevailing exam culture has fostered a dependence on uncritical rote learning which will not help the cause of promoting creativity and innovation in the long run. And as has been well documented, many leading lights in business, the arts and science, such as Albert Einstein, James Cameron and Steve Jobs, were in fact dropouts. The examination-based education system can in fact stifle creativity, original and critical thinking as such different modes of thought usually give us varied answers.
All this is not, of course, to celebrate mediocrity; on the contrary, we should always strive to improve ourselves and pursue high achievement. At the same time, we must also recognise that achievement comes in myriad forms, not just a string of A’s on the exam results slip. While some people may be academically inclined, others may be good with their hands, have innate artistic abilities, be natural people persons ... and the list goes on. Although these life skills do not feature in our examinations, they are often more important than academic skills because working life demands these communication, interpersonal, leadership and other qualities, often more than the technical skills.
Rethink of our priorities may thus be in order. Instead of emphasising A’s at all costs, let us work towards an education system that nurtures well-rounded individuals and offers each student the opportunity to be the best they can be -- now that would be something we can really be proud off.
The celebration of academic achievements through news reports should be stopped as it only serves to strengthen our preoccupation with academic achievements. We are producing skewed students who know a lot about examination-taking but lack other real life skills.
What is Belief? What Does it Mean to Say "I Believe" Something is True?
Beliefs
Matter Because Beliefs Compel Action, Attitudes, and Behavior
By Austin Cline
Atheists are frequently challenged to explain why they are so critical of
religious and theistic beliefs. Why do we care what others believe? Why don't
we just leave people alone to believe what they want? Why do we try to
"impose" our beliefs on theirs? Such questions frequently
misunderstand the nature of beliefs or are even just disingenuous. If beliefs
weren't important, believers wouldn't get so defensive when their beliefs are
challenged. We need more challenges to beliefs, not less.
What is Belief?
A belief is the mental attitude that some proposition is true. For every given
proposition, every person either has or lacks the mental attitude that it is
true — there is no middle ground between the presence of absence of a belief.
In the case of gods, everyone either has a belief that at least one god of some
sort exists or they lack any such belief.
Belief is distinct from judgment, which is a conscious mental act that involves
arriving at a conclusion about a proposition (and thus usually creating a
belief). Whereas belief is the mental attitude that some proposition is true
rather than false, judgement is the evaluation of a proposition as reasonable,
fair, misleading, etc.
Because it is a type of disposition, it isn't necessary for a belief to be
constantly and consciously manifested. We all have many beliefs which we are
not consciously aware of. There may even be beliefs which some people never
consciously some think about — but, to be a belief, there should at least be
the possibility that it can manifest. A belief that a god exists often depends
on numerous other beliefs which a person hasn't consciously considered.
Belief vs. Knowledge
Although some people treat them as almost synonymous, belief and knowledge are
very distinct. The most widely accepted definition of knowledge is that
something is "known" only when it is a "justified, true
belief." This means that if Joe "knows" some proposition X, then
all of the following must be the case:
1. Joe believes X
2. X is
true
3. Joe has good reasons to believe X
If the first is absent, then Joe should believe it because it is true and there
are good reasons for believing it, but Joe has made a mistake for believing
something else. If the second is absent, then Joe has an erroneous belief. If
the third is absent, then Joe has made a lucky guess rather than knowing
something. This distinction between belief and knowledge is why atheism and agnosticism
are not mutually exclusive.
While atheists can't typically deny that a person believes in some god, they
can deny that believers have sufficient justification for their belief.
Atheists may go further and deny that it is true that any gods exist, but even
if it is true that something warranting the label "god" is out there,
none of the reasons offered by theists justifies accepting their claims as
true.
Beliefs About the World
Brought together, beliefs and knowledge form a mental representation of the
world around you — a belief about the world is the mental attitude that world
is structured in some way rather than another. This means that beliefs are
necessarily the foundation for action: whatever actions you take in the world
around you, they are based on your mental representation of the world. In the
case of theistic religions, this representation includes supernatural realms
and entities.
As a consequence, if you believe something is true, you must be willing to act
as if it were true. If you are unwilling to act as thought it were true, you
can't really claim to believe it. This is why actions can matter much more than
words. We can't know the contents of a person's mind, but we can know if their
actions are consistent with what they say they believe. A religious believer
might claim that they love neighbors and sinners, for example, but does their
behavior actually reflect such love?
Why are Beliefs Important?
Beliefs are important because behavior is important and your behavior depends
on your beliefs. Everything you do can be traced back to beliefs you hold about
the world — everything from brushing your teeth to your career. Beliefs also
help determine your reactions to others' behavior — for example their refusal
to brush their teeth or their own career choices. All this means that beliefs
are not an entirely private matter. Even beliefs you try to keep to yourself
may influence your actions enough to become a matter of legitimate concern of
others.
Believers certainly can't argue that their religions have no impact on their
behavior; on the contrary, believers are frequently seen arguing that their
religion is critical for the development of correct behavior. The more
important the behavior in question is, the more important the underlying
beliefs must be. The more important those beliefs are, the more important it is
that they be open to examination, questioning, and challenges.
Tolerance & Intolerance of Beliefs
Given the link between belief and behavior, to what extent must beliefs be
tolerated and to what extent is intolerance justified? It would be legally
difficult (not to mention impossible on a practical level) to suppress beliefs,
but we can be tolerant or intolerant of ideas in a wide variety of ways. Racism
is not legally suppressed, but most moral, sensible adult refuse to tolerate
racism in their presence. We are intolerant: we don't stay silent while racists
talk about their ideology, we don't stay in their presence, and we don't vote
for racist politicians. The reason is clear: racist beliefs form the foundation
for racist behavior and this is harmful.
I don't think any one but a racist would object to such intolerance of racism,
but if it's legitimate to be intolerant of racism then we should be willing to
consider intolerance of other beliefs as well. The real question is how much
harm the beliefs might ultimately cause, either directly or indirectly. Beliefs
can cause harm directly by promoting or justifying harm towards others. Beliefs
can cause harm indirectly by promoting false representations of the world as
knowledge while preventing believers from subjecting those representations to
critical, skeptical scrutiny.
By Benjamin Radford, Contributing Writer | July 25, 2014 02:21pm ET
In the new action thriller "Lucy" from writer and director Luc Besson, Scarlett Johansson plays a drug mule whose body is implanted with a substance that begins to seep into her bloodstream and affect her body — most importantly her brain.
Lucy develops the ability to use the "untapped" majority of her brain, which lies fallow in most people, the movie says. The authoritative, gravitas-laden voice of Morgan Freeman (as Professor Norman, a research psychologist) states in the film, "It is estimated most human beings use only 10 percent of their brain's capacity. Just imagine if we could access 100 percent. Interesting things begin to happen."
As the film goes on, and Lucy accesses more and more of her cerebral capacity, she gains superhuman abilities, such as speed reading, a photographic memory, encyclopedic knowledge, the capacity to learn a foreign language in an hour and psychic abilities such as telekinesis (moving objects with her mind). She sets out for revenge using her powers, and in the trailer when Professor Norman is asked, "What happens when she reaches 100 percent?" he replies, "I have no idea."
Actually, scientists have a pretty good idea of what happens when people use all of their brains — because most of us do: The 10 percent figure is a myth.
"Lucy" isn't a documentary, of course, and it's hardly the first sci-fi thriller to get science wrong. But it may be the most recent high-profile example of the decades-old scientific myth, or urban legend. It's not just a throwaway scientific fact stated by a character who happens to be wrong (as in "Terminator 2," when Sarah Connor says. "There are 215 bones in the human body," when in fact there are 206). In Lucy, the myth is the entire premise of the film.
The fact is, people use all of their brains. Brain imaging research techniques such as PET (positron emission tomography) scans and fMRI (functional magnetic resonance imaging) clearly show that the vast majority of the brain does not lie unused. Although certain activities may use only a small part of the brain at a time (for example, watching reality TV shows), any sufficiently complex set of activities will use many parts of the brain.
In the book "50 Great Myths of Popular Psychology" (2010, Wiley), Dr. Scott Lilienfeld explains, "The last century has witnessed the advent of increasingly sophisticated technologies for snooping in the brain's traffic. ... Despite this detailed mapping, no quiet areas awaiting new assignments have emerged. In fact, even simple tasks generally require contributions of processing areas spread throughout virtually the whole brain."
An incredibly powerful and flexible organ, the brain can learn new languages and complex skills well into adulthood. It's tricky to say what the brain's capacity actually is, though, and the answer depends on what particular ability you're talking about. Most people can memorize only a handful of random digits using their short-term memories, though practice (and techniques such as a "memory palace," which aids recall using visualization) can significantly increase their recall.
It's not that most people have a well-defined physical or psychological limit on memory, or that people with superior memory abilities use more of their brain capacity, though. Instead, most people just don't find memorizing long strings of random numbers that important or interesting. It's all about where you put your time and (mental) resources.
So where did this 10 percent myth come from? Psychologist Barry Beyerstein of Simon Fraser University researched the urban legend for a chapter in the book "Mind Myths: Exploring Everyday Mysteries of the Mind and Brain" (Wiley, 1999), and traced the tall tale back to at least the early part of the 20th century.
In some cases people misunderstood or misinterpreted legitimate scientific findings, but the myth was really popularized by the self-help movement. Self-improvement writers such as Dale Carnegie, author of the classic book "How to Win Friends and Influence People" (first published in 1936, by Simon & Schuster) and groups such as those promoting transcendental meditation and neurolinguistic programming referenced the myth. They promised to teach people methods of getting ahead in life by tapping latent brainpower.
As cool as it would be to have superpowers like Lucy, you're not going to get them by using more of your brain. You're already using all you've got — for better or worse.
Benjamin Radford, M.Ed, is deputy editor of the Skeptical Inquirer science magazine and author of seven books, including "Hoaxes, Myths, and Manias: Why We Need Critical Thinking" (Prometheus Books, 2003). His website is www.BenjaminRadford.com.
Gain a mental edge in the shower, on the dance floor, and more
BY DEIDRE WENGEN, JANUARY 28, 2014
It
happened again: You spaced during an important meeting at work. You forgot to
feed the neighbor’s cat. You tossed your cell in the freezer. We all get bogged
down by the occasional brain fog, but by practicing a few surprising memory
tricks, you can fight back and build your brain up to be stronger than ever—and
avoid another icy iPhone.
Shower with Your
Eyes Closed
Strip down, hop
in, and shut your peepers. Searching for the handle, shampoo, and soap while
making mental notes of textures gives your brain a workout. In fact, doing
anything with your eyes closed is an easy way to refine your focus and memory,
says Ron White, a two-time winner of the USA Memory Championship. So if you’re
feeling a little skittish about a blind shower, try it in your kitchen instead.
Close your eyes and poke around for a specific item in the cupboards or on the
shelves. “It will break your routine and engage your senses.”
Dance to
"Blurred Lines" at Your Buddy's Wedding
Busting a move not only activates the cerebellum—a
part of the brain that helps with things such as forethought and judgment—but
also produces brain-derived neurotrophic factor (BDNF), a protein that helps
neurons communicate more effectively. “Dancing is social, and social
interactions will help neural circuits,” says Gary Small, M.D., director at
UCLA’s Longevity Center and co-author ofThe Alzheimer's Prevention Program. “It’s also
physical, which gets your heart to pump oxygen and nutrients to your brain
cells. And it helps you learn coordination.”
Act Like a Lefty
Or if you’re already a southpaw, take matters into
your right hand. Performing simple tasks such as eating or brushing your teeth
with your non-dominant hand forces your brain to relearn a common activity in a
new way. “Over time, when you use one hand to do certain tasks, it becomes
hardwired and there is almost a reflex component to using that dominant hand,” says
Allen Sills, M.D., associate professor of neurological surgery at Vanderbilt
Medical Center. “When you use your non-dominant hand and you have to activate
and engage many different brain regions, it lays down new memories and new
wiring.”
Become a Ping
Pong Champ
Take a cue from Forrest Gump and work on your table
tennis game. Ping pong improves your hand-eye coordination and gives you a dose
of brain-boosting social interaction, says Daniel Amen, M.D., a brain-imaging
researcher and founder of Amen Clinics. “When it comes to mental exercise, the
really important thing is doing things that your brain doesn’t know how to do,”
says Dr. Amen. “If I just keep doing something that I already know how to do,
it’s not that helpful. But learning different things is what really exercises
your brain.”
Hit the Mall
Although shopping can do a number on your wallet,
it provides surprising benefits for your brain, Dr. Small says. “You do a lot
of mental and physical activities when you go shopping. You’re walking, you’re
engaging with people, you’re making calculations,” he says. “Each of these
exercises will stimulate different parts of your brain and provide a
cross-training effect.” But stay on budget—overspending can cause stress and
actually shrink your brain, says Dr. Small.
Go to Clown
College
According to a 2013 study in the journalNature, learning how to juggle can actually make areas
of your brain grow. After non-jugglers practiced the tricky activity for 3
months, they showed an increase in gray matter in the mid-temporal area and the
posterior intraparietal sulcus—portions of the brain responsible for visual and
motor activity. “Taking on a new task that involves some motor activity,
pattern recognition, and spatial orientation will activate multiple regions and
reawaken dormant areas of the brain,” Dr. Sills says.
Eyes are the window to the . . . brain? A breakthrough study in Psychological Science finds that the small vessels behind your eyes could reveal how healthy your noggin is.
The scientists found that people with wider veins scored worse on IQ tests in middle age. Other factors like smoking, diabetes, or socioeconomic status couldn’t be to blame for the scores, says Idan Shalev, Ph.D., the study’s lead author.
What gives? Your eyes’ vessels may reflect the condition of your brain’s vessels because they're similar in size, structure, and function, says Shalev. “Eye vessels are developed from the same cells that brain vessels are developed from,” he adds.
Previous studies have linked the size of blood vessels in your eyes to risks for other diseases like dementia, cardiovascular disease, or stroke—but those studies were done in older people, says Shalev. This study found that the health of your eyes could indicate brain health at a much earlier age. The results were seen even in children.
So what does it mean for you? Pencil in the eye doctor. Even if you’re blessed with 20/20 vision, retinal imaging (a fancy term for the photo eye docs take of your eyes) does far more than test vision: It could be the easiest way yet to check in on your brain. It’s also a good way to keep track of changes if you’re at high risk for a disease like cardiovascular disease, Shalev says. Being able to compare images over time could help ID changes in midlife that hint towards problems. Otherwise, these changes could go unnoticed as they may not show symptoms until much later, he says. Source: http://www.menshealth.com
Editor's Note: This is one of the most-read leadership articles of 2013. Click here to see the full list.
Get ready to have your mind blown.
I was seriously shocked at some of these mistakes in thinking that I subconsciously make all the time. Obviously, none of them are huge, life-threatening mistakes, but they are really surprising and avoiding them could help us make more rational, sensible decisions.
Duck Or Rabbit?
Especially since we strive for self-improvement at Buffer, if we look at our values, being aware of the mistakes we naturally have in our thinking can make a big difference in avoiding them. Unfortunately, most of these occur subconsciously, so it will also take time and effort to avoid them--if you want to.
Regardless, I think it’s fascinating to learn more about how we think and make decisions every day, so let’s take a look at some of these habits of thinking that we didn’t know we had.
We tend to like people who think like us. If we agree with someone’s beliefs, we’re more likely to be friends with them. While this makes sense, it means that we subconsciously begin to ignore or dismiss anything that threatens our world views, since we surround ourselves with people and information that confirm what we already think.
This is called confirmation bias. If you’ve ever heard of the frequency illusion, this is very similar. The frequency illusion occurs when you buy a new car, and suddenly you see the same car everywhere. Or when a pregnant woman suddenly notices other pregnant women all over the place. It’s a passive experience, where our brains seek out information that’s related to us, but we believe there’s been an actual increase in the frequency of those occurrences.
Confirmation bias is a more active form of the same experience. It happens when we proactively seek out information that confirms our existing beliefs.
Not only do we do this with the information we take in, but we approach our memories this way, as well. In an experiment in 1979 at the University of Minnesota, participants read a story about a women called Jane who acted extroverted in some situations and introverted in others. When the participants returned a few days later, they were divided into two groups. One group was asked if Jane would be suited to a job as a librarian, the other group was asked about her having a job as a real-estate agent. The librarian group remembered Jane as being introverted and later said that she would not be suited to a real-estate job. The real-estate group did exactly the opposite: They remembered Jane as extroverted, said she would be suited to a real-estate job, and when they were later asked if she would make a good librarian, they said no.
In 2009, a study at Ohio State University showed that we will spend 36% more time reading an essay if it aligns with our opinions.
Whenever your opinions or beliefs are so intertwined with your self-image that you couldn’t pull them away without damaging your core concepts of self, you avoid situations that may cause harm to those beliefs. --David McRaney
This video teaser for David McRaney’s book, You are Now Less Dumb, explains this concept really well with a story about how people used to think geese grew on trees (seriously), and how challenging our beliefs on a regular basis is the only way to avoid getting caught up in the confirmation bias:
Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities.
The “swimmer’s body illusion” occurs when we confuse selection factors with results. Another good example is top-performing universities: Are they actually the best schools, or do they choose the best students, who do well regardless of the school’s influence? Our mind often plays tricks on us, and that is one of the key ones to be aware of.
What really jumped out at me when researching this section was this particular line from Dobelli’s book:
It makes perfect sense, when you think about it. If we believed that we were predisposed to be good at certain things (or not), we wouldn’t buy into ad campaigns that promised to improve our skills in areas where it’s unlikely we’ll ever excel.
No matter how much I pay attention to the sunk-cost fallacy, I still naturally gravitate towards it.
The term sunk cost refers to any cost (not just monetary, but also time and effort) that has been paid already and cannot be recovered. So it's a payment of time or money that’s gone forever, basically.
The reason we can’t ignore the cost, even though it’s already been paid, is that we wired to feel loss far more strongly than gain. Psychologist Daniel Kahneman explains this in his book, Thinking Fast and Slow:
Organisms that placed more urgency on avoiding threats than they did on maximizing opportunities were more likely to pass on their genes. So over time, the prospect of losses has become a more powerful motivator on your behavior than the promise of gains.
The sunk-cost fallacy plays on our tendency to emphasize loss over gain. This research study is a great example of how it works:
Hal Arkes and Catehrine Blumer created an experiment in 1985 that demonstrated your tendency to go fuzzy when sunk costs come along. They asked subjects to assume they had spent $100 on a ticket for a ski trip in Michigan, but soon after found a better ski trip in Wisconsin for $50 and bought a ticket for this trip, too. They then asked the people in the study to imagine they learned the two trips overlapped and the tickets couldn’t be refunded or resold. Which one do you think they chose, the $100 good vacation, or the $50 great one?
More than half of the people in the study went with the more expensive trip.It may not have promised to be as fun, but the loss seemed greater.
So like the other mistakes I’ve explained in this post, the sunk-cost fallacy leads us to miss or ignore the logical facts presented to us and instead make irrational decisionsbased on our emotions--without even realizing we’re doing so:
The fallacy prevents you from realizing the best choice is to do whatever promises the better experience in the future, not which one negates the feeling of loss in the past.
Being such a subconscious reaction, it’s hard to avoid this one. Our best bet is to try to separate the current facts we have from anything that happened in the past. For instance, if you buy a movie ticket only to realize the movie is terrible, you could either:
A) stay and watch the movie, to “get your money’s worth” since you’ve already paid for the ticket (sunk-cost fallacy)
or
B) leave the cinema and use that time to do something you’ll actually enjoy.
The thing to remember is this: You can’t get that investment back. It’s gone. Don’t let it cloud your judgment in whatever decision you’re making in this moment--let it remain in the past.
Imagine you’re playing Heads or Tails with a friend. You flip a coin, over and over, each time guessing whether it will turn up heads or tails. You have a 50-50 chance of being right each time.
Now, suppose you’ve flipped the coin five times already and it’s turned up heads every time. Surely, surely, the next one will be tails, right? The chances of it being tails must be higher now, right?
Well, no. The chances of tails turning up are 50-50. Every time. Even if you turned up heads the last 20 times. The odds don’t change.
The gambler’s fallacy is a glitch in our thinking--once again, we’re proven to be illogical creatures. The problem occurs when we place too much weight on past events and confuse our memory with how the world actually works, believing that they will have an effect on future outcomes (or, in the case of Heads or Tails, any weight, since past events make absolutely no difference to the odds).
Unfortunately, gambling addictions in particular are also affected by a similar mistakes in thinking--the positive expectation bias. This is when we mistakenly think that eventually our luck has to change for the better. Somehow, we find it impossible to accept bad results and give up--we often insist on keeping at it until we get positive results, regardless of what the odds of that actually happening are.
I’m as guilty of this as anyone. How many times have you gotten home after a shopping trip only to be less than satisfied with your purchase decisions and started rationalizing them to yourself? Maybe you didn’t really want it after all, or in hindsight you thought it was too expensive. Or maybe it didn’t do what you hoped and was actually useless to you.
Regardless, we’re pretty good at convincing ourselves that those flashy, useless, badly thought-out purchases are necessary after all. This is known as post-purchase rationalization or Buyer’s Stockholm Syndrome.
Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.
Cognitive dissonance is the discomfort we get when we’re trying to hold onto two competing ideas or theories. For instance, if we think of ourselves as being nice to strangers, but then we see someone fall over and don’t stop to help them, we would then have conflicting views about ourselves: We are nice to strangers, but we weren’t nice to the stranger who fell over. This creates so much discomfort that we have to change our thinking to match our actions--in other words, we start thinking of ourselves as someone who is not nice to strangers, since that’s what our actions proved.
So in the case of our impulse shopping trip, we would need to rationalize the purchases until we truly believe we needed to buy those things so that our thoughts about ourselves line up with our actions (making the purchases).
The tricky thing in avoiding this mistake is that we generally act before we think (which can be one of the most important elements that successful people have as traits!), leaving us to rationalize our actions afterwards.
Being aware of this mistake can help us avoid it by predicting it before taking action--for instance, as we’re considering a purchase, we often know that we will have to rationalize it to ourselves later. If we can recognize this, perhaps we can avoid it. It’s not an easy one to tackle though!
He illustrates this particular mistake in our thinking superbly, with multiple examples. The anchoring effect essentially works like this: rather than making a decision based on pure value for investment (time, money, and the like), we factor in comparative value--that is, how much value an option offers when compared to another option.
Let’s look at some examples from Dan, to illustrate this effect in practice:
One example is an experiment that Dan conducted using two kinds of chocolates for sale in a booth: Hershey’s Kisses and Lindt Truffles. The Kisses were one penny each, while the Truffles were 15 cents each. Considering the quality differences between the two kinds of chocolates and the normal prices of both items, the Truffles were a great deal, and the majority of visitors to the booth chose the Truffles.
For the next stage of his experiment, Dan offered the same two choices, but lowered the prices by one cent each. So now the Kisses were free, and the Truffles cost 14 cents each. Of course, the Truffles were even more of a bargain now, but since the Kisses were free, most people chose those, instead.
Your loss-aversion system is always vigilant, waiting on standby to keep you from giving up more than you can afford to spare, so you calculate the balance between cost and reward whenever possible. -You Are Not So Smart
Another example Dan offers in his TED talk is when consumers are given holiday options to choose between. When given a choice of a trip to Rome, all expenses paid, or a similar trip to Paris, the decision is quite hard. Each city comes with its own food, culture, and travel experiences that the consumer must choose between.
When a third option is added, however, such as the same Rome trip, but without coffee included in the morning, things change. When the consumer sees that they have to pay 2,50 euros for coffee in the third trip option, not only does the original Rome trip suddenly seem superior out of these two, it also seems superior to the Paris trip. Even though they probably hadn’t even considered whether coffee was included or not before the third option was added.
Here’s an even better example from another of Dan’s experiments:
Dan found this real ad for subscriptions to The Economist and used it to see how a seemingly useless choice (like Rome without coffee) affects our decisions.
To begin with, there were three choices: subscribe to The Economistweb version for $59, the print version for $125, or subscribe to both the print and web versions for $125. It’s pretty clear what the useless option is here. When Dan gave this form to 100MIT students and asked them which option they would choose, 84% chose the combo deal for $125. 16% chose the cheaper web-only option, and nobody chose the print-only option for $125.
Next, Dan removed the ‘useless’ print-only option that nobody wanted and tried the experiment with another group of 100 MIT students. This time, the majority chose the cheaper, web-only version, and the minority chose the combo deal. So even though nobody wanted the bad-value $125 print-only option, it wasn’t actually useless--in fact, it actually informed the decisions people made between the two other options by making the combo deal seem more valuable in relation.
This mistake is called the anchoring effect, because we tend to focus on a particular value and compare it to our other options, seeing the difference between values rather than the value of each option itself.
Eliminating the "useless" options ourselves as we make decisions can help us choose more wisely. On the other hand, Dan says, a big part of the problem comes from simply not knowing our own preferences very well, so perhaps that’s the area we should focus on more, instead.
Our memories are highly fallible and plastic. And yet, we tend to subconsciously favor them over objective facts. The availability heuristic is a good example of this. It works like this:
Suppose you read a page of text and then you’re asked whether the page includes more words that end in “ing” or more words with “n” as the second-last letter. Obviously, it would be impossible for there to be more “ing” words than words with “n” as their penultimate letter (it took me a while to get that--read over the sentence again, carefully, if you’re not sure why that is). However, words ending in “ing” are easier to recall than words like hand, end, or and, which have “n” as their second-last letter, so we would naturally answer that there are more “ing” words.
What’s happening here is that we are basing our answer of probability (that is, whether it’s probable that there are more “ing” words on the page) on how available relevant examples are (for instance, how easily we can recall them). Our troubles in recalling words with “n” as the second last letter make us think those words don’t occur very often, and we subconsciously ignore the obvious facts in front of us.
Although the availability heuristic is a natural process of our thinking, two Chicago scholars have explained how wrong it can be:
Yet reliable statistical evidence will outperform the availability heuristic every time.
The lesson here? Whenever possible, look at the facts. Examine the data. Don’t base a factual decision on your gut instinct without at least exploring the data objectively first. If we look at the psychology of language in general, we’ll find even more evidence that looking at facts first is necessary.
The funny thing about lots of these thinking mistakes, especially those related to memory, is that they’re so ingrained. I had to think long and hard about why they’re mistakes at all! This one is a good example--it took me a while to understand how illogical this pattern of thinking is.
The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts.
Here’s an example to illustrate the mistake, from researchers Daniel Kahneman and Amos Tversky:
In 1983, Kahneman and Tversky tested how illogical human thinking is by describing the following imaginary person:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations.
The researchers asked people to read this description, and then asked them to answer this question:
Which alternative is more probable?
1. Linda is a bank teller. 2. Linda is a bank teller and is active in the feminist movement.
Here’s where it can get a bit tricky to understand (at least, it did for me!)--If answer #2 is true, #1 is also true. This means that #2 cannot be the answer to the question of probability.
Unfortunately, few of us realize this, because we’re so overcome by the more detailed description of #2. Plus, as the earlier quote pointed out, stereotypes are so deeply ingrained in our minds that we subconsciously apply them to others.
Roughly 85% of people chose option #2 as the answer. A simple choice of wordscan change everything.
Again, we see here how irrational and illogical we can be, even when the facts are seemingly obvious.
I love this quote from researcher Daniel Kahneman on the differences between economics and psychology:
I was astonished. My economic colleagues worked in the building next door, but I had not appreciated the profound difference between our intellectual worlds. To a psychologist, it is self-evident that people are neither fully rational nor completely selfish, and that their tastes are anything but stable.
Clearly, it’s normal for us to be irrational and to think illogically, especially when language acts as a limitation to how we think, even though we rarely realize we’re doing it. Still, being aware of the pitfalls we often fall into when making decisions can help us to at least recognize them, if not avoid them.
Have you come across any other interesting mistakes we make in the way we think? Let us know in the comments.
--Belle Beth Cooper is a content crafter at Buffer, a smarter way to share on Twitter and Facebook. Follow her on Twitter at @BelleBethCooper
Reprinted with permission from Buffer [Image: Flickr userFlorian Rathcke]