<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
  <title type="text">Articles by Ian Stevens</title>
  <id>urn:uuid:d7012b07-ce1c-3355-b28f-5d1a2062ef36</id>
  <updated>2020-12-16T00:00:00Z</updated>
  <link href="https://ianstevens.ca/articles/" />
  <link href="https://ianstevens.ca/articles/feed.xml" rel="self" />
  <author>
    <name></name>
  </author>
  <generator uri="https://github.com/ajdavis/lektor-atom" version="0.3">Lektor Atom Plugin</generator>
  <entry xml:base="https://ianstevens.ca/articles/child-product-inclusive-redesign/">
    <title type="text">Are you repelling users with your product design?</title>
    <id>urn:uuid:22c32703-1f63-31be-a491-ac5f74f318c6</id>
    <updated>2020-12-16T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/child-product-inclusive-redesign/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;p&gt;Good product design attracts, bad product design
repels. Yet not everyone experiences bad design in the same
way. Too often, &lt;a href=&quot;/articles/lurking-joanne-mcneil-review/&quot;&gt;marginalized and underrepresented users
experience unintended product outcomes like anguish, anxiety, and a feeling of
alienation&lt;/a&gt;. This can happen
when a product's entire userbase isn't fully considered in its
design. How do you avoid repelling and alienating users
and build a product that's inclusive and welcoming, one with
a truly universal appeal?&lt;/p&gt;
&lt;aside&gt;
This is one example of short-sighted
design in an electronic product for children.
&lt;/aside&gt;&lt;p&gt;I'm regularly reminded of the importance of
inclusive design, particularly in children's products.
Both of my children are Black, and our family is often
disappointed when characters in and on
clothes, toys, and books don't match our children's skin
colour and hair texture. As an example, the only way I was able
to get a Black Lego minifigure was to buy a &lt;a href=&quot;https://www.lego.com/en-ca/product/women-of-nasa-21312&quot;&gt;Women of NASA kit&lt;/a&gt;
which included Mae Jemison. &lt;a href=&quot;https://www.lego.com/en-sg/service/help/fun-for-fans/behind-the-scenes/brick-facts/why-are-minifigures-yellow-bltf9f5e2d63aefa532&quot;&gt;Lego doesn't think it's a problem&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Here's one example of short-sighted
design in an electronic product for children.
I'll tell you what wrong, why it might have happened, how it could have been avoided, and walk through how the design could be improved. Frankly, I was disappointed when I learned how the team responsible for
the design's shortcomings addressed them.&lt;/p&gt;
&lt;p&gt;Both my children have Nabi Jr. tablets, full-featured Android devices loaded
with educational and fun apps for kids aged 3-5. They're locked down,
but parents and caregivers can enter &quot;Mommy Mode&quot; with a
password for software settings, app store access, and other utilities.&lt;/p&gt;
&lt;p&gt;You read that right: Mommy Mode. Icons for apps requiring internet access are
greyed out and overlaid with &quot;Account Required. Ask Mom.&quot; If that wasn't enough, the top-right corners of these icons have what looks like a red ID card with
a bob-haired silhouette and a tiny gold lock badge. Tapping the icon brings up a
modal dialog titled &quot;Ask Mom!&quot;. The silhouette is
revealed to be a smiling white-presenting woman with a pink
hairband. That same &quot;mom&quot; also appears in a menu which slides
out from the top of the screen. At least the messaging is
consistent.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;mommymodemontage.jpg&quot;/&gt;
&lt;figcaption&gt;(L-R) Mommy Mode app icon overlay, slide-down top menu, and dialog. Hands up if this looks like you or your mother. These devices are geared towards children aged 3-5. How many that age can read and understand these text labels?&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;To be fair, it does say &quot;your mom or dad&quot; in the text of the dialog, but even this messaging is inappropriate
for many use cases. Consider how it might affect a grandmother or a step-parent as primary caregiver.
How would a child in foster care or an orphan feel when reading it? These
aren't &quot;edge cases&quot;, they're &lt;em&gt;people&lt;/em&gt;. &lt;a href=&quot;https://www.census.gov/newsroom/press-releases/2016/cb16-192.html&quot;&gt;Over 2 million children in the States live without either
of their parents&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;That microcopy and design also does disservice to many millions of other children
and parents. Would a child of gay parents — or the gay parents themselves — see
themselves in this Mommy Mode? What about people of colour and their children? My
wife is Black. This digital mom doesn't represent her.
The design doesn't fit,
as it wouldn't for millions of other parents and
children in North America. This is a huge oversight.&lt;/p&gt;
&lt;aside&gt;My
wife is Black. This digital mom doesn't represent her.
The design doesn't fit,
as it wouldn't for millions of other parents and
children in North America.&lt;/aside&gt;&lt;p&gt;Why weren't these many parents and children considered in the design?
To say the Nabi product team didn't care is probably far from the truth. It's more likely
they weren't tuned to gender and racial bias. &lt;a href=&quot;/articles/confirmation-bias-part-1/&quot;&gt;Bias
affects us all and can lead to irrational
decisions&lt;/a&gt;. Many people are biased towards mothers as caregivers and wouldn't
give a second thought to this design. It wouldn't be surprising, then, if the team in
charge of design and microcopy held this and other biases.&lt;/p&gt;
&lt;p&gt;Similarly, a biased
view of Nabi's users as white likely stopped its product team from considering
alternatives to the fair-skinned bob-haired mother. Had the team been made up of
active fathers, ex-foster or adopted children, people who grew up in
mixed-family homes, as well as people of colour, someone would have
noticed the limits of the Mommy Mode design and seized the opportunity for
inclusivity.&lt;/p&gt;
&lt;p&gt;Another possibility is that the team didn't fully consider their
userbase. There are the children who &lt;em&gt;play&lt;/em&gt; with the Nabi, sure, and there's also
a parent or caregiver who &lt;em&gt;supports&lt;/em&gt; it. Broadening that support userbase
beyond mothers — and fathers — opens us up to many different considerations. Let's briefly
consider a few user personas and pair a supporting user with each child. They're
contrived, light on detail, based on assumptions, and would later be validated through
interviews:&lt;/p&gt;
&lt;figure&gt;&lt;ul&gt;
&lt;li&gt;Ofei - 5 years old, two parents; Mother Esi and Father Kofi as supporting users
&lt;li&gt;Lee - 3 years old, single Dad with a nanny; Father Tim and nanny Laura as supporting users
&lt;li&gt;Han - 4 years old, lives with grandparents; Grandfather Chen as supporting user
&lt;/ul&gt;&lt;/figure&gt;&lt;p&gt;With these personas, a stereotype of a
mother as primary caregiver — and a white one at that — just won't do. We need something else, something
which conveys &quot;give this to a grownup&quot; to a child who probably can't even read. Too much detail
could alienate the child and her supporting users. What we need is an abstract
representation of stereotypical grownup-child interaction.&lt;/p&gt;
&lt;p&gt;One such abstraction could be of a child handing a tablet to a grownup.
We could use pictograms,
as in the image below. More details could be added, like eyes and
facial expressions, and the figures could be styled to fit existing aesthetics.
Finer detail — clothes, hair, skin colour, etc. — could take away from the
inclusive design we're trying to achieve. Cartoon animals and other
non-human characters could also be used here, but tread carefully.
Our biases and stereotypes are readily applied to cartoons if not kept in check.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;nabipicto.svg&quot; width=&quot;50%&quot;/&gt;
&lt;figcaption&gt;Pictogram of a child handing a tablet to an adult.
Placed in the context of the Mommy Mode dialog, this is a signifier for the children and should be
prominent.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;That takes care of the dialog. What about the icon overlay and
menu that bring us there? Those two locations have less space for our pictogram, so we need
something even more abstract. In this case, we can zoom in on the hands
reaching out to each other and add a bit of detail.
Again, this image could be given a splash of colour and
stylized to match the Nabi's overall design.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;nabihandover.svg&quot; width=&quot;50%&quot;/&gt;
&lt;figcaption&gt;Pictogram of a child handing a tablet to an adult. This would be used where there isn't enough space for the two figures above.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;With the relevant design elements out of the way, we can turn to the
microcopy. Good microcopy enhances design, but shouldn't exist to make up for a bad one.
Our target users are 3-5 year-olds who likely can't read well, if at all, so any text is
meant for an adult. Starting with the icon overlay, there are
several things we can do to make it easier for a child to
understand its meaning.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;askmomicon.jpg&quot; width=&quot;50%&quot;/&gt;
&lt;figcaption&gt;The Mommy Mode app icon overlay. Note the ID card and lock badge in the top right corner.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;To a child who can't read, the existing icon overlay offers few hints
as to its meaning. Children understand locks, but the one in the top right corner is too small to appear important.
We should draw attention to it by making the lock larger.
We'll also use our hands
and tablet design element in place of the Mommy Mode badge in the
upper-right corner, as well as in the top menu. In the dialog
we'll replace the &quot;mom&quot; with our child and adult
pictogram figures.&lt;/p&gt;
&lt;p&gt;With these design changes, we can likely get away with less microcopy in the icon overlay. For the
top menu and dialog copy, a rename of &quot;Mommy Mode&quot; is required.
&quot;Grownup Mode&quot;, &quot;Big Person Mode&quot;, &quot;Adult Mode&quot; … any of these would
fit, although that last one might raise a few eyebrows.
When I consulted my 7-year-old, she suggested &quot;Guardian Mode&quot; as it could also apply to older siblings and adolescent babysitters. I thought that was a great idea, so we'll use it.
We'll stick with the same instructions, though. Here, then, are alternatives to the existing
microcopy and design:&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;guardianmodemontage.png&quot;/&gt;
&lt;figcaption&gt;These are mockups of the (L-R) new Guardian Mode app icon overlay, slide-down top menu, and dialog. Test them with your children! Note that I've also repositioned the links on the dialog, though their functions aren't obvious.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Naturally, we'd test whether children and adults can make sense of these mockups. Still, we can already see how a small change
in microcopy and design can lead to a more inclusive and usable product.&lt;/p&gt;
&lt;p&gt;You're probably curious how the Nabi team solved the Mommy Mode challenge,
making its design more representative of its users. Easy: a toggle under Mommy
Mode switches it to Daddy Mode. Would you say this is
an appropriate solution?&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;daddymode.jpg&quot; width=&quot;50%&quot;/&gt;
&lt;figcaption&gt;The Nabi team solved the Mommy Mode problem with … Daddy Mode. Meet your &quot;Daddy&quot;. (He does sort of look like me, though.)&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;It's interesting to note that, had the Nabi team started with a more
inclusive design, they wouldn't have needed to develop
and support the two Mommy and Daddy modes. Developers
had to add this option to the Nabi settings, and make the labels
and graphics change depending on the mode.
Software development is expensive, and these costs didn't need
to be made.&lt;/p&gt;
&lt;p&gt;How can you avoid the same oversight the Nabi team had with their
product? If you want to ensure that your products are inclusive and welcoming,
these three objectives will help:&lt;/p&gt;
&lt;figure&gt;&lt;ol&gt;
&lt;li&gt;Consider your users, language, and tone during the ideation stage of your product, with
personas and team members representative of your customers. Aim for diversity of experience, background, and perspective as well.
Maintain this focus on underrepresented users throughout the product development cycle.&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;../confirmation-bias-part-1/&quot;&gt;Keep in mind any biases you and your team may need to overcome&lt;/a&gt; while designing and developing your product. Diversity in your team and your personas will help overcome bias, but &lt;a href=&quot;../confirmation-bias-part-3/&quot;&gt;more should be done&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Workshop your design with people from other departments in your company, and test
how well your users will understand and act on it.&lt;/li&gt;
&lt;/ol&gt;&lt;/figure&gt;&lt;p&gt;These goals aren't easy, but they &lt;em&gt;are&lt;/em&gt;
worthwhile. Start slow and build momentum
within your team. Approach building inclusive and welcoming products as a challenge
and an opportunity to learn more about your
users.
You'll find new ways to provide value to your customers, and make your product easier to use — for everyone.
The result will be
something welcoming to all instead of repelling to some.&lt;/p&gt;
&lt;div class=&quot;admonition admonition-tip&quot;&gt;&lt;p&gt;If you want to learn more
about how appropriate design and microcopy can lead to a better experience, be sure to read &lt;a href=&quot;https://open.nytimes.com/to-design-better-products-consider-the-language-f17b923f8bae&quot;&gt;To
Design Better Products, Write Better UX
Copy&lt;/a&gt;
by Nina K. Feinberg of the New York Times. For a more in-depth read with a detailed guide to creating inclusive products, try the book &lt;a href=&quot;https://g.co/kgs/FwqBC2&quot;&gt;Building for Everyone by Annie
Jean-Baptiste&lt;/a&gt;, Google's Head of Product Inclusion.&lt;/p&gt;&lt;/div&gt;</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/lurking-joanne-mcneil-review/">
    <title type="text">Joanne McNeil's Lurking serves as a warning for makers of digital products</title>
    <id>urn:uuid:6e2246ba-0a1d-3e40-9bfd-38bc41886f01</id>
    <updated>2020-06-19T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/lurking-joanne-mcneil-review/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;p&gt;&quot;Induce anguish&quot; and &quot;increase anxiety&quot; are
usually not on a list of outcomes for any digital product. Even so,
many social media products of the past and present have done just that.
Joanne McNeil's &lt;em&gt;Lurking&lt;/em&gt; follows the digitally connected user of the
internet and pre-internet. It's a great book for anyone
designing or building social apps and services. Those who forget the
social internet's past errors are doomed to repeat them.&lt;/p&gt;
&lt;p&gt;I knew I was in for a wild ride when I saw Warren Ellis' — of
&lt;a href=&quot;https://en.wikipedia.org/wiki/Transmetropolitan&quot;&gt;Transmetropolitan&lt;/a&gt; and &lt;a href=&quot;https://en.wikipedia.org/wiki/Castlevania_(TV_series%29&quot;&gt;Castlevania&lt;/a&gt; fame —
quote-review on the inside front sleeve: &lt;em&gt;&quot;The first history of the social
internet I've seen that has its authentic life and breadth.&quot;&lt;/em&gt; It's a
valid claim.&lt;/p&gt;
&lt;p&gt;If you're a product manager, designer, or developer,
that history is worth your while. McNeil includes a number of examples of product and feature fails which caused
anxiety and anguish among users — especially marginalized ones.
Many companies strive to create a &quot;disruptive&quot;
product, one with the potential for a huge impact. In doing so, they
often unknowingly adopt or amplify toxic elements of our society — racism, sexism, etc.
Lives can &lt;em&gt;also&lt;/em&gt; be disrupted.&lt;/p&gt;
&lt;aside&gt;We keep making the same
hurtful mistakes in digital products.&lt;/aside&gt;&lt;p&gt;McNeil frames this disruption and the history of the internet user through the
lenses of searching, safety, privacy, identity, community, anonymity, and
visibility. The internet's capacity to make good on those
functionalities ebbs and flows through time and across various identities.
&lt;em&gt;Lurking&lt;/em&gt; includes examples of how marginalized communities have been both
uplifted and failed by the social internet and its products and services.&lt;/p&gt;
&lt;p&gt;This book grabbed me, not just on an empathetic or techno-anthropological level but
also because of how it mirrors my history as an internet denizen, first connecting
with friends and like-minded people through
&lt;a href=&quot;https://en.wikipedia.org/wiki/Bulletin_board_system&quot;&gt;BBSes&lt;/a&gt;, then facilitating
those connections as a BBS &quot;sysop&quot;, joining the ranks of the internet on
&lt;a href=&quot;https://en.wikipedia.org/wiki/Usenet&quot;&gt;Usenet&lt;/a&gt;, creating a homepage shortly after the advent of HTTP and graphical
browsers, then coming full circle and connecting with friends and strangers on
evolving social media platforms, starting with blogs. It's been quite a journey.&lt;/p&gt;
&lt;p&gt;In &lt;em&gt;Lurking&lt;/em&gt;, McNeil describes how
online society has shifted over time. If you've been online since birth,
the book offers a comprehensive look at online life before your time. If you've journeyed
online since there's been an &quot;online&quot;, this book pieces it all together
and offers glimpses you otherwise might have missed.&lt;/p&gt;
&lt;p&gt;Take McNeil's
inclusion of Echo, like a for-fee Reddit for the New York arts
community. You've probably never heard of it — even if you lived in NYC in the
early '90s. Likewise, if you haven't been harassed on social media,
as many women have — especially racialized ones — you might not know about tech companies' history of largely
ignoring such threats. &lt;em&gt;Lurking&lt;/em&gt; tells these stories and others from the
30+ year period of the social internet.&lt;/p&gt;
&lt;p&gt;Many of these stories aren't the sorts of experiences you want for your users.
As a product manager or designer, you don't want your product
to amplify racism, sexism, or anxiety — right? &lt;em&gt;Lurking&lt;/em&gt;
is a backgrounder in how previous social products have failed certain
segments of its users. We keep making the same
hurtful mistakes in digital products.&lt;/p&gt;
&lt;p&gt;One attitude which hasn't changed, unfortunately, is the pass white
supremacy often gets online. As with Facebook, AOL — that walled
garden in the internet — &lt;a href=&quot;https://about.fb.com/news/2019/10/mark-zuckerberg-stands-for-voice-and-free-expression/&quot;&gt;defended their inaction as supporting free speech&lt;/a&gt;, despite limiting it elsewhere. In McNeil's words:&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;In the nineties, AOL even hosted a page for the Texas branch of the
Ku Klux Klan. The online provider prohibited racial slurs in search
and user profiles and yet this was a First Amendment issue, AOL
insisted.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Time and again, companies miss the same opportunities. Often a
guise of &quot;authenticity&quot; results in its rejection.
Take Friendster, an early social media site. People could
connect individually, but not as a group around a shared
interest. To compensate, users started creating fake accounts to engage
with like-minded users on an authentic level. These
accounts acted as stand-ins for celebrities, movies, or even
concepts like &lt;em&gt;war&lt;/em&gt;. Rather than embrace it, Friendster
stamped it out:&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;But Friendster developers were unbudging about its purpose. Rather
than capitalizing on emerging user behavior, they banked on their
product as a sorta-kinda dating space that mapped how various people
were connected to one another. Fakesters were an innocuous
presence, but the company believed they contaminated the data the
platform collected and provided as a hook.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Facebook, in its drive for authenticity with real names, &lt;em&gt;also&lt;/em&gt;
kicked off accounts employees deemed fake. Unsurprisingly, some people
&lt;a href=&quot;https://en.wikipedia.org/wiki/Facebook_real-name_policy_controversy&quot;&gt;had their accounts suspended despite using their real
names&lt;/a&gt;.
In her book, McNeil recounts &lt;a href=&quot;https://www.bbc.com/news/blogs-trending-31699618&quot;&gt;the story of Lance
Browneyes&lt;/a&gt;, an
Oglala Lakota artist kicked off Facebook for using a &quot;fake&quot;
name. To get his account reinstated he had to submit proof
of ID, only to have a Facebook admin inexplicably change his name to
&quot;Lance Brown&quot;.&lt;/p&gt;
&lt;p&gt;That real name policy often alienated trans users, Indigenous
people, Black users, and anyone with names that weren't expressly white.
The sad irony being that anyone
could — and did — create a fake account with an &quot;accepted&quot; name, only
to harass LGBTQ+ and racialized people, infiltrate their groups, or
flag them for deletion. Meanwhile, some members of targeted groups couldn't even use
their real names without passing through special hoops, like
requiring passport photos or other official documents. Facebook later
changed its name policy — in 2015, 11 years after its founding.&lt;/p&gt;
&lt;aside&gt;It's the stories in Lurking which truly illustrate the emotional damage
poorly-designed features can cause.&lt;/aside&gt;&lt;p&gt;It's the stories in &lt;em&gt;Lurking&lt;/em&gt; which truly illustrate the emotional damage
poorly-designed features can cause. Take a Facebook sidebar of supposedly
most-frequently contacted friends added in 2010. There are
stories of girls who started seeing profiles of boys — past boyfriends,
hookups, crushes, and acquaintances — in that space, despite not
being connected to and not viewing those profiles. These, the girls
deduced through experiments, were their lurkers. As you can imagine,
the presence of this feature induced both anxiety and curiosity.&lt;/p&gt;
&lt;p&gt;Another example of unintended consequences is Facebook's &quot;People You
May Know&quot; feature. One woman once noticed her estranged father of
twenty-seven years in that box. Facebook had inadvertently forced this man back
into her life along with any feelings it dredged up.&lt;/p&gt;
&lt;p&gt;As you read &lt;em&gt;Lurking&lt;/em&gt;, consider your offering
through the lenses of searching, safety, privacy, identity, community,
anonymity, and visibility, as McNeil lays out. Where does your product
fit in? Are you supporting your users' outcomes in each category? In
which ways might people from marginalized groups disagree with your
assessment? How might your products erode outcomes or lead to anguish
or anxiety in those categories?&lt;/p&gt;
&lt;p&gt;These errors or oversights can be limited with teams which better
represent a product's users. Drawing people in from a wider community
increases diversity of experience and, in turn, increases diversity of
thought. To reduce group-think, teams should include members of
differing genders, ethnicities, nationalities, physical abilities, racial identities, and
socioeconomic classes. Recognizing possibly damaging unintended
outcomes won't be easy, but it will be worthwhile knowing your product
stands a far less chance of ruining someone's day — or worse.&lt;/p&gt;
</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/confirmation-bias-part-3/">
    <title type="text">Only you can stop confirmation bias</title>
    <id>urn:uuid:3cf3752a-c988-317f-881d-65f095f82005</id>
    <updated>2020-05-11T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/confirmation-bias-part-3/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;div class=&quot;admonition admonition-info&quot;&gt;&lt;p&gt;A while back I volunteered to contribute to a book on the behaviours and
history of political, legal, and socio-economic systems. It was to be a primer
for people creating products with the potential to disrupt those systems. My contribution was a chapter on
confirmation bias, detailing its effects, its workings, and how it can be
overcome. Though the book was never published, my research had me reconsidering my
behaviour. Always careful with my words, I started speaking even more purposefully,
not wanting to pass bias on to others. The experience had such an impact that I
couldn't let my chapter sit unread, and split it into three
articles. &lt;a href=&quot;/articles/confirmation-bias-part-1/&quot;&gt;The first speaks to the pernicious influence of
confirmation bias&lt;/a&gt;, while &lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;the second describes how
it grows and spreads&lt;/a&gt;. This is the last in the series, explaining what we can do to fight confirmation bias.&lt;/p&gt;&lt;/div&gt;&lt;p&gt;If you've been following this series on confirmation bias or already know its
mechanisms, you may be feeling a little wary of your internal state of
the world. I know I was during my research. It's alarming to know that we can
gather false truths, nurture them through selective testing and
interpretation, and become certain they are true, all while thinking we're
being perfectly reasonable. All is not lost, however. There are ways we can
fight our bias and lessen its impact — you may already be using some of them.
Groups are also prone to biased decision-making, and there are techniques for
lessening this error as well. As for debiasing others, that can be a touch more
complicated.&lt;/p&gt;
&lt;aside&gt;Though we may never be free of our biases, we can try to make
sure our decisions and actions remain untainted.&lt;/aside&gt;&lt;p&gt;Entrenched in our brains, confirmation bias seems difficult to combat.
Though we may never be free of our biases, we can try to make sure our
decisions and actions remain untainted. As you probably realize, this isn't
easy. Working against stereotypes — a product of confirmation bias — for instance, takes more time and uses
different parts of our brains than our natural thought processes. &lt;!--[NELSON2015]--&gt;
It's a battle worth fighting. Bias affects
decisions both big and small — like hiring a new employee,
crossing the street to avoid someone, or even
the words we use to describe others. Actions based on bias can
have long-term consequences, such as an educator forming opinions
of their students based on where they live, as mentioned in &lt;a href=&quot;/articles/confirmation-bias-part-1/&quot;&gt;part
one of this series&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The good news is that at times we unknowingly reduce bias' impact on our decision-making.
If we feel we might suffer a loss of status
for a biased decision, our desire for approval can help lessen bias. &lt;!--[KLAYMAN1995]--&gt;
On the other hand, there's little evidence that incentives
— such as a reward for considering every course of action — improve
the reliability of our decision-making. We can't simply ask and expect
ourselves and others to &quot;try harder.&quot;
Doing so assumes we already know effective strategies
and somehow aren't using them properly. &lt;!--[LARRICK2004]--&gt; &lt;!--[RABIN1999]--&gt;
In many cases, incentives can produce worse outcomes. A financial advisor
with fees tied to an increase in portfolio value, for
example, might be biased towards riskier trades.&lt;/p&gt;
&lt;p&gt;We also unknowingly lessen our bias in
decisions where our accountability is at stake,
provided we have the appropriate decision-making strategies. Even for those limited to
experience in a related field, accountability can be better than punishments and incentives
at countering bias.
We have a strong social need for consistency.
When making decisions for which we'll be held accountable,
we're willing to put in the effort and more effectively use information.
Generally, we want to avoid embarrassment
and maintain pride. This means we're more likely to preemptively self-criticise and
foresee flaws. That thirst for accountability can go too far, however.
We sometimes feel a need to &quot;give people what they want&quot;, particularly if we're
undecided — like fudging a report to match expectations. &lt;!--[LARRICK2004]--&gt;&lt;/p&gt;
&lt;figure&gt;
&lt;table class=&quot;table table-sm table-borderless table-centerall table-insideborders&quot;&gt;
&lt;colgroup span=&quot;2&quot;&gt;
&lt;colgroup span=&quot;2&quot; width=&quot;30%&quot;&gt;
&lt;thead&gt;
&lt;tr&gt;
    &lt;td colspan=&quot;2&quot; rowspan=&quot;2&quot;&gt;&lt;/td&gt;
    &lt;th colspan=&quot;2&quot; scope=&quot;colgroup&quot;&gt;Is that a predator's face I think I see?&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
    &lt;th scope=&quot;col&quot;&gt;Yes&lt;/th&gt;
    &lt;th scope=&quot;col&quot;&gt;No&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
    &lt;th rowspan=&quot;2&quot; scope=&quot;rowgroup&quot;&gt;That's actually a predator's face&lt;/th&gt;
    &lt;th scope=&quot;row&quot;&gt;Yes&lt;/th&gt;
    &lt;td&gt;True positive&lt;/td&gt;
    &lt;td&gt;False negative may result in death&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
    &lt;th scope=&quot;row&quot;&gt;No&lt;/th&gt;
    &lt;td&gt;False positive results in fleeing unnecessarily&lt;/td&gt;
    &lt;td&gt;True negative&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 1&lt;/strong&gt;
We're great at recognising faces, to the point where we see them
where there are none — a false positive. In this way, our brain errs on the
side of caution. If we weren't good at facial recognition, we
might miss faces when they're present — a false negative. If that
face belonged to a predator, that mistake could come with a high cost: death.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Along with accountability, context is also key when unknowingly making decisions with less bias.
With some decisions, such as those related to survival,
false negative errors have a higher cost (see Fig. 1). Others may have
expensive false positive errors.
It helps to have
experience in the area under study, especially if we encounter a problem we've solved before.
When we encounter these high-cost conditions, we usually err on the side of caution.
Yet confirmation bias often reappears if we try to map
that experience to a different domain. &lt;!--[KLAYMAN1995]--&gt;  One example might be a
successful day-trader confidently wading into socio-economic
theory with a selective knowledge in that field.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;https://d2f99xq7vri1nk.cloudfront.net/DinoSequentialSmaller.gif&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 2&lt;/strong&gt; Interpreting data through summary statistics alone can be misleading. This animation shows a dozen different datasets with identical summary statistics. Source: &lt;a href=&quot;https://www.autodeskresearch.com/publications/samestat&quot;&gt;Autodesk Research&lt;/a&gt;&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;It's reassuring that we unknowingly reduce our bias when making
certain decisions. What can be done to improve that intuition?
In &lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;part two of this series&lt;/a&gt;
we learned that confirmation bias can often develop if we fail to properly apply formal
reasoning. We might have some basic logic, economics, or statistics knowledge
— such as sampling — but may not know when or even how to use it.
For example, people often misinterpret or misuse summary statistics like mean, median, or
standard deviation (see Fig. 2), and could benefit from refresher training.
A lesson on how causation and correlation are frequently conflated
could also help.
There &lt;em&gt;is&lt;/em&gt; evidence that
short training sessions in a domain with which we're comfortable —
sports statistics, for instance — can help reduce bias in other areas. That assist, however,
often diminishes after only two weeks and suffers when learning complex rules,
like &lt;a href=&quot;https://en.wikipedia.org/wiki/Bayes%27_theorem&quot;&gt;Bayes theorem&lt;/a&gt;. &lt;!--[LARRICK2004]--&gt;&lt;/p&gt;
&lt;p&gt;In &lt;a href=&quot;/articles/confirmation-bias-part-1/&quot;&gt;part one of this series&lt;/a&gt; we
learned of a 2013 &quot;study of studies&quot; on gender and risk which
showed that even scholars and
experts can be victims of bias. &lt;!--[NELSON2015]--&gt; There seems to be no guarantee
that intuition can be improved with more education. &lt;!--[KLAYMAN1995]--&gt; Outside
motivation — punishment, accountability, etc. — isn't always helpful,
and can sometimes have the
opposite effect, like taking accountability too far and delivering what's expected.
We can't debias
ourselves by ourselves, as we're likely biased against even the existence of our biases.
How can we hope to lessen their impact? Formal approaches exist but are
more geared towards reducing bias in group decisions. As it turns out,
simply &lt;em&gt;knowing&lt;/em&gt; that confirmation bias exists goes a long way.
A basic understanding of how unreliable
human reasoning can be, with no instructions other than &quot;beware&quot;,
can help counter biases. &lt;!--[LARRICK2004]--&gt; The best strategy to exceed this bare minimum, however,
may be to consider the opposite.&lt;/p&gt;
&lt;aside&gt;There seems to be no guarantee
that intuition can be improved with more education.&lt;/aside&gt;&lt;p&gt;If you've ever argued a position in school — in English or a debate class, perhaps
— you may have prepared by researching opposing arguments. Considering the
opposite is also a decent strategy for fighting bias in our beliefs. This
might be as simple as asking ourselves how we could be wrong on a position, why,
and for what reasons. In doing so, we widen our search and direct our attention
to contrary evidence. This approach can help reduce overconfidence, a symptom
of confirmation bias. It's also been shown to lessen bias when seeking and
interpreting new information. &lt;!--[LARRICK2004]--&gt; We also reason better with two theories
than when evaluating a single hypothesis.
What's important is that we seriously examine a &lt;em&gt;specific&lt;/em&gt;
opposing belief. &lt;!--[KLAYMAN1995]--&gt;&lt;/p&gt;
&lt;p&gt;Naturally, &lt;em&gt;seriously&lt;/em&gt; examining an alternate belief is the key. We might not give an
opposing belief its due, especially if we feel ours is already viable.
&lt;!--[KLAYMAN1995]--&gt; Although paying attention to contrary evidence can help
counter bias, requiring too many opposing views can backfire. Failing to
come up with a required number of alternate theories might make us
consider weaker ones, making us more
confident in our own viewpoint. &lt;!--[LARRICK2004]--&gt; Considering more than one theory at
once can also divide our attention. We're then less likely to give other theories their due. Instead, think about alternates
separately and independently. &lt;!--[KLAYMAN1995]--&gt;&lt;/p&gt;
&lt;p&gt;We might be able to hold our confirmation bias at bay so long as we're aware
of it, and give serious thought to viewpoints opposed to our own. What about
people we work with, or our friends and family?&lt;/p&gt;
&lt;p&gt;Unfortunately, when it comes to other individuals, we may just have to grin and
bear it. In the absence of bias, a rational person could correct their belief with more
information. With bias, more information is not better. Trying to convince someone affected by confirmation bias to change their belief may
have the opposite effect and &lt;em&gt;increase&lt;/em&gt; their leanings. This is known as the &lt;em&gt;backfire effect&lt;/em&gt; or &lt;em&gt;belief perseverance&lt;/em&gt;.
Giving the same
ambiguous information to people with differing beliefs may move their beliefs
further apart. &lt;!--[RABIN1999]--&gt; Depending on their viewpoint, two people may see the same evidence and
interpret it differently, judging it as being more consistent with their bias.
&lt;!--[NICKERSON1998]--&gt;&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;../confirmation-bias-part-2/signals.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 3&lt;/strong&gt; Every signal we receive influences our belief. This chart is a probable timeline of the skew of someone's specific belief.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Considering belief formation as a series of signals, as in &lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;part two of this series&lt;/a&gt;, can also show how difficult
it may be to debias someone else. The effect of each signal depends on those
which came before it, including any prior beliefs (see Fig. 3). To debias someone, we may
need to know their initial belief on a topic as well as the order of signals
which followed. &lt;!--[RABIN1999]--&gt;
Unraveling how they nurtured this false knowledge takes care, understanding, and respect.
It also depends on open and reliable narrators with good memories and an interest in reaching the truth.
With severe bias, our efforts to reason away another's false belief could be futile. This is perfectly illustrated by The Doobie Brothers in &lt;a href=&quot;https://youtu.be/Zjqcf5F0YRg&quot;&gt;&quot;What a Fool Believes&quot;&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;But what a fool believes he sees&lt;br/&gt;
No wise man has the power to reason away&lt;br/&gt;
What seems to be&lt;br/&gt;
Is always better than nothing&lt;br/&gt;
Than nothing at all&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://youtu.be/Zjqcf5F0YRg&quot;&gt;The Doobie Brothers, &quot;What a Fool Believes&quot;&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Our friends and family with severe bias may be lost to it, but our workplace
can still be saved. Thankfully, many decisions which matter are made by groups,
which are more readily debiased than individuals. Many
tried and tested strategies for lessening bias in groups exist, usually involving a framework or
tool to help make sound decisions. Groups can make use of decision aids,
information displays, statistical models, and other formal decision analysis
techniques. Complex problems can be split into smaller, simpler ones — such as listing the pros and cons of a position — and
assigned to smaller groups. These technical strategies are simply out of reach
for most individuals. Whereas individuals can introduce bias at every step of
the decision-making process, groups can track their progress and use the
results as feedback.&lt;/p&gt;
&lt;p&gt;When using strategies or tools to make unbiased decisions at work,
adoption can be difficult. Processes like these are usually imposed company-wide from the top down, and as such are often rejected or begrudgingly implemented, leading to failure.
A bottom-up approach can have better results than a general
process imposed from the top-down. When those making the decisions choose
a strategy appropriate to their group, their sense of ownership helps them
stick with it and approach it more honestly. If it works for them, the group can evangelise the strategy and inspire adoption. Beware, however. As with ourselves,
groups can also underestimate their bias and be overconfident in their
decision-making. They, like us, may fail to recognize a need for help. &lt;!--[LARRICK2004]--&gt;&lt;/p&gt;
&lt;aside&gt;Fighting confirmation bias in ourselves and in groups requires careful
and consistent attention to how we make decisions.&lt;/aside&gt;&lt;p&gt;Groups are also prone to &lt;em&gt;group-think&lt;/em&gt;. Their members may be influenced by
others with either more seniority, or who are more aggressively persuasive.
Because of this, groups may anchor on the judgments of a few people. Having group
members think about their preferences and estimates before a meeting can help
lessen this risk. Strategies and tools such as
&lt;a href=&quot;https://en.wikipedia.org/wiki/Multiple-criteria_decision_analysis&quot;&gt;multi-attribute analysis&lt;/a&gt;, or
&lt;a href=&quot;https://en.wikipedia.org/wiki/Decision_support_system&quot;&gt;decision support systems&lt;/a&gt;
prompt groups to think more deeply than otherwise, and
can also check for errors in the
decision-making process. It's also a good idea to maintain complementary
expertise within the group, and be aware of blind spots due to shared errors. &lt;!--[LARRICK2004]--&gt;
A supportive environment in which everyone feels free to correct
belief or adjust decisions is also key. &lt;!--[KLAYMAN1995]--&gt;&lt;/p&gt;
&lt;p&gt;Group-think due to blind spots can be lessened through diversity of experience
within the group. While training can help preserve that diversity of
perspectives, groups can do better by increasing the sample size of experience.
&lt;!--[LARRICK2004]--&gt; Drawing people in from a wider community increases diversity
of experience and, in turn, increases diversity of thought. To reduce the
risk of locally-held beliefs, groups should include members of differing
genders, ethnicities, nationalities, racial identities, and socioeconomic classes.&lt;/p&gt;
&lt;p&gt;Fighting confirmation bias in ourselves and in groups requires careful
and consistent attention to how we make decisions. The
solutions are there:&lt;/p&gt;
&lt;figure&gt;
&lt;ul&gt;
&lt;li&gt;At a bare minimum, know that bias exists and is widespread.&lt;/li&gt;
&lt;li&gt;Consider opposing arguments or alternate theories to those which drive your actions.&lt;/li&gt;
&lt;li&gt;Use tools and decision-making frameworks when you collaborate at work.&lt;/li&gt;
&lt;li&gt;Involve people with different backgrounds and experiences to rein in group-think.&lt;/li&gt;
&lt;/ul&gt;
&lt;/figure&gt;&lt;p&gt;My research in confirmation bias had a profound impact on me. I was aware of
it, but I had no idea how often our brains fail to be rational. It made me
wonder which biases I had adopted, from where, and how much they had affected
my actions. What bothered me the most was the possibility of spreading false belief to others.
I vowed that bias would stop with me. Anything I was unsure of
remained unsaid or came with a disclaimer. I tried to limit my use
of words like &quot;every&quot; or &quot;none&quot; when I really meant &quot;most&quot; or &quot;few&quot;. More
importantly, I better understood how others could become so firmly attached to
false belief or prejudice in spite of themselves, and how the slightest action
borne of that bias could negatively affect others in a big way. I
started to imagine a world free from the effects of bias, and it was
glorious. What will you do to help make that world a reality?&lt;/p&gt;
&lt;h3&gt;References&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a id=&quot;KLAYMAN1995&quot;&gt;[KLAYMAN1995]&lt;/a&gt; Klayman, J. (1995). Varieties of confirmation bias. In J. Busemeyer, R. Hastie, &amp;amp; D. L. Medin (Eds.), Decision making from a cognitive perspective. New York: Academic Press (Psychology of Learning and Motivation, vol. 32), pp. 365-418.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;LARRICK2004&quot;&gt;[LARRICK2004]&lt;/a&gt; Larrick, R. P. (2004). Debiasing, in Blackwell Handbook of Judgment and Decision Making (eds D. J. Koehler and N. Harvey), Blackwell Publishing Ltd, Malden, MA, USA.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NELSON2015&quot;&gt;[NELSON2015]&lt;/a&gt; Nelson, J. A. (2015). Are women really more risk-averse than men? A re-analysis of the literature using expanded methods. Journal of Economic Surveys, 29: 566-585.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NICKERSON1998&quot;&gt;[NICKERSON1998]&lt;/a&gt; Nickerson, J. S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, Vol. 2, No. 2, pp. 175-220.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;RABIN1999&quot;&gt;[RABIN1999]&lt;/a&gt; Rabin, Matthew and Schrag, Joel L. (1999). First Impressions Matter: A Model of Confirmatory Bias, The Quarterly Journal of Economics, 114, issue 1, p. 37-82.&lt;/li&gt;
&lt;/ul&gt;
</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/confirmation-bias-part-2/">
    <title type="text">How we're steered towards false belief</title>
    <id>urn:uuid:80dcf6f1-4636-3cfe-94de-7f5fdce4329a</id>
    <updated>2019-10-28T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/confirmation-bias-part-2/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;div class=&quot;admonition admonition-info&quot;&gt;&lt;p&gt;A while back I volunteered to contribute to a book on the behaviours and
history of political, legal, and socio-economic systems. It was to be a primer
for people creating products with the potential to disrupt those systems. My contribution was a chapter on
confirmation bias, detailing its effects, its workings, and how it can be
overcome. Though the book was never published, my research had me reconsidering my
behaviour. Always careful with my words, I started speaking even more purposefully,
not wanting to pass bias on to others. The experience had such an impact that I
couldn't let my chapter sit unread, and split it into three
articles. &lt;a href=&quot;/articles/confirmation-bias-part-1/&quot;&gt;The first speaks to the pernicious influence of
confirmation bias&lt;/a&gt;. This is the second of the three, describing how
it grows and spreads. Finally, &lt;a href=&quot;/articles/confirmation-bias-part-3/&quot;&gt;the last article explains what we can do to fight confirmation bias&lt;/a&gt;.&lt;/p&gt;&lt;/div&gt;&lt;p&gt;Confirmation bias is one of a number of cognitive biases which
affect how we reason. It tricks us into accepting untruths and
nurtures them until we're certain they're true. As a result, we're
led to hold false beliefs with greater confidence than evidence
can justify. Confirmation bias doesn't happen by
itself. It needs agreeable conditions to grow, flourish, and persist.
Our tendency to selectively gather, interpret, and recall information provides
fertile ground for confirmation bias to take hold (See Fig. 1).&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;../confirmation-bias-part-1/cbpillars.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 1&lt;/strong&gt; Our tendencies support the formation of confirmation bias.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;If we don't actively choose our information or how we interpret it, how does bias start?
These tendencies operate while leaving our learning and reasoning process
intact. We don't see our bias. We feel we're being rational, and we often are,
but with skewed information. Every stage of belief development is affected,
from initial hypothesis generation, to searching for, testing, interpreting,
and recalling evidence. &lt;!--[KLAYMAN1995]--&gt;&lt;/p&gt;
&lt;aside&gt;Confirmation bias starts to take hold with initial hypothesis generation.&lt;/aside&gt;&lt;p&gt;Our selective gathering of evidence often starts when we form an initial belief from weak evidence. Have you ever firmly believed something to be true only to find out – years later
– that it had no basis in reality? Maybe you forgot how you had come to
believe in something which, under scrutiny, you later realised was completely false.
How could you have been so wrong? Chances are you took something you heard or
read at face value and carried it for years. This initial hypothesis generation is where confirmation
bias starts to take hold.&lt;/p&gt;
&lt;p&gt;Governed by something known as &lt;em&gt;anchoring&lt;/em&gt;, that
initial belief is powerful and can take root in our brains. Information
acquired early carries more weight and is more readily recalled. First impressions matter. Belief
starts to collect around those first pieces of information. With belief
backed by initial weak evidence, we may have problems correctly interpreting
better — possibly contradictory — information received later on. &lt;!--[RABIN1999]--&gt;&lt;/p&gt;
&lt;p&gt;With that kernel of belief in mind, we'll gather evidence to support
it. This isn't to say we actively seek out that evidence, although we
sometimes do. Rather, our brains tend to more readily take in information which
supports our belief. Have you ever bought or considered a car, an item of
clothing, or other object, and then suddenly started noticing the same make,
model, or style much more often? This tendency is called the &lt;em&gt;frequency
illusion&lt;/em&gt;, and it plays a part when you're subconsciously gathering evidence to
support a belief. We see what we seek. &lt;!--[NICKERSON1998]--&gt;&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;https://chainsawsuit.com/wp-content/uploads/2014/09/20140916-research.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 2&lt;/strong&gt; As in Wason's experiment, we tend to pick evidence which confirms our bias. Source: &lt;a href=&quot;http://chainsawsuit.com/comic/2014/09/16/on-research/&quot;&gt;Chainsawsuit&lt;/a&gt; by &lt;a href=&quot;https://twitter.com/KrisStraub&quot;&gt;@KrisStraub&lt;/a&gt;&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Frequency illusion is only one of many tendencies which focus our attention on confirming evidence.
Many of them are holdovers from when our evolutionary ancestors had to flee from danger. It didn't matter much if that danger wasn't a danger at all.
Humans evolved with great pattern recognition skills — skills which also make us see patterns
where there are none (i.e. false positives). Thanks to something called &lt;em&gt;illusory correlation&lt;/em&gt;, we
tend to see connections between unrelated events. People often
stereotype certain groups — cultural, political, or otherwise — as being bad people, a drain on society, or
simply &quot;idiots&quot;. They probably started with an unfavourable belief about that group
and likely later noticed members of that same group exhibit bad behaviour.
They saw what they wanted to see, and failed to
register the times they saw members of that
group not demonstrating similar behaviour.&lt;/p&gt;
&lt;p&gt;Gathering evidence isn't always enough. We usually test our beliefs to see if
they hold. These tests are often flawed, however. For one, we're more likely to
ask questions whose answer is
&quot;yes&quot; should our hypothesis be true. In one study on test selection,
participants were given a profile of someone described as either an extrovert
or an introvert. Subjects were then asked to interview that person and determine if
they fit the labeled personality type. Participants usually picked questions which, if answered
with &quot;yes&quot;, would strongly confirm their profile, and strongly
disconfirm it if answered with &quot;no&quot;. &lt;!--[NICKERSON1998]--&gt; For instance,
someone given a profile flagged as belonging to an extrovert might have asked &quot;Do you enjoy large
parties?&quot; but not &quot;Do you enjoy time alone?&quot;. This reinforcement of our initial belief through positive tests leads
us to be more confident in our belief, even if the information we collect has
no value. &lt;!--[KLAYMAN1995] [JONES2000]--&gt;&lt;/p&gt;
&lt;p&gt;This tendency to seek and test largely positive evidence can uncover
patterns which may not exist, limiting discovery. These tests can confirm belief but will not uncover false
negatives. &lt;!--[KLAYMAN1995]--&gt; Using Wason's 2-4-6
task from &lt;a href=&quot;/articles/confirmation-bias-part-1&quot;&gt;part 1 of this series&lt;/a&gt; as an example, subjects tested their theory by picking three
numbers which fit it, not three numbers which fit a different but also
valid theory, nor numbers which didn't fit their theory at all. For example, someone who
believed that a 2-4-6 sequence represented even numbers increasing by
two might have tested it with 8-10-12, but not 3-5-7, nor even 6-4-2.
When we rely on largely positive evidence, we fool ourselves into a false
belief (see Fig. 3).&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;full-moon.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 3&lt;/strong&gt; We tend to see patterns where there are none, and use tests which confirm our theories.
Here, two people in neonatal care see the same evidence, with one difference.
The one on the left has accepted a belief that full moons cause more births.
That person's tests involve checking the moon when there are more babies in the
nursery. Because of this, they miss seeing a full moon when there are fewer
babies. They also underweigh the instance with many newborns and no full moon.
The person on the right hasn't been influenced by an initial belief on lunar
cycles and births, and remains unbiased.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Test results need to be interpreted to be useful.
Confirmation bias kicks in here as well.
Evidence and tests which confirm our belief are prone to be seen as reliable and relevant, and are often
accepted at face value. By contrast, evidence which disagrees with our belief is often seen as
unreliable, unimportant, and open to scrutiny — often hypercritically,
especially if the source is believed to be subject to error. &lt;!--[RABIN1999]--&gt;
When evidence is ambiguous, vague, or open to interpretation, we unfortunately tend to give our beliefs
the benefit of the doubt. &lt;!--[KLAYMAN1995]--&gt; As an example, a teacher might
interpret a student's non-standard answer to a question as either stupid or
creative, depending on how the teacher feels about the student beforehand.
(Recall the study in &lt;a href=&quot;/articles/confirmation-bias-part-1&quot;&gt;part 1 of this series&lt;/a&gt; of a girl playing in urban and suburban neighbourhoods.)&lt;/p&gt;
&lt;aside&gt;In many cases, it may be more important for us to maintain our belief
than to be accurate.&lt;/aside&gt;&lt;p&gt;When we overweigh or underweigh evidence in this way,
we usually require less of it
to uphold a belief than we require evidence to reject one.
Other factors are at issue, such as our degree of confidence in our belief, the value of
making a correct conclusion, or the cost of making a bad decision. Our motivation for truth,
however, can still be outweighed by our need for self-esteem, approval from others, need for control,
and the internal consistency that confirming evidence provides. &lt;!--[NICKERSON1998]--&gt;
In many cases, it may be more important for us to maintain our belief
than to be accurate. Being wrong is painful and often seen as undesirable,
exhibited by the expectation that we &quot;have the courage of one's convictions.&quot; &lt;!--[KLAYMAN1995]--&gt;&lt;/p&gt;
&lt;p&gt;Interpreting tests and evidence can be quite challenging. We feel we're
impartial and open, that we adjust our belief accordingly, but the opposite is
often true. For one, most of us have trouble with statistics and probability.
Even the most analytical minds fail to properly consider the odds a particular belief is
true given confirming evidence, with its own occurrence probability (see
&lt;a href=&quot;https://en.wikipedia.org/wiki/Bayes%27_theorem&quot;&gt;Bayes Theorem&lt;/a&gt;).
Similarly, we rarely consider the odds of disconfirming evidence or alternative beliefs being correct.
Most people also fall prey to a number of &lt;a href=&quot;https://en.wikipedia.org/wiki/List_of_fallacies&quot;&gt;logical
fallacies&lt;/a&gt; and generalizations,
which hinder the way we interpret tests and evidence.&lt;/p&gt;
&lt;aside&gt;How we test and interpret our evidence governs how we see new evidence.&lt;/aside&gt;&lt;p&gt;The feedback loop between evidence gathering, testing, and interpretation is ongoing. How we test and interpret our evidence
governs how we see new evidence. We can quickly
grow more confident in our belief, interpreting even ambiguous evidence as
supporting it, all with an internally coherent pattern of reasoning.
&lt;!--[JONES2000]--&gt; This confidence can make it painful to give up our beliefs. We become more likely to
question information which goes against them than information which agrees with
them. &lt;!--[NICKERSON1998]--&gt; Searching for and interpreting evidence, then, can be an internal fight between
what is right and what feels good. Confirmation bias is not a simple error, but
an internally coherent pattern of reasoning. &lt;!--[JONES2000]--&gt;&lt;/p&gt;
&lt;p&gt;Confirmation bias grows and persists by way of a number of
tendencies. To combat our bias, we need to understand how our
beliefs can be so easily skewed by it. Another way to do so is to think
about our belief formation as influenced by a series of signals.
We're constantly receiving signals of the true state of the world, through our
senses and our interactions with it. Learning of something online, watching a video clip,
speaking with someone outside our circle — signals like these influence our
belief. A rational observer who perfectly rates each signal and applies it to
her beliefs would, after an infinite number of signals, always attain
near-certain belief. &lt;!--[RABIN1999]--&gt;&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;signals.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 4&lt;/strong&gt; Every signal we receive influences our belief. This chart is a probable timeline of the skew of someone's specific belief.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Few of us are perfectly rational, however. We may start our decision-making
process believing that two sides to an issue are equally valid. This
may change as soon as we receive our first signal. &lt;!--[RABIN1999]--&gt;
&lt;!--[NICKERSON1998]--&gt; As with the primacy effect,
that first signal may completely determine our final belief. Once we
begin leaning towards a belief, we may misinterpret further signals which
conflict with it. We may ignore or underweigh a conflicting signal, or
overweigh one which agrees with our belief. &lt;!--[RABIN1999]--&gt; Under bias, our belief formation may
quickly become a feedback loop. Every signal we receive may be used to defend
or justify our position. &lt;!--[NICKERSON1998]--&gt; Pretty soon, we're like a boat that's drifted off-course.&lt;/p&gt;
&lt;p&gt;The existence of our bias inhibits our ability to overturn false belief. If
severe enough, &lt;em&gt;further disconfirming signals may worsen our bias&lt;/em&gt;. &lt;!--[RABIN1999]--&gt; Even after an infinite number of signals, our bias
may compel us to believe with near-certainty in an incorrect belief. Chances are,
though, that we'll become convinced of our own belief and stop paying attention
to further signals. After processing a number of signals, our belief may go
from feeling natural, to feeling incontestable. &lt;!--[NELSON2015]--&gt;&lt;/p&gt;
&lt;p&gt;Given that tendencies supporting bias largely happen subconsciously and are
entrenched in our brains, can we do something about it? Can we debias
ourselves? Or are we powerless to continue making bad decisions based on
selective processes? Awareness is the first step, but what about debiasing
other individuals or groups we're a part of? &lt;a href=&quot;/articles/confirmation-bias-part-3/&quot;&gt;Read on for part three of this
series&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;References&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a id=&quot;JONES2000&quot;&gt;[JONES2000]&lt;/a&gt; Jones, M., and Sugden, R. (2000). Positive confirmation bias in the acquisition of information. (Dundee Discussion Papers in Economics; No.  115). University of Dundee.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NELSON2015&quot;&gt;[NELSON2015]&lt;/a&gt; Nelson, J. A. (2015). Are women really more risk-averse than men? A re-analysis of the literature using expanded methods. Journal of Economic Surveys, 29: 566-585.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NICKERSON1998&quot;&gt;[NICKERSON1998]&lt;/a&gt; Nickerson, J. S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, Vol. 2, No. 2, pp. 175-220.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;KLAYMAN1995&quot;&gt;[KLAYMAN1995]&lt;/a&gt; Klayman, J. (1995). Varieties of confirmation bias. In J. Busemeyer, R. Hastie, &amp;amp; D. L. Medin (Eds.), Decision making from a cognitive perspective. New York: Academic Press (Psychology of Learning and Motivation, vol. 32), pp. 365-418.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;RABIN1999&quot;&gt;[RABIN1999]&lt;/a&gt; Rabin, Matthew and Schrag, Joel L. (1999). First Impressions Matter: A Model of Confirmatory Bias, The Quarterly Journal of Economics, 114, issue 1, p. 37-82.&lt;/li&gt;
&lt;/ul&gt;
</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/confirmation-bias-part-1/">
    <title type="text">Three reasons why we make bad decisions, and why it matters</title>
    <id>urn:uuid:bca72bb4-547b-3d99-9806-4545b7506af2</id>
    <updated>2019-10-13T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/confirmation-bias-part-1/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;div class=&quot;admonition admonition-info&quot;&gt;&lt;p&gt;A while back I volunteered to contribute to a book on the behaviours and
history of political, legal, and socio-economic systems. It was to be a primer
for people creating products with the potential to disrupt those systems. My contribution was a chapter on
confirmation bias, detailing its effects, its workings, and how it can be
overcome. Though the book was never published, my research had me reconsidering my
behaviour. Always careful with my words, I started speaking even more purposefully,
not wanting to pass bias on to others. The experience had such an impact that I
couldn't let my chapter sit unread, and split it into three articles. This is the first
of the three, describing the pernicious influence of confirmation bias. The
other two explain &lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;how it grows and spreads&lt;/a&gt;, and &lt;a href=&quot;/articles/confirmation-bias-part-3/&quot;&gt;what we can do to fight it&lt;/a&gt;.&lt;/p&gt;&lt;/div&gt;&lt;p&gt;&quot;THIS JUST IN: We're getting reports — around the world — of a disease rapidly spreading out
of control. The infected are in the millions, and have started behaving
irrationally. Some have been seen making poor judgments of their and others'
ability. Girls as young as six with the disease are viewing women as less
smart than men. Scientists are working on a cure, but a viable
solution could be weeks away. In the meantime, everyone is urged to use
caution.&quot;&lt;/p&gt;
&lt;p&gt;Thankfully, this is not a real news story. The disease isn't real, but the symptoms are, and almost
everyone on Earth is affected. If it were a disease, there might be a cure, or
at least a way to slow the contagion. There isn't. It's us, or more
specifically our brains.&lt;/p&gt;
&lt;p&gt;You may know this affliction as &lt;em&gt;confirmation bias&lt;/em&gt;, one of many cognitive or
unconscious biases which affect how we reason. Unlike a leaning or a slant,
like left- or right-wing bias, cognitive biases are the result of involuntary
mental &quot;short-cuts&quot;, leftovers from when we had to quickly tell friend from
foe, or avoid potentially dangerous situations.  These short-cuts may
have kept our ancestors alive, but they impede logic and accuracy necessary in
our modern world. &lt;!-- [NELSON2015](#NELSON2015) --&gt;&lt;/p&gt;
&lt;p&gt;Before confirmation bias had a name, people were thought to be largely
rational. An error in judgment was simply a matter of poor
reasoning. &lt;!-- [LARRICK2004](#LARRICK2004) --&gt; In his recount of the Peloponnesian War almost 2500 years ago, historian Thucydides
called it a &quot;habit&quot;:&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;&quot;… and their judgment was based more upon blind wishing than upon any sound
prediction; for it is a habit of mankind to entrust to careless hope what
they long for, and to use sovereign reason to thrust aside what they do not
desire.&quot;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://en.wikiquote.org/wiki/Thucydides#Book_IV&quot;&gt;Thucydides, 460BC&amp;#x2011;395BC&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Over 400 years ago, Sir Francis Bacon was more inclined to consider this &quot;habit&quot; as a
trick of the mind:&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;&quot;The human understanding when it has once adopted an opinion […]
draws all things else to support and agree with it. And though there be a greater
number and weight of instances to be found on the other side, yet these it
either neglects and despises, or else by some distinction sets aside and
rejects, in order that […] its former conclusions may remain inviolate.&quot;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://en.wikiquote.org/wiki/Francis_Bacon#Book_I&quot;&gt;Sir Francis Bacon, 1561&amp;#x2011;1626&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It wasn't until 1960, when psychologist Peter Wason performed his first
selection experiment, that confirmation bias finally had a name.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;wason.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 1&lt;/strong&gt; Wason's selection experiment. Most people tested triplets which confirmed their rule, often based on the initial triplet (left). Only a few people rationally chose test triplets which didn't confirm their rule (right). The actual rule was always &quot;any ascending sequence.&quot;&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Wason's experiment was simple: present a person with three numbers (e.g.
2-4-6), and ask them to guess the rule for those numbers. That
person was then directed to test their theory with their own triplets,
and was told when each matched the &lt;em&gt;actual&lt;/em&gt; rule. Participants mostly
came up with rules specific to the initial triplet (e.g. &quot;numbers
increasing by two&quot; in the case of 2-4-6), and only tested them with triplets
which fit their theory (e.g. 9-11-13). They rarely chose triplets
which didn't agree with their theory (e.g. 9-10-11). Most, then, never
guessed the actual rule, which was always &quot;any ascending sequence.&quot;
Fig. 1 shows an example of the irrational approach most people took,
as well as a rational one. One of the tendencies which supports
confirmation bias involves improper selection of evidence.&lt;/p&gt;
&lt;aside&gt;Confirmation bias leads us to hold false beliefs with a confidence greater than evidence can justify.&lt;/aside&gt;&lt;p&gt;Of the many cognitive biases, confirmation bias likely does us the
most harm. It tricks us into accepting untruths and nurtures them
until we're certain they're true. It leads us to hold false
beliefs with a confidence greater than evidence can justify.
&lt;!-- [NICKERSON1998](#NICKERSON1998) --&gt; Those affected will often misinterpret new
information as supporting a previously-held but false belief.
It may not seem dangerous, but confirmation bias can change the way we view reality. &lt;!-- [RABIN1999](#RABIN1999) --&gt;&lt;/p&gt;
&lt;p&gt;The Wason experiment demonstrated one tendency contributing to confirmation
bias: selective evidence gathering. We may only see what our beliefs lead us
to expect. We also have a tendency to selectively interpret evidence for or
against a belief. We may add weight to information or events which support
our theories, and discount information which does not. When it comes to
remembering evidence, we tend to filter out or forget opposing views and their
supporting facts, and selectively recall evidence which supports our views.
These tendencies support confirmation bias (see Fig. 2) and lead to over-confidence in our
beliefs. Over-confidence, in turn, can lead to bad choices, sometimes
resulting in risky and extreme behaviour. &lt;!-- [RABIN1999](#RABIN1999) --&gt;&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;cbpillars.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 2&lt;/strong&gt; Our tendencies support the formation of confirmation bias.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Horoscopes are a mild example of how confirmation bias can affect behaviour. Perhaps
you know someone who, in some small way, has acted on the advice of a
horoscope or a psychic. Such mediums remain popular largely
because we want to believe them. To reinforce our belief, we tend
to focus on or remember what we expect or want from them. We also more
readily recall the times they were right, and forget or discount
when they were wrong. Many &quot;readings&quot; also label the viewer as having
positive traits, such as kindness and generosity. People often
fail to consider how universal these traits are, and instead selectively use
those readings as supporting credibility.
This behaviour opens us to place faith in astrology, fortune-tellers, and con
artists, who — knowingly or not — appeal to those traits so that we recognize
ourselves in their &quot;predictions&quot;. &lt;!-- [NICKERSON1998](#NICKERSON1998) --&gt;&lt;/p&gt;
&lt;p&gt;This same tendency to see or remember what we expect or desire can also feed
more serious conditions such as hypochondria and paranoia. Depressed people
may also focus on information which strengthens their depression, and ignore
more positive information which may help them. &lt;!-- [NICKERSON1998](#NICKERSON1998) --&gt; Over time,
this selective memory can affect and reinforce their &quot;core beliefs&quot; — absolute truths we hold about
ourselves — and lead to a negative self-image.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;https://www.pewresearch.org/wp-content/uploads/2018/01/FT_16.11.16_crime_trend.png&quot;/&gt;
&lt;figcaption&gt;&lt;strong&gt;Fig. 3&lt;/strong&gt; People's beliefs about crime rates are often at odds with reality. Source: &lt;a href=&quot;https://www.pewresearch.org/fact-tank/2016/11/16/voters-perceptions-of-crime-continue-to-conflict-with-reality/ft_16-11-16_crime_trend-2/&quot;&gt;Pew Research Center: Public perception of crime rate at odds with reality&lt;/a&gt;.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Tendencies contributing to confirmation bias can lead us to confidently make bad
decisions which affect ourselves and others. Many people wrongly believe that crime is rising, and vote for candidates who are tougher on crime, despite the reality (see Fig. 3). Maybe you know
someone — a friend or relative, perhaps — who seems to have nothing good to
say about a particular group. Have they gone out of their way to
avoid a member of that group? Or treated someone in that group
differently than they would others? Such stereotypes and prejudices
are largely fed by confirmation bias, and can influence how we treat
or view &quot;others&quot; — people different from us.&lt;/p&gt;
&lt;aside&gt;The tendency to make biased decisions based on confidence
in a stereotype isn't born of years of prejudicial thinking.&lt;/aside&gt;&lt;p&gt;Thanks to confirmation bias we may readily observe and recall
unusual behaviours in people from distinct ethnicities. This
selective evidence gathering and recall, left unchecked, can contribute to racist stereotypes. &lt;!-- [NICKERSON1998](#NICKERSON1998) --&gt;
Belief due to tainted memory increases
our confidence in stereotypes, leading us to more likely act on
them. &lt;!-- [NELSON2015](#NELSON2015) --&gt; Several studies have shown how bias can
change our reactions to people about whom we hold stereotypes — even
if we are only &lt;em&gt;told&lt;/em&gt; those people belong to a specific group.
&lt;!-- [NICKERSON1998](#NICKERSON1998) --&gt;&lt;/p&gt;
&lt;p&gt;In one such study, participants were shown a video of a girl playing. Half
were told the girl's parents were college-educated with white-collar jobs.
They were shown the girl playing in a well-to-do suburban
neighbourhood. The other half was told the girl's parents were
high-school graduates with blue-collar jobs, and were shown her playing
in a disadvantaged urban neighbourhood. People in each of those groups were then shown
the same video of the girl answering a series of questions, and were asked to
evaluate her reading level. The group which was told the girl was from
a well-to-do suburban family rated the girl's reading ability &lt;em&gt;significantly higher&lt;/em&gt;
than the group which was told she was from a poorer neighbourhood. Both
groups saw the same Q&amp;amp;A video, were given no other information, yet reached
different conclusions because of how they felt about where the girl lived. Consider what happens when caregivers and teachers approach their students with a similar bias. &lt;!-- [RABIN1999](#RABIN1999) --&gt;&lt;/p&gt;
&lt;p&gt;This tendency to make biased decisions based on confidence
in a stereotype isn't born of years of prejudicial thinking.
In a similar study, 5-7 year-olds were told
of a person who was &quot;really, really smart.&quot; The children were then
shown a picture of four adults — two women and two men — and were asked
to pick the &quot;really, really smart&quot; one. At aged 5, boys and girls
chose their own gender roughly equally. Girls aged 6 or 7, however,
were significantly less likely than boys the same age to view their
own gender positively. In another study with different children,
boys and girls aged 6 or 7 were introduced to a game
&quot;only for kids who are really, really smart&quot; or one &quot;only for kids who try really,
really hard.&quot; The girls were less interested than the
boys in the game for &quot;really, really smart&quot; children, but not the game for &quot;kids who try really, really hard.&quot; &lt;!-- [BIAN2017](#BIAN2017) --&gt;&lt;/p&gt;
&lt;aside&gt;We don't see our bias. We feel we're being
reasonable, and we often are, but with skewed information.&lt;/aside&gt;&lt;p&gt;Like Thucydides, we may feel our reasoning is
stronger than others', that we never fall into the &quot;habit&quot; of
misjudging people, especially a child in a video, so easily. Yet people
who have studied reasoning and statistics can still have a problem
with confirmation bias and stereotypes. As an example, numerous
peer-reviewed studies claim to show that women are more risk-averse than
men. A 2013 &quot;study of studies&quot;, however, claims that
those studies and their authors were likely affected by stereotypes
induced by bias. &lt;!-- [NELSON2015](#NELSON2015) --&gt; The studies' authors reached
inaccurate conclusions by falling prey to tendencies
behind confirmation bias. Many inaccurately cited conclusions of earlier literature, or emphasized results
agreeing with stereotypes, while downplaying or omitting results which
did not. These confirming results were, in turn, more likely to be published.
Researchers overlooked situations where women
naturally take on a great deal of risk, such as with child birth or risk of
domestic violence. Instead, areas of risk such as
finance were studied and findings extrapolated to a broader context.
&lt;!-- [NELSON2015](#NELSON2015) --&gt;&lt;/p&gt;
&lt;p&gt;Since Wason's experiment, many studies have shown that not only do we hold
cognitive biases, they can be difficult to correct. &lt;!-- [LARRICK2004](#LARRICK2004) --&gt; We're
usually unaware of our own confirmation bias. Worse, although our
reasoning about information may be biased, we still rationally apply that data to our
own state of the world. We don't see our bias. We feel we're being
reasonable, and we often are, but with skewed information. This bias-influenced reasoning may
make sense to us, but it results in bad decisions. Ignorant to our bias,
we may become over-confident in our beliefs and risk tainting future reasoning,
thereby reinforcing our bias. &lt;!-- [JONES2000](#JONES2000) [RABIN1999](#RABIN1999) --&gt;&lt;/p&gt;
&lt;p&gt;Confirmation bias exists and negatively influences human behaviour — so now
what? We can correct our own bias, although it can be difficult. Awareness helps a great deal, as it turns
out. There are also habits we can pick up to better keep our bias in check.
That's in &lt;a href=&quot;/articles/confirmation-bias-part-3/&quot;&gt;part three of this series&lt;/a&gt;. Our next step is to gain a better
understanding of &lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;how confirmation bias grows and spreads&lt;/a&gt;, which brings us to
&lt;a href=&quot;/articles/confirmation-bias-part-2/&quot;&gt;part two&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;References&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a id=&quot;BIAN2017&quot;&gt;[BIAN2017]&lt;/a&gt; Bian, L., Leslie, S., and Cimpian, A. (2017). Gender stereotypes about intellectual ability emerge early and influence children’s interests.  Science, 27 Jan 2017, Vol. 355, Issue 6323, pp. 389-391.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;JONES2000&quot;&gt;[JONES2000]&lt;/a&gt; Jones, M., and Sugden, R. (2000). Positive confirmation bias in the acquisition of information. (Dundee Discussion Papers in Economics; No.  115). University of Dundee.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;LARRICK2004&quot;&gt;[LARRICK2004]&lt;/a&gt; Larrick, R. P. (2004). Debiasing, in Blackwell Handbook of Judgment and Decision Making (eds D. J. Koehler and N. Harvey), Blackwell Publishing Ltd, Malden, MA, USA.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NELSON2015&quot;&gt;[NELSON2015]&lt;/a&gt; Nelson, J. A. (2015). Are women really more risk-averse than men? A re-analysis of the literature using expanded methods. Journal of Economic Surveys, 29: 566-585.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;NICKERSON1998&quot;&gt;[NICKERSON1998]&lt;/a&gt; Nickerson, J. S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, Vol. 2, No. 2, pp. 175-220.&lt;/li&gt;
&lt;li&gt;&lt;a id=&quot;RABIN1999&quot;&gt;[RABIN1999]&lt;/a&gt; Rabin, Matthew and Schrag, Joel L. (1999). First Impressions Matter: A Model of Confirmatory Bias, The Quarterly Journal of Economics, 114, issue 1, p. 37-82.&lt;/li&gt;
&lt;/ul&gt;
</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/everything-is-practice/">
    <title type="text">Everything is practice</title>
    <id>urn:uuid:4255c091-a804-38a4-b212-e1ef64a67e36</id>
    <updated>2019-09-30T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/everything-is-practice/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;p&gt;&quot;Does Practice &lt;em&gt;Really&lt;/em&gt; Make Perfect?&quot;. That was the title
of one of my middle school science projects. I forced my family to play
Perfection over, and over, and over again to see if they
got better with time. Spoiler alert: they did.&lt;/p&gt;
&lt;p&gt;If you don't know the game, the goal of Perfection is to place twenty-five shapes into
matching holes on a recessed tray before it pops up after one minute and ejects its contents — all
while a ridiculously loud timer ticks away every 10th of a second.  It's
stressful. The pop at the end still surprises me.&lt;/p&gt;
&lt;figure&gt;
    &lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/BjLzzLq765Q&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture&quot; allowfullscreen&gt;&lt;/iframe&gt;
    &lt;figcaption&gt;Perfection — it'll give you nightmares.&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;Fast-forward twenty-five years later. I chose not to renew a
contract to spend the summer with my infant son and my wife, then on maternity leave.
What better time to buy a new camera, what with a baby who can't
refuse photos and all. I had a DSLR gathering dust, its heft stopped
it from being a &quot;casual&quot; camera. It was time to buy something I could
carry without feeling like I was lugging a delicate brick. I
settled on a mirrorless camera — an Olympus E-M10, which I highly
recommend — for its portability and build. To get myself acquainted with the camera, I
decided to take a picture with it every day.&lt;/p&gt;
&lt;aside&gt;There's only one rule when taking that picture — a rule I'll repeat to myself when I feel uninspired: just try. &lt;/aside&gt;&lt;p&gt;Take a picture every day I did — for three years and counting. My camera has become
part of my wardrobe. Out for groceries? Take the camera. Dropping the
daughter off at daycare? Take the camera. Off to coffee with the wife?
Take. The. Camera. Some days I take one quick shot, other days I might take a
dozen thoughtful ones. On rainy days stuck inside I sometimes get anxious satisfying my
daily photo fix. Yet I'm always able to take a picture.&lt;/p&gt;
&lt;p&gt;There's only one rule when taking that picture, a rule I'll
repeat to myself when I feel uninspired: just try. Anytime
I take that daily picture, I just need to try to do the subject
justice. It doesn't have to be a particularly good shot or even inspire pride. Framing,
composition, lighting, timing — they don't have to
meet with approval, but they should be considered.&lt;/p&gt;
&lt;p&gt;Looking back at three years of photos, it's clear I've improved
with everyday practice. Many once-favourite shots
look cluttered to me now. I still love some of my old photos, but
I'd treat many others differently now – or avoid taking them at all.
My &quot;eye&quot; has improved over the years. I now confidently experiment with new
techniques and new themes.&lt;/p&gt;
&lt;p&gt;After seeing how this daily practice bettered my photography, I looked for something else to try.
I settled on writing, aiming for a 250-word article each day. That same rule applied:
just try. Any topic would do. It didn't have to be thought-provoking, just
cohesive, with only a short time spent on edits. I kept that up for two weeks before stumbling,
and found that my writing improved even after such a short time.&lt;/p&gt;
&lt;p&gt;I didn't intend to practice them every day, but I became addicted to crosswords
some three years ago. It started as a way to pass the time on my commute — easy
ones in the commuter tabloids as well as more challenging ones from the
bigger papers. It wasn't long before I was regularly
finishing a crossword in a fraction of the time I used to, and with far more ease.&lt;/p&gt;
&lt;aside&gt;It's just practice, after all — it doesn't have to be perfect.&lt;/aside&gt;&lt;p&gt;These improvements through daily or almost daily practice convinced me to try
advancing other skills. I fell through with my writing
partly because it demanded a large commitment. Writing 250 words often took
more than an hour. Comparatively, photography and crosswords
take up less than 30 minutes — combined. I started to recognise
that even 5-10 minutes of practice a day was enough to improve small
skills over the long term. To that end,
I took up activities which could be worked into my schedule. For
instance, I switched from printing to cursive after buying a fountain pen and
found cursive to be more fluid for writing on my iPad.&lt;/p&gt;
&lt;p&gt;Practicing something new, even for a few minutes, can involve a great deal
of patience. Some days I don't feel up to taking shots with my camera.
When I was regularly practicing my writing, I often felt that starting was
the most difficult part. That's where my mantra of &quot;just try&quot; came from.
It's freed me from my expectations and allowed me to move on. It's
just practice after all — it doesn't have to be perfect. If I feel stuck
taking one thoughtful photo, I'll pick something — anything — to shoot.
I'll try to pick a decent perspective, try to frame it nicely, and try to get
something interesting out of it. If it doesn't work out, at least I've tried
and that's my practice for the day. I usually find, however,
that that first shot frees me from a desire for perfection and propels me on to
more creative photos.&lt;/p&gt;
&lt;blockquote&gt;&lt;p&gt;&quot;The common conception is that motivation leads to action, but the reverse is
true –  action precedes motivation. You have to prime the pump and get the
juice flowing, which motivates you to work on your goals.&quot;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;/articles/everything-is-practice/&quot;&gt;Robert J. Mckain, author of &quot;Realize Your Potential&quot;&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;That drive to continue taking pictures after encouraging myself to &quot;just try&quot;
is a great example of action preceding motivation. I've put off
many tasks, even ones I enjoy, because I don't have the motivation. My
6-year-old daughter does something similar with her homework. I advise her to
&quot;just start&quot; and help her find a small and easy first step. Ten minutes later
she's finished her homework.&lt;/p&gt;
&lt;p&gt;This brings me back to my middle school science project with Perfection. If you've
practiced something for a long time you know that practice &lt;em&gt;never&lt;/em&gt; makes
perfect. There's always room for improvement. Instead, it
should be &quot;practice makes progress&quot;. It's progress we're after, no matter how
small, not perfection. Practicing something every day means freeing yourself from perfection.
Expect stumbling blocks,
resistance, or flawed output &lt;em&gt;and work past it&lt;/em&gt;. You
won't see progress day-by-day, but over time it will become more obvious.
Practice makes progress.&lt;/p&gt;
&lt;figure&gt;
&lt;img src=&quot;https://66.media.tumblr.com/421b8c8664c807d255a99edcb7143927/tumblr_p18qvuwAmZ1qiuiebo1_540.jpg&quot;/&gt;
&lt;figcaption&gt;People really underestimate the value of practice. Source: &lt;a href=&quot;https://sarahcandersen.com/post/168749352296&quot;&gt;Herding Cats by Sarah Andersen&lt;/a&gt;&lt;/figcaption&gt;
&lt;/figure&gt;&lt;p&gt;That progress — that daily exercise in something you want to do — can make you
happier. If you've ever practiced cognitive behavioral therapy (CBT),
you may recognise everyday practice as a form of the &lt;a href=&quot;https://en.wikipedia.org/wiki/Mastery_and_pleasure_technique&quot;&gt;mastery
and pleasure technique&lt;/a&gt;.
Under CBT, this technique works on the idea that people who have lost joy need
regular reinforcement to feel good about themselves. Routine activities where
people feel a sense of mastery using their skills and talents — no matter how
significant — can boost self-esteem and confidence. Likewise, pleasurable activities — those
we find enjoyable and relaxing — can help us with stress. Together, these two
types of activities can improve self-esteem and add excitement and enjoyment to
your schedule. Regular practice of something you like fits both mastery and
pleasure. Practice makes you powerful.&lt;/p&gt;
&lt;p&gt;That regular practice can be anything, even something you might not consider
practice-able. Activity you can fit in your routine is best. Maybe
you want to get better at photography like I did — you're probably already holding something
which takes pictures! Commute using public transit? Practice your conversation skills by talking to
strangers without self-judgment. Practice can
also fill those moments you're idle: keep a sketchbook
on-hand for drawing, exercise your mindfulness, repeat card tricks, etc. There's no end to
things you can practice, even for only a few minutes at a time. Think of all the days you
have available — days in which you could slowly improve a skill you find
interesting. You'll notice yourself getting steadily better. You'll also feel better.
You just need to try.&lt;/p&gt;
</content>
  </entry>
  <entry xml:base="https://ianstevens.ca/articles/start-with-why/">
    <title type="text">For Great Products, Start With Why</title>
    <id>urn:uuid:c40464c5-94ca-3a88-85ac-5d65e9571465</id>
    <updated>2018-11-12T00:00:00Z</updated>
    <link href="https://ianstevens.ca/articles/start-with-why/" />
    <author>
      <name></name>
    </author>
    <content type="html">&lt;p&gt;I love finding the intent behind a rule, a concept, or even a mysterious part on a physical product. With that all-important question — Why? — answered, I’m free to dream up simpler ways that intent could have been satisfied. Getting to that kernel of truth can be very satisfying. Conversely, an unanswered Why can be quite frustrating, almost painful.&lt;/p&gt;
&lt;aside&gt;The graveyard of companies and products is full of those whose focus was What, not Why.&lt;/aside&gt;&lt;p&gt;When I started reading Simon Sinek’s &lt;cite&gt;Start With Why&lt;/cite&gt;, I was drawn to its simplicity. Inspiration, values, and purpose stem more readily from a Why than from How or What. To win the hearts and minds of your employees and customers alike, you need to start with Why. Sounds easy, yet the graveyard of companies and products is full of those whose focus was What, not Why.&lt;/p&gt;
&lt;p&gt;This reasoning behind &lt;cite&gt;Start With Why&lt;/cite&gt; forms the basis for a number of product management books I’ve been reading:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Marty Cagan’s &lt;cite&gt;Inspired&lt;/cite&gt; details the practice of product management, beginning with personas and product discovery — literally starting the product process with a Why;&lt;/li&gt;
&lt;li&gt;Dan Olsen’s &lt;cite&gt;Lean Product Playbook&lt;/cite&gt; shows how to isolate and focus on a problem (a Why) and iterate on a solution (a What);&lt;/li&gt;
&lt;li&gt;&lt;cite&gt;Crossing the Chasm&lt;/cite&gt; by Geoffrey Moore shows that making that leap to a mainstream customer base is harder if the Why behind your offering isn’t very clear.&lt;/li&gt;
&lt;/ul&gt;
&lt;aside&gt;Once you know your Why, you’re free to make any product which satisfies it.&lt;/aside&gt;&lt;p&gt;These three books have a common teaching: Success can be difficult if you’re tied to a particular What. What if it isn’t the right What? What if it’s disrupted by another company’s offering? Even worse, what if your What doesn’t inspire your employees or your customers? Clayton Christensen’s &lt;cite&gt;The Innovator’s Dilemma&lt;/cite&gt; lists many companies which were too focussed on their What, and couldn’t innovate their way to other products by adopting a clear Why.&lt;/p&gt;
&lt;p&gt;Once you know your Why, you’re free to make any product which satisfies it. You can adopt different techniques or constraints and come up with several solutions to test. You might even discover an entirely new product line, like other successful companies which kept their raison d’être in focus. Start by communicating a clear Why and bake it into your processes — not just because the possibilities are endless, but also because it will readily inspire trust from your employees and customers.&lt;/p&gt;
</content>
  </entry>
</feed>
