►
From YouTube: OCB: Digital Nudge: A conversation with Fabio Pereira and Andrew Clay Shafer (Red Hat)
Description
Fabio Pereira is the head of Open Innovation Labs Latin America at Red Hat. Fabio has over 18 years of experience, 10 of those at ThoughtWorks Australia where he acted as digital transformation advisor for several clients including the Australian Government Digital Transformation Agency. He's also the author of the book Digital Nudge. Fabio is passionate about human behavior and believes that understanding the hidden forces that drive the 35 thousand decisions we make every day can drastically change our lives. Join us for a conversation with Fabio and Andrew Clay Shafer as part of this #TransformationFriday on OpenShift Commons
A
All
right,
everybody,
it's
finally
friday,
it's
been
a
long
week
welcome
back
to
openshift
commons
and
today,
as
we
do
on
fridays,
we
have
talks
about
transformation.
So
today
is
the
hashtag
transformationfriday
conversation
and
we
have
with
us
fabio
pera,
who
is
the
author
of
digital
nudge
and
one
of
our
colleagues
at
red
hat.
A
This
is
really
quite
an
interesting
concept,
I'm
going
to
let
fabio
introduce
himself
and
introduce
the
topic
and
then
we're
going
to
have
a
conversation
with
myself,
andrew
clay,
shaffer,
otherwise
known
as
little
idea
on
twitter
and
anyone
else.
Who's
joining
in
in
the
stratospheres
can
join
us
as
well
and
ask
questions
so
so
we
can
get
an
introduction
to
this
topic
and
get
nudged
fabio.
Take
it
away.
B
Thank
you
so
much
diane
and
thanks
andrew
for
the
invitation
to
speak
to
this
group.
I'm
really
flattered
to
be
here.
I
would
really
like
to
understand:
what's
your
take
on
the
on
the
digital
nudge
concept,
by
the
end
of
this,
so
let's
definitely
have
a
conversation
about
it.
At
the
end,
I
actually
I'm
a
red
header.
I
work
in
latin
america.
B
I
lead
the
open
innovation
area,
open
innovation,
labs
area
of
latin
america,
and
now
we
are
rebranding
ourselves
to
what
we
are
calling
transformation
services,
which
is
the
area
inside
the
services
organization
that
deals
with
transformation.
So
we'll
talk
a
lot
about
transformation
here,
what
we
mean
by
transformation.
B
I'm
going
to
ask
everyone
who
joined
you
to
respond
very
very
quickly
in
chat
the
question
I'm
going
to
ask
you:
this
is
an
experiment
called
the
bat
and
the
ball
experiment.
If
you
have
already
done
this
experiment
before,
please
do
not
answer,
because
there
is
a
bit
of
something
that
you
will
know
about
it.
So
we
really
want
you
to
think
quickly
and
respond.
The
first
thing
that
comes
to
your
mind
when
I
ask
this
question.
B
So
here
is
the
question:
a
bat
and
a
ball
together:
cost
1.10
the
bat
is
one
dollar
more
expensive
than
the
ball.
How
much
is
the
ball?
Please
type
in
chat,
how
much
you
think
the
ball
is
and
because
I'm
not
seeing
chat,
I
don't
know
if
diane
could
probably
read
some
some
of
the
some
of
the
responses
that
come
out
because
I'm
sharing
my
full
screen.
So
I
can't
see
the
chat.
A
So
so
the
quick
response
is
10
cents.
B
Exactly
yeah,
so
usually
the
first
thing
that
comes
to
our
minds
is
10
cents.
But
if
you
think
a
bit
more
carefully
about
this,
you
think
if
the
bat
is
one
dollar
more
expensive,
the
bat
will
be
110
and
the
total
is
actually
120..
So
how
come
we
make
this
mistake
right?
We
are
rational
people.
We
are
smart
people,
we're
all
very
smart
and
don't
worry
if
you
made
that
mistake
like
have
the
mit
students
actually
make
the
same
mistakes.
So
don't
worry
harvard
students
make
that
mistake.
Usually
the
right
response
is
five
cents.
B
When
I
ask
this
question,
I
I
usually
get
like
10
20
of
people
say
five
cents
right,
and
I
I
like
to
talk
to
people
who
actually
responded
five
cents,
because
that's
either
someone
who
does
that
calculation
very
quickly
in
their
brains
or
it's
someone
who's
seen
that
before
so
the
bat
is
105
and
the
sum
is
actually
the
total
is
110..
So
pretty
much.
We
make
35
000
decisions
every
single
day.
That's
an
average
number.
Of
course.
B
It's
a
bit
more
to
some
people
a
bit
less
to
others,
but
it's
an
average
number
and
it
turns
out
that
over
90
of
those
decisions
we
don't
make
them
consciously.
We
make
them
using
a
separate
part
of
our
brain,
which
is
the
fast
brain.
There
is
a
nobel
prize
winner
called
daniel
kahneman.
He
came
up
with
a
concept
called
thinking
fast
and
slow.
It's
basically
the
concept.
It
is
the
most
advanced
and
most
recent
way
of
explaining
our
thinking
and,
like
neuroscience,
actually
uses
that
a
lot.
B
It's
almost
as
if
the
brain
was
a
hardware
that
has
two
applications
running
on
it.
It
has
an
application
that
runs
very
fast,
but
it
actually
makes
more
mistakes.
The
problem
with
the
fast
brain
is
that
it
creates
shortcuts
and
it
makes
a
lot
of
mistakes,
the
slow
brain
it
spends
more
cognitive
energy.
So
we
spend
more
brain
energy
with
our
slow
brain,
but
it's
more
accurate.
It
makes
better
decisions
in
terms
of
being
more
accurate.
So
this
thing
about
the
10
cents
experiment.
B
So
to
to
think
about
another
shortcut,
I
I've
lived
in
australia
for
eight
years,
I'm
originally
brazilian,
but
I
wanted
you
to
type
in
chat
as
well.
The
first
thing
that
comes
to
your
mind
when
you
think
about
australia,
let's
give
it
like
30
seconds
or
so
to
see
what
comes
up
and
if
you
could
also
help
me
there
diane,
with
what
people
are
typing
in.
B
That
is
definitely
the
shortcut
that
we
think
about
when
we
think
about
australia.
I
get
a
lot
of
koalas
and
kangaroos
and
when
I
moved
there
as
well
in
2008,
that's
the
first
thing
that
come
came
to
my
mind
as
well,
but
other
than
koalas
and
kangaroos.
There's
a
lot
more
to
do
with
australia.
If
we
have
any
australians
here,
probably
not
because
of
the
time
zone,
but
I
don't
even
know
if
this
community
is
a
global
or
just
a
local
community.
B
B
We
simplify,
we
oversimplify
a
complex
system
into
something
very
simple,
and
we
say
australia,
our
country
koalas,
and
that's
very
common
right,
but
being
conscious
of
that
being
conscious
about
the
fact
that
we
do
that
allows
us
to
do
it
less
and
to
actually
be
aware
of
it
when
you
are
doing
this,
be
conscious
of
it
and
be
aware
that
you
are
over
simplifying
a
complex
system,
same
thing
with
a
company.
If
we
think
red
hat,
you
are
oversimplifying
it
sometimes
by
thinking
about
something
and
a
red
hat
is
a
complex
system.
B
A
group
of
14
000
people
is
definitely
a
complex
system.
Same
thing
happened
when
I
talk
about
brazil
when
I
was
in
australia-
and
I
said
that
I
was
brazilian.
The
first
thing
that
came
to
mind
to
people
was
like
soccer
and
beaches
and
and
people
thought
that
I
played
soccer
and-
and
I
I'm
actually
very
bad
at
soccer,
people
would
invite
me
to
go
and
play
soccer
with
them
and
I
suck
at
soccer.
B
So
another
concept
is
a
name
right.
If
you
think
about
a
name
pele.
The
first
thing
that
comes
to
mind
is
the
soccer
player,
but
it's
not
necessarily
that
right,
you
could
have
someone
else's
name
is
pele,
in
fact
my
name
is
fabio
and
in
australia
I
had
a
lot
of
challenges
with
that
name:
actually
fun
challenges,
because
in
australia
nobody
names
their
kids
fabio,
because
there
is
one
guy
he's
very
famous
and
every
time
that
people
think
about
the
name
fabio.
B
This
is
what
comes
to
mind
to
them,
because
this
guy
he's
like
cover
of
magazines
and
books
and
everything
and
his
name
is
fabio.
So
I
had
a
lot
of
trouble
because
when
I
said
my
name
is
fabio,
people
thought
that
I
was
joking
and
I
tried
to
buy
a
wig
and
that's
the
the
closest
I
got
to
being
fabio.
But
that's
also.
B
B
So
dan
he's
a
he's,
a
scientist
and
he
proved
scientifically
that
we
are
irrational
and
that
we
make
irrational
decisions
in
a
predictable
way,
and
this
book
talks
about
all
the
experiments
that
he
went
through
to
prove
that,
and
I
started
studying
that
about
10
years
ago
in
2010,
and
I
got
really
really
excited
about
the
concept
of
predictably
irrational,
because
I
thought
that
I
was
rational.
I
thought
that
I
made
rational
decisions
that
my
decisions
were
always
the
best
decisions,
and
I
started
realizing
that
I
wasn't
so.
I
started
studying
something
called
cognitive
psychology.
B
Cognitive
psychology
is
a
sub
area
of
psychology
that
uses
behavior
economics
to
understand
public
choice
and
decisions
in
general
right.
So
that's
those
are
the
areas
that
I
really
study
and
one
one
way
that
dan
ariely
explains
our
irrationality
is
by
comparing
it
with
cognitive,
the
cognitive
biases
that
we
have.
He
compares
it
with
the
illusions
that
we
have
optical
illusions.
B
So
when
you
look
at
the
blue
circle
on
the
left,
it
looks
slightly
bigger
than
the
blue
circle
on
the
right,
because,
what's
around,
it
actually
influences
the
way
we
see
things.
We
all
know
that
we
know
that
we
have
optical
illusions
and
that
we
don't
really
see
things
in
a
clear
way
and
and
that's
something
that
we
do
understand,
that
the
image
that
gets
to
our
brain
has
some
problems
so
square.
A
and
square
b
here
are
actually
the
same
shade
of
gray.
When
I
looked
at
that
one
I
was
like.
B
Oh
no,
I
am
definitely
broken
like
my
brain
is
definitely
broken
and
I
actually
cut
like
some
some
parts
of
a
and
b
to
look
and
it's
actually,
it
is
actually
the
same
shade
of
gray.
It
turns
out
that,
what's
around
the
a
is
is
light,
so
a
turns
out
to
be
darker
and
what's
around
b
is
shade,
so
b
is
lighter,
and
that
is
one
of
the
most
famous
optical
illusions
that
we
have
and
the
way
that
dana
really
explains
our
cognitive
biases
is
showing
some
decision
making.
B
I
I
asked
that
in
a
in
a
talk
once-
and
I
said
what
do
you
guys
think
this
is-
and
people
said
it's
the
number
of
austrians-
and
I
was
like-
that's-
could
be
very
clever.
Actually
it
could
be,
but
this
is
actually
the
number
of
organ
donors,
so
it
turns
out
that
the
default
in
austria
is
that
you
are
an
organ
donor
and
you
have
to
opt
out
and
in
denmark
it's
an
opt-in
system.
You
have
to
ask
to
be
so
you
have
to
go
and
say
I
want
to
be
an
organ
donor.
B
B
I
was
like
what
is
a
status
quo
bias,
what
is
scarcity
and
what
is
loss
aversion
and
what
are
all
these
things
that
I
have
in
my
brain.
So
I
kind
of
became
that
little
boy
from
the
sixth
sense
movie
that
says
that
he
can
see
dead
people.
So
I
started
seeing
biases
everywhere
and
I
I
just
started
visualizing
those
things
everywhere.
B
Where
do
I
click
and
it
turns
out
that
91.5
of
people
click
on
a
result
from
the
first
page
of
google
there's
actually
a
joke
online.
That
says
that
if
you
kill
someone-
and
if
you
want
to
hide
the
body,
you
should
put
it
on
the
page
two
of
the
google
results,
because
no
one
goes
there.
So
I
was
thinking
right.
If
I
click
on
a
result
from
the
first
page,
did
I
make
that
decision
or
did
google
make
that
decision
for
me-
and
I
was
thinking
like
this-
is
like
the
default.
B
This
is
like
putting
a
result
on
the
first
page
of
google.
It's
using
the
the
status
quo
bias
and
it's
nudging
me
it's
influencing
me
to
make
that
call
and
to
actually
make
that
decision.
There's
a
few
other
biases
that
I
want
to
highlight
here.
There's
one
called
the
dunning-kruger
effect
which
it's
that
our
level
of
confidence
about
some
knowledge
is
much
higher
when
we
don't
know
anything
about
it.
So
this
is
pretty
much
an
effect
that
shows
that
when
someone
knows
almost
nothing
about
a
topic,
they
actually
think
they
know
a
lot.
B
And
when
we
get
to
know
something
about
the
topic,
then
we
realize
now.
I
know
that
I
don't
know
anything
about
this
topic,
so
the
level
of
confidence
drops
a
lot
and
it's
very
interesting
right
when
we
know
that
we
can
deal
with
people
in
a
better
way,
because,
if
we're
dealing
with
someone-
and
they
think
that
they
are
experts
about
something
it
could
actually
be,
that
they
are
at
the
top
left
of
this
graph
and
they
actually
know
nothing
or
almost
nothing
about
that
something.
B
The
ikea
effect
is
another
bias
that
I
like,
and
I
use
it
a
lot.
The
ikea
effect
says
that
we
exaggerate
the
value
and
we
get
attached
to
something
that
we
help
to
build.
So
if
you
build
something
you
get
attached
to
that
something-
and
you
feel
like
it's
yours
just
like
when
you
go
to
ikea
and
you
buy
some
furniture
and
you
assemble
it.
B
So
if
you
want
people
to
get
engaged
and
if
you
want
someone
to
get
engaged
on
an
idea
on
on
something
that
you're
doing
just
get
them
to
build
it
with
you,
because
if
you
come
with
the
thing
already
pre-built,
they
will
not
feel
as
much
of
that
ikea
effect
and
the
ikea
effect
is
something
that
was
published
by
the
harvard
business
review
and
dan
ariely
is
one
of
the
one
of
the
creators
or
one
of
the
the
people
who
documented
that
effect.
Officially.
B
B
It's
actually
a
mid
price,
so
6.99
is
something
that
is
not
the
most
expensive
one,
because
I
have
iphone
11
pro
max
for
199,
so
that
also
that
is
also
a
way
to
influence
people
to
look
at
a
price
and
not
think
that
that
price
is
the
most
expensive
one.
The
name
of
this
bias
is
called
framing
where
you
frame
a
decision
in
a
certain
way
where
people
see
it
differently.
There's
an
amazing
show
on
netflix
that
I
watched,
and
I
was
surprised
by
it.
B
I've
I've
helped
the
australian
government
with
what
was
called
the
digital
transformation
agency
was
an
agency
that
was
brought
up
by
the
prime
minister
and
we
had.
We
had
a
whole
hub
of
innovation
to
help
build
simple,
clear
and
fast
public
services
using
digital
right,
and
we
used
a
lot
of
those
concepts
at
the
digital
transformation
agency
that
was
back
in
2015
from
2018.
B
I
joined
red
hat
to
launch
the
open
innovation,
labs
area
of
latin
america,
but
one
thing
I
want
to
bring
back
from
the
dta
work
that
I
did
was
something
called
digital
service
standard.
It
was
a
combination
of
13
principles
that
everyone
in
government
were
using
to
build
the
digital
services
and
standard
number.
One
said
we
have
to
understand
user
needs
and
to
us
here
at
red
hat.
I
would
say
we
have
to
understand
our
client
needs,
because
we
we
need
to
build
something.
B
We
need
to
bring
to
the
clients,
things
that
are
related
to
their
needs.
So
I
believe
that
anything
that
we
build
should
be
solving
someone's
needs,
and
during
that
time
I
I
also
wrote
an
article
called
the
five
cognitive
biases
to
avoid
during
discovery,
so
I
will
explain
just
one
of
them.
Instead
of
going
through
the
five,
the
confirmation
bias,
the
confirmation
bias
is
a
bias
that
during
research,
when
we're
trying
to
understand
what
someone
needs,
we
might
ignore
their
pain
points
because
they
don't
fit
with
some
existing
assumptions
that
we
might
have
so
I'll.
B
Give
you
an
example.
If
we
have
openshift
and
we
go
to
a
client,
the
only
needs
that
we'll
think
about
are
the
needs
for
openshift.
So
we'll
think
about
oh
yeah.
This
client
needs
openshift,
so
we
might
be
confirming
a
bias
that
we
already
have
a
pre-existing
assumption,
as
opposed
to
going
to
a
client
with
a
blank
mindset
and
actually
think.
Let
me
understand
this
client
needs,
regardless
of
what
I'm
trying
to
bring
to
them.
B
B
How
many
of
those
decisions
are
things
that
you
either
delegate
to
an
application
or
to
google
or
to
netflix
or
to
anything
digital,
and
how
many
of
those
decisions
are
augmented
decisions
that
you
actually
make
using
digital?
The
digital
world,
so
I
was
signing
up
to
netflix
a
while
ago-
and
there
was
something
here
that
said:
please
do
not
email
me
netflix
special
offers-
and
I
was
thinking
hang
on.
If
I
don't
do
anything,
I
will
receive
the
offers.
So
this
is
pretty
much
the
default
bias
right.
B
It
is
the
do
nothing
decision
being
applied
to
a
form
where
the
default
decision
the
do
nothing
decision.
By
not
checking
the
box,
I
will
actually
be
deciding
to
receive
the
offers
in
order
to
opt
out
I'll,
have
to
check
the
box,
and
I
was
watching
netflix
and
I
was
actually
thinking
like
when
we
watch
a
series
on
netflix
and
we
have
the
autoplay
functionality
when
you're
watching,
let's
say
episode,
one
of
black
mirror,
which
is
a
tv
show.
B
I
love
and
you
actually
don't
have
to
do
anything
to
watch
episode
number
two,
because
automatically
the
autoplay
feature
actually
has
something
embedded
on
the
do,
nothing
which
is.
If
you
do
nothing,
you
will
keep
watching
right
so
think
about
it.
All
the
decisions
that
you
make
without
doing
anything.
We
make
a
lot
of
decisions
because
others
make
those
decisions
for
us.
B
Amazon
has
patented
recently
something
called
anticipatory
shipping,
which
is
an
algorithm.
It
is
a
whole
business
model
around
shipping
things
to
us
before
we
buy
it,
so
they
will
understand
our
behavior
online
and
they
will
think
about
like,
of
course,
like
the
algorithms
machine.
Learning.
Artificial
intelligence
will
will
make
a
decision
that
we
want
to
buy
a
certain
product
and
that
product
will
arrive,
so
that
is
designing
a
digital
customer
experience
to
send
them
something
without
them
even
clicking
on
the
button
to
buy
so
back.
B
B
The
ux
people
they're
very
powerful
because
they
actually
create
the
decision
environment
that
we
make
online,
and
then
I
realized
that
we
are
all
digital
decision:
architects.
We
are
all
making
creating
environments
for
people
to
make
decisions.
If
you
send
an
email,
sometimes
you
want
someone
to
make
a
decision.
So,
at
the
end
of
the
day,
we
are
all
digital
decision.
Architects
right
we
are
all
digital
decision,
architects
and
to
bring
some
examples
of
companies.
Actually
that
use
this
concept.
Lemonade
is
an
insurance
company
is
an
insurance
startup
in
san
francisco.
B
The
latest
I
heard
from
lemonade
is
that
they've
been
evaluated
on
a
300
million
dollar
company,
so
they
went
from
13
million
to
300
million
and
one
of
the
things
they
did
as
yet,
if
you
want
to
know,
is
they've
created
the
the
shortest
time
that
someone
could
actually
get
a
claim.
So
there
was
a
claim
that
was
approved
in
three
sec.
Sorry
did
anyone
say.
B
B
Three
seconds
claims,
but
one
interesting
thing
I
want
to
highlight
is
that
lemonade
has
inside
of
it
what's
called
a
behavioral
lab
and
a
behavioral
lab
is
a
way
to
understand
people's
behaviors,
and
that's
why
I
want
to
go
back
to
the
word
behavior,
because
remember
dana,
really,
the
guy
that
I
mentioned
that
had
impacted
me
with
his
book
dan
was
assigned
and
he
was
hired
by
lemonade
as
the
cbo.
B
So
he
was
the
chief
behavioral
officer
of
lemonade
and
when
I
saw
that
I
actually
started
looking,
and
it
turns
out
that
this
is
a
bit
of
a
theme
on
the
market
that
people
have
been
hiring
and
giving
people
that
title
of
chief
behavioral
officers,
I
became
a
huge
fan
of
daenerye.
I
was
like,
I
actually
approached
him
and
we've
been
communicating
since
2015.
B
I
met
him
at
a
conference
and
he
he's
been
giving
me
some
tips
every
now
and
then
to
to
go
in
one
direction
or
another,
and
actually
the
book
has
been
one
of
the
tips
that
he
gave
me.
I
was
going
in
one
direction
and
he
actually
nudged
me
in
another
direction
that
turned
out
to
be
the
book,
and
I
want
to
also
bring
another
name
that
it's
a
very
inspirational
name
as
well.
It's
this
guy
called
richard
taylor,
so
sailor
and
castelstein
wrote
a
book
called
nudge
and
sailor
won
a
nobel
prize
in
2017.
B
So
three
years
ago
he
got
a
nobel
because
of
his
contributions
between
the
economics
and
the
psychology
side
of
things,
so
he
merged
these
two
worlds
right.
He
actually
created
a
way
for
us
to
understand
better
economics
and
decision
making
through
psychology,
and
that
is
what
gave
him
the
nobel
prize
and
the
nudge,
which
is
like
his
concept.
B
What
is
a
nudge
nudges
are
small
and
powerful
interventions
in
the
environments
where
we
make
decisions
right
and
a
digital
nudge
is
it's
making
that
intervention
using
the
digital
world
using
a
digital
device
is
using
an
application
using
anything
digital.
So
that's
when
I
actually
came
up
with
the
concept
and
the
digital
nudge
concept
came
up
a
few
years
ago,
and
it
became
the
book
and
pretty
much.
It
has
two
publics
like
people
ask
me:
what
is
the
audience?
B
The
audience
is
the
digital
decision,
architects,
anyone
who
uses
the
digital
environments
to
nudge
people
into
making
better
decisions,
and
also
the
digital
citizens,
which
are
all
of
us
like
we
all
live
in
the
digital
world.
We
all
have
access
to
internet,
because
here
we
are
on
blue
jeans
having
this
community,
and
this
is
pretty
much
like
half
the
world.
Pretty
much
half
of
the
world's
population
has
access
to
the
digital
world.
They
are
digital
citizens.
B
Then
the
book
came
up.
Digital
nudge,
I
had
a
few.
I've
had
a
few
events
already
on
the
book.
I've
officially
published
it
last
year
in
denmark,
at
a
conference
in
denmark,
the
go-to
conference
in
copenhagen.
B
It
was
amazing,
I
actually
managed
to
hand
the
book
on
the
hands
of
wozniak,
the
the
apple's
co-founder
and
I
was
like
I
was
so
amazed
when
I
gave
him
the
book.
He
looked
at
the
book
and
he
was
like
wow.
I
should
read
this
and
I
hope
he
read
it
and
if
you
want
to
think
about,
like
which
other
books
have
anything
to
do
with
this
one,
this
is
a
list
of
some
other
books
that
have
to
do
with
this.
B
So
if
you've
read
any
of
those
other
books
like
homodeos,
the
fourth
industrial
revolution
hooked
the
power
of
habit
like
all
these
books,
they
are
correlated.
So
the
digital
notebook
has
a
lot
to
do
with
those
ones
as
well,
and
some
people
ask
me
fabio.
B
If
I
know
about
these
things,
can
I
manipulate
people-
and
I
say
yes,
because
nudging
is
a
tool
and
it's
a
very
powerful
tool
and
as
a
knife,
if
you
have
a
very
sharp
knife,
you
can
use
that
knife
to
harm
someone
or
you
can
use
that
knife
to
cut
let's
say
a
sashimi
in
a
very
nice
way.
So
it
is
a
tool
and
it's
a
very
powerful
tool
and
it
can
go
bad
right.
B
Nudges
can
definitely
go
bad
if
you
do
it
in
the
wrong
way,
because
you
can
influence
people
the
wrong
way,
and
I
like
to
say
that
there
is
the
spider-man
effect
with
nudges,
which
is
with
great
power,
comes
great
responsibility.
So
you
do
have
the
responsibility
to
nudge
people
in
the
right
direction.
But
how
do
I
know
what
is
the
right
direction?
B
And
I
like
near
ayal's
definition
of
the
two
types
of
influence
which
one
of
them
is
called
persuasion,
so
persuasion
is
influencing
people
to
do
what
they
want
and
what
they
need
and
coercion
is
actually
the
ability
to
influence
people
to
do
what
they
don't
want
and
they
don't
need.
There
is
a
good
website
called
dark
patterns,
which
is
a
website
that
mapped
a
lot
of
these
dark
nudges
or
things
that
are
bad.
They
are
things
that
people
do
that.
We
don't
really
like.
One
of
them,
for
instance,
is
called
sneak
into
basket.
B
It
is
a
it
is
a
way
to
say.
Let's
say
you
bought
a
tablet
and
automatically
by
default.
You
have
a
case
for
your
tablet
in
your
basket
and
why
is
it
that
we
accept
that
online?
And
if
you
were
shopping,
let's
say
you
were
shopping
at
a
supermarket
and
you
have
your
trolley
and
then
someone
sticks,
something
in
your
trolley.
Would
we
accept
that?
B
No,
we
wouldn't
unless
it's
a
bottle
of
vodka,
because
I
like
vodka
and
that's
actually
a
good
vodka,
but
you
wouldn't
want
that,
and
you
would
definitely
realize
that
someone's
put
that
in
your
basket-
and
you
would
not
accept
that.
So
I
I
decided
to
start
a
movement
that
I
call
the
digital
nudge
for
good
movement,
which
pretty
much
is
it's
a
mapping
of
all
the
ways
that
you
that
digitally
we
influence
people
for
good.
B
So
I
started
asking
questions
and
during
conferences
I
asked
people,
can
you
tell
me,
can
you
tweet,
can
you
can
you
post
on
social
media
with
the
hashtag
digital
nudge?
Anything
that
you
feel
is
good,
so
I
started
keeping
track
of
those
things,
and
one
of
them
is
a
device
that
a
friend
of
mine
actually
had
one
of
those
he
had
diabetes
and
he
had
a
device
that
helped
him
make
a
decision
on
when
to
take
his
insulin
automatically
by
having
an
application
on
his
phone
connected
to
that
device.
B
That
was
constantly
measuring
his
glucose
levels
and
he
was
nudged
to
make
a
decision
and
he
was
actually
offloading
his
brain
and
diabetes
is
a
is
a
huge
issue
right
now.
At
the
moment,
one
in
12
people
have
diabetes
and
there's
actually
another
company
called
iuda
heuristics
that
has
been
renamed
recently
to
quintec.
B
That
has
artificial
intelligence
to
help
people
make
better
decisions
with
diabetes.
Another
digital
knowledge
for
good
is
this
cap,
which
has
been
developed
by
ford.
It
is
called
safe
cap
and
it
has
an
accelerometer
and
it
has
the
intelligence
to
understand
when
the
the
head
movement
of
a
truck
driver
is
about
to
fall
asleep.
So
they
actually
realize
like
how
how?
How
is
the
bad
move?
The
head
movement
pattern
in
order
to
understand
and
predict
a
a
sleep
and
that's
very
interesting,
so
they
nudge
the
driver
and
the
outcome
is
less
accident.
B
So
I've
been
thinking
at
red
hat
right
how
our
technology
helps
people
with
like.
If
we
were
to
to
complete
that,
we
should
definitely
think
about
it
in
that
term,
like
what
is
it
that
we
help
people
with
with
our
technology,
because
we
are
definitely
nudging
people
to
do
things
in
a
certain
way
and
to
finalize.
A
B
Want
to
talk
about
a
framework
that
has
been
published
by
the
university
of
liechtenstein
by
marcus,
christopher
and
jan.
They
published
an
article
called
digital
nudging,
and
that
was
that
came
out
after
the
book
came
out.
I
got
in
touch
with
them.
We
are
really
connected
and
it's
a
framework
of
five
steps
to
create
a
nudge
to
influence
someone.
I'm
not
gonna,
go
through
all
of
the
steps
now,
but
if
you
want,
I
can
share
this
with
you.
B
It's
pretty
much
very
well-defined
framework
in
order
to
go
through
five
steps
to
influence
people
to
make
decisions
using
the
digital
niger.
B
Now
fabio,
what
what
what
do
we
expect
from
this
now?
I
think
we
can
see
ourselves
in
two
of
these
boxes
if
you
will
or
roles
in
this
world.
We
are
either
decision
architects,
where
we're
helping
people
make
decisions,
so
we
should
be
nudging
for
good
and
we
should
be
we're
also
digital
citizens.
So
if
we
are
digital
citizens,
we
should
be
raising
our
digital
consciousness
in
order
to
understand
how
we
get
influenced
so
that
we
don't
get
influenced
as
much
and
for
reflection.
B
One
question
is:
what
are
we
doing
to
influence
the
irrational
brain
both
internally
and
at
our
clients
and
red
hat?
How
might
we
find
ways
to
influence
our
clients,
or
even
anyone
who
we
interact
with
to
drive
success
through
behavioral
change,
and
I
want
to
leave
you
with
one
call
to
action
which
is
think
about
one
action.
One
thing
that
you
will
be
doing
differently
from
now
on,
because
learning
is
acting.
B
A
Well,
thank
you
because
I
think
that's
a
really
nice
synopsis
of
our
of
of
the
book
and
of
the
point
of
view
that
you
have
on
all
of
this.
A
One
of
the
things
that
that
you
talk
about
and
you
brought
up
was
persuasion
versus
coercion,
and
I
was
thinking
about
this
when
you
talk
to
show
the
example
of
the
opt-in
or
opt-out,
like
you
had
to
opt
out
in
order
to
actually
opt
in
or
whatever
that
little
thing
was,
and
I
know
right
now,
like
everybody,
who's
signing
up
for
virtual
events
and
things
along
that
nature,
we
have
we
put
up
registration
pages
and
there's
a
lot
of
government
regulation
around
it
like
gdpr
and
you
you
actually
have
to
physically.
A
You
have
to
opt
in
there's
some
regulation-
that's
in
there.
So,
besides
deciding
to
do
for
good
that,
there's
also
a
lot
of
regulation
now
being
imposed,
maybe
not
on
everybody,
obviously,
but
on
a
lot
of
people
about
that,
opt
in
and
opt
out
stuff,
and
I
wonder
if
you
could
talk
a
little
bit
about
that.
Maybe
how?
Because
you
worked
with
the
australian
government,
if
that
came
into
play
at
all
very
curious,
because
it's
affecting
how
we
do
things
in
this
new
virtual
world.
B
Yes,
definitely
so
gdpr
and
all
the
the
data
regulating
rules.
In
fact,
in
brazil
there's
one
now
called
lgpd,
which
is
pretty
much
gdpr
for
brazil.
It
has
something
called
explicit
consent,
so
explicit
consent
means
that
you
cannot
assume
that
someone
is
giving
you
consent
to
either
share
their
data
or
to
receive
things.
So
there
is
very
much
a
lot
of
regulation
around
that
consent,
so
the
opt-out
version
is
pretty
much
an
implicit
consent
version
of
that.
B
So
what
came
out
with
gdpr
and
these
other
rules
is
that
the
default
bias
cannot
be
exploited
anymore
for
certain
things.
It
is
very
much
regulated
right
now
for
a
few
things,
especially
data
sharing,
so
especially
things
around
who
can
have
access
to
your
data
that
now
needs
explicit
consent.
So
I
do
I
I
I
I
see
that
some
governments,
like
the
the
the
uk
government,
the
american
government,
they
all
work
with
behavioral
units,
all
of
them
give
give
the
units
different
names.
B
So
the
uk
one
is
called
the
nudge
unit
and
the
the
governments
are
usually
very
much
influenced
by
this
concept
of
behavior
economics
and
nudging.
The
nudge
unit
in
the
uk
is
very,
very
much
powerful
into
helping
the
government
create
legislation
and
understanding
what
is
possible
and
what's
not
possible
from
a
behavior
economics
perspective
obama
in
the
u.s
had
the
behavior
economics
unit
that
helped
him
a
lot.
There's
a
lot
of
articles
published
by
that
unit
that
made
a
lot
of
advisory
too
to
to
obama.
At
that
time.
B
The
australian
government
also
has
a
nudge
unit.
It's
usually
referred
to
as
the
nudge
unit,
because
the
uk
one
is
a
bit
of
a
reference,
and
definitely
there
are
units
of
government
that
inform
and
advise
the
government
and
change
policy
in
a
way
that
nudge
people
towards
a
direction
and
that's
where
the
morals
and
the
ethics
come
in
place.
It
has
to
be
based
on
the
morals
and
the
ethics
of
where
governments
think
that
the
population
should
be
going
and
that's
very
controversial,
because
ethics
and
morals
is
not
a
black
and
white
thing.
B
A
So
the
other
thing
that
you
talked
about
a
little
bit
that
that
I
thought
was
quite
interesting,
was
the
confidence
level.
The
dunning-kruger
example
and
I
love
the
the
chart
with
like
the
hundreds
of
different
biases.
I've
got
to
get
that
and
put
frame
it
and
put
on
the
wall,
and
you
know
in
huge
poster
size,
because
I
that
was
really
for
me,
that
was
an
eye
opener.
We
usually
think
about,
like
the
top
five
that
you
were
talking
about
and
stuff.
A
But
I
can
remember
a
long
time
ago
and
I
apologize-
I
don't
know
who
to
attribute
it
to
there
was
a
quote
that
went
around
and
still
gets
hurt.
Is
that
it's
not
what
you
don't
know?
It's
what
you
know
you
don't
know
right
that.
That
really
is
the
scary
thing
and
I
think
that's
like
when
we're
making
decisions
and
things
of
that
nature.
It's
I
totally.
A
You
know,
I'm
totally
confident
that
I
could
fix
the
water
leak
main
in
my
backyard
and
until
I
dig
up
the
pipe
and
I
look
at
it
and
I'm
like
holy
crap
right
that
was
this
past
weekend.
So
it's
like,
then.
I
called
it
an
expert
that
was
not
my
area
of
expertise,
but
you
know
I
thought
yeah.
How
hard
can
this
be?
A
You
know
duct
tape,
but
I
think
a
lot
of
those
different
biases
when
we're
doing
things
like
usability
and
user
design
and
interfaces
for
things,
especially
in
tech,
we're
doing
this
stuff
we
have
to
take.
You
know,
take
those
biases
in
and
make
sure
that
we're
not
perverting
them
for
our
own
ends
and
stuff
like
that.
So
I
think
that's
you
know.
A
But
I
was
thinking
about
that
in
terms
of
talks
that
I've
been
giving
about
end
users
getting
more
involved
in
contributing
to
open
source
projects
and
in
some
ways
it
we.
We
are
encouraging
that,
like,
if
you
look
at
the
cncf
and
all
the
projects
and
people
donating
code
like
lyft
of
any
envoy-
and
you
know,
jaeger,
coming
from
uber
and
spotify,
giving
backstage
all
these
wonderful
projects,
plus
people
contributing
from
enterprises
like
amadeus
and
stuff,
like
back
into
that.
A
So
even
if
maybe
it's
not
perfect
anymore,
like
some
of
those
pieces
of
furniture
that
I've
actually
put
together
with
a
drill
as
opposed
to
the
holes
in
ikea,
that
I
think
is
and
you
it
gave
me
a
really
pause,
because
I
have
to
give
a
talk
on
that
in
a
couple
of
days
to
see
how
I
could
incorporate
that,
because
I
think
that
is
actually
what's
happening.
It's
not
that
the
software
is
getting
worse.
Is
that
we're
seeing
more
end
users
collaborating
with
vendors
like
red
hat
and
ibm
and
vmware
and
dell?
A
And
all
you
know
everybody
in
the
upstream,
which
means
they're
getting
more.
Maybe
confidence
in
using
the
product
understanding
it,
but
they
also
that
it's
they're
getting
they
like
it
better
like
and
that's.
B
Kind
of
an
interesting
I
love
that
insight
yeah.
I
had
never
had
that
insight
on
the
open
source
like
if
I
wrote
one
line
of
code,
and
I,
when
I
submitted
that
and
my
pull
request
got
accepted.
I
feel
like
that
product
is
a
bit
of
it's
mine
like
I
there's
a
piece
of
me
inside
of
it,
so
I
I
feel
like
I
own
it,
I
love
it.
Yeah.
A
C
I'm
still
updated
on
the
the
last
analogy
you
used
in
your
call
to
action,
and
I
think
that
you
could
do
a
lot
better
than
that,
but
we'll
we'll
come
back
to
that.
I
need
to
dwell
on
that
right
right
now.
I
I
think
that
this
is
a
super
common
and
in
the
you
know,
the
history
of
devops
there's
been
a
number
of
interesting
talks.
C
People
gave
kind
of
catalog
cognitive
bias,
particularly
when
you
start
thinking
about
incident
management
and
some
of
these
other
things
that
gets
interesting,
but
this
ikea
effect
is
also
like
the
we
the
way
I
always
say
this
is
that
we
attach
our
ego
to
our
artwork.
C
Like
made
a
thing,
then
it's
like
you
said,
but
another
thing
that
I
keep
thinking
about
like
watching
the
actual
evolution
of
this
industry-
and
you
mentioned
a
bunch
of
these
books
and
I
haven't
read
all
of
them,
but
there's
this
documentary
out
on
netflix
about
kind
of
like
the
social
media
stuff,
that's
that's
been
going
on
and
and
we
almost
have
hugh,
we've
created
huge
fortunes,
focused
on
specifically
exploiting
the
these
anti-patterns
or
or
dark
nudges
or
whatever.
C
C
A
Yeah,
I
think
in
that
the
movie
you're
talking
about
there's
a
scene
where
there's
like
three
clones
of
the
same
guy
and
every
time
someone
does
something
on
the
internet.
They're
like
oh,
let's
push
this
to
him
and
let's
push
that
to
them.
They're
sort
of
like
this,
and
I
you
kind
of
have
that
image
of
what's
going
on
like
when
you're
on
the
facebook
or
amazon
or
netflix
like
what
netflix
suggests
to
me.
A
You
have
this
thing
and-
and
it
is
it's
very
lucrative
these
nudges
and
and
how
you
weigh
in
with
companies
to
do
what's.
Morally
right,
yeah.
B
B
B
What
what
scares
us
right
now
is
that
the
digital
world
has
given
us
access
to
anyone
any
of
the
seven
billion
people
or
even
half
of
it,
like
the
3.2
or
3.5
billion,
who
have
access
to
the
internet
right
now,
and
that
is
what's
what's
brought
it
to
an
exponential
situation
right
now,
that
that
is
the
big
difference.
But
you're
right,
like
influencing
for
bad,
has
always
been
a
problem.
Since
we've
existed.
C
A
There's
another
thing
you
touch
on
in
in
the
book
fabio
about
vr
and
augmented
reality
and
how
the
technologies
I
mean.
We
go
from
radio
to
the
pen
to
all
of
these
technologies
that
influence
what
we
see,
how
we
interact
with
the
world,
and
it's
only
going
to
get
more
exponential.
I
think
until
we
do
things
like
kill
facebook
off,
you
know
log
out
of
taking
that
time
to
to
read
a
hard
book
and
and
be
influenced
in
a
way.
That's
not
a
nudge,
like
that's
a
significant,
life-changing
thing
like
reading.
C
I
don't
see
us,
I
don't
see
us
doing
it.
We
like
the
dopamine.
We
like
so
another
thing
that
you
said
earlier
in
the
talk
that
that
I
kind
of
got
fixated
on
and
I've
been
thinking
about
a
lot
in
other
contexts
as
well.
Is
this
idea
of
you?
You
have
to
make
it
you
you
want
to
become
conscious
of
this
right.
You
want
to
become
biased
and,
and
that's
sort
of
a
a
paradox
right
like
some
of
the
some
of
the
things
that
make
these
bias
and
this
irrationality
or
the
shortcuts
or
whatever.
C
However,
you
want
to
frame.
It
is
because
it's
deeply
subconscious
and
and
there's
definitely
things
we
can
do
and
there's
mindfulness
practices
and
we
can
be
thoughtful,
but
at
some
point
like
you
can,
like
you
literally
have
to
exploit
in
both
sense
of
that
word,
those
shortcuts
to
to
be
able
to
participate
in
in
reality.
B
Yeah
yeah
you're
right.
In
fact
there
is
a
richard
taylor
mentions
on
his
book
nudge
that
there
is
no
way
for
us
to
be
fully
conscious
all
the
time,
because
we
do
not
have
enough
cognitive
brain
load,
cognitive
capacity,
to
make
all
those
decisions
in
a
conscious
way.
So
when
I
talk
about
expanding
our
consciousness
and
in
fact
the
book
in
portuguese,
the
the
in
english
is
called
digital
nudge
in
portuguese
is
called
digital
consciousness,
because
in
portuguese
I
focus
more
on
that
side
of
the
of
the.
B
What
I
want
to
do,
I
want
to
raise
people's
consciousness
and
what
we
can
do
is
we
can
minimize
the
negative
impact
of
when
we
are
being
manipulated
by
a
few
nudges
that
we
get
by
being
conscious
about
that.
I'll.
Give
you
a
real
example.
B
If
the
other
day,
I
was
on
booking.com
like
booking
a
room
at
a
hotel,
and
I
saw
that
there
was
a
message
saying
that
this
is
the
last
room
available
at
this
hotel,
and
there
was
another
message
that
said
there
are
15
other
people
looking
at
this
room
at
this
very
moment.
So
in
my
mind,
I
trigger
my
consciousness
button
and
I
go
like
hang
on.
They
are
trying
to
use
my
scarcity
in
my
scarcity,
bias
to
influence
me
to
click
on
the
buy
button
right
now.
B
A
And
this
this
is
an
interesting
thing
too,
that
it
comes
into
play.
It's
like
when
I
see
that
on
booking.com
or
anyplace
else.
I
have
a
trust
issue
right,
so
it's
like
do
I
trust
that
vendor
do.
I
trust
that
website
and
I
think
the
importance
here
is.
I
totally
agree,
because
if
we
didn't
raise
our
consciousness
and
awareness
of
these
biases
and
how
we're
being
triggered,
it
would
be
that
the
t-shirt
would
read.
A
Resistance
is
futile,
join
the
borg
right,
but
I
think
the
other
piece
of
it
is
do
do
the
people
that
you're
interacting
with
online,
do
you
trust
them,
and
I,
and
I
know
from
from
red
hats,
you
know
we
always
tout
ourselves
as
a
trusted
vendor
for
our
partners
and
our
end
users
and
our
customers,
but
how?
How
do
people?
How
does
trust
work
into
this?
This
dynamic
from
your
yeah.
C
So
really
quick
comment
on
this
particular
example,
and
I
won't
name
any
names
and
it's
been
a
while,
since
this
happened,
but
the
this
like
little
thing
where
it's
like
trying
to
create
this
scarcity.
C
A
B
A
C
B
Yeah,
I
love
that,
because
the
same
thing
that
l,
the
gdpr
came
up
for
the
explicit
consent.
I
wish
we
came
up
with
some
sort
of
a
stamp
or
some
sort
of
the
same
way
that
we
have
for
food.
I
don't
know
about
in
other
countries,
but
we
have
an
organic
food
stamp
right,
which
is
a
regulatory
thing
to
say
that
that
food
is
organic.
Like
I
wish
one
day
we
have
a
some
sort
of
stamp
of
like
organic
content
or
organic
digital
platform
or
whatever.
B
We
are
here
to
say
that
this
thing
has
been
verified,
that
there
is
no.
There
is
no
one
trying
to
do
those
things
that
those
dark
patterns
and
they
are
not
trying
to
use
a
nudge.
That
is
not
real,
because
that
fake
javascript
number
is
a
fake
nudge.
It
is
definitely
a
nudge
for
bad.
A
Which
then
brings
you
to
the
whole
concept
of
fake
news,
and
I
don't
know
how
many
of
you
out
there
viewing
and
watching
along
watch
the
debates
last
night,
but
one
of
the
things
that
done
early
talked
about
in
you
know
some
book
or
somewhere
was
about
when
you
swear
in
when
you
go
to
court,
and
you
say
I,
I
swear
to
tell
the
truth,
the
whole
truth
and
nothing
but
the
truth
at
the
beginning
of
a
court
trial
and
now
at
now
he
his
example
was
now
they
flipped
it.
A
You
said
when
you
do
your
taxes,
you
sign
off
at
the
end
that
you
verify
at
the
end,
so
you
have
this
propensity,
maybe
to
make
little
lies
through
the
whole
thing,
but
if
you
make
the
commitment
up
front
that
you're
not
going
to
fly
that
maybe
in
the
presidential
debates,
maybe
the
next
round
of
them,
we
have
every
presidential
candidate
has
to
swear
up
front
before
they
go
on
stage.
It's
not
just
a
covid
test.
It's
also,
you
know
you
swear
on
a
stack
of
whatever
book
that
you
you.
A
We
trust
you
to
mean
something
to
you,
but
you
know
this
whole
idea
of
setting
up
the
interaction
to
start
with,
and
this
is
where
maybe
we
go
back
to
ux,
design
and
stuff
like
that
and
a
certification,
organic
certification
or
trust
certification.
Those
are
really
hard
things
to
do.
Even
for
software
to
get
certified.
We
have
all
this
testing
and
stuff
that
we
set
up,
but
I
think
it's
it's
an
interesting
concept
to
figure
out.
B
The
thing
the
same
way,
the
same
thing
that
we
have
for
like
a
penetration
test
and
things
like
the
wasp,
the
the
the
top
list
of
the
wasp
things
that
the
vulnerabilities
that
we
can
look
for.
We
could
probably
like
one
of
the
dreams
I
have
is
to
have
a
global
organization
that
thinks
about
this
in
that
way
and
actually
prevents
it
from
happening
that
we
we
not
only
have
a
penetration
test
for
security,
but
we
also
have
a
test
for
these
kinds
of
of
bad
nudges
and
fake
nudges
and
and
coercions.
A
A
Yeah,
I
think
I
think
that's
the
thing
that
we
we
try
and
do
we
talk
companies
and
communities
through
digital
transformations
and
making
codes
of
conduct
and
doing
all
these
things,
but
planting
the
seeds
of
awareness,
helping
people
figure
out
how
to
raise
their
consciousness
and
awareness
of
all
their
own
biases.
A
We
see
that
reflected
in
the
politics
of
the
day
globally
and
then
you
know
I
I
do
go
back
to
a
lot
about
building
the
trust
between
across
the
ecosystems,
with
vendors
and
partners
and
end
users
and
and
the
importance
of
trust.
You
know,
and
people
allowing
to
be
digitally
nudged.
A
You
know
I
I
personally,
I
read
through
all
of
the
stuff
and
a
number
of
the
little
books,
not
the
little
but
the
important,
the
pulitzer,
the
nobel
prize,
winning
books
that
you
head
up
there
and
you
know,
I
think,
about
health
and
fitness,
and
you
know
on
your
watch
getting
a
nudge
to
stand
up
or
to
to
breathe.
Like
I
have
a
nudge
that
reminds
me
to
breathe.
There
are
some
things
that
yeah
really
take.
You
know
take
a
deep
breath,
diane,
don't
don't
get
so
stressed
out
right.
A
I
have
to
do
that
digitally
so
that
I
remember
it's
almost
I
don't
have
diabetes.
I
have
like
stress,
so
I
think
that's
the
the
key
thing
is
it's
just
some
of
these
nudges.
I
trust
apple
and
the
app
to
do
that
with
me
right.
I
think
it's
so.
C
We
probably
don't
have
time
to
go
all
the
way
through
this,
but
I
have
this
gamified
life
insurance
policy.
That
pays
for
me
to
have
an
apple
watch
and
then
it's
like
I
get.
I
get
points
for
doing
exercise.
I
get
points
for
meditation.
I
get
all
these
things
that,
like
there
I
mean,
there's
some
some
quantitative
model
somewhere.
B
A
Well,
I
kind
of
feel
we're
winding
down
a
little
bit,
which
is
great,
and
I
I
love
the
the
christmas
list
of
books
that
you
put
up
there
earlier
and
I
think
one
of
the
things
we've
been
teasing
about
andrew
and
the
other
folks
from
the
gto
is
going
forward
to
like
creating
this
coveted
reading
list
or
holiday
reading
lists
like
our
wish
list
for
books,
and
I
think,
you've
added
at
least
seven
or
eight
mentions
of
different
things
that
that
now
people
could
read
and
add
and
append
to
that.
A
So
I'm
really
grateful
that
that
you
share
this
with
us.
I
love
the
concept
of
nudge
and
digital
nudges
and
and
the
way
that
we
at
red
hat,
we
have
so
many
different
products
and
projects
that
have
user
interfaces
that
do
these
kind
of
digital
nudges
and
how
important
it
is
for
us
to
be
aware
and
continue
to
build
the
trust
with
our
end
users,
whether
it's,
the
openshift
console
and
I'll,
put
a
plug-in
for
there's
an
openshift
console,
customization
contest
going
on
now.
A
If
you
follow
me
on
twitter,
there's
a
couple
of
tweets
about
that.
But
it's
like
how
do
you
create
those
user
experiences,
whether
they're,
vr
or
consoles,
or
cli
or
augmented
realities
where
people
trust
you,
and
so
as
we
go
about
our
day-to-day
jobs
and
and
work
with
our
end
users
and
on
our
products,
making
sure
we
hold
these
things
and
raise
some
consciousness
awareness
about
all
of
these
biases?
B
Well,
let
andrew
go
flat
because
there
is
actually
one
of
our
biases.
That,
however,
something
ends
is
the
thing
that
we
remember
the
most.
It's
called
the
peak
and
rule.
That's
why,
when
we
see
a
movie,
we
feel
like
if
the
ending
of
the
movie
is
not
good,
we
feel
like
the
whole
movie
sucked
so
I'll.
Let
andrew
finish:
what's
so
that
bad.
C
So
so
memory
tests
show
you
remember
things
at
the
end
and
at
the
beginning
better
and
that
you
get
more
fuzzy
in
the
middle
anyway.
I
I
just
think
it's
a
pleasure
to
to
have
fabio
come
talk
to
us
about
this.
I
really
like
the
the
thought
you
put
into
this
and
the
you
know
the
work
that
you
do
as
part
of
the
team
at
red
hat,
and
so
it's
just
fun
to
to
get
you
on
and
have
you
kind
of
explore
some
of
these
topics,
and
you
know
I.