►
From YouTube: Problem validation pilot with GV
A
So
the
goal
for
this
meeting
today
was
Michael
sent
over
a
research
brief
that
we
could
follow
and
fill
in
details
about
the
company
and
the
problem
that
we're
trying
to
solve
and
how
we
see
our
users
and
our
customers
so
for
the
rest
of
the
meeting.
I
think
Michael
will
be
going
through
that
doc.
Unless
you
had
any
other
agenda
items,
no.
B
Yeah
so
that
duck
I
think
will
be
kind
of
a
guide.
Thank
you
for
taking
the
time
to
fill
all
that
out.
It's
super
helpful
to
me
to
see
it
I'm
curious
what
it
was
like
for
you
to
fill
it
out,
but
I
want
to
talk
through
it
anyway,
just
especially
because
I'm
not
a
developer
I'm,
not
as
just
admin
that
core
people
you're
after
and
so
some
of
it
I'll
just
need
explanation.
B
Great
so
part
of
what
you
can
see
I'm
trying
to
do
through
that
brief
and
through
the
comments
I'm
added
to
the
document
and
question
zone,
hopefully
we'll
be
asking
you
it
is
to
get
you
to
be
as
specific
as
possible
about
the
users
you're.
After
about
the
questions
you're
trying
to
answer
like
as
detailed
as
we
can
be
about
like.
B
What's
the
thing
are
the
key
kinds
of
information
that
we
want
to
get
out
of
this
I
mean
it
doesn't
mean
it's
everything
that
you're
gonna
want
ultimately,
but
it's
I'm
kind
of
looking
for
where
do
we
start?
What's
the
what's
the
most
important
things
for
us
to
get,
because
if
I,
if
I
can
understand
that
stuff,
then
we
can
craft
the
research
exercise
to
be
as
targeted
as
possible
and
as
fast
as
possible.
Then
you
can
kind
of
move
forward.
If
we
were
to
do
something
at
the
extreme
well,
it
was
too
broad.
B
It
can
just
be
very
difficult
to
do
quite
frankly,
because
then
you're,
like
oh,
my
gosh,
how
many
people
do
we
talk
to?
What
do
we
ask
about?
What
do
we
do
and
then
you
get
all
this
information
like
this
is
like
a
lot
to
deal
with
right,
I'm
just
trying
to
be
as
focused
as
we
can,
and
it
could
be
that
it's
it's
not
super
narrow.
That's
okay,
weed
I
just
want
to
kind
of
define
that
yeah.
A
B
Then
I
can
count
what
the
reason
I
do.
That
is,
if
I
can
figure
out
the
targeted
purpose
and
another
way
that
I
think
about
it
is
the
kind
of
deliverable
at
the
end
like
what
would
you
want
to
have
and
think
when
we're
done
and
then
I
can
help
you
think
through?
Who
do
we
talk
to?
What
do
those
interviews
look
like
look?
B
We
can
if
we,
the
way
I
describe,
is
start
at
the
end
and
then
it's
easier
to
plan
backwards
to
know,
because
that
I
would
design
the
interview
very
particular
in
a
particular
way,
depending
on
what
we
want
at
the
end,
what
I
mean
whether
it
was
example
is:
if
we
wanted
a
detailed
process
of
somebody's
workflow,
then
the
interview
that
I
would
conduct
is
very
particular
to
that.
What
do
you
do?
Show
me
what
you
do
then?
What
do
you
do
and
then
what
do
you
do
in
it?
B
A
Think
that
makes
a
lot
of
sense.
The
research
we've
done
so
far
has
been.
We've
been
asking
users
how
they
build
and
push
and
and
delete
images,
and
we've
done
and
IANS
captured
a
lot
of
those
notes
and
and
how
people
are
doing
that
and
we've
thought
about,
but
we
have
somewhat
of
the
process
and
in
finding
that
process,
we've
realized
that
a
lot
of
the
that
people
are
need
in
order
to
move
through
that
process,
we're
not
helping
them
with,
so
they
could
do
it.
They
could
follow
our
documentation.
A
So
I
think
for
me
now
with
this
research
brief,
it
would
be
to
determine
maybe
it's
not
so
much
about
the
priming,
and
you
could
disagree
with
me
here.
Maybe
it's
not
so
much
about
the
process.
It's
more
about.
Is
this
the
data
that
you
need,
or
you
know,
we've
identify
what
the
process
is,
and
now
we
want
to
see
like.
Are
we
actually
helping
you
to
solve
your
problem?
More
efficiently,
are
we
helping
you
move
through
the
application
more
smoothly
is
something
maybe
that
I
would
like
to
focus
on
them.
C
Yeah,
no,
that
sounds
exactly
right.
The
only
thing
I
would
kind
of
add
to
that
is
I'd
like
to
do
research
or
around
their
processes
are
kind
of
redefined
in
other
parts
of
our
application,
but
for
package,
we're
kind
of
a
reference
source,
so
connect
figuring
out
how
to
connect
users
to
the
data
that's
actually
relevant
to
them
is
what
I
really
like
to
focus
on
that's
kind
of
what
I
know
that's
kind
of
what
Tim
said,
but
that's
the
mindset
I'm
walking
into
this
way,
yeah.
A
We
need
to
figure
out
which
information
is
valuable
and
then
how
do
you
present
that
to
the
user
when
they
need
to
see
it
and
then
in
the
future?
Are
there
any
like?
How
can
we
you
know?
The
good
lab
way
is
to
you
know,
make
decisions
for
the
user
based
on
what
we
know
is
like
common
paths,
but
we
I
think
just
first
identifying
that
data
and
then
figuring
out
how
to
present
it
would
be
really
valuable.
Ok,.
A
A
A
A
The
pipeline
will
get
kicked
off
and
a
new
docker
image
will
be
created
for
that
specific
branch,
so
they're
creating
many
images,
as
opposed
to
like
docker
hub,
where
you're
like
I'm,
looking
for
something
specific
and
then
I'm,
gonna
iterate
on
it
and
maybe
I'll
push
it
back
in
this
case,
they're,
creating
many
images
and
many
tags
and
it's
all
tied
with
their
source
code,
their
pipelines,
as
well
as
the
images
in
tags.
So
there's
additional
data
that
that
we
we
could
be
surfacing
so
that's
sort
of
where
we
came
up.
A
But
the
data
model
was
looking
at
those
different
at
those
different
pieces.
Like
what
information
would
be
useful
from
their
source
code
well
be
really
helpful
to
know.
The
latest
commit
message,
maybe
or
be
really
helpful
to
know
what
branch
a
specific
image
was
pushed
from
or
it'd
be
really
helpful
to
know
which
pipeline
it
was
generated
from
so
bringing
together
the
the
data
that
get
lab
has
to
offer
and
in
these
different
processes
and
and
and
tying
it
to
a
given
image.
Okay,.
B
A
A
They
actually
built
a
separate
their
own
UI
separately,
so
they
took
the
gitlab
API
and
the
docker
API
and
built
their
own
version
of
a
container
registry
which
showed
them
the
docker
label
info
the
the
commit
and
the
which
pipeline
it
came
from
and
they
actually
showed
us.
This
would
be
useful.
You
could
see
here,
I
could
sort
things
I.
Could
you
know
I
could
find
which
branch
it
was
tied
to
so
that,
in
that
case,
that
was
cool
to
see
our
users
and
how
technical
they
are
that
they
work
around
these
problems.
Yeah.
C
A
In
other
cases,
people
are
just
doing
things
manually
like
they're,
going
in
and
finding
a
specific
image,
they're
saying:
okay,
I
think
it
was
tied
to
this
commit
now
I
need
to
copy
this
little.
You
know
26
character,
commit
sha
and
put
that
into
a
file
somewhere
else,
so
by
forcing
users
to
do
that.
We're
sort
of
creating
these
risky
manual
workarounds,
which
is
not
ideal,
yeah,
okay,.
B
One
of
the
things
that
would
I'm
wondering
if
you've
figured
out
or
that
we
need
we
do
need
to
track
is
this
idea
of
which
of
these
is
most
important.
So
there
are
a
lot
of
times
when
I
do
interviews
to
be
like.
Oh,
oh
yeah,
that
would
be
helpful
or
that
would
be
convenient
or
but
you
want
to
avoid
kind
of
a
free,
cookie
problem
right
like
would
you
like?
We
cook
you
like
I,
would
told
them.
The
other
will
be
done.
B
A
Really
sensitive
to
that,
because
in
my
previous
job-
and
you
know,
I
was
starting,
I
was
an
early
phase
startup
and
we
went
out
and
did
our
initial
research
and
we
heard
a
lot
of
that
like
oh,
this
is
cool.
This
is
interesting
and
I'm
like
this
is
yeah.
That's
great
we're
building
the
right
thing,
and
then
it
wasn't
until
we
launched
the
version
two
or
three
we
were
in
a
meeting
doing
some
research
and
someone
said
one
of
the
engineers
said
hold
on
a
second.
A
A
Well,
I
have
10
projects
and
20
people
and
on
every
commit
to
each
of
those
projects
and
new
images
being
built
so
he's
like
hundreds
per
week.
Maybe
it
could
be
as
much
as
thousands
depending
on
how
busy
we
were
and
we're
not
giving
those
users
any
ability
to
understand
how
those
images
are
tied
to
specific
pipelines
or
branches.
A
So
I
think
if
we
focused
on
the
CI
integration
that
would
probably
be
the
most
valuable
and
from
agate
lab
business
perspective,
it
might
make
the
most
sense
to
because
if
we
can
get
our
users
to
use,
get
lab
CI,
that's
a
very
sticky
future
if
they're,
not
using
CI
and
they're.
Just
using
the
container
registry
well
at
any
point,
they
could
just
switch
to
dock
or
Quay,
but
once
they
have,
you
know
this
process
where
it's
like.
Well
now.
This
is
inherently
tied
to
how
we
build
our
code
and
deploy
it.
A
Things
I'm
trying
I'm
trying
to
yeah,
really
see
that
as
one
that's
our
distinct
advantage
and
and
twos,
knowing
that
that's
where
I
want
to
get
is
that
I
want
to
have
a
really
tight
integration
for
a
product
with
with
CI,
so
that
that
is
how
I'm
seeing
the
the
priority
and
then
I
think
the
specific
sort
of
source
control
things
like
what
branch
was
this
or
what?
What's
the
commit
message?
Those
are
the
things
that
are
important
but
maybe
are
less
important
than
the
CI
integration.
C
B
A
Of
how
you're
filtering
stuff-
and
then
the
other
thing
I
would
say,
is
the
core
artifacts.
So
when
we
think
about
what
what
is
involved
in
building
an
image
is
there's
a
docker
file,
there's
probably
a
requirements
txt
file
and
right
now
we
don't
help
the
user
get
to
those
at
all.
So
if,
for
example,
I'm
looking
at
the
container
registry,
UI
I
see
a
list
of
images
but
I
actually
don't
see
which
docker
file
created
those
images
or
which
requirements
file.
A
It's
using
that
information
is
also
important
because
that's
what's
actually
they're
using
to
generate
that
information,
so
I
see
it
as
the
metadata
around
CI
pipelines,
because
that's
sticky
and
that's
because
the
how
the
majority
of
images
are
getting
built
and
then
giving
them
easy
access
to
the
core
artifacts
that
are
responsible
for
building
those
images
and
not
forcing
the
user
to
go
through
back.
Okay.
A
Well
now,
I'm,
looking
at
this
like
Ian
just
mentioned
I'm
looking
at
the
this
page,
I
want
to
verify
something,
and
now,
if
I
change
it
I
have
to
go
back
to
my
main
project.
I
have
to
find
the
doctor.
If
I
have
to
make
a
change,
I
have
to
submit
a
merge
request
and
then
you
know
start
the
whole
process
again.
So
I
think
having
some
concept
of
being
able
to
surface
that
information,
all
in
one
place
would
be
really
would
be
really
useful.
Okay,.
B
A
Yeah,
exactly
some
that's
a
good
question.
I
think
we
can
hope
to
answer
that
better
with
with
talking
to
users,
but
so
far
my
my
feeling
has
been
people
because
they're
going
to
that
page
to
see
that
data
about
a
given
image,
it's
important
that
we
actually
show
them
the
actual
data,
that's
related
to
that
image
and
because
the
end
so
ci
is
first
because
that's
how
the
majority
of
images
are
being
built,
okay
and
then
I
think
about
okay,
the
user
experience
of
how
are
they
moving
to
actually
update
those
things.
Yeah.
B
So
some
signals
that
I
would
look
for
when
I
do
the
research
on
this
kind
of
stuff.
So
imagine
that
we
map
out
the
whole
workflow,
which
is
kind
of
the
way
that
I
would
usually
do.
It
is
to
have
a
way
to
to
create
that
map
of
all
the
steps
and
who's
doing
what
and
what
are
they
trying
to
do
at
every
step
through
this?
What
information
they
need
at
every
stage?
Where
do
they
go
to
get
it,
then?
What
I
would
do
is
look
at
that
and
see
like
okay.
B
A
B
Way
that
I
do
it,
let
me
pull
up
the
template
for
this.
There
are
a
lot
of
if
you
were
to
search
online
for
like
ux
map
or
journey
maps
or
stuff.
You
can
see
images
online
for
a
bazillion,
different
ways
of
doing
it,
and,
and
usually
they
have
slid
there,
they
have
slightly
different
goals
for
the
deliverables.
This
is
a
way
that
I
do
it,
partly
because
it's
been
very
useful
to
me
and
partly
because
if
I
had
to
do
it
more
visually,
it
would
be
a
big
problem.
B
Okay,
can
you
see
that
sheep,
oh
yeah,
this
looks
great
okay,
so
this
is
a
standard
way
that
I
do
it.
This
is
like
the
most
simplified
version,
so
you
can
see
on
the
left.
You
know
kind
of
what
the
the
key
things
are
then,
looking
at
like
who's
who's
doing
this,
because
sometimes
it's
multiple
people
involved,
you
know,
sharing
things
you're
in
you
know
depends
on
what
the
process
is
at
this
very
specific
step.
What's
the
person
trying
to
do,
how
are
they
doing
it
to
the
actions?
B
What
are
the
questions
that
they're
trying
to
answer,
and
so
these
things
can
be
somewhat
micro?
If,
but,
for
example,
your
what
you
described
is
somebody
going
to
this
page
to
look
at
they're
trying
to
validate
something
there's
there
are
very
specific
questions
in
their
head
that
they're
checking
right.
Does
it
have
as
it
was
it
this
artifact
that
it
go
through?
Did
it
whatever
so
I
think
in
terms
of
questions,
because
the
data
that
you're
trying
to
figure
out
is
the
answers
to
the
bat
right?
B
A
B
A
If
we
feel
something
like
this
out,
I
think
that
would
help
Ian
and
I
were
to
map
out
Morse
more
concretely
what
the
process
is
and
because
right
now,
I
know
it's
mostly
in
my
head
and
we
haven't
really
like
mapped
out
thoroughly.
What
and
maybe
there's
two
versions
of
this
like
depending
on
who
the
user
is
and
but
doing
that,
then
we
can
validate
that
with
our
user
research
right
like
we
could
say
on
step
one.
A
You
know,
and
we
could
ask
them
okay
step
one
you're
building
the
image,
and
then
we
can
ask
the
specific
questions
like
how
do
you?
When
do
you
update
the
dockerfile?
Where
are
you
expecting
that
soccer
file
to
be
pushed
to,
or
things
like
that
once
once
would,
is
that
the
best
way
to
follow
through
a.
B
A
B
A
Well,
I
think
that
we
could
take
a
stab
at
it
to
validate
it
with
further
research.
My
concern
is
that,
because
we,
because
most
all
of
the
users
that
we
spoke
to,
except
for
one,
are
people
that
we
uncovered
from
the
issues
get
lab
issues
like
just
reaching
out.
One
of
the
advantages
of
the
it
lab
is
our
users
are
very
active,
and
it's
nice
they're
they're
very
open
to
talking
to
us
and
helping,
but
we
may
not
have
gotten
all
of
the
use
cases,
because
it's
only
people
that
are
willing
to
say
yeah.
C
A
Following
along
in
the
issues
and
I'm
willing
to
do
user
research,
it'd
be
great
to
reach
out
to
customers
specifically
and
I,
think
it
would
be
helpful
to
think
through
the
customer
profiles
and
user
segments
a
bit
more,
at
least
for
our
stage.
So
I
think
we
can
expand
on
that.
But
if
we
went
in
with
a
plan
like
this
and
then
we're
validating
it,
you
think
that's
risky
to
do
something
like
that.
Or
is
it
better
to
have
sort
of
like
what
we
think
is
the
the
process
so
I.
B
Guess
there
are
a
couple
things
there,
so
one
is
in
terms
of
whom
you're
recruiting
and
talking
to
it.
All
all
of
this
hinges
on
making
sure
that
you're
talking
to
the
right
people,
it's
super
important
that
you're
doing
that
so
and
the
right
people
means
you
have
to
decide
and
we
can
talk
through
it.
B
I'm
like
who's
the
bullseye
for
you
of
the
the
people
that
are
most
important
to
you
and
I
know
like
everyone's
our
favorites
yeah,
but
as
a
business
right,
there's
some
group
that
is
most
important
or
maybe
most
important
to
help.
First,
we
could
say
it's
not
the
we're
not
going
to
help
other
people
and
but
presumably
right
now,
I'm
an
investor,
so
there's
some
core
for
your
business
that
you
guys
could
describe
like.
Oh
it's
it's!
A
Is
the
developer
no
offense
to
any
of
our
users
that
are
watching
this
they're
a
little
bit
selfish
they're,
not
thinking
about
how
to
clean
up
the
registry
they're,
not
thinking
about
how
to
like
really
scale
this
to
to
support
like
we
mentioned
thousands
of
images,
whereas,
like
the
system
administrator,
they
do
have
the
responsibility
of
thinking
about
how
the
container
registry
works
across
projects
or
across
multiple
groups,
and
so
they're
gonna
have
the
most
complicated
use
cases
amongst
those
system.
Administrators
and
I
think
this
is
really
you're.
A
Pushing
me
in
this
way,
and
it's
good
to
think
about
is
probably
teams
that
are
doing
something
with
micro
services,
so
they're,
building
a
bunch
of
different
images
for
different
branches
and
and
different
repositories
and
they're
building
many
images.
So
maybe
we
could
say,
organizations
that
are
using
docker
containers
to
support
micro
services
that
are
using
gitlab,
CI
and
CD
to
build
those
images
and
deploy
them,
and
maybe
a
good
criterias.
Are
they
building
more
than
15
images
per
week
and
I
could
be
flexible
on
that
number?
A
It
could
be
10
or
it
could
be
20,
but
saying
15,
because
they're
going
to
encounter
these
problems
of
well,
what
I
don't
understand
where
this
image
came
from?
What
what
pipeline
was
it
tied
to
where
who
built
it?
What's
the
base
docker
image,
they're
gonna
have
the
most
questions,
whereas
one
of
my
friends
is
a
developer
and
I
was
asking
about
this
he's
like
well,
we
just
have
one
image
for
a
model,
a
monolith
repository
like.
Why
is
this
a
problem?
B
I'd
be
very
specific
about
the
numbers
that
you're
after,
like
what
and
like
you
said
like.
Maybe
it's
15,
maybe
it's
20,
but
you
should
write
down
the
recruiting
criterion
very
specifically
because
when
we
screen
them
you'll
need
to
know
like
is
this
person
to
in
or
out
right,
not
the
end
like
once
you
decide
not
B,
not
let
the
other
ones
in
because
it'll
just
make
it
confuse
it,
and
when
you
just
describe
the
group
you
describe
its
I
just
want
to
make
sure.
B
A
Don't
know
I
yes,
because
I
think
they're
most
likely
to
encounter
the
problem
and
I
think
if
we
can
solve
their
problem.
It
will
only
help
everyone
else
like
if
we
could
solve
the
the
problem
for
those
users,
the
problem
I
guess
thinking
it
through
is
and
the
you
know,
I'm
being
selfish
and
thinking
about
my
stage.
You
know
thinking
about
well
for
my
stage
those
are
the
most
important
users
and
I
suppose
they
are
the
mote
in
in
that
sense,
for
my
stage
and
how
it
rolls
up
to
get
lab.
A
Let's
just
use
Amazon's
container
registry
and
we'll
forget
about
using
gitlab,
see
I
will
will
switch
to
Jenkins,
so
I
think
the
risk
is,
if,
for
the
small,
for
my
friends
use
case
for
his
like,
we
have
one
image,
he's
fine
to
use
the
container
in
trees,
no
problem.
But
if
we
fail
these,
these
bigger,
more
complicated
stories,
they're
gonna
take
all
of
those
images
being
built
and
all
of
that
storage
stand
and
all
those
CI
pipelines
to
one
of
our
competitors.
So
I
do
think
they.
That
could
be
the
most
important
okay.
B
B
Research
right
is
to
be
very
clear
about
who
who
are
the
targets
and
what's
most
important
to
our
business,
so
that
we're
solving
there
with
all
due
respect
to
all
the
other
good
lab
users
were
important
to
us
and
understanding
that
solving
that
these
other
problems,
like
he's,
that
will
adjust
it'll
kind
of
result,
votes
right,
but
for
the
research
we
want
to
make
sure
we're
talking
to
the
people
who
are
really
the
key
people.
You
want
to
help
because
it
could
be.
B
C
For
sure,
so
I've
been
doing
some
exploration
on
my
own
I'm,
just
some
engineering,
friends
and
kind
of
run
into
that
pattern.
Pretty
consistently
the
system
admins,
the
ones
that
have
these
really
complex
repositories
are
dealing
with
these
complicated
issues
and
that
permeates
out.
So,
even
if
there
are
engineers
that
would
end
up
utilizing
these
features,
it's
because
the
system
admin
already
set
it
up
for
them
and
they're
checking
or
it
doesn't
exist.
B
C
B
B
Truck
she
has
to
go
talk
to
her
about
that,
because
you
could
imagine
if
you
were
I
mean
again
what
you're
describing
makes
sense
that
these
are
the
people
to
go
after
to
focus
on
first.
But
if
somebody
says,
if
it's
business,
it's
this
other
group,
then
we
should
figure
out
like
what
are
their
pain
points
and
go
solve
them.
If
this
is
like
some
weird
fringe
thing
like
yeah,
it's
true,
but
these
aren't
that
many
people
who
have
this
problem
or
have
these
fit
into
this,
this
group,
in
the
Venn
diagram.
A
Think
I
saw
the
note
in
the
the
research
brief
to
expand
on
that
and
I
think
you
know:
we've
dialed
in
a
little
bit
more
so
I
think
when
we're
updating
that
we
could
question
those
assumptions
a
little
bit
more,
because
I
do
want
to
make
sure
that
we
are
thinking
about
get
lab
as
a
whole
and
not
just
our
stage
and
Isis.
My
instinct
is
to
just
say:
yeah.
A
B
Usually
what
I
do
is
I
create
some
very
a
screener
which
what
I
mean
is
a
very
simple
questionnaire
right,
where
I'm
asking
people
in
an
open-ended
way.
So
I'm
not
saying
oh,
do
you
do
15
images
at
least
a
week,
I'm
saying
so
in
a
week?
How
many
images
do
you
do
right
and
then
we
can
sift
through
and
say,
like
oh
they're,
doing
four
or
they're
doing
100,
and
then
we
can
pick
right.
B
A
Those
users,
how
do
we
find
users
yeah
I've,
used
two
respondent
before
I,
had
a
lot
of
success
with
that.
A
My
last
company
we're
building
a
data
science
platform
and
we
were
using
that
to
filter
on
show
me
data
scientists
that
work
at
companies
with
more
than
you
know,
a
thousand
people
that
are
using
me
that
are
familiar
with
these
tools
and
then
we
sent
them
a
like
a
brief,
like
I,
probably
wasn't
so
careful
about
it
being
open-ended
and
not
giving
them
the
right
answers,
but
we
screen
them
and
there
were
some
cases
where
we
said
well.
This
person
doesn't
exactly
meet
the
criteria,
but
they're
really
interesting
because
there
are
there.
B
A
B
A
A
We
have
our
the
network
of
engineers
here,
and
some
of
them
are
heavy
users
of
the
product,
so
we
could
definitely
talk
to
them
and
then
connect
to
their
networks,
and
there
might
be
I'd
have
to
talk
to
the
research
team.
There
might
be
ways
that
they
do
like
flash
surveys
via
email
and
things
like
that
as
well,
that
we
might
be
able
to
leverage
yeah.
B
So
I
would
definitely
coordinate
with
the
research
team
because
I'm
guessing
that
they
have
systems
for
doing
this
and
a
process
and
yeah,
because
for
a
few
reasons
so
one
is
they've,
probably
figured
some
stuff
out.
They
have
some
good
techniques
on
the
other.
Is
you
want
to
coordinate
with
them?
Because
you
want
to
make
sure
across
the
organization
you're
not
all
talking
to
the
same
people
by
accident?
B
B
Than
longer
maybe
I
can
hour,
yeah
I
mean
what
you're
describing
to
me.
Maybe
it's
just
cuz,
it's
it's,
it's
unfamiliar
to
me.
It
sounds
like
a
complicated
process
so
to
dig
through
at
the
level
of
detail
that
I
would
want
to
get
through.
This
is
easily
an
hour
okay
to
to
recruit
the
kind
of
people
that
you're
talking
about
to
get
them
to
spend
the
time
I
would
probably
offer
you
could
experiment
a
little,
but
I
would
guess
it
would
take
a
hundred
fifty
bucks
for
them
to
spend
an
hour
with.
B
A
no
that's
what
we
did
it
for
their
data
scientists
as
well
was
a
150
bucks.
Yeah
I
mean
if
you
just
need
to
talk
to
somebody
who's
more
of
a
consumer
like
off
the
street
kind
of
no,
not
a
specialist
you
can
get
away
with.
You
can
try
getting
away
with
less,
but
the
incentives
are
a
place
where
I
I've
learned
to
not
be
penny
wise
and
pound
foolish.
B
So
what
I
mean
by
that
is,
if
you
you
can
imagine
if
you
offer
people
like
a
$25
Starbucks
card
like
a
sysadmin,
is
not
gonna
like
to
blow
you
off
for
anything
else.
That
happens
in
their
day,
and
so
you
want
to
welcome
to
show
up,
because
it's
you're
all
organized
you're
ready
to
go
it's
a
waste
of
its
time
and
you
want
them
to
take
it
seriously
and
and
and
just
some
respect
to
them,
like
you're,
an
expert
on
your
time
right.
A
That
makes
sense.
Do
you
think
that
we
should
phase
this
like?
Let's
say
that
we
start
and
we
were
going
through
we're
validating
that
the
process
that
we've
come
up
with
and
that
the
questions
that
people
are
asking
at
each
step
of
that
process
or
right
and
is
the
next
phase
of
that
being
able
to
then
say:
okay.
Well,
now,
we've
created
this.
You
know
skeleton
UI,
like
a
clickable
prototype,
show
us
how
you
would
do
this.
Is
this?
A
What
you're
looking
for
is
that
the
next
step
of
the
research
is
it
to
go
to
start
with
like
validating
that?
Okay,
what
we
thought
is
right
and
then
next
is
to
say,
is
this
how
you
would
actually
use
it
yeah
so.
B
We
kind
of
go
through
a
process
of
narrowing
anyway
so
validating
what
I
would
this
first
step
of
like
validating
the
questions?
It
sounds
like
you
guys
should
map
out
what
you
think
you
know
so
far
and
do
it
in
like
try
to
do
it
in
excruciating
detail
you're
better
off
having
more
detail
than
less,
because
that's
where
the
that's,
where
the
value
is
in,
like
the
nitty-gritty
yeah.
Exactly
it's
a
great
point,
because.
B
This
and
like
oh,
it's
a
six
step
thing.
We
do
this
and
they
do
that.
You're
like
okay.
This
is
totally
not
useful
for
me
to
design
anything,
and
so
it's
about
getting
to
that
nitty-gritty
of,
like
oh
there's,
this
one
piece
of
information
that
they
need
that
they
then
go.
Look
they
dig
up
and
they
find
this
far.
B
They
have
to
ask
a
friend
like
that's
the
stuff,
that's
really
valuable
to
know
so
anyway,
write
out
as
much
as
you
can
for
what
you
know
and
then,
when
you,
when
you
say
validate
it's
not
that
I
would
present
that
to
them
right.
So
what
I
would
do
is
then
I
would
conduct
a
fresh
interview
with
each
person
in
like.
A
Yeah
that
makes
sense
we're
more
just
showing
them
and
and
we're
leading
we
could
lead
the
questions
into
each
of
those
steps
like
one
step
could
be
they
log
on
to
get
lab,
and
it's
like.
We
don't
necessarily
need
to
ask
them.
Did
you
just
log
on
to
get
lab
we're
just
saying?
Okay,
we
saw
they
logged
on
they
navigate
to
a
project.
Okay,
right.
B
Because
you'll
map
this
hell
out-
and
it
might
be
that
you
realize,
like
oh
they're-
like
these
three
parts-
that's
actually
where
we're
gonna.
We
think
that
that's
pretty
problematic.
We
want
to
go
through
everything
and
make
sure
that
there's
not
something
else,
but
then
we're
gonna
dig
into
those,
and
we
know
already
that.
That's
our
focus.
We
might.
We
wouldn't
say
that
necessarily
in
the
interview,
because
I
don't
want
to
give
it
away
again,
agree:
oh
yeah,
okay
and
then,
and
then
mapping
this
thing
out
and
those
Keith's.
B
So
as
I
shared
you
so
the
bottom
row
of
that
map
that
I
showed
you
was
like
pain
points.
So
at
each
of
these
steps
that
I'm
documenting
like
what
they
have
trouble
with
something
that
took
too
much
time,
I'm,
something
that
was
annoying
things
that
was
a
her
source
or
whatever
and
that's
the
stuff
I'm
looking
for
there.
That
is
like,
presumably,
if
I
can
fix
that
I
could
eliminate
those
things
that
are
problems
can.
A
C
For
sure
no,
this
is
wonderful.
I
was
gonna
jump
on
slack
afterwards
and
ping
you,
but
those
baselines
that
I've
been
talking
about
the
things
that
we
need
to
put
together
is
basically
a
third
of
it.
Is
this
experience
map
that
Michaels
talking
about
so
laying
out
every
staff?
You
know
they
clicked
on
this
link
to
get
to
this
page
when
they
were
on
this
page.
They
looked
at
this
piece
of
data
while
they
were
looking
at
it.
A
C
B
A
So
it
sounds
like
as
a
follow-up.
What
we
need
to
do
is
we
need
to
go
through
the
research
brief
again
and
there's
some
questions
in
there
that
you
commented
on
and
I
think
we
can
add
more
depth
to
and
more
specific
details,
and
we
could
try
specifically
when
we
think
about
who
are
the
users
that
we
want
to
talk
to
and
that
we
want
to
recruit
for
the
interviews
and
then
once
we
have
that
I
think
we
can
run
that
by
the
get
lab
head
researcher
to
say.
Okay,
how
do
we
recruit
these
people?
A
These
are
the
specific
people
that
we're
looking
for
here's
the
timeframe
that
we'd
liked
it
to
interview
them
and
separately.
We
can
create
the
go
through
and
create
the
experience
map,
what's
the
exact
process
in
a
very
detailed
manner,
and
then
we
can
from
that
once
we
have.
That
is
the
idea,
then,
to
set
up
our
questions
and
our
actual
like
how
are
we
gonna,
add
conduct
the
interview
next
after
that
right.
B
And
so
the
interview
you
can
imagine
if
in
mine,
so
this
is
what
I
was
talking
about
before
the
start
at
the
end,
if,
if
in
my
head
is
the
researcher
I
know
when
I'm
done
I
want
to
be
able
to
fill
out
this
map,
then
that
interview
guide
is
very
deliberately
going
through,
like
let's
take
me
through
a
project
right
like
so.
Where
does
it
start
and
then
what's
in?
B
How
would
you
know
somebody's
built
their
own
UI,
like
Wow,
okay,
show
me
what
you
guys
did,
because
that
shows
that
this
was
a
big
pain
in
the
butt.
If
you
took
that
effort
and
did
that
yeah
exactly
fix
it
or
whatever,
or
we
have
these
post-its
on
my
my
screens
that
are
showing
me
this
or
that
and
whatever
so
those
things,
the
macros
that
they've
built
whatever
the
things
are
like
that
they're
their
crutches,
you
want
to
dig
into
those
as
much
as
possible
to
find
out
like
what
is
that?
What
you?
A
So
I
think
my
and
in
can
chime
in
too,
but
I
think
my
goal
would
be
to
be
able
to
take
a
first
shot
at
that
process
map
and
then
to
validate
that
we're
seeing
that
right
that
we
identified
the
problems
and
then
from
there
being
able
to
generate
a
clickable
prototype
that
we
could
do
another
round
and
and
now
say
like
okay,
are
we
solving
the
problem
like
we've
validated?
What
the
problems
are?
B
And,
depending
on
what
the
the
remaining
questions
are
at
the
end
of
this,
that
will
affect
what
that
prototype
looks
like
and
what
I
mean
is
not
just
how
you
design
it,
but
how
clickable
it
is
or
what?
What
are
we
doing?
Are
we
showing
a
scenario
where
it's
a
story,
or
is
it
actually
something
for
them
to
use
and
click-through
like
right
depend
on
what
what
it
is
or
is
it
just
a
plan,
a
new
landing
page
describing
new
features
that
people
like?
Oh
yeah?
B
Yes,
I,
don't
like
this,
because
there
are
times
that
I
do
things
like
that
where
we
present
essentially
a
prototype
of
the
new
product.
It's
that
you're
kind
of
showing
up
for
packaging.
If
you
write
the
landing
page
that
thing
and
then
you
can
compare
it
to
competitors
like
how
does
this
sound
compared
to
these
other
ones?
What's
one
of
the
pros
and
cons,
you
know,
like
oh
I,
like
how
these
people
are
going
to
do
this
thing,
I
like
how
you
guys
are
offering
this
and
whatever
and
that's
another
way.
A
B
So
you
can
tell
the
way,
I
think
about
the
lies
things
like.
What's
the
question
you
have
to
answer
right
now
the
question
is:
what's
their
process
and
what
do
they
need
these
days,
perhaps
and
then
we'll
figure
out.
What's
our
next
question
and
then
we'll
figure
out
the
right
way
to
answer
that?
Okay,
but
you
but
will
kind
of
be
narrowing
and
getting
a
building
up
where
confidence
along
the
way
that.
A
A
What
we
could
in,
what
do
you
think
when
do
you
think
we
could
reasonably
go
through
and
I
think
that
identifying
the
respondents
and
talking
to
the
researcher,
we
could
do
probably
like
this
week
early
next
week
and
we'll
just
wait
for
a
response
from
them,
but
going
through
the
process
may
take
a
little
bit
of
time.
Do
you
think
we
could
do
it
by
this
time
next,
Thursday
Eli?
Can
we
just
reschedule
this
meeting.
C
I
would
lean
towards
two
weeks
these
kind
of
like
when
you
lay
out
the
process.
It
is
really
granular
and
really
for
me
night
and
with
an
immature,
not
immature
a
Jew
like
as
a
young
product
like
there's
going
to
be
a
lot
of
oh
and
I
noticed
this
and
oh
I
notice
sisters
that
the
process
ends
up
being
a
little
slower
than
you
think
it
will
be
so
I
would
say.
B
A
B
A
Just
their
whole
day
is
focused
on
that.
It's
a
poke
because
I've
been
doing
it
as
like.
I
have
one
you
it's
and
it's
painful
that
way
like
an
interview
a
day,
because
it's
that
you
spend
so
much
time
getting
ready
for
each
one
and
you
have
to
like
reget
in
the
groove.
That's
exactly
right!
So
you
have.
B
To
get
really
get
in
the
groove
and
then
to
go,
do
the
analysis,
at
least
for
me,
it
takes
a
lot
more
work
because
I'm,
like
oh
crap,
I,
talked
to
this
person
three
weeks
ago
or
the
week
of
the
day
ago,
from
my
brain
like
what
what
was
it.
But
if
you
do
it
all
in
a
clump
in
a
day
or
two
days,
then
a
lot
of
the
patterns
and
the
things
you
just
will
have
heard
stuff
over
and
over,
and
so
the
big
things
become.
Pretty
obvious.
I
mean
it's.
B
This
is
maybe
it's
less
true
for
this
kind
of
study,
where
it's
a
work
flow.
You
have
to
it's
a
lot
of
detail.
You'll
have
to
go
back
and
map
it,
but
there
will
be
a
lot
of
things
that
will
just
come
through.
You'll
have
seen
like.
Oh
I,
see
it
it's
about
this
part
yeah
like
you'll.
Just
you
just
get
these
big
things
pop
out.
You
know
after
one
two
three
you're,
not
quite
sure
four
or
five
like
oh,
my
god.
Let's
just
go
for
exactly
like.
Don't
do
any
more
interviews,
please
exactly.
B
A
B
A
B
C
B
A
C
A
9:00
or
9:30,
that's
whatever
you
want
strong
for
me:
okay,
okay,
cool
and
then
even
I
will
work
on
the
experience
map
I'm,
still
working
on
the
data
model
and
adding
in
definitions
which
I
think
will
help
inform
some
of
that
too,
and
then
we'll
work
on
the
the
specific
recruits
that
we
want
to
or
interviewees
that
we
want
to
recruit
and
researcher
and
then
we'll
share
that
information
with
you
as
well.
Yeah.
B
A
Sadly,
we
do
not
have
a
lot
of
metrics
where
you
we
really
are
limped.
It's
a
problem
at
get
web
actually
right
now,
but
we
are
working
on
adding
and
tracking
for
a
lot
of
those
things.
So
we
we
can
at
least
think
about
what
metrics
we
would
want
to
use
and
when
we
do
start
tracking
them
we
could.
We
can
yeah.
A
B
C
B
Is
another
way
that
we
can
validate
some
of
these
things?
This
again
depends
on
the
questions
we
have
there
are
things
be
another
question
for
your
research
team
is:
what
are
the
mechanisms
for
surveying?
Give
that
users
right?
Okay
and
a
related
question
to
that
is:
do
you
have
a
way
that
you're
tracking
customer
satisfaction?
B
B
Because
some
of
these
things
we're
gonna
again
it's
a
way
to
just
see
like
did
this
move
right
are
we
are
we
affecting
the
right
really
affecting
true
satisfaction
or
whatever
the
key
metrics
are
and
hopefully
not
NPS?
We
don't
want
MPs,
not
satisfaction,
okay,
different
problems.
I
can
explain
to
you
why
that
is
someday.
If
you
want
that,
yeah
I
am
wondering
why
that
you
know
like
NPS.
It's
basically
been
debunked
is
the
short
answer.
B
So
there
are
a
lot
of
ways
that
the
standard
way
that
it's
used
as
a
measurement
is
just
not
very
useful
for
UX
or
design
and
I
can
send
you
the
easiest
ways.
I
can
send
you
there's
a
very
good
article
that
Jared
spool
wrote
that
I
seem
like
I'm
sharing
all
the
time
and
it
mirrors
stuff
that
I've
heard
from
our
own
service
scientists,
that'd.
B
So
an
example
of
this
is
if
basically,
what
you're
asking
is,
how
likely
to
recommend
something,
and
so
your
likelihood
to
recommend
something
it's
dependent
on
a
bunch
of
indirect
variables.
So,
for
example,
whatever
it
was
eight
years
ago,
you
likelihood
to
recommend
Facebook
or
Gmail
to
somebody
might
have
been
pretty
high
right,
but
now
you
might
love
those
products,
you
might
even
love
them
more
but
like
when
was
last
time
you
recommended
Gmail
all
right
the
penetration
just
doesn't
it
doesn't
make
sense
for
that
or
for
certain
kinds
of
products.
B
If
I
don't
know
so,
I
could
say
I'm
an
oncologist
and
I
use
a
certain
product
like
how
likely
to
recommend
to
like.
Well,
maybe
I,
don't
know
somebody
else
who
needs
this
thing
right,
look,
I,
love
it,
but
I,
don't
know
how
many
other
oncologists
with
this
particular
need,
or
something
I
don't
know.
So
it's
just
it's
very
different
than
asking
like
how
much
do
you
love
it
like?
Oh
I,
totally,
love
it.
Okay,
that's.
A
B
Healthcare
and
life
sciences
stuff,
so
I've
worked
with
oncologists.
I've
worked
with
I
did
a
bunch
of
thing
with
stroke
teams
not
long
ago
and
how
they
process
a
patient
comes
in
with
stroke,
and
what
does
that
look
like
so
I?
Do
everything
from
that
kind
of
stuff
to
get
lab
done
a
lot
of
cyber
security,
stuff,
retail
e-commerce
hardware
transportation?
You
know
we
did
a
lot
of
work
with
lacus
lacked
lime.
A
B
That
first
answer
they
give
you
there's
like
a
shorthand,
and
maybe
you
are
then
assuming
what
that
means,
and
maybe
it
is
maybe
it's
the
same
or
maybe
it's
not
but
I
wouldn't
know
and
I
would
just
ask.
So
it's
very
common
that
I'm
doing
the
interview
a
team
is
watching
and
they're
like
oh
I,
didn't
realize
that's
what
they
met.
Yeah.
A
I
could
totally
see
that
so
that
maybe
that's
a
good
way
of
phrasing
is
like
go
into
the
interview
and
pretend
you
don't
know
anything
because
I
think
that's
such
a
for
me
at
least
like
a
strong
impulse
to
be
like
I
understand
what
you're
talking
about
I
got
I
got
what
you
just
said
like
I
and
I
have
I.
Think
that's
a
good
call
out
for
me
to
try
to
avoid
is
pretend
I,
don't
understand
anything
even
when
I
do
understand
it
right.
B
Because
so
I
guess
the
mindset
that
I
have
when
I
go
into
it.
So
part
of
what
I
just
described
is
a
mindset.
Part
was
just
my
reality,
which
is
I,
don't
know
anything
and
I'm,
always
very
clear
up
front
like
I'm.
Not
an
expert
in
this
I
do
a
lot
of
research,
but
I,
don't
know
about
oncology.
I,
don't
know
about.
You
know
suffering
whatever
and
I
say
that
whether
I've
talked
to
nobody
or
I've,
talked
to
over
220
people
right
that
it's
still
fresh
so
part
of
its.
B
That
part
of
the
mindset
is
that
what
I'm
trying
to
do
is
see
it
through
their
eyes.
So
even
if
I
do
know
something,
what
I'm
trying
to
do
is
get
them
to
explain
it.
How
are
they
experiencing
it
or
seeing
it
or
doing
it,
and
that's
usually
new
right?
It's
always
a
little
different,
because
the
context
is
different.
Their
experience
is
different.
The
tools
they're
using
or
a
little
different,
and
so
I'm
I'm
pressing
on
that
and
part
of
it
is
I.
B
This
will
sound
a
little
weird,
but
I
I,
actually
like
I,
play
a
different
role
when
I'm
doing
research
when
I'm
doing
an
interview,
and
so
before,
I
before
I,
actually
pick
up
the
phone
right
walk
into
a
room
with
somebody
I
actually
like
take
a
moment.
I
stop
I,
take
a
breath
and
I
smile
and
I'm
in
a
completely
different
mind
frame,
which
is
I'm
fascinated
by
this
thing.
B
We're
going
to
talk
about
and
I
just
it's
really
interesting
to
me
to
try
to
understand
how,
and
why
do
you
do
this
and
I'm
just
like
super
fascinated?
It
not
usually
takes
some
amount
of
caffeine
for
me
and
to
just
be
really
focused
on
that
and
super
open,
and
that
changes
my
my
body
language
and
it
changes
everything
and
to
the
point
where,
when
my
kids
have
there
been
times
that
they've
seen
like
some
video
of
me
doing
this
or
whatever
it's
like,
who
is
that
guy
yeah.
A
B
And
just
be
super
open
and
like
oh,
my
gosh,
that
is
the
most
interesting
thing
and
tell
me
more
about
that
and
as
opposed
to
my
normal,
like
skeptical
cynical,
my
kids,
don't
even
recognize
it
really,
but
it
changes
it
even
if
it's
over
the
phone
right,
if
I
smile,
like
it
just
changes
my
tone.
So
it
is
it's
a
shift
of
how
I'm
presenting
and
how
I'm
asking
questions.
That's.