►
From YouTube: Release PM/PD/UXR Monthly Sync - June 2022
Description
This meeting is a monthly touchpoint between Chris Balane (Product Manager), Rayana Verissimo (Product Design Manager), and Will Leidheiser (UX Researcher) to discuss research within the Release stage group.
A
So
I
guess
we
can
get
started
at
the
top.
So
one
of
the
first
things
that
I
put
in
the
agenda
was
that
the
interviews
have
wrapped
up.
I've
heard
you
hyanna
talk
about
the
solution,
validation,
work
for
the
environments,
page
redesign,
so
I
was
just
kind
of
curious
when
the
results
and
insights
would
be
shared
out,
and
you
know
what
are
the
next
steps
that
are
kind
of
coming
out
of
that
work?.
B
Yes,
good
kala,
thanks
for
the
reminder,
because
indeed
I
think
this
week
I
started
with
a
prototyping
and
I
totally
forgot
to
share
the
results
so
I'll
do
that
this
week,
I'll
write
down
the
summary
and
also
share
with
the
team,
yes
slack
and
team
meeting
and
last
time
we
also
talked
about
actionable
insight
right.
So
there's
a
couple
of
issues
there
already
on
my
on
my
plate
that
relate
to
that.
B
So
I'll
add
the
labels,
because
I
haven't
done
that
yet
and
then
follow-up
question
to
you
it's
like.
Do
you
propose
anything
additional
that
I
can
do
to
to
share
out
the
results
with
the
team.
A
Yeah
one
of
the
things
that
I
had
worked
on
with
adam,
I
think
sometime
last
year
was
we
had
like
updated
the
problem,
validation
solution,
validation
templates
to
add
like
a
little
table
at
the
top.
So
when
the
study
is
complete,
it's
very
easy
for
people
to
find.
A
You
know
what
people
actually
learned
from
the
study
because
we
had
heard
complaints
about.
Like
you
know,
if
I
go
to
a
research
issue,
I
don't
really
know
what
they
learned.
I
have
to
like
dig
into
the
comments
or
something
so
we
just
added
a
little
table
at
the
top
that
you
can
easily
edit
and
we're
only
looking
for
just
like
a
couple
sentences.
I
know
that
you've
already
included
like
a
link
to
the
dovetail
project
and
other
summaries
that
are
linked
within
that
issue.
A
So
that's
good,
but
in
addition
to
that,
if
you
want
to
make
like
a
very
short
video
summary
just
talking
through
that-
and
you
know
showing
some
of
the
things
in
real
time-
you're
welcome
to
do
that.
But
I
don't
want
to
put
more
on
your
plate.
I
think
the
stuff
that
you've
already
proposed
is
is
fine.
B
Okay,
awesome
I'll
look
into
that
this
week.
My
writing
summarized
from
each
interview
the
main
insights
but
yeah.
I
don't
know
I'll
I'll
think
about
the
video.
I
think
it's
actually
a
fun,
but
also
a
nice
way
to
share
out
the
results
with
more
people,
so
I'll
put
that
in
my
backlog,
but
thanks
for
the
thanks
for
the
heads
up
will
be
helpful.
B
A
A
C
Thanks
will
transparency
I'm
like
I'm
just
very
behind,
and
so
I
think
I
do
have
a
lot
of
free.
I
have
not
free
time,
but
I
have
some
time
today
to
like
focus
on
this,
and
I
will
try
to
get
a
few
of
these
scheduled
like
in
the
next
week.
Okay
yeah,
thank
you
for
the
offer
and
thanks
for
checking
in
I,
if
I,
if
I
this,
is
sort
of
new
to
me.
C
A
Yeah
I'll
I'll
have
some
time
later
in
the
afternoon,
because
I
think
you're
also
like
eastern
standard
time,
so
I
think
I'm
pretty
much
available
like
after.
A
Oh
what's
my
schedule
like
like
two,
so
if
you
need
a
quick
like
a
short
zoom
call
or
just
want
to
ping
me
anything
like
that
is
totally
fine.
C
D
B
Yeah,
I
got
an
email
I
think
today
about
a
participant
I
haven't
received
your
thank
you
gift
after
well
participating
in
the
the
interviews
for
the
the
environment
page
and
I
actually
don't
remember.
I
submitted
the
the
request
with
katelyn
or
not
because
was
one
of
the
period
where
I
was
either
filling
in
in
the
in
the
spreadsheet
that
you
and
erica
use
for
the
common
screener
and
katelyn's
sheet
right
for
that
particular
recruiting
request
for
that
issue.
B
So
how
do
I
follow
up
with
that
right
so
that
the
person
can
can
receive
their
gift,
but
also,
should
I
keep
using
the
common
screener
file
or
yeah?
Is
it
okay?
If
I
just
do
it
directly
with
caitlyn.
A
I
think
if
it
involves
using
participants
that
we
get
through
the
common
screener,
I
would
put
it
in
that
sheet
just
so.
We
can
track
people
across
those
different
studies
because
they're
all
essentially
coming
from
one
pool
of
people,
but
I
would
also
in
terms
of
your
original
question
at
mention
caitlyn
and
then
just
tell
her
about
like
hey.
These
are
the
names
of
the
people
or
their
email
addresses
and
I'm
not
sure
if
they
were
paid.
A
Could
you
just
look
into
it
and
we're
using
a
relatively
new
system
that
she
should
be
able
to
like,
find
that
and
and
confirm
if
those
people
were
actually
paid
in
the
first
place,
and
if
that
payment
went
through
successfully.
B
Will
thank
you
so
much
and
then
a
follow-up
question
to
that
that
there
was
one
participant
I
think
the
first
person
I
talked
to
that
wasn't
really
helpful.
I
think
not
that
they
were
not
gitlab
users,
but
for
you
know
the
particular
they
they
did
not
have
experience
with
with
git
lab
with
the
deployments
and
releases
and
they
were
not
applied
with
releasing
gitlab.
How
do
I
give
that
feedback
about
the
particular
person
so
that
when
they
are.
B
Picked
next
time
yeah,
they
are
not.
You
know
selected
for
this
particular
type
of
study
or
type
of
questions.
A
I
think
within
the
recruiting
issue,
it
should
be
confidential,
so
you
could
just
mention
the
name
of
that
individual
and
like
tag
caitlyn
and
erica,
so
that
either
they
can
like
drop
that
person
altogether
from
the
qualtrics
data
or
they
can
like
flag
them
in
some
way.
I
think
there
may
be
some
way
to
do
that,
so
that
other
people
don't
recruit
them
and
run
into
the
same
issue.
B
Okay
sounds
good
I'll.
Do
that
and
I'll
after
I'll?
Do
that
and
I'll
share
that
that
default
with
my
team,
because
there
were
some
instances
in
the
past
where
they
also
feel
like
the
registration
that
wasn't
really
helpful
or
that
they
were
all
over
the
place
or
remind
them
to
do
that
as
well.
After
the
interview
right
so
flag
that
for
the
team.
A
So
for
the
next
thing,
what
we're
currently
working
on,
so
I
appreciate
you
all
going
through
the
mural
activity,
the
next
part
or
one
of
the
next
parts
that
I'm
working
on
is
creating
that
ux
sandbox
that
we
talked
a
little
bit
about
in
the
release
team
meeting
yesterday
and
one
of
the
other
researchers
created
a
handbook
for
that.
A
A
So
my
plan
is
to
follow
the
same
set
of
steps,
create
a
sample
project
and
then
I'll
need
to
generate
fake
data.
I
have
looked
into
what
the
fake
data
or
what
type
of
fake
data
will
be
generated,
and
it's
a
lot
of
just
kind
of
basic
stuff.
Like
issues
merge
requests,
it's
not
stuff.
That
seems
to
be
applicable
to
our
situation
like
environments.
A
You
know
for
help
adding
like
fake
deployment
related
data,
so
we
could
have
users
go
through
various
tasks
with
with
that
information.
C
Yeah
definitely
feel
free
to
reach
out.
I
think
what
would
be
helpful
and
I'm
sure,
you'll
you'll
have
this
list
already
is
like
what
are
the
piece,
the
things
that
you
need
in
place
like
a
list
of
deployments
or
releases
or
whatever
it
is,
and
then
yeah,
I'm
sure,
yeah
between
between.
C
A
Yeah
you,
you
kind
of
said
the
same
things
just
looking
at
the
doc.
Basically,
what
chris
was
just
saying
so
to
answer
chris,
your
point
like
it's
really
just
going
to
depend
upon
what
tasks
we
end
up
using
in
this
study.
A
I
was
looking
at
the
mural
and
there's
31
different
tasks
that
have
been
proposed
across
you
and
hayana,
so
we
won't
be
able
to
go
through
every
task
in
the
study.
I
would
imagine
just
because,
as
I
added
a
little
note
for
context
here,
the
other
researcher
who
did
the
same
type
of
study
in
create,
went
through
20
tasks
and
it
took
two
hours
per
participant
and
that
was
with
20
participants.
A
So
it's
it's
going
to
be
a
pretty
big
time,
commitment
and
everything
so
I'll
just
have
to
like
whittle
down
what
tasks
we
go
through.
So
I
may
set
up
a
a
separate
conversation
once
I've
had
a
chance
to
look
at
all
those
different
tasks
and
then
we'll
figure
out,
you
know
what
you
know
roughly
20
or
so
we'll
want
to
go
with,
and
then
then
I
can
give
you
and
the
team.
A
better
idea
of
you
know:
here's
the
set
of
information,
I'll
need
for
those
tasks.
B
Because
there
are
a
couple
of
scenarios
and
tasks
that
can
only
be
set
up
via
api
today,
because
we
don't
have
the
ui
we
can
highlight
those
in
the
mirror
board.
So
just
say
like
this
is
a
back
and
only
task
right
or
a
scenario.
I
also
have
an
idea,
but
to
chris's
point
yeah
like
if
it's
about
the
scenario
or
the
task
is
managing
deployments
or
setting
up
a
deploy,
pipeline,
etc.
B
I
think
it's
good
to
have
those
first,
so
that
the
team
can
set
up
the
particular
ui
and
a
particular
test
project,
with
not
only
relevant
data
but
meeting
the
criteria
right
that
we
want
to
validate
yeah,
and
I
can
also
help
out
if
you,
if
you
need,
will
okay,
please
reach
out
to
the
team,
but
also,
let
me
know
there
are
a
couple
of
things
I
think
I
might
be
able
to
to
help
you
with
okay.
I.
C
Think
this
is
just
also
great,
because
this
is
a
good
like
learning
exercise
for
all.
I
think
all
for
all
of
us
too.
So
I
think
we'll
learn
a
lot
by
trying
to
generate
this,
like
all
the
pieces
that
we
need
to
be
able
to
showcase
the
different
features
and
the
tasks
that
are
done
in
the
product.
So
that's.
I
think
this
is
awesome
and
yeah
same
I'm
happy
to
help
as
well.
Will
when
you
get
to
that
point,.
A
Yeah
and
to
piggyback
on,
I
don't
know
what
you
were
saying
just
a
moment
ago.
I
think
it
would
be
good
to
know
which
ones
are
api
based,
and
we
may
want
to
have
that
discussion
in
our
next
chat.
Like
are
those
things
out
of
scope,
or
do
we
want
to
focus
on
the
ones
that
are
just
within
the
get
lab
ui
and
that
decision
doesn't
have
to
be
made
right
now,
but
just
maybe
something
to
think
about
before
we
we
have
that
chat.
B
Okay,
I'll
look
at
the
file
after
our
call
and
then
I'll
just
add
something
on
the
sticky
notes
to
indicate
that
this
is
an
api
only
test
for
the
ones.
I
know
you
know
cool.
D
A
And
did
you
talk
about
this
point
as
well
hayanna?
I
can't
remember.
B
B
Know
we
also
have
some
some
people
that
have
might
have
more
or
less
availability,
so
nicole
might
be
able
to
tell
you
who
is
the
best
person
to
support
you
with
this
with
the
setup?
B
A
A
You
know,
review
the
tasks,
the
recruiting
criteria
really
nail
those
down
and
then
I'll
want
to
check
in
with
you
after
we've
figured
out
those
things
just
from
time
to
time,
with
more
specific
questions
about
the
tasks
that
we've
decided
upon-
and
you
know
I've
included
kind
of
a
list
of
example.
A
Questions
that
you
know
I'll
be
asking
about,
like
you
know,
what's
the
happy
path
to
do
this
task
from
start
to
finish,
what
are
the
different
ways
that
they
can
like
complete
the
task,
because
one
of
the
main
aspects
of
the
usability
benchmarking
study
is
to
kind
of
determine
if
they've
completed
it
or
not.
So
we'll
have
to
know
like
okay
when
they
do
this,
that's
a
success,
but
if
they
do
it
this
other
way,
that's
maybe
a
failure
or
something
so
we'll
want
to
determine
that.
B
Boys
my
comments:
you
have
a
new
product
designer
joining
soon
all
right
and
I'd
like
to
include
them
in
the
conversation
as
well,
for
the
the
benchmarking
for
release.
So
that's
yeah
that
can
be
part
of
their
onboarding
process
and
then
yeah,
just
not
only
experiencing
the
benchmarking
process
itself,
but
give
them
the
opportunity
to
learn
more
about
release
with
us.
So
I'll
I'll.
Let
you
know
when
I
know
more
about
and
then
when
you
can
include
them.
Okay,.
A
Yeah
and
I'll
add
a
little
comment
about
this
one
in
the
doc,
but
I
definitely
want
to
keep
you
hyuna
involved
in
this
process,
even
though
we
will
be
bringing
on
someone
else
just
because
you
know,
you've
got
a
lot
of
great
knowledge
in
in
this
space,
but
it's
it's
good
to
know
that
we're
bringing
on
another
designer.
D
B
Awesome,
please
keep
me
in
the
loop,
I'm
it's
also
fun.
It's
a
fun
type
of
activity
so
I'll
be
happy
to
continue,
not
collaborating
with
y'all
yeah.
A
I
guess
just
for
context
chris
that
hayana
and
I
have
like
a
separate
ux
related
meeting,
and
I
had
asked
a
question
about
how
right
now
in
that
planning
issue,
we've
got
like
a
jobs
to
be
done
and
a
cm
scorecard
both
for
environment
management,
and
so
I
wasn't
sure
like
if
one
of
those
had
to
be
completed
before
the
other
one.
Just
in
terms
of
like
how
to
stack
them.
You
know
priority
wise,
so
hyanna
had
advised
me
that
the
jobs
to
be
done
should
be
completed
before
the
cm
scorecard.
A
So
I
reordered
that
so
my
question
was:
when
do
you
think
the
jobs
to
be
done?
Work
we'll
get
started.
B
So
mate
senior
product
designer
for
growth
he
reached
out
to
me
today
asking
you
can
help
with
anything
else
for
the
learning
team-
and
I
say
yes,
we
have
this
job
to
be
done,
work
that
needs
to
be
done
and
that's
when
he's
available
to
help
right,
I
told
him:
well,
I'm
not
sure,
I'm
not
sure
what
his
schedule
looks
like,
but
as
if
he
can
do
it
and
complete
it
in
152,
I'm
ready
to
hear
back
from
him,
but
I
also
told
him
if
you
can
do
it
sooner,
the
better
right.
B
So
if
you
can
just
pick
it
up
next
week
or
start
looking
at
it
so
that
we
deliver
before
the
end
of
the
next
milestone,
I
think
that
would
be
awesome,
but
I'm
still
not
sure
what
his
answer
will
look
like.
Let's
see
if
he
replied.
D
A
I'll
I'll
keep
writing
this
in
the
the
doc,
but
basically
when
he
picks
it
up.
You
know
whether
that's
15.2
or
something
else
he
can
just
ping
me
for
feedback
and
consulting
I've
done
a
couple
jobs
to
be
done.
Studies
that
I'm
currently
helping
the
configure
team
their
designers
is
leading
one
for
kubernetes.
D
B
We
do
thank
you
for
that
anything
for
this
one
in
particular,
because
we
have
so
many
insights.
That's
just
my
initial
gut
feeling
that
we
might
not
necessarily
need
to
do
the
the
interviews,
but
I
would
think
I'll
leave
that
up
to
to
him
and
to
you
to
decide
on
the
approach,
because
he
also
needs
to
become
familiar
with
the
area,
but
because
we
have
not
only
talked
but
investigated
environment
management
in
the
past.
I
think
you
can
leverage
from
that
from
the
inside
that
we
have
in
our
backlog.
B
I
want
to
add
another
item
here
to
the
agenda
shirts
about
the
opportunity
canvas
for
future
flags
that
slack
threads
that
we
had
last
week
here
follow
up
on
that
we
talk
a
little
bit
so.
D
B
A
Yeah,
so
when
I
was
looking
at
just
that,
like
one
or
two
sentence
description,
you
know
it
was
feature
flag,
solvent,
a
well-known
problem.
We
want
to
understand
how
our
current
offering
is
working
for
existing
customers
and
what
gaps
there
are
for
more
usage
and
adoption.
A
I
was
essentially
saying
that
that
that
kind
of
reads
like
more
problem:
validation
than
solution
validation,
just
because
solution
validation
is
like
when
you
have
like
a
new
proposal
that
you
want
to
get
evaluated,
and
since
this
is
just
based
on
the
current
experience
at
this
time
and
like
assessing
gaps,
that's
that
pretty
much
sounds
like
problem
validation,.
C
That
makes
sense
well
yeah,
thanks
for
clarifying
linking
linking
also
to
their
handbook
pages.
Yeah
sounds
good.
I
I
think
yeah
and
thanks
hannah
for
raising
this.
I'm
just
thinking
about
I
know.
Are
you
asking
also
like
what
what
should?
How
should
we
move
this
forward,
and
what
are
the
next
steps?
Is
that.
B
Yeah,
just
so
ready
to
see
like
is
there
anything
that
we
need
to
discuss
about
the
methodology
if
it's
a
problem,
a
solution
and
then
yeah,
if
you,
if
you
already
have
some
thoughts
for
some
about
next
steps,
that
will
be,
will
be
interesting
to
hear.
C
Yeah,
I
think
it
sounds
like.
Then
you
got
to
will's
point
it.
Maybe
it's
it's!
It's
problem
validation.
I
think
what
the
research
would.
I
think,
what
like
it
would
could
look
like
is
interviewing
or
discussing
even
with
internal
teams
like
one,
maybe
just
more
color
about
this,
like
what
like,
we
have
the
digital
experience
team.
C
I
think
that's
what
they're
called
in
the
in
marketing
and
they're.
They
decided
to
use
launch
darkly,
which
is
one
of
our
competitors
for
their
needs,
and
I
would
love
to
know
why,
and
you
know
why
they
made
that
decision
and
why,
if,
if,
if
they
considered
using
git
lab
for
dog
fooding,
rather
than
that,
so
like
that's,
that
that
would
be
one
data
point,
I'd
love
to
to
know,
and
then
maybe
even
some
internal
engineering
teams,
but
then
yeah.
C
Definitely,
you
know,
we
know
if
a
small
set
of
customers
do
use
the
features,
future
flag
features
and
yeah
tables.
Maybe
well
like
what
you're
saying.
Maybe
it's
like
problem
validation
of
what
what
else
is
kind
of
uncovering
what
other
unmet
needs
there
are
for
the
existing.
A
Yeah
that
sounds
good
to
me.
I
think
that's
a
good
strategy
to
to
lean
on
like
internal
customers
or
internal
team
members,
but
if
there's
also,
if
you
can
also
you
know,
get
access
to
a
couple
of
customers
that
don't
you
know,
work
inside
get
lab
that'd
be
great
too
cool.
C
A
Yeah
and
then
you
can
also
create
like
a
recruiting
issue,
there's
templates
for
both
of
those
and
yeah.
We
can
just
go
from
there.
B
I'm
very
happy
for
future
flags.
I
already
mentioned
that
last
time,
but
what
surface
that's
excited
also
for
the
for
the
validation
for
to
hear
the
results
right?
Why
or
maybe
that's
decidable?
Why
are
we
not
using
future
flags
today?
So,
let's
see.
D
B
I
don't
think
so,
just
a
major
follow-up
question
will:
when
do
you
want
to
have
that
the
test
project
with
all
the
scenarios
up
and
running?
Do
you
have
a
deadline
for.
A
That,
no,
that's
that's
a
good
question.
I
don't
have
a
deadline
at
the
moment,
I
probably
say
definitely
by
the
end
of
the
month,
but
I
I
think
the
more
immediate
need
is
just
determining
you
know
what
tasks
and
recruiting
criteria
we
have
and
then
I
can
give
you
a
more
like
accurate
answer.
D
C
Well,
actually,
I
actually
had
a
question.
That
is
the
idea.
I
know
you
did
and
by
the
way,
those
videos
for
the
for
the
active
exercises
are
like
awesome
and
super
helpful.
So
thank
you
for
making
those
are
the
tasks
I
was.
I
had
this
question
when
I
was
doing
it
are
the
tasks
meant
to
be
like?
C
A
It's
it's
more
of
the
latter,
so
we're
just
assessing
like
based
on
all
the
features
and
things
that
are
currently
baked
into
git,
lab
like
how
how
good
or
not
are
certain
tasks
and
aspects
of
the
product,
so
it
just
gives
like
a
more
broad
usability
assessment
because
right,
I
think
it's
the
handbook
describes
like
you
know
every
quarter.
A
So
I
talk
about
sus
and
it's
like
yeah,
you
know,
sus
is
like
up
in
the
clouds
and
it
tells
you
like
get
lab
is
like
a
little
bit
better,
a
little
bit
worse
overall,
but
it
yeah.
You
know
besides,
that,
doesn't
really
tell
you
much
so
this
is
gets
like
very
granular
and
says
like
okay,
when
they're
trying
to
do
this
very
specific
task
within
you
know,
environments,
you
know,
can
they
do
it?
I.
A
Most
can
or
most
can't-
and
you
know
takes
them
this
long
on
average
to
do
it
this
many
minutes
or
this
many
seconds.
So
that's
what
this
study
will
really
help
us
understand
a
little
bit
more
about.
B
And
just
and
just
a
follow-up
thought
on
that
will
jackie
and
I
did
what
was
that
was
a
solution
but
solution,
validation
for
secret
management
with
vault
like
a
year
and
a
half
two
years
ago,
and
it
was
api,
heavy
right
and
it
took
so
long
because
we
had
to
wait
for
the
the
app
to
run
the
tasks
and
for
the
pipelines
to
you
know
to
complete,
etc,
to
run.
B
So.
Just
thinking
of
that,
maybe
that's
also
something
for
for
us
to
to
keep
in
mind
that
if
the
scenarios
that
we
want
to
test
are
focused
on
those
developer,
personas
that
you
know
it's
gonna
take
a
while
and
do
we
want
to
break
it
down.
B
Do
you
want
to
have
do
we
want
to
have
like
we
are
validating
the
usability
right
through
the
for
the
interface
versus
the
api
right
and
how
the
those
rate,
because
the
personas
are
also
different
and
the
tasks
are
going
to
be
different
and
back
then,
when
we
did
that
exercise,
no
one
could
complete
they.
If
we,
we
had
to
close
the
the
meeting,
because
it
was
just
taking
so
long
and
people
had
to
configure
xyz
before
being
able
to
complete
the
task
and
it
didn't
have
permissions,
etc.
B
A
Yeah
and
that's
why
we
want
to
like
narrow
down
the
tasks,
not
just
because
we
have
a
broad
list,
but
you
know
we're
only
gonna
have
so
long
to
talk
to
a
participant.
I
mean
they'll
gladly
take
more
of
our
money
for
more
of
their
time,
but
we
don't
want
to
like
overload
them
with
participating
in
you
know
a
three
hour
study
or
something
with
us,
so
yeah
it'll
be
like
finding
that
right
balance
between
you
know
what
tasks
like.
A
B
Yep
I
understand
I
agree
with
that
information,
so
I'm
going
to
look
into
that
right
after
a
call,
because
then
it
will
be
still
fresh
in
my
mind
and
because,
when
I
was
doing
this,
this
exercise
also
in
the
mirror
board,
I
was
like
this
is
a
persona
that
would
definitely
do
this
through
the
ui
or
this
is
the
person
that
would
set
up.
You
know
acquire
the
via
api
and
I
think
also
based
on
all
the
limitations
that
we
are
we
have
in
on
the
interface
today.
B
A
lot
of
the
tasks
can
be
via
api
anyways,
so
we
got
to
have
an
idea
of
what
parts
of
the
flow
will
kind
of
stop
in
the
from
the
ui
right.
We're
gonna
have
to
we're,
not
gonna,
be
able
to
complete
something
particular
and
then
what
the
api
task
would
look
like.
A
B
I'm
just
still
thinking
about
this,
like
we
can
also
ask
the
engineers
or
if
our
team
is
able
to
help,
we
can
ask
them
to
just
like
I
don't
know,
maybe
just
record
your
your
desktop
or
ask
them
to
you
know
to
track
how
much
time
it
took
them
to
run
something
specific
that
we
kind
of
have
an
idea
of
how
complex
those
those
tasks
can
be
or
how
lengthy
they
can
be.
A
Yeah,
so
I
I
could,
if
you
want,
I
can
like,
within
that
epic
write
out
all
the
different
tasks
that
we
have
currently
and
see.
If
there's
people
that
within
the
team,
that
would
be
able
to
specify
like,
like
an
estimation
of
time
for
each
of
those
going
into
our
meeting,
that
might
be
helpful.
B
D
B
Well,
because
I
remember
with
jackie,
was
really
frustrating
because
we
couldn't
complete
any
of
the
the
interviews
and
the
setup,
because
we
really
underestimated
how
much
setup
people
have
to
do
before
going
to
the
the
session
with
us.
So
I
think
in
particular,
because
we
are
looking
at
the
release
as
a
whole,
really
interesting
to
yeah,
try
to
frame
that
and
see.
If
there's
anything,
we
can
do
to
avoid
the
security.
A
C
Well,
fyi,
actually
the
two
of
you
might
have
discussed
it,
but
have
you
discussed
the
ux
roadmap
sort
of
I
guess,
initiative
or
experiment
that
the
that
some
of
the
ux
design
team
is
working
on.
A
C
Yeah,
I
think
just
fyi,
maybe
just
just
just
awareness
well
actually
I
do
you
want
to
speak
to
it
because
I
think
you're
you
might
be
more
you're
closer
to
it.
D
C
I
just
I
was
just
I
just
wanted
to.
Let
will
know
that
about
the
ux
roadmap,
like
projects
yeah
and
then
so
so
I
linked
a
linkedin
issue
will
where
I
think
I
I
think
it's
a
subset
of
like
the
ux
design
team
or
like
want
to
try
to
try
this
new
ux
roadmap,
like
kind
of
creating
one
to
define
sort
of
like
what
are
the
pieces
of
work
in
terms
of
design
and
research.
That
are
that.
C
You
know
that
the
team
is
thinking
about
and
focusing
on,
and
so
it's
it's
definitely
like
related
to
what
the
product
roadmap
is
and
but
the
idea,
but
I
think
in
q3
it
sounds
like
it's
like
an
okr
to
try
to
to
try
out
this
process
and
see
if
it's,
you
know
how
it
works
and
if
it
makes
sense
for
us
to
adopt
so
anyway,
really
is
more
of
an
fyi
because
we're
all
talking
together,
but
that's
something
high
undeniable
and
the
new
designer
will
will
be
we'll
be
working
on.
C
A
Yeah,
thanks
for
including
the
link
I'll
check
out
it
later
today,
I'll
just
kind
of
skim
through
it.
C
Just
wanted
you
to
to
be
aware.
B
And
we
want
to
like
one
of
the
goals
for
this.
That's
the
the
secure
team
ran
a
pilot,
and
then
we
are
trying
this
out
with
the
other
design
teams
so
that
well
we
can
test
their
this
framework.
The
process,
but
also
and
then
give
feedback
but
also
see
like,
is
this
helpful
right?
Does
it
bring
more
clarity
to
I
don't
wanna,
I
don't.
I
don't
wanna
call
it
roadmap,
because
we
already
have
the
product
roadmap
and
then
that
doesn't
really
replace
anything
right.
B
I
think
for
us
in
particular,
it
can
be
interesting
for
the
milestone
planning
right
so
that
if
you
can
look
ahead
and
know
that,
for
with
the
themes
and
the
jobs
to
be
done,
that
we
have
a
high
level
of
confidence.
B
The
important
problems
to
solve
are
x,
y
and
z,
and
what's
next
right
or
the
things
that
we
don't
have
validated
yet
and
how
do
they
connect
with
our
business
objectives
and
also,
what's
lacking
for
your
work,
I'm
more
interested
in
this
process
from
that
sense,
rather
than
here
it's
a
roadmap
for
ux
right,
and
I
think
that's
now
that
we're
going
to
have
a
new
product
designer
joining
soon.
B
It's
going
to
be
helpful
to
to
talk
about
that
as
a
as
a
team,
so
chris
and
myself,
and
also
see
if
we
can
in
the
gaps
or
the
the
ux
explaining
based
on
the
problems
that
say
that
we
have
a
medium
high
level
of
confidence
right
and
then
maybe
later
on,
populate
our
handbook
with
that
information
right
what's
next,
and
why?
B
For
for
ux,
not
just
in
terms
of
here's,
an
issue
that
we
are
solving
or
here's
the
problem
that
we
are
closing
or
validating,
but
in
the
longer
term
this
is
the
chunk
of
work
that
ux
is
going
to
be
working
on,
that
contributes
to
the
product
roadmap
or
the
you
know
our
stage
vision,
but
really
I've
never
done
this
before
I'm
curious
and
I
think
it'll
be
an
interesting
exercise
for
us
to
do
so.
Let's
see
yeah
it's
a
it's
a
part
of
a
okay.
I
will,
by
the
way,.
D
C
C
B
D
Yes,
awesome.
We
still
have
like
six
months
that
anything
else
that
we
didn't
discuss
today.
B
B
Kind
of
related
sorry
last
question:
we
have
there's
the
that's.
No
fireside
chat
tomorrow
right.
You
are
planning
on
joining
that.
D
D
C
And
that
particular
customer
is
the
one
I
talked
to
also
last
week
or
yeah
last
week
or
the
week
before.
So
I
think
it'll
be
great.
I'm
going
to
add
some
questions
about
deployment
and
release
stage
features
into
the
dock
tomorrow.
B
A
C
The
only
thing
is
yeah,
I
I
mean
only
if
there
are
particular
questions
you
have
for
them,
it's
it's
more
of
like
a
q,
a
with,
I
think,
with
with
them.
So
apart
from
that,
I
think
no
need
to
no
need
to
prep.
A
Okay,
yeah
I've.
I've
got
it
blocked
off
on
my
calendar,
so
I'll
sit
in.
C
B
C
A
Oh
cool,
in
that
case
we
can
wrap
up
and
I'll
post
a
recording
to
the
release
slack
channel
a
little
bit
later.