►
From YouTube: Testing UX / PM Sync 2021-10-21
Description
Gina and James cover how to wrap up current research around JTBD and Opportunity validation and how to hand those items off at the end of the quarter.
A
This
is
the
testing
pm
ux
sync
for
october
21st
2021.
Some
follow
us
from
last
week,
pings
for
performance
testing
jobs
to
be
done.
The
interview
recruiting
how'd
it
go
what's
next,
have
you
put
any
things
out
there?
How
can
I
help.
B
So
I
have
started
a
draft
of
the
discussion
guide,
but
I
haven't
gotten
into
like
any
of
the
questions
that
we
actually
want
to
ask
yet.
So
I
wanted
to
see,
if
I
could.
I
was
going
to
ping
you
and
hayana
today
and
probably
will
to
take
a
look
at
it
to
make
sure
that
this
top
part
at
least
looks
good
and
then
I'll
move
on
to
the
questions
and
as
long
as
we
have
the
participant
profile
by
here,
then
I
can
recruit
start
recruiting
too.
A
Yeah
yeah,
I
think
your
participant
profile
is
probably
going
to
be
quality
engineering
and
or
our
sct
friend,
teammates
and
front-end
developers
are
probably
gonna,
be
where
I
would
start
because
they
generate
review
apps
and
because
we
can
use
review
apps
for
that
testing
and
to
see
how
components
are
changing
that'd
be
where
I
would
start
and
yeah
I'll
be
happy
to
take
a
look.
Take
a
look
at
this
a
little
bit
later.
B
A
Cool
and
then
I
wanted
to
make
sure
on
our
usability
testing
research
that
before
we
get
too
far
away
from
that
that
we
have
solid
recommendation
issues,
let
me
back
up
a
good
summary
of
what
we
learned
issues
on
recommendations
of
what
we
need
to
do
next,
as
far
as
either
validating
the
jobs
to
be
done
through
research
issues,
or
here
are
the
bits
of
functionality
that
we
need
to
implement,
so
that
a
user
can
do
their
job
with
this
category
or
the
combination
of
categories
between
review,
apps
and
usability
testing,
and
that
that's
lined
up
ready
to
go
so
that
we
can.
A
I
just
want
to
make
sure
that
that's
all
buttoned
up
ready
to
go
and
we
don't
leave
this
kind
of
lingering
well,
we
did
some
research,
but
nothing
really
came
with
it,
but
we
didn't
do
anything
with
it.
Yeah,
like
we've,
invested
this
time.
Let's
make
sure
that
we
have
return
on
that
investment
one
way
or
the
other,
and
that
could
be.
We
didn't
learn
enough.
A
We
need
to
go
out
and
do
broader
research
of
what
does
this
category
mean
and
that'd
be
fine,
but
let
me
know
how
I
can
help
with
creation
of
those
issues
or
walking
through
and
creating
a
summary
it'd
be
great
to
post
a
summary,
even
if
it's
private
here's
what
we
learned
to
the
unfiltered
channel,
just
kind
of
recapping,
here's
what
we
did
you
did
that
with
a
ux
scorecard.
A
I
thought
it
was
really
great
so
doing
something
similar
just
a
recap
of
what
we
learned
about
the
category
and
the
jobs
that
users
are
doing
when
they're
doing
that
acceptance
testing
would
be,
I
think,
a
fantastic
outcome
of
this
research
issue.
A
B
B
Okay,
I
did
record
a
you.
I
recorded
walkthrough.
I
just
added
the
research
issue.
If
you
go
to
that,
it
has
all
of
that
information
in
the
top.
So
I
did
a
summary
of
insights
in
that
issue
too,
so
that
we're
not
like
having
to
go
to
to
dovetail
and
then
yeah.
I
did
a
video,
but
what
I
did
not
do
yet
is
create
recommendation
issues
of
how
to
move
forward
with
what
we've
learned,
because
all
I
did
was
update
the
jobs
to
be
done.
A
That
was
already
out
there
yeah.
What
we
can
do
from
that
is.
We
can
create
an
epic
around
usability
testing
to
viable
and
put
the
issues
into
it
that
these
are
the
things
we
need
to
get
done
before
we
can
before
we
can
move
the
maturity
and
we
could
even
attach
ux
issues,
research
issues
as
part
of
that
now
we
need
to
go
back
and
do
the
scorecard
if
we
wanted
to
that,
could
be
part
of
that.
Epic,
okay,.
B
Okay,
that
would
be
great
and
I'll
put
the
dog
fooding
issue
in
there
too.
Okay,
that
sounds
like
a
good
move
forward.
I
also
haven't
heard
back
from
from
hayanna
about
doing
this
scorecard
around
this,
but
I'll
make
sure
to
follow
up
with
her.
I
think
it
would
be
fine.
B
B
B
But
that's
my
top
priority.
I
would
say:
okay,
what
is
this?
You
had
pinged
me
on?
Yes,
the
metrics
report
I
had.
I
had
answered
your
question
there,
but
I
just
wanted
to
make
sure
that
we
were
on
the
same
page.
I
I
gathered
this
screenshot
from
the
new
merge
request,
widgets,
which
it
doesn't
look
like.
This
is
how
it
shows
up
today,
like
with
these
categories,
but
I
could
be
wrong.
A
B
A
B
But
it
will
cover
our
case,
but
it
still
doesn't
enable
this
like
hide
changes
depending
on
a
set
threshold
that
they
were
talking
about,
which
I
was
thinking
that
maybe
we
could
just
add
an
option
to
like
add
that
configuration
when
you're
setting
the
widget
up
instead
of
showing
it
in
the
ui.
Because
we
have
such
a
small
area
to
play
with.
They
could
just
set
that
as
a
threshold.
And
then
whatever
like
things,
will
show
up
depending.
A
We
implemented
a
change
like
this
with
browser
performance
testing,
so
I
will
I'm
gonna,
assign
this
myself
and
stick
it
into
refinement.
A
A
To
I'm
going
to
rewrite
the
issue
a
little
bit
to
say
now
we
have
a
better
usability
around
this,
of
showing
the
changes
in
a
logical
order.
Instead
of
just
here's
everything
and
you
have
to
go
hunting
for
it
and
then
introduce
the
threshold
like
we
did
for
performance
for
browser
performance
of
here's
kind
of
the
floor,
below
which
I
don't
care.
If
I
see
a
change
or
not,
we'll
probably
do
that
based
on
no,
we
don't
do
this
on
percentage.
B
A
A
B
B
A
Did
I
feel
like
what
I'm
trying
to
say,
I'm
not
doing
a
good
job
verbalizing?
What
I
want
to
do
for
the
comparison
is
say:
here's
what
the
value
is
right
now,
the
latest
here's,
the
last
time
we
measured
and
that
could
be
yesterday
that
could
be
90
days
ago.
It
doesn't
matter
we're
just
going
to
say
here's
how
it's
changed.
A
B
A
I
will
yeah,
let's
yeah,
that's
I
I
was
trying
to
say
at
the
designer's
discretion.
I
can
add
a
comment
into
the
issue
itself
of.
Let's
compare
the
latest
data
data
point
to
the
last
data
point
and
just
show
that
comparison
in
our
design.
If
that
would
be
helpful,.
B
All
right
cool,
thank
you
and
then
my
my
other
question
was
just
asking
how
your
transition
is
going
and
if
there's
anything
that
I
can
help
you
with,
because
I
know
you're
splitting
time
right
now.
A
No,
I
think
everything
is
going
okay
for
now
keeping
up
on
issues
and
pinging
me
if
anything
is
going
through
the
cracks
you're,
not
getting
an
answer
as
fast
as
you
would
have
expected,
even
async
is
would
be
great.
Just
that's.
My
biggest
concern
is
that
I'm
letting
stuff
fall
through
the
cracks.
A
Good
all
right
on
my
side,
let's
see,
let's
see
yeah
I'll
share
my
screen,
see
if
I
can
do
the
right
one.
I
recorded
I
recorded
a
walkthrough
yesterday
of
how
to
use
chorus
and
slack
as
a
sensing
mechanism.
A
My
first
attempt
at
that
was
the
wrong
chrome
window,
where
it
was
just
me
blabbering
over
the
top
of
an
empty
gmail
inbox,
because
I
picked
the
wrong
one
when
I
picked
share
screen,
so
you
got
to
make
sure
you
get
the
right
screen
share.
So
this
week
I
finished
up
the
sales
enablement
session
on
tuesday.
That
was
a
good
one,
yeah
testing
team
transition
and
then
pe
team
transition
as
well,
is
what
I'm
working
on
right
now.
A
My
next
focus,
though,
is
I
have
one
more
customer
journey
interview
this
after
late
morning
today
and
then
I'm
gonna
be
working
on
tagging,
all
of
those
videos,
insights
and
then
probably
canceling.
The
interviews
that
I
have
scheduled
for
next
week,
if
I
feel
like
we,
have
the
insights,
that
we
want
out
of
this
around
the
pain
points
and
setting
up
artifacts
or
setting
up
artifacts
reports
junit
or
cobratura,
which
is
specifically
what
we're
digging
into.
So
that's
that's
my
next,
my
next
priority
and
towards
that.
A
I
want
to
make
sure
that
scott
and
I
hand
off
any
research
that
we're
wrapping
up
this
quarter
to
you
and
the
new
product
manager
coming
in
we've
he's
found
some
really
great
insights
around
when
there's
a
pain
point
in
pipeline
execution
time,
which
has
been
really
interesting.
We
went
into
this
thinking.
A
It
was
all
around
testing
and
flicky
tests
and
you
have
flaky
tests
and
you
just
have
to
rerun
the
thing,
and
we
talked
to
some
folks
who
have
that
problem,
but
seeing
the
demographics
of
them
and
the
circumstances
of
that
has
been
really
interesting
and
they're
being
pain,
points
not
associated
with
testing
but
associated
with
having
to
re-run
a
pipeline
and
understanding
where
it
gets
painful
for
a
user
versus
expected
time,
because
everybody
we've
talked
to
has
a
much
shorter
pipeline
time
run
time
than
we
do
they're
talking
about.
A
Oh
my
pipeline's
running
four
minutes
and
we're
just
like
what
our
pipeline's
right
in
45
minutes.
That's
amazing,
you
must
be
so
happy
and
like
it
could
be
better
like.
Oh,
my
goodness,
you
don't
have
a
monolith.
That's
what's
going
on
here!
So
that's
been
really
interesting.
It'll
be
good
to
hand
that
off
and
we're
focused
on
like
not
saying
that
we
have
the
wrong
persona
or
from
our
screener,
but
in
understanding
the
opportunity
space.
A
So
we
might
need
to
go
back
and
do
even
another
round
of
problem
validation
or
split
it
up
and
then
deep
dive
into
more
of
who,
we
think
is
going
to
have
that
problem
to
understand
how
big
the
space
is
before
we
start
getting
into
solutions.
But
it's
been
really
interesting.
We'll
try
to
get
that
wrapped
up
this
quarter
or
it'll
bleed
into
next
quarter,
we'll
get
it
wrapped
up
and
handed
off.
A
Make
sure
that
we
have
solid
next
steps
and
issues
that
there's
a
summary
published
somewhere
either
on
unfiltered
or
at
least
in
the
issue
kind
of
like
what
you
just
did
with
the
jobs
to
be
done
stuff,
but
make
sure
we
didn't
do
research
and
then
just
there's,
there's
nothing
from
it
that
we
have
solid
next
steps
that
someone
else
could
pick
up
and
run
with
to
move
that
forward
through
the
validation
cycle.
A
All
right
and
then
I
wanted
to
make
sure
that
our
so
those
are
the
things
that
I
will
put
into
the
prioritization
of
ux
research
projects
for
verify
and
package.
I
think
they're
both
already
in
there
and
is
there
anything
else
that
we
need
to
add
in
here
from
what
you're
planning
on
working
on
next
quarter.
B
B
B
Okay,
but
other
than
that,
I
think
everything
would
be
really
more
considered,
like
solution
validation
that
I
can
think
of.
Okay,
cool.
A
All
right
awesome,
I'm
gonna
keep
this
meeting
on
the
calendar
for
next
week,
just
thinking
about
timing
of
this
next
week
for
sure
the
week
after
that
and
sam
just
really
does
me
sorry,
sam
and
I
just
booked
our
recurring
one-on-one,
but
he's
out
like
the
next
three
weeks,
so
we're
not
gonna
have
it.
A
What
is
that?
Oh
I
booked
over
the
top
of.
A
A
No,
it
gets
even
weirder
than
it
would
be
for
the
normal
split,
because
europe
and
the
us
change
times
at
different
times
of
the
year.
Now
now
you
throw
in
arizona,
doesn't
change
times,
and
it
just
gets
all
sorts
of
missing
back
to
I'm
going
to
keep
this
on
the
calendar
for
next
week
and
the
week
after
that,
I
think
I
will
add
our
new
product
manager
to
that
one-on-one.
A
I
believe
that
they're
east
coast,
based
so
it'll,
be
a
friendly
time
for
them
as
far
as
titans
go
and
then
I'm
gonna
cancel
this
call
and
let
you
and
them
set
up
a
call
to
be
recurring
at
whatever
pace
you
need
it
to
be,
but
I
think.
A
A
All
right
awesome
so
for
now
I'm
going
to
do
nothing,
but
I'm
going
to
add
to
the
agenda
for
two
weeks
from
now.
Let's
talk
about
how
many
more
of
these
that
james
needs
to
be
a
part
of
and
when
we
could
just
cancel
this
recurring
set.