►
From YouTube: Testing UX / PM Sync 2021-10-28
Description
Gina and James talk through getting the Project Quality Summary MVC issues to be more up to date, what's really in the MVC and how we got where we are about the data being presented.
A
This
is
the
testing
ux
pm
sync
for
october
27,
2021,
so
gina
and
I
are
catching
up
on
things
gina.
Do
you
wanna
talk
through
your
follow-ups
from
last
week
of
what
we
got
accomplished.
B
Yes,
so
last
week
we
talked
about
a
lot
of
things
about
usability,
testing
and
being
able
to
follow
up
on
the
research
that
we
carry
it
out
for
a
job
to
be
done.
B
B
B
A
On
I
added
more
sub
effects
to
it.
Oh
you
did
okay.
Well,
I
thought
I'd
added
the
there's
some
login
stuff,
that's
in
there
as
well,
that
I
thought
I
had
added.
Oh
yeah,
it's
in
yeah
the
authentication
improvements
I
might
break
that
out.
So
it's
not
a
sub
epic
of
the
screenshots
and
annotations,
but
I
think
it's
required
for
that.
Larger
problem
set.
B
Okay,
yeah,
I
we
can
definitely
move
that.
I
I
ended
up
placing
all
of
these
ones
that
were
talking
about
login
in
that
epic.
A
A
Really
it's
just:
how
are
we
gonna?
How
are
we
gonna?
Let
users
log
in
so
they
can
leave
comments
or
how
are
we
gonna
allow
anonymous?
That's
just
the
workflow
that
we
still
need
to
get
sorted
out
because
we
didn't
do
it
honestly,
we
didn't
do
a
great
job
of
figuring
out
that
workflow
and
making
it
just
work
for
everybody.
So
we
didn't
take
into
account
all
of
the
different
cases
that
could
have
shown
up
for
public
projects.
A
A
B
A
I
think
we'll
have
I
mean
there'll,
be
the
flow
chart,
but
it'll
be
based
on
like
so
many
diagrams.
There's
a
table
of
like
here's,
the
here's,
the
type
is
it
self-managed.
Is
it
hosted
right?
Is
it
private?
Is
it
public?
Is
it
internal
and
are
they
a
user
within
the
name,
space
or
an
anonymous
person?
And
what
do
we
make
them
do
to
leave
a
comment
like?
Yes,
you
can
leave
a
maybe
you
can
leave
a
comment.
Maybe
you
can
comment
on
a
thread.
I
I
don't
know.
B
B
B
A
B
B
Highest
thing
is
job
to
be
done,
research
and
then
project.
Quality.
Summary
designs
is
the
other
thing.
I've
completed
a
good
amount
already,
but
I
just
want
to
get
those
summary
designs
in
a
good
spot
so
that
miranda
can
take
them
cool.
A
One
thing
I've
noticed
as
we've:
this
is
totally
aside
from
the
project
quality
stuff,
it's
more
about
the
workflow
that
we've
been
using,
and
I
think
it's
a
reflection
on
the
tool
of
it's
really
hard
to
realize
how
many
pings
you
have
to
answer
on
that
design
tab,
because
it's
such
a
small
space
in
that
right,
gutter
or
that
right
side.
A
So
I,
if
I,
that
is
to
say,
hey
if
I've
missed
anything,
make
sure
you
pick
me
again
and
b
I'll,
probably
leave
a
comment
about
that
workflow
in
products
and
ping,
kai
or
ping
gosh.
I
don't
even
know
who
owns
the
design
area
right
now,
but
whatever
the
right
pm
is
in
their
designer
and
like
this
is
a
this
was
hard
like
to
do
and
make
sure
that
I
actually
answered
all
of
the
feedback.
A
I
have
to
imagine
that
if
you
were
leaving
a
ton
of
design
feedback
on
something
that
would
be
even
harder
like
right,
we
had
25
of
those
pins
out
there.
How
do
I
iterate
through
that,
like
you,
can
with
unresolved
threads
on
an
mr
so
that
I
can
go
through
all
of
my
tasks
or
whatever
you
want
to
call
them
all
of
my
pings
and
make
sure
that
all
of
those
things
are
answered.
Yeah.
B
No,
I
agree
with
you.
I
was
making
smaller
and
there's
like
15,
so
I
had
made
little
threads
for
myself
that
it
was
easy
for
me
to
go
through,
because
I
have
contacts
into
what
I
was
creating
and
I
could
resolve
them.
There's
no
like
count
of
how
many
times
you've
been
tagged
here
at
all.
You
do
get
like
the
to
lose,
but
it's
not
the
same.
So
I
I
understand
that
yeah.
A
So
that's
just
an
aside
and
yeah
just
if
I
missed
anything
on
the
project.
Quality
summary.
Let
me
know,
and
then
I
asked
jackie
to
review
the
jobs
to
be
done.
Discussion
guide
for
performance
testing
stuff.
I
was
just
running
out
of
time
and
I
didn't
want
to
hold
you
back.
So
I
it's
still
on
my
to-do
list.
B
A
I'm
just
gonna,
I'm
gonna
mark
that
off.
Is
it
done
perfect,
so
the
project
quality
summary
issue.
If
there's
a
single
source
of
truth,
where
the
original
requirements
are
listed,
I
was
just
as
I
was
going
through
our
agenda
last
night.
I
thought
I
think
that
the
epic
is
probably
the
single
source
of
truth
or
one
of
the
implementation
issues
there
within.
If
not,
you
want
to
just
open
that
up
real,
quick
and
we'll
get
get
it
sorted.
A
A
Yeah,
because
we
don't
have
there's
nothing
great
on
there,
so
just
the
ui
is
that
what
we're
looking
at
display
the
two
test,
two
metrics
yeah.
We
can
update
this
to
show
more
of
what
is
what
we're
going
to
be
looking
at
or
what
we're
going
to
be.
Displaying.
B
A
B
A
B
Actually
familiar
with
this
stuff,
now,
though,
so
maybe
I
can
just
cop
what
I
might
do
is
just
copy
this
over
to
the
design
description
and
here
iterate
on
it
in
here
and
make
sure
it's
up
to
date
and
then
edit
it
back
in
this
thingy
too
awesome,
okay,
that
works
and
I'll
ping
you,
and
if
you
want
to
look
it
over,
if
you
have
time
that's
great
but
yeah.
A
Absolutely
we'll
do
that.
Make
sure
that
we're
good
the
one
thing
I
notice
in
my
proposal
is
that
the
first
line
of
it
displays
the
two
metrics
contradicts
then
adding
the
quality
violations.
A
B
A
A
Yeah,
let's
treat
it
the
same
way,
we
would
the
load
testing
the
browser
performance,
the
accessibility
of
let's
just
link
to
the
docs
page
and
say
to
see
for
more
information
about
code
quality,
scans
click.
Here,.
A
We're
not
even
gonna
have
that
data.
I
did
a
bad
job
of
saying
hey.
This
is
gonna,
be
our
npc,
assuming
that
we'd
be
able
to
get
that
data
point,
and
when
it
became
obvious
that
we're
not
gonna
be
able
to
get
that
data
point.
I
I
didn't
get
that
pulled
out
and
then
I
kept
reading
feedback
that
made
it
sound
like
we
would-
and
I
apologize
for
that.
A
So
I
think
that
the
work
we've
done
is
informative
of
the
next
version
of
this,
and
we
should
try
to
save
that
somewhere,
but
for
our
for
our
purposes
of
minimum
viable
we're
going
to
have
just
test
coverage
and
just
tests
executed,
passed
and
failed.
Those
are
going
to
be
our
data
points
to
start
and
then
we'll
continue
to
iterate.
I
think
that
it's
a
great
spot,
though,
to
put
if
you're
looking
for
code
quality
violation,
data
click.
A
B
A
Yeah,
let's
I'll
spin
up
an
issue,
real
quick
that
is
that
and
just
list
out
here's
the
content
to
put
on
the
page
and
the
page
to
link
to
from
there.
We
have
a
treatment
like
that
for
tests
on
the
pipeline
page.
If
you
have
no
tests
uploaded,
we
give
you
some
data,
we'll
take
a
similar
approach
of
you
have
no
no
test
execution.
History!
Click
here!
You
have
no
coverage
data
click
here.
That
gives
you
information
about
why
you
should
go
create
that
data.
A
A
B
A
B
A
Okay,
cool
all
right,
so
I
will.
Let
me
have
a
to-do
version.
A
So
I'll
go.
Do
this
yeah
your
last
point,
then,
if
you
have
future
iterations
a
vision
issue
sounds
great.
Okay,
there's,
definitely
stuff,
that's
out
of
scope.
That
would
be
awesome
when
we
have
the
data
that
we
should
put.
A
Yeah
yeah,
if
you
want
to
spin
up
here's
where
it
could
go
long
term
and
then
I'd
love
to
use
that
get
that
as
a
jpeg
and
stick
it
into
our
code
testing
and
coverage
direction,
page
as
the
new
vision
of
where
we
want
to
go.
A
Okay,
because
we
have
an
old
mock
that
juan
and
I
did
like
a
year
and
a
half
ago
or
a
year
ago
that
it's
it's
outdated.
It
includes
project
qual,
a
bunch
of
project
quality
information,
we're
not
going
to
include
we're
not
going
to
have
on
the
page
potentially
because
it'll
live
within
sas
and
security
dashboards.
B
I'm
like,
where
is
this:
I'm
gonna
just
go
here
first
and
then
look
it
up.
I
think
I
saw
it
yesterday.
B
A
A
A
And
I
think
I'll
try
to
find
the
youtube
video
for
that
link
to
it.
What
we
could
do
or
what
you
and
new
pm
can
do,
is
record
something
that
talks
through
here's,
what
the
vision
was
and
now
we're
starting
to
realize
that
vision,
here's
what
we
first
implemented,
here's
our
next
version
of
where
we
want
to
go
based
on
the
feedback
we've
received
from
this
et
cetera,
et
cetera.
I
think
that'd
be
a
great
update
to
that
direction.
A
Page
in
a
couple
of
months
as
we
get
closer,
okay,
I'll,
stop,
sharing.
A
A
I've
not
talked
to
people
who
are
doing
pipeline
authoring,
but
I
have
some
ideas
based
on
the
workflow
of
people
who
look
at
test
results
of
where
we
can
put
some
prompts
and
we're
gonna.
I
think,
stick
those
in
as
run
experiments
alongside
pipeline
authoring
as
a
okr
for
q4
for
the
team,
which
would
be
an
interesting
way
to
try
to
prompt
users
to
hey,
there's
a
better
view
of
this
data
outside
of
logs.
A
If
you
go,
do
this
and
then
in
the
pipeline
authoring
experience
and
pipeline
editor,
if
we
do
things
like
hey,
you
have
a
test
stage,
but
you
don't
you're,
not
uploading
any
artifacts.
If
you
upload
june
cover
to
our
artifacts,
you
get
all
of
this
functionality
too,
and
it's
all.
In
course.
We
can
always
display
that
kind
of
information.
A
Those
are
those
are
just
random
ideas.
I
think
there's
other
things
that
we
can
do
that
solve.
For
that
problem
of
I
don't
know
I
wish
I
could
get
a
better
view
of
my
data,
so
we
we
can
attack
it
from
both
sides.
I
think,
or
try
to
solve,
that
from
both
sides
of
I've
already
run
it
and
I'm
looking
at
it
or
I'm
editing
the
pipeline
or
offering
the
pipeline
so
wrapping
that
up
and
then
I
was
finalizing
okr's.
A
I
think
we
have
some
good
stuff
lined
up
for
both
teams
and
appreciate
testing,
taking
on
a
few
of
the
things
from
pipeline
execution,
it's
been
an
interesting
balance
this
week
of
making
sure
I'm
taking
care
of
both
teams
and
not
you
know,
shoveling
things
from
one
to
the
other,
because
I'm
leaving
one,
but
I
think
that
we're
in
a
good
spot
as
far
as
ux,
that
and
all
of
the
other
things
and
appreciate
your
efforts
of
going
through
the
backlog
and
looking
at
all
of
those
old
usability
issues
to
identify
some
of
that
ux.