►
From YouTube: Verify:Testing Internal Customer Call - September 2021
Description
Today we talked about some current research around Usability JTBD and insights uncovered related to Environment Management. We also did quick overview of how GitLab is using Metrics Reports (https://docs.gitlab.com/ee/ci/metrics_reports.html) uncovering a possible degradation. Lastly we talked about the future of test reports and how filtering by pipeline is an important use case for our internal users.
A
This
is
the
verify
testing
internal
customer
call
for
september
september
of
2021.
Now
that
we're
every
six
weeks,
I
think
our
next
one
will
be
in
november.
We
have
joanna
with
us
today.
Thank
you
for
joining
looking
at
the
agenda
a
couple
of
quick
notes
and
from
the
roadmap
deck
that
is
shared
in
the
agenda.
A
There
is
research
happening
as
we
validate
usability
testing
jobs
to
be
done,
gina
wrapped
up
those
interviews
this
week
and
has
already
posted
insights,
so
we're
working
on
rewriting
the
job
to
be
done
around
usability
testing.
It
did
uncover
a
lot
of
interesting
things
in
environment
management
and
code
review
as
much
as
it
did
around
how
we
use
review,
apps
and
visual
review
tools,
which
was
really
interesting.
So
as
we
continue
to
push
forward
to
move
that
category
from
minimal
to
viable
and
driving
more
internal
usage.
A
We're
looking
at
things
like
the
doc
site
and
handbook
of
how
do
we
use
review,
apps
and
visual
view
tools
as
part
of
the
review
process
with
those,
and
then
we're
I'm
going
to
be
kicking
off
research
later
this
afternoon
with
the
first
interviews
around
just
the
overall
customer
journey
from
when
you
start
authoring
that
gitlab.yaml
pipeline
gitlab.gitlab
ci
yaml
pipeline
file,
wow
through
when
you
start
uploading,
artifacts
and
looking
at
them
and
see
what
that
process
is
like
where
there
are
friction
points
in
uploading,
artifacts
we're
changing
our
main
pi
from
usage
of
paid
features
to
just
overall
usage
tagging
onto
the
stage
monthly
active
user
account
of
how
many
users
are
starting
the
pipeline
to
how
many
users
are
starting
the
pipeline
that
uploads
a
file
one
of
those
two
test
report
files.
A
B
To
be
completely
honest,
I
didn't
know
that
we
had
this
feature,
so
I'm
glad
that
you
brought
it
up
in
this
call.
I
was
looking
at
the
docs
before
this
and
I
think
it's
definitely
really
cool
I'd
be
curious,
like
if
you
know
how
we're
using
it
at
gitlab
now
like
what
other
teams
are
using
it
yeah.
A
A
A
So
we're
using
it
today
for
things
like
gem
size,
I
want
to
say
see
if
I
can
find
a
non-docs.
A
So
here's
an
mr
foresight,
you
see
it's
one
of
the
widgets
here
so
we're
using
it
to
track
some
metrics
around
gem
size
memory
usage
on
boots.
I
don't
know
what
all
these
gem
size
things
are.
A
So
these
are
gems
we're
either
using
or
gems
we're
creating
as
part
of
the
build
process.
But
I'm
honestly
not
sure
it's
like
that's
everything
and
I'm
kind
of
surprised
that
those
all
say
new,
because
you
would
think
that
those
are
existing
gems
and
here's
the
diff
between
the
two,
because
this
view
over
here
is
now
the
size
of
a
thing.
Without
any
comparison
is
really
not
that
useful.
I
I
I
would
think,
and
that's
the
feedback
that
we've
gotten
from
customers
as
well.
A
C
We'll
probably
be
able
to
review
it
once
we
migrated
over
to
the
new
merge
request,
widget
extension,
so
yeah
we
can
always
see
if
it
was
a
front-end
degradation
or
or
if
it's
some
kind
of
way.
The
report
is
handled
yeah.
A
C
Yeah,
I
think
the
only
thing
that
was
keeping
us
from
moving
that
over
was
polling
wasn't
put
into
the
core
component.
Yet
so
as
soon
as
polling
is
in
there,
then
we're
good
to
go.
Yeah.
A
Cool
I'm
gonna
try
to
find
our
example.
A
A
So
this
one
has
changes
and
then
a
new
item
as
well.
So
we
have
one
new
one
that
didn't
change
and
if
there
is
a
change,
I
don't
think
we're
showing
the
percentage
here
yet
because
these
are
strings
and
not
things
that
we
can
calculate
on
easily,
but
it
should
show-
or
it
shows
that
that
there
isn't
a
change
or
the
diff
between
the
two
just
through
simple
conversion.
A
If
something
has
changed
so
I
would
expect
that
it's
not
a
degradation
that
it's
in
the
way
that
or
it's
not
a
degradation
and
it's
always
showing
things
as
new,
but
maybe
something
changed
in
the
widget
logic
as
we're
looking
at
it,
where
it's
just
not
picking
those
up
as
changes
and
showing
them
as
new
every
time
or
it
could
be
that
the
file
doesn't
exist
on
the
target
branch
too.
If
that's
not
run
as
part
of
the
normal
default
branch
build.
A
A
Having
that
there,
I
I'll
post
back
into
engineering
productivity
and
poke
around
and
see
how
what
the
intent
is
for
that
I'll
also
see
if
I
can
figure
out
who
the
author
was
on,
that
part
of
the
gitlab
ci
yml,
and
try
to
uncover
some
motivation
for
having
that
in
there.
A
Well,
since
you
didn't
know,
we
had
the
feature
and
haven't
used
the
feature
and
we're
finding.
Some
gaps
exist
around
it's
not
showing
us
actually
diffs,
which
might
be
more
useful,
but
I
can
investigate
further
that
wraps
up
my
questions
around
using
metrics
reports.
A
B
Yeah,
so
I
think
kyle
had
added
a
comment
on
like
the
text
test
execution
reports
like
this
is
something
that
we're
trying
to
kind
of
build
on
our
own
right
now,
because
we're
trying
to
understand
like
what
are
the
trends
for
our
pipelines
over
time
right
now,
but
if
we
had
something
you
know
that
was
like
a
gitlab
feature
that
we
could
use.
That
would
be
awesome,
so
I
guess
my
question
is
you
know
like
what?
What
is
the
current
plan
for
that?
B
Like
scheduling
wise
and
what
kind
of
help
would
you
all
need
from
us
to
move
that
forward?.
A
Excellent
question:
let's
take
a
look
at
the
road
map.
A
I
had
left
a
comment
in
that
issue,
asking
if
there's
anything
we
could
contribute,
and
I
think
that
they're
using
the
end
point
that
we
set
up
the
internal
one
to
populate
the
test
today
and
beyond
that,
there
wasn't
anything
that
you
all
needed
initially.
A
So
yeah,
where
we
had
started,
was
that
we
wanted
initially
we
thought
we
would
do
just
the
last
10
executions
of
a
test,
but
I
don't
know
that
that's
the
right
level
of
what
you're
looking
for
of
did
these.
I
guess,
I
think,
there's
a
gap
between
what
this
does
and
what
you're
trying
to
understand
that
I
need
some
help
figuring
out.
A
So
what
we
had
done
was
put
together,
and
this
went
through
solution,
validation,
where
at
the
project
level
you'd
be
able
to
see
all
of
your
tests
and
a
history
of
them
over
the
past.
However,
many
runs
to
see
how
the
test
itself
is
trending
over
time,
but
that's
all
the
way
down
at
the
test
level,
not
even
at
the
job
level
or
overall.
The
suite
level.
A
B
By
pipeline
and
see
the
trends
that
way,
rather
than
like
the
project
level,
yeah.
A
A
This
would
give
you
at
least
the
current
the
the
latest
pipeline
that
ran
on
the
default
branch
for
the
project.
A
How
many
tests
are
running,
failing
or
being
skipped,
and
then
the
click
in
from
here
on
the
full
report
would
be
then
the
that
next
level
of
here's
all
of
those
jobs
and
tests,
then
I
think
the
iteration
after
that
would
be
now.
A
Great,
that's
all
right!
Okay!
So
if
we
were
able
to
see
this,
but
do
a
filter
by
the
pipelines,
then
you'd
be
able
to
do
it
be
able
to.
You
would
accomplish
more
of
what
you're
trying
to
do
with
your
own
report,
but
if
we
were
able
to
at
least
give
you
an
end
point
for
this
and
filter
by
pipeline
you'd
be
able
to
populate
that
report.
It
sounds
like
yeah,
that's
good
to
know.
A
I
will
take
away
from
this
at
least
a
to
do
to
create
the
issue
of
create
the
endpoint
with
a
pipeline
filter
on
it,
so
that
we
can
get
closer
to
that.
A
If
we
don't
already
have
that
or
if
the
team
isn't
able
to
move
forward
with
the
endpoint,
that
is
there
and
then
let's
create
a
view
of
filtering
by
pipeline
within
the
project
level
as
well,
so
that
we
can
make
this
we'll
create
a
dog
food
issue
for
that,
so
that
we
know
that
that's
a
use
case
for
our
internal
customer
awesome
awesome
thanks
once
we
had
that,
I
think
the
test
summary
view
would
be
a
little
bit
better
for
you,
too,.
A
Anything
else
that
we
can
help
with
anything
you
want
to
poke
at
for
things
that
have
come
out
of
the
team.
Lately.
A
Well,
very
fruitful
call
if
a
quick
call.
I
appreciate
it.
Thank
you
james.
Thank
you,
jordan.
I
appreciate
all
it's
good
to
see
you
yeah
love,
seeing
kyle
as
always
thank
you
kyle,
but
like
seeing
you
too
all
right
thanks.
Everybody
cheers.