►
From YouTube: GitLab 13.6 Kickoff - Verify:Testing
Description
Kickoff Page: https://about.gitlab.com/direction/kickoff/#verify
Ops Direction: https://about.gitlab.com/direction/ops/#verify
We love feedback and you can reach me at jheimbuck@gitlab.com or mention me on GitLab.com using @jheimbuck_gl
Issues and Epics discussed on the call
* Pipeline tab with screenshot links for failed tests: https://gitlab.com/gitlab-org/gitlab/-/issues/216979
* Unit Test Errors + screenshots: https://gitlab.com/groups/gitlab-org/-/epics/3197
* Show current average coverage for a group: https://gitlab.com/gitlab-org/gitlab/-/issues/263478
* Code Quality Severity: https://gitlab.com/gitlab-org/gitlab/-/issues/2529
A
Hi,
I'm
james
heimbach,
the
product
manager
for
the
testing
group
in
the
verify
stage
of
git
lab,
and
today
I'm
joined
by
our
product
designer
juan
ramirez
today,
we'll
be
previewing.
Some
of
the
items
that
the
team
is
working
on
for
the
upcoming
13.6
milestone
you'll
find
a
link
to
the
13.6
kickoff
page
with
links
to
these
items
and
the
verify
stage
direction
page
with
discussion
of
our
future
work.
A
In
the
comments
below
this
video,
you
can
always
reach
me
at
jheimbuk,
gitlab.com
or
through
commenting
and
tagging
me
on
any
item
with
the
group
testing
label
in
the
gitlab
project.
So
today
I
want
to
highlight
two
problems.
The
team
will
start
to
solve
in
the
13.6
milestone.
The
first
problem
is
for
testers
and
developers
who
have
automated
tests
that
are
capturing
screenshots
alongside
errors.
A
You'll
have
a
new
tab
on
the
pipeline
page
that
just
has
failed
tests
and
a
link
directly
to
the
screenshot
that
was
captured
during
the
failed
test.
This
is
our
first
step
towards
a
richer
unit
test
report,
experience
that
will
include
more
metadata
about
the
test
environment
that
you
can
read
about
in
epic
number,
3197,
junit
errors
and
screenshots.
So
I'd
encourage
you
to
go
there
to
look
at
our
forward
direction
for
this
area
and
give
us
feedback
on
this
first
step
towards
that
direction.
A
So
far,
the
teams
provided
a
downloadable
report
for
all
and
for
some
select
projects,
as
well
as
display
for
the
current
coverage
of
those
selected
projects
and
with
completion
of
number
263478.
The
team
will
also
include
the
current
average
the
number
of
projects
and
how
many
coverage
jobs
go
into
that
average.
So,
instead
of
having
to
calculate
that
from
the
downloaded
file,
that's
now
quick
and
easy
and
accessible
for
that
application,
director
or
manager
to
take
a
look
at
for
all
of
the
projects
across
the
group.
A
B
James,
so
let
me
show
my
screen
all
right,
so
another
thing
that
we
are
planning
to
work
on:
it's
the
code
quality
severity
rating.
So
in
the
past,
when
we
show
code
quality
issues
on
an
mr
or
on
our
report,
we
will
show
it
and
just
tell
the
customer
or
the
user.
B
Hey
here's
the
issue
and
that's
it
right
like
there
was
no
context
on
whether
it's
a
a
major
issue
or
if
it's
critical
or
if
it's
something
that
you
actually
need
to
take,
look
at
because
it's
blocking
you
somewhere
else.
So
what
we're
gonna
be
working
on
is
adding
those
states
to
the
actual
issues
that
are
being
or
points
that
are
being
pinpoint
on
the
call
quality,
widget
and
the
code
quality
report.
So
the
way
that
you're
gonna
see.
B
That
is
that,
every
time
that
the
issues
show
up,
instead
of
just
like
showing
an
x
saying
that
this
is
an
issue,
we're
gonna
tell
you
if
it's
a
minor
issue,
if
it's
a
major
issue,
if
it's
a
critical
issue,
if
it's
just
an
informative
type
of
issue
or
if
it's
actually
a
blocker
and
as
you
can
see,
the
affordances
kind
of
like
match
that
scale
of
severity,
and
as
I
mentioned
before,
this
is
also
gonna
be
added
to
the
actual
code
quality
report
in
the
pipeline
view
in
the
code
quality
tab.
B
So
yeah,
that's
another
exciting
thing.
That's
happening
in
13.6
back
to
you,
james.
A
Thanks
one,
I'm
really
excited
about
that,
so
you
can
actually
tell
what
you
should
pay
attention
to
in
those
really
long
code
quality
reports,
sometimes
so,
as
always,
we
welcome
feedback
through
discussion
on
these
issues
that
we
talked
about
today.
This
video
or
you
can
always
email
me
at
jheimbach
gitlab.com
about
these
or
any
other
issues
in
our
testing
backlog.