►
From YouTube: Verify:Testing 13.8 Refinement Kickoff
Description
Continuing our async refinement here's a walk through of the 13.8 planning issue as of November 24, 2020.
Issue: https://gitlab.com/gitlab-org/ci-cd/testing-group/-/issues/22
A
Hey
team:
I
wanted
to
walk
through
the
1398
planning
issue
as
we
kick
off
refinement
for
that
milestone,
so
the
dates
for
13.8
are
december,
22nd
through
january
21st
of
next
year,
so
over
some
major
u.s
holidays.
So
we
anticipate
reduced
capacity
is
the
first
thing
I
want
to
call
out
probably
down
20.
A
So
as
always,
we
plan
ambitiously,
so
we
probably
don't
have
less
than
20
in
there
as
traditional
issue
and
weight
of
what
we've
been
delivering
but
we'll
be
taking
a
close
look
at
that
as
we
move
closer
to
kicking
this
off
some
of
the
key
or
top
priority
deliverables
that
I
want
to
call
out
within
the
milestone.
A
The
test
history
mvc,
the
full
unit
test
report,
so
that
is
extending
what
we
built
within
the
merge
request,
widget
and
adding
that
into
the
full
unit
test
report.
So
adding
that
data
in
there.
A
Some
of
the
other
top
things
that
we
want
to
look
at
are
continuing
to
track
paid
feature
usage
with
usage
ping.
So
there's
a
couple
of
issues
in
13.8
to
continue
our
work
on
that
and
that'll
culminate
or
finish
up
in
13.9,
as
we
then
build
out
the
aggregated
metric,
so
tracking
metrics
report
usage,
which
we
see
a
lot
of
jobs
that
create
metrics
reports,
so
it'll
be
a
key
one
for
us
and
then
load
performance
testing,
which
is
an
emergent
category.
A
It's
still
minimal,
but
that'll
give
us
good
eyes
onto
how
much
are
people
actually
using
this?
Is
there
an
appetite
there
folks
trying
to
use
that
engage
with
that
where
we
can
really
build
out
that
feature,
set
and
differentiate
ourselves
from
competitors
and
then
finally,
the
code
quality
without
docker
and
docker,
and
I'll
talk
about
that?
A
little
bit
more.
A
At
the
end,
when
I
talk
about
okrs,
so
as
I
looked
today
today
is
november
24th
at
things
that
require
additional
refinement,
there's
only
a
few
left
already
so
great
work,
getting
everything
already
looked
at
and
weights
applied
code
quality
without
docker
and
docker
is
one
of
them.
So
we
need
to
put
some
focus
on
that
and
making
sure
that
we
are
ready
to
go.
We
have
a
solid
proposal
to
get
that
done
and
then
creating
the
partial
report
file
as
pipeline
artifact.
A
That
sets
us
up
for
later
features
of
adding
code
quality
violations
inline
in
the
commit
we've
seen
great
adoption
of
the
test
coverage
visualization
of
developers
who
want
to
see
if
lines
are
covered
by
tests.
We
anticipate
a
really
big
adoption
or
high
adoption
of
that
feature,
as
well
among
the
code
quality
user
base.
So
they
can
see.
Is
there
a
code
quality
violation
in
here
as
part
of
the
code
review.
A
Design
items
that
are
happening,
hayana
code,
quality,
inline
notices,
the
feature
that
I
just
talked
about.
We
have
it
designed
for
that.
We
just
need
to
give
it
a
second
set
of
eyes
as
you're
coming
on
board
and
getting
up
to
speed.
So
we
created
a
specific
design
issue
for
that,
and
that
is
linked
here.
A
No
bugs
no
security,
no
tech,
debt
issues,
although
we
should
probably
rename
this
section
to
feature
maintenance
issues,
ricky
ux
debt
issues,
fortunately
nothing
there
and
then
the
okr
alignment
test
history
mvc.
This
leads
into
sid's,
okay,
around
popular
next-gen
product,
as
noted
by
a
popular
open
source
project.
The
link
is
to
the
blog
there
when
they
were
changing
their
ci
system.
They
called
out
that
a
ui
to
browse
test
history
should
be
obvious
for
an
eci
system,
but
we
don't
have
one
tracking
the
paid
features.
A
A
And
then
the
code
quality
without
docker
and
docker,
this
one
can
directly
impact
growing
acv
code
quality
is
a
paid
feature
and
we
know
that
there's
users
who
cannot
use
it
or
will
not
use
it
because
of
the
docker
docker
requirement.
So
if
we
can
remove
that
requirement,
we
unlock
additional
market
that
we
can
sell
that
feature
into
or
that
can
start
to
use
that
feature
for
us.