►
Description
Walkthrough of the 14.4 planning issue focusing on the theme of the Milestone (Build Artifacts), Design issues being worked on and Bugs to address: https://gitlab.com/gitlab-org/ci-cd/testing-group/-/issues/68
We also go over the high level of what the team is working on in the second half of the year.
A
Hey
team:
I
wanted
to
walk
through
the
14-4
planning
issue,
as
we
start
refinement
for
that
milestone
with
14-2
wrapping
up
and
14-3
refinement
wrapping
up,
and
I'm
also
going
to
touch
on
a
little
bit
of
the
q3
priorities
with
the
taking
on
build
artifacts
and
a
lot
of
the
work
that
we've
been
doing
is
build
artifacts
kind
of
talk
about
the
what
and
the
why
of
that,
and
then
what
else
is
going
to
be
happening
in
q3
and
end
of
the
second
half
of
the
year,
so
starting
with
the
14-4?
A
This
milestone
will
be
kicking
off
in
september
about
a
month
from
now
running
through
october.
So
the
milestone
theme
we're
going
to
continue
to
prioritize
efforts
around
build
artifacts
through
epics,
like
the
statistics.
Artifact
exercise
is
broken,
artifact
storage
consumption,
because
our
hypothesis
is
that
making
artifacts
easier
to
manage
is
going
to
drive
later
adoption
of
testing
features.
So
we
want
to
make
sure
that,
as
we're
storing
artifacts,
the
size
is
reflected
correctly,
that
people
can
access
them
later
on
if
they
want
to
download
those
artifacts.
A
But
ultimately,
we
want
them
uploading
them
as
probably
reports
so
that
they
can
drive
some
of
our
test
features
like
test
summary
or
coverage
test
coverage,
visualization
things
like
that.
A
So
some
of
the
top
deliverables
we
have
a
couple
of
inver
dev
issues
which
are
both
confidential,
so
we'll
be
clicking
into
those
and
then
implementing
the
saks
parser
for
cobratura,
so
that
we
can
safely
increase
the
size
limit
of
those
coverture
files
to
drive
test
coverage.
Visualization
we're
going
to
do
this
instead
of
take
the
mr
that
came
from
a
customer
to
just
change
the
size.
A
This
was
something
that
was
identified
as
a
potential
performance
improvement,
so
we're
going
to
tackle
that
first
and
then
we
have
a
couple
of
slow
endpoint
issues
that
we
want
to
tackle
thanks
scott
for
digging
into
the
dashboard
and
sussing
those
out,
so
that
we
have
some
things
to
work
on.
So
we
have
a
couple
things
there
and
then
digging
in
some
design
issues.
Gina
is
going
to
start
working
on
some
of
the
iterations
that
we
identified
as
part
of
our
solution.
Validation
for
the
project
quality.
Summary
I'll
talk
about
that.
A
A
little
bit
more
we'll
talk
about
the
second
half
of
q3
and
then
some
top
bugs
that
we
want
to
try
to
address
some
of
the
priorities
that
are
at
the
top
are
also
bugs
and
then
keeping
the
latest
artifacts
keeps
latest
artifacts
of
bail
pipelines.
This
is
a
weird
corner
case
that
I
think
we
should
probably
talk
through.
A
We
can
do
that
asynchronously
in
the
issue,
just
to
talk
about
the
expected
results,
so
in
the
case
of
a
failed
pipeline,
if
this
setting
is
enabled
we'll
hang
on
to
that
job
or
hang
on
the
artifacts
from
the
job,
if
the
job
fails
so
once
a
newer
job
replaces
it.
That
should
be
fine,
but
there
might
be
some
situations
based
on
how
I'm
reading
this
issue,
where
that
artifact
persists
indefinitely,
where
expirin
doesn't
work
for
it,
so
we
we
might
have
some
work
to
validate
that.
A
That
is
happening
as
it
should
first
and
then
the
last
is
this:
auto
devops
is
not
uploading
artifacts.
This
might
end
up
being
treated
as
a
future
enhancement.
I've
been
fully
dug
into
it.
I'm
not
sure
the
template
is
set
to
upload
these,
so
it
may
not
be
a
bug
so
much
as
an
enhancement
to
that,
but
I
really
think
that
this
would
be
a
great
way
to
drive
adoption.
A
If
we
auto
test,
we
should
grab
those
artifacts
if
we
can
upload
them
if
they're,
j-unit
and
format
so
that
people
can
start
to
use
test
summary
and
the
test
reports
in
those
bits.
Those
features
as
we've
changed
our
performance
indicator
from
pg
mount
to
just
gmail,
so
from
paid
users
to
all
users,
so
that
is
what's
going
on
14-4
and
then,
as
promised,
wanted
to
talk
a
little
bit
about
the
rest
of
q3
in
the
second
half
of
the
year.
A
So
looking
at
our
near-term
road
map
right
now,
next
later
now
we're
really
working
on
improving
that
management
performance
of
the
artifacts,
and
that
goes
back
to
you
know
a
couple
of
things,
one
being
our
theme
around
hosted:
gitlab,
first
and
performance.
We
have
the
ongoing
infradev
and
scale
sorry
starting
issues
that
we're
working
on.
A
We
may
work
with
our
friends
over
in
sas
to
get
an
end
point
to
give
us
project
quality
violations.
That
was,
you
know,
had
great
response
from
customers
about
that.
Pulling
that
into
that
view
as
well
and
then
later
on,
as
we
get
into
the
latter
half
of
the
latter
half
of
the
year
starting
to
dig
into
project
history
reports
and
let's
click
into
that
just
to
take
a
peek.
A
So
this
is
a
project
level
test
result:
history
on
that
default
branch.
Looking
at
the
the
history
of
a
test
over
time,
so
expanding
on
that
nvc
that
we
introduced
way
back
in
13,
something
to
show
not
just.
How
often
is
this
building
over
the
last
14
days,
but
give
me
those
last
couple
of
runs
so
that
I
can
better
identify
if
a
test
is
flaky
or
not?