►
From YouTube: Maintainership Working Group 2022-05-09
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
C
A
Okay,
I'm
gonna
go
ahead
and
get
started
this
very
long.
Silent
pause,
I'm
sorry
about
that.
This
is
the
may
9th
maintainership
working
group
meeting.
A
There
are
several
incidents
and
escalations
happening
right
now,
and
I
know
that
a
lot
of
these
bullet
points
have
people
involved
with
those,
and
so
I'm
just
going
to
verbalize
all
of
these
changes
since
last
time,
and
if
there
are
any
questions,
go
ahead
and
interrupt
me
or
stop
me,
and
we
can
add
it
to
the
agenda
for
follow-up.
A
The
first
change
since
last
time
is
my
own.
This
is
like
exit
criteria.
Zero
is
to
make
sure
we
have
a
good
good
representation
for
the
working
group.
I
did
recommend
eight
team
members
with
different
maintainer
status,
responsibilities,
regions
and
things
like
that,
and
I
really
really
hope
that
the
rest
become
members.
So
I'm
going
to
update
the
handbook
page
with
an
mr
for
that,
including
the
members,
and
we
can
go
from
there
and
I
hope
to
have
that
done
today.
A
Robert
may
volunteered
last
week
to
take
on
exit
criteria.
Number
three
he's
got
a
few
ideas
here
for
handling
the
bursting
of
maintainer
reviews.
This
was
a
concern
about.
If
you
take
the
red
dot
off,
you
are
flooded
with
reviews,
and
then
you
have
to
put
the
red
dot
back
on
max
has
opened
an
mri
to
add
emoji
work
in
progress
support.
I
think
this
has
to
do
with
review
limits,
which
is
another
talking
point
on
here
as
well.
So
this
is
basically
saying
I'll.
A
D
Three
so
update
the
expected
behaviors
and
responsibilities
for
engineers
and
maintainers.
I
wonder
if
we
were
clear
on
that
one.
I
definitely
think
what
robert
is
working
on
is
is
accurate
and
falls
in
there,
but
I
thought
we
were.
We
were
going
to
revisit
the
actual
responsibilities
of
job
descriptions
and
expected
behaviors
in
the
career
framework,
and
so
I
wonder
if
we
need
to
maybe
clarify
the
scope
in
that
issue
to
say
we're
looking
at
like
multiple
things
this
this
and
this,
is
it
clear,
I'm
I'm
looking
I'm
pulling
it
up
now
so.
A
I
will
say
that
on
this
one
I
actually
added
it
to
exit
criteria.
Number
three.
It
was
part
of
number
two
for
data
and
metrics,
and
the
reason
I
did
that
is
because
robert
is
leading
it
and
depending
on
what
he
finds
with
that
issue,
I
felt
like
it
might
lead
into
job
responsibilities,
for
example,
if
there's
guidance
that
he
needs
to
come
up
with,
or
things
like
that,
let's
go
ahead
and
check
exit
criteria,
number
three.
C
So
the
number
of
people
are
gonna
join
here
fyi
we
had
two
different
zoom
links.
I
guess
in
the
invite
michelle
we'll
have
to
figure
out.
What's
going
on.
E
F
C
A
We
had
a
long
long
pause.
Silence
we
were
like
is
that
all
that's
joining.
H
A
D
A
I
So
what
from
the
issue
that
we've
linked
so
far,
we've
probably
got
two
things,
one
of
which
is
around
future
behavior
of
how
we
limit
people
from
getting
too
many
reviews,
so
max
has
got
an
open,
merge
request
for
allowing
people
to
dictate
their
own
limit,
but
the
other
aspect
of
it
is
around
behavior
in
terms
of
how
we
get
people
to
probably
re-enable
themselves
for
accepting
maintainer
reviews
so
getting
trying
to,
I
suppose,
long
term.
I
We
want
people
to
not
have
to
red
circle
themselves
because
over
time,
the
more
people
who
do
it,
the
worse
it
gets
for
everybody
else,
as
it
progressively
starts
giving
everyone
else
more
reviews,
and
so
the
more
people
do
it,
the
more
people
that
will
do
it
and
it's
like
an
exponential
effect
of
everyone's
experience
getting
worse.
So
that
comes,
I
think,
under
the
the
maintainer
behavior
aspect
of
it
and
some
guidelines
that
we
can
potentially
rewrite
there.
I
But
I
think
there's
more
that
we
could
do
too
around
who
is
expected
to
be
maintainers.
What
they're
expected
to
be
maintainers,
for,
I
guess
as
well
comes
under
that
heading
I
think,
and
if
anyone's
got
any
other
ideas,
I'd
be
interested
to
see
what
they
are
and
we
can
always
expand
it.
A
bit
more.
D
So
I'll
jump
in
here
I
I
we're
going
a
little
out
of
order,
so
I
don't
want
to
forget
max
so
we'll
go
back
to
max,
but
just
in
response,
I
think
what
you're
saying
is
spot
on
robert.
I
just
wanted
to
clarify
that
part
of
the
scope
also
included
reviewing
the
actual
expected
behaviors
in
the
career
framework
for
the
different
roles,
as
well
as
reviewing
the
roles
and
responsibilities
for
the
different
senior
staff,
etc.
I
Yeah,
I
think
that
makes
sense,
and
I
think
for
that
aspect
of
it
I
might
need
to
talk
more.
I
guess
to
maybe
some
maybe
more
managers
who
have
people
of
different
different
levels
reporting
some
maybe
do
find
people's
opinions
on
that.
I'm
sure
I
can
find
them
from
everybody
all
the
engineers
otherwise,
but
that
might
be
quite
a
lot
of
responses
to
go
through.
So
perhaps
it's
it'd
be
good
to
either
ask
people
to
vote
somewhere,
maybe
on
how
they
feel
about
it
at
different
levels.
A
Jump
in
on
that
one,
robert
yeah,
I
think
that
maybe
the
functional
leads
and
topic
one
would
be
great
for
that
and
because
they
can
number
one,
they
represent
a
lot
of
different
levels
and
maintain
our
statuses
but
number
two.
They
can
also
go
out
and
get
that
feedback
as
well
and
kind
of
collate
it
for
you,
so
that
might
be
an
option.
I
A
J
I
think
robert
covered
it,
but
we've
opened
nmr
to
add
support
for
a
numbered
emoji,
so
maintainers
can
set
a
ceiling
on
the
number
of
concurrent
reviews
they
currently
have
in
process
manage
sort
of
took
that
around
with
it.
I've
not
had
time
to
revisit
it
yet,
but
it
seems
like
a
solid
idea
that
we
could
definitely
try
and
it
wouldn't
replace
the
existing
system
of
red
circles.
J
We
have
just
another
tool
to
enable
people
who
would
otherwise
exclude
themselves
from
the
review
process
entirely
to
you
know,
participate
just
at
a
more
reduced
amount.
Yeah.
So
comment
on
that,
mr.
If
there's
any
any
additional
thoughts
on
it,.
G
I
also
wanted
to
add
a
point.
I
did
not
get
time
to
write
it
because
I
just
noticed
this
now
now
that
we
have
capacity
data
in
roulette.
I
just
noticed
that
maintainer
capacity,
it's
mostly
at
55
percent
and
training
maintainers,
are
surprisingly
lower
than
that
like
44,
I
doubt
if
that
is
because
the
extra
pressure
we
add
on
them,
because
there's
no
way
to
control
reviews
coming
your
way
and
you
get
like
you-
are
picked
forex,
there's
a
chance
to
pick
training
maintenance
forex
compared
to
maintaining.
C
I
think
we're
on
my
item
out
of
curiosity,
how
how
how
well
do
engineers
and
maintainers
keep
their
reviewer
lists
up
to
date.
Out
of
curiosity,
if
people
like
try
to
get
to
zero
or
kind
of
what's
the
like
is
it
does
anybody
have
a
feel
for
it.
J
E
Yes,
so
for
from
point
of
view
of
front-end
maintainer,
we
try
to
push
this
to
zero.
Yes,
and
also
we
take
this
as
a
priority
number
one
to
unblock
our
peers
on
reviews.
So,
basically,
my
first
responsibility
during
the
day
is
usually
taking
on
reviews
and
doing
all
of
them
that
I
have
right
now
or,
if
I'm
not
able
sending
them
to
another
maintainer,
because
we
have
48
hours
to
review.
C
Okay,
that
makes
sense
it's
good
to
hear
like,
but
that
would
be
the
hope,
but,
like
I
was
curious,
like
you
know
how
how
how
likely
that
is,
but
thanks
to.
A
Here:
okay,
max
now
that
you're
on
the
call
you
want
to
go
over
point
c,
your
exit
criteria
number
two
update.
J
So
yeah
a
couple
of
big
things
have
changed.
I
believe
kyle
and
lucas
have
done
the
work
to
add
capacity
data
to
the
sort
of
internal
workload
dashboard,
which
is
really
helpful
and,
as
a
snapshot
gives
us
information
that,
for
example,
backend
maintainer
capacity
seems
to
hold
steady
at
about
50
percent,
pretty
much
for
all
the
data
we
have
so
far.
So
that's
quite
useful.
J
A
Thank
you,
steve
abrams
has
exit
criteria
number
five
as
of
last
week.
Thank
you
steve,
and
so
it
looks
like
you
have
an
update
as
well.
D
Actually,
michelle,
I
think
natalia,
had
a
comment.
E
A
E
E
B
For
exit
criteria
number
five:
it
revolves
more
around
communicating
the
plan
and
the
outcomes
of
the
working
group.
So
not
not
too
much
from
that
aspect
yet,
but
I
did
open
an
issue
to
sort
of
discuss
and
identify
different
success
factors
from
previous
implementations
and
communication
plans.
So
we
have
an
idea
of
what
will
work
as
we
start
to
to
work
on
that.
A
H
So,
some
weeks
back,
I
took
on
the
action
item
to
kind
of
give
a.
I
guess,
an
audit
of
current
training.
Maintainers
excuse
me,
so
I've
just
threw
some
numbers
up
on
the
agenda
for
us
to
take
a
look
at
so
in
summary,
we've
got
96
open
training,
maintainer
issues,
73
of
those
are
older
than
or
from
year.
H
Sorry
there
there
are
only
23
of
those
issues
are
from
this
year,
and
so
we've
got
issues
dating
all
the
way
back
to
2019
that
we
should
definitely
look
to
follow
up
on
I've
broke.
I've
included
a
breakdown
of
the
different
types
of
training,
maintainer
issues
as
well
to
kind
of
get
some
are
more
specific
than
others
in
terms
of
like
igloo,
maintainership
versus
backend
and
front
end.
H
So
in
terms
of
questions
here,
the
main
thing
I
want
to
do
is
try
to
kind
of
provide
this
data
to
the
managers
to
say:
hey,
you
have
training,
you
have
reports
between
maintainer
issues
that
are
older
than
x.
I
just
wanted
to
kind
of
help
needed
some
help
to
figure
out
like
what's
that
threshold.
For
us,
I
realize
that
also
feeds
in
potentially
to
the
different
exit
criterias,
because
just
kind
of
brainstorming
on
that,
I
think
we
can
use
this
to
influence
how
we
can
make
this
more
efficient.
C
Yeah
first
blush
definitely
older
than
a
year
yeah,
and
you
know
I
guess
the
question
would
be.
Is
you
know
at
six
months?
I
guess
the
one
question
that
would
be
interesting
is,
you
know,
should
we
have
people
who
are
just
reviewing
code
but
potentially,
like
you
know,
like
a
good
example?
Is
somebody
who's
not
necessarily
actively
trying
to
become
a
maintainer
but
reviewing
code,
and
how
do
we
want
to?
How
do
we
want
to
adjust.
I
I
delayed
becoming
a
maintainer
for
quite
a
while,
even
though
I
opened
the
issue
within
the
first
few
months
of
joining,
because
I
just
didn't
feel
like
I
knew
enough
about
the
overall
application
or
the
effects
certain
things
can
have
on
it,
and
I
still
don't
there's
certain
areas
of
the
application
that
you
just
don't
come
across
very
often
or
things
that
you've
literally
never
even
knew
that
we
had
as
a
feature,
and
you
receive
a
maintainer
request,
for
it
can
be
quite
quite
difficult
and
it
takes
more
time
when
you
do
come
across
those
things,
because
you
do
have
to
spend
a
lot
more
time,
making
sure
that
you
know
what's
going
on
or
tagging
other
people.
C
So
one
idea
is:
if
we
need
to
up
the
expectations
of
the
development
team
at
large
for
reviews
that
that
can
be
a
proposal
we
can
consider
out
of
this
exercise.
It's
not
just
about
maintainership.
It's
also
about
reviewers,
which
it
sounds
like
sounds
like
there's
something
in
there.
So
I'll
they'd
have
to
go
back
and
look
at
what
we
actually
currently
say
and
whether
we
just
need
to
start
socializing
that
more
or
if
we
need
to
change
it.
H
Yeah,
that's
what
I
want
to
dive
deeper
into.
I
think
I
don't
I
don't
know
if
it's
it's
been
consistent
across
the
years
of
training
retainerships
in
terms
of
the
templating,
but
I
have
seen
when
I
was
looking
through
these
issues
that
it
says
that
there
is
no
definitive
timeline,
so
we
should
definitely
kind
of
do
an
inventory
of
certain
expectations
we
set-
and
I
think
that's
part
of
criteria
for
in
terms
of
like
the
efficiency
of
the
process
and
see
if
we
need
to
update
that
across
the
board.
Yeah.
H
C
D
So
you're
right,
this
was
not
originally
part
of
the
exit
criteria
dennis
so
my
thoughts
were
when
I
saw
2019
was:
should
there
be
exit
criteria
I
mean
I'm
sorry
should
yeah?
Should
we
create
an
exit
criteria
to
define
an
expiration
time
or
like
how
long
something
can
sit
and
if,
like
after
a
year,
it's
just
stale
and
you
need
to
start
over
anyway?
That's
something
I'm
wondering
if
we
should
add
that
to
the
exit
criteria
to
create
those
yeah.
H
Coincidentally,
the
oldest
training
maintainer
issue.
We
have
so
that
april
2019
just
started
getting
more
updates
as
of
like
last
month.
So
that's
that's
interesting
and
that
was
updated
even
as
recently
as
last
week.
So
I
think
we
should
at
least
kind
of
keep
them
fresh
that
we
don't
have
like
these.
These
super
old
issues
so
like
if
you
get
because
if
you,
if
it's
been
open
since
2019
and
then
you're
now,
just
starting
to
touch
them
years
later,
like
you
should
probably
just
start
with
a
fresh
one.
I
It
might
be
that
when
I
did
it
personally,
I
found
the
actual
process
of
doing
the
the
maintainership
issue
irritating,
and
so
I
spent
six
months
to
a
year,
not
adding
anything
to
it
and
then
when
it
was
like.
Oh
now,
I
need
to
become
a
maintainer,
go
back
and
add
everything
to
it
and
it
was
kind
of
annoying
if
there's
any
way
that
we
could
make
it
less
frustrating
to
do,
because
I
think
it
like
it's
for
a
lot
of
manual
process.
I
Adding
somebody's
like
you
had
the
merge
request
that
you
reviewed
and
then
you
put
like
you
tagged
the
maintainer
in
it
and
you
ask
them
how
your
review
was
and
all
this
sort
of
stuff,
that's
quite
laborious.
So
if
we
could
automate
that
with
a
boss
or
something
maybe
that
make
it
a
bit
easier
for
people.
H
Yeah,
I
know
we
have
a
handful
of
actual
like
automation,
tools
that
people
have
come
up
with
specifically
to
address
that
concern.
I
think
that's
part
of
the
refreshing
expectations
that,
like
you
know
as
part
of
this
whole
thing,
that
we
make
people
aware
that
these
tools
are
exist
to
help
make
that
process
a
little
bit
easier.
I
can.
H
I
can
take
a
look
at
that,
but
I
guess
just
to
move
on
to
the
rest
of
the
agenda
like
I
I'm
I
haven't,
come
up
with
very
clear,
like
action
items
in
terms
of
how
this
will
feed
into
the
different
criteria,
but
expect
some
some
follow-ups
and
those
in
those
epics
to
see
how
this
information
can
benefit.
You.
A
I'm
sorry
I
was
trying
to
finish.
A
Thank
you.
There
is
this
extra
criteria,
number
four
that
centers
around
the
training
maintainer
process
and
making
that
more
efficient.
Already
in
this
conversation,
I
feel
like
this
is
several
issues,
but
maybe
to
start
just
one
around
making
it
more
efficient
or
why
I
have
these.
That's
my
next
point:
why
have
these
issues
been
open
for
so
long?
I
know
that
ezekiel.
A
This
meeting
is
out
of
his
time
zone,
but
he
is
a
maintainer
mentor
and
he's
got
a
lot
of
interesting
data
points
on
that,
because
I
think
he
mentors
some
who
have
their
issues
open
for
quite
a
while
in
the
interest
of
time,
oh
dennis
you've
added
some
action
items.
What
do
we
want
to?
We
can
read
the
help
needed
asynchronously,
but
we
do
need
help
on
this
issue
having
a
dri
and
then
the
exit
criteria
number
one
having
a
dri.
A
My
two
discussion
points
I
only
added
because
the
agenda
was
empty,
so
I'm
happy
to
punt
that
to
the
next
meeting
and
then
natalia,
do
you
want
to
go
over
yours?
Oh
there,
you
are.
E
Yeah
sure,
so
let
me
quickly
check
again.
We
have
average
data
in
terms
of
more
assigned
per
day
per
maintainer.
It's
in
the
same
dashboard
max
or
the
link.
You
can
check
average
maintainer
backend
average
maintainer
front-end,
and
there
are
data
for
the
average
number
of
mrs
per
day
per
maintainer
assigned
so
for
front-end,
for
example,
it's
1.2
1.4
per
day.
E
B
I
can
speak
from
just
my
experience
because
I
keep
an
eye
on
these
dashboards
sometimes
and
I'm
not
sure
if
it
necessarily
correlates
to
exactly
how
many
mrs
I'm
getting
a
day,
especially
if
it
is
really
over
seven
days
and
we
have
a
five-day
work
week.
But
I
know
that
when
I
start
to
get
over
two
is
when
it
starts
to
feel
like
a
lot.
I'm
not
sure
if
that's
true
for
a
lot
of
people,
but
there
is
generally
a
threshold
of
if
I
notice
it
get
enough
above
a
certain
number.
D
D
I
I
think
that
it
would
be
interesting
to
kind
of
gauge
from
team
members
how
many
they
do
a
day
versus
how
many
they
do
a
week
if
they
do
it
weekly
instead
of
daily.
J
I
think
from
my
experience
it's
definitely
daily,
but
it
it's
also
a
mix,
and
if
I
get
a
maintainer
review
that
comes
in
that's
obviously
small,
then
I
might
just
slot
it
in
and
get
it
done.
But
I
see
one
that's
you
know.
1000
changes
20
commits
and
you
know
it's
got
thousands
of
comments.
I
can
see.
Natalie
smiling,
you
know,
that's
that's
going
to
take
me
half
a
day
minimum
to
really
dig
into
so
it's.
I
don't
think,
there's
any
maintainers
that
set
aside
a
full
day
a
week
to
do
it.
F
I'd
agree
with
that,
I
don't
think
our
current
guidelines
allow
for
a
once
a
week,
cadence
anyways
with
like
their
first
time
to
first
touch.
I
think
the
guideline
which
is
like
what
is
it
36
hours,
48
hours,
something
like
that.
A
A
E
Sometimes
it's
a
maintainer
of
the
team
of
the
stage.
Sometimes
it's
maintainer
with
domain
knowledge.
Sometimes
it's
something
else,
but
if
we
check
the
dashboard
there
is
a
huge
imbalance.
So
should
we
enforce
review
a
lot
more?
What
can
we
do
to
make
this
a
lot
more
balanced
over
maintainers
as
well?
It's
not
only
about
a
number
but
about
like
more
equal
distribution.
A
Okay,
I'll
take
an
action
item
actually
to,
I
think,
create
three
issues
and
I'll
tag
you
on
those
and
I'll
add
it
to
the
channel.
But
I
do
really
feel
like
balancing.
I
think
this
is
in
line
with
a
lot
of
what
robert
has
been
talking
about,
but
also
having
some
of
that
data.
In
terms
of
you
know
getting
that
conversation
started
in
terms
of
what
do
we
actually
need
and
what
is
reasonable,
and
how
do
we
balance.
A
Okay,
we
got
one
minute,
so
thank
you
all
for
joining
I'm
going
to
move
some
of
these
items
to
the
next
meeting,
which
would
be
next
wednesday
next
wednesday
and
enjoy
the
rest
of
your
days.