►
From YouTube: Development Metrics working group 2019-06-20
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
B
That's
that's
kind
of
my
read
on
the
situation.
A
couple
things
to
note.
One
I've
got
a
question
out
to
directors
kind
of
query
their
team's
around
it.
So
I
get
some
feedback
around
that
it
could
be.
You
know,
June's
a
very
popular
month
for
people
with
certain
vacations.
So
that's
what
we're
anticipating
to
come
back
but
we'd
like
to
gonna
hear
about
it
more
detail.
B
A
B
It
kind
of
Falls,
the
other
one.
The
way
I
would
describe
it
is
is
I
would
hope
we
would
hit
around
1900
to
1950,
probably
for
this
month,
just
because
last
month
was
basically
a
lower
because
of
the
contribute.
Now,
if
it
turns
out
that
vacation
is
playing
into
this,
and
obviously
we
won't
hit
that
so
that's
kind
of
a
good
explanation
around
it
from
that
perspective.
But
that's
that's
where
we
understand
a
little
better
so
that
it
feels.
A
B
We
do
in
regards
so
usually
in
November
and
December.
You
can
see
those
summer
months
and
summer
months.
You
would
expect
to
see.
November
December
is
a
more
significant
decline.
Summer
months
are
a
decline,
but
maybe
more
distributed
as
Professor.
You
describe
it
so
June,
July
August,
is
what
I've
seen
historically,
but
you
know
with
the
more
international
geography
we'll
evaluate
that
aspect.
A
You
no
longer
have
to
update
the
spreadsheet,
which
is
now
automated
and
monitor
this
from
me
right
now,
comments
yep,
okay,
I'm
gonna,
stop
sharing
my
screen,
so
we
can
see
each
other
better
and
moving
to
we're
skipping.
That
hatha
sees
all
up.
Sorry
I
think
correctly.
Do
you'll
manage
to
take
the
lead
on
number
eight
and
nine.
Any
updates
you
would
like
to
give
us
I
did
take.
B
The
lead
they
are
or
should
be
assigned
to
me,
but
no
I
haven't
had
any
time
to
spend
on
it.
This
week
it's
I
spent
some
time.
Learning
about
periscope
I
have
access
to
it
now,
so
in
theory,
I
have
access
to
all
the
data
or
I
know
how
to
get
more
data
to
be
able
to
better
inform
those,
but
no
I'm,
still
pretty
heads
down
on
the
memory.
Okay
I
would
expect
not
a
ton
of
progress
in
the
coming
couple
weeks,
but
it's
certainly
on
my
list
of
things
to
do.
That's.
A
Good
enough
for
me-
and
you
just
recently
joined
so
you're-
definitely
hitting
the
ground
running
at
full
speed.
So
thank
you
for
your
all.
Your
contributions,
absolutely
okay!
Let's,
let's
go
through
the
agenda,
then
I
ice
core
of
our
working
boot
last
time,
I'm
making
another
iteration
to
the
ruffs
scoring,
which
we
will
attach
at
the
end
I
think
we
were
roughly
50%
done
from
the
rough
swag.
A
Any
any
comments.
If
you
would
like
to
take
a
look,
please
click
on
the
link.
We
do
have
progress
on
the
metrics,
which
is
great,
the
I
think
what
what
needs
to
like
the
loophole
that
I
see
here
is
Irish
time
to
resolve.
I,
don't
think
we
are
gonna,
make
120
days
on
average
and
I'm
okay
with
that
being
a
follow-up
after
the
mid,
this
working
group
has
disbanded
because
that's
we
have
clear
process
clear,
metrics
already
established.
B
A
A
We
50%
Raj
a
big
gap
here
is
in
marriage,
where
I
don't
see
a
milestone
being
assigned
to
it.
The
good
news
is
that
these
are
the
only
customer
as
well
as
tools
that
are
not
scheduled.
We
don't
need
not
have
any
in
EE,
which
is
great.
The
bulk
of
it
is
in
is
in
C
and
Jason
is
not
here
all
ticket
to
bring
magnitude
to
remind
folks
in
the
dev
section,
product
management
to
Addison
questions
comments.
A
A
A
Okay,
moving
on
heat
maps
are
coming
to
group
trash
packages.
Now
this
is
a
mock-up
and
these
are
being
generated
for
every
team
label
for
now,
but
they
will
transition
to
two
groups
we
release
to
using
timbales
and
devops
group
labels
together.
So
this
will
technically
capture
the
the
issue
population,
the
bug
population,
so
this
is
gonna,
help
every
every
group
and
teams
visualize
what
is
in
their
backlog.
This
is
a
mock-up,
so
please
disregard
the
145
issues
with
no
severity
and
priority.
A
It's
just
an
arbitrary
number
in
every
link
in
every
numbers
that
has
a
corresponding
s4p
label.
We
will
build
a
query
link
to
the
issue
tracker,
so
you
can
click
on
it
and
then
it
will
render
the
kit
lab
issue
tracker
with
with
those
group
labels
and
as
one
people
once
you,
you
can
just
go
straight
to
those
issues
and
take
a
look
and
push
change
milestones
as
necessary
comments,
suggestions.
A
Okay,
moving
on
this
came
off
of
the
product
team
meeting
on
Tuesday.
The
discussion
on
using
due
dates
to
move
s
loz
closer
I.
Think
there
are
the
two
sides
of
this
I
think
some
some
folks
would
love
to
see
due
dates.
I
think
it
may
generate
more
noise
than
necessary
if
you
were
anticipating
a
fix
for
two
or
three
months
down
the
line.
The
accuracy
of
dates
are
not
there.
So
like
there's
more
likely
that
you
miss
the
due
date,
because
you're
clicking
an
arbitrary
due
date
in
the
future.
B
I'm
sorry
I'm
just
trying
to
get
it
in
here
so
I
was
like
yeah
like
if
we're
just
going
to
picking
dates
like
my
reaction
is,
is
if
we
want
to
try
this
I'm
up
for
experimentation.
I
just
I
have
a
feeling
that
this
is
going
to
end
up
being
that
we're
gonna
have
a
lot
of
stuff.
That's
overdue,
like
I,
think
there's
a
couple
things
that
are
kind
of
in
here.
B
A
B
B
That
part
of
it
I
think
that's
totally
reasonable
from
that
perspective,
but
like
from
bug
creation
to
to
having
a
due
date,
just
feels
like
it
feels
like
it.
The
other
aspect
is:
is
that
because
there's
a
prioritization
element
in
here,
you
know
like
unless
we
have
you
know,
unless
we
have
an
agreement
from
product
to
say,
here's
here's,
you
know
here's
what
we
wish
the
level
you
know
bugs.
B
B
B
A
I
I
would
agree
because
we
now
have
the
ticket
missed
s
ellos
and
they
could
just
clearly
by
that
label.
Now
and
new
charts
are
being
added
into
the
the
native
hit
11
sites,
as
we
speak.
So.
A
A
A
I
do
want
to
show
this
new
chart.
This
just
took
mark
an
hour
to
do
so.
You
went
ahead
with
the
experimentation,
but
we
also
figured
out
that
we
could
just
configure
this
in
native
insights
today.
So
there's
a
enough
flight
plan
for
the
quality
dashboard
that
I
have
outline-
and
this
is
one
of
the
new
charts
that
we'll
move
over.
So
good
news
is
very
little
pee
ones,
bad
news,
huge
tail
of
P
threes
and
moderate
P
choose
so
where
the
P
fours
we
did
not
detect
P
force.
A
Cuz,
that's
gonna,
be
even
more
because
you
can
go
into
infinity
that
language
says
that
it's
120
days
plus
unless
we
can
come
up
with
120
to
200
days
and
then
and
then
reevaluate
their
it's.
It's
just
not
worth
looking
into
right
now.
I
would
say:
okay
charts
here
we
migrating
it
over,
and
this
will
go
away
probably
sometime
next
week
and
you
would
see
the
equivalent
of
this
chart
in
the
in
the
product
itself.
A
Second,
let
me
stop
sharing
go
back
to
the
agenda.
Okay,
this
is
a
sensitive
one
and
I
wish.
Jason
was
here,
follow
from
how
we
want
to
deprecate
the
old
team
labels
and
I.
Think
there
are
discussions
on
what,
if
a
team
member
form
a
group
contributes
to
work
in
another
DevOps
stage
and
that
will
that
may
cause
it
to
be
misaligned
because,
hey
my
my
team
member
is
in
this
Sage
Group,
then
we
just
want
to
help
out.
A
B
A
A
It
might
be
mismatched
if
they
changed
the
group
label
to
their
own
group.
For
example,
it
may
be
dependency
scanning,
which
is
insecure,
but
originally
it's
it
was
in.
The
work
is
in
DevOps
plan
and
there
will
be
a
disconnect
there,
because
the
work
is
contribute
towards
the
planned
stage
done
by
someone
in
the
secured
group,
for
example,
dependency
scanning.
B
B
B
A
Should
we
revisit
this
when
in
Jason's
back
here
next
week,
yeah
I
think
so
yeah
I
think
it's
cross
cuts
many
areas.
Okay,
I'll
update
the
issue
there,
so
the
team
there
no,
this
was
this
came
up
as
part
of
the
discussion
there.
Okay.
A
A
cluster
that
it
is
outdated,
it's
as
if
an
issue
is
labeled
illustrate,
maybe
you
scheduled
a
plan
team,
for
example.
Such
cases
include
both-
and
this
is
already
incorrect,
because
plan
is
not
a
team
and
yeah
sit
and
speak
to
team
structure
and
nomenclature
and
I
know
we're
big
on
that,
because
we
need
to
get
it
right.
Yeah.
B
A
A
2014
6
to1
to
ground
what
do
they
really
need
and
what
it's
nice
to
have.
They
did
give
us
some
Headroom
here
that
if
it's,
if
we're
at
75%
data
collected,
that's
ok,
which
I
think
is
good
but
I,
don't
think
it's
realistic
for
us
to
go
back
and
label
every
undefined.
Mr
for
two
years,
I
think
that's.
We
have
a
lot
of
things
to
do
and
that
when
you
a
much
value,
I
think
we
could
guarantee
that
it
will
be
undefined
indefinitely
because
we
don't
go
and
change
it
and
the
number
should
be
static.
B
B
To
understand
audited
a
broad
level,
it's
like
they
test
the
source
to
use
full
list
of
I
March
and
sure
we
didn't
miss
out
putting
in
EMRs,
let's
presume,
are
so
important
period.
Extend
the
word
know:
what
does
the
status
we
respectively
will
undefined
M
ours
period?
Yep,
that's
the
one
ok
yeah
I
would
just
I
would
say
that.
B
Yeah
so
I
just
wanted
to
update
the
team
urge
to
basically
I've
started
to
put
together
a
spreadsheet
around
Miss
deliverables,
and
this
is
not
too
surprising
and
reversed.
I
did
buy
the
whole
team's
I
figured
that
was
currently
at
the
listing
so
I'm
having
some
data
integrity
issues,
but
in
the
process
of
doing
that,
I
started
to
look
at
both
focus
in
particular,
on
I,
create
and
verify
and
try
to
understand
better,
what's
going
on
there.
B
One
thing
I
noticed
is:
is
that
on
both
of
those
teams
they
have
issues
basically,
for
you
know,
say
three
releases
in
the
future
and
I'm.
Trying
to
best
understand
is
this:
is
this
a
issues
keep
piling
up
so
what
happens?
Is
they
just
go
to
the
next
release
and
then
they
pile
up
and
or
is
it
a
some?
Other
phenomenon
is
basically
going
on
here.
B
The
other
thing
I
noticed
is,
as
I
went
back
about
four
or
five
releases
for
each
of
those
teams,
and
you
see
some
high
variance
and
the
number
of
issues
like
when
one
really
still
have
five
or
six
issues
completed
the
next
one
will
have
40
issues
completed
so
there's
a
little
bit
of
work
to
be
done
there
to
better
understand.
What's
going
on
from
that
perspective,
so
so
no
real
summary,
but
just
wanted
to
make
sure
that
folks
were
kind
of
work
in
that
aspect
of
it.
B
A
Great
because
I
also
want
to
point
out
that
in
every
team
stretch,
oh,
we
should
call
it
group
retros
I,
see
engineering
managers
pull
in
the
issue,
tracker
query
and
they
will
list
like
mister
Global's.
For
that
for
that
release,
which
is
good
but
I,
think
yeah
like
having
a
guideline
on
how
many
mystical
goals
are
we
taking
on
for
or
per
release
and
like
how
many
more
are
we
expected
to
incur
I
think
that's
that's
a
great
way
to
take
on
it.
Do.
A
A
A
A
Yes,
so
I
have
an
issue
for
that
on
cleaning
how
we're
doing
up.
It's
called
the
next
iteration
of
closing
out
milestones,
so
there's
a
slight
we're
in
a
limbo
state
where,
before
it's
closed,
that
we
miss,
we
tagged
a
mr.
little
bowls
at
feature.
Freeze,
freeze,
which
is
the
sevens.
Let's
say
the
old
way
of
doing
things,
and
then
all
generic
missed
items
are
tagged
after
the
milestone
is
closed.
The
next
iteration
is
we
would
just
do
everything
on
the
same
day.
The
milestone
is
closed.
If
it's
a
mr.
A
B
B
B
A
B
B
A
B
B
Cool
any
other
questions
on
that
one
all
right,
no
and
then
the
next
one
is
is
I've
run
through
the
exercise
of
basically
getting
teams
to
label.
There's
only
a
one
outstanding
team
I
think
we
went
from
like
roughly
over
300
to
under
180
unlabeled
items.
So
that's
that's
the
circle,
an
improvement
proportionately.
This
is
just
for
me.
We
still
have
a
good
good
number
of
them
that
will
have
to
work
to
address
that
appear
to
not
have
a
team
label
based
on
that
bit
of
feedback.
B
A
Is
that
correct?
Yes,
so
I
we
have
some
enhancements
that
will
help
it
make
it
easier
and
they
are
like
people
like
stand.
That
Cisco
does
everything
and
he
doesn't
map
to
a
to
a
team
per
se
and
I
think
there's
some
educators
that
we
could
maybe
help
with
auto
labeling
to
make
sure
that
they,
you
kind
of,
find
a
pattern,
just
go
and
run
the
script,
so
you
had
to
go
through
200m,
ours,
yeah.
A
A
A
Before
we
go
to
the
exit,
courtesy
I
think
cleaning
up
the
the
mouse
and
how
we
available
mr.
littles,
it
would
be
key,
so
I'll
jump
on
that
with
the
rest
of
the
team.
Hopefully
they
get
something
in
by
the
court
for
next
week
and
then
other
action
items
that
I
think
are
important.
I,
don't
think
they
are
we're.
Making
good
progress
towards
closing
this
out
with
that,
I
will
again
share
my
screen
and
we
will
grade
the
exit
criteria,
live.
A
Folks
see
my
screen:
yes,
okay,
okay,
I'm,
taking
this
at
55%
21
increase
in
throughput
90,
no
change
here
from
last
time.
Let's
see
how
we
do
by
the
end
of
this
month,
define
KPIs,
that's
really
that's
basically
done
and
ensure
customer
bigger
Nick.
On
a
second
yeah
like.
Let
me
redo
this
again.
It's
a
the
resolution
is
not
that
good
one.
Second,
my.
A
A
Okay,
cool
ensuring
all
customer
facing
books
have
a
security
label,
so
this
is
I'm
taking
that
70s
are
applied,
SMP
are
applied,
but
we
would
like
to
close
this
out
in
terms
of
getting
a
mouse
to
and
attached
to
it
together
time
to
resolve,
as
one
as
to
this
is
already
automated.
This
is
done
and
one
second,
my
my
OCD
twin
brother
is
asking
this
to
be
done
so
on
dinner.
A
I
like
that
as
well,
going
back
to
time
to
resolve
I
think
we're
still
monitoring
it
so
and
I
have
any
great
for
this
an
effective
into
the
current
charge
package.
So
we
have
trash
package
which
you'll
already
defined
we're
getting
about
we're
getting
binds
and
socializing
with
product
management
I'm
taking
as
a
50%,
all
the
code
and
services.
We
just
have
to
go
it
out.
This
includes
SLO
detection,
heat
map
and
every
new
focus
on
customer
effect,
customer
affecting
bugs
and
and
features
training
for
eons
and
pmc
use
proteins
very
more
effectively.
A
I
picked
this
at
30%.
I
have
a
curriculum
there
for
a
workshop
already
and
I'll
do
this
throughout
like
a
few
YouTube
sessions,
and
if
anybody
here
would
they
would
like
to
participate
in
this
session,
because
I
just
have
to
the
issue
in
an
arm
will
welcome
their
training
for
engineering
managers
and
KPI
and
interventions.
Last
time
we
were
too
much
sure
about.
We
would
like
to
update
anything
here.