►
From YouTube: CHAOSS.Common.May.28.2020
Description
CHAOSS.Common.May.28.2020
B
So
I'll
run
through
the
agenda
and
then
we'll
go
through
each
of
the
items.
In
turn,
I'm
gonna
need
to
recruit
someone
to
run
the
next
meeting,
because
I'm
not
gonna,
be
available,
we'll
go
through
our
action
items
and
notes
from
the
previous
meeting.
I
know:
I
haven't
done
mine,
so
this
should
be
fun,
we'll
review
any
PRS
and
issues
look
at
progress
on
the
continuous
release
for
the
time
to
close
metric,
we'll
have
a
look
at
the
metrics
spreadsheet
and
then
figure
out.
B
C
B
D
B
B
B
E
B
C
B
C
E
Could
review
it
right
now,
I'm
gonna
put
it
out
there
and
then,
if,
if
it's,
if
there
are
changes,
the
continuous
review
process
would
allow
people
to
suggest
changes
to
Gehrig's
point:
do
you
want
to
share
it?
I,
don't
have
it
handy?
Do
you
want
to?
Can
someone
share
their
screen,
so
we
can
do
that
really
quick.
B
B
D
E
D
Those
that
are
now
in
the
spreadsheet
did
you
know
this
process
kind
of
official
thing.
Maybe
it's
not
the
right
place,
those
that
are
under
the
time
and
then
I'm
just
highlighting
them
under
considering
so
here
time
with
the
preservative
reviewer
than
waiting
iterations
time
to
merge.
Thank
you
mark
in
general
and
the
time
to
time
to
mark
we
can
so
these
are
the
ones
that
we
are
missing,
at
least
so
the
other
two
that
we
were
working
at
in
the
in
etiquette.
It's
time
to
close.
Thank
you.
First
response.
There
yeah.
D
D
B
B
B
B
G
So
I
was
going
through
this
article
and
if
the
idea
came
across
the
lot
of
open
source
projects
are
doing
a
proper
planning
there.
But
to
do
this,
there
were
cool
items.
Take
them
a
formal
planning
shatter,
so
one
of
the
metric
can
be
like.
Can
we
see
all
the
open
source
projects
are
doing
proper
planning
or
this
handing
a
dog?
This
is
okay,
the
idea
came,
we
did
up
develop
something
or
I
thought
they're
doing
it
in
a
proper
plan.
We
do.
G
B
C
C
C
You
can
demonstrate
these
downstream
to
other
people,
and
people
in
your
upstream
can
demonstrate
them
to
you.
Sean
and
I
have
talked
to
Kate
Stewart
in
risk
about
about
this
about
exactly
this,
so
actually
taking
a
look
at
whether
or
not
upstream,
vendors
are
open
chain
compliant.
I
think
it's
what
the
term
that
they
use
as
a
measure
of
a
risk
reduction.
B
Sounds
like
if
they're
already
talking
about
risk,
maybe
it
makes
more
sense
to
to
have
that
conversation
as
part
of
the
risk
working
group
and
see
if
see
if
it's
a
better
fit
to
roll
it
into
the
stuff
they're
already
working
on
it.
C
B
B
B
Okay,
so
I
think
were
there
any
others
that
we
wanted
to
talk
about.
We
talked
about
the
issues
in
a
lot
of
detail
last
week.
Were
there
any
others
that
people
wanted
to
chat
about.
B
B
B
Can't
figure
out
how
to
stop
sharing
problem
as
things
get
okay
there
we
go.
Okay,
progress
on
the
continuous
release
for
the
time
to
close
metric
I
think
we
kind
of
talked
about
that.
There's
an
issue
for
it
and
Kevin
sent
a
reminder
to
the
mailing
list.
So
it
looks
like
nobody
else
has
sent
us
any
feedback,
but
it
looks
like
that's
kind
of
on
schedule
closes.
C
A
B
Okay
said:
we
want
to
talk
about
metric
release
schedule.
Just
somebody
want
to
chime.
C
C
E
B
I
must
have
closed
it
all
right.
So,
let's,
if
somebody
else
wants
to
look
at
it
and
merge
it
go
for
it.
Okay,
there
we
go.
B
A
B
B
A
A
B
C
Garrett
point
is
that
there's
a
four
week
comment
or
a
four
week
comment
period
ahead,
but
we
also
needed
an
additional
week
just
for
Kevin
to
do
the
work
to
get
it
out
of
the
repos
and
onto
the
website.
So
it's
it's
basically
five
weeks
out
of
that.
So
that
would
work
in
groups
would
end
that
whenever
the
24th
or
23rd
of
June,
okay.
D
E
B
That
well
just
saying
I
once
I
get
then
there's
another
deadline,
which
is
the
actual
metrics
freeze,
which
is
when
we
have
to
have
all
the
comments
incorporated
is
a
week
before
that
August
1st
deadline,
so
that
Kevin
could
get
everything
on
the
website.
So
this
need
to
make
sure
that
everybody
understands
like
when
they
need
their
draft,
for
you
know
for
the
review
period
to
start
when
they
need
to
have
all
of
their
comments
incorporated,
because
Kevin
needs
to
have
a
frozen
version
to
put
on
the
website.
Yes,
okay,.
B
C
Gonna
say
that
you
know
it's
in
terms
of
like
volume
of
metrics,
you
mean
even
like
two
and
three
per
working
group
is
great,
because
that
I
think
it's
a
every
six
months
that
ends
up
being
about
ten
metrics
released
over
the
year.
That's
20
and
it's
a
pretty
good
volume
and
personally
I
kind
of
like
that
piece
of
not
releasing
like
70
metrics.
You
know
in
a
cycle,
so
I
think
it's
fairly
easy
to
keep
track
of.
B
B
B
Okay,
so
so
we
have
two
that
are
pending
contributor
location,
which
is
the
one
that
John
was
working
on
sorry.
This
is
a
move
that
that
doesn't
really
belong
in
the
reviewing
of
the
progress,
but
if
we
get
chance,
let's
talk
about
that
as
a
metric
and
contributor
employer
location,
so
that
was
kind
of
along
with
that
one
I'm
not
sure.
If
he's
also
been
working
on
that.
F
E
B
B
E
D
B
D
I
need
to
merge,
I,
don't
think
we
are
considering
time
to
close
it's
time
to
close
for
everything
or
anything.
It's
time
to
merge.
This
case
is
a
different
time,
so
we
can
consider
time
to
merge
either
the
time
for
our
code
review
process
like
the
whole
code
review
or
even
more
adding
some
other
extra
times
as,
for
instance,
the
time
to
merge.
So
you
can
have
like
the
proof
right,
but
then
there
is
certain
machinery
time
until
this
is
merged.
D
So
then
we
can
have
all
of
this
time,
so
this
might
be
used
to
useful
to
look
at
cost
and
this
time
to
merge
one
we
have,
let's
say,
define
time
to
merge.
Then
we
have
theater
on
the
top,
which
is
time
to
merge
two
masters.
It's
a
review
shaprut.
So
then
it's
the
same
code
review
process,
but
then,
after
this
approval,
this
is
merged
into
the
into
master,
just
kind
of
another
time
and
then
kind
of
related.
They
are
these
two
which
are
in.
D
Row
number
27
and
28
thank
waiting
for
some
interaction
or
time
wait
in
front
of
your
action.
Sorry,
but
someone
is
reviewing
this
and
then
perhaps
that
person
asked
for
certain
mineral
changes.
So
then
this
is
time
waiting
for
a
submitter.
So
the
first
thing
everything
if
I
have
to
choose
or
go
for
something
I
go
for
time
to
merge
or
even
this
little
time.
So
those
are
the
two
language
go
for.
This
is.
B
B
Do
see
that
as
separate
from
I
mean
time
to
time
to
close
this
one
way
of
looking
at
it
time
to
merge
is
kind
of
a
specific
case
of
time
to
close
I
think
but
probably
specific
enough
that
it
makes
sense
to
do
it
as
a
separate
metric.
It
might
be
worthwhile
in
the
references
may
be
pointing
to
the
time
to
close
metric.
D
E
E
E
E
E
E
E
C
D
The
only
thing
there
is
so
we
have,
in
this
case
time
to
merge,
is
like
kind
of
the
the
model
of
the
rest
of
the
matrix,
because
if
we
think
about
the
code
review
process,
we
can
split
that
whole
time
into
different
slots,
for
instance.
So
we
can,
we
can
use
different
perspectives.
There
try
to
know
how
to
apply
new
filters
to
that.
Just
an
example
here,
so
we
can
have
a
time
to
merge
with
three
iterations.
E
D
D
Metric
that
we
can
split
into
others,
so
I
don't
know
how
we
are
doing
this
in
the
English
working
room.
So
that's
that's
all
and
then
how
this
might
be
related
to
a
specific
filter
that
you
mention
math
help.
So
we
need
to
do.
We
need
to
keep
applying
our
filters
to
these,
or
how
do
you
see
this
relation?
How
to
play
with
all
of
this.
C
So
anything
did
I
if
I
followed
the
question
and
I
think
it
if
I
did
and
I
it's
a
good
question
so
as
we
have
common
metrics
whatever
they
might
be,
whether
it's
time
to
merge
time
to
close
organizational
diversity
contributors,
those
may
very
well
show
up
as
filters
in
other
metrics.
Is
that
right.
E
E
D
C
D
D
E
D
E
We've
also
created
so
we've
also
created
specialized
versions
of
common
metrics
in
evolution.
So
we've
like
a
subclass
for
lack
of
a
better
man
for
common
metrics
in
this
case
and
I'm.
If
we
develop
these
metrics,
we
were
just
beasts
and
I
think
we
want
to
at
least
reference
or
subclass
metric
review
duration,
because
I
think
that
it
essentially
falls
under
most
of
the
issues
related
to
the
burstiness
issue
that
a
whole
group
of
I
clicked
on
the
wrong
one.
E
D
E
B
Personally,
I
don't
have
a
I,
don't
have
a
strong
preference,
I
mean
I
think
we
should
I
tend
to
be
pretty
pragmatic
about
stuff
like
this,
like,
whatever
whatever
seems
to
work
in
the
situation
that
we're
in
seems
reasonable
to
me.
If
there's
something
that
we
need
to
define,
we
can
spend
some
time
defining
it
like
to
be.
Do
we
need
so
the
question
is:
do
we
do
be?
B
C
So
I
like
the
idea
of
cross-referencing,
whether
it's
as
a
filter,
this
is
kind
of
what
common
provides
so
I
like
the
idea
of
cross-referencing
as
a
filter
or
as
a
subclass
I.
Think
that's
a
really
good
idea.
I!
Don't
like
the
process
of
doing
it.
It
just
sounds
like
a
absolute
mess
on
two
fronts,
one
to
explain
it
to
people.
C
It
sounds
very
difficult
and
part
of
the
chaos
project
is
there's
a
lot
of
information
and
the
more
we
intertwine
it
I
think
the
harder
it
is
to
approach
and
then
to
just
maintaining
cross
references
like
that
reminds
me
of
the
web
in
like
1999,
maintain
URLs,
and
we
get
some
MIDI
sound.
Another
website,
yeah
just
going
to
be.
C
E
E
B
Okay,
so
it's
it's
a
quarter
till
we
should
probably
try
to
end
in
about
five
minutes,
so
you
people
have
have
thoughts
on
next
steps
so
as
opposed
to
how
are
we
going
to
resolve
that
we're
obviously
not
going
to
resolve
this
in
the
end?
This
meeting
is
this
something
that
I
mean
do
some
of
you
want
to
take
this
back
to
your
other
working
groups
and
have
kind
of
an
informal
discussion
about
what
other
working
groups
think
or
do
we
want
to
take
this
into
the
the
weekly
meeting
and
tee
up
a
discussion
there.
B
E
C
E
C
D
C
No,
no,
this
it's
it's
really
good.
It
is
important
because
a
lot
of
metrics
do
kind
of
butt
up
against
other
metrics
and
they
are
related
and
if
one
metric
is
using
the
term
organizational
diversity
like
they
actually
mentioned
that
it
it's
already
been
defined
here.
So
the
different
working
group
is
heading
that
direction.
That's
probably
important,
just
to
kind
of
say
listen.
This
has
already
been
defined,
so
please
use
this
definition.
B
C
B
C
A
B
E
F
C
Think
I'm,
as
we
start
releasing
more
and
more
metrics
I,
think
I'm
process
wise
with
Dawn
that
if
something
hasn't
been
changed,
it's
just.
If
there's
no
comment,
it
just
rolls
into
the
next
version.
But
if
a
filter
has
been
added
or
a
sentence
has
been
added
in
the
objectives,
then
we
do
roll
it
back,
even
just
to
provide
a
review
on
that.
One
sentence.