►
From YouTube: CHAOSS.Evolution.March.25.2020
Description
CHAOSS.Evolution.March.25.2020
A
All
right
welcome
to
the
chaos
evolution
meeting
on
March
25th
I'm
gonna
have
a
request
to
all
working
groups
that
whoever
is
leading
the
meeting
share
their
screen,
occasionally
or
often
because
on
the
videos,
it's
helpful.
If
people
who
do
actually
watch
the
videos
can
see
the
screen
that
we're
talking
about
yeah
for
sure
there's
often
times
it's
just
a
bunch
of
us
talking
with
no
context
yeah.
B
D
B
Alright,
so
we
have
a
lot
of
metrics
right
now
that
we're
considering
it's
like
code
review,
iterations,
pull,
requests,
open
closed
and
comments,
which
I
think
we
discussed
in
a
different
meaning
that
those
should
be.
It
was
the
common
meeting
that
I
was
in
that
we're
calling
those
reviews
in
in
the
other
places
that
was
the
naming
convention
for
a
pull
request.
You
had
because
yes,
different
language
in
different
places,
so
I.
B
E
B
A
E
D
B
G
A
B
Personally,
I
think
probably
I
hear
questions
about
community
growth,
which
one
isn't
that
what
a
row
community
growth
is
at
the
bottom.
It
starts
to
roll
41
and
it
looks
like
there's.
New
contributors
and
contributors
of
new
issues
that
are
pending,
that
might
be
warrant,
might
warrant
might
be
developed
and
we're.
Looking
at
opening.
B
B
B
Yep,
what's
Markham
just
let's
work
on
new
contributors
of
commits
or
issues,
any
preference
for
where
we
start
well,
you
already
changed
new
contributors
of
issues
to
impress
so
yeah.
Oh
I
did
I
thought
I
did
both
of
them
too.
I'll.
Take
that
as
a
sign
yeah
all
right
alter
over
they.
Let's
start
there
then.
B
So
as
new
contributors
of
issues,
the
name
we
want
sounds
a
bit
awkward
but
I'll
go
with
it
for
now
contributor
niche
new
issue.
Can
it
is
from
new
contributors,
yeah
I,
guess
that
doesn't
make
sense,
because
new
issue
contributors
doesn't
make
it
clear
that
it's
the
new
contributor
part
that's
important
new.
E
B
B
A
A
B
E
The
common
working
group,
we
have
a
metric
called
contributors,
and
so
if
we
take
that
as
the
foundation
of
identifying
who
is
in
the
project,
then
we
just
need
to
filter
by
contributors
who
are
on
issues
and
look
at
those
who
didn't
contribute
before
our
certain
time
frame.
So
then
we
get
new
contributors
to
issues.
A
E
Yes,
exactly
it's
a
subclass
of
contributors
and
then
you
see
it
under
filters.
There
is
issue
authors,
so
we
applied
the
filter
issue,
authors
and
then
we
have
another
parameter
somewhere,
where
we
have
another
filter,
saying
no
activity
before
whatever
our
time
frame
is
so
if
you
want
to
see
new
issue.
H
Is
the
this
is
related
for
the
new
contributors
metric
as
well
yeah,
it's
the
Jade.
The
generic
a
feature
is
no
trick
so
in
this
and
and
that
one
we
just
said
at
the
very
last
line
of
the
description.
This
is
a
specific
implementation
of
the
contributors
metric.
So
maybe
because
this
one
is,
has
an
additional
filter,
it
makes
more
sense
to
call
it
subclass,
but.
A
D
C
B
B
Do
we
want
new
contributors
of
issues
and
new
contributors
of
commits
to
simply
be
filters
on
our
new
contributors
metric?
Oh
oh
I
see
what
you're
saying.
In
other
words,
do
we
really
need
to
all
up
two
additional
metrics,
or
can
they
be
filters
on
this
existing
metric?
Mostly
what
you're,
saying
and
I
don't
I
don't
have
an
argument
either
way
in
wonder
in
one
respect,
adding
it
as
a
specific
metric
makes
it
more
clear
to
the
to
a
person
who's
new
to
the
project
that
this
information
is
a
specific
metric.
B
A
E
E
A
Purely
anecdotal,
one
of
the
criticisms
that
I
hear
about
the
chaos
project
as
a
whole
is
that
it's
a
lot
to
take
in
the
more
that
we
can
kind
of
keep
that
volume
of
noise
down
the
better
mm-hmm.
D
A
H
I
think
doing
it
in
this
way
a
filtering
system
also
a
good
way
of
showing
people
that
we
collect
data
for
things
besides
what
they
might
have
originally
been
looking
for,
it
might
give
them
new
likes.
Just
seeing
this
list
of
everything
that
we
you
know,
it
would
consider
a
location
engagement.
It
might
help
them
think
of
new
metrics
themselves.
B
B
B
B
H
I'm
I'm
in
the
middle
of
adding
it
to
the
metric
ribbon,
we're
just
gonna
put
it
in
the
same
place
under
filters
right,
yeah,
okay,
I'll,
just
get
rid
of.
B
A
E
A
Know
what
I
mean
do
they
just
have
like
some
face
validity
to
them
right.
B
B
C
C
B
B
B
Okay,
let's
see
if
there's
any
others
reviews
open
closed
comments,
yeah,
let's
check.
If
those
review,
that's
chips,
check
on
reviews
and
see
if
those
filters
are
actually
even
in
there
by
actors
and
my
groups
of
actors
and
we
had
open
clothes.
So
we
would
probably
issue
a
change
to
this
metric.
You.
B
I
was
like
New
York's
things
that
he
crossed
out
and
he
took
a
look
at
reviews
open
closed
in
comments
and
suggested
that
they
are
filters
on
reviews,
and
so
then
I
opened
the
reviews
metric
that
we've
already
released
and
checked
to
see
if
they
were
filters
already
and
they're.
Not,
and
so
I
think
this
would
if
we
do
agree
that
those
are
really
filters
on
reviews.
E
B
In
this
case,
it's
a
little
less
clear
to
me
because
so
people,
don't
people,
ask
then
the
contributor
question
and
they
want
to
kind
of
broadly
construed.
In
many
cases
when
it
comes
to
pull
requests,
projects
do
specifically
want
to
know
the
number
that
are
open
and
closed
and
the
comments
on
them
and
I
agree
that
they
could
be
filters
and
I'm
just
debating
with
myself.
If,
in
this
case
it
is
helpful
or
too
complex.
A
Have
a
question
on
this
look
at
this
list,
so
we
have
like
down
below
and
code
development
efficiency.
We
have
reviews,
accepted,
declined
and
duration
mm-hmm
so
and
those
are
already
released.
You
know
above
we
have
reviews
open
and
closed
and
comments
or
open
and
closed
I
should
say
at
what
point
is
something
a
metric
like
it
should
be
its
own
metric
and
at
what
point
should
something
be
a
filter
and
it's
just
yeah
mine
in
between
which
I'm
not
sure
I've
ever
thought
about
me.
B
Either,
like
I
think
new
contributors
and
contributors,
this
made
a
lot
of
sense
to
make
them
filters,
but
we
do
get
a
lot
of
questions
and
I.
Guess
it's
in
the
different
people,
people
that
we're
working
with
right
now.
They
want
to
know
how
many
reviews
are
open
at
a
point
in
time.
How
many
reviews
were
closed?
The
ratio
between
them
number
of
comments.
B
Reviews
duration
is
actually
pretty
common.
A
A
So
at
that
point,
then
we
would
just
have
kind
of
a
the
highest
level
activities
as
being
metrics
reviews,
Forks
and
everything.
The
activity
on
those
would
all
be
captured
via
filters,
but
it
appears
listening
to
you
talk
Sean
that
there's
certain
certain
of
those
highest
level
items
like
reviews
that
putting
that
granularity
and
a
filter
may
too
much
and.
B
It
might
bloat
like
when
you
talk
about
like
reviews,
accepted
and
declined
or
open
and
closed.
There
has
to
be
some
definition
of
like
does
it
require
mas,
but
it's
essentially
the
state
of
the
review.
So
in
that
sense
it
could
be
a
filter,
accepted
declined
in
duration.
Those
are
pretty
detailed,
specific
calculations
and
I
think
they
make
sense
open
and
closed.
I
I
think
they
could
be
filters
on
reviews,
because
it's
really
it's
pretty
easy
to
describe
like
I'm
thinking
also
about
how
much
does
it
bloat
the
individual
metric?
B
A
H
B
H
E
B
D
B
Okay,
and
so
we
have
one
this
skinny
complex,
we
do
have
one
under
reviews,
so
we're
going
to
take
a
mid
period
release
for
that
one,
that's
under
review
and
the
review
period
finishes.
So
if
we
edit
master
directly
on
on
reviews,
then
whatever
changes
we
make
to
reviews
is
also
going
to
get
pulled
when
the
review
period
for
new
contributors
ends
and
we
add
the
tag
right.
B
A
E
D
E
B
G
E
E
B
B
H
E
B
B
And
so,
if
you
created
take
it's
the
latest
commit
and
if
we're
editing,
reviews
and
those
edits
are,
you
know
before
the
latest
commit
and
basically,
if
the
review,
if
the
edits
to
release,
if
the
edits
to
reviews
and
master
are
before
the
commit,
that's
current,
which
they
would
be,
then
they
would
be
included
in
our
release
and
they
wouldn't
have
gone
through
the
review
period.
I.
E
E
B
A
H
H
That's
up
for
under
review
goes
to
the
under
review
branch
and
then,
as
soon
as
it's
ready,
it
gets
pushed
to
the
release
branch,
and
then
we
only
ever
build
tags
from
the
release
branch,
even
if
it's
not
master
and
that
way,
even
if
something
is
under
review,
it's
only
ever
we
haven't.
We
have
snapshots
of
the
repository
where
every
metric
is
always
it's
some
form
of
completion
and
never
in
limbo.
But
if
their
website
can
do
two
different
tags,
it's
probably
easy
solution.
So
just
go
with
that
and.
E
Yeah
you've
thought
about
doing
like
feature
branches
for
all
the
metrics
in
the
past
and
then
merging
it
all
together,
and
it
sounds
like
a
lot
of
merchants
out
there,
chaos
that
was
too
much
complexity
because
everything
was
shifting
and
moving
all
the
time
working
master.
Everything
is
to
some
form
complete
or
not
complete,
but
we
don't
have
the
overhead
of
managing
multiple
branches
and
forks
and
whatnot,
and
so
that's
why
everything
is
in
master.
Even
if
it's
not
adhering
to.
Yes,
it's
ready
for
the
release.
E
E
H
A
H
A
H
I
was
just
saying
for
the
four
new
contributors
they
even
though
I
copied
in
some
new
stuff
in
there
it
was
a
point
to
well.
It
was
what
was
the
link
was
pointing
to
so
it's
not
actually
any
new
like
content.
So
I
was
thinking
that
one
I
don't
think
we
should
review,
but
the
there
were
years
when
I
definitely
think
that
we
should
I
was
currently
under
review.
You.
G
B
Sentence
yeah
so
I
guess
the
working
group
maybe
has
the
authority
to
decide
if
it
should
go
under
review
or
if
it's
not
a
material,
it's
the
non
housekeeping
style
of
change.
If
it's
not
a
material
change,
then
does
it
maybe
not
because,
like
literally
if
I
edit
reviews
here,
I
can
just
put
under
filters,
you
just
add
opened
and
closed
basing
on
basically
yeah
I.
F
A
B
It
seems
it
seems
a
bit
much
to
me
yeah
the
in
this
case.
It
really
does
seem
like
a
30-day
review,
is
a
little
absurd,
because
no
one's
gonna
disagree.
The
DCO
sign.
E
G
A
A
H
A
H
In
this
also,
in
this
particular
case,
I
think
since
by
adding
that
filter,
it
does
essentially
eliminate
two
of
the
metrics
from
the
list
entirely.
That
probably
should
go
up
for
a
30-day
when,
when
the
actual
removal
of
those
metrics
maybe
should
whatever
PR
does
that
that
should
maybe
go
under
a
30-day
review,
but
I
mean
at
that
point
we're
fundamentally
changing
the
structure
of
at
least
one
metric
by
getting
rid
of
it
right.