►
From YouTube: CHAOSS.Common.August.20.2020
Description
CHAOSS.Common.August.20.2020
A
Thank
you
so
good
to
go
so
hello,
everybody
and
welcome
to
the
chaos
community
call.
It
is
august,
20th
2020,
so
we
have
a
few
things
that
I
think
we'd
like
to
take
a
look
at
today.
So
if
somebody
could,
I
suppose
I
should
get
this
going.
A
Add
yourself
to
the
minutes.
I
didn't
do
that.
Do
you
want
me
to
screen
share
or
do
you
want
to
screen
share
them?
I
don't
care,
I
don't
care
either
so
hold
on.
I'm
just
trying
to
get
the
minutes
up.
A
So
if
you
could
add
yourself
get
a
fairly
light
agenda
today,
I
think
the
biggest
thing
was
to
take
a
look
at
open
issues
and
prs
that
we
might
have
in
the
repository
and
then
also
it's
really
about
identifying
any
metrics
that
we
want
to
put
on
the
radar
for
moving
forward.
A
A
Well,
why
don't
I
go
sean
I'll
just
share
my
screen
here.
No
problem
hold
on.
A
All
right,
so
here's
our
our
repo,
I
did
actually
have
a
question
kind
of
for
kevin
kevin.
These
release
note
metric
or
a
release.
Note
issues
do:
do
we
close
these
when
we're
done.
F
E
When
I
closed
that,
I
immediately
opened
this
release,
notes
2021
to
capture
to
capture
the
discussion
for
the
next
six
months:
okay,
gotcha
so-
and
I
can-
I
can
close
these
at
release
time
basically,
but
it
should
probably
stay
open
until
then.
No
that's
no
problem
whatsoever.
The.
E
The
individual
issues
for
the
candidate
metrics
yeah-
I
think
those
can
be
closed
by
the
working
group
when
they
are
ready
to
be
okay
released
or
when
it's
through
the
comic
period.
A
Right
sounds
good,
so,
basically.
E
F
A
G
B
B
E
Then
and
then
basically
then
it's
closed
and
just
replaced
by
another
one
and
then
to
to
gierg's
point
when
we
open
up
the
individual
metrics
for
the
comment
period.
That
issue
will
stay
open
until
the
working
group
signs
off
on
it
at
right
before
the
release,
okay,
so
so
so
that
can
be
open
six
months
five
months,
four
months
as
well.
Okay,
that
sounds
good.
Thank
you.
A
A
I'm
usually
you
know
you
don't
have
it
on
here.
It's
on
a
different
issue.
It's
in
the.
I
think
it's
in
the
diversity
of
inclusion,
working
group
there's.
A
C
C
A
There's
a
repo
for
that,
isn't
there
I
don't
know
if
we
created
it
or
not,
I
think
daniel
did
that
very
well
could
be.
Let
me
take
a
look.
I'm
going
to
stop
sharing
for
a
second.
A
D
C
A
C
G
A
All
right,
so
sean
has
taken
care
of
that.
Hey,
we're
gonna,
close
an
issue
all
right,
so
discussion
moving
on
anyway,
any
so,
basically
for
those
of
you
that
aren't
familiar
with
this
discussion,
which
is
great
that
it's
going
to
close
is
that
sean
and
daniel
have
been
looking
at
a
way
to
and
sean
correct
me
from
wrong,
but
track
org
affiliation
that
could
be
used
by
both
grimore
lab
and
auger,
so
that
both
tools
don't
need
to
maintain
it.
A
C
The
last
discussion
that
was
a
big
discussion
in
common
about
time
zones
than
it
was
a
long
time
ago,
was
basically
that
we
were
using
the
local
developers
utc
offset,
which
is
usually
in
the
get
log
as
a
proxy.
Some
people
are
able
to
get
ip
address
and
that's
a
better
proxy,
but
I
think
github
is
pretty
restrictive
about
how
many,
how
much
of
that
you
can
get
like
you
can't
really
get
it
for
every
commit.
C
B
B
I
think
we
have
both.
We
have
one
field
with
local
time
and
one
filled
with
utc
time.
C
C
C
All
right,
yeah,
I
don't
think
I
don't
think
this
one's
close
ball.
B
I
think
it
is
closable
and
I'm
writing
a
comment
right
now.
I'm
just
copy
pasting.
Some,
let's
see
the.
G
B
A
B
F
B
C
A
concern,
it's
it's,
the
number
of
metrics
we
have
and
their
utility
are.
I
think
getting
to
the
point,
and
this
is
an
example
where
some
kind
of
index
of
questions
answered,
because
I
think
there
are
yards
right.
There
are
questions
that
people
ask
that
are
answered
by
a
particular
way
of
implementing
a
metric
and,
and
so
but
they
but
the
name
of
that
metric
may
not
always
be
you
make
it
crystal
clear.
Then
you
can
answer
that
question
using
that
metric,
and
this
is
a
good
example
of
that.
A
So
an
index
of
metrics,
that's
driven
by
yeah
the
questions
right
or
being
asked,
and
this
can
help
locate
people
to
particular.
A
C
Okay,
yeah
yeah,
because
because
to
be
an
index
means
essentially
like
the
book
index
where
you've
got
each
each.
If
each
metric
is
a
short
chapter,
then
the
index
lets
you
look
at
more
granular
things,
so
you
may
not
think
that
chapter
is
about
what
you
want
to
be
about.
But
if
you
look
in
the
expanded
index
you
can
find
what
you're
looking
for
so
it's
a
way
of
navigating
at
a
lower
granularity
or
perhaps
even
a
higher
granularity.
If
your
question
requires
several
metrics.
E
C
B
E
So
one
of
the
one
of
the
things
I've
been
struggling
with
lately
is
that
the
way
we
deal
with
metrics,
they
kind
of
belong
to
certain
working
groups,
and
I
don't
I
don't
really
think
individual
metrics
should
I
mean
a
working
group
is
obviously
working
on
defining
it,
but
that
doesn't
necessarily
mean
that
that
working
group
owns
that
metric
and
we
we
talked
about
this
a
little
bit
the
other
day
in
in
the
wrist.
E
The
risk
working
group
and
in
the
community
working
group
is
that
we
don't
have
a
way
of
handling
duplicate
metrics
in
working
groups
where
this
metric
is
very
important
for
this
focus
area.
But
it's
also
very
important
for
another
focus
area
and
it's
possibly
also
very
important
for
another
working.
C
Group,
it's
and
so
like,
for
example,
we've
I've
asked
for
a
different
category
on
a
metric,
because
I
forget
what
metric
it
was,
but
I
think
it
might
have
been
commits
came
up
as
it
was
lines
of
code
lines
of
code
came
up
as
a
metric,
that's
important
for
some
risk
evaluation,
but
obviously
it
already
exists
in
evolution.
So
in
the
spreadsheet.
C
I
would
like
to
indicate
that
we've
identified
this
as
a
risk
relevant
metric,
but
we're
not
creating
it
because
it
already
exists.
So
like
there's
a
code
for
it's
like
a
pull
request
that
doesn't
get
merged
effectively,
but
we
still
want-
and
we
still
want
that
indexed
with
risk,
so
that
if
people
are
looking
at
risk,
they
could
see.
These
are
metrics
that
are
pertaining
to
risk.
Even
if
they're
not
risk
metrics.
A
Can
I
can
I
make
a
comment?
I
think
we
have
two
things
we're
talking
about
here.
So
one
is
the
first
well.
The
first
was
the
creation
of
an
index
that
would
help
people
locate
quickly,
irrespective
of
working
groups.
Right
so
has
anybody
ever
done
that
work
before
like
I,
I
totally
agree
with
you
sean.
A
I
think
that's
a
good
idea,
particularly
as
our
metric
set,
continues
to
grow
and
we
do
have
filters
and
right
now
the
questions
to
george's
point
that
we
kind
of
assigned
to
a
metric,
they're
kind
of
high
level
questions,
but
based
on
how
you
might
implement
a
metric,
you
could
answer
slightly
different
questions.
So
super
interesting
and
I
100
agree
with
you.
C
C
Like
we
could
hire
a
library
science
student
that
that's
what
I've
done
before
I,
when
I've
been
in
high
schools,
which
I
was
for
a
while
long
time,
I
just
hired
a
library
science
student
to
do
this
kind
of
work
and
they're
actually
very
good
at
organizing
things.
C
A
C
I
mean
you
know:
you've
probably
got
students
in
your
program
who
have
that
interest
or
specialization.
I've
got
students
across
the
street
with
that
specialization
at
the
master's
level.
That
probably
could
help
us
out
part
time,
obviously
like
kevin
and
georg,
and
you
and
I
and
john,
and
probably
everybody
on
this
call
is
capable
of
this.
A
I
I'm
not
capable
of
it,
because
I
don't
know
what
it
what
I
don't
know
what
it
would
look
like.
I
don't
understand
how,
given
the
current
structure
that
we
have
that
we
could
create
an
index
that
would
help
people
answer
specific
questions
and
then
present
them
with
information.
That
said,
here's
the
metric
with
these
filters.
C
Part
of
how
we're
going
to
get
there
is
this
survey
that
we're
releasing
that
where
people
ask
for
the
community
report
because
we're
asking
them
what
questions
they
want
answered
and
as
we
go
about
answering
those
questions,
then
we'd
probably
have
a
google.
We
should
keep
a
google
doc
that
says:
okay,
this
is
a
question
that
came
up
and
here
are
the
metrics
we
used
to
answer
it.
C
Here
are
the
the
tools
that
provided
useful
answers
and
and
it
can
become
a
little
bit
of
an
faq
and
then,
if
linux
foundation
lets
us,
throw
a
search
engine
into
that
part
of
our
website.
C
A
F
I
I
was
just
going
to
bring
up
the
limitations
that
we
do
have
with
with
our
wordpress
install,
so
I
think
kevin
should
be
the
one
to
kind
of
sorry
kevin.
I
just
realized,
I'm
volunteering
you
for
this,
but
I
think
that
you,
you
are
the
expert,
so
you
would
know
like
what
our
limitations
are
as
far
as
technology
goes
and
how
we
could
you
know
what
plug-ins
are
available,
etc.
So
I
think
we
should
really
use
kevin's
guidance
on
some
ideas.
F
B
So
I
I
wanted
to
also
point
out
that
the
app
ecosystem
work
group
is
working
on
this
issue
of
how
do
we
know
what
metrics
we
want
to
pay
attention
to
right
so
completely
leaving
behind
the
structure
we
currently
have
and
thinking
from
hey.
These
are
the
stakeholders.
These
are
the
questions
and
goals
they
have
and
then
identifying
metrics
that
are
relevant.
B
So
work
like
that
can
help
fill
this
in
the
survey
that
sean
mentioned
can
help
us
and
then,
when
it
comes
to
building
out
the
index,
if
the
website
does
not
provide
the
features,
we
need
I'm
okay
with
thinking
blanks
late,
maybe
get
someone
from
library,
science.
If
that's
a.
If
we
can
find
someone
good
there
to
build
the
system,
it
doesn't
have
to
be
within
our
website.
C
Yeah
and
and
like
the
well,
the
ones
that
I've
seen
or
done
that
are
not
are
useful
in
ongoing
basis,
do
require
some
curation.
I
mean
it's
not
a
it's,
not
something
that
you
do
once
and
then
you're
done,
because
you
know
it's
just
not
so
there's
an
initial
effort,
but
then
there's
also
maintenance
on
it.
It's
like
software
that
way.
B
Yeah,
if
you
look
at
that,
that's
where
we've
gone
from
the
goals-
oops,
sorry
at
the
top,
that's
where
we've
gone
from
the
goals,
questions
all
the
way
down
to
the
metrics
and
data.
B
But
if
we
encapsulate
gold
question
metric
data
in
each
one
of
those
as
one
flash
card
in
our
index
and
we
find
a
good
way
to
sort
it
when
we
sort,
we
have
one
catalog
that
is
sorted
by
question
and
one
that
is
and
think
about
physical
flashcards
right
now
in
the
library
you
know
and
another
one
that
is
sorted
by
goals
and
we
can
find
these
so.
A
The
index
doesn't
necessarily
allow
people
to
ask
any
question.
It.
A
A
So
it's
a
part
of
creating
the
index
kind
of
like
what
is
here
and
then
in
part
kind
of
creating
the
machine
that
can
map
the
question
that
a
person
who
comes
to
the
index
might
ask
and
mapping
what
they
want
to
know
to.
What
we
have
already
indexed.
Is
that
correct,
right,
yeah.
B
B
C
No,
no,
I
think
the
way
we
build
them
is
working.
I
don't
want
to
build
them
differently.
I
just
think
we
need
to
recognize
we're
creating
a
lot
of
information
and
the
more
that
we
create
the
more
we'll
have
to
do
to
help
people
navigate
that
okay
in
order
to
consume
it.
That's
all
it's!
It's
like
natural,
it's
a
natural
evolution,
I
think
from
the
project,
and
I
it's
good.
This,
like
I
think,
looks
like
the
app
ecosystem
group
is
thinking
in
somewhat
that
direction
already.
A
Okay,
so
all
of
this,
this
discussion
stemmed
from
aorg's
two
lines
here,
just
to
remind
people
as
to
why
we
were
having
that
discussion
in
the
first
place.
So
I
think
this
can
close.
Is
that
correct?
A
So
I
think
this
issue
can
close
yeah.
This
issue
can
close,
but
it
did
unlock
kind
of
perhaps
a
need
for
being
able
to
locate
people
a
little
bit
better
in
the
metrics
yeah,
particularly
as
we
start
thinking
about
filters,
because
that
provides
different
views
and
answers.
Different
questions,
yeah.
A
E
G
I
have
a
one
suggestion
on
the
cross
referencing
of
the
matrix
in
the
working
group
like
we
don't
create
a
separate
tag
in
our
excel
sheet.
We
keep
the
same,
which
is
already
released
in
another
metric
and
just
write.
The
name
keep
the
same
green
tag,
which
is
release
and
then
in
the
comments
we
write
refer
to
the
this
particular
working
group
for
the
detail
of
the
matter.
G
A
C
The
state
is
released
in
a
different
working
group
or
something
like
that,
so
that
and
the
color
is
different,
so
that
then
the
working
group
knows
they
don't
have
to
work
on
it
and
that
they
that
it
exists
in
the
outcomes
of
another
working
group,
and
if
we,
if
we
do
the
within
working
group
index,
that
will
lead
us
natural
that
way
that
will
almost
naturally
evolve
into
some
something
we
can
eventually
matrix.
E
C
Yeah-
and
I
I
think
previously,
we've
been
deleting
lines
when
that
happens
and
where
that
has
become
problematic
is
the
same
idea
comes
up
again
in
the
working
group.
So
if,
if
it
comes
up
and
we
identify
a
pointer
to
another
working
group,
keeping
that
pointer
in
this
list,
I
think
is
going
to
keep
us
from
walking
over
already.
G
Even
if
the
idea,
if
even
if
that
metric
is
not
released
but
still
in
consideration,
we
can
keep
in
consideration
and
point
it
to
the
yeah
like
we
don't
create
a
separate
tag
for
it,
so
tagging
will
remain
same
and
we'll
just
have
a
pointer
for
that.
I.
E
C
Well,
it
is
but
like
when
I
I
like,
when,
like
risk
and
evolution
are
in
value
and
common,
which
are
the
ones
I
participate
in
the
most
I
mean
the.
If,
if
something
comes
up,
there's
usually
more
people
than
me
in
the
in
the
discussion.
That
remembers
that
it's
in
a
different
working
group
as
well-
and
I
expect
that
to
be
the
case
for
the
foreseeable
future,.
G
Or
there
is
one
additional
thing
we
can
do
to
that
cross
referencing
only
like
we
put
it
in
the
particular
working
group
metric
that
this
is
considered
or
released
under
this
metric
cross
reference
in
other
working
group,
and
we
can
create
a
single
tab
for
only
for
the
cross
reference
matrix.
G
A
F
A
E
I
think
it's
going
to
be
helpful
for
I'm
sorry
to
interrupt
you.
I
think
it
could
be
helpful
for
for
people
who
are
visiting
the
site
who
are
interested
in
code
development
metrics
as
well.
I
I'm
interested
in
code
development
metrics
and
then,
when
I
go
there
and
I
look,
I
see
well
there's
only
four.
There.
I
A
E
A
That,
what's
the
value
in
that
I
don't
the
first
one
I
get
like,
and
I
also
understand
I
think,
sean's
point
of
being
able
to
identify
that
it's
done.
I
think
also
addresses
the
concern
that
perhaps
two
working
groups
are
actually
working
on
the
same
metric
and
this
way.
E
E
E
A
E
I
think
it
I
think
in
individuals
are
often
interested
in
metrics
kind
of
based
on
the
the
focus
areas
that
we've
created
right.
So
I
can.
I
can
envision
a
scenario
where
a
community
manager
is
interested
in
in
a
certain
set
of
metrics
that
are
associated
with
code
development,
so
code
development
is
an
evolution,
focus
area
and
when
you
go
there
there
are
maybe
four
or
five
metrics
that
are
defined
within
that
focus
area
right.
C
E
Yeah,
so
if
we
started
cross-referencing
that
a
user
visiting
the
site
looking
for
code
development
activity
type
metrics
would
then
see
more
than
just
those
two
metrics
that
are
currently
released.
They
could
see.
You
know
all
of
the
metrics
that
we've
defined
that
relate
to
code
development
activity,
sure.
A
F
But
what's
what
might
also
be
helpful
is
if,
if
we
see
that
two
or
three
or
the
rest
of
the
groups
are
all
pointing
to
the
same
metric,
then
we
will
know
that
that's
a
candidate
that
should
be
moved
over
to
common,
and
I
don't
really
think
we
have
a
process
right
now
for
officially
saying
this
thing
belongs
in
common:
do
we
or
is
it
just
kind
of
something
that
we
feel
like
everybody
could
use,
so
we
move
it
to
common,
but
this
would
be.
G
F
C
C
Saying
is
the
we're
running
up
we're
starting
to
run
against
volume
of
information
yeah.
C
A
A
All
right,
these
are
two
like
huge
issues,
just
so
to
me
so
kevin.
Your
concern
is
that
right
now
this
is
happening
too
informally.
Is
that
right.
E
A
Yeah,
so
as
a
possible
way
forward
kevin.
Is
it
something
that
needs
to
be
what
you
envision
is
something
that
needs
to
be
done
to
this
spreadsheet.
E
So
the
the
reason
I
keep
on
mentioning
the
spreadsheet
is
because,
once
again,
the
the
spreadsheet
is
this:
it's
this
one
area
where
we,
where
all
of
the
working
groups
coordinate
together.
If
there
is,
if
there's
another
tool,
then
another
tool
is
fine.
I
just
we
just
we
need
to.
E
We
need
to
think
more
about
how
we
are
coordinating
together
as
the
chaos
community
between
the
working
groups
and
how
we're
organizing
the
the
work
that
we
that
we
do
so
in
my
mind,
it's
kind
of
the
the
working
groups,
kind
of
working
groups,
own
the
focus
area,
but
the
metrics
kind
of
belong
to
everyone.
E
I
do
not
other
than
other
than.
A
C
C
A
C
Yes,
and
I
will
stop,
I
will
never
again
recolor
cells
and
I
will
fix
the
coloring
that
I've
done.
A
C
And
at
some
point
in
the
future,
probably
two
releases
from
now
we
may
decide
to
release
the
metrics
according
to
that
sort
of
cross-reference
as
it
emerges.
I
don't
think
we
want
to
change
anything
for
a
year,
but
I
think
right
here
we
might
decide
to
include
that
cross-referencing
in
released
metrics
for
working
groups.
A
C
C
A
All
right
did
that
close
anything
for
us,
no
all
right.
I
think
that
was
still
actually
related
to
george's
two
statement
issue.
Two
two
statement
comment
all
right:
I'm.
A
And
it
took
40
minutes
all
right.
A
Yeah
this
is
good.
This
is
super
helpful,
all
right,
so
the
we
do
have
a
new
metric.
I
do
think
that's
an
action
item
for
me
that
I
haven't
done,
which
is
this
so
sorry.
Are
there
any?
Maybe
in
the
remaining
couple
minutes,
are
there
any
other
metrics
on
this
list,
that
kind
of
jump
out
to
people
with
respect
to
conversations
that
have
come
up
in
risk
or
value
or
dni
or
evolution.
B
One
of
the
ones
that
I
know
people
are
interested
in
is
that
time
to
merge
similar
to
time
to
close,
but
actually
it
might
be
just
the
filter
on
time.
To
close
with
is
was
this
merge
request
or
pull
request
closed
because
it
was
merged
or
not.
B
So
we're
back
to
this
idea
of
creating
an
index
or
some
form
of
referencing,
saying
hey
if
you
want
time
to
merge,
implement
time
to
close
with
these
parameters.
With
these
filters.
B
A
C
C
I
mean
I'm
just
like
burstiness
is
visible
in
all
dimensions
of
software
work,
you'll,
see,
bursty,
pull,
requests,
acceptance,
bursting,
commits
and
bursty
issues;
usually
they
circle
around
or
release
or
the
end
of
the
year,
but
sometimes
around
other
things.
So
I
guess,
as
I'm
saying
is,
burstineness
is
a
characteristic
of
how
data
is
shaped
when
we
visualize
it
and
I'm
not
saying
it's.
I
don't
think
it's
a
bad
issue.
I
don't
think
it's
a
bad
metric
to
create
I'm
just
trying
to
think
about
how
we
would
create
it.
C
I
think
we
should
move
it
forward.
I
just
wanted
to
mention
that.
Okay,
I
don't
I
don't
want
to,
because
I
think
people
will
understand
the
burstiness
of
activity
and
I
think
that
may
be
a
composite
that
will
that
it's
not
an
atomic
metric.
Nor
is
it
a
program,
a
project
it
is.
I
think
it
is
a
synthetic
metric.
I
think
it
will
draw
on
existing
metrics
and
say
something
about
them.
C
F
F
A
Time,
sean.