►
From YouTube: CHAOSS Evolution Working Group 5/31/22
Description
Links to minutes from this meeting are on https://chaoss.community/participate.
A
A
So
the
two
things
that
I
thought
we
had
to
go
to
today
were
the
review
of
the
metrics
that
need
to
be
updated.
Elizabeth.
You
indicated
in
our
chat
a
minute
ago
that
you
have.
A
B
C
And
I
think,
with
a
lot
of
these
evolution
metrics
just
because
they're
referenced
so
often
and
are
like
such
building
blocks
for
other
metrics
metrics
models
that
I
think
that
you
know,
even
though
the
changes
might
end
up
being
minor.
I
think
it
would
be
worth
our
while
to
just
look
at
them
and
make
sure
that
they're
tight.
You
know.
A
D
Yeah
I
was
asking
like
you,
have
highlighted
the
areas
that
needs
to
be
worked
within
a
particular
metric
right.
It
looks.
C
C
Sense
for
maybe
one
or
two
people
to
take
those
three
together,
they
were
very
similar,
they're.
C
Yeah
yeah
and
maybe
even
there's
one
more
issues
time
I
think
issues
open
or
something
I
didn't
get
to
yet
that
one
may
be
the
same
as
well,
which
is
at
the
bottom
of
the
issues
list.
A
A
A
C
There
also
may
be
some
philosophical
questions
to
be
asked
in.
C
Such
that,
like
those
metrics,
had
the
specific
instructions
of
how
to
get
that
data
out
of
gremore
lab.
So
we
can.
We
have
two
choices.
We
can
either
keep
continue
to
leave
them
in
and
just
verify
that
they
work
or
we
can
take
those
out
because
I
don't
think
they're
in
a
lot
of
other
metrics
to
that
level
of
detail.
C
So
that
might
be
a
question
that
the
group
wants
to
answer
as
a
whole.
Just
philosophically,
where
we
stand
on
how
much
detail
to
provide
in
the
metric
itself.
C
D
D
D
A
A
D
A
D
D
So
I'm
saying
like
if
anyone
who
is
interested
in
collecting
the
data
for
that
particular
matrix
will
definitely
go
to
either
auger
or
grimo
lab
install
it
on
their
machine
and
then
look
around
it.
So
if
they
have
gone
through
that
much
process,
I
think
they
are
capable
enough
to
extract
this
data.
Out
of
that,
that
is
my
assumption.
D
A
A
A
I
see
a
typo
in
this
at
the
top
line,
but
and
then
the
other
mind
is
that
this
isn't
systematically
provided
by
the
other
working
groups,
and
so
does
this
content
need
to
be
moved
to
some
other
place
and
then
referenced
in
the
in
the
work
in
this
metric
definition
and
the
first
other
place
that
came
to
my
mind
was
if
we
created
a
read
the
docs
dot
io
for
the
working
group
that
included
the
metrics
themselves
and
their
definitions
as
well
as
these
implementation
details.
D
A
C
Yeah
that
was
kind
of
my
feeling
too.
It's
like
we
have
to
go
back
and
eat
and
verify
that
all
this
still
works,
but
also
knowing
that
we
probably
won't
look
at
it
again
for
another
two
four
years,
whatever
three
four.
So
do
we
want
to
keep
this
in
here,
knowing
that
it's
not
gonna
get
eyes
on
it
for
a
long
time
again,
yeah
so.
C
D
D
A
That's
the
that's,
certainly
the
part
that
I
have
questions
about.
D
A
A
C
D
C
There
was
another
actually
two
other
questions
with
these
metrics
that
I've
noticed
the
trend
was.
There
also
was
a
section
of
that
said
parameters,
and
it
seemed
like
in
that
metric
the
parameters
and
the
filters
like
I
was
getting
confused
as
to
what
the
difference
was,
because
I
think
we
just
call
them
all
filters
now.
A
B
A
Yeah,
I
I
had
a
power
outage
like
half
an
hour
before
this
meeting
and
but
everything
I
have
is
on
ups.
So
when
the
power
came
back,
I
was
right
there,
but
that's
you
know:
I've
owned
this
house
for
six
years.
I've
got
everything
configured
now
so
metrics,
so
aggregators
filters.
B
B
So
filters
filters
are
part
of
the
metrics
template.
I
believe
that's
it's
a
subheading
underneath
implementation.
B
I
would,
I
would
say
that
they're
probably
inappropriate,
but
you
know
at
the
same
time,
some
of
the
qualitative.
A
lot
of
the
qualitative
metrics
have
like
survey
questions
in
them,
and
I
I
don't.
I
don't
know
that.
There's
much
difference
between
an
sql
query
in
this
case
and
a.
C
And
then
sorry
to
keep
context
switching
on
you
all,
but
the
third
trend
that
I
noticed
in
these
metrics
is
that
there's
a
block
of
synonyms.
That's
it's
in
the
wrong
spot.
You'll
see
it's
under
data
collection
strategies.
We
just
list
a
bunch
of
synonyms,
especially
with
regard
to
the
metrics
around
issues
like
what
is
an
issue.
So
I
know
that
we
moved
that
up
to
the
description,
but
it
makes
me
wonder
like
it's.
It's
repeated
in
issues,
new
issues,
closed
issues,
active
anything
issues
related.
C
A
D
A
But
this
is
actually
syntactically
critical
for
spanning
different
issue.
Trackers.
D
C
C
D
Maybe
I
think
it
was
in
the
references
or
somewhere
we
mentioned,
and
I
have
to
look
at
the
current
template.
They
needs
to
be
moved
in
the
current
template.
B
C
A
I
was
looking
at
over
here.
I
was
under
issues
closed.
C
A
Yeah
I
mean
it
could
also
be
incorporated
into
a
handbook
general
definition,
kind
of
thing.
It
just
depends
on
how
granular
and
detailed
we
want
that
to
be
so.
There
should
probably
be
a
section
on
that's
separate
from
terminology
and
really
specific
to
technical,
metrics,
related
technical
definitions
or
something.
B
Yeah,
I
will
say
the
any:
I
think,
any
glossary
work
that
we
do.
The
goal
of
that
glossary
work
is
probably
not
to
describe
the
different
terms
that
are
that
are
used
in
in
different
platforms.
A
B
A
B
A
B
D
So
for
the
revision
as
it
is
assigned
to
me,
I
need
to
create
a
google
doc
move
that
patrick
over
there
clean
it
up
all
these
things
and
then
bring
it
back
here
to
the
meeting
so
that
we
all
can
review.
That
is
the
plan
right.
No.
C
D
C
B
So
these
revisions
are
being
done
directly
in
markdown.
We're
not
we're
not
creating
google
docs
for
them.
D
So
if
there
is
like
a
huge
change
in
this
structure,
if
you
are
removing
sql
queries
or
doing
some
restructuring
of
the
metric
as
per
new
template,
that
needs
to
be
done
directly
in
the
markdown
and
a
pull
request
is
created,
or
we
that
that
is
why
I
was
thinking
like
we
do
all
the
changes
in
the
dock
show
it
to
everyone.
If
everyone
in
the
group
agrees,
then
we
can
create
a
pull
request
and
change
the
amount
down.
A
So
this
back
to
the
first
philosophical
question
right:
bernard
yes,
should
metric
software
instructions
be
in
the
metric
or
elsewhere,
and
I
do
think
there's
there's
you
there
is
utility
in
having
the
metrics
explain
and
maybe
not
the
metrics,
but
maybe
some
kind
of
linked
documentation
explain
how
you
get
data
for
that
metric
from
gremore,
lab
and
or
auger.
A
A
And
kind
of
like
big
picture
vision
wise,
I
think
it
would
be
useful
if
the
metrics
had
trying
to
think
what
am
I
trying
to
say
here,
it'd
be
useful
if
the
metrics
had
google
develops
some
kind
of
documentation
that
showed
you
how
to
get
the
data,
and
it
would
seem
that,
ultimately,
that
documentation
might
be
pretty
comprehensive
and
applied
to
multiple
metrics,
so
that
you
know
I,
you
know
any
metric
where
you
can
get
it
from
trace
data.
Maybe
we
should
be
providing
instructions
on
how
to
get
it
out
of
a
platform.
B
At
the
very
least,
maybe
a
pointer
to
to
a
tool
or.
B
We
have
an
entire
header
section
in
the
metric
called
implementation
right,
implementations
yeah.
So
it's.
What
would
you
expect
to
find
in
implementations?
I.
D
B
B
I
don't
think
we
need
to
be
comprehensive
or
definitive
in
any
way
like
these
are
the
the
only
ways
to
do
it,
but
I
think
giving
some
samples
or
pointing
to
some
tools
is
completely
within
the
realm
of
that
implementation.
Section.
A
B
B
No,
I
don't,
I
don't
believe
so
I
mean,
I
think
it
should
be.
It's
not
explicitly
part
of
the
template,
but
I
think
when
we
define
when
we
define
metrics
tools,
providing
the
metric
is
something
that
I
mean.
I
think.
A
B
Maybe
it
is
maybe
I'm
just:
where
is
the
template?
It's
in
the
community,
repo,
okay.
D
A
B
Yeah,
I
was,
I
was
thinking
that
same
thing,
those
aggregators
we
could
turn
into.
I
think.
B
So
the
the
way
we've
been
the
way
that
most
working
groups
use
filters.
Yeah
is
not
as
a
so.
Those
aggregators
are
all
kind
of
types
of
measurements.
Right,
it's
a
count.
It's
a
ratio,
it's
whereas
filters
generally
are
more
filters,
are
actually
metrics,
usually
right.
So
it's
the
it's
this
metric
in
relationship
to
another
metric
and
you
can-
and
you
can
use
that
metric
to
sort
it
or
to
understand
an
issue
based
on
the
relationship
between
those
two
metrics.
B
A
B
B
A
B
B
A
B
Yeah
reactions
is
the
that's
the
the
emoji
yeah.
A
A
That's
so
that's
there's
actually
a
status
on
many
platforms.
A
for
an
open
issue
can
go
on
github.
Specifically,
an
issue
can
go
from
open
to
closed
to
reopen,
so
it
wouldn't
have
an
open
status.
It
would
have
a
reopened
status
right
and
then
it
would
be
ultimately
closed
again.
B
I
still,
I
still
think
those
could
be.
Those
could
be
re-recontextualized
as
metrics
I
mean
we're
talking
about
closed
issues
versus
open
issues.
A
closed
issue
is
that's
kind
of
that's
kind
of
a
metric.
It's
something
you
can
count
well.
This
is
the.
A
Possibly
the
criteria
for
source
code
and
the
issue
of
reopening
these
are
things
filter
matters
I
think
they're
confusing.
A
I
agree
with
criteria
for
closed,
possibly
belonging
under
the
definition.
I
just
don't
want
to
lose
this
information
because
it's
incredibly
useful
to
anyone
trying
to
implement
these
metrics
no.
B
No,
I
don't
want
to
lose
it
either.
I'm
just
I
just
want
to
make
sure
we
have
a
clear
definition
of
what
a
filter
is
and
that
that's
part
of
that's
part
of
what
we're
we're.
Looking.
A
At
with
so
in
this,
in
this
particular
case,
I
can
see
where
aggregators
might
be
might
not
need
to
be
stated
at
all
and
parameter
the
parameters
that
are
here.
I
think
all
four
of
them
could
drop
under
filters
with
some
validity.
I
think
the
criteria
for
closed.
D
A
D
Yeah,
I
can
move
it
to
the
google
doc.
That
is
not
a
big
deal,
but
I
was
trying
to
confirm
like
that
is
the
procedure
we
have,
because
I
recalled
for
a
minor
revision,
some
grammar
or
some
typos.
We
don't
need
the
google
doc,
but
for
major
revisions.
We
need
to
discuss
like
here.
C
D
A
Yeah
I
I
can.
I
can
see
some
validity
to
that.
I
think
I
would
leave
it
to
the
person
who
takes
on
a
particular
issue
about
whether
they
want
to
start
with
a
pull
request
or
a
google
doc.
In
this
case
I
mean,
I
think
this
discussion
probably
covers
the
other
two
issues
that
are
related
to
issues,
so
that's
not
confusing
at
all,
like
once.
We
make
it
like
once
we
reach
a
consensus
here,
it
probably
can
be
applied
across
the
remainder
of
the
other
issue
related
metric
issues.
A
A
So,
okay,
I
think
we
got
somewhere
there
with
that
discussion
yep,
but
not
I
will
I'll
just
leave
so
I'll
just
say.
A
A
All
right,
I
just
said
you
can
see
what
I'm
typing
there
I
tried
to
represent
what
I
think
we
just
said.
A
Was
I
guess,
taking
you
isn't
working?
Who
knows,
doesn't
work
doesn't
matter.
The
other
agenda
item
that
we
had
is
to
just
revisit
our
discussion
about
focus
areas,
metrics
and
metrics
oops
models,
and
I
recognize
that
we
didn't
really.
I
think
that
it
effectively
got
thrown
back
to
us
in
the
community
meeting.
Is
that
sort
of
your
recollection
elizabeth.
B
But
it
was,
it's
been
like
it's
been
thrown
back
and
forth
multiple
times
and
it
the
in
the
last
the
last
discussion
it
landed
on
me.
B
The
I'm
the
squeaky
wheel
on
this
one,
so
I
think
the
I
think
it
was
basically
a
put
up
or
shut
up
comment.
That's
helpful!
So
really
I
can
share
the
document
I'm
working
on
if
you'd
like.
B
A
Right
then,
I
can
oh
have
I
already
no,
I
haven't
you
haven't
shared
the
link
to
the
doc.
B
B
Some
of
these
things
are
just
kind
of
the
way
that
I
view
them
and
then
maybe
we
need
a
bit
of
discussion
to
to
land
on
what
the
what
the
definition
actually
is
like
the
composite
metric,
I
think,
is
one
that
could
use
a
little
bit
of
discussion
so.
A
I
think
composite
metric
is
a
synonym
for
metric
model.
I
think
they're
the
same
thing:
yep.
B
So
I
think
there
I
think
there
is
a
case
where
a
composite
metric
can
be
a
proxy
for
a
single
for
a
single
point
of
measurement,
where
it's
not
necessarily
a
model
right,
and
then
I
kind
of
give
the
example
of
if
we're
measuring.
If
we're.
A
B
A
A
B
B
A
A
B
So
that
that
would
be
the
proposal
for
how
we
organize
our
work
in
chaos,
with
the
kind
of
a
cross-cutting
cross-cutting
model
with
the
the
working
groups
as
the
verticals
focusing
on
measurable
phenomena
and
then
across
the
working
groups.
We
can.
We
can
connect
these
metrics
and
models
to
one
another
through
focus
areas
which
are
not
measurable
phenomena.
B
They
would
be
a
description
of
a
context
area
and
then
I'm
working
on
a
grocery
store
metaphor
on
how
we
can
on
how
we
would
present
the
the
metrics
to
users
using
using
using
these
verticals
and
horizontals,
and
the
addition
of
tagging.
B
A
D
More
than
a
combination
from
the
different
working
groups
from
different
context,
areas,
like
I
see
model
I
don't
know-
maybe.
C
D
B
So
in
that
particular
example,
the
focus
area
that
I
pulled,
a
focus
area
from
common
called
place
and
I
pulled
a
focus
area
from
event
called
dei.
So
the
common
connection
across
those
working
groups
is
that
context
area.
Okay,
that
we're
looking
at
it's
the
event
or
the
place,
and
then
different
working
groups
have
actually
in
the
common
has
a
metric
that
is
event.
B
Locations
dei
has
multiple
event,
multiple
event,
metrics
and
then
the
the
models
working
group
is
is
working
on
an
event
badging
model,
and
then
we
also
have
the
event
badging
initiative.
So
all
of
those
things
across
working
groups
are
kind
of
focusing
on
a
specific
context
area
and
that's
what
ties?
D
B
B
A
A
Okay,
so,
but
I
don't
think
metric
is
owned
by
more
than
one
working
group.
It's
just
relevant
for
more
than
one
purpose,
so.
A
B
So
that's
the
so
that's
the
point.
That's
the
point
of
this
metric
and
that's
the
the
point
of
that's
the
that's
exactly
the
point
of
this
work.
Okay,
so
the
the
way
that
we
present
it
is
is
outside
of
this
and
which
what
you're
saying
is
exactly
true.
Okay.
This
is
purely
a
way
of
organizing
work
within
chaos,
so
that
we
have
a
a
shared
understanding
of
what
these
things
are:
shared
definitions
and
ability
to
organize
the
work
across
the
platforms.
B
When
we
talk
about
presenting
the
metrics,
we're
really
talking
about
tagging
and
dynamically,
presenting
them
to
the
users
in
a
in
a
way
that
so
I
I
don't
believe
that
we
have
the
ability
to
organize
or
categorize
these
metrics
in
a
way
or
metrics
or
models
in
a
way,
that's
going
to
be
static
right,
it's
it's!
It's
always.
A
Going
to
fall
apart,
I
agree
so
taking
approach
is
really
smart
and
my
only
I
guess
my
only
my
only
knit
with
it
is
that
metric
models
don't
they're,
not
re.
A
I
just
don't
see
the
way
that
we're
working
on
them
as
being
long-term
sustainable
if
it's
just
isolated
as
a
working
group
like
I
think
the
way
of
working
has
to
become
more
fluid,
but
that's
a
discussion
for
another
day
because
we
are
out
of
time
and
we
have
the
community
meeting
coming
up
in
seven
minutes
and
I
need
a
break
okay,
so
elizabeth,
you
can
stop.
I'm
sorry
kevin
to
cut
you
off,
but
it
has.