►
From YouTube: CHAOSS Metrics Models Working Group 4/26/22-4/27/22
Description
Links to minutes from this meeting are on https://chaoss.community/participate.
A
Go
welcome
everybody
to
the
metrics
model.
Work
group
meeting
today
is
april.
26
27th.
I
think
we're
going
to
revisit
some
things
today
that
we
had
from
the
asia
pacific
call
last
week.
I
think
there's
still
some
discussion
points
there
as
well,
so
I'm
going
to
go
ahead
and
share
my
screen.
Can
somebody
put
the
minutes
in
the
chat
really
fast.
C
A
There
we
go
so
thank
you,
so
the
first
just
the
first
really
quick
thing
is.
This
is
largely
maybe
for
yehui
or
folks
that
are
interested
in
the
conversion
rate
metric.
So
we're
done
with
the
google
summer
of
code
applicants.
You
know
all
the
applications
are
done
and
same
with
outreachy.
So
at
this
point
we
have
to
decide
kind
of
how
we
want
to
basically
who
we
want
to
select
for
the
mentorship
program.
A
D
C
D
C
Know
if
you
saw
my
zoo
message,
but
the
reason
I
asked
you
is
because
a
couple
of
the
proposals
are
actually
around
this
metric
model
that
you've
proposed.
C
And
one
of
them
is
building
it
in
grammar
lab,
and
I
didn't
know
if,
if
I
I've
asked
george,
if
there's
a
grammar
lab
person
who
could
co-mentor
just
I
don't
know
how
well
you've
mastered
grimoire
labs.
So
I
didn't
want
to
automatically
throw
that
all
on
you,
but
if
you're
comfortable
with
it,
I
won't
worry
about
it,
but
usually
grow.
More
lab
folks
are
super
about
helping
during
google
summer
code.
B
E
B
A
C
It's
okay,
yeah
yeah,
that's
important!.
D
Yeah.
Okay,
sure
just
just
tell
me
how
to
attend,
enjoy
join
the
whole
things
how
to
handle
the
process.
I
I
I'm
happy
to
to
contribute
to
my
time.
I
believe.
C
A
On
the
call,
but
what
we
do
is
we
have.
There
are
a
large
number
of
applicants
this
year
and
both
so
usually,
we
just
ask
for
the
mentors
to
provide
maybe
a
top
four
that
they
have
an
interest
in,
so
that
we
can
get
the
list
down
to
a
manageable
group
that
we
can
talk
about.
You
know
what
I
mean
or
five
or
whatever
it
might
be
just
so
we
can
get
it
down
so
we're
not
talking
about
maybe
20
people
or
25
people
yeah
so
and
then
from
there.
E
A
You
know:
okay,
well,
great
thanks,
yahui.
A
All
right,
the
next
thing
is
the
the
metric
model
step
forward
proposal
and
we
talked
about
this
in
the
asia
pacific
column.
I
think
we
think
there
were
some
kind
of
some.
These
are
just
notes
from
the
from
the
asia
pacific
call.
So
I
think
there's
a
few
things
that
we
still
need
to
sort
out.
I
think
the
the
step
forward
proposal
was
received
really
really
well
in
the
asia
pacific
call
last
week
and
so
june.
A
C
F
Okay,
could
we
just
directly
ask
questions
and.
G
F
Okay,
I
can
quickly
go
through
and
I
just
want
to
show
our
step,
so
we
have
five
steps,
so
the
most
important
step
is
step
step.
Two
metric
step
step
two
and
in
the
picture
is
a
picture
three
picture
three.
So
this
I
just
want
to
show
our
current
step
is
the
metric
step
two
and
the
matrix
module
step
one.
F
F
A
F
A
A
B
E
A
F
A
C
I
think
I've
seen
a
more
detailed
version
of
that
graphic.
That
explains
it
more
completely.
Maybe
it's
in
the
issue
real
for
google
somewhere
code,
it's
somewhere,
I've
seen
I've
seen
more
detail
for
it
somewhere
in.
A
A
A
D
If
you
step
into
the
last
slides,
we
can
see
that
how
to
transform
from
step
to
metrics
model
step
two
to
step.
One
sorry
step
two
to
step
two,
a
step
three,
this
slide
right
here
that
yeah
the
left
picture
here
so
maybe
june
you
can
you,
you
wanna,
say
something.
F
F
Overlay,
organizing
and
our
original
date
so
and
we
we
don't
have
online
realization,
maybe
we
could
use
auger
or
some
load
book
okay,
so
it
is
so
we
added
us
three
parts
to
to
implement
our
match
module
step.
Two.
F
C
So
we
could,
I
know,
I'm.
D
Sorry
yeah
I
mean
I
I'll,
show
you
and
like
how
introduced
some
visualization
from
using
by
the
notebook
into
the
matrix
model.
We
really
love
this
idea,
so
we
want
to
do
some
enhancement
means
that
we
we
introduced
this
algorithm
definition
in
the
notebook
and
and
the
data
inside
I
mean
not
just
to
show
the
visualization.
D
We
provide
some
analysis
on
each
of
single
metric
results
with
with
the
real
world
community
data
site,
so
basically
that
that
could
quickly
verify
how
it
works
about
this
matrix
model
final
result,
so
the
the
the
left,
the
right
part
I
mean
yep,
this
top
right
corner
is
showing
that
the
basic
structure
we
set
up
in
the
last
asia-pacific
meeting
gave
us
a
a
demo
how
to
do
that.
A
A
Header,
so
for
yeah
and
here's
the
community
activity,
so
this
was
the
proposed
structure:
okay,
so
and
sean.
You
can
comment
on
this,
so
I
I
when
it
comes
to.
C
A
C
So
so
so
the
the
data
would
be,
for
example,
and
daniel,
and
I
have
discussed
this-
that
the
data
could
be
gremore
lab
data.
It
could
be
auger
data-
that's
not
that's
not
relevant
where
the
data
comes
from,
but
we
could
decide
on
delivering
it
in
a
json
format,
regardless
of
the
origins.
That
seems
like
a
smart
idea
that
we
wouldn't
have
dependencies
on
a
particular
tool,
but
we
could
actually
have
shared
data.
C
D
B
D
We
so
we
definitely,
we
definitely
need
the
the
support
form,
auger
or
grimlab,
to
set
up
those
data
sites
and
that's
what.
C
For
for
either
auger
or
grammar
lab,
it's
pretty
easy
to
say
we
called
these
apis
and
pulled
this
data,
and
this
is
what
we
did
to
transform
it
exactly
and
for
the
data
insight.
That
would
be
the
piece
that
we
would
then
evaluate
right
like
in
terms
of
like
when
you
think
about
any
you
know,
proving
any
algorithm
has
utility
it's.
It's
there's
an
evaluation
like
this.
C
Do
you
agree?
Yup,
yup,
okay,
yeah
this?
Actually,
this
from
how
you
do
this
perspective,
this
makes
it
significantly
more
clear
than
the
approach
that
we
did
by
essentially
burying
all
of
this
in
a
notebook
and
a
series
of
api
calls.
So
we
never
persist
the
data,
for
example,
in
the
work
that-
and
I
did
so
I
I
like
the
structure
that
you
produced.
C
I,
like
your
straw,
I
like
the
structure
much
better
and
I
think
it's
a
model
to
follow,
because
I
think
I
think
we
can
get
to
that
evaluation
point
much
more
clearly
with
the
model
that
you
that
you've
laid
out
here
in
this
other
improvement.
Are
you
talking
about
the
insights
yeah
so
so
the
data
insight
is
like
a
thing.
We
can
evaluate
the
algorithm
that
produced.
It
is
clear
and
separate
and
the
data
that's
used
to
generate
it
is
persisted
and
clear.
C
So,
when
you're
evaluating
the
insight
you're,
you
can
actually
you
actually
know
how
you
got
to
that
insight
and
could
reproduce
it
not
that
you
can't
reproduce
what
regal
and
I
did,
but
I
think
this
this
structure
is
much
more
transparent.
D
D
D
B
D
And
we
can,
we
can
help
them
to
to
set
up
this
notebook
at
the
beginning
if
they
don't
have
such
ability
capability
in
the
beginning,.
E
E
C
Yeah
and
this
this
leads
to
a
like
question
regardless,
and
I
have
an
outstanding
question
of
interest
that
we
haven't
solved
yet,
which
is,
if
we
use
notebooks.
It
would
be
really
great
if
there
was
a
way
for
people
to
make
a
comment
on
a
notebook,
and
I
don't
know
that
there's
a
plugin
that
does
that.
But
if
anyone
discovers
one
I
think
it
would.
That
would
be
exceptionally
useful
and
then
we
could
host
these
metric
models
somewhere
as
well.
D
Yeah,
I
think
I
think
draca
has
already
set
up
how
to
how
to
set
up
the
whole
notebook
environment
on
one
requirement
files.
If
I
remember
correctly
in
the
community
awareness,
I
think
we
can
based
on
this
file,
changi
and-
and
I
and
android
could
work
together
to
set
up
the
whole
notebook
set
environment,
setup
guidelines.
D
Set
up
this
notebook
and
yeah
yeah,
actually,
we
we're
using
vs
code
on
our
laptop
to
quickly
good
quickly
set
up
the
the
whole
things.
It's
quite
quick
and
we.
C
D
C
C
E
C
D
As
as
we
discussed
in
the
last
stage
of
pacific
june
also
mentioned
that,
actually
because
we
talked
to
our
community
open
ruler
and
undergo
an
open
course,
and
they
would
like
to
contribute
some
online
service
to
to
be
able
to
install
some
online
green,
live
or
augur
and
together.
H
D
D
D
A
B
C
Agreed
that
the
structure
of
of
the
metric,
that
was
that
that
yehui
and
june
did
is
a
really
good
structure
for
showing
and
evaluating
the
utility
of
a
metric
model.
C
So
this
makes
it
agnostic
to
whatever
tool
is
used,
which
I
think
is
good
and
the
data
insights
well,
okay,
the
algorithm,
then
the
algorithm
folder.
C
That
folder
makes
the
way
that
the
metric
is
calculated
transparent.
So
it
separates
the
presentation
of
data
from
the
machine
learning
or
data
analytics
that
are
used
to
provide
data.
That's
displayed
okay
and
then
the
data
insight
is
it
is
that
the
other
yeah
the
data
insight
is
essentially
that
becomes
the
thing
that
we
show
people,
so
they
can
look
at
data
about
their
community
and
provide
us
feedback
about
basically,
an
evaluation
of
the
utility
of
a
metric
model
and
I'm
sort
of
intersecting.
Some
conversations
that
yehui
and
I
had
on
slack
about
this.
C
H
E
I
love
the
I
love
the
slides
that
you
created
those
are
excellent,
slides
in
the
last
meeting.
We
were
kind
of
talking
a
little
bit
about
what
what
a
metric
is,
what
a
model
is
and
what
a
focus
area
is
and-
and
to
be
honest
with
you,
my
thought
I
had.
I
had
gone
to
that
data
information,
knowledge,
the
d-I-k-y
or
d-I-k-w
triangle
as
well.
When
I
was
when
I
was
thinking
about
that.
E
So
when
I
saw
this,
it
was
just
like
oh
yeah,
I
was
kind
of
thinking
of
that
as
well,
so
so
very
cool.
The
the
comment
that
I
would
make
on
this,
though,
is
that
we
are
becoming
very,
very
implementation,
heavy
and
I
question
whether
we
want
to
go
that
far
in
the
metrics
models,
because
it's
it's
really
for
me.
E
It's
really
getting
into
the
realm
of
software
implementations,
which,
which
are
it's
definitely
something
that
we
are
working
on
and
we
need
to
be
working
on
and
it's
important
and
and
I
and
I
think,
you've
outlined
a
a
great
way
of
handling
the
implementations.
E
E
C
I
think
the
the
discussion
that
yehui
and
I
had
in
slack,
really
resonated
with
me
that
we
should
about.
We
need
to
evaluate
these
different
metric
models,
because
I
think
that
provides
some
kind
of
evidence
to
people
that
they're
useful
and
I
think
we're
going
to
learn
that
some
metrics
models
that
we
dream
up
are
not
useful,
and
so
I
think
we
can
sort
of
approve,
publish
a
metric
model
at
the
definition
stage
and
I
think
there's
like
a
separate
checkbox
that
I
think
is
really
important
or
will
become
really
important.
C
D
D
But,
as
a
as
shane
mentioned,
it's
kind
of
the
big
biggest
difference
between
the
metric
and
the
matrix
model,
we
finally
have
to
provide
the
value
of
of
the
matrix
model
for
for
people
who
does
want
to
use
that,
like
community
manager
and
osmo
people,
so
they
would
like
to
to
know
that
it
really
worked
for
my
comp
company,
our
community,
because,
as
I
mentioned
in
the
last
asia-pacific
meeting,
a
lot
of
community
manager
come
to
keos
because
they
found
out
kls
have
a
lot
of
good
excellent
metrics
already,
but
they
don't
know
how
to
use
that.
D
That's
why
we
create
or
set
up
this
matrix
model
working
group
together
and-
and
we
already
stepped
into
this
east
world
means
we
have
to
prove
that
matrix
model
could
prove
that
this
matrix
with
a
some
logic,
neologic
side
could
provide
value
for
those
people.
So
we
have
to
verify
that.
Prove
that
that's
why
we
provide
some
real-world
design
and
you
by
provided
by
our
group
and
provide
design
data
inside
and
those
data
could
from
us,
could
form
community
monitors
with
their
real
experiences
working
in
the
community,
but
and
finally,
they
would
find
out.
D
Does
it
really
this
matrix
model
could
work
for
for
them
and
also
I
have
a
as
you're
concerned.
I
agree,
and
I
think
we
can
include
me
and
shane
and
other
people
together
in
this
meeting.
We
can
provide
some
implementation
support
on
on
on
the
matrix
model
we
already
have,
and
if,
if
I
mean
for
for
like
issue,
100
or
response
needs
some
metrics
model,
if
they
don't
have
info
not
telling
enough
time,
I
mean
trenchi
and
julia,
and
I
could
also
provide
such
support
on
that.
D
We
can
help
collective
data
handling
the
data
following
the
small
definition,
and
we
would
like
to
provide
some
data-
I
mean
the
notebook
basically
but
the
data
inside.
If
I
I
hope
we
can
work,
the
entire
work
worked
better
to
prove
that
white
such
things.
D
So
that's
why
I
mentioned.
We
would
like
to
introduce
the
dating
data
site.
Firstly,
from
our
own
community,
like
chaos
to
to
that
era,
is
used
first
to
to
give
us
comments
and
that
she
liked
or
matrix
model
or
not.
A
E
Also,
I
would
just
say
that
I
I'm
not
I'm
not
disagreeing
with
either
of
you
on
the
importance
of
proving
utility.
I
think
that
I
absolutely
agree
with
you.
My
worry
is
around.
E
The
the
working
group
and
the
creation
of
the
metrics
models
and
that
we're
almost
jumping
past
the
model
into
implementation,
and
perhaps
we
need
to
slow
down
and
spend
a
little
bit
of
time
on
the
model
and
creating
that
recipe
for
implementation
before
we
jump
directly
into
implementation
and
then
the
and
then
a
little
bit
of
the
confusion
of
what
that
implementation
would
be
for
our
working
group
because
we
do
have,
we
do
have
software
working
groups.
Where
is
the?
Is
the
implementation
better
handled
in
augur?
E
C
Sean
one
of
the
things
I
think
came
out
of
a
discussion
that
daniel
and
I
had
when
we
visited
in
madrid
and
in
some
of
his
team,
is
what
we'd
like
to
do
inside
chaos
is
sort
of
make
the
software
part
a
little
bit
more
agnostic
and
supportive
of
efforts
like
this.
So
we
can
produce
these
json
files
with
whatever
tool.
Somebody
knows
whatever
makes
the
most
sense,
and
then
it
comes
to
fruition
in
a
notebook.
C
That's
really
successful,
of
course
we're
gonna
roll
it
into
gremore
lab
and
maybe
we'll
roll
it
into
auger,
but
providing
tools
that
let
us
get
to
this
point
where
we
have
data,
where
there's
a
much
broader
skill
set
of
there's
a
larger
number
of
people
who
understand
how
to
run
a
jupiter
notebook
against
a
json
file.
Then
there
are
people
who
have
the
time
and
inclination
to
learn
all
of
auger
or
all
of
gremore
labs.
So
I
think
this
could
help
us
build
the
software
community
inside
of
chaos
by
taking
away.
A
Okay,
thank
you
for
this
conversation.
My
just
this
is
part
of
it
is
just
listening
and
part
of
my
thoughts
too,
but
one
I
do
think
I
thought
about
this
like
in
in
chaos.
I
think
we
have
to
reach
towards
insight.
I
think
we
have
a
program
that
does
that
right
now
with
dei
badging
and
I
think
it's
hugely
successful.
A
I
think
a
lot
of
people
like
that
insight,
so
I
think
we
have
to
reach
towards
that,
and
then
I
was
wondering
this
is
then
from
the
conversation
like
the
way
that
we
release
metrics,
like
the
mindset
I
have
towards
metrics
is
kind
of
you
know
it's
gotten
us
to
like
80
metrics
and
I'm
wondering
if
that's
not
the
right
mindset
around
metrics
to
models.
A
You
know
what
I
mean
that
we
kind
of
do
one
one
at
a
time
or
two
like
slow
like
really
slow.
This
is
part
of
the
conversation
yeah.
We
take
our
time
with
it,
as
opposed
to
just
kind
of
producing
them
yeah.
I
see
your
thumbs
producing
them.
The
way
that
we
produce
metrics,
sometimes
but
part
of
that
part
of
that
slowness
does
include
this
evaluative
component.
You
know
what
I
mean.
A
It
does
include
this,
this
data
set
from
communities
and
it
does
provide
that
insight,
and
so
I,
like
that
quite
a
bit
and
then
to
your
last
point,
kevin
like
where
this
conversation
you
know
would
occur
like
like
the
the
the
dei
badging
program,
obviously
does
not
occur
within
the
dei
working
group
just
because
it
is
so
much
work
to
get
done.
A
I
think
my
my
thought
would
be
is
that
we
at
least
start
the
work
here
and
if
it's
clear
that
we
can't
support
it
in
a
two-week,
you
know
an
every
other
week,
cadence
kind
of
thing,
or
it
ends
up
occupying
an
entire.
The
entire
meeting.
Everything
then
then
we'll
cross
that
bridge
when
we
come
to
it,
but
it
until
then.
I
think
I'd
kind
of
like
to
keep
it
here,
because
I
think
dei
badging
even
started
in
the
dei
working
group
until
it
kind
of
became
its
own
thing
and
needed
its
own.
E
Your
music-
I
think
that
was
elizabeth.
That
made
the
comment
in
the
in
the
chat
and
I
think
I
think
that's
kind
of
where
I
was
thinking
about
it
as
well.
E
Maybe
we
could,
if
we
could
think
of
getting
to
metrics
model
step
one
as
oh
sean,
okay,
getting
to
metrics
model
step,
one
is
creating
the
recipe
for
the
model
right,
so
we
get
to
step
one,
and
this
is
the
recipe
for
the
model,
and
then
we
get
into
step
two,
and
this
is
let's
all
get
together
and
and
make
dinner
right
the
implementation,
let's
cook
it
up,
but
I,
but
I
I
think
that's
cut.
That's
for
me.
I
feel
like
there
should
be
kind
of
two
distinct
steps
where
you
are
like
step.
E
A
A
A
C
World
insight
to
it
so,
and
if
and
and
I
would
I
mean
it's-
it's
not
insignificant
that
yahui
thinks
that
openoiler
may
be
able
to
provide
us
with
some
infrastructure.
Where
effectively
we
could
stand
up
a
version
of
grimoire
lab
and
a
version
of
auger
where
people
threw
us
their
repos.
We
could
generate
these
json
files
with
in
relatively.
D
C
D
Maybe
next
the
next
meeting
we
can
prepare
some
example
for
the
on
the
other
matrix
model.
Not
just
the
community
aware
companies
and
and
under
connectivity,
but
but
for
the
exact
working
matrix
model,
can
provide
some
quick
demo
online
to
show
how
easy
it
is
to
set
up
the
whole
things,
and
with
that
we
may
be
sharing.
We
can
work
together
to
to
provide.
C
C
D
I
mean
I
mean
the
people
working
here.
We
have
different
background,
we
are
professors,
we
are
community
managers
and
we
are
hospital
peoples,
but
we
definitely
not
all.
Not
all
of
our
of
us
are
data
scientists,
so
we
don't
understand
right,
so
we
have
to
make
the
whole
data
handling
simplified.
Simply
so
that's
our
purpose.
A
D
E
D
You
that
how
to
using
notebook,
okay
and
the
json
file
to
provide
value
of
examples
on
the
on
the
other
matrix
model,
to
see
it's
very
quickly
and
also
about
I
mean
before
we
already
have
some
instance
on
our
side
of
group,
mobile
or
auger.
Actually,
it's
quite
simple
or
quickly
to
correct
to
collect
those
data
and
handle
those
data.
D
It's
not
such
a
complicated.
I
gotta
try
to
okay.
We
are
not
gonna
light
all
the
people
to
to
set
up
those
environments.
We
can
help
people.
A
D
A
A
C
C
D
Yeah
I
mean
we
can
we
can
fall
into
the
working
progress
as
we
have
done
before.
We
can
set
up
google
doc
to
say
to
to
give
us
a
proposal
to
say.
Okay,
I
have
idea
about
the
working
matrix
model
and
which
contains
several
metrics
we
get
I
gave.
I
gave
this
metrics
model,
several
user
stories
or
use
cases
and
the
surrounding
with
some
matrix
definition,
just
like
the
template
with
matrix
mode
template.
D
We
have
this
that
that's
the
first
one
or
step
one
of
the
matrix
model
creating
a
setting
up,
and
then
we
can
step
into
the
okay.
If,
if
we
already
done
so-
and
we
can
step
into
the
next
step,
I
mean
we
can
quickly
using
the
green
live
auger
to
start
collecting
the
data
from
from
some
communities
and
set
up
those
notebooks
and
and
to
quickly
to
show
the
the
final
result-
and
I
hope
the
whole
working
flow
is-
is
friendly
freshly,
so
make
us
quickly
to
say
it's
a
result.
D
Unlike
lucky
it
is
mentioned.
It's
it's
could
be
more,
could
be
useful
for
some
people
or
not
useful
for
some
people,
but
we
just
do
want
to
verify
that
it
really
work.
Yeah
yeah.
A
Yeah
I
like
this,
the
the
phrase
of
verification
or
evaluation,
because
that's
really
what's,
I
feel,
like
that's,
what's
happening
here.
The
other
thing
that
I
would
like
to
talk
about
next
time
is
you
kind
of
keep
alluding
to
the
ability
to
get
community
data.
A
F
Yeah,
maybe
we
could
have
owen
skipper
for
each
tools.
B
G
D
A
That
so,
okay,
okay,
that's
helpful.
Okay,
we're
we're
making
progress!
Folks,
we
really
are.
This
is
a
good
conversation.
I
think,
there's
a
couple
action
items,
one
night,
shawn
and
regava.
I
did
just
put
one
in
there
for
you
for
the
community
welcoming
this
folder.
C
That
would
be.
That
would
be
my
goal.
I
don't
know
if
forgava's
with
us,
but
I
I
actually
I
I
it's
most
of
that
notebook
actually
depends
on
json,
that's
dumped
from
the
auger
api.
So
I'm
sure
you
have
it
all
anyway
right
we
would
just
save
the
json
files,
okay,
and
we
would
also
have
to
extract
the
algorithms
used
to
generate
the
json
files.
But
okay.