►
From YouTube: CHAOSS DEI Working Group 4/20/22
Description
Links to minutes from this meeting are on https://chaoss.community/participate.
B
D
B
C
E
A
A
D
A
A
B
F
Yes,
indeed,
so
we
can
probably
have
like
a
quick
look.
Let's
see
what
we
need
to
what
we
need
to
get
done
here
and
if
there
are
things
that
needs
more
work,
I'm
happy
to
continue
working
on
it.
I
even
like
during
the
week
as
well
so.
B
Perfect,
so
I
can,
I
can
stop
the
recording.
Does
everybody
have
access
to
this.
F
B
C
B
B
C
C
Anybody
yet,
like
that's
still
part
of
their
yeah
kind
of
their
ask,
so
I
like
that,
how
you
have
that
separated
out,
I
think,
is
important.
B
F
We
can,
I
can
try
to
like
probably
fill
those
categories
with
more
information.
Okay-
and
I
can
go
back
again
to
document
me
and
maybe
arrange
the
paragraphs
so
in
a
way
that
they
can
make
sense.
B
F
B
Okay,
okay,
so
I
added.
B
C
You
go
forward
when
there
are
specific
things
that
are
attached
to
a
specific
program
about
expectations
or
some
like
a
step
that
only
google
asks
for,
but
like
outreachy
doesn't
where,
where
would
that
live
in
this
document?
Do
you
think.
C
B
Because
I
think
you're
right
and
I
think
every
each
program
changes
a
little
bit
every
year
and
so
like
just
the,
for
example.
The
way
that
google
summer
of
code
does
the
tell
us
how
many
slots
you
would
like,
like
that.
Outreachy
doesn't
do
that,
and
so
that
that
kind
of
affects
how
we
do
evaluation,
because
without
richie
the
evaluation
has
to
be
really
towards
just
a
single
individual.
With
google
summer
of
code,
the
evaluation
can
kind
of
be
a
range
of
people.
E
B
C
But
it
makes
sense
for
us
to
have
a
person
that
is
maybe
not
a
mentor,
but
someone
that
know
knows
kind
of,
or
has
looked
at
that
or
knows
where
to
find
that
information
and
can
like
make
sure
that
everybody
knows
kind
of
like
a
coordinator
or
something
like
that.
Do
you
think
there's
a
room
for
that
here.
B
I
do-
and
I
actually
had
the
same
thought
this
morning-
that
maybe
just
one
person
you
know
for
the
year
of
2022
kind
of
thing
is
for
every
program
that
we're
involved
in
kind
of
takes
a
look
at
how
season
of
docs
runs
or
how
outreachy
runs,
because
I
think
we
kind
of
assume
they
all
run
approximately
the
same,
and
they
don't.
C
B
Okay,
the
let's
see
one
of
the
things
so
I'll
just
talk
about
these
three
because
we
have
so
many
applicants.
This
is,
I
know
it
says,
no
direct
messages
from
applicants.
B
The
really
the
reason
for
this
is
because,
if
there's
a
question
that
a
person
has
it's
probably
a
question
that
other
people
have-
and
it
really
makes
sense
to
have
those
questions
and
answers
occur
on
the
slack
channel
and
if,
if
we
do
it
through
direct
messages,
like
elizabeth,
you
get
it,
I
mean
you
would
answer
the
same
question
that
I
would
and
then
it's
just
not
very
efficient
and
then
not
everybody
even
sees
the
answer.
So
there
could
be
a
bunch
of
other
people
that
could
benefit
from
that
answer
as
well.
B
So
I
really
think
we
should
start
signaling
this
in
the
future
that
we
don't
do
direct
messages
with
applicants.
I
also
think
it
creates
some
inequity.
Sometimes
if
people
aren't
comfortable
to
do
direct
messages
and
others
are,
I
do
think
through
that
process.
It
can
create
some
inequity.
B
B
I
know
it
can't
be
guaranteed
and
the
reason
I
said
24
hours
is
because
we
do
have
people
that
are
in
different
time
zones
and
sometimes
it
just
by
the
time
you
see
the
message
and
are
able
to
pick
up
the
message.
24
hours
may
have
elapsed
in
in
a
subsequent
answer,
and
then
I
think
this
was
from
also
like
from
amy
and
elizabeth
that
we
use
slack
to
define
expectations,
whether
it's
through
pins,
whether
it's
through.
B
Like
the
description
in
the
channel
like
we
use
that,
and
we
communicate
that,
that's
what
we
use
then
the
last
thing
I
added
in
elizabeth.
You
certainly
alluded
to
this.
If
you
were
a
mentor
for
a
project,
if
you
choose
to
be
a
mentor
for
a
project,
you
are
involved
in
the
project
for
the
life
of
the
program,
which
is
during
the
development.
B
Of
what
that
project
would
be
it's
during
the
application
and
question
period,
which
is
you
know,
kind
of
that
the
meet
and
greet
period
that
is
prior
to
application.
It's
during
selection
period,
it's
during
the
project
itself
and
it's
during
final
evaluation,
so
commitment
to
be
a
mentor
is
not
just
a
summer
commitment.
B
C
B
Yeah
so
you're
you're
involved
from
then
it's
not
just
a
one
and
none
because
I
just
think
I
think
this
is
really
important
and
I
think
if
a
mentor
is
putting
together
a
project,
they
are
really
the
people
who
are
most
appropriate
to
answer
questions
regarding
that
project
because
they
had
taken
the
time
to
develop
kind
of
the
guidelines
around
that
project,
and
I
do
think
it's
a
little
tricky,
sometimes
for
other
people
to
answer
questions
about
projects
that
they
didn't
necessarily
develop.
So
that's
my
take
there
and
I'm
done.
C
I
think
the
only
things
that
I
added
were
kind
of
exactly
to
what
matt
just
said
is
just
kind
of
making
sure
that
that
expectation
is
that
all
mentors
can
jump
in
and
answer
questions.
Not
just
you
know,
the
project
leads
or
whatever
like
it
can
be.
Whoever
is
on
the
mentorship
team.
Can
jump,
should
jump
in
and
answer
questions
before.
E
B
B
Could
that
be
submitted
via
the
gsoc
application
portal,
such
a
proposal.
You
know
what
I
mean
like
we
ask
for
a
document
that
describes.
B
B
D
Yeah,
it
was
just
a
few
comments
on
the
number
of
goals
achieved.
I
wondered
if
we
should
compare
it
to
like
the
goal
set
so
just
to
give
like
a
proper
like
evaluation
of
like
if
you
completed
100
or
just
that,
and
my
second
thought
or
my
second
comment
was
about
mentee
feedback
just
like
for
those
who
maybe
didn't
complete
their
program
successfully
or
anything
just
to
hear
why,
just
I
think,
feedback
from
them
would
be
a
good
way
to
like
improve
for
the
next
year
or
the
next
time.
There's
a.
B
B
F
D
F
All
meant
all
the
mentees
can
fill
it
out.
F
B
F
A
I
think
some
it's
a
restaurant,
for
the
feedback
would
be
like
asking
menses
what
they
think
might
have
made
their
contribution
easier.
I
think
that
way
helps
us
know,
maybe
what
to
add
like
a
guide
or
something
like
what
could
be
lacking
if
you
get
so
sometimes
maybe
some
people
find
it
found
they
had
that
to
contribute
because
they
could
not
find
their
feet
early
or
did
not
know
how
to
do
certain
things.
I
think
maybe
it's
feedback
will
help.
B
B
We
try
to
help
capture
the
work
that
was
done
so
that
it
could
be
shared
forward
for
other
people,
and
I'm
wondering
if
we
could
include
the
include
a
form
like
this
at
that
point,
not
even
like
post
graduation,
because
sometimes
sometimes
people
graduate
and
they're
gone.
A
C
If
we
had
like
a
coordinator
of
all
of
the
programs
that
could
be
that
person's
job,
that
they're,
like
not
super
involved
in
that
so
like
there
won't
be
a
bias
you
know
like
if
your
mentor
is
the
one
asking
you
for
feedback
on
themselves
like
there
might
be.
Some
hesitancy
to.
You
know
be
honest,
but
if
it's
like
a
third
party
person,
almost
that
wasn't
super
involved
but
knows
what's
going
on,
they
might
be
more
willing
to.
B
B
B
B
B
B
C
Matt,
did
you
want
to
talk
this
isn't
on
the
agenda,
but
sorry
to
interrupt
everything?
Did
you
want
to
talk
about
the
templates?
The
issue
templates
a
little
bit.
B
Leaning
right
into
it,
so
just
so
everybody
knows
across
all
of
the
working
groups.
The
metrics
working
groups
are
like
value
dei,
common
risk
and
evolution.
We
have
created
we're
starting
to
create
templates
just
to
kind
of
help
with
the
workflow
of
things,
and
so
one
of
the
templates
so
we've
create.
We
have
really
three
templates.
B
Don't
worry
about
miscellaneous
right
now,
but
the
first
template
is,
if
you
just
have
a
metric
idea,
so
you're
talking
to
somebody
at
a
conference
or
you
read
a
paper
and
you
have
an
idea
for
a
metric
this
first
one
is
really
just
to
capture
that.
So
just
tell
us
a
little
bit
about
what
the
metric
is.
You
know
maybe,
where
you
heard
about
it.
B
The
second
template
is
for
metrics
release
candidates.
So
as
we
actually
develop
a
metric
within
a
working
group
over
the
course
of
six
months,
part
of
the
process
is
you.
You
ultimately
finalize
the
metric
in
terms
of
language,
but
then
you
open
an
issue
that
kind
of
signals
to
the
release
team
that
hey
here's
a
metric,
that's
going
to
be
part
of
that
release
process.
So
this
issue
is
that
the
third
template
is
for
metric
revisions
and
so
for
particularly
this
next
round
of
metrics
releases
over
the
course
of
the
next
six
months.
B
So
there
are
there's
formatting
issues,
but
we
just
need
to
go
through
and
kind
of
revise
each
of
the
metrics.
And
so,
if
you,
you
know
open
up
a
new
issue,
you'll
be
presented
with
these
templates,
and
so,
as
you
revise
a
metric,
just
choose
this
template
and
kind
of
follow
along
with
the
template.
So
really
what
the
template
asks
for
is
the
name
of
the
metric
that
is
being
revised
and
the
website
link.
So
its
current
release
on
the
on
the
website
a
link
to
the
original
metric.
B
B
B
And
then
specifics
as
to
what
you
as
a
reviewer,
think
the
working
group
should
take
a
look
at
and
I'll
show
you
a
couple
that
I
have
done
for
the
dei
working
group,
and
then
we
have
a
checklist.
The
checklist
is
not
done
as
part
of
the
original
issue
submission.
It's
really
a
checklist
that
once
the
issue
is
created,
the
working
group
goes
through
the
checklist
just
to
kind
of
ensure
that
it's
ready
for
a
second
release
or
a
re-release.
B
All
right
are
there
any
questions
on
that
it'll
make
more
sense.
Maybe
when
I
show
it
so
one
of
the
released
metrics
that
we
have
is
is
inclusion,
time,
inclusion
for
virtual
events
and
so
revising
metric
time.
Inclusion
for
virtual
events,
the
name
of
the
metric
being
revised
as
time.
Inclusion
for
virtual
events
here
is
a
link
to
the
metric
on
the
website.
B
B
Here
are
the
specific
things
that
I
had
noted
that
should
be
updated
with
the
metric,
so
some
of
them
are
naming.
Some
of
them
are
questions
like
I.
I
personally
didn't
understand
how
low
bandwidth
was
part
of
a
time.
Inclusion
metric,
changing
things
on
the
leicard
scale,
from
one
to
five
to
one
to
x.
B
Just
you
get
the
idea
just
kind
of
walking
through
the
list
of
proposed
changes
or
things
to
look
at
it
is
not.
The
working
group
could
look
at
this
list
and
be
like
those
are
all
terrible
suggestions.
We
don't
take
any
of
them
and
you
move
forward.
I
mean
that's
totally
cool
too
and
then
here's
what
the
checklist
looks
like.
B
So
as
a
working
group,
as
you
are
making
the
revisions
to
the
metric,
you
can
kind
of
check
things
off,
so
the
issue
for
revision
has
been
created
once
you're
done,
you'll
check
this
that
it's
been
marked
as
a
you
know,
candidate
release
once
we
announce
it
to
the
once.
We
announce
that
this
is
now
under
review.
You.
B
C
C
B
Yeah
I
mean
if
they
see
something
that
is
not
on
this
list:
totally
cool
yeah,
okay
yeah.
These
are
just
the
things
that
I
saw,
and
certainly
you
know
if
you're
looking
at
time,
inclusion
for
virtual
events
and
you're
like
the
the
third
point,
is.
D
B
D
B
B
And
then
I
just
one
quick
question
before
we
go
elizabeth:
do
you
do
we?
Where
is
the
event
code
of
conduct?
You
know
where
that's
located.
C
Not
on
our
event
page,
let
me
see
here.
C
It
is
oh
here
it
is
oh
interesting.
Okay,
that's
like
into
the
linux
event
code
of
conduct.
I
will
have
to
look
for
this.
B
B
Okay,
maybe
we
don't
have
one.
Nonetheless,
it's
been
suggested
that
we
have
that
we
update
how
about
this,
that
we
at
least
update
our
project
code
of
conduct
to
include
events
as
well.
So
we
just
have
a
single
code
of
conduct
that
cuts
across
all
things.