►
From YouTube: CHAOSS Value Working Group Meeting
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
B
Okay,
welcome
everyone.
Welcome
back
yours
value,
working
meeting
on
april
22nd,
so
please
write
your
name
and
tell
us
how
you
are
feeling
or
anything
anything
you
have
on
that
desktop
or
anything.
You
want
to
share
just
write
down
in
the
meeting
minutes.
C
B
B
C
Okay,
now
I'll
figure
it
out,
but
today,
for
some
reason
it's
not
opening
the
link.
Oh.
B
Okay,
so
the
first
topic
on
the
agenda
is
academic
focus
group,
as
stephen
told
that
he's
attending
on
the
csv
conference.
So
that
was
very
last
thing
we
discussed
in
the
last
meeting.
So
the
goal
is
like
what
are
we
hoping
to
gather
from
a
csv
conference
or
thoughts
or
suggestions
or
ideas.
F
Well,
by
the
way
I
did
have
a
mango
this
morning,
just
because
food
is
always
the
most
important
thing
to
talk
about,
but
moving
along.
I
I
pitched
a
birds
of
a
feather
session,
which
is
primarily
a
discussion
in
terms
of
you
know
what
what
the
academics
want
in
terms
of
the
ability
to
be
able
to
rec
credit
for
open
work
within
the
the
academic
system
of
traditionally
peer-reviewed
stuff,
which
doesn't
really
work
anymore
for
a
lot
of
people
right
or
is
certainly
not
the.
F
I
am
collecting
a
bunch
of
links
and
resources
to
you,
know,
articles
and
guidelines
and
stuff
that
I'm
going
to
throw
out
there
and
try
to
collect
we're
discussing
some
of
the
pain
points
and
some
of
the
goals
and
desires.
I
recently
had
a
conversation
with
our
associate
provost
for
faculty
progress
or
something
like
that.
F
We
are
looking
at
at
our
university
these
and
I
assume
lots
of
other
universities
using
these
kind
of
master
database,
metric
import,
collect
systems
or
trying
to
manage
the
horror
that
is,
academic,
annual
review
and
plan
of
work
and
promotion
and
scholarship
record
and
all
that
stuff.
These
are
proprietary,
of
course,
systems
that
are
supposed
to
suck
various
things
into
into
them
and
automatically
kind
of
ignore,
duplicates,
remove,
duplicates
or
stuff
like
that,
the
institute
had
looked
at
four,
had
narrowed
it
down
to
these
two
and
was
gonna
start
pursuing.
F
F
C
Is
is
including
software
contributions
as
evaluation
or
criteria
for
academics.
F
Not
not
just
software,
but
you
know
the
world
of
open
science
right.
You
know
grabbing
stuff,
and
so
you
know
we've
done
some
very
preliminary
playing.
You
know
mike
mike
nolan.
My
students
have
found
a
way
to
import
center
for
open
science,
open
software
foundation
platform
data
into
chaos
into
our
sorry
integra
bar
into
our
local
instance.
Is
it's
just
a
thing
of
goofing
around
to
see
what
could
go
there
and
these
two
links?
F
You
know
some
of
the
reasons
they
were
interested
in
these
two
is
that
you
can
their
apis
will
theoretically
allow
you
to
specify
other
databases
to
pull
stuff
in
for
your
institution.
So
I
think
in
the
long
term
my
my
team
will
look
at
whatever
we
do
with
our
mashup
and
what
we
can
get
out
of
that
into
these
systems.
F
Just
to
see
what
works
is
early
stage
beta
testing
and
what
I'm
hoping
to
get
from
this
birds
of
a
feather
is
where
other
people
are
running
into
pain,
points
or
other
things
that
are
our
wish
list
for
things.
They
wish
that
their
academic
overlords
consider
that
they
don't
so
my
output
for
my
university
at
some
point
this
summer,
I'm
going
to
come
up
with
a
very
stretchy
first
draft
of
recommendations
and
best
practices
within
our
institution
for
being
able
to
cite
the
work
that
we
all
do.
F
F
C
C
It's
like,
I
don't
even
know
what
they
do
after
reading
their
whole
website.
Digital
science
is
fairly
straightforward.
F
F
No,
but
my
hope
is
that
you
know
we'll
start
playing
with
these.
I
assume
once
we
start
talking
these
guys,
we'll
get
some
kind
of
trial
accounts
on
their
systems
in
which
we
can
do
some
playing
around
so
and
another.
The
other
piece
of
this
is,
you
know,
there's
an
overall
research
and
scholarship
committee
for
my
university
that
looks
over
stuff
like
this,
and
luckily
my
college's
rep
was
stepping
down,
so
I
stepped
up
so
I'll
be
sitting
on
that
committee
as
all
this
stuff
is
happening,
and
is
this
part
of.
F
Is
this
adjacent
to
the
hospital
part
of
my
starter
is
to
try
to
figure
this
out?
Okay
right,
my
charter
evolved
from
a
10-page
whistle
wish
list
that
evolved
out
of
five
or
six
meetings
with
different
sets
of
faculty
and
and
one
of
the
constant
complaints
was
why
I'm
here
right.
You
know
my
my
college.
F
My
college
believes
that
unless
there's
a
peer-reviewed
journal,
article
you've
done
nothing
right.
So
how
do
we
fix
that
is
essentially
the
you
know.
I
can't
fix
it
at
a
policy
level,
but
I
can
fix
it
at
regarding
recommendations
and
from
the
ospo
and
then
hopefully
interfacing
with
these
systems
moving
forward.
G
G
Spreadsheet
we
have
kind
of
these
different
focus
areas.
One
is
called
individual
value
and
so
from
an
individual
value
perspective.
We
talk
a
lot
about
at
least
in
value
so
far
like
job
opportunities,
so
that
there's
a
value
in
engaging
in
open
source
because
it
might
provide
you
additional
job
opportunities.
So
that's
a
metric
right
and
so
there's
a
value
in
participating
in
open.
H
G
G
Right
and
then,
as
you
move
forward,
this
metric
would
probably
like
break
down
maybe
into
more
granular
ways
of
thinking
about
it,
but
it
could-
and
this
is
the
chicken
and
egg
part
it
could
give
you
some
support
if
we
could
publish
a
chaos
metric
that
you
could
point
to
that,
you
that
you
helped
develop
and
you
helped
use
to
create
your
argument.
So
it's
kind
of
kind
of
well.
H
F
Personal
room
for
rpt
is
is
non-existent,
but
even
if
there
was
like
a
really
sketchy
draft
that
I
could
go
to
the
scholarship
group
at
some
point
they
felt
say:
hey,
you
know,
look
here's
one
way
of
thinking
about
this
right.
I
mean
the
the
interesting
thing
for
the
non-academics
in
the
world.
Right
is
that
every.
F
As
universities
have
gone
less
and
less
traditional
in
terms
of
the
science
or
humanities
right
within
some
disciplines,
it's
fairly
easy
to
get
your
rankings
right.
It's
like,
oh
well,
you
know,
I'm
a
scientist,
and
in
the
last
five
years
I've
had
three
articles
published
in
the
journal.
F
There's
video
game
instructors
right
where
we
can
kind
of
go
ahead
and-
and
you
know,
write
an
article
on
the
psychology
of
people
playing
this
game
or
stuff
like
that
right,
there's
very
little
academic
journal
stuff
and
there
are
no
top
academic
conferences.
In
fact,
most
of
most
of
the
conferences
as
with
open
source
software,
their
industry
conferences
and
a
lot
of
academic
places,
say
industry
specimen
right
and
this
whole
idea
of
ranking
a
conference
is
there's
really
only
two
education
conferences
in
in
computer
science.
F
F
What
is
it?
Do
you
remember?
Matt
ieee,
whatever
the
ieee
education
conference
yeah,
we
won't
find
you
to
go
there
to
present
because
it's
acceptance
percentages
are
too
low,
I'll,
take
25
or
50
of
what's
accepted,
instead
of
only
10
or
whatever
his
personal
method
was,
but
it's
one
of
the
only
two
places
you
can
publish
as
an
academic
right.
So
it
doesn't
matter
what
its
acceptance
level
is
at
a
certain
point.
If
those
two
journals,
you
know
those
two
conferences,
the
only
two
they're
out
there.
F
B
Can
yeah,
I
have
a
question
on
this
is
like?
Are
we
focused
on
the
like
relating
this
tenure
and
promotion
thing
with
the
open
source
thing
is?
H
I
F
G
I'll
add
in
my
thoughts
yeah
I
mean
I
think
we
could
sort
that
out
in
this
metric
here.
So
I
think
both
could
be
critical,
so
I
mean
developing
a
piece
of
open
source
software
that
is
used
widely
downstream
by
other
people.
Certainly,
that's
a
big
thing,
making
a
kind
of
say
this
today,
but
making
like
a
critical
contribution
to
the
kernel
as
an
academic
could
be
also
quite
influential.
F
F
It
can
also
be
you
know,
because
the
idea
of
peer
review
is,
is
I
publish
my
equation
or
my
formula
and
other
people
replicate
it,
and
then
they
start
sharing
it
through
academic
journals
as
part
of
their
research
process
right,
which
is
basically,
you
know
I
forked
it
or
I
maintained
it
or
I
added
a
new
feature
right
there,
those
kinds
of
things
and
software,
but
it's
also
the
open
science
piece.
The
the
open
data
piece
open
hardware,
piece
like
who's
using
my
stuff
right,
the
academic
stuff,
the
academic
stuff.
F
Is
only
used
by
this
people
right
in
my
field,
that's
who
there
is
right.
It's
that
kind
of
impact
and
translation.
It's
all
about
that
stuff
and
the
in
a
way,
generally
academia,
measures,
impact
and
translation
through
peer-reviewed
journals
and
we're
trying
to
find
ways
in
which
impact
and
translation
can
be
measured
in
other
ways
that
a
given
professors,
department,
sharing
dean,
would
understand
right.
F
F
Professor
jacobs
got
the
institution
a
license
to
publish
games
with
nintendo
and
he
got
the
first
university
game
on
a
nintendo
ecosystem.
All
right
academically,
yawn
game
industry
wise
fairly,
reasonable
street
cred
right,
you
know,
did
300
million
people
play
the
game?
No,
you
know
a
couple
hundred
thousand
government
which,
in
terms
of
money,
means
nothing
but
in
terms
of
a
game
about
game
history,
working
with
the
strong
national
museum
of
play
on
a
commercial
console.
F
It's
not
like
moving
einstein's
gravity
waves,
but
it's
a
thing
in
my
corner
of
the
world
right,
that's
the
kind
of
stuff
that
we're
trying
to
do
with
these
metrics.
Let's
say
in
our
corner
of
the
world,
and
you
know
the
guys
in
chemistry.
Most
of
them
don't
know
what
the
next
kernel
is
right
or
why
it
would
be
important
or
why
the
entire
internet
might
benefit
from
your
work
right.
You've
got
to
be
able
to
go
ahead
and
communicate
that
and
make
it
clear.
G
F
F
G
C
C
Number
of
downloads
I
mean
it's
like
it's
easier
to
gain
number
in
forks
clones.
Contributor
contributors
are
a
hard
metric
to
gain
use.
Is,
is
it's
a
hard
metric
to
build
confidently
like
there's?
No
there's
no
infrastructure.
It
like
surrounds
academic
publishing,
for
knowing
how
many
people
are
using
that.
F
F
And
something
should
be
clear
to
everybody
is
just
looking.
The
h
index
tells
you
nothing
right.
Everything's
got
to
be
in
context,
everything
has
to
be
human
evaluated,
and
so,
if
you're,
just
looking
at
the
h
index
in
the
top
three
conferences
and
top
five
journals,
you're
not
really
kind
of
digging
deep
into
what
the
work
is.
F
For
a
journal
yeah-
and
this
was
the
problem
with
the-
or
this
has
been
the
discussion
around
the
altmetric
stuff
right-
people
who
are
arguing
well
under
the
humanities,
and
the
fact
that
I
have
a
hundred
thousand
twitter
followers
should
mean
something
in
my
field
and
other
people
saying
you
can
get
a
hundred
thousand
twitter
followers
by
offering
a
mcdonald's
hamburger
certificate
to
anyone
who
signs
up.
So
how
is
that
really
impact
right?
So
is
that.
H
F
F
Yes
and
no-
it's
been
alluded
to
a
couple
of
times,
but
almost
all
of
those
folks
are
just
we're
trying
to
figure
out
how
to
talk
their
universities
with
letting
them
make
an
hospital
and
what
would
be
ospo.
So
this
is
like
on
the
checklist
of
something
an
ospo
might
do,
yeah
that
nobody's
really
thinking
about
it.
Yet
because
it's
only
me
and
and
syed
at
hopkins,
who
have
a
functioning
hospital
at
this
point.
So
so.
G
F
It's
a
was
an
orders
of
operation
people,
it
was
yeah.
F
That
covers
proper
hardware,
all
the
other
things
right,
so
that
we
draft
the
one
that's
starting
to
make
its
rounds
across.
It's
a
year-long
process
of
approval.
F
We
got
part
of
the
strong
foundation
money
that
we
got
was
about
building
a
student
and
staff
team
to
to
help
make
faculty
projects
more
open-able
or
the
ones
that
are
already
open
better
in
terms
of
generating
community
and
stuff,
and
a
lot
of
my
focus
has
been
on
selecting
those
first
21
projects
and
getting
my
team
to
start
to
work
with
them.
This
metric
and
the
larger
picture
around
this
metric
is
kind
of
like
one
of
my
summer
projects,
and
this
will
never
be
policy,
because
policy
only
happens
at
the
university
level.
F
The
university's
policy
or
rpt
is
just
each
college
should
have
one
that
makes
sense
within
their
context.
So
I
can't
mandate
at
the
university
right
so
this
and
if
I
don't
get
the
open
source
policy
except
for
this
policy,
this
stuff
is
going
to
exist
as
our
recommendations
and
best
practices
for
helping
people
do
this
right.
So
I
hope
to
have
that
published
my
website
by
mid-summer
or
fall,
but
here's
what
the
discussions
in
the
world
are
in
terms
of
journal
articles.
Look
at
these
here's,
what
we're
doing!
F
F
We
can
get
everybody
to
either
put
their
stuff
in
there
or,
if
we
get
one
of
these
giant
master
software
systems,
where
the
the
formal
academic,
publications
and
metrics
are
being
sucked
in,
can
we
pull
that
back
out
or
do
we
just
input
our
stuff,
the
larger
overall
system?
Knowing
that
there's
going
to
be
a
much
larger
overall
system,
both
both
helps
and
hampers
what
I'm
doing
right
now
there.
F
C
B
B
So
in
the
interest
of
time,
should
we
move
to
the
other
agenda
or
do
we
have
anything
to
add
to
this
or
anything.
B
F
B
Okay,
so
the
next
agenda
in
the
topic
is
participation
in
or
spokon,
so
I
have
written
a
proposed
document
so,
like
the
topic
is
what
is
the
story
like?
What
should
we
tell
them,
or
is
the
theme
for
that?
So
if
we
take
a
look
at
this,
so
this
is
just
a
sketch
and
the
goal
I
was
trying
to
is
like
what
that
comes.
B
To
my
mind,
is
I
trying
to
understand
their
perspective
that
what
what
are
the
pain
points
for
the
hospital
or
how
do
they
want
to
assess
the
impact
of
that
hospital
or
their
open
source
participation?
G
B
H
Yeah,
that's
fine.
I
think
we
have
I
just
looked
this
morning.
We
had
like
45
responses
and
there's
one
more
day,
so
I
was
going
to
retweet
it
again.
The
results
are
inconclusive,
however,
so.
G
B
That's
a
good
idea.
So
then,
what
is
our
story
that
we
in
the
survey
we
found
these
things
that
people
are
struggling,
especially
in
collecting
or
defining
the
data,
and
how
to
go
about
it
or
I'm
trying
to
make
a
story
that
can
be
presented.
G
Yeah,
so
I
think
the
story
is
you
know
we
we
in
the
chaos
project
have
been
working
on
this
pipeline.
I
kind
of
hate
that
word
but
working
on
this
pipeline
for
a
while.
You
know
in
terms
of
the
process
by
which
people
can
make
data-driven
decisions.
I
also
don't
like
that
phrase,
but
make
data
driven
decisions
to
improve
their
engagement
with
open
source
and
just
through
our
own
reflection.
I
think
this
was
really
elizabeth's
kind
of
motivation
here
through
our
own
reflection.
G
G
Measurement
process
yeah
so
just
kind
of
so
the
we're
out
of
ospo
at
this
point,
so
this
is
no,
I
think,
no
you're,
I'm
sorry
you're
right.
It's
still
in
hospital,
but
like
elizabeth
put
together
the
the
twitter
survey,
which
I
don't
think
was
necessarily
aimed
at
ospo's,
but
but
it
was
just
kind
of
like.
Where
does
everybody
live,
but
this
might
be
useful
for
an
ospo
and
to
just
know
that
this
process
is
problematic
on
three
of
the
four.
H
Yeah
I
dropped
that
screenshot
of
the
results
so
far
in
the
minutes.
If
you
all
want
to
see
it.
C
G
There's
people
have
pro
like
even
when
14
percent
are
saying
so,
then
maybe
the
the
birds
of
a
feather,
I'm
sorry,
the
lightning
talk
could
say
so
what
we
realized
that
in
the
cavs
project
was
we
have
our
work
cut
out
for
us
along
each
one
of
these
horizontals
at
the
moment,
each
one
of
these
bars
and
here's
we
could
in
10
minutes
it's
getting
pretty
tight,
but
maybe
like
off,
for
what
we
would
think
would
be
no,
not
10
minutes
to
end
the
meeting.
10
minutes
for
the
lightning
talk.
G
B
G
C
G
You're
just
presenting
the
results,
and
maybe
talking
about
what
this
tells
us
in
the
community
in
the
cast
project
and
then
maybe
just
a
few
first
steps
forward
yeah
and
we
can
talk
about
what
we
think
those
first
steps
forward
might
be.
I
don't
it's
not
just
to
push
it
to
you,
like
you
figure
out
what
you
figure.
A
G
G
C
Yeah,
I
think
I
think
I
think
a
lot
of
it
comes
down
to
changes
to
the
software
that
we
have
and
more
user-centered
design.
I
don't
think
gremor
lab
or
auger
follows
a
path.
That
is
what
people
are
trying
to
accomplish
so
like
initially
getting
some
data
is
great
but
oftentimes.
There's
a
desire
to
continuously
get
data
easily
create
new
repos
segment
them.
C
I
think
I
think
the
process
of
getting
the
data
and
then
knowing
what
you
have
and
what
questions
you
can
ask
is,
is
a
is
a
barrier.
So
there
are
great
tools.
There
are
a
couple
great
tools
inside
chaos
for
getting
data,
but
I
think
the
process
of
helping
people
know
what
to
do
with
that
afterwards
and
to
sustain
to
sustain
that
work,
because
neither
tool,
continuous,
I
mean
augur-
does
continuously
collect
but
either
tool
like,
for
example,
if
you
own
an
org
and
there's
new
repos,
I
mean
things
can
get
missed.
C
So
I
guess
I'm
just
saying:
there's
a
there's
still
a
lot
of
programming
and
and
analysis
work
that
falls
to
the
organization
and,
to
the
extent
that
you
know
we're
starting
to
understand
the
stories
and
the
metrics.
That
need
can
be
made
easy
and
are
highly
valuable,
but
I
think
there's
more
work
to
do
there
and
and
connecting
what
the
chaos
metrics
are
to
every
view
of
of
what
you're
showing
people
so
there's
a
clear
relationship
between
what's
in
the
tool
and
what
the
metrics
that
are
represented
in
that
particular
display.
G
I
think
is
my
opinion,
so
maybe
I'd
suggest
on
that
first
one.
Knowing
what
data
to
get
like
a
couple
things
that
we
are
doing
in
the
chaos
project.
I
would
actually
say
that
the
dna
badging
program
is
helping.
People
know
what
data
to
get
it
is
we've
framed
it.
We've
said
these
are
the
things
at
this
point
that
matter
we've
worked
around
what
those
things
are.
So
I
think
you
could
mention
that.
I
even
think
our
our
short
stories
that
we're
putting
together
are
helping
yeah
help.
People
know
what
to.
F
So
I
I
had
a
thought
one,
I
mean
one
of
the
services
that
open
at
rit
provides
to
faculty
through
this
fellows
program,
as
we
do
a
bunch
of
ux
ui
work
in
terms
of
like
personas,
helping
people
figure
out
who
their
users
are
and
what
their
use
cases
are.
Is
that
something
chaos
community
has
done
or
would
want
to
do
at
some
point,
we've.
C
We've
done,
we've
identified
key
personas
early
on
in
the
design.
What
I
don't
think
we've
gone
back
and
done
regularly.
C
And
the
different
tools
have
different
constraints
on
what
they
can
do.
I
mean
the
personas
are
ospos
community
managers,
foundations,
corporations,
those
are
the
primary
personas
that
are
software
targets
right
now.
C
F
Valuable-
let's,
let's
talk
to
mike
about
that
too
I'll-
add
that
in
the
email
I'm
sending
between
you
and
him
and
sean
okay,
we
have
to
balance
it
in
terms
of
when
we
can
do
that
versus
when
we
can
do.
F
We
can
serve
the
faculty
where
we're
lining
up
as
fellows,
but
especially
if
that's
revisiting
what
you
folks
have
already
done
it.
It
might
not
be
a
huge
lift
yeah
and
what
we're
doing
a
lot
of
is
trying
to
balance.
C
B
So
does
this
help
tonight?
Yes,
yeah,
so
I'll
yeah
I'll,
wait
for
one
day,
then
I'll
contact
elizabeth
for
the
final
result
and
then
maybe
I'll
write
a
an
abstract
for
this
and
share
with
the
community
for
the
feedback.
B
Okay,
so
we
have
four
minutes
left.
The
next
agenda
was
to
finalize
this
organizational
influence
metric.
We
have
developed,
it's
almost.
G
I
B
Is
this
the
same
forward
format
for
all
the
working
groups?
Yes,
okay,.
G
So
we
kind
of
made
a
few
suggestions
here,
just
in
terms
of
things
around
the
contributing
file,
like
anything
that
basically
anything
that
appears
to
be
consistent
across
all
working
group,
readmes,
that
we
move
that
consistent
text
to
the
handbook,
okay
and
in
the
readme.
We
just
point
to
the
handbook.
H
G
Okay,
uniquely
in
every
repository.
B
K
C
C
J
Yeah,
I
think
you're
I
think
you're
right,
but
that
is
a
document
we
can
just
create
and
we
can
throw
in
there.
C
J
That
a
link
would
be
better
just
because
the
more
places
we
haven't
duplicated,
the
the
harder
it
is
to
get
everything
aligned
well.
C
If
we
have
a
link,
then,
if
there
are
idiosyncratic
behaviors
within
a
working
group,
they
can
put
those
after
the
link.