►
From YouTube: CHAOSS Value Working Group 2/10/22
Description
Links to minutes from this meeting are on https://chaoss.community/participate.
A
Who
has
the
record
okay,
I
got
it.
Thank
you.
So
much
welcome.
Welcome
to
the
chaos
valley,
working
group
meeting
of
february
10
2022
and
today
we
have
anna
with
us
from
dora
she'll
be
telling
about
dora
and
their
journey
about
the
matrix.
So
welcome
and
thank
you
anna
for
joining
so
flores
up
to
you
and
then
yep.
B
Wonderful,
thank
you
so
much
so
I
thought
I'd
give
some
like
a
little
bit
of
background
on
you
know
what
dora
is
and
then
talk
about
this
new
project
that
we've
started.
I'm
hoping
this
can
be
really
casual.
So
if
you
have
any
questions,
just
feel
free
to
like
unmute
yourself
and
jump
right
in
and
that's
totally
fine.
B
So
the
idea
for
the
declaration
came
way
back
in
or
feels
like
way
back
now
in
2012,
at
the
annual
meeting
for
the
american
society
for
cell
biology
that
was
taking
place
in
san
francisco,
which
is
you
know
why
we're
called
the
san
francisco
declaration
on
research
assessment
and
it
came
out
about
when
a
group
of
researchers,
academic,
editors,
professional
editors
were
having
a
conversation,
I
believe
on,
like
the
poster
floor
about
how
research
was
being
evaluated
and
what
the
journal
impact
factor
was
doing
and
the
effects
that
it
was
having
not
only
on
hiring
and
promotion,
but
what
it
meant
for
workforce
composition,
what
it
meant
for
research,
integrity,
how
it
was
shifting
the
rewards
and
recognition
system,
so
that
discussion
ultimately
led
to
the
development
of
the
declaration
that
was
released.
B
The
following
may
and
for
five
years
it
did
an
excellent
job
of
raising
awareness,
collecting
signatures
right
now,
more
than
20
000
organizations
and
individuals
in
I
believe
it's
143
countries
have
now
signed
on
to
the
declaration,
which
is
amazing,
but
in
2017
some
of
the
original
sort
of
writers
of
the
declaration,
as
well
as
some
individuals
who
are
very
passionate
about
research
assessment
started.
Having
these
conversations
about
you
know,
awareness
isn't
enough.
B
Awareness
is
an
action,
it's
not
change,
and
while
dora
did
an
excellent
job
of
raising
awareness
about
misuse
of
the
gif
and
bring
us,
you
know
bring
in
the
community
to
have
conversations
about
research
assessment
reform.
It
wasn't
you
weren't,
seeing
change
on
the
ground
happen
and
that's
when
the
decision
was
made
to
turn
dora
from
sort
of
a
declaration
into
an
initiative
that
can
help
support
and
facilitate
the
type
of
culture
change
that
we'd
like
to
see.
B
Also
just
briefly
point
out,
I
do
live
near
a
fire
station,
so
if
you
hear
some
trucks
in
the
background,
it's
probably
it's
gonna
be
okay.
I.
B
Become
my
routine
disclaimer
on
calls
lately
that,
and
I
apologize
for
my
doc
snoring.
B
So
that's
that's
kind
of
the
trajectory
is
started
off
as
a
declaration
to
raise
awareness
was
very
successful.
I
don't
think
the
authors
had
anticipated
how
much
the
declaration
would
resonate
outside
of
cell
biology
and
how
much
it
would.
The
sort
of
concept
of
responsible
research
assessment,
in
addition
to
jeff
would
resonate
sort
of
around
the
world
as
well.
B
B
So
in
the
past
three
years
we've
developed
a
number
of
resources
in
partnership
with
professor
ruth
schmidt,
at
the
illinois
institute
of
design
in
chicago
really
thinking
about
how
do
we
reduce
the
administrative
costs
of
culture
change?
B
So
how
can
we
reduce
some
of
those
transaction
costs,
but
also,
how
can
we
think
about
culture
change
at
a
systems
level,
so
we
actually
have
a
webinar
on
systems,
thinking
and
culture
change
tomorrow
that
I
can
sort
of
drop
the
link
in
the
chat
a
little
later
on,
but
the
idea
here
is:
how
can
we
make
sure
that
organizations
have
infrastructure
in
place
that
can
support
interventions
for
responsible
research
assessment,
because
if
you
don't
have
the
infrastructure
or
the
foundation,
it's
really
it's
going
to
be
challenging
for
an
intervention
to
be
successful
off
the
bat
so
starting
to
look
a
little
bit
about
like
well.
B
What
type?
What
do
you
need?
What
are
the
building
blocks
that
you
need?
So
that's
kind
of
the
resource
development
side
that
we've
been
doing
and
then
in
terms
of
community
engagement
and
building
communities
of
practice.
B
We
piloted
a
funder
discussion
group
starting
our
first
meeting
was
in
march
of
2020
and
so
far
that
has
really
grown
so
that
the
workshop
we
hosted
in
september,
we
had
over
120
individuals
from
more
than
40
different
funding
organizations
in
about
22
countries
and
right
now,
all
of
these
funding
organizations
are
having
conversations
about
narrative
cv
formats.
B
So
what
types
of
qualitative
assessment
can
you
do
for
grant
evaluation
so
thinking
you
know
not
only
about
what?
What
is
the
format
of
the
narrative
cv,
but
how
do
you
evaluate
that?
How
do
you
make
that
evaluation
consistent,
also
thinking
about
sort
of
what
are
other
qualitative
indicators
of
success?
B
So
that's
sort
of
our
community
engagement
part.
We
also
host
a
lot
of
community
webinars,
like
I
mentioned,
and
then
sort
of
knowledge
sharing,
and
this
is
where
I
think
sort
of
our
website
is
the
biggest
resource
where
we
have
case
studies
of
academic
institutions
that
are
going
through
the
process
of
change
and
for
us
this
was
really
important
because
we
were
seeing
good
practice.
But
what
felt
like
the
more
interesting
information
was
okay.
How
did
that
good
practice
get
into
place?
B
What
did
how
were
organizations
sort
of
going
through?
I
don't
think
going
through
the
motions,
but
you
know
who
was
involved.
What
was
the
timeline?
You
know
what
is
a
realistic
timeline
to
develop
a
new
policy
for
faculty
sort
of
hiring
a
promotion?
Is
it
happening
at
a
department
level?
Yeah?
No,
and
that's
exactly
what
they
showed
and
how
does
you
know
change
even
get
started
on
campus?
Is
it
you
know
you
know
primarily
top
down?
B
Does
it
happen
from
bottom
up
and
what
we're
seeing,
through
these
case,
studies
that
really
it's
both
but
there's
an
interplay,
that's
happening
between
them.
There
has
to
be
some
sort
of
connection
between
top-down
and
bottom-up
change,
whether
it
is
one
of
the
case
studies,
for
example,
the
university
of
catalonia,
someone
in
their
international
relations
office
had
come
across
dora
and
had
presented
it
to
their.
B
B
Why
don't
we
sort
of
create
a
group
of
people
to
figure
out
what
it
is,
what
this
would
mean
from
the
university
so
sort
of
that,
like
bottom
up
or
bottom
up
push
but
had
the
support
from
the
bottom
or
the
top?
And,
alternatively,
you
know
we've
seen
examples
where
it's
very
much
supported
by
university
leadership,
and
then
they
work
on
garnishing
that
sort
of
bottom-up
support.
So
how
can
they
seek
sort
of
feedback
and
input
and
co-creation
from
faculty
from
post-docs
from
the
wider
university
community?
B
B
B
So
tara
is
a
project
release
designed
to
facilitate
the
development
of
new
policies
and
practices
for
academic
career
assessment,
and
there
are
three
three
main
pieces
to
project
tara,
but
I
think
one
will
be
of
particular
interest
to
the
group,
so
I'll
focus
in
most
on
that.
So
the
first
thing
that
we
want
to
do
is
conduct
a
survey
of
u.s
academic
institutions
to
get
sort
of
a
broad
understanding
of
institutional
attitudes
and
approaches
to
research
assessment
reform
and
then
we're
also
working
on
a
toolkit
of
resources.
D
Anna
by
tool,
kit,
I
mean
this
is,
I
think
I've
started
learning
this
through
nasam
is,
since
most
of
us
are
like
software
people.
When
we
hear
the
word
tool
we
think
about
software
and
in
the
context
of
a
lot
of
these
academic
organizations,
a
toolkit
is
like
collections
of
boilerplate
and
articles
and
stuff
like
that
to
help
you
use
in
your
discussions,
arguments
with
other
people
on
the
same
topic.
Is
that
correct?
When
you
say
toolkit,
that's
what
you
mean.
B
B
Which
is
designed
to
sort
of
help
organizations
figure
out
where
they
are
with
sort
of
I've
talked
about
infrastructure.
Before,
like
what
type
of
infrastructure
do
you
need
to
support
culture
change,
so
this
breaks
it
down
into
five
different
categories
and
then
recognizing
that
each
institution
is
at
a
different
stage
of
readiness
for
reform,
then
sort
of
across
you
know
what
you
could
see
is
the
x-axis
would
be
sort
of
like
you
know.
B
What
do
you
need
at
the
foundation
stage
versus
sort
of
expansion
and
scaling
up,
and
we
have
a
worksheet
in
with
this,
that
organizations
can
fill
out
so
that
yeah.
E
B
I'm
happy
to
put
it
into
the
chat
right
now.
Oh
so
there's
the
link
to
all
of
our
resources.
Here's
the
link
specifically
to
space
and
the
idea
like
so
two
things
that
you
can
do
is
you
can
kind
of
look
at
the
rubric
fill
it
in
where
you
are
with
your
institution?
B
B
You
can
also
use
it
retrospectively
after
you've
tested
an
intervention
and
in
that
way
kind
of
see.
You
know
why
you
know
analyze
the
outcomes.
Why
was
it
successful
or
was
it
not
successful?
Are
there
things
that
we
could
do
differently?
That
would
improve
it
in
the
future.
B
B
So
this
we're
seeing
as
an
interactive
online
dashboard
that
can
track
responsible
research
assessment
reform,
specifically
for
hiring
review
and
promotion,
and
so,
if
we
started
this
process
really
sort
of
scoping,
the
intent
of
the
dashboard
through
a
couple
of
community
calls
that
we
held
this
fall
and
and
right
now,
sort
of
based
on
the
feedback
that
we've
gotten
from
those
calls
where
our
thinking
is
at
is
you
know,
really
sort
of
the
goal
of
the
dashboard
is
to,
like,
I
said,
monitor
and
track
progress
towards
responsible
research
assessment
at
academic
institutions.
B
The
first
version
of
the
dashboard
is
going
to
be
focused
on
faculty
career
advancement,
specifically
hiring
promotion,
tenure
and
retention,
but
for
future
versions.
We
really
want
to
look
at.
B
Other
types
of
assessment,
too
so
graduate
student
and
postdoc
evaluations,
as
well
so
in
terms
of
the
type
of
source
material
that
we're
planning
on
sort
of
supplying
the
dashboard
with
we
want
to
find
stuff
that
sort
of
collectively
and
individually
indicates
progress
towards
research
assessment
reform.
B
So
again,
sort
of
based
on
the
community
calls
sort
of
our
work
in
gathering
materials
about
responsible
assessment.
We've
defined
these
as
announcements.
So
a
lot
of
times
when
organizations
will
sign
dora,
they'll,
put
out
a
press
release
and
in
those
press
release
they
give
hints
that
you
know
this
is
how
we're
going
to
be
implementing
responsible
assessment.
B
So
looking
at
announcements,
sort
of
dora
signatures
or
commitments
to
other
responsible
research
assessment
documents
such
as
the
leiden
manifesto
or
the
hong
kong
principles,
also
including
action
plans.
So
I
mentioned
the
university
or
the
open
university
of
catalonia.
They
published
an
action
plan
once
their
dora
working
group
was
done
with
their
work.
B
A
lot
of
organizations
are
now
publishing
principles
about
responsible
research
assessment
or
about
responsible
use
of
the
blue
metrics,
and
then,
of
course,
you
know
we
get
into
sort
of
the
policies.
B
The
practices
and
some
outcomes-
so
that's
this-
is
the
sort
of
breadth
of
the
material
we're
thinking
that
can
feed
the
dashboard
and
also
can
use
to
get
a
handle
of
where
institutions
are
at
asses
with
assessment
and
be
able
to
sort
of
track
that
trajectory
of
progress.
F
And
you
know
sorry,
can
I
ask
you
a
quick
question
about
that
is
so
when
someone
signs
the
the
dora
document
is
that
part
of
kind
of
the
agreement
is
that
they
will
publish
these
things
or
are
people
just
doing
that
just
because
they
want
to
kind
of
show
the
community
and
that's
kind
of
the
culture
that
you've
built?
Is
that's
what
we
do?
We
share
what
we're
doing
exactly.
B
So
that's
the
second
part
of
the
answer,
so
yeah
signing
door
is
a
voluntary,
voluntary
commitment
that
organizations
make
and
really
in
signing
dora.
I
think
the
big
power
is
that
you
know
your
community
can
hold
you
accountable
to
that.
So
it
really
is
fostering
that
environment
of
you
know,
sharing
about
what
you're
doing
with
responsible
research
assessment
and
also
knowing
that
there's
more
work
to
be
done.
It's
a
process,
that's
never
finished.
B
Yeah
and
we've
had
about
the
breakdown,
I
think
putting
me
on
the
spot.
A
little
bit
is
18,
it's
about
18,
000
individuals
and
then
2
000
organizations
and
those
organizations
include
academic
institutions,
publishers,
research,
funders,
scholarly
societies,
other
non-profit
organizations
that
are
within
the
academic
ecosystem.
C
So
it
sounds
like
the
point
of
connection
is
around
these
dashboards,
as
you
mentioned,
or
the
potential
point
of
connection.
So
what?
Where
are
you
at
with
respect
to
dashboards?
You
know
what
are
you
hoping
to
see?
I
mean
I
know
that
the
process
of
building
the
dashboards
will
change.
What
is
actually
seen.
B
Yeah,
so
we
have
no
we've
identified
kind
of
the
source
material
that
we
that
we
want
to
use
to
feed
the
dashboard
we're
working
with
a
web
development
firm
to
build
the
dashboard
out
they're
working
on
the
content
plan,
we're
going
back
and
forth
with
them
right
now,
and
I
think
you
know
what
what
we
want
to
be
able
to
sort
of
see
and
do
with.
B
B
B
I'm
trying
to
think
here-
sorry,
apologies
if
I
might
have
like
lost
my
train
of
train
of
thought,
because
you
asked
us
sort
of
like
the
work
in
progress
and
where
we're
at
so
I
think
where
we're
at
is
like
you
know:
we've
scoped
out
sort
of
the
intent
and
desired
uses.
We
are
working
now
on
sort
of
collecting
that
source
material
which
we've
done
sort
of
in
a
targeted
fashion,
but
then
also
sort
of
held
a
community
call
to
surface
some
materials
that
we
may
not
have
been
aware
of.
C
So
was
the
dashboard
about
the
work
that
dora
is
doing
like
it
represents
the
good
work
within
the
dora
project,
or
is
it
a
dashboard
that's
made
available
to
people
who
would
like
to
participate
in
improving
the
ways
that
they
think
about
contributions
in
the
academic
process?
You
know
what
I
mean
like
you
would
have
a
dashboard
for
university
one
and
a
dashboard
for
university,
too,
and
so
on
and
so
forth,
or
is
it
a
dashboard
for
dora?
That
kind
of
shows
the
work
that
you're
doing?
C
B
I
I'm
tempted
to
say
a
little
bit
of
both,
but
also
it
should
be
completely
honest
to
your
teams
that
you,
I
do
not
have
my
background
in
biochemistry
and
not
like
tech
building.
So
you
know,
I
see
the
dashboard
as
being
we're,
not
specifically
tracking
what
the
universities
that
have
signed.
Dora
are
doing,
that's
included
in
the
dashboard,
but
it's
really
meant
to
be
a
wider
resource
for
the
academic
community.
So
a
lot
of
the
good
practices
that
we're
finding
come
from
institutions,
I
think
maybe
haven't
even
heard
of
dora.
B
So
in
that
sense
I
think
that
it
really
is
sort
of
beyond
just
dora,
and
I
say
that
because
in
there
are
different
policy
environments,
you
know
in
different
parts
of
the
world
so,
for
example,
in
europe
we've
had
you're
recording
me.
So
I
don't
want
to
get
the
numbers
wrong,
but
I'm
going
to
estimate
here
and
give
the
display
this.
B
About
like
100
150
to
100
180
academic
institutions
have
signed
dora
in
europe,
a
majority
of
those
come
from
the
united
kingdom
and
when
you
look
at
the
policy
environment
there
plan
s
which
is
driven
by
a
coalition
of
research
funders,
is
requiring
or
really
sort
of,
you
know,
championing,
publishing,
open
access
and
part
of
plan.
S
is
also
asking
academic
institutions
to
rethink
how
they
evaluate
their
researchers
for
hiring
promotion
in
tenure
that
are
based
off
of
the
principles
that
are
in
the
declaration.
B
B
So
if
you
get
any
money
from
welcome,
I
believe
you
have
to
show
that
you
are
improving
your
research
assessment
again,
sort
of
based
on
the
principles
of
dora
and
and
I'm
pretty
sure,
sort
of
that
there's
there's
some
type
of
adherence
mechanism
built
into
that
so
in
in
europe,
particularly
in
the
uk.
There
really
is
a
sort
of
a
strong
momentum
for
change
in
the
united
states.
B
We
don't
necessarily
have
that
as
strong
of
pressure
from
funding
organizations
so
we've
there
are
six
academic
and
research
institutions
that
have
signed
dora
in
the
us,
so
I
think
you
know
limiting
to
that
is
kind
of
silly
when
there
are
so
many
other
really
good
practices
going
on
and
our
goal.
The
goal
for
dora
is
not
to
collect
signatures.
It's
to
facilitate
culture
change.
D
D
One
is
that
you
know
we
do
see
a
lot
of
funders
now
you
know
federal
and
foundational,
maybe
state
in
the
us
saying
your
stuff
has
to
be
open
right
and
either
either
give
us
where
you're
going
to
put
it
or
if
we
have
one
put
it
in
our
place
right
where
we're
letting
you
do
that,
and
then
we,
but
in
a
bigger
picture
level
or
the
opposite
is,
is
your
dashboard
is
doing
a
bigger
picture
level
rather
than
a
specifically
I
as
faculty,
I'm
saying
my
work
had
value
for
tenure
and
promotion.
D
D
In
terms
of
how
you
know
how
how
agile
and
responsive
is
this
community
or
how
well
are
they
meeting
the
needs,
the
community,
or
what's
their
dei
footprint,
look
like
or
stuff
like
that,
so
it
might
be
of
use
to
groups
like
this
to
be
able
to
say,
okay,
you
know
in
terms
of
scholarly
work
right
in
terms
of
these
metrics
right.
You
can
hit
the
dora
tara
api
and
pull
this
stuff
in
and
then
kind
of
munge
it
to
the
ways
in
which
we
want
to
apply
it
to
individual
projects
so
probably
longer
term.
D
Some
conversations
between
this
group
and
your
your
web
dev
in
terms
of
making
that
kind
of
data
open
and
accessible,
so
other
people
can
use
it
would
increase
the
value
of
the
work
dora
is
doing
geometrically.
D
B
Openness
is
definitely
something
that
is
very
sort
of
central
to
our
thinking
as
well
and
our
philosophy
as
we're
going
through
this
process,
as
you
know,
as
open
as
possible
as
closed
as
necessary.
So
that's
our
that's
our
mantra
for
the
team.
C
G
G
But
but
it's
we
also
have
to
figure
out.
I
think
a
way
to
get
tenure
and
promotion
committees
to
add
a
document
basically
or
give
us
a
place
to
put
that
document
in
the
pro
in
the
development
of
a
packet
that
gets
reviewed
so
that
it's
officially
given
some
merit
that
it's
not
a
decorative
aside
that
the
faculty
member
applies
in
a
general
way
to
some
miscellaneous
document
to.
G
D
My
anna,
the
thing
that
we
sent
you
a
proposal
about
that
you
didn't
get
was
rit
is:
is
goofing
around
with
the
system
from
the
bottom
up
layer,
you're
coming
very
top
down
and
we're
going
bottom
up
where
we're
saying
all
right
faculty
member
put
these
urls
into
this
window,
for
where
your
research
projects
are
and
will
try
to
generate
a
dashboard.
D
That
proves
what
the
translation
and
impact
of
your
work
is,
even
though
it's
not
in
a
journal
article
or
a
conference
or
in
addition
to
right,
we'll
we'll
list
the
journal
articles
in
the
conferences
too.
But,
oh
you
know,
you've
been
a
maintainer
of
scientific
software
project
x
for
five
years
and
that's
worth
a
lot,
even
though
you
haven't
been
writing
journal
articles
about
it
right.
That's
the
kind
of
place
where
I
came
to
talk
to
chaos
in
the
first
place
and
how
these
kind
of
discussions
are
going.
G
So
there's
there's
a
there's
the
case
where
it's
in
an
independent
scientific
contribution,
then
you
have
to
show
use
in
community.
I
think
there's
other
cases
where
some
of
these
artifacts
can
be
traced
directly
back
to
funding
received,
which
is,
of
course,
what
our
provosts
all
care
the
most
about.
Yes-
and
I
think,
helping
us
to
draw
those
lines
between
these,
these
other
kinds
of
contributions
and
getting
funded
it
that
will
go.
That
will
be
easy
to
explain.
B
And
I
should
say
you
know
we
talked
a
little
bit
about
metrics
and
the
what
we'll
be
tracking
in
terms
of
like
progress
for
assessment
reform.
B
I
think
in
some
instances
it
can,
but
I
think
a
good
example
that
I've
come
across
lately
is
from
yale's
molecular
biology
and
biophysics
department,
where
they
they
piloted
a
hiring
process
for
an
assistant
professor,
where
they
all
of
the
applicants
had
to
completely
blend
their
application.
So
no
university
names
on
their
resumes,
no
journal
article
names
their
mentors
were
blinded.
Of
course
their
names
were
blinded
in
the
entire
application
packet,
and
then
they
went
through
and
evaluated
those,
and
I
think
they
made
some
modifications
to
the
evaluation
process.
B
First,
so
of
course
triaging
applications,
one
of
the
primary
ways
that
you
can
do
that
very
quickly
is
looking
at
journal
name.
Well,
if
you
don't
have
that,
then
how
do
you?
How
do
you
do
that
triage?
So
they
had
to
think
carefully
and
develop
a
plan
for
that.
B
Yeah
exactly
excuse
me
and
then
sort
of
they
were
able
to
sort
of
measure.
They
measure
their
outcomes
in
terms
of
okay.
How
diverse
was
the
final
applicant
pool
that
we
got
to
and
invited
into
for
interviews?
B
So
I
think
that's
a
really
good
example
of
some
of
an
innovative
sort
of
research,
assessment
practice
and
I'll
drop.
They
actually
have
it
very
well
documented
on
their
website
and
it's
been
written
up
in
a
couple
different
places
and
they
also
offer
download
of
all
of
their
search
files
that
they
use.
So
every
firm
thing
from
the
slides
that
the
search
committee
shared
with
the
department
about
the
process
as
well
as
I
think
it
includes
like
the
language
in
their
advertisements
as
well.
So
this
is
the
type
of
information
that
it's
like.
B
It's
not
it's,
not
a
metric,
but
it
is
an
innovative
practice
that
we
think
is
worth
sort
of
tracking
and
sharing.
A
So
anna,
my
question
is
like:
are
you
grouping
those
practices
in
a
certain
way,
or
it's
just
a
list
of
all
these
things?
You
are
like
doing
it
out
or
like
as
a
new
person
how
to
access?
Okay,
which
area
fits
in
my
category,
for
example
as
an
individual?
Should
I
look
at
the
software
side?
Should
I
look
at
the
just
other
things
or
community
services
like
how
to
assess
those
things.
B
I
am
laughing
nervously
now
because
this
is,
I
think,
where
my
lack
of
software
and
tech
development
special
to
you
or
expertise
is
gonna
like
shine
through
real
clear,
is
what
I'm
calling
a
system
of
meta
tags.
B
So
how
can
we
create
sort
of
clusters
or
practices
to
help
sort
of
compare
contrast,
and
we
have
so
far
we're
doing
this
through
sort
of
the
type
like?
What
is
the
type
of
source
material?
Is
it
an
announcement?
Is
it
an
action
plan?
Is
it
a
practice?
Is
it
an
outcome
and
then
for
each
of
these
tags
we're
working
on
concrete
definitions
to
make
them
clear
and
then
another
category
we
have
are
the
types
of
decisions
that
impact
research
careers.
B
So
then
again,
you
can
kind
of
look
in
that
bucket
this
first
iteration
of
the
dashboard
to
make
it
manageable,
it'll,
be
faculty,
hiring,
promotion,
10-year
retention,
and
then
I
think,
sort
of
we're
getting
into
where
you
get
our
other
specific
sort
of
topical
areas
within
assessment
and
for
those
we've
come
up
with.
You
know:
open
research,
research,
integrity,
impact,
responsible,
metrics,
equity
and
inclusion,
and
this
concept
of
portfolio
assessment.
B
B
Like
academic
discipline,
population,
like
I
said,
right
now,
we're
starting
with
faculty,
but
we're
hoping
to
broaden
that
out
eventually
to
consider
sort
of
graduate
students,
post-docs
geographic
location
time.
So
the
the
year
in
which
the
action
described
in
the
source
material
was
documented,
what
type
of
institution
and
then
what
is
the
scope
of
the
policy?
Is
it
the
department
policy?
B
A
C
Dora
and
chaos
like
is:
it
is
your
hope
that
we
can
help
in
the
development
of
say,
metrics
and
thinking
through
how
metrics
are
developed,
irrespective
of
how
they're
deployed
is
it
about
like
just
documenting
those
metrics?
Is
it
about
developing
a
dashboard
based
on
metrics
that
that
you
have
described
in
the
dora
project?
C
C
E
C
So
we
try
to
listen
and
document
what
we
hear
through
a
fairly
rigorous
process
that
we
use
to
develop
metrics
and
those
metrics
are
based
on
if
it's
around
dei
they're,
based
on
what
we're
hearing
from
individuals
and
organizations
that
care
about
dei.
If
it's
around
risk,
it's
because
we're
listening
and
working
with
companies
that
are
organizations
that
care
about
risk.
C
So
I
mean
one
of
the
things
that
we
can
do
is
kind
of
help
ensure
a
process
by
which
dora
develops
the
metrics
that
you're
interested
in
like
documentation
just
documenting
those
metrics
and
elizabeth
put
a
list
in
there
of
metrics
that
are
already
developed
from
dora,
actually,
which
is
slightly
different
than
the
way
we
do.
It.
B
Oh
yeah,
and
that
is
that's
from
the
metrics
toolkit,
so
that's
actually
one
of
the
resources,
that's
in
our
library,
but
it's
a
different
initiative.
So
that's
a
specific
entity.
E
C
But
how
you
might
assemble
those
metrics
in
collections
to
be
meaningful
to
them,
because
sometimes
an
individual
metric
by
itself
doesn't
really
matter,
but
as
a
collection,
it
does
start
to
matter
is
what
we're
learning
in
the
chaos
project
and
then,
as
as
sean
had
pointed
out,
there's
there's
the
potential
to
put
the
metrics
into
practice,
which
is
kind
of
another
step.
C
So
if
we
write
the
definition
for
the
metric
actually
deploying,
it
is
step
number
whatever
three
in
this
process
or
two,
and
so
I'm
I'm
trying
to
figure
out
where
you
need
a
hand
in
this
process,
because
it
sounds
like
you
are
working
with
some
web
development
teams
that
the
list
is
even
if
it's
a
little
out
of
band
is
a
set
of
metrics
like.
Is
it
where's
the
the
weak
spot?
Here.
C
Think
about
them
as
goals
first
and
so
what's
the
goal
you're
trying
to
achieve
and
then
what
is
the
data
you
need
to
kind
of
measure
against
that
goal,
and
so
a
metric
to
us
is
a
very
it's
a
kind
of
atomic
end
of
the
line
thing.
So
my
goal
is
to
improve
dei
with
an
event
within
an
event.
I
have
that's.
B
Yeah,
okay,
yep,
so
yeah,
okay,
I
think
yeah,
I'm
following
you
now
what
the
gap
that
I
kind
of
use,
because
we're
we're
following
a
lot
of
these
policies
and
practices
that
are
sort
of
innovative
but
then
breaking
down
sort
of
within
that
specific,
topical
area.
B
So
you
would
mention
dei,
but
also
open
research
and
research
integrity,
but
identifying
what
those
indicators
are.
I
think
I
might.
I
tend
to
prefer
sort
of
indicator
to
metric
for
a
variety
of
reasons,
but.
G
G
G
But
I
think
there
are
a
number
that
do
and-
and
so
I
think,
the
mutually
beneficial
or
copacetic
or
a
way
we
could
generate
energy
together-
is
to
have
you
know
the
help
of
dora
to
develop
metrics
that
serve
the
interest
of
the
dashboard
and
then
have
chaos,
sort
of
implement
and
help
make
available.
Those
metrics.
D
I
think
I
think
something
some
additional
things
for
consideration
on
this.
First
of
all
for
both
groups.
I
know
I
sent
this
to
the
chaos
group
before,
but
this
this
concept
of
software
being
scholarship
in
and
of
itself
from
mit
law
right
and
then
we
have
the
fact
that
you
know
I'm
working
with
biologists
who
are
writing
software,
for
you
know
tracking
genetic
lines
in
vineyards.
Right
I
mean
that's,
not
what
you,
the
old
school
biologist
does
right.
You
know
yes,
okay,
it's
genomics,
but
beyond
that
right.
D
B
And
I'm
so
glad
you
brought
up,
because
I
think
this
is
maybe
hits
in
another
point
of
synergy
which
is
sort
of
the
dashboard
in
its
own
way,
is
looking
at
use.
Cases
of
kind
of
the
metric
sort
of
that
you're
you're,
developing
right
and
thinking
about
how
you
want
to,
and
it
might
be
at
the
end
of
that
sort
of,
like
once
they're
deployed
and
in
practice
and
sort
of
this
is
how
they're
being
used,
then
that
sort
of
becomes
a
policy
or
practice
or
source
material.
B
B
C
I
just
put
in
the
chat
we
are
at
the
end
of
our
time.
Oh
gosh,
I'm
sorry
yeah
I
mean.
Would
it
be
possible
for
you,
anna
or
somebody
else
from
dorad
doesn't
have
to
be
you
that
could
attend
the
chaos
value
meetings,
at
least
even
if
it's
just
once
a
month,
we
could
kind
of
have
these
discussions
about
metrics
development
and
think
through
that
with
that
and
we're
going
to
move
this
back
an
hour.
So
it's
going
to
start
at
10
a.m.
U.S
central.
B
B
Short
answer
is
yes,
I
am
going
on
matt
leave
in
a
couple
of
weeks,
so
it
may
not
be
be
me
per
se.
C
Be
really
great
because
those
were
kind
of
and
they're
kind
of,
like
today
we
talked
about
this,
but
we
oftentimes
use
those
meetings
as
working
sessions
where
we
just
work
through
and
develop
metrics
right
there
in
on
the
spot
yeah,
so
I
mean
we
can.
We
can
move
these
forward
right
right
here
in
these
meetings.