►
From YouTube: Torus Community Meeting May 2023
Description
Monthly meeting to discuss Open Learning Initiative's next generation platform. This month, we presented designs for analytics and dashboard visualizations.
A
C
C
D
E
Thank
you,
yeah
I've,
just
like
added
it
to
my
calendar
in
January,
yeah,
cool.
F
Erin
well
we're
waiting
for
people.
I
realize
I've
been
neglecting
to
kind
of
like
take
user
stories
down
since
the
beginning
of
the
year.
So
would
if
these
recordings
are
available,
do
you
want
need
to
go
back
and
do
that
so
that
we
can
then
do
product
design
with
them
like
I?
Can
I
can
take
care
of
that.
D
Yeah,
let's
talk
about
that
following
up
next
week-
maybe
maybe
some
of
them,
but
maybe
some
of
some
of
them.
You
wouldn't
need
to
do
that
with
okay.
E
D
So
welcome
everyone.
We
usually
get
started
about
12
35,
my
time,
which
is
in
about
three
minutes.
So
hang
just
hang
on
we'll
get
started.
B
D
So
for
those
who
just
joined-
and
it
might
be
new
to
this
meeting-
I
usually
get
started
at
about
three
more
minutes.
So
just
hang
tight.
C
C
D
Okay,
well,
it
is
12
35,
my
time
so
we'll
get
started.
Welcome
everybody
to
the
Taurus
community
meeting
for
May,
I'm,
Erin,
swinski,
product
manager
and
learning
engineering
manager
at
the
open
learning
initiative
and
the
Simon
initiative
at
Carnegie
Mellon
University
today,
I'm
joined
by
a
few
folks
that
can
help
in
the
demo
but
or
in
the
in
the
demo
that
we're
going
to
be
giving
of
some
new
analytics,
sketches
and
designs
and
I'll
just
turn
this
over
to
John
V
Anton
VL.
D
Let
you
introduce
yourself
and
I
assume
Christine
will
also
be
hoping
out.
So
thanks,
John
vegan
Christine
for
coming
and
for
sharing
this
information
and
I'll.
Let
you
I'll
turn
it
over
to
you
now.
B
Thanks
Erin
hi
everyone,
I'm,
tanvi
and
I'm,
a
learning
engineer
here
at
Oli
I've
also
been
working
on
some
of
the
design
and
development
of
the
analytics
prototypes
for
Taurus
and
I've,
been
working
with
Darren
Christine
and
actually
the
rest
of
the
gates
team
as
well
on
the
design,
so
we'll
be
showcasing
some
of
the
initial
prototypes
that
we've
been
working
on
for
the
version
24
release,
as
many
of
you
might
already
have
used
our
old
platform
and
have
been
using
the
analytics
within
the
old
platform
of
student
tracking
student
mastery
student
performance,
foreign
for
students,
we
have
been
working
on
adding
a
few
more
features
or
metrics
within
within
the
platform
and
want
your
feedback
for
that
again.
B
These
are
all
initial
prototypes
and
will
usually
benefit
from
any
kind
of
feedback
that
you
all
have
about
these
designs
and
with
that
I'll
start
sharing.
My
screen,
please
feel
free
to
interrupt
me
at
any
point
of
the
demo.
If
you
all
have
any
questions
or
comments,
but
I'll
also
be
pausing
in
between
to
ask
questions
and
I'll
be
stopping
my
video,
which
is
because
my
share
screen
gets
wonky
if
I.
If
I
am
sharing
my
screen
and
sharing
my
video
as
well.
B
B
This
first
view
is
when
the
instructor
logs
into
the
codes
and
sees
an
overview
page
over
here,
they
are
shown
some
overview
information
about
the
section
and
then
they're
also
given
some
reminders
or
recommended
actions
that
the
instructor
might
need
to
take.
So
you
can
see
these
top
priority
action
view
with
a
lot
of
action
cards
that
the
instructor
is
shown
and
what
kind
of
actions
they
might
want
to
take,
for
example,
grading
the
assignments
and
then
even
reviewing
some
of
the
materials
that
might
wake
up
upcoming
for
their
classes.
B
Yeah,
so
over
here
you
all
can
see
the
top
priority
actions
that
instructors
might
need
to
take
for
the
class.
There
is
also
an
option
to
hide
this
view.
If
instructors
prefer
that
and
then
over
here
we
see
a
scatter
plot
and
a
matrix
kind
of
view
where
we
are
tracking
student.
Mastery
versus
student
engagement
and
each
of
these
dot
represents
a
student
within
the
section
and
how
they
are
performing
so
over.
B
Here
we
have
four
groups,
basically,
students
who
are
who
are
low
on
Mastery
and
low
on
engagement
and
the
instructor
might
want
to
take
a
couple
of
actions
based
on
based
on
their
class
and
what
they
think
might
be
appropriate
like
sending
reminder
emails
or
approaching
this
in
class,
asking
them
to
complete
some
of
the
materials
in
class
because
they
are
low
in
engagement
and
so
getting
them
to
First
go
through
the
materials
itself.
B
Then
there
are
the
this
other
group
where
students
are
high
in
engagement
and
low
in
Mastery.
So,
even
though
students
are
going
through
the
materials,
they
are
still
low
in
Mastery,
and
so
maybe
the
entropriate
might
want
to
take
alternate
actions
like
assigning
additional
practice
materials
for
setting
up
study
groups.
B
And
then
there
are
students
who
are
low
in
engagement,
but
still
high
in
Mastery.
The
instructor
might
just
want
to
track
this
group
and
see
what
kind
of
prior
knowledge
this
group
has
that
day
or
what
what
other
materials
the
students
are
referring
to,
that
they
are
doing
well,
even
without
going
through
all
the
materials
within
the
course
and
then
there's
this
group,
which
is
high
engagement,
High
Mastery
students
who
have
gone
through
all
the
materials
and
are
all
also
doing
well
within
the
course.
B
Then
the
instructor
is
also
able
to
see
a
list
of
topics
that
the
students
are
struggling
with
for
that
particular
week.
And
then
this
is
the
list
of
learning
objectives
and
all
the
masteries
are
the
student
mastery
overall
student
mastery
for
their
learning
objectives.
Given
for
that
particular
week.
B
Apart
from
this,
we
also
plan
to
have
a
table,
on
the
right
hand,
side
with
topics
that
the
students
are
doing
well
in
for
that
week
and
so
again,
similar
list
of
learning
objectives
where
the
students
are
doing
well
in
for
that
particular
week
and
then
over
here
we
plan
to
have
a
sum,
distribution
and
activities
that
students
are
struggling
with
this
for
that
week,
and
these
are
like
bar
charts
or
visualizations
of
activities
that
the
students
struggle
with
for
that
week
and
how
many
percentage
of
students
got
those
activities
correct.
B
So
this
is
the
overall
view
of
of
the
overview
screen
that
we
have
been
working
on.
G
B
Yeah,
so
basically
we
are
skilled.
We
are
tagging
the
activities
with
either
the
learning
objectives
or
the
sub
learning
objectives,
and
so
based
on
what
we
are
tagging
them
with.
It's
like
an
average
of
how,
for
all
the
activities
the
students
are
performing
right.
Now,
it's
a
very
straightforward
calculation
of
whether
they
have
gotten
it
correct
or
incorrect.
In
their
first
attempt.
B
And
then,
moving
on
to
the
reports
tab
over
here
instructors
are
able
to
view
many
more
sub
tabs
and
they
are
able
to
even
download
these
sub
tabs
in
different
formats.
So
this
first
sub
tab
is
the
content
sub
tab,
which
usually
gives
an
overview
of
all
the
modules
and
what
might
be
the
overall
student
completion
rate
for
each
module
overall
student
mastery
for
each
module
and
the
overall
student
engagement
for
each
module.
B
B
Yeah
and
so
over
here
in
the
modules
tab,
we
are
able
to
see
the
student
completion
overall
student
completion
for
each
module,
the
overall
student
mastery
for
each
module
and
the
overall
student
engagement
for
each
module,
and
then
each
of
these
metrics
I'll
just
talk
a
little
bit
about
how
they
are
calculated
right
now,
and
we
would
love
to
get
your
input
on
this
as
well.
B
So
the
student
completion
right
now
is
nothing,
but
if
the
student,
if
there
are
activities
within
a
page,
then
the
percentage
of
activities
that
each
student
has
completed
within
that
page
and
student
mastery
is
again
nothing
but
like
what
I
spoke
about
earlier,
whether
when
the
student
is
getting
the
the
activity
correct
or
incorrect
in
their
first
attempt
for
each
of
these
questions,
and
then
student
engagement
is
a
calculation
that
we
are
still
working
on
right
now.
B
B
And
we
have
categorized
the
student
mastery
and
student
engagement
into
like
low
medium
high
and
not
enough
data.
But
then
the
completion
is
a
percentage
that
we
are
currently
showing
within
the
interfaces.
B
On
clicking
on
these
modules,
each
of
these
modules,
the
instructor,
has
taken
to
another
sub
tab
where
they
are
table
where
they
are
shown,
a
list
of
the
students
for
that
who
have
participated
in
completion
of
that
module
and
what
their
completion
rate
is
and
what
their
Mastery
rate
is
for
each
of
that
model.
Each
of
those
modules.
B
B
And
then
there
is
this:
the
next
sub
tab,
which
is
the
student
sub
tab,
which
shows
the
list
of
students
the
time
that
they
have
last
interacted
with
their
system
with
the
system.
The
overall
course
progress
how
much
they
have
completed
their
overall
course
Mastery
and
their
overall
course
engagement
on
clicking
on
each
of
these
student
names.
B
The
instructor
is
guided
to
an
individual
student
view
where
they
are
shown
how
an
overall
view
of
the
student
and
I'll
just
try
to
zoom
out
and
show
you
this
overall
view.
First,
so
over
here,
you
can
see
that
this
first
first
table
over
here
shows
the
basic
information
about
the
student,
their
major
and
their
background
information.
B
And
then
we
have
a
set
of
tables
over
here
and
again,
I'll
dive
a
little
deeper
into
this
tables
in
a
minute,
and
then
we
have
these
notes
section
where
the
instructor
can
add
any
notes
about
that
particular
students.
Here,
foreign.
B
The
learning
objectives,
tab,
which
shows
the
learning
objective,
the
sub
learning
objective,
the
student
performance
for
that
particular
learning
objective
and
the
sub
learning
objective.
That
is
indicated
with
the
student
mastery
and
the
student
engagement
for
that
particular
learning
objectives.
B
And
then
we
also
have
the
quiz
course
tab,
which
shows
for
each
individual
quiz
the
scores
for
the
student
and
the
number
of
attempts
that
the
students
might
have
taken
on
clicking
on
these
codes.
We
are
able
to
view
the
attempt
history
and
what
exact,
what
what
answers
the
students
might
have
clicked
on
for
each
of
the
questions
within
that
group.
B
Going
back
to
this
view
overall,
section
view
where
we
are
able
to
see
or
the
whole
list
of
students,
there
is
the
next
sub
tab,
which
is
the
learning
objective
sub-tab,
and
this
one
again
has
a
similar
format
to
the
learning
objectives
that
you
saw
in
the
individual
student
view,
tab
where
we
have
the
learning
objective,
the
sub
learning
objective,
the
Mastery
of
the
objective,
the
Mastery
of
the
sub
learning
objective
and
the
engagement
overall
engagement
for
that
objective.
B
And
in
this
view,
we
are
also,
we
are
also
able
to
filter
filter
these
learning
objectives
based
on
their
modules.
So
if
we
want
to
view
the
learning
objectives
for
a
particular
module
you
can
go
through,
we
can
sort
it
through
this
filter.
B
B
And
then
we
have
the
quiz
course
tab,
which
shows
the
overall
quiz
code
for
that
particular
particular
class.
I
see
that
Gotham
has
a
hand
up.
H
Yeah
I
think
this
is
the
feedback.
I
also
give
for
the
or
things
side,
and
so
it
is
that
just
writing
multiple
choice
which
might
be
a
placeholder
here,
but
in
tourists
right
now
we
collect
any
data.
We
only
see
multiple
choices
written,
but
that
is
not
clickable.
Well,
it
does
not
have
a
any
unique
ID
unless
we
go
and
rename
that
in
tourists,
so
as
a
instructional
designer
it's
kind
of
Burden
for
me
to
rename
all
the
formative
assessment
or
quiz
question.
H
If,
instead
of
multiple
choice,
maybe
I
had
questions
stem
or
some
kind
of
unique
ID
that
is
already
attached
with
that
question
server
yes,
I
can
map
like,
which
are
the
question
on
the
authoring
side
is
being
shown
here,
or
even
if
it's
clickable
and
redirected
to
somewhere,
where
I
can
see
the
question
just
feedback
on
that.
B
Yeah,
that's
a
great
Point
and
we
are
thinking
of
making
this
clickable
as
well.
So
we'll
definitely
try
to
have
the
data
that
you
are
talking
about
included
somewhere
in
here
as
well.
B
And
then
this
quiz
core
view
is
basically
like
the
grade
book,
where
the
instructor
is
able
to
see
all
the
quizzes
and
the
names
of
all
the
tools.
Excuse
me
and
all
the
student
performance
for
that
for
those
quizzes
again,
the
ones
which
have
low
scores
are
the
ones
that
are
highlighted
in
red,
but
apart
from
that,
the
rest
of
them
are
in
the
same
color.
B
This
view
is
again
downloadable
as
a
CSP.
If
the
instructor
needs
the
grades
outside
of
the
platform.
B
So
this
was
one
view
I'm
going
to
go
back,
so
this
was
one
View
and
I'll
show
you
in
the
next
view.
What
goes
in
these
manage
and
discussion
activity
tabs
as
well.
B
You
know
this
next
flow,
or
this
view
shows
shows
the
reports
tab
as
the
main
tab
when
instructor
is
able
to
enter
again.
This
is
something
that
we
were
planning
to
have
for
version
24,
that
is
the
July
release.
B
We
might
not
be
able
to
do
all
of
the
things
that
are
there
in
the
overview
page
over
here
for
the
July
release,
but
that
is
in
plan
for
I,
think
the
October
release,
and
so
for
the
July
release.
We
were
planning
to
have
the
reports
view
as
the
main
page
and
then
have
again
these
recommended
actions
for
instructors
over
here
as
a
cut
out
there,
but
also
something
that
they
can
hide
if
they
need
to
what
we
again.
B
B
What
we
also
have
here
is
the
managed
manage
tab
where
all
the
other
functionalities
are
added
for
the
instructor
to
either
preview
the
course
as
an
instructor
entering
the
course
as
a
student
customizing,
the
curriculum
getting
and
scheduling
and
all
the
other
tools
that
are
actually
there
in
the
managed
course
page
right
now,.
B
Then
we
also
have
the
discussion
activity
tab
for
the
instructor
over
here.
The
instructor
is
able
to
see
all
the
posts
or
comments
that
the
students
are
making
on
posts.
They
also
have
the
ability
to
select
them
and
accept
or
reject
them.
Based
on
the
comments,
they
also
have
the
ability
to
filter
this
view
by
either
the
most
recent
comments
or
posts
that
have
been
made
or
the
posts
that
that
have
not
already
been
accepted.
B
B
Okay,
thanks
Erin,
we
are
looking
for
feedback
on
all
these
metrics
and
how
we
all
are
calculating
them
right
now,
in
your
classes,.
B
And
then
this
third
floor-
or
this
third
view
shows
the
overview
tab
in
which
we
have
the
insights,
the
recommended
actions
in
different
in
different
sub
tabs
within
the
overview
tab.
So
we
have
this
scattered
plot
and
the
metrics
view
over
here
of
student
engagement
and
student
mastery.
We
also
have
recommended
actions
in
our
tabular
format.
In
this
view,
and
then
we
might
have
a
couple
more
sub
tabs
within
the
overview
tab.
B
The
reports
tab
is
very
similar
to
what
you
have
been
seeing
in
the
past
two
floors,
where
we
have
the
content,
the
students,
the
learning
objectives
and
the
quiz
for
sub
tabs
and
the
manage,
and
the
discussion
activity
is
also
very
similar
to
what
you're,
seeing
and
flow
to
with
the
discussion
activity,
having
an
accept
and
reject
for
all
the
discussion,
posts
and
comments
and
the
manage
having
all
the
other
links
to
two
things
that
the
instructor
can
customize
the.
B
So
the
only
thing
that
is
different
in
this
third
floor
is
the
way
we
are
structuring
the
overview
with
these
different
sub
tabs
of
insights.
And
then
this
tabular
view
of
recommended
actions,
as
opposed
to
this
view,
where
we
have
these
action
cards
and
we
are
looking
for
feedback
on
which
we
would
order-
probably
prefer
whether
it's
this
tabular
view
or
whether
it's
this
view,
which
can
be
expanded
and
hidden.
B
So
I
think
with
that
the
demo
is
complete,
but
I'll
also
turn
over
to
Darren,
Aaron
and
Christine.
If
they
have
any
other
questions
that
they
might
want
to
ask
about
prototypes.
D
Yeah
thanks
tranvi
that
was
excellent,
so
I
know
I'm.
Looking
for
I
can
facilitate.
You
know
asking
pointy
questions
about
some
of
these
things,
but
I
believe
I
can
imagine
that
Christine
has
some
questions
to
help.
You
need
some
direct
questions
for
folks
to
help
collect
some
feedback.
So
Christine,
if
you
do
feel
free
to
jump
in.
A
Here,
yeah
hi
everyone.
So
since
time
he
was
able
to
show
the
flows
for
you
guys,
I
guess
I
just
wanted
to
ask
for
some
preferences,
I
suppose
on
some.
She
already
asked
one
of
the
ones
that
I
was
gonna,
ask
which
was
the
different
variations.
So
let's
see
B1
yeah.
That
I
would
like
to
know
as
well
is.
A
Do
we
prefer
this
reports
page
where
it
opens
up
the
cards
or
the
overview
version
where
it
opens?
It
just
has
the
the
tabs
here
so
I
guess
that
was
one
of
the
questions
and
feel
free
to
type
in
the
chat
you
guys
don't
have
to
like
say
aloud,
and
then
some
other
ones
I
wanted
to
ask
were
in
the
first
prototype
in
the
first
flow
we
had
some
I've
been
playing
around
with
essentially
how
these
cards
could
look
like.
A
So
we
were
having
some
feedback,
essentially
that
these
cards
might
be
a
little
bit
just
two,
maybe
big
or
there's
too
much
text.
So
that's
why
we
were
playing
around
with
that
table.
Formatting
and
so
I
also
decided
oh
hold
on
one
second.
A
So
there
was
this
I
made
the
cards
a
bit
smaller
to
try
to
make
them
look
a
little
bit
less
intrusive,
but
still
I
don't
know
if
this
is
a
fan,
favorite
necessarily
so
yeah.
So
that
was
why
we
went
ahead
and
played
around
with
this
version,
and
we
were
getting
mixed
feedback
about
this
as
well.
A
I
also
made
a
version
with
the
subtext
in
there
as
well,
so
that
way
they
could
just
switch
over
to
see
oops
wrong
boxing
yeah,
so
they
could
switch
over
to
that
oops.
Okay,
that
that's
not
right.
Okay,
sorry
I
must
have
accidentally
connected
the
wrong
thing,
but
yeah
sorry
go
ahead.
Aaron.
D
Oh
well,
I
was
just
gonna
ask
so
it's
a
matter
of.
Do
they
like
the
card
version
or
like
where
it's
kind
of
right
there
and
ready
at
the
time
yeah
or
would
they
want
to
possibly
just
go
to
that
tab
when
they
need
when
they
think
they
need
some
nudges
or
some
recommended
actions
as
the
as
the
system
sees
it?
So
I
guess
that's
the
first
question:
what
are
people's
preferences
in
terms
of
those
recommended
actions
and
those
recommended
actions?
D
I
assume,
are
you
know,
kind
of
taking
into
account
the
metrics
that
are
that
have
been
collected
and
saying?
Okay,
maybe
this
group
of
students
need
attention,
or
this
particular
student
needs
tension
or
it's
time
to
grade
this
assignment
or
I?
Think
those
are
some
of
the
examples
of
recommended
actions
and
so
do
folks
feel
that
that's
helpful.
A
Yeah
but
yeah,
so
that's
those
are
the
two
versions
that
I
made
yeah
feel
free
to
yeah.
D
I
would
just
keep
that
up
as
we
as
we
start
to
dig
in
and
have
questions
so
folks.
What
are
initial
reactions?
Anybody
have
like
strong
reactions
to
the
designs.
D
H
Yeah,
if
you
click
on
it,
you
can
see
and
canvas
that
there
is
right
side.
There
are
28
students
who
need
to
be
graded
for
this
assignment.
There
are
17
students
who
need
to
be
graded
for
this
assignment.
So
there
is
like
a
list
of
five
to
six
students
as
a
instructor,
you
can
see
and
I
think
I
really
prefer
that,
because
it
kind
of
helps
me
prioritize,
which
task
I
want
to
tackle.
Next,
the
Kites
and
entire
page
seems
to
be
too
much.
D
Okay,
so
if
you
look
at
the
other
version
of
the
cards,
which
is
just
a
table,
I
think
that's
more
in
alignment
with
what
you're
thinking,
but
you
would
want
it
on
the
on
the
left
side
is
what
you're
saying.
H
No
I
was
saying
like
I
would
not
want
to
go
to
a
separate
page
every
time
just
to
see
what
are
the
recommended
action.
Okay,
if,
instead
of
guides
in
the
overview
page,
there
was
a
I
can
show
you
how
the
to-do
list
looks
like
if
it
is
helpful,
yeah.
D
H
A
D
Yeah
I
was
just
gonna
switch
to
Caitlyn's
comment,
Caitlyn
yeah,
we
did
meet
with
Ken
last
week
and
he
says
he
tends
to
look
at
this
data
like
as
he's
on
his
way
to
class
on
his
mobile
phone.
How
many
folks
here
think
that
that's
something
that
they
that's
like
a
practice
they
perform.
D
I
That's
not
really
what
my
comment
was
about.
It
was
more
about.
He
said
there
was
like
a
deal
breaker
for
him
that
we,
if
we
don't
include
it,
he
doesn't
want
to
use
Taurus.
What
was
that
I
forgot,
yeah.
H
I
have
it
I
can
share
it
with
everyone
dates
right
back
to
the
screen,
which
he
said
was
a
deal
breaker
for
him
if
he
is
going
to
use
it
when
was
about.
Where
did
the
student
make
most
mistakes,
so
he
can
discuss
it
in
the
class
as
he
walk
in
and
for
a
particular
question.
What
were
the
student
responses?
So
if
a
lot
of
student
responses
something
incorrectly,
he
can
discuss
that
kind
of
thing
by
the
might
have
that
misconception.
D
Yeah
and
by
the
way
we
are
working
on
these
types
of
Deep
dive
analytics,
we
just
don't
have
the
mock-ups
yet
for
them,
you
know
we're
starting
kind
of
Broad
and
then
digging
in
you
know,
there's
there's
it's
like
a
you
know
and
onion
there's
lots
of
layers
to
peel
and
we
wanted
to
give
folks
like
the
most.
You
know
the
summary
information
and
then
dig
in,
and
that
will
be
one
of
the
views
we.
We
eventually
have.
G
So
Aaron
and
Tom
B-
this
was
very
helpful.
I
think
that
I
similarly
would
want
to
be
able
to
leverage
the
information
to
know
which
quiz
questions
were
the
most
problematic,
so
I
liked
the
suggestion
that
someone
gave
earlier
in
the
presentation,
as
opposed
to
having
a
question
labeled
as
multiple
choice,
you
know
to
have
it
labeled.
As
you
know,
quiz
57
item
five
to
know
which
questions
were
the
most
problematic.
The
other
thing
that
I
think
is
would
be
a
very
kind
of
innovative
thing
feature
that
might
be
nice.
G
Is
it's
really
helpful
when
people
are
actually
making
exams
from
existing
assessments
and
quizzes
to
be
able
to
choose
items
for
transfer
based
upon
the
set
of
learning
objectives
that
they
want
to
track
and
follow?
So
I
was
just
wondering
I'm
wondering
if
that
could
be
something
where
if
I
was
making
an
exam,
I
would
be
able
to
use
that
learning
objective
screen
and
be
able
to
say
okay
of
these
learning
objectives,
I
want
to
make
sure
I
make
transfer
questions
for
the
following
quiz
questions.
G
D
Think
I
think
it
does
it's
just
we
might
want
to.
You
know,
ask
some
clarifying
questions
to
teach
that
apart.
I
think
I
think
what
I
heard
is
a
use
case.
D
Something
you'd
want
to
be
able
to
do
is
to
dive
in
you
know,
let
a
glance
look
at
the
learning
objectives
see
where
folks
are
maybe
struggling
still
with
learning
objectives,
then
be
able
to
dig
into
the
questions
associated
with
that
learning
objective
and
then
actually
select
questions
to
to
create
like
a
new
quiz
for
students,
with
the
questions
that
they're
struggling
with
is
that
kind
of
what
I
was
hearing.
G
Yeah,
absolutely
so
that
would
be
one
one
thing
to
do
would
be
to
make
a
new
quiz
for
people
to
review,
with
the
questions
from
each
learning
objective
that
they're
struggling
with
the
other.
You
know
use
case
that
I
think
is
really
helpful.
I've
seen
people
that
basically
make
exams
using
existing
questions
that
they
then
make
transfer
questions
for,
but
they
like
to
make
sure
they
have
an
equal
representation
in
their
learning
objectives.
Right.
D
G
They
meet
those
exams,
so
that's
one
of
the
things.
That's
really
awesome
about
the
way
that
this
Taurus
system
is
set
up
because
it
actually
makes
it
easier
for
someone
to
make
those
decisions.
So
that's
that's
what
I
was
curious
about
looking
at
this
learning
objective,
screen,
I
think
I
could
probably
do
that.
Given
the
way
the
screen
is
because
I
can
choose
which
questions
to
make
to.
C
D
That's
super
cool
because
we
really
want
to
encourage
you
know
evidence-based
practices
by
instructors,
and
so
you
just
gave
a
really
interesting
use
case
where
we
would
be
encouraging
instructors
in
a
way
to
review
the
data
and
to
use
that
data
in
a
constructive
way
to
help
students
toward
more
Mastery.
D
That's
exactly
the
kind
of
behaviors
they
think
we
want
to
be
able
to
encourage
from
our
instructors.
So
that's
an
interesting
suggestion,
thanks
Lauren
and
by
the
way
we
are
talking
all
the
time
about
this
idea
of
a
homework
system,
meaning
being
able
to
either
use
the
practice
and
the
quizzes
and
the
questions
that
are
in
a
course
in
different
ways,
meaning
you
know
build
like
a
homework.
D
G
Tommy,
what
I
mean
by
transfer
questions
is
so
in
science
of
teaching
and
learning
research.
You
know
I
I.
Do
that
all
the
time
and
basically,
what
we'll
do
is
we'll
have
homework,
Activity
questions
and
we'll
make
sure
that
the
exam
questions
ask
the
same
type
of
question
that
is
in
the
activity,
but
in
a
different
way,
and
if
it's
a
question
that
basically
all
you
need
is
the
information
that
you
retrieved
or
elaborated
on
in
the
original
activity.
G
That's
considered
Near
transfer,
whereas
if
you
use
that
information
in
a
new
or
novel
way,
it's
considered
far
transfer
questions.
So
I
guess
I'm
sort
of
wondering
also
based
upon
what
you
just
said:
Aaron.
It
would
be
really
helpful
if,
let's
say
I
was
let's
say:
I
was
building
that
new
homework
activity,
as
you
were
describing
what
if
there
was
a
way
for
me
to
like
click
and
select
several
of
these
learning
objectives
like
those
low
ones,
I
like
pull
those
and
then
I
could
export
them
into
a
new.
G
D
Yeah,
we're
like
I,
said
I'm,
always
looking
for
ideas
for
how
we
can
help
instructors
generate
more
questions
against
the
learning
objectives
to
either
you
know
to
assign
for
homework,
for
especially
for
for
folks
who
have
already
worked
the
practice
that
exists
and
still
need
more
right,
but
then
also
being
able
to
tile
that
data
in
this
to
the
same
set
of
objectives
and
skills
is
really
cool
too
interesting.
D
So
so
far,
does
it
look
like
we
have?
You
know
the
basic
analytics
that
would
be
needed
to
manage
a
course
today.
Knowing
that
we're
still
going
to
do
drill,
you
know
we're
still
Drilling
in
but
say
this
is
all
the
analytics
we
can
get
out
for
fall.
Is
that
helpful
for
folks?
Is
that
good
enough
thanks
Natalie,
that's
helpful.
D
I
know
that
Ken
wants
the
the
question
drill
down,
be
able
to
see
kind
of
the
the
item
by
item
responses
and
the
and
the
distractor
responses.
All
of
that
and
again,
we
are
working
on
that
because
that's
a
very
powerful
way
to
look
at
the
data
for
sure.
D
Okay,
thanks
Lauren
Daniel,
that's
a
good
caveat
right
like
it's
helpful
as
long
as
you
know
what
you're
looking
at
I
feel
like
I
could
have
a
whole
like
session,
just
on
naming
conventions
and
automatic
naming
ideas.
D
We
have
some
already
that
we're
kicking
around,
of
course
got
him.
You
know
shared
some
of
those
like
being
able
to
see
the
beginning
of
the
question,
stem
actually
being
able
to
click
on
that
and
take
you
directly
to
that
question.
E
Is
more
for
written
responses
and
wouldn't
necessarily
be
related
to
that
overview?
Page,
it's
not
something
you
can
do
in
canvas
which
I
always
struggle
with,
but
so
I
require
my
students
to
submit
via
Google
Docs,
so
I
can
give
them
comments
in
line
because
otherwise
you're
just
writing
in
the
comment
box
and
you
can't
even
bold
or
italicize
or
anything
so
with
D2L
brightspace,
which
was
a
LMS
I
previously
worked
with.
E
You
could
give
inline
comments
also
with
I
think
a
supersite
mindtop
by
Vista,
Higher
Learning
and
a
couple
of
the
Pearson.
Maybe
a
couple
of
the
big
textbook
Publishers
for
language
learning
were
on
their
websites.
You
could
give
inline
feedback,
which
is
really
helpful
if
a
student
just
missed
an
accent
or
you
just
wanted
to
say
like
hey.
E
This
is
a
conjugation
thing,
but
I
anticipate
would
be
useful
for
other
subjects
too,
when
you're
asking
the
students
write
something
and
submit
so
this
would
be
more
related
to
when
you're,
actually
grading
and
giving
feedback
that
that's
something
I
would
like.
D
Okay,
that's
super
helpful
nobody's
mentioned
anything
like
that
before
and
we've
been
working
on
and
we'll
be
working
on
kind
of
like
what
we're
calling
an
annotation
system.
So
the
first
part
of
that
implementation
was
to
even
just
get
discussion
boards
into
the
course.
The
next
is
to
really
look
at
how
we
could
use,
but
we
were
looking
at
it
from
the
student
perspective
right
letting
students
highlight
things
and
ask
questions
and
use
that
discussion
and
annotating
together,
but
I
did
never
thought
about
it
and
I,
don't
think.
D
We've
ever
heard
anyone
ask
on
the
instructor's
side
of
things
for
you
to
be
able
to
go
in
and
highlight
and
say
you
know,
especially
in
open
responses
like
really
quickly.
E
D
Interesting,
okay,
we'll
probably
follow
up
on
on
some
more
ideas
about
that
from
you
know
with
you
for
sure,
because
that's
the
first
I've
heard
that
that's
cool
while
we're
thinking
of
Pie
in
the
Sky
ideas.
Anybody
have
any
others,
foreign.
B
I
also
wanted
to
ask
a
question
to
the
group
about
tourist
engagement
and.
B
B
G
I
can
tell
you
that
there's
there's
like
a
presentation,
each
semester
by
the
r
sister
group
that
describes
how
canvas
uses,
how
you
can
view
view
the
analytics
in
canvas
and
because
a
lot
of
the
analytics
have
to
do
with
page
views
as
opposed
to
completion.
You
know
there's
activity
completion,
but
if
you
look
at
page
page
views
that
there's
a
lot
of
pressure
not
to
use
that
as
a
reason
for
grading,
so
I
think
at
least
at
our
University
people
are
basically
cautioned
not
to
do
that.
G
B
Yeah
right
now
like,
as
you
saw
in
the
design,
we
have
like
these
two
metrics,
which
is
student
completion
and
student
engagement.
But
what
we're
really
thinking
is
that,
if
engagement
is
something
that
instructors
are
really
not
counting
towards
anything,
then
it
might
be
redundant
and
we
might
just
use
student
completion
as
a
metric
and
not
have
engagement.
D
I
was
just
going
to
say
it
might
help
to
know
what
we're
talking
about
and
what
the
differences
are
so
engagement,
how
we're
calculating
that
and
I'm
not
going
to
give
you
the
exact
calculation,
but
we're
we're
taking
to
account
page
views
activity
use
so
the
amount
of
practice
the
students
touch
and
how
how
many
discussion
posts
they
either
submit
or
answer.
So
we're
trying
to
calculate
engagement
by
the
total
engagement
with
the
system
and
all
the
different
pieces
and
parts
of
it.
D
However,
when
we
start
to
get
into
the
details
of
discussing
that,
you
know,
different
people
have
different
ideas
of
how
much
those
things
are
weighted
and
if
they
even
need
to
be
included
and
it
and
if
it
really
gives
you
anything
different
from
straight
completion.
So
how
about
you
know
starting
there
is
that
something
that
is
interesting
to
you
all
to
know.
D
H
Yeah,
as
long
as
it
looks
like
they
made
an
effort
to
engage
in
the
discussion,
even
if
they
did
not
do
all
the
required
discussion.
I
just
give
them
a
full
grade
and
for
formative.
I
just
take
a
glance,
I
never
check
if
they
are
actually
doing
like
every
single
one
of
them,
but
I
think
having
some
exact
statistic
and
those
kind
of
thing
might
be
helpful.
Okay,.
D
H
C
H
D
Yeah
but
I
wonder
even
if,
if
we're
calculating
the
engagement
for
you,
if
that
would
be
a
more
consistent
measure,
then
you
know
I
mean
across
even
students
and
it'd,
be
more
fair,
I,
don't
know
I'm
just
I'm,
throwing
it
out
there
because
then
there's
something
you
could
point
to
instead
of
being
like
well
I
kind
of
fuzzily.
Did
this
and
I
can't
really
justify
it?
How
I
did
it
differently
from
student
to
student,
but
I?
D
H
Yeah
they've
never
challenged
any
participation
score
and
it's
mostly
because
I
only
cut
any
blade
when
I
see
that
they
are
not
engaging
and
they
just
stopped
it,
but
I
think
having
something
to
refer
to
might
be
helpful.
Okay,
like
I,
can
see
overall
statistic
being
useful,
but
I
think
it
will
depend
on
if
it's
exactly
how
I
want
it
to
be.
If
it's
not,
however,
to
be
I'll
just
like
default
to
taking
it
with
a
glance
and
like
making
up,
on
my
own
mind
on
how
to
grade
students.
D
Yeah
I
wonder
if
we
even
release
it
with
how
we
calculate
it,
collect
really
detailed
feedback
on
how
folks
are
using
that
or
not
and
then
maybe
even
Downstream
be
able
to
put
in
some
functionality
where
instructors
could
even
adjust
those
calculations.
D
H
Would
definitely
want
to
adjust,
because
sometimes
it
happens,
that
students
do
not
post
anything
on
the
discussion
for
them,
but
they
exchange
a
lot
of
emails
with
me.
So
we
I
count
that
as
participation
grade
instead
and
Ken
and
Michael
like
everyone
is
fine
with
it
in
CMU.
So
it
obviously
depends
on
the
interest
since
going
outside
the
platform.
So
an
ability
to
adjust
would
be
definitely
apply
already.
J
So
I
just
wanted
to
point
out
that
we
we
do
have
a
separate
kind
of
thread,
pushing
forward
on
exploring
how
to
incorporate
participation
specifically
into
the
system
as
a
metric
right
that
we
can
track
and
even
end
up
rolling
back
and
showing
up
and
appearing
in
the
grade
book,
and
so
this
engagement
metric
isn't
necessarily
that
it's
not
necessarily
the
same
thing.
I
think
what
originally
motivated
us
with
this
idea
of
Engagement
was
this
metric.
J
That
instructors
could
look
at
to
see
which
students
are
actually
doing
things
yet
not
getting
good
results
right
so
that
then
the
instructors
could
reach
out
and
and
check
on
that
group
of
students,
so
I
think
right.
There's
there's
some
overlap
there
with
what
what
I
think
we're
considering
participation
to
be,
but
I
think
we
have
to
be
careful
to
keep
maybe
that
original
goal.
J
H
On
that,
I
completely
agree
about
defining
the
engagement.
That
way,
my
maybe
part
of
my
concern
was
when
I
saw
that
graph.
There
was
a
option
to
send
a
reminder,
email
and
it
looked
like
it
will
go
to
the
all
the
student
in
the
entire
quadrant,
but
sometimes
if
I'm
contacting
Stevens,
who
are
not
keeping
up
with
the
course
I.
Mostly
like
reminder
emails
to
someone
who
I
have
not
heard
anything
from.
D
Yeah,
oh
great,
thank
you
for
commenting
on
that.
That's
really
helpful
because
I
don't
think
anybody's,
given
us
feedback
like
that
on
that
I
think
that
was
a
new
that
was
new
to
me
anyway.
Seeing
that
button
there
I
know
that
we
wanted
to
do
suggested
actions,
but
I
can
imagine
based
on
your
comment,
got
them
to
like
click
on
that,
but
then
still
have
be
able
to
just
to
exclude
students
in
that
quadrant
from
the
from
the
email.
J
Yeah,
there
were
actually
some
early
designs
produced
for
that
kind
of
bulk
email
that
we
didn't
show
here,
but
Waterman
actually
worked
the
way
that
I
think
you
were
hoping
that
would
work
that
once
the
instructor
clicked
that
button
to
send
that
email,
they
would
actually
first
see
this
view
that
had
right
the
list
of
all
students
that
the
email
would
be
getting
sent
to
they'd.
I
J
So
that
that
is
being
tracked
at
the
moment
and
is
visible
already
as
a
feature
for
instructors
so
for
for
graded
Pages,
where
there
is
a
a
distinct
end
to
the
attempt
where
the
student
submits
that
entire
page
attempt
the
the
system
will
show
to
the
instructor.
You
know
how
many
minutes
the
student
did
or
did
not
spend
working
on
that
quiz.
C
D
So
we're
over
time
by
a
minute.
It
really
thank
you
all.
Thank
you,
Tom,
V
and
Christine
for
sharing
those
those
designs.
Thanks
for
all
the
feedback
that
we've
collected
from
you
all
today,
don't
hesitate
to
keep
submitting
feedback
if
you
think
about
it.
After
send
me
an
email
or
visit
the
tourist
Pages,
there's
there's
places
there,
you
can
click
to
get
right
to
submitting
a
ticket
or
an
issue.
D
There's
also
a
form
where
you
can
send
suggestions
for
features
so
definitely
make
use
of
those
we're
hoping
at
the
next
meeting
in
June
to
maybe
showcase
some
of
the
advanced
authoring
capabilities
so
stay
tuned
for
that
I
hope
everybody
has
a
great
Cinco
de
Mayo
and
great
weekend
thanks.
Everyone.