►
From YouTube: Torus Community Meeting September 2022
Description
Monthly meeting to discuss Open Learning Initiative's next generation platform. In this meeting, we discussed functionality and features to support metacognitive activities.
Find the Slide Deck here: https://docs.google.com/presentation/d/1PXRSIVLNaibUTSyEOOg54jzyCjL_CUAP/edit?usp=sharing&ouid=105645051721469794447&rtpof=true&sd=true
B
Good
good,
probably
a
small,
I
assume
it'll,
be
a
small
group
today,
just
being
the
the
Friday
before
a
holiday
weekend.
B
But
as
long
as
some
key
people
are
here
and
I
see
Lawrence
here,
that's
great
I
think
we'll
we'll
be
able
to
work
through
a
lot
of
cool
stuff.
So,
as
usual
I'm
going
to
wait
about
four
more
minutes
and
and
then
we'll
get
started.
So
hang
tight
till
then.
B
Waiting,
what
is
the.
B
That
is
a
hard
to
answer
question,
but
I
will
tell
you
this
they've
already
started
putting
things
in
place,
getting
prepared
for
language
courses,
so
they're
working
on
functionality,
because
we
really
want
to
get
those
in
right
now
we're
targeting
chemistry,
mainly
because
it
has
the
one
of
the
biggest
feature
sets
of
any
of
our
courses,
and
we
need
it
to
work
on
it
for
a
grant
that
we
have,
but
at
the
same
time,
that
they're
building
out
the
functionality
for
chemistry.
B
It
also
happens
to
build
out
a
lot
of
the
same
functionality.
The
language
courses
has
and
I'm
even
being
impressed
by
the
fact
that
they're
also
adding
support
for
language
courses
like
they
have
that
in
the
back
of
their
mind
as
they're,
also
building
out
chemistry,
so
I
I
think
what
we're
targeting
is
chemistry
and
then
we'll
probably
be
wanting
to
meet
with
you
guys
very
frequently
to
find
out
how
to
how
to
get
things
in
Taurus
and
what
you
guys
need.
B
Besides,
what
we
know
you
need
based
on
the
dtds
and
the
XML
structure,
so
don't
have
a
good
timeline
I'm.
Looking
for,
if
I
had
to
give
an
estimate
based
on
what
I've
seen
and
what
I
know,
I
would
probably
say
we
will
start
talking
seriously
about
migrating
language
courses
at
the
beginning
of
the
new
year
yeah,
because
we've.
C
No
because
I
haven't
tested
it
in
about
three
years,
but
and
the
consensus
response
I
got
back
was.
B
B
I
would
call
that
pretty
soon
I
mean
we
were
we
really.
We
know
that
you
guys
are
actively
you
know
developing,
improving
and
now,
like
hooking
up
your
your
skills
maps,
and
all
of
that
so
I
mean
it'll,
be
so
cool
to
see
those
courses
over
and
over
in
Taurus
and
so
I
know
it's
something
it's
a
priority
for
us
and
because
they're
easier
support
there
as
well.
Just
in
general,
you
know
with
the
rapid
ability
to
like
publish
changes
immediately
and
that
sort
of
thing
they're.
B
So
by
the
way
so
like
just
recently
like
just
this
past
week,
they
landed
the
functionality
to
do
pronunciations,
and
things
like
like
so
there's
a
whole
another
like
Rich
set
of
features
around
like
audio
that
you
guys
have
been
asking
for
that.
We
haven't
been
able
to
do
in
the
past
that
we
will
be
able
to
do
in
Taurus,
so
we're
like
kind
of
constantly
thinking
about
what
we
need
for
the
language
courses,
because
they're
so
important
who
have.
B
Way,
you're
describing
it
is
nothing
I've
heard
of
yet.
Oh
really,
I
think
it's
just
in
casual
conversation.
What
we
do
is
we
evaluate
the
courses
themselves
like
the
XML
and
what
kind
of
DTD
elements
that
they're
using
and
then
just
based
on,
like
conversations
that,
like
you
know,
I've
had
with
different
folks
house
had
with
different
folks
we're
just
you
know,
as
so
we're
not
developing
specific
functionality
like
just
for
languages
but
stuff
that
we've
heard
would
also
you
guys
would
also
take
advantage
of
kind
of
thing
like
that.
B
We
can
do
now
in
Taurus
so
like
now,
you
can
do
things
like
I,
don't
know
if
you'd
ever
want
something
like
this,
but
like
you
can
do,
there's
a
whole
system
in
place
to
like
where
students
could
upload
things
like
audio
files
or
any
kind
of
files.
We'd.
C
B
C
Have
that
yeah
I
had
to
actually
write
my
own
recording
app.
C
B
So
there's
a
lot
of
other
things
like
as
we
build
out
we're
we're
like
kind
of
constantly
thinking
like
how
would
the
languages
use
this
so
right
now
we're
just
basic,
basically
trying
to
replicate
what's
available
in
the
XML.
But
then
the
engineers
are
like
enhancing
some
of
those
features
as
they
go
and
then
once
we
have
like
I
said
once
we
have
chemistry
in
then
we'll
probably
Circle
back
around,
say:
okay,
what
haven't
what
don't
we
have
now
that
you
would
like
you
know
and
and
get
that
in
place
great
thanks
sure.
B
So
that
was
a
good
discussion
and
we're
at
where
can
I
get
started
then.
So
today's
topic
is
about
metacognition
and
the
different
ways
that
kind
of
want
to
talk
about
the
different
ways.
Metacognition
can
happen
in
a
course
or
we
can
promote
a
good
metacognitive
strategies
for
our
students
and
then
think
about
that'll
be
a
way
of
thinking
about
like
okay.
B
What
what
are
all
the
possibilities
of
things
we
want
to
be
able
to
do
and
then
I'll
show
a
demo
of
what
is
currently
our
survey
capability,
which
is
a
big
component
of
doing
metacognitive
activities.
You
know
having
activities
that
don't
necessarily
have
a
right
or
wrong
answer,
but
that
offers
some
feedback
that
you
can
ask
like.
B
In
our
past
surveying
capabilities
like
you
could
only
get
a
likert
scale
and
it
was
really
like
there
was
only
one
kind
of
activity
that
you
could
use
that
didn't,
give
right
or
wrong
answers
and
I.
B
Think
then,
for
student,
cognition,
toolbox,
they've
done
some
like
creative
things,
with
the
with
the
current
functionality
to
enable
their
surveying
capabilities,
but
we
also
build
a
special
thing
for
them
to
be
able
to
give
the
students
a
report
back
on
their
surveys
that
part
I
can
say,
hasn't
yet
been
built,
but
we're
in
the
process.
So
let
me
start
with:
let's
see,
okay,
so
today's
topic
metacognitive
skill
building
and
practice.
So
we
can
talk
everything's
kind
of
on
the
table
about
this
activities.
B
Question
types
strategies
for
how
you
incorporate
more
metacognition
in
the
courses,
because
that
could
lead
to
ideas
about
the
kind
of
functionality
we
need
to
build
to
support
those
different
strategies,
anything
about
data
collection
and
the
way
you
want
to
collect
that
data
or
see
that
data
reported
back
and
then
I'll
and
I'll
also
show
some.
B
Existing
lightweight
functionality
exists
right
now
on
Taurus
the
developed,
the
ability
to
to
provide
surveys,
but
we
haven't
developed
the
like
poll
reporting
structure
yet
so
always
want
to
hear
all
of
your
ideas
and
brainstorming
on
these
things.
So
just
trying
to
move
my
my
windows
around
a
little
bit
getting
stuck
in
I'd
like
to
see
everybody.
B
Okay,
so
I
have
to
give
a
big
shout
out
to
Caitlin
who's
on
the
call.
A
while
ago,
her
and
I
were
discussing
about
different
metacognition
strategies
and
like
trying
to
find
like
all
the
different
ways
it's
been
done
in
various
courses
and
she
compiled
this
fantastic
document.
B
This
PowerPoint
of
different
strategies
that
are
used
across
different
courses
at
Oli
and
it's
just
it's
just
really
cool
so
I
hope
it'll-
be
able
to
spark
some
discussion
around
the
kinds
of
things
we
would
like
to
be
doing
the
kinds
of
things
we
are
doing,
and
maybe
even
thinking
about
beyond
that.
B
B
I'll
just
go
through
this
really
quickly
again,
just
to
spark
some
creative
brainstorming
and
ideas.
So
this
is
one
way.
I
think
this
is
in
a
let's
see
psychology
course.
B
Maybe
we
do
these
my
responses,
you
many
of
you
are
probably
familiar
with
these
my
response
activities
that
we
developed
and
we
tend
to
put
them
at
the
end
of
oh
sorry,
at
the
end
of
every
module
and
the
way
we
designed
them
is
so
that
there's
a
likert
scale
where
students
can
describe
what
level
of
comfort
or
not
Comfort
performance,
they
think
they
can.
They
would
have
against
any
the
objectives
in
that
module
and
then
some
open-ended
questions
about.
Is
there
anything
else?
B
That's
not
clear
to
you,
and
this
gives
a
lot
of
good
information
not
only
to
the
instructors
in
real
time
about
where
the
students
feel
like
they're
least
confident
or
most
confident,
where
there
might
be
still
having
still
struggling
with
certain
topics.
But
we
also
have
a
reporting
mechanism
where
you
can
show
the
results
of
this
to
a
class
by
stripping
out
their
names
so
that
you
could
use
it
as
a
discussion
prompt
to
say
hey.
This
is
where
the
class
is
and
and
help
folks
feel
like.
B
Okay,
you're,
not
alone,
everybody's
kind
of
struggling
with
these
same
Concepts
or
okay.
Everybody
seems
real
confident
in
these
Concepts,
so
we're
not
going
to
spend
more
time
on
them.
We'll
spend
more
time
on
the
things
that
you
feel
like
you're
still
struggling
with,
and
the
way
this
is
currently
in
our
Legacy
system
is
just
like
a
it's
a
kind
of
an
automated
activity,
type
that
is
just
placed
wherever
you
want
it,
but
you
can't
really
change
it
very
much.
B
So
that's
one
of
the
limitations
of
this,
but
just
as
a
strategy
having
students
reflect
on
their
learning
is
just
a
fantastic
wave
of
getting
them
to
practice.
Metacognition,
you
know
having
them
reflect,
really
think
about
how
yeah,
how
confident
am
I
with
these
Concepts,
and
these
could
I
perform
these
learning
objectives
with
some
support
or
can
I
do
it
on
my
own
at
all,
and
we
I
remember
when
this
was
first
developed.
The
this
wording,
like
are
you?
B
How
comfortable
are
you
in
performing
these
learning
objectives
and
the
not
at
all
yeah?
It's
up
to
sorry,
I
gotta
move
my
window
to
see
on
my
own,
and
everything
in
between
was
like
really
rich
data
to
collect
on
the
students
so
explaining
my
metacognition,
so
just
even
telling
students
that
this
is
a
thing
and
that
they
they
wouldn't
practice
it,
and
they
should
be
aware
of
it
is,
is
a
way
to
you
know,
seed
that
ability
for
the
students
to
understand
that
they
do
need.
B
This
is
something
they
need
to
practice
and
and
just
like
letting
them
know
that
this
is
a
thing.
It's
a
metacognitive
strategy
and
even
going
through
and
explaining
Active
Learning.
Maybe
even
explaining
growth
mindset
to
some
degree
is
always
good
for
students
to
fill
more
confident,
more
comfortable,
more
confident,
especially
when
there's
a
lot
of
desirable
difficulty
built
into
a
course,
and
they
need
to
kind
of
persevere
and
and
have
some
struggle
before
they
really
learn
it
and
retain
it
and
can
transfer
that
knowledge.
B
So
really
just
like
this
is
a
strategy
just
letting
them
know
that
it's
okay
and
they're
going
to
struggle
and
how
to
how
to
like
really
take
advantage
of
the
practice
that
we
build
into
our
courses
addressing
misconceptions
head
on
I
know:
I'm,
not
sure
what
course
this
is
income
evidence-bage
practice.
This
is
probably
from
our
ebm
course
or
evidence
based
management
course.
B
They
do
a
really
good
job
and
you'll
probably
see
a
couple
different
examples
in
here
of
where
they
do
things
like
they
just
head
on,
say:
there's
a
here's,
a
misconception
that
we
have-
and
you
know
just
know
that
this
is
something
that's
common
among
all.
Students
is
a
way
of
addressing
some
or
you
know,
building
up
practice
and
metacognition
I
know
in
another
thing
that
probably
is
in
these
slides.
B
Actually
it
is
I
know
it
is
so
I'm
going
to
save
that,
but
a
p
I
know
in
the
anatomy
and
physiology
course
that
I
developed
years
and
years
and
years
ago
I
tried
to
do
something
in
there,
where
at
the
beginning
of
every
module
or
unit
I,
believe
it
was
like.
So
it
was
say
it
was
a
skeletal
unit.
I
would
say
here's
common
myths
about
you
know
the
skeletal
system
that
you
may
have
heard,
because
again
people
usually
come
to
that
course
thinking.
B
They
know
a
lot
about
their
bodies
and
I
like
I
know
how
you
know
things
are,
and
so
it
was
fun
to
kind
of
even
think
about
and
embed
those
questions
in
that
course
to
really
again
address
those
misconceptions
that
they
might
have
right
before
they
even
dive
into
any
of
the
materials.
So
they
can
start
to
kind
of
weigh
what
they
know
and
their
prior
knowledge
against
what
they
learn.
B
Pacing
expectations,
another
thing
yeah
like
letting
them
know
that
this
is
going
to
take
some
time,
maybe
even
giving
them
some
idea
of
how
much
time
to
expect
to
take
so
that
they
can
help
gauge.
The
students
can
start
to
learn
how
to
gauge
for
themselves
how
much
time
to
spend
and
something
before
they
need
to
ask
for
more
help,
or
you
know
the
fact
that
they
can
recognize.
B
Oh
I
kind
of
know
this,
and
so
I
can
go
a
little
faster
on
this
because
I
know
the
next
topic
is
going
to
be
a
little
more
challenging
for
me
and
that's
where
I
want
to
spend
most
of
my
time
so
giving
them
the
opportunity
and
even
the
prompts
to
begin
to
think
about
their
own
pacing
and
be
able
to
to
start
to
learn
how
to
adjust
that
for
themselves.
B
Here
is
the
one
that
I
was
going
to
bring
up
the
evidence-based
management.
Does
they
actually
embed
these
little
prompts
throughout
the
course
to
say:
hey
you've
been
you've,
probably
been
consuming
a
whole
bunch
of
material
here?
Why
don't
you
take
a
break
and
let
it
absorb
you
know,
you
know,
take
a
nap.
B
B
Hints
and
hints,
let
me
see
what
this
one's
about,
but
I
can
imagine,
take
a
break.
Let's
see
what
this
really
means:
okay,
by
prompting
Learners,
take
a
break.
The
other
course
is
encouraging
self-monitoring.
So
again,
I
think
it's
just
more
hinting
at
you
know
reminding
them.
Don't
just
put
it
at
the
beginning
of
your
course
and
say:
okay,
do
all
this
metacognitive
stuff,
but
really
embedding
it
as
you
go
through
the
course.
B
You
know
reminding
students.
Okay,
take
a
break
think
about
this
process
that
it's
just
always
a
good
strategy
flow,
charts
and
diagrams.
Actually,
yes,
they
really
help
solidify
Concepts
that
are
interconnected.
Also
helps
helps
students
understand
like
where
they
are
in
the
content,
how
much
more
they
have
to
learn
about
this
thing,
but
also
laying
things
out
and
providing
really
clear,
infographics
and
different
representations
of
the
same
information
and
the
more
you
do
that
and
the
more
you
build
that
into
your
courses.
B
The
more
the
students
I
think
experience
that
the
more
they'll
be
able
to
do
it
for
themselves
and
when
they
encounter
something
they'll
be
able
to.
You
know
visualize
their
own
little
mind,
map
or
flowchart
of
the
information.
You
know
the
more
that
they
get,
that
you
know
scaffolding,
the
more
they'll
be
able
to
do
it
for
themselves.
B
There's
only
a
few
more
here
yeah.
This
is
the
the
same
graphic
organizer.
So
we
do
this
in
our
courses
with
a
big
picture
we
try
to
for
every
course
we've
developed.
We
try
to
build
a
big
picture
of
the
whole
course
so
that
students
understand
kind
of
where
they
are
in
the
course
what
all
they'll
be
learning
and
maybe
even
how
it
connects
to
the
bigger
outside
world.
B
We
do
that
with
our
statistics,
big
picture,
we
really
kind
of
I-
think
that's
a
good
one
that
shows
okay,
you're
learning
this
in
the
course,
but
it
really
fits
in
in
a
bigger
and
even
a
bigger
picture
than
that
into
the
domain,
and
this
is
how
what
you
learn
now
will
connect
to
and
help
you
later.
As
you
learn
more
things
in
the
same
domain.
B
B
B
I,
like
just
trying
to
get
more
and
more
examples,
yeah
and
then
what's
this
one,
so
to
make
and
explain
a
prediction:
oh,
this
is
a
new
thing
that
we're
trying
to
really
do
a
lot
of
in
our
courses,
and
this
is
based
on
Paulo
I,
couldn't
butcher's
last
name
and
I
apologize
he's
a
professor
and
researcher
in
the
metals
program
and
he
did
some
really
cool
experiments
where
he
added.
Thank
you
Caitlin
for
putting
that
in
the
text.
B
He
did
some
really
cool.
He
generated
some
new
activities
where
students
had
to
make
a
prediction
and
then
observe
something
happening
and
then
kind
of
refining.
That
prediction.
Essentially
it's
a
it's
a
way
of
embedding.
B
Well,
it's
a
way
of
in
Psychology,
bringing
it
back
more
to
a
scientific
method
of
learning,
psychology
and
learning
about
the
science
of
psychology,
but
also
it's
just
a
great
tactic:
to
help
students
gauge
themselves
and
and
be
able
to
kind
of
see
down
the
road
make
a
prediction
see
how
it
changes,
how
the
the
reality
changes
that
it's
a
really
kind
of
a
way
of
of
like
the
misconceptions
or
like
the
myths
that
I
talked
about
in
amp
kind
of
confronting
confronting
them
right
away
about
what
they
think
is
going
on.
B
Having
them
actually
do
something
that
might
change
that
and
and
then
you
know,
having
them
be
able
to
add
that
new
information
to
their
prior
knowledge
in
a
way
that
helps
them
going
forward.
So
I
just
wanted
to
quickly
go
through
some
of
these
strategies
that
we've
found
throughout
our
courses.
I'm
curious
as
to
know,
if
there's
anything
different,
that
any
of
you
all
do
to
support
metacognition
in
our
courses.
B
B
That's
really
what
I'm
trying
to
get
at
in
a
way
like?
Are
there
ways
that
you've
thought
of,
like
oh
it'd,
be
really
cool?
If
we
could
do
this
and
it
would
really
help
our
students
help
to
learn
how
to
learn
better,
but
we
didn't
have
the
functionality
or
ability
to
do
it.
C
Aaron,
yes
hi,
this
has
been
I
will
say:
I,
don't
I,
don't
have
any
good
answers
for
this,
but
you
know:
we've
talked
about
how
I'm
attempting
to
use
a
more
abstract
representation
for
representing
causal
Pathways
and
encouraging
students
in
biology
to
see
the
commonality
that
exists
across
topics.
C
What
you
might
want
to
do
is
encourage
students
to
engage
in
the
metacognitive
activity
to
shift
from
the
minutia
to
the
more
abstract
representation
between
topics
when
are
the
most.
When
are
the
best
times
to
ask
them
to
reflect
on
the
abstractions
from
the
minutia
and
to
see
how
those
commonalities
exist
between
topics.
C
B
It
is
I
mean
I
would
say:
do
you
want
to
do
it
as
as
early
and
often
as
you
can
and
I
will
be
interested
in
hearing
your
ideas
for
those.
C
Well,
it's
you
know
it's
kind
of
like
the
that
the
way
people
say
You're
supposed
to
give
talks.
You
know
you,
you
tell
people
what
you're
going
to
tell
them,
and
then
you
tell
them,
and
then
you
tell
them
what
you
told
them.
So
you
know,
perhaps
the
answer
is
just
to
it
as
often
as
possible.
C
C
I
think
is
also
an
extremely
interesting
question
that
I
am
unaware
of
any
people
that
have
good
answers.
Martina
Rao
has
done
some
interesting
work
on
this
working
with
multiple
representations,
but
if
anybody,
if,
if
anything,
that
I've
said
reminds
people
of
any
particular
research
on
on
this,
you
know
seeing
commonality
that
exists
across
topics
and
how
to
get
students
to
see
that
and
using
multiple
representations
to
encourage
that
I
would
be
interested
in
in
pointers
to
that.
C
R-A-U
she
was,
and
actually
she
was
her
her
thesis
was-
was
on
multiple
representations
in
chemistry,
so
whoever
who
who's
who's
working
on
the
on
the
grant
for
for
the
chemistry
stuff
that.
C
Okay,
well,
I'm
sure
that
I'm
sure
that
that
Dave
is
is
very
familiar
with
martinez
work
and
he's
he's
probably
had
several
meetings
with
her.
She
was
a
grad
student
at
CMU
back.
She
probably
got
her
PhD
like
a
decade
ago.
Okay,.
B
A
Have
I've
just
read
a
bunch
of
research
on
this
actually
so
yay
sorry,
this
is
Lauren.
Yeah
I
should
introduce
myself
in
case
you
don't
know:
I'm
a
I'm
at
University
of
New
Hampshire
I
work
in
like
learning
Sciences,
basically
I'm
leading
I'm,
leading
a
special
interest
group.
This
fall
on
a
critical
thinking
and
encouraging
it
in
your
courses
with
a
variety
of
active
learning
activities
like
case
studies,
problem-solving,
difficult
stem
approaches.
A
Things
like
that
and
in
Linda
B
nilsson's
book
that
I
would
highly
recommend
I'll
put
the
I'll
put
the
link
in
the
chat
and
it's
also
it's
a
freely
available
resource.
Through
my
University
nice.
A
A
At
the
beginning
of
the
lesson
so
I
agree
like
basically
early
and
as
often
as
you
can,
but
that
seemed
compelling
evidence
for
you
could
do
it
every
single
lesson,
but
before
you
jump
into
the
content,
explicitly
say
where
it
fits
into
the
big
picture
and
that
there
is
support
for
way
more
retention.
If
you
could
do
that
so
I
don't
know
if
that
helps
or
not,
but
I'll
I'll
put
the
link
in
the
chat,
but
that's
an
explanation
of
why
I'm
putting
that
book.
B
Yeah
cool
thanks
for
sharing
that
and
I
will
also
make
a
plug
for
you
know
how
Learning
Works,
which
was
developed
by
you,
know
Marsh
Lovett
and
focused
at
CMU
in
the
Eberly
Center
at
the
time
and
I
believe
and
right
now,
there's
a
pre-order
on
an
eight
or
a
second
edition,
which
is
now
it's
eight
principles
of
how
learning
works
and
they
have
a
whole
section
on
metacognition
as
well,
and
some
some
ideas
about
how
to
support
that.
B
So
that's
another,
that's
another
good
resource
and
then
Lauren.
When
you
give
me
yours,
I'm
gonna
put
that
on
the
slide
too,
or
you
might
have
already
done
that
by
the
way
I
started.
Alex
is
helping
me
link.
All
of
the
slides
slide
decks
from
these
meetings
in
the
description
of
the
you
of
the
YouTube
videos.
B
So
if
you
ever
want
to
see
the
notes
that
were
taken
in
a
particular
meeting,
anything
we
kept
anyway,
we'll
we'll
be
out
there
or
might
already
be
there
actually
I
think
she
actually
finished
that
yesterday
so
yeah.
So
thanks
Lauren
for
that
again,
thanks
Alex!
B
So
does
anybody
else
have
any
other
ideas
for
how
they
support
metacognition
or
want
to
support
metacognition
any
other
research
you
want
to
share?
In
fact,
let
me
look
at
I
want
to
look
at
my
bibliography.
I
have
a
I
have
a
bibliography
I'm
going
to
stop
sharing
real,
quick,
just
so
I
can
change
to
a
screen
or
I
can
do
a
search
just
to
see
because
I
think
I've
collected
some
articles
there
as
well.
B
Let
me
see
if
I
have
a
metacognition
I,
don't,
but
you
know
what
I'm
going
to
create
one
now
with
these
resources
that
you
guys
are
sharing,
but
just
so
you
know
what
I'm
what
I'm
talking
about,
even
though
there
isn't
something
about
metacognition
there,
it's
a
I
call
it
a
learning,
engineering
bibliography.
It's
just
a
you
know,
collection
of
studies
that
I
I.
My
team
Oli
tends
to
cite
frequently
so
there's
that
and
now
I'll
be
building
I'll
be
building
one
a
section
on
metacognition.
B
A
A
A
So
our
student
cognition
toolbox
deals
with
different
study
strategies
and
also
asks
students
to
reflect
every
single
module
on
what
they're
learning
so
we're
obvi
we're
constantly
asking
students
to
think
about
what
they're
learning
in
the
context
of
what
they've
already
known,
so
I'll
show
you
an
example
of
that
in
a
minute,
but
the
survey
tool
that
was
created
by
CMU
has
become
a
really
important
Cornerstone
in
our
toolbox.
We
do
it
at
the
first
module
and
the
final
module
and
it
it
looks
like
this.
A
It
looks
like
on
my
response,
but
then
you
click
on
it
and
it
is
a
survey
and
we
use
it
in
a
couple
of
different
ways,
but
this
particular
survey
you
can
see
I
just
filled
it
out
already
for
sort
of
like
ease
asking
students
to
reflect
on
how
they
space
out
their
study,
whether
they
explain
Concepts
to
classmates
or
friends,
whether
they
just
copy
their
lecture
notes
or
recourse
course
material.
Just
basically
memorization,
so
I
asked
them
to
reflect
on
how
they're
currently
studying
and
that
kind
of
reflection
them
going
through.
A
A
A
So
you
can
have
students
do
just
simple
questions
as
learned
by
doing
to
get
at
this,
and
then
you
can
also
Leverage
The
Power
of
the
survey
tool
to
basically
not
only
ask
them
really
helpful
survey,
questions
that
are
displayed
in
elegant
fashion,
but
also
to
generate
those
types
of
Graphics,
which
I
think
are
very
helpful
for
students
to
see
at
a
glam
I'll
stop
sharing
that.
That
was
like
a
lot
of
information.
I,
don't
know
if
anyone
has
questions.
B
And
just
know
for
everyone
here,
the
student
cognition
toolbox
is
available.
It's
like
a
what
we
call
a
sidecar
in
a
way
like
it's
a
short
course
that
can
be
used
in
conjunction
with
with
other
courses.
So
anytime,
you
want
to
you
know
it's
a
great
it's.
You
know,
as
Lauren
said
before
it's
a
whole
course.
B
That's
really
trying
to
get
students
to
understand
how
they
learn
and
the
best
approaches
to
learning
for
the
type
of
material
that
they're
studying,
and
so
it's
really
complementary
to
any
course
where
students
are
needing
to
learn
and
be
able
to
retain
and
transfer
that
information.
B
So
ask
us
about
it.
If
you
want
to
use
it
thanks,
Lauren
for
sharing
that
so
I'm.
If
does,
if
anybody
has
any
questions,
I'll
pause
for
questions.
B
Okay,
so
what
I'm
going
to
do
now
since
Lauren
was
showing
off
some
functionality
that
we
that
already
exists
in
Legacy
by
the
way
I'm
going
to
show,
what's
been
done
so
far
in
Taurus
to
start
to
support,
surveying
capabilities,
because
I
and
the
reason
I'm,
showing
off
surveys,
as
you
can
imagine,
is
because
that's
one
of
the
best
tools
to
just
help
students
be
aware
of
and
and
learn
how
they
learn.
B
You
know
something
that
doesn't
have
necessarily
right
or
wrong
associated
with
it
and
if
you
do
know
and
you've
used
any
of
the
activity,
types
and
Legacy,
you
know
that
they
always
have
to
have
like
a
right
or
wrong
Association
to
them,
which
is
why
we
developed
that
my
response,
type
surveying
capabilities
and
Furler
develop
that
for
the
student
cognition
toolbox
as
well.
So,
but
let
me
show
you
we,
so
this
isn't
all
built
out
yet,
but
I
think
Lauren
you'll
see
some
cool
things
that
You
probably
couldn't
do
before.
B
So,
as
always,
my
disclaimer
is
forgive
me
if
I,
if
I
stumble
around
Taurus
a
little
bit,
I'm,
always
learning,
there's
always
like
weekly
new
functionality.
Added
the
engineering
team
is
amazing
and
it's
just
Full
Speed
Ahead
pumping
out
the
features
so
sometimes
I'm
surprised
by
what
I
see
or
I
don't
know
exactly
how
how
it's
intended
to
be
used.
So
just
do
use
that
disclaimer.
B
B
I
did
create
a
survey
already
yeah
survey
example.
So
a
couple
new
things
that
you'll
see
is
part
of
the
survey
that
aren't
just
part
of
the
surveying
capabilities.
We've
had
this
ability
to
do
grouping
on
a
page.
B
So
if
you
remember
you
know,
we
I
tend
to
use,
and
most
of
us
at
Oli
tend
to
use
the
the
terms,
activity
and
question
interchangeably,
but
if
I
really
had
to
define
those
terms
and
the
right
way,
we
we
would
be
using
them
is
that
activities
are
sometimes
made
up
of
one
or
more
questions
and
they're
all
contained
around
a
particular
topic
or
learning
objective
skill
that
sort
of
thing
and
so
in
Legacy.
B
You
could
put
multiple
different
types
of
questions
in
one
visual
box,
so
to
speak,
and
that,
like
showed
that
those
questions
were
grouped
into
an
activity
that
usually
targeted,
you
know
one
or
two
skills,
or
at
least
just
all
in
service
of
the
learning
objective
on
the
page
in
Taurus,
the
way
we've
mimicked
kind
of
that
grouping
Behavior
the
ability
to
group
questions
is:
is
this
grouping
capability
and
what's
cool
about
this?
B
Is
you
can
have
groups
of
things
that
that
that
then
you
can
move
the
whole
group
around
using
the
drag
and
drop
and
really
be
able
to
lay
out
your
your
page
design
using
now
this
grouping
capability
or
not?
B
But
what
I
did
here
is
I
I
created
a
group
that
was
a
survey
and
the
way
I
did
that
by
the
way
is
you
know
in
Taurus
anywhere
you
see
a
page,
and
maybe
when
you
see
this
little
bar
it
gives
you
a
toolbar
for
some
things
and
if
you
hit
you
can
hit
group
to
group
things
and
Survey
to
get.
You
know,
survey
type,
questions,
and
so
in
this
survey,
because
I
selected
survey,
yeah
I
can
edit
the
title
and
then
what
I
did
is
I
added
one.
B
Two
three
questions
to
this,
and
so
the
the
cool
thing
about
this
is
that
all
the
different
question
types
are
available
as
a
survey
so
now
you're
not
you're,
not
forced
to
just
use
an
open-ended
or
a
multiple
choice
or
a
likert
scale.
B
We
actually
have
all
of
those
available
and
you
can
have
any
one
of
those
like
here's,
a
multiple
choice:
here's
a
single
response
and
here's
a
another
single
response
that
I
was
just
in
here
playing
around
to
mimic
a
survey,
and
then
you
can
right
now
we
have
all
these
things
available,
like
answer
key
hints,
mainly
because
they're
using
the
same
activities
exactly
the
same
activities
as
available
anywhere
outside
of
a
survey
too.
B
So
some
of
the
functionality
that
in
me,
playing
with
this
that
I've
thought
of
is
you
know,
being
able
to
offer
feedback
either
by
question
or
for
the
whole
survey
is
something
that
I
think
I
would
want
as
an
enhancement
that
doesn't
exist
yet
and
again.
I
said
the
reporting
functionality
doesn't
exist
either
yet,
but
let
me
show
you
what
this
looks
like
in
a
preview
and
how
it
kind
of
functions.
B
So
the
other
piece
that
hasn't
been
built
out
yet
is
the
data
collection
capabilities.
You
know
so
if
they're
I'll
show
you
that
in
a
second,
let
me
come
back
to
that.
So
right
now,
if
I
fill
out
the
survey,
so
it
puts
it
all
in
one
box
and
has
a
submit
button
for
the
for
the
whole
survey
and
I
can
go
and
I
can
say:
okay,
everything
everything
is
ready,
yeah
or
no
other
questions
and
then,
when
I
submit
I'm
submitting.
All
of
that,
as
a
survey
tells
you
it's
been
submitted.
B
So
that's
what
we
have
right
now,
but
it's
with
some
what's
different
from
Legacy
again.
Is
the
ability
to
use
any
question
type
to
not
have
have
to
have
a
clear
right
or
wrong
answer.
So
when
it's
in
the
survey
box,
it
doesn't
have
to
have
a
clear
right
or
wrong
answer
and
right
now,
there's
no
feedback
either.
So,
even
if
I
wrote
feedback
in
the
authoring,
it
wouldn't
show
up
anywhere
and
then
the
reporting
and
then
the
data.
So
let
me
show
the
data
for
a
second
right.
B
Now
we
don't,
as
you
all
know,
we
don't
have
a
dashboard
yet,
and
we
definitely
don't
have
the
the
capabilities
of
the
student
cognition
toolbox
to
to
generate
this
Dynamic.
B
B
That's
collected
through
a
course
that's
developed,
so
right
now
see
naming
conventions
are
really
important
by
the
way
and
when
you're
building
these
courses
and
tourists
so
far,
the
data
does
feed
this
here,
although
I
can't
see
it,
it
will
tell
you
what
your
answers
are
but
or
what
I
don't
actually
know
how
actually
I
apologize.
I,
don't
know
how
the
data
is
working
and
where
you
can
see
it.
I
asked
this
question
yesterday,
but
I
guess
I
didn't
pay
attention
to
the
answer.
B
There's
no
place
right
now
to
see
the
open-ended
responses
and
unless
you
download
the
raw
data
like
by
the
way,
so
you
can
always
download
the
raw
data
and
you'll
get
all
the
data
that
was
that
was
generated,
but
I
think
that
I
thought
I
heard
that
the
at
least
number
of
attempts
and
I
guess
it
doesn't
really
matter
here,
because
you
wouldn't
expect
any
of
those
survey
questions
to
be
correct
or
eventually
correct.
B
So
it
doesn't
really
make
sense
to
use
this
same
interface
for
those,
although
I
thought
it
I
thought
it
was
totally
just
feed
this,
but
anyway
that's
what
we
have
so
far
so
between
knowing
what
we're
trying
to
accomplish
how
we
want
to
definitely
replicate
or
improve
on
what
we've
done
in
Legacy
and
in
light
of
different
metacognitive
strategies
that
you
can
take
in
your
courses.
B
Give
me
some
feedback
on
any
of
these
areas
in
terms
of
reporting
when
I,
say,
reporting,
question
types,
how
those
questions
should
behave,
what
the
data
collection
could
look
like
and
anything
is
up
for
grabs
at
the
moment.
It's
one
of
the
reasons
I'm
having
this
now
before
it's
fully
built
out,
so
that
we
can
build
in
things.
B
So
Lauren,
let
me
pick
on
you
for
a
second,
so
the
reporting
capability
you
have
right
now
like
think
back
to
when
that
was
developed,
was
it
developed
exactly
with
your
the
way
you
wanted
it
done
in
mind
or
were
the
things
that
you
had
to
like,
say?
Okay,
you
had
to
make
compromises
because
of
the
way
Legacy
worked.
A
A
Sort
of
format
works
very
well
for
us
one
of
the
capabilities
that
we
dreamed
about
that
definitely
was
not
possible
and
I
I
mean
we
kind
of
abandoned
this
was
we
were
really
interested
in
almost
like
crowd
sourcing,
the
current
responses,
so
that
students
could
see
basically
like
how
they
responded,
but
then
also
the
current
responses
of
like
everybody
else
who
had
gone
through
the
course
to
see
how
their
responses
compare
to
other
people's,
and
we
were
going
to
use
different
questions
for
that.
A
So
we
wouldn't
have
used
the
same
questions
that
I
just
showed
you
we're
more
interested
in
there
was
there
were
different
types
of
things
that
we
were
going
to
be
asking
for
that,
but
that
would
be
really
awesome
if,
basically,
it
could
also
generate
a
graph
that
was
representative
in
real
time
of
what
other
people
have
put
before
so
I,
don't
know
if
that
answers
your
question,
but
that
was
something
that,
like
we
dreamed
of,
that,
like
didn't
exist.
B
That's
excellent
and
in
fact
that's
a
really
good
idea
that
I
didn't
I
hadn't
thought
of,
but
we've
been
thinking
about,
building
something
like
that
for
crowdsourcing,
like
places
where
students
need
help
and
being
able
to
like
have
them
annotate
and
say
Here's,
someplace
I,
don't
quite
understand,
but
then
also
make
making
that
feed
like
a
general
crowd
source.
So
you
could
see.
B
Okay,
I
have
other
people
there's
other
people
in
my
class
that
are
having
the
same,
the
same
issue,
the
same
struggle
and
allowing
just
that
visibility
to
help
students,
maybe
connect
up
and
and
do
study
groups
and
things
like
that.
But
I
love.
This
idea,
this
crowdsourcing
of
responses
in
your
situation
is
a
tool
to
you
know,
there's
a
lot
of
good
things
about
that.
But
at
the
same
time,
if
you
use
that
we
could
use
that
same
functionality
in
other
ways
as
well
to
do
some
gamification
a
little
bit.
B
You
know
leaderboards
and
where
it
makes
sense,
of
course,
as
a
way
of
like
helping
students
understand
where
they
are
in
a
you
know,
a
bigger
picture
of
their
learning
and
in
this
case,
with
your
type
of
material,
seeing
that
you
know
they're
not
alone
and
and
maybe
even
hooking
up
with
the
right
kind
of
people
to
help
them
get
through
it
and
learn
some
new
things
in
new
ways.
So
I'm
glad
you
mentioned
that
that's
awesome.
B
Just
adding
on
to
that
idea
of
crowdsourcing.
When
I
took
a
Coursera
course,
they
had
a
I
guess
message
board
at
the
end
of
well.
C
B
B
You
know
it's
so
funny,
because
Eric
Barons,
who
does
the
evidence-based
management
course
I,
think
would
love
a
feature
like
this
as
well.
B
He
likes
to
be
able
to
gate
students
in
a
way
where
they,
you
know
kind
of
you're
kind
of
forcing
them
to
do
about
a
cognition
before
they're
allowed
to
move
on,
which
is
different
than
you
have
to
pass
this
assessment
before
you
move
on.
It
almost
makes
it
seem,
like
you
know,
it's
just
as
important,
which
is
cool.
A
We
also
want.
Sometimes
we
want
the
survey
to
be
on
a
different
page,
but
sometimes
we
want
it
to
be
on
the
same
page
and
I.
Think
that
that's
something
we
addressed
in
a
previous
Taurus
meeting
with
you
know
with
learn
by
doings
or
I.
Don't
know,
I
feel
like
we've
talked
about
this
before,
but
we
also
have
very
brief
survey,
questions
that
are
metacognitive
in
nature
that
it's
only
one
question
is
treated
as
a
survey
and
it
shows
up
as
a
different
page
and
we'd
like
it
to
be
on
the
existing
page.
B
Yeah
well,
if
I,
don't
think
I
pointed
this
out
in
the
demo,
but
you
can
place
those
surveys
anywhere.
You
can
do
it
as
a
separate
page.
You
could
do
it
as
a
graded
assignment.
You
could
do
it
as
a
just
write
in
a
regular
page,
so
that
already
exists
for
you.
A
Great
and
the
other
thing
that
we
had
that
has
been
like
that
has
been
a
little
bit
less
than
perfect
that
you
guys
have
accommodated
and
helped
us
get
all.
The
data
is
just
that.
Currently,
the
survey
tool
data
is
somehow
housed
in
a
different
place
than
the
rest
of
the
data,
so
we
need
a
different
report
generated
so
I
think
that
that's
something
that
could
be
different
in
Taurus.
A
A
If,
especially,
if
you
were
going
to
have
lots
of
those
elements
like
right
now,
we
right
now
we
have
two
major
elements
and
so
like
getting
those
is
it's
fine,
but
I.
Imagine
if
I
was
to
use
a
survey
in
every
single
module
that
could
be
a
bit
of
a
pain.
B
B
You
would
probably
still
want
and
I'm
just
making
this
up,
but
would
you
still
want
maybe
a
separate
place
to
go
just
to
review
survey
data
too
like,
but
have
it
all
in
the
same
day
to
download
which
is
which
is
what
it
does
now
in
Taurus,
so
it
all
feeds
the
same
data
dump
so
to
speak,
but
in
terms
of
like
administering
a
course
on
the
from
the
instructor
side
like
would
you
want
to
be
able
to
go?
B
You
know,
I
can
imagine
you
want
to
be
able
to
go
someplace
and
show
the
survey
results,
maybe
and
have
the
same
functionality
where
you
can
make
remove
names
and
share
that
data
with
the
class
or
even
just
a
way
to
view
it
real
time
quickly
without
having
to
download
I.
Imagine
yeah.
A
So
also
kind
of
the
way
that
currently
exists
in
an
LMS
when
you
have
a
quiz
and
you
can
basically
at
a
glance
easily
toggle
through
and
see
the
response
rates.
That's.
A
B
B
The
survey
data
in
a
larger
report
in
a
larger
data
dump
I'm
going
to
call
it,
but
what
I
was
referring
to
is
even
being
able
to
I
like
wait,
I
want
to
capture
what
you
said
too,
about
being
able
to
toggle
through.
So
when
you
said
that
I'm
imagining
something
like
you
know,
a
lot
of
people
are
probably
used
to
Google
surveys.
B
A
If
that
I
mean
I,
don't
think
that
we
would
require
that
level
of
flexibility,
I
think
it
I
think
it
would
be
nice
to
be
able
to
see
to
easily
see
that
crowd-sourced
response
rate.
So
I
think
that
that
would
be
the
the
most
helpful
thing
to
be
able
to
visualize
quickly
but
I
love.
The
idea
of
I
love
the
idea
of
the
survey
data
being
housed
with
everything
else,
but
I
agree
survey.
Data
is
inherently
different
because
it's
not
like
there's
a
correct
answer.
B
Yeah
I
will
tell
you
that
I
think
we're
going
through
this
now
Caitlyn
and
I,
where
we're
trying
to
dig
into
the
data
generated
over
years
of
Statistics
courses
and
trying
to
get
to
that.
My
response
data
to
see
to
use
that
to
iterate
on
the
course
too
so
like,
where
did
to
crowds
it'd,
be
so
great
if
that
was
already
crowdsourced
right,
where
I
couldn't
just
look
at
our
report
and
say:
okay,
all
students
need
reporting
they're
still
struggling
with
this
one
concept.
That
would
be
awesome.
B
Then
I
could
go
right
to
that
in
the
course
and
add
more
scaffolding
or
clarify
it
or
maybe
even
you
know,
take
a
better
look
at
those
questions
and
see
if
they're
in
alignment
or
if
they're
you
know
what
their
difficulty
is
so
yeah
I
would
I
would
want
the
ability
actually
now
that
I'm
talking
I
want
the
ability
to
just
like
go
directly
to
those
responses
to
in
context.
B
I,
don't
know
how
to
articulate
that,
though,
here
I
want
to
finish
this
note
individual
responses,
even
though
I
know,
you
didn't
quite
ask
for
that.
I
think
I
like
that.
As
an
idea
and
by
the
way
I'm
saying
all
this
is
not,
you
could
always
get
data
Dom,
but
I'm,
trying
to
think
of
ways
that
would
make
it
easier
to
consume
this
data
in
a
real-time
kind
of
way.
So
my
next
idea
was
and
I
already
lost
it.
B
Same
question:
oh
yeah
I
have
a
course
here
how
about
across
a
single
question
a
course
is
that
what
I
said
is
that,
where
you
were
you
answering
my
question
about
what
my
idea
was
yeah.
B
I
think
I
thought
there
was
something
else
to
that.
I
wanted
to
actually.
B
B
I,
like
this
idea
and
I
want
to
make
sure
that
I
said
this
yeah
be
able
to
differentiate
the
survey
data
on
a
larger
data
dump,
because
you
would
actually
also
treat
that
data
differently.
You
know
that's
not
data
you're
going
to
put
into
Data
shop
and
find
you
know,
although
maybe
you
would
to
find
like
where
students
answered
the
most,
what
are
other?
What
are
the
range
of
responses,
so
there's
open-ended
I
mean
we
really
need
I.
Think
a
clean
interface
to
review
open-ended.
B
I
know
that's
a
real
General
comment,
but
I
just
feel.
Like
you
know,
the
way
it's
done
in
Legacy
is
not
ideal
and
we
really
need
to
I
think
think
that,
through
in
terms
of
both
open-ended
questions,
just
in
a
course
related
to
the
course
material
and
the
outcomes,
but
also
survey
survey
opening
it's.
A
True,
so
the
way
that
we
deal
with
that
now
is
Kathy
pulls
those
responses,
because
we
ask
for
feedback
at
the
end
of
not
only
the
toolbox,
but
at
the
end
of
each
unit,
to
ask
what
was
confusing
and
to
ask
you
know
what
would
need
some
work
and
she
just
like
basically
pays
them
into
a
giant
Excel
file
so
that
they're
segmented
and
that
works
pretty
good
as
a
workaround.
But
that
is
clunky.
B
Okay,
let
me
switch
a
little
bit
to
feedback
and
the
kind
of
responses
you
would
want
to
generate
back
to
students
upon
submitting
a
survey
so
like
so
right
now,
feedback
is
disabled
in
those
questions,
I
don't
know,
I
think
I
would
want
the
ability
and
norm
and
I
were
kind
of
going
back
and
forth
in
this
a
little
bit
too
I
think
we
both
agreed
that
we
want
the
ability
to
at
least
give
feedback
if
we
want
and
then
how
does
that
look?
Is
it
feedback
just
for
the
whole
survey?
B
Would
you
want
feedback
for
individual
questions?
I
can
see
actually
use
cases
for
both.
A
A
So
we
yeah
I
could
see
that
being
useful,
especially
if
especially
if,
if
eventually
we
were
to
remove
some
of
the
sort
of
student
responses
to
changes
over
time,
you'd
want
students,
you
know
if
you
were
removing
some
of
those
typed
in
responses.
You'd
want
them
to
be
able
to
select
something,
and
then
the
feedback
could
be
more.
Automated
I
could
see
the
utility
in
that
errand
so
that
it
would
serve
a
little
bit
more.
Almost
like
a
like.
A
quiz
serves.
B
Yeah
I
can't
think
of
a
use
case
right
this
second,
but
I
can
imagine
if
I
wanted
to
present
a
question.
That's
a
survey
question
to
a
student
like
you
know.
How
do
you
feel
about
this
topic
or
actually?
No,
that's
not
even
the
use
case,
I'm
thinking
more
in
terms
of
courses
that
were
starting
to
build
that
are
more
around
Student
Success
more
around.
B
Like
a
lot
like
the
student
cognition
toolbox
too,
like
trying
to
help
students,
like
you
know,
grapple
with
these
hard
social
emotional
kind
of
topics
like
you
know,
harm
reduction.
You
know
alcohol
consumption.
I
can
imagine,
especially
in
those
types
of
course,
materials
where
you
want
to
give
some
sort
of
like
encouraging
statement
after
they've
answered.
You
know
a
hard
question,
like
that's
personal
and
and
vulnerable.
You
might
want
to
say
something
like
you
know.
B
B
So
I
think
at
the
question
level
now
or
even
at
the
survey
level
like
just
again
going
back
to
like
Google
Docs,
you
have
the
ability
to
like
modify
the
the
the
final
note.
Once
someone
submits
a
survey,
you
can
say
you
know
thank
you
for
completing
a
survey
or
you
can.
You
know,
give
them
more
information
about.
You
know
the
topic
that
kind
of
makes
you
feel
like
and
I.
B
Think
people
almost
come
to
expect
that
a
little
bit
to
at
least
have
like
okay
I
submitted
it
and
now
I
get
some
message
back.
You
know
so
it
assures
me
I
submitted
it
and
I,
don't
know
how
important
do
you
think
that
is.
A
B
So
it's
fun
about
this
meeting
so.
A
B
A
Of
a
big
ask
for
students
that
would
be
a
little
bit
lazier.
It
would
be
really
nice
if
there
was
a
feature
when,
if
you
did
a
post
survey,
where
the
instruct
I
mean
the
basically
the
course
designer
could
check
a
box.
That
said
also
show
original
survey
like
above
or
below
or
really
graph
side
by
side,
but
because
pre
and
post
surveys
with
the
same
questions
are
so
common
I.
Don't
think
we
would
be
the
only
people
that
would
like
two
bars
than
the
Great.
B
I'm
so
glad
you
brought
that
up.
Yes,
that's
something
I
didn't
think
about,
and
we
could
probably
talk
at
length
about
the
capabilities
needed
just
to
support
pre
and
post
testing
in
general.
In
fact,
I'm
going
to
mark
that
as
a
topic
for
a
future.
C
B
Yeah
so
I'm
gonna
save
that
for
a
whole
other
community
meeting,
because
I
think
we
could
talk
at
length
and
it's
really
important
topic
to
a
lot
of
people.
B
B
So
but
the
other
cool
reason
I'm
glad
you
brought
this
up.
Is
we
use
this
in
a
way
this?
This
type
of
functionality
we
built
specifically
and
I-
think
it's
only
used
and
I
believe
it's
a
prose
style
course
and
I
know
when
I
was
at
acrobatic
I
believe
we
tried
to
build
an
English
composition
course,
and
this
challenge
to
that
was
having
open-ended
responses
that
were
large.
B
B
Based
on
the
instruction
that
they
were
given,
so
in
it's
a
very
similar
type
of
functionality
that
you
just
described
where
we
want
to
be
able
to
feed
back
previous
information
data
survey
results
something
like
right
in
a
place
where
they're
going
to
use
it
again
or
need
to
compare
and
contrast
and
in
terms
of
a
survey
and
in
particular
your
surveys.
B
It's
really
important
for
them
to
be
able
to
see
I
mean
that's
ultimately,
what
you're
trying
to
show
them
through
the
whole
student
cognition
toolbox.
Is
you
have
these
study
strategies
and
now
they
know
more
about
them,
you're
going
to
plan
to
do
more
of
these,
and
it
would
be
nice
for
them
to
see
that
change
in
their
own
behavior,
which
is
also
supporting
metacognition.
C
Aaron
can
I
ask
a
question
absolutely
so
in
some
of
these
time
and
some
of
these
sort
of
interventions
that
are
arguably
secondary
to
the
learning
you
know.
Sometimes
you
you
learn
from
pre-mid
post-tests
whatever,
but
when
you're
talking
about
the
surveys,
it
seemed
to
me:
I
mean
I.
I
always
wrestle
with.
Are
students
going
to
start
to
get
annoyed
with
all
these,
with
all
these
tests,
with
all
these
quizzes
we're
giving
them?
You
know
yeah
and
I'm,
wondering
if
like
for
the
surveys,
it
might
not
it.
C
C
That
might
encourage
students
to
just
blow
it
off
and
just
you
know,
submit
their
previous
answers.
So
maybe
that's
not
a
good
thing,
but
it
might
be
an
option.
Yeah
and
again,
you
know
there
are
especially
the
good
students,
the
good
students,
the
trouble
that
we've
had
with
the
genetics.
Tutor
was
you
you
just
the
and
the
professors
were
always
more
focused.
C
It's
often
more
focused
on
the
good
students
and
we're
more
concerned
about
not
wanting
to
annoy
or
burden
the
good
students
with
activities
that
they
were,
that
they
didn't
really
need
than
they
were
in
making
sure
that
the
weakest
students
got
all
the
help
that
they
needed
now.
Obviously,
you
can
try
and
address
that
with
with
knowledge,
tracing,
etc,
etc.
But
it's
it's
it's
it's
of
course,
I'm
sure
everybody
is
is.
B
Yeah
I
will
say:
I've
just
learned
recently:
I
can't
remember
how
that
yeah
I
mean
oh
I,
think
it
was
through
some
user
testing
that
we
were
doing
for
a
grant
where
the
students
there
was.
We
were
going
to
give
them
a
place
to
do
some
goal.
Writing
for
the
course
and
say
it's
for
a
chemistry
course
goal
setting
I
mean
you
know,
rights
and
goals
that
you
have
and
then
giving
them
an
opportunity
to
revise
those
goals
throughout
the
course
and
I
was
skeptical
like
nobody's
gonna.
B
B
How
do
you
like
I,
just
can't
imagine
it
actually
being
used
if
we
make
it
optional
and
through
the
user
testing
all
the
students?
Actually
many
of
the
students
I
believe
said
that
they
would
use
it
if
there
are
points
associated
with
it.
That
it'd
be
really
cool
feature
and
they
would
totally
probably
benefit
from
it,
but
they
probably
wouldn't
do
it
unless
there
were
points
associated
with
it.
B
So
yeah
there's
a
lot
of
probably
a
lot
of
research
to
consume
even
on
this,
like
the
right
timing
and
the
right
amount
and
I
could
probably
even
think
of
some
creative
ways
to
solve
that
problem
in
a
given
course
by,
like
you,
know,
kind
of
letting
the
good
students
know
in
some
way
that
you
know,
although
this
might
be
really
annoying,
it's
really
helpful.
B
It's
actually
like
kind
of
putting
a
little
bit
of
responsibility
on
them
for
not
only
their
own
learning,
but
also
in
the
instructor's
ability
to
help
others
and
learn
from
those
good
students
too,
about
what
to
do
for
the
struggling
students,
and
so
it's
important
to
get
that
data
from
them
as
well
and
probably
have
to
do
that
with
either
like
points
or
or
just
really
explaining
it
and
making
it
clear
that
it's
an
important
part
of
the
the
the
course,
even
though
it
doesn't
seem
that
way
to
them.
B
But
yeah.
It's
a
struggle
to
yeah
I!
Think
about
that
too,
and
that's
why
we
place
these
surveys
in
our
courses.
There's
my
responses
at
the
end
of
every
module
because
we
I
think
we
found
like
if
you
do
it
at
the
unit
level,
it's
too
many
objectives
for
them
to
rate
themselves
against
and
if
you
do
it
more
frequently,
like
you
know,
every
couple
Pages,
that's
certainly
they're
gonna
just
start
to
ignore
them
right
it's.
B
So
it's
a
little
bit
of
trial
and
error
to
find
that
sweet
spot
of
like
what's
the
best
place,
to
put
it
that
doesn't
like
seem
overwhelming
and
so
outside
of
the
the
learning
for
the
course
and
that
it
actually
enhances
that
if
they
take
advantage
of
these
surveys
and
these
reflection
tools,
my.
C
My
takeaway
was,
if
you
don't
give
students
points
for
certain
things,
that's
where
you
learn
which
the
really
conscientious
students
are,
and
if
you
do
give
points,
they
tend
to
do
those
activities
right
before
the
final,
when
they're
realizing
that
they
might
be
close
to
what
they
might
be
close
to
a
a
boundary
and,
of
course,
from
you
know,
learning
perspective.
You
know,
data
collection
perspective,
that's
not
what
you
want.
You
want
students
to
all
be
doing
using
a
tutor
lesson
at
the
same
point
during
the
course.
B
Know
you're
you're,
making
me
write
down
so
we're
in
the
in
the
middle
of
trying
to
figure
out
like
professional
development
resources
and
tools
and
courses
or
whatever
for
both
faculty
and
instructional
designers
and
I.
Think
something
that
I
hadn't
thought
of
before.
As
a
topic
is
you
know,
grading
strategies
for
online
learning
motivation
strategies
for
online
learning,
which
is
something
I
haven't?
B
You
know,
isn't
in
my
list
of
topics
to
build
out,
but
it's
definitely
something
I
think
would
be
useful
and
needed
by
a
lot
of
instructors
who
who
do
online
teaching.
A
Was
awesome,
I
was
also
thinking
about
how
you
know
you
might
only
want
to
ask
so
many
discreet,
content-based
questions,
and
you
don't
want
to
like
continue
to
ask
them
in
different
ways,
because
you
have
students
at
different
levels
and
you
don't
want
to
bore
those
students,
but
you
can
ask
questions
that
are
metacognitive
in
nature
that
still
ask
for
Content
retrieval
synthesis
and
integration.
That
is
our
kind
of
great
equalizer
questions
that
everybody
can
benefit
from
it.
A
So
I
would
say
that
there
that's
a
nice
way
to
strike
the
balance
like
if
you
have
a
bunch
of
content
based
discrete
questions.
If
you
ask
students
the
questions
that
have
to
do
with
connecting
the
new
content
to
their
prior
knowledge.
Those
short
answer
questions,
even
if
you're
only
grading
those
based
upon
completion,
those
benefit
novices
and
more
experienced
Learners.
So
that's
kind
of
a
way
to
get
around
this
issue
of
Learners
at
different
levels.
B
Yeah,
so
basically,
let
me
let
me
repeat
back
what
you
said
to
just
phrase
it
another
way
and
make
sure
I
understand
what
you're
saying.
So
it's
basically
asking
a
question
about
their
understanding.
You
know
it's
kind
of
like
I
used
to
love,
because
I
I
was
a
student
that
hated
tests.
B
I
hated
tests,
I
always
like
to
write
and
I,
had
a
teacher
once
who
would
do
an
assignment
where
it
was
like
write
everything
you
know
about
this
topic
and
it
to
me
it
was
so
fun
because
you
could
literally
just
write
okay,
I'll
write
everything
I
know
about
this
and
like
so
something
along
those
lines
where
you're
asking
them
a
question
about
the
content,
but
in
a
way
where
they
have
to
actually
write
a
coherent
response
in
an
open-ended
way.
B
And
if
you
do
that,
it
helps
them
solidify
that
information
even
more,
and
it
would
help
you
as
an
instructor
to
really
understand
where
their
misconceptions
are,
which
would
help
you
help
other
students
to
pinpoint
their
misconceptions
that
are
maybe
not
as
keen
on
their
metacognitive
strategies
and
abilities
and
sorry
I'm
babbling
on
about
that.
But
I
like
that
idea,
a
lot.
A
It
benefits
it
benefits,
different
stages
of
learning
as
well,
so.
B
A
Why
I'm
saying
it
kind
of
like
does
it?
It
does
all
of
the
things
that
you
want
it
to
do
for
different
levels
of
novice,
first
towards
expertise
of
students,
because
if
they
are
expert,
then
they'll
get
practice
and
fluently
describing
more
eloquently
how
the
new
knowledge
is
incorporated
with
their
previous
understanding
or
how
it
fills
in
gaps
or
misconceptions,
and
if
they're
complete
novices,
then
it'll
still
help
them
practice.
Trying
to
place
that
information
into
some
sort
of
coherent
context.
A
B
I
love
that
strategy,
it's
kind
of
tricky
you
know
so
like
what
I
wrote
here
is
it's
like,
so
that
you
know
they're
more
likely
to
answer
those
types
of
questions.
First
of
all
than
just
like
a
survey
question
that
they
feel
isn't
relevant
to
the
material,
and
you
know
the
third
just
asked
how
they
feel
about
this
material.
Well,
they'll
kind
of
skip
those,
but
if
you
ask
them
a
question
that
just
helps
them
makes
them
synthesize
what
they
just
learned.
B
Both
you
and
the
student
are
going
to
benefit
from
that
in
ways
that,
like
are
Beyond.
What
even
just
regular
survey
questions
would.
B
A
Elaborative
interrogation,
yeah
elaborative
interrogation
of
questions
can
be.
You
requires
some
prior
knowledge,
so
self-explanation
is
more
inclusive
of
all
different
learning
levels,
but
elaborative
interrogation
also
falls
under
the
realm
of
reflective
question.
B
B
I'm
just
going
to
take
a
minute
to
think
about
it
and
I
encourage
you
guys
to
do
too
and
just
come.
You
know
say
what
comes
to
mind
if
you
think
of
anything
else,
that
would
be
important
to
this
topic.
B
B
A
For
them
to
do
it
themselves
and
then
also
to
easily
Post
in
existing
concept,
Maps,
where
they'll
be
readable
by
a
screen
reader,
so
that
you
don't
then
have
to
do
some
sort
of
insane
text
box.
B
B
So
that
that
could
be
a
really
interesting
question
type
and
I'll
probably
want
to
pick
your
brain
more
about
how
that
might
work
and
what
the
UI
might
look
like
for
that.
A
To
have
students
come
up
with
their
own
terms,
but
novices
like
you
can
provide
a
list
of
terms
so
there's
different
ways
that
you
can
do
this.
B
B
Right,
I
I
will
tell
you
this
if
you
haven't,
read
it
yet
or
if
I
haven't
mentioned
it
before
I'm
sure
my
teens
heard
me
mention
it
a
couple
times,
but
there
was
research
done
by
Rachel
van
camdenow
at
vitalsource
you,
you
know
it
took
over
acrobatic.
They
were
able
to
generate.
B
The
study
itself
is
about
computer
generated
questions
and
the
way
that
they
were
able
to
get
they.
They
kind
of
pitted
instructor,
written
questions
against
AI,
written
questions
and
found
and,
and
the
study
was
to
see,
are
they
just
as
effective?
Are
they
just
as
good
questions
as
what
instructors
would
come
up
with
or
subject
matter
experts,
and
through
that
study
they
were
evaluating
question
types
based
on
engagement,
difficulty
and
I?
Have
it
written
down
here
persistence
which
I've
never
seen?
B
Another
study
like
this
is
why
it
makes
me
so
excited
and
why
I
talk
about
it
all
the
time
like
there's,
not
a
whole
lot
of
research
out
there
on
how
different
question
types
work
either
for
the
students
or
for
the
material
itself,
and
so
they
found,
interestingly
enough,
that
not
only
could
the
AI
the
AI
generated
questions
were
better,
which
was
insane
to
me,
but
the
other
little
sub
part
of
that
study
that
I
love.
B
So
much
is
you
know,
breaking
down
these
different
question
types
like
multiple
choice,
drag
and
drop
fill
in
the
blank
I
forget
the
range
of
of
types
that
they
looked
at
and
they
broke
them
down
by
that
like
what
like
what
I
said,
persistence,
performance
or
you
know,
difficulty
and
engagement
and
found
that
Dragon
drops
are
the
most
engaging
the
ones
that
the
students
persist
with
the
most
and
and
I
can't
remember
what
the
difficulty
was
around
them,
because
I
was
more
I
was
more
interested
in
the
like
the
behavior
of
the
idea
of
Engagement
and
persistence
in
a
way,
and
dragon
drops
were
the
best
now,
unfortunately
drag
and
drops,
and
the
reason
I'm
bringing
this
up
is
because
I'm
thinking
of
that
mind,
mapping
activity
type
for
some
reason
again,
like
that's
kind
of
how
it
would
have
to
work
in
a
way
and
the
problem
with
those
types
of
activities
is
they're.
B
Not
there's
not
a
really
good
way
to
do
those
excessively
or
on
a
mobile
device
very
easily
and
so
I'm
always
kind
of
like
in
the
back
of
my
mind,
trying
to
think
about
how
do
we
build
more
more
of
those,
but
is
there
like
a
way
to
make
them
more
accessible
that
doesn't
exist
yet
so
again,
I'm
just
babbling
on
now
sorry,
I
just
got
thinking
about
this
concept,
mapping,
question
type
and
and
how
I
think
it
would
be
a
really
excellent
type
to.
C
I
I'm
I'm
completely
unqualified
to
to
make
this
claim,
because
I
really
don't
I'm,
really
not
familiar
enough
with
with
how
the
technology
works,
but
it
always
struck
me
that,
and-
and
this
is
not
consistent
with
what
you
just
said,
but
it
always
struck
me
that
the
AI
types
of
question
generation
would
be
much
more
superficial
and
and
much
easier
for
students
to
potentially
get
just
like
from
skimming,
or
you
know
the
people
that
weren't
really
reading
the
material
very
deeply
yeah
and
that
really
insightful
kinds
of
questions
that
would
force
like
deeper
thinking
would
require
a
level
of
processing
that
the
quote
unquote,
AI,
question
extraction
algorithms
would
be
incapable
of
is
that
is,
is
that.
B
So
it's
interesting
that
you
bring
that
up
yeah
and
it's
taking
us
in
a
whole
different
Topic
at
the
moment,
but
I
like
this
topic.
It's
fun
to
talk
about
so
I
agree
and
the
reason
I
agree
and
I'd
have
to
dig
into
the
study
a
little
bit
more.
They
gave
some
examples,
but
what
I'm?
What
I'm?
Assuming
is
that
you
know
a
lot
of
instructors
who
don't
have
a
background
in
learning
design
and
I've
even
seen
novice
learning
designers.
B
Do
this
a
lot
which
is
they're
generating
questions
that
what
I
would
call
reading
comprehension,
questions
they're,
not
really
about
not
getting
at
what
the
students
know
or
don't
know,
but
were
they
able
to
to
skim
and
get
the
information
real,
quick?
And
then
they
can
answer
the
question
and
there's
there's
value
in
doing
those
types
of
questions
for
sure
I
mean
that
can
help
with
scaffolding.
B
You
know
low-level
scaffolding,
you
can
ask
them
those
questions,
but
it
really
disheartens
me
when
I
see
like
a
whole
course
that
are
just
like
reading
comprehension,
they're,
not
really
asking
them
to
apply
they're,
not
asking
to
analyze
they're,
not
doing
anything
which
you
can
do
easily
with
like
scenario
based
questions
and
things
like
that.
But
instead
it's
easier
and
the
feedback
is
much
easier
to
write
for
questions
that
are
just
like
you
know.
B
Did
you
get
this
this
piece
of
knowledge
out
of
the
reading
above
you
know,
and
I
can
imagine
to
your
point
that
the
AI
generated
questions
are
really
really
good
at
that
now
I,
don't
know
and
I,
don't
think,
there's
been
enough
research
to
know
and
I,
don't
think
AI
in
my
opinion
is
like
far
I'm
I
was
actually
really
surprised
by
their
findings
because
and
I
kind
of
I'm
not
saying
I,
don't
trust
them,
but
there's
like
more
digging
in
to
do
there,
because
I
I'm
still
skeptical
that
even
even
a
lot
of
instructors
still
just
write,
reading,
comprehension,
questions
because
they're
trying
to
get
to
the
numbers
of
questions
right
and
they're
trying
to
make
sure
they
analyze
every
part
of
what
the
student
knows
or
doesn't
know,
and
those
are
easy
questions
to
write
easy
feedback
to
write
it
does
it
does
I
kind
of
agree,
you're,
probably
right
on
that.
B
C
And
there
are
so
many
levels
at
which
you
could
be
skeptical,
but
the
you
know
on.
C
C
You
know
that
that,
were
you
know
getting
at
the
kinds
of
knowledge
that
would
transfer-
and
you
know
so
so
many
different
kinds
of
expertise
that
go
into
the
right
kinds
of
questions
that
you
want
to
be
asking
so
I
mean
I,
I,
I,
I,
I,
don't
I,
don't
it
would
be
unfair
to
to
judge
people
that
are
doing
that
kind
of
research,
because
I
think
it's
I,
think
it's
great
and
it's
to
to
be
able
to
to
to
do
that
with
that,
little
investment
is
is,
is
a
is
a
is
a
great
tool
to
have
in
the
toolbox.
C
B
Like,
for
example,
it'd
be
great
to
be
able
to
get
a
whole
bunch
of
questions
with
feedback
and
everything
written
that's
AI
generated,
but
then
at
least
have
a
starting
point
that
someone
can
go
through
and
say:
okay,
I'm,
going
to
add
and
enhance
this
with
some
more
deeper
level
questioning
and
getting
at
higher
cognitive
levels.
But
it's
a
good
starting
point
at
least
because
it's
it's
so
expensive.
I
I
tell
people
when
I'm
when
I'm
training
them
and
how
to
do.
Question
writing
for
our
platform.
B
That
requires
a
lot
of
thought
and
a
lot
of
a
lot
of
work
and
so
expect
a
half
hour
per
question
and
then,
as
you
get
more
practiced
at
it,
you
can
bring
that
down
to
about
15
minutes
a
question,
but
if
you're
spending
less
than
that,
then
you're
writing
bad
questions.
In
a
way.
B
I've
never
said
that
before,
but
I
kind
of
feel
that
way
deep
down
like
if
you're,
if
you're,
just
like
throwing
together
questions
that
they're
gonna
most
end
up
most
likely
end
up
being
reading
comprehension,
questions
unless
you're
being
really
thoughtful,
so
yeah,
it's
a
whole
interesting
field,
there's
a
lot
to
unpack
there,
but
thanks
thanks
for
the
babble
it
took
us
to
time
and
I
think
we
got
a
lot
out
of
this
meeting
in
terms
of
like
ideas
and
ways
we
can
support.
B
Metacognitive
questioning
and
strategies
in
our
courses
know
that
the
next
Taurus
meeting
will
be
October
2nd
and
it's
going
to
be
around
research
support.
So
we
have
a
lot
of
researchers
who
use
our
platform
to
perform
different
kinds
of
research
and
fulfill
Grand
obligations
and
I
really
I.
B
Don't
think
I've
ever
really
gotten
a
whole
bunch
of
those
folks
together
to
find
out
what
are
their
needs
and
what
would
they
like
to
see
as
part
of
the
platform
to
support
and
enable
better
research
and
I'm
not
talking
just
about
like
a
b
testing,
although
we
could
really
talk
about
what
does
it
mean
to
have
an
A
B
testing
capability,
but
I
would
kind
of
want
to
think
about
more
than
that.
The
data
processing
every
every
aspect
of
it.
B
So
if
you're
interested
in
that-
and
you
have
a
lot
to
say
about
that-
please
comment
if,
if
you
know
people
who
do
do
research
with
these
types
of
platforms
and
and
are
even
Adept
with
different
ones,
I
would
love
for
them
to
come
to
this
meeting
as
well
so
spread
the
word
and
it'll
be
about
research,
and
thank
you
guys
all
for
coming
on
a
Friday
before
holiday
weekend.
I
hope
you
have
a
nice
weekend
enjoy
the
last
little
bits
of
Summer
and
I'll.
See
you
all
soon.
B
C
C
A
quick
question
off
topic:
I
have
been.