►
Description
In the 2014-2015 school year, many states will implement new common assessments that are aligned to their new common college and career-ready standards. Hear from experts from the two assessment consortia, PARCC and Smarter Balanced, to get an update on this work and what it means for your state. Don't miss this opportunity to get your questions answered directly by the experts!
A
Good
day,
ladies
and
gentlemen,
and
welcome
to
your
common
student
assessments,
where
legislators
need
to
know
webinar
on
lines
have
been
placed
on
listen-only
mode
and
the
floor
will
be
open
for
your
questions
and
comments.
Following
the
presentation,
you
would
like
to
submit
a
question
at
any
time.
During
the
webinar,
you
may
use
the
chat
box
located
on
the
lower
right
hand,
side
of
your
screen
simply
type
your
message
into
the
box
and
click
on
the
send
button.
A
B
You
and
good
afternoon
everyone
welcome
to
today's
webinar
on
common
student
assessments.
We
hope
that
this
webinar
will
give
you
an
opportunity
to
have
all
of
your
questions
answered
about
the
new
assessments
by
members
of
the
consortium
selves.
My
name
is
Michelle
extreme
and
I'm.
An
education
program
director
at
the
National
Conference
of
State
Legislatures
I,
will
be
your
moderator
for
today's
webinar.
Today
we
have
joining
us
Jackie
King
from
the
smarter,
balanced
assessment
consortium
and
Jeff
now
house
from
the
park
consortium.
B
First,
we'll
begin
with
Jackie
King
jaggi
joined
the
smarter,
balanced
assessment
consortium
in
October
2011
as
director
of
higher
education
collaboration.
She
has
primary
responsibility
for
ensuring
that
higher
education
institutions
are
engaged
in
the
process
of
designing
the
new
assessment
system,
to
measure
students,
college
and
career
readiness
and
to
use
the
assessments
of
part
of
their
course
placement
policies
in
fall
of
2013.
B
Her
responsibilities
were
expanded
to
include
services
for
state,
k-12
representatives
and
strategic
communications.
She
holds
a
PhD
in
higher
education
from
the
University
of
Maryland
at
College.
Park
I
am
pleased
to
have
her
with
us.
We
know
that
you
all
have
lots
of
questions
and
are
anxious
to
hear
a
little
bit
more
about
where
the
assessment
consortia
are
at,
as
we
are
nearing
the
2014-2015
school
year
and
I
am
going
to
go
ahead
and
turn
over
to
jackie's
so
that
she
can
fill
you
in
on
what
is
going
on
at
smarter
balance.
Great.
C
Just
walk
through
quickly.
What
I
would
hope
we
can
cover
in
the
next
few
minutes
is
a
quick
refresher
on
smarter,
balanced,
an
update
on
where
we
are
in
terms
of
the
development
of
the
assessment
system,
and
then
we
want
to
hone
in
on
four
key
issues
that
we
hear
about
a
lot
technology
student,
privacy
cost
and
sustainability
and
I'm
looking
forward.
Also
to
taking
your
questions.
C
C
C
It
was
really
the
the
genesis
of
the
smarter
balanced
assessment
consortium
was
really
state
assessment
directors
who
know
each
other
talk
to
each
other.
All
the
time
see
each
other
at
conferences
and
would
of
course,
you
know
trade
notes
and
learn
from
each
other
and
who
wanted
to
do
things
differently,
wanted
to
improve
the
the
kinds
of
assessments
that
they
were
offering,
but
oftentimes
in
their
individual
states
were.
A
C
Constrained
in
terms
of
the
resources
that
they
had
available
to
them.
So
when
the
race
to
the
top
assessment
program
came
along
and
the
federal
government
was
putting
dollars
on
the
table
to
support
the
development
of
assessments,
two
groups
of
assessment
directors,
one
in
the
Pacific
Northwest,
primarily
that
called
themselves
smarter
and
one
in
New
England.
They
called
itself
balanced
where
each
putting
together
proposals
they
got
together,
realized
that
their
proposals
had
a
lot
in
common
and
we're
complementary
and
would
actually
be
stronger
together.
C
And
so
the
smarter
group
got
together
with
the
balanced
group
and
and
dairy
from
that.
You
get
there
our
funny
name,
but
wanted
to
tell
that
story.
So
you
could
understand
the
genesis.
We
get
a
lot
of
folks
assuming
we're
some
kind
of
a
testing
company,
and
really
it
is
a
state-led
consortium
really
built
from
folks
in
state
education
agencies
who
have
been
tasked
for
all
these
years
with
developing
and
implementing
state
assessment
programs.
C
C
What
happened
happened
to
them
with
what
happened
so
often
now
is
the
kids.
Do
everything
they've
been
told
they're
supposed
to
do
they
get
they
get
decent
grades
and
it's
not
until
they
get
to
college
and
have
to
take
a
placement
test
that
they
find
out
that
they
didn't
learn
everything
they
needed
to
learn
in
high
school
and
that
they're
going
to
have
to
pay
and
use
their
financial
aid
eligibility
to
take
developmental
courses
that
won't
count
towards
their
degree.
So
well.
C
This
link
to
college
readiness
and
career
readiness
are
really
really
important
elements
of
the
assessment
system.
In
order
to
to
do
that
to
demonstrate
to
higher
education
faculty
that
that
these
students
have
the
necessary
skills,
we
really
have
to
move
away
from
what
what
standardized
tests
have
traditionally
measured,
which
are
the
skills
on
this
chart
at
levels.
One
and
two
estimating
matching
recalling
defining
and
really
moving
towards
strategic
and
extended
thinking
where
students
have
to
analyze
and
synthesize.
They
have
to
develop
my
logical
arguments.
They
have
to
critique
and
explain.
C
Here's
just
a
quick
visual
on
where
we
are
in
terms
of
building
the
assessment.
We
started
with
cognitive
labs
and
small-scale
trials
in
2012.
These
are
our
early
tryouts.
The
cognitive
labs
are
actually
one-on-one
a
researcher
with
the
students
sitting
with
them,
as
they
work
through
assessment
items,
them
think
out
loud,
while
they're
working
on
it
identifying
where,
where
they
have
difficulties,
where
perhaps
the
interface
is
confusing,
those
were
very,
very
helpful
exercises.
C
We
also
did
a
pilot
test
last
spring
with
approximately
six
hundred
and
fifty
thousand
students
in
about
five
thousand
schools
around
the
country.
We
completed
development
of
our
field
test
item
bank
in
december
of
last
year.
That's
about
twenty
thousand
items.
Smarter
balanced
is
a
computer
adaptive
assessment.
C
This
is
a
pretty
commonly
used
model
in
testing
that
adjusts
the
assessment
to
the
performance
of
the
student,
so
I.
My
simple-minded
way
of
thinking
about
it
is
that
the
children's
game
of
you're
getting
colder
you're
getting
warmer
student
starts
the
test
with
questions
at
about
the
middle
range
of
difficulty.
If
they
get
those
questions
right,
questions
get
a
little
bit
harder.
They
get
the
mom
questions,
get
a
little
bit
easier
and
they'd
say
it's
a
dialogue
between
the
student
and
the
test
until
we're
able
to
zero.
A
C
The
students
performance
level-
this
has
a
couple
of
benefits.
One
is
that
you
can
typically
reduce
the
testing
time
a
little
bit,
because
it's
a
bit
more
efficient,
it
also
is
produces
a
more
accurate
measurement
for
students
at
the
top
and
bottom
of
the
achievement.
Range
students
aren't
confronted
with
a
lot
of
questions
that
they
can't
answer
or
that
they
find
really
easy.
C
We
are
gearing
up
for
a
field
test,
that's
starting
in
a
little
little
more
than
two
weeks
on
March
18th
that
will
run
through
june.
Six
are
the
field
tests
will
include
over
three
million
students
in
about
twenty
thousand
schools
around
the
country,
so
this
is
a
practice
run
of
the
assessment.
It
will
be
our
final
quality
check
on
the
items,
a
really
important
level
of
quality
assurance.
C
We
will
do
something
called
standard
setting
which,
in
real
English
means
setting
the
cut
scores.
Setting
the
performance
cut
scores
for
the
assessment,
this
September,
and
then
we
will
roll
out
the
full
operational
assessment
in
the
1415
academic
year,
and
that
will
be
the
end
of
your
summative
in
2015,
and
also
the
interim
assessments
and
the
formative
assessment.
Digital
library.
C
Folks,
always
ask
me
about
testing
time
so
I
threw
in
this
chart.
You
can
see
that
it
ranges
from
four
to
four
and
a
half
hours
in
total
for
English
language
arts.
A
half
hour
of
that
is
a
classroom
activity
that
used
to
introduce
the
performance
task
portion
of
the
assessment
performance
tasks
are
really
an
important
element
of
getting
at
those
level
3
and
level
4
kinds
of
skills
that
I
showed
earlier.
C
This
is
a
set
of
activities
organized
around
a
common
theme
that
will
ask
students
to
apply
a
range
of
skills
and
knowledge
an
ela.
This
is
where
they
would
write
an
extended
essay
and
demonstrate
research
skills
in
mathematics,
they're,
going
to
be
applying
a
lot
of
different
skills
and
knowledge
in
order
to
solve
a
fairly
complex
problem.
C
So
we
do
a
short
activity
in
the
classroom
to
help
them
get
ready
to
do
that.
Orient
them
to
the
theme
that
that
kind
of
unites
all
the
questions
in
the
performance
task
and
you
can
see
math
there.
The
total
testing
time
with
the
in-class
activity
ranges
from
three
to
four
hours,
depending
on
the
student
grade.
C
We've
got
a
lot
of
resources
available
to
help
schools
prepare
without
a
practice
test
available
since
May
of
last
year.
We
now
have
a
training
test
as
well,
and
a
lot
of
additional
online
resources
that
schools
are
taking
advantage
of
to
get
themselves
ready
to
administer
our
field
test,
starting
on
the
18th
of
March.
C
Want
to
move
on
to
address
some
of
these
key
issues.
We
know
technology
is
a
big
concern.
We've
had
an
online
readiness
tool
that
was
developed
with
Park
to
allow
schools
and
districts
to
evaluate
their
readiness.
That's
been
available
now
for
oh
well.
Well,
over
a
year,
probably
going
on
two
years,
we've
had
our
standards
out
for
the
kinds
of
new
and
existing
hardware
and
operating
systems
that
are
necessary.
We've
really
tried
to
build
these
assessments,
so
they
will
work
on
the
computers
and
on
the
software
that
exists
in
schools
today.
A
C
The
assessment
system
on
Windows
XP,
which
is
a
pretty
old
operating
system
and
on
any
computer
that
will
run
in
windows
XP
at
the
at
the
other
end
of
the
spectrum.
We've
also
designed
these
assessments,
so
they
will
work
on
tablets
and
some
of
the
newer
kinds
of
devices
that
are
appearing
in
schools
really
important
to
understand
that
schools
do
not
need
121
computers
to
do
these
assessments.
One
of
the
benefits
of
the
computer
adaptive
assessment
is
that
you
can
test
over
a
longer
period
of
time,
since
every
kids
test
is
different.
C
The
test
security
concerns
are
are
less
than
they
otherwise
would
be
a
benefit
of
any
computer
based
test.
You
don't
have
tests,
you
don't
have
test
papers
sitting
around,
so
you
can
spread
out
the
testing
over
over
a
longer
period
of
time
and
by
and
in
so
doing
you
can
take
full
advantage
of
the
computer
resources
that
you
do
have
without
needing
a
lot
of
additional
machines.
So
we
estimate
that
school,
a
school
with
a
600
600
students
can
test
in
a
single
30
computer
lab
in
our
in
our
testing
window.
C
The
biggest
issue
really
is
not
hardware
and
software,
its
bandwidth
for
schools,
and
so
we
have
a
readiness
calculator
that
allows
schools
to
calculate
how
many
students
they
could
test
simultaneously,
given
their
bandwidth
and
the
other
normal
demand
on
their
bandwidth
during
a
school
day.
So
that's
been
a
really
useful
tool
and,
of
course,
the
field
tests
will
give
schools
an
opportunity
to
try
that
out
in
in
real
time.
C
Another
concern
that
comes
up
a
lot
is
student
privacy,
and
we
know
that
this
is
something
that
I'm
apparent.
I
can
tell
you
I
worry
about
it
a
lot
in
this.
In
this
internet
age
we
are
taking
a
number
of
measures
to
safeguard
student
privacy
and
data
security
under
smarter,
balanced
States
retain
complete
control
of
their
student
data.
We
are
not
going
to
share
student
level
information
with
the
federal
government,
we're
not
going
to
sell
student
data.
C
We
will
only
share
information
with
a
third
party
such
as
a
researcher
if
the
state
directed
us
to
do
so,
States
completely
retain
control
of
their
data
and
the
reporting
requirements
for
the
note
for
No
Child
Left
Behind
are
completely
unaffected
by
participation
in
the
consortium.
Smarter,
malice,
Chiefs
recently
affirmed
this
and
a
letter
to
secretary
Duncan,
nothing
changes
about
what
states
have
to
report
to
the
federal
government.
Our
states
have
for
the
field
test
and
then
for
operational
testing.
C
C
We
do
need
certain
pieces
of
information
to
do
the
research
and
analysis,
that's
necessary
to
make
sure
that
the
assessment
system
is,
is
performing
up
to
snuff
and
to
administer
and
essentially
to
administer
the
assessment.
So
we
will
need
a
few
pieces
of
data.
We
will
not
require
States
to
report
student
name
or
date
of
birth.
They
can
do
that
if
they
choose
to
if
they
want
us
to
handle
producing
student
level
score
reports,
but
if
the
state
would
prefer
to
handle
that
itself,
then
we
do
not
need
that
information
and
would
not
require
it.
C
Another
big
issue
is
cost
just
to
put
it
in
perspective
and
obviously
every
dollar
it
state
budgets
is
precious
and
has
about
20
different
uses
that
are
valid
and
important
that
it
could
be
put
to
so.
But
nonetheless,
it's
important
to
remember
that
what
states
are
spending
now,
which
is
an
average
of
about
$35.
A
student,
is
equivalent
to
0.3%
of
people
expenditures.
C
So
assessment
costs
are
a
small
fraction
of
total
spending
on
on
k-12
education,
there's
two
components
of
the
cost
for
smarter,
balanced
there's
a
membership
fee
for
the
consortium
to
can
maintain
and
improve
the
assessment
system
under
the
governance
of
the
member
states
and
with
member
states
continuing
to
make
all
major
design
policy
and
financial
decisions.
And
then
there
are
cost
estimate
also
includes
an
hour.
Cost
includes
an
estimate
for
the
cost
that
states
will
incur
to
to
administer
the
assessments
smarter
balanced.
This
is
not
planning
to
become
the
test
administrator.
C
C
This
was
a
decision
that
we
made
to
try
to
maintain
a
very
small
footprint
for
smarter,
balanced
as
an
organization
and
to
provide
states
with
as
much
flexibility
as
possible
to
manage
the
administration
of
the
assessments
in
the
way
that
best
suits
each
individual
state,
so
just
to
show
you
those
dollars
really
quickly.
This
is
how
the
dollars
add
up.
Then
the
membership
fee
is,
is
a
is
a
that's
a
hard
number.
That's
what
the
number
will
be.
The
the
member
managed
activities
is
an
estimate.
C
And
then
the
last
topic
is
what
happens
after
September
30th
of
this
year,
the
federal
grant
that
funds
the
the
consortium
ends
on
sep
tember
30th
and
with
states
rolling
out
the
first
operational
assessment
that
neck
that
academic
year
and
with
colleges
looking
to
to
evaluate
whether
or
not
to
recognize
students
Wars.
It
doesn't
seem
like
a
good
time
for
us
to
just
hang
up
the
closed
sign.
C
We
really
need
to
maintain
away
for
states
to
continue
to
work
together
to
govern
the
assessment
system
moving
forward,
so
we
have
are
working
on
an
arrangement
with
with
the
UCLA
graduate
school
of
education
to
serve
as
a
host
and
a
partner
for
sustaining
smaller,
smarter
balance.
We
would
become
a
program
within
the
graduate
school
of
education
at
UCLA.
C
This
would
provide
us
with
a
lot
of
the
back-office
kinds
of
support
that
we
need
and
also
give
us
access
to
a
lot
of
good
minds
and
and
able
folks
in
terms
of
graduate
students
and
faculty
at
UCLA
who
can
work
with
us
on
the
continued
development
of
the
assessments
of
some
key
principles
that
we
are
working
into.
Our
the
arrangements
that
we
are
making
with
UCLA,
we
will
retain
the
state,
lead
governance
structure
of
the
consortium.
C
So
I
have
said
a
handful,
a
few
key
takeaways
to
keep
in
mind.
We
are
on
track
to
deliver
a
complete
assessment
system
in
1415
on
time
and
on
budget
we're,
taking
steps
to
address
some
key
state
concerns
regarding
technology
and
student
privacy,
and
we
have
a
sustainability
plan
that
in
place
that
will
keep
States
in
the
driver's
seat
on
assessment,
design,
policy
and
cost,
and
with
that
I'll
close
with
just
a
few
resources.
If
you
want
to
kind
of
stay
apprised
of
what's
going
on
with
the
consortium.
C
Of
course,
we
have
a
pretty
comprehensive
website,
a
regular
newsletter
that
we
send
out
and
and
who
doesn't
tweet
these
days.
That's
important
I'm
also
welcome
questions
from
the
group
now,
but
also
questions
from
folks
via
email
or
phone
and
happy
to
provide
my
contact
information.
Maybe
we
can
do
that
right
at
the
end
of
the
webinar.
B
Thanks
Jackie,
that
was
great
I,
see
that
we
do
have
a
question
about
the
similarities
between
smarter,
balanced
and
park.
So
I
think
we
will
hold
that
question
to
the
end,
because
I
think
some
of
that
will
become
apparent
as
Jeff
presents
regarding
park
and
then
if
year,
if
there
is
any
additional
clarification,
that's
needed,
we
can
handle
that
at
the
very
end,
just
a
reminder
that
that
you
can
definitely
type
in
your
questions.
B
As
the
speakers
are
speaking
and
then
we
will
visit
those
at
the
end
of
the
webinar
will
be
sure
that
we
get
all
of
your
questions
answered.
So
let's
go
ahead
and
continue
on
and
let's
hear
from
park,
Jeff
Maehl
house
is
with
us
from
park.
He
is
the
project
park.
Is
the
project
management
partner
for
the
park
consortium,
which
is
one
of
the
two
state
consorcio
that
received
race
to
the
top
grant,
the
other
being
smarter
balanced?
B
Of
course,
he
holds
a
bachelor
of
science
and
chemistry
from
the
University
of
Massachusetts
and
a
Master
of
Science
in
science,
teaching
from
Antioch
graduate
school
of
education
and
a
master's
in
education
in
administration
policy
and
planning
from
harvard
graduate
school
of
education
right
now,
Jeff
serves
as
the
director
of
policy
research
and
design
at
park.
So
we
will
have
Jeff
tell
us
a
little
bit
more
about
the
PARCC
assessment
consortium.
D
Great
Michelle,
thank
you
very
much
and
good
afternoon.
Everyone
thank
you,
council
for
giving
us
an
opportunity,
both
Jackie
and
I.
To
present
about
our
consortium
is
afternoon.
You'll
see
there
are
some
similarities
between
our
two
consortia
and
some
differences,
and
perhaps
from
my
presentation,
you
can
begin
to
discern
some
of
the
similarities
and
differences.
I
hope
to
provide
you
some
background
about
the
power
consortium.
The
design
of
our
assessment
system.
D
D
Park-Like
smarter,
balanced
as
a
state-led
initiative
that
the
chief
state
school
officers
from
the
states
serve
on
the
consortium's
governing
board.
I,
formerly,
was
working
with
the
Massachusetts
Department
of
Education
and
my
old
boss,
Commissioner
Mitchell
Chester
is
the
chair
of
the
park
governing
board.
That
governing
board
meets
in
person
on
a
quarterly
basis
and
monthly
on
a
conference
call
and
has
an
executive
committee
made
of
the
made
up
of
the
Chiefs
from
the
states
of
Massachusetts
Rhode
Island,
Colorado,
New,
Mexico,
Tennessee
and
Maryland,
and,
and
they
have
a
weekly
conference
call.
D
So
this
is
all
to
say
that
our
the
education
education
leaders
in
the
various
Park
states
have
a
huge
stake
in
the
consortium
and
are
very
committed
to
working
on
a
very
frequent
basis
on
the
various
policy
issues
that
arise
in
implementing
this
new
assessment
program.
Currently,
the
fiscal
agent
for
the
consortium
is
Maryland
and
Washington
is
for
smarter,
balanced
the
races,
Top
funds,
federal
funds-
that
is
a
you
know,
being
used
to
to
procure
the
various
services
for
the
consortium
for
bringing
the
states
together
is
channel
through
the
state
of
Maryland.
D
Park
Incorporated
is
the
project
man.
That's
the
organization
I
currently
work
for
where
the
project
management
partner
for
the
consortium
in
our
mission
is
to
work
on
behalf
of
the
member
states,
to
facilitate
the
state
involvement,
to
help
with
procurement,
to
manage
contracts
and
to
coordinate
and
integrate
vendors
day-to-day
operations.
So
that
gives
you
a
picture.
If
you
look
at
the
map,
the
state
of
Pennsylvania
is
in
a
lighter
color
there.
D
D
You
know
to
understand
the
design
of
our
assessment
system.
It's
I
think
it's
important
to
first
understand
what
are
the
priorities
and
purposes
that
the
assessment
system
is
is
attempting
to
address.
So
there
are
a
number.
There
are
lots
of
different
purposes.
Assessment
systems
address
I've,
highlighted
five
here,
which
I
think
are
the
most
important.
First
of
all,
our
mission
or
not
is
to
develop
a
very
high
quality
student
assessment
system
system.
D
That's
designed
to
measure
college
and
career-ready
standards,
but
more
specifically,
the
system
as
a
smarter
balanced
system
is
to
provide
information
and
tools
to
support
effective
instruction
to
tripura.
And
this
is
where
our
I
think
both
of
our
systems
are
going
to
address
issues
around
accountability,
so
we're
generating
data
that
is
reliable
and
timely
for
a
range
of
accountability,
uses
including
school
accountability,
district
level,
accountability,
educator
effectiveness
systems
and
student
accountability
systems.
D
A
fourth
priority
for
this
consortium
is
to
report
comparable
data
across
schools,
districts
and
states,
and
this
was
a
very
high
priority
for
the
governors
who
signed
on
to
participating
in
the
consortium.
I
think
there
was
one
of
the
problems
we've
had
in
the
past
is
that
states
were
holding
different
standards,
reporting
different
numbers
of
students
proficient,
and
it
was
creating
a
lot
of
confusion
in
the
country
and
by
having
common
assessments,
we
can
have
results
that
can
be
comparable.
D
Finally,
and
both
of
the
consorzio
doing,
this
is
using
technology
for
a
variety
of
purposes,
including,
most
importantly,
engaging
students
in
the
21st
century
learning
environment
and
then
also
for
creating
cost
efficiencies
and
all
phases
of
the
cyst
of
the
testing
system
from
test
development,
the
administration
of
the
tests,
the
scoring
of
the
tests
and
the
reporting
of
the
tests.
So
it's
just
important
to
keep
these
priorities
in
mind
as
I
go
now
here
to
just
try
to
give
you
a
quick
overview
of
the
design
of
the
PARCC
assessment
system
into
a
second.
D
D
States
will
I
think
this
is
a
similarity
between
smarter,
balanced
and
park
and
that
we
have
required
components
and
some
optional
components.
These
optional
components
can
be
used
throughout
the
school
year
and
are
designed
to
address
that
first
priority
I
spoke
about,
which
is
to
support
effective
instruction
tools
that
can
be
used
throughout
the
school
year.
In
the
instructional
process.
D
Formative
assessments
are
typically
not
used
for
accountability
purposes,
so
they're
not
designed
to
report
comparable
data
across
schools
and
districts
and
students.
The
two
components
you
see
on
the
right
of
the
screen
show
what
we've
referred
to
as
our
summative
assessment
components.
There
are
two
summative
components
in
part:
one
is
called
the
computer-based
component
and
the
other,
the
end-of-year
component
and
the
summit
of
components
address
all
of
the
prior.
D
You
might
I
would
say
that
the
summit
of
components
address
all
of
the
priorities
previously
mentioned
and
are
the
assessments
upon
which
student
performance
results
will
be
generated
and
none,
then
those
students
results
will
be
aggregated
to
report.
School
district
and
state
results.
There's
one
component
there,
the
speaking
and
listening
assessment,
that's
under
development,
and
it
will
eventually
be
phased
in
and
become
part
of
the
summative
assessment
system.
D
Okay,
it's
going
to
give
you
a
little
bit
more
information
about
each
of
those
about
the
summative
assessments
and
the
formative
assessments,
which
are
the
out
of
the
program
in
terms
of
the
summative
assessments.
You
can
see
in
this
slide
that
the
English
language,
arts,
/
literacy,
performance-based
component,
that's
in
the
top
left
hand
box.
There
is
focused
on
assessing
students,
ability
to
write
and,
in
particular,
their
ability
to
incorporate
evidence
from
sources
in
their
writing.
D
While,
if
you
just
move
to
the
right
a
little
bit,
you'll
see
that
in
the
end
of
year
component
the
English
language
arts
test
is
focused
on
reading.
Both
reading
comprehension
of
both
literary
and
informational
texts
and
in
those
texts
and
grades
6
through
11
in
terms
of
the
informational
text
will
be,
will
include
historical
texts,
scientific
and
technical
texts,
dropping
down
to
the
math
PBA
and
the
Matthew
I.
D
The
math
performance
base
component
will
focus
on
those
extended
multi-step
problems
that
require
mathematical
reasoning
and
solving
real-world
problems,
while
the
end-of-year
component
will
focus
on
concepts,
fluency
and
short
applications,
and
so
it's
the
results
from
both
of
these
components
that
are
combined
to
create
the
students
overall
score.
So
we
have
a
lot
of
information
being
put
together
to
provide
an
overall
score
for
students
just
a
little
bit
more
information
on
the
formative
components.
D
The
diagnostic
assessments
on
the
left
will
be
creative
for
students
in
grades
two
through
eight
and
they'll
focus
on
reading,
writing
and
mathematics.
The
purpose
of
the
diagnostic
tools
is
to
pinpoint
student
strengths
and
weaknesses,
and,
as
such,
they
will
be
computer-based
and
computer
adaptive.
Here's
where
our
diagnostic
assessments
use
a
model,
that's
similar
to
aspects,
summative
assessments
in
that
the
test
will
be
at
both
computer-based
and
computer
adaptive.
The
mid-year
assessments
on
the
right
will
take
the
form
of
modules
or
Tess
'let's
that
span
grades
3
through
11.
D
They
will
be
both
paper-based
and
computer-based
and
they'll
serve
various
purposes.
Schools
will
have
the
opportunity
to
use
these
for
classroom
assignments,
to
use
them
for
benchmark
assessments
and,
since
the
design
of
the
mid-year
assessments
will
be
very
similar
to
the
design
of
our
performance-based
assessments
under
the
assumption
they
can
be
used
to
help
students
prepare
for
the
summer
summative
assessments.
So
that's
just
a
little
bit
more
information
about
the
design
of
the
system
itself.
D
This
will
move
over
here.
I
just
want
to
quickly
summarize
some
of
the
accomplishments
of
the
park
consortium
to
date.
The
point
of
this
slide
is
to
show
that
the
work
of
both
consortia
has
been
very
wide-ranging,
including
state
engagement
of
both
our
k12
and
higher
education
part
his
test
development.
Both
consortia
has
done
a
tremendous
amount
of
work
in
developing
test
items
for
its
new
assessments
policy
development.
Jackie
talked
about
the
work.
That's
been
done
to
support
technology,
readiness
and
research.
D
We're
all
research
is
also
a
large
part
of
each
of
our
assessment
program
with
respect
to
policy,
as
jackie
described
we
park
is
also
adopted,
a
data
privacy
policy
very
similar
to
smarter
balances
policy,
and
I
think
we
actually
work
together
and
drew
from
each
other's
thinking
on
data
privacy.
Knowing
that's
a
very
important
issue
and,
like
the
smarter,
balanced
consortia
there'll
be
very
strict
requirements
around
maintaining
the
security
of
personally
identifiable
information.
D
So
this
just
is
to
illustrate
that
what
both
consorcio
are
doing
a
much
more
than
just
billet
building
a
test
where
we're
doing
research
we're
bringing
multiple
states
together
to
develop
policy
where
there's
been
very
disparate
policy
before
both
consortia
have
a
college
and
career-ready
determination
policy.
We
have
data
privacy
policies,
we're
working
together
to
have
similar
test
accommodation
policies
and
so
on
and
so
forth.
So
well,
we
are
different
in
some
ways
we
are
similar
in
others.
D
And
this
is
a
big
concern
right
now.
As
you
can
see,
both
consortia
will
be
phasing
in
the
full
use
of
computers
over
time.
So
there
will
be
a
period
over
the
next
couple
of
years
where
some
schools
will
be
taking
the
test
on
paper.
Others
on
computer
and
will
raise
questions
about
the
comparability
of
those
results
and
we're
looking
into
that
matter.
Another
we're
also
as
part
of
our
field
test.
There's
a
your
question
is
for
the
test
administrators
and
there
are
questionnaires
for
the
students
themselves
to
help
us.
A
D
Test
the
clarity
of
the
directions
and
helping
us
improve
the
test
administration
test
administration
procedures
for
for
next
year
so
and
then
I
just
want
to
emphasize-
and
this
is
true
for
both
consortia-
that
the
field
tests
are
not
about
getting
results
with
students
or
schools
or
districts.
It's
about
getting
this
other
information
from
which
we
can
build
a
strong
test,
fit
for
the
first
year
of
operation
in
1415
oops.
D
So
in
terms
of
the
field
tests,
we
have
14
states
plus
did
it
plus
the
District
of
Columbia
participating
in
the
field.
Tests
in
in
those
states
will
be
involving
over
1
million
students
in
nearly
16,000
schools,
and
the
way
we've
sampled
on
the
field
test
was
to
minimize
the
testing
burden
on
schools
this
spring
that
are
also
implementing
their
their
existing
state
testing
programs.
D
So
in
any
school
that
was
selected
to
participate
in
the
park
field
test,
we're
only
feel
that
we're
only
including
one
two
three
classrooms
in
that
school,
so
in
additionally,
students
participating
in
the
field
test
will
only
be
taking
a
field
test
in
one
content
area
and
in
one
of
the
components
of
the
content
area.
So
the
amount
of
testing
time
for
the
field
test
will
not
exceed
three
hours
and
that
will
be
over
anywhere
from
one
to
three
days.
D
D
So
the
relatively
long
windows
are
designed
to
provide
schools
with
as
much
time
as
possible
in
scheduling
the
field
tests,
so
they
can
schedule
them
around
other
plant
events
in
the
school
understanding
that
this
year
is
a
year
in
which
states
are
also
administering
their
existing
state
tests.
The
relatively
long
windows
are
there
to
allow
them
to
schedule
around
those
tests.
This
slide
also
indicates
the
kind
of
supports
we
have
been
providing
schools
and
districts
the
past
few
months
to
prepare
for
the
field
tests.
D
I
think
Jackie
showed
some
similar
slide
where
they've
been
regional
workshops
on
site
visits
manuals
and
their
own
a
narrated
powerpoints
presented
to
local
educators.
We
have
a
call
center.
We
have
various
kind
of
tools
that
allow
them
to
check
their
computer
systems
and
actually
conduct
a
dress
rehearsal
for
the
field
test
in
terms
of
knowing
whether
or
not
they're
the
technology
in
their
schools
is
adequate
to
handle
the
computer-based
testing
this.
D
This
slide
also
shows
that
will
be
releasing
practice
tests
this
spring
and
the
findings
of
the
research
studies
that
would
that
we're
conducting
later
this
summer,
just
moving
on
I,
want
to
say
a
few
more
words
about
the
the
kind
of
supports
we're
providing
both
the
educators
and
students
to
get
ready
for
the
field
tests
and
really
for
the
operational
test.
Next
year
we
have
released
sample
items
they're
available
now,
and
you
can
see
the
link
to
that.
You
could
go
to
if
you
wanted
to
look
at
some
of
the
sample
items.
D
Try
them
for
yourselves,
there's
also
practice
tests
coming
up
this
spring
and
those
prep.
The
purpose
of
the
practice
tests
is
to
give
opportunities
to
schools
that
weren't
selected
for
the
field
test
a
similar
opportunity
to
participate
in
a
park-like
assessment,
so
they
can
also
feel
prepared
for
the
for
implementing
the
full
test.
Next
year,
both
the
sample
items
in
the
practice
tests
are
designed
to
help
familiarize
students
with
the
item
types
and
with
the
technology
platform
before
taking
the
full
test
in
the
spring
of
2015.
D
This
is
just
another
calendar
to
show
the
major
events
that
will
be
taking
place
as
we
move
towards
full
implementation
of
the
tests
in
the
spring
of
2015.
But
you
can
see
this
year.
Students
will
be
taking
the
field
tests,
we're
releasing
practice
tests.
Research
findings
will
reported
this
summer.
Park
will
be
constructing
its
operational
assessments
this
summer
and
fall.
The
mid-year
assessments
will
be
available
next
winter,
at
which
time
students
will
begin
taking
the
operational
tests
late
winter
and
early
spring.
D
We
also
will
be
setting
performance
level
cut
scores
next
summer
and
reporting
out
the
results
of
the
first
operational
assessments
at
the
late
in
the
summer
about
2015.
The
diagnostic
assessments
will
be
available
in
the
fall
of
2015
and
for
those
students
who
earn
a
college
and
career-ready
determination
in
the
spring
of
2015.
Those
students
will
be
entering
college
in
the
fall
of
2016
and,
if
they've
earned,
that
determination
they'll
be
able
to
use
that
to
mitrik
matriculated
to
those
college
credit
bearing
courses.
D
I
just
wanted
to
share
with
you,
Michelle
I
hope
everyone
is
getting.
This
particular
presentation
because
they'll
have
links
to
all
the
materials
that
were
have
been
developed
by
Park.
To
date
to
this
is
a
list
of
instructional
resources
that
we've
developed
and
are
available
to
the
public
and
local
educators
on
the
next
slide.
D
These
are
the
resources
that
we
have
developed
to
help
prepare
educators
and
students
for
the
field
test
this
spring.
So
all
of
these
materials
are
available
on
our
website
or
link
from
our
website
to
what's
called
the
park
pearson
website
with
materials
where
at
least
the
online
materials,
the
computer-based
testing
materials
sample
items
are
available
and.
D
D
A
D
B
B
First,
one
question
that
we
get
a
lot
is
whether
or
not
states
are
able
to
customize
their
assessments,
given
the
fact
that
they
are
able
to
customize
a
their
standards,
they're
wondering
if
there
might
be
able
to
add
questions
that
are
specific
to
their
states
or
even
cut
back
the
number
of
questions
in
total
to
cut
back
a
little
bit
on
the
number
of
hours
that
the
test
will
require.
Can
both
of
you
address
that
with
regard
to
your
your
work,
sure.
C
So
one
of
the
big
benefits
of
the
consortium
is
that
the
results
of
these
assessments
will
be
comparable
across
states.
So
you
will
be
able
to
look
from
your
state
to
your
neighboring
states
and
see
how
your
students
are
performing
relative
to
other
states
for
the
summative
end
of
your
assessment.
If
a
state
were
to
say
add
on
a
significant
you
know
additional
portion
or
cut
or
wanted
or
want
to
cut
back
a
significant
portion,
you
would
lose
that
comparability
in
the
assessment
system.
C
The
test
administration
platform
that
smarter,
balanced
has
developed
is
an
open
source
platform
that
will
allow
states
to
to
use
that
platform
if
they
choose
for
other
assessments
that
they
give
so
say,
for
example,
for
a
social
science
assessment,
it
would
have
the
same
look
and
feel
as
a
smarter,
balanced
assessment.
If
a
state
wanted
to
say
add
on
additional
additional
portion,
they
in
English
and
math,
they
could
use
that
same
assessment
platform
to
do
that.
C
But
I
think
they'd
want
to
do
it
as
an
addendum
to
the
existing
assessment,
because
if
you,
if
you
were
to
integrate
it
in
some
kind
of
way
or
want
that
those
those
the
answers
to
those
questions
merged
into
the
score
on
the
smarter,
balanced
assessment,
you
would
lose
that
comparability
across
states.
So
that
would
be
a
pretty
significant
trade
off
yeah.
D
I,
you
know
just
agree
with
Jackie
there.
The
whole
idea
behind
the
assessments
in
the
common,
the
common
assessments,
is
to
get
comparable
results
and
once
you
start
cutting
back
the
number
of
questions
or
adding
on
questions
and
certainly
I,
don't
think
states
want
to
add
on
questions.
They
are
most
concerned
about
testing
time.
So
I
think
that
the
agreement
of
the
states
that
are
members
of
the
consortium,
that
is,
that
they're
going
to
administer
comparable
tests
common
tests
with
common
policies,
so
results
can
be
comparable.
B
Another
question
that
comes
up
quite
a
bit
is:
there's
an
option
to
participate
either
in
be
either
in
the
summative
piece
or
you
can
buy
the
entire
assessment
package.
That
includes
formative
and
interim
assessments.
Can
you
address
whether
it's
a
state
role
to
determine
that
or
if
districts,
for
example,
if
a
state
would
decide
to
only
use
the
summative
assessment
portion?
Could
a
district
come
to
you
and
provide
extra
funding
to
be
able
to
use
that,
for
their
assessment
system?
Sure.
C
We
get
that
question
all
the
time
we
will
not
be
set
up
to
bill
the
thousands
of
districts
out
there
around
the
country.
So
if
a
state
wanted
to
work
out
an
arrangement
with
its
districts
whereby
the
district's
essentially
paid
the
state
and
then
the
state
paid
us,
we
could
work
that
out,
but
it
would
be
a
big
increase
in
our
administrative
overhead
and
therefore
in
our
caught
the
cost
that
we
would
have
to
charge
to
other
states.
C
D
The
park,
this
is
still
a
policy
on
the
discussion
we
are
looking.
You
know
as
a
goal
to
for
the
interim
for
our
diagnostic
assessments
for
mid-year
assessments
for
districts
if
states
are
not
purchasing
those
for
all
the
districts
in
their
state
that
there's
an
opportunity
for
individual
districts
to
get
access
to
those
materials.
But
I
agree
with
jackie-
and
you
know
it
all:
it
comes
down
to
some
administrative
operational
challenges
and
doing
that,
but
we're
hoping
that
we
can
make
these
these
tools
for
classroom
assessments
pretty
widely
available.
B
C
In
general,
yes,
if
there
were,
for
some
reason,
some
kind
of
interruption
say
a
power
outage
and
the
student
might
lose
the
immediate
question
they
were
working
on,
they
would
lose
it
if
they
hadn't
submitted
it.
They
hadn't,
you
know,
said
I'm
done
with
this
question,
but
the
rest
of
what
they
had
done
would
have
been
saved,
and
so
they
could
log
back
in
and
resume
their
their
test.
Their
testing
experience
yeah.
D
D
D
C
So
this
is
this
is
a
small
little
slim
footprint
should
explain
that
you
know
bees
will
not
look
like
you're
like
a
kid's
video
game,
because
of
that
the
software
doesn't
look
real
real
fancy,
but
that
was
done
on
purpose
to
to
make
sure
that
the
assessment
meets
its
purpose,
but
doesn't
create
an
unnecessary
load
on
what
we
know.
Our
limited
bandwidth
resources
in
a
lot
of
schools.
C
D
C
Only
if
a
state
authorizes
it
and
then
in
that
case,
a
state
would
enter
into
a
data
privacy
agreement.
So,
for
example,
if
a
state
is
contracting
with
a
vendor
to
maintain
its
data
warehouse
or
to
do
its
reporting
and
that's
going
to
include
reporting
to
individual
students,
then
then,
though,
that
vendor
would
have
have
access
to
that
personally.
Identifiable
information
and
the
state
would
enter
into
a
data
privacy
agreement
to
govern
the
the
use
of
that
data,
but
states
remain
in
the
driver's
seat
for
all
questions
with
regard
to
their
data.
C
D
They
need
that
data
to
do
that,
but
it's
it's
not
to
be
shared
in
any
other
way,
not
to
be
given
to
any
other
third
party.
Without
this,
the
explicit
permission
of
the
state
and
that's
rarely
given
given
out
so
I,
don't
think
that
you're
going
to
find
that,
because
there
are
a
consortium
now
that
there's
any
greater
issues
around
confidentiality
and
privacy,
all
of
those
rules
are
going
to
be
kept
in
place.
B
D
And
we
have
it,
we
got
a
similar
amount
of
grant
funding
and
some
private
funding
as
well,
so
that
that
money
has
been.
You
know
been
used
to
help
each
of
the
consortium,
their
assessments
to
all
the
item,
development
for
the
assessments
now
doing
the
field
testing
in
some
cases,
building
the
technology
platforms
to
administer
score
and
report
the
assessments.
So
it's
really
gone
for
you
know
a
whole
range
of
uses,
engaging
the
states
and
so
on
and
so
forth.
B
Actually,
I
have
one
more
question:
that's
come
in
that
I'd
like
to
get
you
real
quickly.
Is
it
in
the?
Is
it
the
intent
that
if
a
high
school
student
meets
a
certain
test
score,
that
student
must
be
placed
in
a
freshman
level
course
for
credit?
It
cannot
be
further
evaluated
to
determine
if
remedial
replacement
might
be
in
the
students
best
interest.
If
so,
will
this
apply
or
be
utilized
by
public
or
private
institutions
of
Education,
so
benign.
A
C
C
We
think
that
will
start
primarily
with
public
institutions,
because
that's
who
we've
really
been
working
with
most
closely
and
then
likely
would
spread
from
there
to
private
institutions
and
whether
or
not
there.
This
would
be
a
kind
of
a
state
level
decision
or
a
system
level
decision
really
will
depend
in
each
state
on
the
governance
system
for
higher
education
in
some
states
this
might
be
a
statewide
decision
and
in
other
states
each
individual
campus
would
make.
That
call
really
just
depends
on
the
governance
structure
in
in
any
given
state.
B
Thank
you
both
so
much
I
know
that
this
has
been
really
helpful,
but
for
me,
and
for
the
legislators
and
legislative
staff
who
are
able
to
join
us,
I
did
want
to
remind
everyone
that
this
is
archived,
and
so
we
will
send
out
the
link
to
the
archive
and
within
a
few
days,
and
you
can
share
it
with
your
colleagues
who
were
not
able
to
join
us
for
the
webinar
or
to
use
it
to
refer
back
to
any
information
that
was
in
included
in
the
webinar.
As
you
can
see,
my
contact
information
is
listed
there.
B
If
you
have
any
questions
about
this
topic,
we
are
well
positioned
to
help
you
and
obviously
always
happy
to
help
you
get
your
questions,
answered
and
connect
you
to
folks
at
the
consortium,
other
experts
who
might
be
able
to
answer
your
questions
in
more
detail.
This
is
the
conclusion
of
the
webinar
and
we
appreciate
your
participation.