►
From YouTube: Instructional Design 101
Description
The Learning & Development, Field Enablement, and Professional Services team held an Instructional Design 101 session led by Sr. Technical Instructional Designer Kendra Marquart
A
Sweet
well
thanks
thanks
for
joining
us
everyone,
I
I
wanted
to
put
together
like
an
instructional
design,
101
session
with
kendra
one
of
our
instructional
designers.
Here
at
git
lab
I'm
josh
zimmerman,
I'm
an
lnd
partner
and
I
build
learning
content
just
like
all
of
you
all.
So
I
was
curious
just
to
learn
more
about
you
know:
instructional
design,
sort
of
best
practices
and
principles,
applying
adult
learning
theory
to
kind
of
what
we
do
at
kit
lab
and
put
together
an
agenda
just
with
some
high
level
questions.
A
B
Yeah
we
can
go
ahead
and
go
over
the
questions
first,
so
just
to
introduce
myself.
My
name
is
kendra
marquart,
I'm
the
senior
instructional
designer
for
the
pro
professional
services
team.
So
I,
along
with
christine,
build
all
of
the
content
that
we
deliver
to
our
clients,
so
all
of
our
external
customers
who
purchase
our
training
on
how
to
use
the
product.
B
Eight
years
ago
I
started
out
as
a
trainer
almost
12
years
ago,
where
I
was
a
technical
trainer
and
I
kept
sending
in
suggestions
for
the
content
over
and
over
again
until
the
manager
of
the
apartment,
kind
of
got
irritated
and
just
told
me
to
apply
so
it
was
not
a
it's
not
like.
I
went
to
college
and
then
when
and
I
was
never
in
education,
it
was
never
something
that
I
ever
thought
I
would
end
up
in
and
it
just
kind
of
happened.
B
So
I've
been
in
the
industry
for
about
10
years
as
far
as
training
and
design
goes,
but
I've
been
a
designer
for
eight
years
now
I
started
when
it
was
just
you
know:
grayscale
word
documents,
I'm
sure
christine
is
familiar
with
the
old
school
style
and
then
more
into
e-learning.
So
my
specialty
is
around
e-learning.
B
As
far
as
the
second
question
for
adult
learning
theory,
so
there
is
several
different
learning
methodologies
throughout
instructional
design,
so,
depending
on
kind
of
where
you
entered
in
structural
design
and
where
you
came
from,
you
will
see
various
different
different
methodologies
used.
So
the
basic
core
model
of
all
instructional
design
is
analysis,
design
and
development
and
then
evaluation.
So
there
is
several
different
methodologies
out
there.
I
started
on
adi,
which
is
the
old
school
model.
I
know
I
think
christine
is
pretty
pretty
aligned
with
that
model
as
well.
B
Almost
all
of
them
have
these
three
sections,
some
form
of
a
needs
analysis
that
includes
understanding
the
needs
of
the
training
and
the
audience
that
you're
delivering
to
the
design
and
development
is
around
actual
design
of
the
content
and
the
instruction
materials
and
then
determining
what
method
of
delivery
you're
going
to
use,
whether
it's
instructor-led
self-paced,
a
desk
drop,
which
is
like
just
a
pdf
that
you
email
out
an
article
and
then
the
evaluation
section,
which
is.
How
are
you
going
to
determine
that
the
learners
actually
took
away?
B
Some
of
the
methodologies
that
you'll,
see
or
you'll
hear
about
are
addie
loom's
taxonomy
is
another
really
popular
one
sam,
the
kilpatrick
model,
agile
and
rapid
camp
and
action
mapping,
so
you'll
see
tons
of
different
ones,
and
I
think
some
of
you
that
were
at
devlearn
this
last
week
have
probably
heard
a
couple
of
these.
If
you're
in
any
of
the
id
sessions,
the
most
two
common
ones
that
I've
focused
on
in
my
experience
is
addie,
it's,
in
my
opinion,
the
most
thorough
of
the
instructional
design
model.
B
B
So
it
can
be
kind
of
hard
to
do
in
a
really
fast-paced
environment
like
what
we
have
here
at
gitlab,
just
because
it
is
very
time
consuming,
because
the
stages
are
very
precise
and
very
laid
out,
and
the
needs
analysis
portion
usually
takes
up
a
lot
of
that
time,
because
you
go
to
each
individual
audience
that
you're
going
to
ensuring
that
you
understand
what
the
training
needs
are
and
what
skills
they
already
have
versus
what
skills
they're
going
to
have
to
learn
as
far
as
here
at
gitlab,
the
most
useful
one,
I
think
for
our
business
would
be
sam,
which
is
the
successive
approximation
model.
B
So
this
is
split
into
three
different
sections
and
these
sections
are
meant
to
repeat
over
and
over
again
for
any
type
of
iteration
or
change
that
you
do
to
the
training.
So
the
first
one
is
just
a
basic
prepare
stage.
This
is
where
you
gather
information
you
collaborate
with
your
smes
and
your
stakeholders.
B
You
have
a
quick
little
savvy
start.
Your
kickoff
calls
that
type
of
content,
and
then
you
go
into
the
design
phase,
which
is
you
design
the
content?
You
have
a
prototype
or
a
draft,
you
have
that
reviewed
and
then
rinse
and
repeat
you
keep
going
until
you
have
a
product,
that's
viable
enough
to
send
to
your
audience
and
then
you
develop
where
you
go
into
the
development
stage.
You
implement
it
out
to
your
audiences.
B
You
do
pilots,
any
kind
of
rapid
training
that
you
want
to
do
to
test
it
to
make
sure
it's
working
effectively,
and
then
you
build
your
evaluations.
So
your
quiz
questions,
knowledge,
checks,
exams,
certifications.
I
know
on
our
team
and
process.
We
do
certifications
to
ensure
that
you
actually
have
a
measurable
data,
that
your
learners
are
retaining
the
knowledge
that
you're
teaching
them
and
then
same
thing,
rinse
and
rb.
So
this
cycle
is
meant
to
consistently
go
over
and
over
again
as
you
iterate
throughout
the
process.
C
Yes,
both
addie
and
sam,
and
we
were
actually
just
talking
about
how
we
need
to
move
from
an
addy
perspective
to
sam
to
start
releasing
more
rapid
content,
because
we're
really
getting
kind
of
hung
up
in
the
the
process
that
we
have
now
with
addie
and
for
the
field
certification
team.
They
haven't
been
able
to
release
any
field,
cert
courses
for
the
enterprise
team,
as
of
yet
because
we've
been
so
lost
in
the
analysis
piece,
whereas
on
my
team,
I've
been
delivering
content
left
and
right,
but
I'm
doing
it
in
rise
right.
A
Yeah,
I
would
say
the
same
for
us
too,
just
like
not
probably
definitely
applying
sam
to
the
content
we
developed,
but
I
think
we
get
really
hung
up,
sometimes
in
the
collaborating
with
stakeholders.
Just
because
you
know
at
gitlab
our
culture
is
very
open
and
transparent.
So
as
you're
making
your
way
to
developing
and
designing
content,
you'll
get
more
input
and
kind
of
have
to
pivot
a
lot,
so
it
can
be,
it
can
be,
it
can
be
difficult,
but
I
think
that's
that's
the
benefit
of
the
model
too.
So.
B
B
You
know
unless
it's
like
the
basics
of
customer
service,
you
know
something
like
that,
like
soft
skill
type
stuff,
but
as
far
as
the
methodologies
that
I
like
about,
it
is
whether
you're
using
you
know
addie
or
sam,
I
really
like
rapid,
too
agile
or
rapid.
If
you
come
from
like
I
have
process
improvement
background.
So
for
for
me,
the
agile
kind
of
fits
with
my
background
for
that
way,
but
it
gives
you
a
step-by-step
process,
whether
it's
three-step
model
for
sam
or
it's
the
six-step
model
for
addie.
B
B
You
go
create
it
and
then
you
get
to
your
stakeholders
and
it's
not
what
they
wanted,
or
it's
not
what
they
needed
so
using
this
methodologies
helps
break
that
so
that
you
don't
take
a
finished
product
or
get
done
completely
with
your
your
course
and
you
hand
them
a
package
and
they're
like
okay.
This
isn't
what
we
intended.
This
isn't
what
we
wanted.
B
So
these
type
of
methodologies
helps
get
away
from
that,
so
you're
not
wasting
precious
development
time,
especially
in
this
kind
of
business,
and
also
really
fits
with
our
collaboration
model
that
we
have
here
at
get
lab
like.
A
B
In
particular,
I
tend
to
use
the
tams
and
the
the
engagement
team
and
they're,
usually
real,
or
the
engineer,
team
and
they're
really
good
about
getting
feedback
back
to
us.
C
And
kendra:
do
you
suggest
using
this
approach,
the
majority
of
the
time,
because
a
lot
of
the
content
isn't
necessarily
evergreen
like
if
there's
evergreen
content,
then
you
can
be
a
little
bit
more
addy
about
it,
and
and
is
that?
Is
that
a
true
statement?
Do
you
agree
with
that.
B
Or
for
me,
it
depends
on
the
type
of
content
for
certifications
that
we're
going
to
be
delivering
and
we're.
Actually,
you
know
putting
this
out
there.
This
is
a
gitlab
certification.
I
think
that's
something
that
definitely
needs
to
have
a
very
good
needs
analysis,
so
that
would
need
to
be
put
through
a
more
strenuous
approach
like
addie
I
lucked
out,
because
christine
had
everything
like
an
awesome
job
task
analysis
and
needs
analysis
before
I
came
on
board.
B
B
So
you
don't
need
to
keep
going
back
and
redoing
it
every
time.
There's
an
update.
A
B
Awesome
yeah,
so
I
originally
did
media
in
in
college.
So
a
lot
of
this
stuff
is
built
using
infographic
templates
that
I
have
saved
that
I've
just
had
forever
and
then
otherwise.
It's
using
a
combination
of
the
icon
library
that
I
shared
in
the
gitlab,
ed
content,
channel
versus
also
looking
at
templates
that
we
have
available.
A
B
Yeah,
these
are
just
things
that
I
like
to
add
in
I'm
sorry,
I
didn't
put
my
phone
on
you,
oh
my
god.
So
as
far
as
the
methodologies
go,
I
think
addie
would
be
good
for
anything.
That's
like
a
program,
so
something
that's
going
to
have
a
lot
of
courses
in
it.
It's
going
to
have
a
straight
up
exam
or
an
evaluation
and
they're
going
to
be
getting
a
badge
for
it.
B
B
And
then
I
put
in
here
just
a
couple
of
needs:
analysis,
templates
and
I'll
pdf
this
and
send
it
out
to
everybody,
but
these
are
just
links
to
different
needs.
Analysis
templates
that
are
out
there,
like
the
cdc,
has
a
really
good
one
and
the
this
is
something
that
christine
actually
linked
to
our
education
services
channel.
It's
the
content
display
theory,
so
this
is
going
through
the
type
of
content
and
the
performance
that
you're
trying
to
achieve.
A
B
You're
delivering
to
so
part
of
instructional
design
is
ensuring
that
you
know
your
audience,
so
you
want
to
know
who
you're
delivering
to
because
say.
For
example,
we
have
an
issue
with
with
gitlab
ci
cd,
so
our
course
for
that
we
can
get
a
lot
of
people
in
the
audience
that
are
that
know
what
they're
doing
and
they
think
the
course
is
too
basic.
But
then
we
get
a
lot
of
other
people.
That
think
the
course
is
too
too
advanced.
B
So
by
able
to
find
that
and
understanding
our
audience
now
we
know
that
we're
going
to
be
creating
a
gitlab,
cicd
advanced
course,
just
because
the
people
in
the
room
have
such
varying
different
skills.
You
want
to
create
a
baseline,
so
you
need
to
understand,
you
know,
what's
the
baseline
of
what
they
need
to
do
and
how
does
it
reflect
the
learning
objective
that
you're
trying
to
get
like
forget
lab
basics?
Have
they
ever
been
in
gitlab
before
you
know?
Have
they
created
an
issue
before?
B
Have
they
what
it
what's,
what
is
required
for
their
daily
job?
Do
they
use
git
lab
on
a
day-to-day
basis
for
their
job?
B
Do
they
use
the
coding
feature
or
stuff
like
that
and
creating
your
personas
and
understanding
your
audience
will
help
you
with
your
objectives
as
well,
especially
for
you
know,
I
don't
know
if
we
really
have
test
out
features
yet
for
lnd
or
field
enablement,
but
you
know
things
where
if
it's
something
that
they've
been
doing
as
their
daily
job,
do
they
really
need
to
waste
the
time
going
through
the
training,
so
maybe
like
a
test
out
feature,
would
really
help.
B
D
So
you
know
job
task.
Analysis
is
not
well
understood
by
a
lot
of
our
team
members
or
even
across
companies
or
departments.
No
one
really
knows
what
that
is,
and
it
sometimes
can
get
mixed
up
with
competencies,
skills
like
what.
C
D
Of
these
things
mean
a
lot
of
people,
don't
understand
what
they
mean
and
they're
relying
on
us
instructional
designers
to
kind
of
guide
them
through.
What
is
that?
Why
do
we
have
to
do
it
and
the
one
that
I
did
I
I
did
really
fast
so
so
the
other
thing
is
speed
right.
D
Do
we
have
three
months
to
do
this
jta
or
do
we
have
a
month
and,
and
so
it
can
vary
in
the
amount
of
detail,
but
I
feel
like
what
what
I
put
together
is
kind
of
like
bare
minimum
of
what
you
would
need
in
a
job
task
analysis.
I
think,
for
roles
that
are
where
we're
talking
about
job
tasks
that
use
a
lot
of
soft
skills.
Where
you
know
that's
evergreen,
those
soft
skills,
don't
change.
Those
are
transferable
skills,
like
communication
skills,
right
or
leadership
skills.
D
That's
where
I
feel
like
in
a
job
task
analysis.
It's
really
good
to
flush,
those
out
and
also
there's,
probably
a
lot
of
off
the
shelf
content
that
you
could
map.
But
the
point
is
you
want
to
have
a
map
you
want
to
have
a
map,
so
you
know
what
to
plug
in
where
so
that
people
are
getting
the
relevant
content
for
them,
and
so
I
I
can't
stress
enough
how
important
it
is
to
have
that.
D
E
B
B
I
should
look
to
see
if
we
have
this,
because
this
is
a
common
topic,
so
saving
you
time
as
well
and
for
me,
as
an
instructional
designer
when
I
came
in
to
the
company
fresh
looking
at
that
job
task
analysis
and
the
personas
that
christine
had
created
really
helped
me
understand
the
different
roles
at
the
depart
from
a
department
level
as
well
for
understanding
who
I
needed
to
go
to
stakeholders
like
who
would
know
the
stuff
for
this
job
like
who
would
know
the
stuff
for
this
job.
Like
do
I
go
to
the
towns?
B
B
So
one
thing
that
I
was
asked
about
by
another
content:
curator
here
is
the
colors
that
I
use.
So
I
try
to
always
stick
to
design
branding,
but
there's
always
images
and
stuff
that
we
kind
of
have
to
use
because
we're
trying
to
display
something.
But
I
wanted
to
talk
a
little
bit
about
color
psychology,
especially
because
almost
all
of
our
training
is
online
training
because
we're
a
remote
company
so
being
very
aware
of
the
colors
that
you're
using
and
how
they
impact
a
learner's
retention.
B
B
A
B
I
have
it
at
gitlab,
but
I
have
at
others
that's
why
the
warm
colors
come
into
come
into
use.
That's
why
they're
flagged
as
the
best,
because
the
warm
colors
translate
to
colorblind
and
ocular
people
a
lot
better
and
also
warm
colors
are
the
easiest
to
filter
blue
light
through
so
if
they
have
a
blue
light
filter
on
their
screen,
they're
the
easiest
where
you
won't
see
a
difference
like
if
I
turned
on
my
blue
light
filter
that
I
have
on
my
my
my
monitor.
B
I
won't
see
much
of
a
difference
between
warm
colors,
as
I
would
see
a
huge
difference
between
like
these
bright
greens
and
the
yellow
got
it.
Our
branding.
Colors
are
actually
pretty
pretty
well
staged,
especially
in
that
area.
The
only
one
that's
a
little
flagging
is
the
bright
red
for
the
tanuki,
but
all
of
the
others
follow
the
warm
powder.
So
as
long
as
you
stick
to
our
branding
colors,
you
can
be
pretty
safe.
B
The
biggest
thing
that
I'll
talk
about,
though,
is
ensuring
that
you're
laying
the
right
color
on
top
of
each
other
so
like.
If
you
have
a
purple
background,
make
sure
you
use
white
text
and
vice
versa,
like
ensuring
that
the
text
is
easily
readable
and
if
you
can
look
at
it
and
it
kind
of
throws
you
off
a
little
bit,
then
you
need
to
you
know
you
need
to
switch
the
color,
which
the
marketing
page
actually
talks
about
which
colors
to
use
on
top
of
each
other.
So
that
will
help
you
as
well.
F
A
B
Try
to
put
links
or
things
wherever
possible,
the
other
topic
that
I
wanted
to
go
over.
I
put
this
in
the
get
lab
curator
section
as
well,
but
creating
good
objectives.
This
is
something
that
I
am
notoriously
horrible
about.
B
I've
gotten
better,
but
I
tend
to
just
write
fast
ones,
but
the
basis
of
it
is
making
sure
that
you're
writing
the
right
objective
for
the
right
action,
so,
whether
it's
a
skill,
an
attitude
or
a
knowledge
that
you're
trying
to
reinforce
or
trying
to
get
them
to
learn
ensure
that
you
always
use
an
action
verb.
So
what
will
the
learner
be
able
to
do
at
the
end
of
this
lesson?
I'm
a
really
big
fan
of.
B
What's
in
it
for
me,
using
that
type
of
methodology
like
how
is
this
going
to
help
you
do
your
job.
You
know
at
the
end
of
this
lesson,
you're
going
to
know
how
to
create
an
issue
in
git
lab,
or
you
know
the
associate's
going
to
be
able
to
create
a
valid.
You
know
gitlab
yaml
file
and
then
following
I've
always
followed
the
abcd
model.
It's
I
believe
the
d
is
labeled
differently,
depending
on
where
you
learned
it
from
like
christine.
Did
you
have
a
different
v
from
your
training?
B
D
The
d:
sorry,
I'm
not
even
remembering
from
clark
training
what
the
d
stands
for
degree.
Degree
of.
B
D
Yeah
yeah
yeah,
I
mean.
B
D
A
really
good
point
like
when
we
write
assessment
questions
or
do
lab
activities
and
assessments
that
that
is
what
that
can
help
measure.
Is
that
actually
being
able
to
judge
or
measure
how
well
they
did
something
or
to
what
extent
they
did.
Something
is
important.
B
Like
having
a
valid
configuration
file
or
having
a
a
multi-epic
instead
of
a
regular
epic
like
something
that
would
that
would
actually
show
that
they
were
able
to
do
it
correctly,
which
thankfully
gitlab
has
pretty
much
an
error
feature
that
would
error
everything
out
if
they
didn't
do
it
correctly,
so
they
kind
of
save
us
for
a
lot
of
that.
But
but
what
does
condition
mean
like
condition?
B
This
would
be
anything
that's
relying
on
it.
So
audience
is
the
person
that's
doing
it
so
like
the
associate
or
the
trainee
or
the
client
customer
behavior
is
what
you
expect
them
to
do.
A
condition
is
something
that
is
is
expected,
I'm
trying
to
think
of
an
example.
Well,.
D
B
So
if
there's
a
certain
variable
that
you
want
to
test
that,
maybe
that's
a
skill
set
but
there's
various
variables
to
it,
so
the
skills
set
can
change
depending
on
it.
Then
you
would
want
to
put
it
into
condition
and
so,
like
with
us,
with
a
starter,
get
lab
package
create
a
flushed
out
issue
using
the
available
pieces
or
something
that.
B
A
condition
is
basically
something
that
is
going
to
set
that
behavior,
so
something
that
would
change
that
behavior.
If
something
was
changed.
D
It's
not
a
it's.
It's
really
not
helpful
at
all
to
any
of
us
who
have
to
who
have
to
create
the
content
or
measure.
I
did
want
to
say
that
these
verbs
map
really
nicely.
D
I
feel
with
the
component
display
theory
that
the
kendra
was
mentioning
and
there's
actually
a
little
recipe
that
I've
used
over
the
years
to
create
different
pieces
of
content,
and
I
and
I
I
use
the
yeah-
it's
like
the
verb
list
that
kendra
has
like
there's
a
categorization
you
can
make
of
those
to
the
component
display
display
theory
framework
and
that's
like
a
whole.
Other
thing
that
I
could
like
do
a
whole
hour
on.
D
It
has
to
do
with
reusable
learning
objects,
but
but
ruth
clark,
kind
of
from
what
I'm
remembering
in
my
training,
with
her
incorporated
that
as
well,
and
it
makes
it
all
very
systematic.
If
we
can
use
these
verbs,
we
can
then
start
to
categorize
them
to
figure
out
how
to
how
to
put
together
in
a
templated
way,
how
to
put
together
different
types
of
learning
modules.
So.
B
D
A
B
I
think
so,
and
one
thing
that
it
does
too
for
me,
is
it's
kind
of
like
a
like
a
checks
and
balances
system
for
myself,
because
then
I
look
at
my
objectives.
They
help
me
create
the
course
and
then
it
also
helps
me
create
the
exam,
because
anything
that
you
test
them
on.
You
want
to
tie
back
to
your
objectives,
so
for
me,
not
only
does
it
does
it
help
me
flesh
out
that
training
and
ensure
that
you
know
how
is
this
going
to
benefit
them
like?
B
How
is
this
training
going
to
either
help
them
do
their
job
or
how
is
this
training
and
help
them
using
gitlab,
but
it
also
ties
into
helping
me
ensure
that
my
training
is
aligned
with
the
objectives
that
I
gave
the
customer
or
in
your
case,
like
internal
employees,
so
whoever
whoever
your
your
audience
is
well.
C
In
kendra
I
know
one
of
the
and
we
might
be
getting
to
this,
but
I
think
one
of
the
biggest
pieces
that
I
struggle
with
is
the
delivery
mechanism
right.
So
I'm
working
exclusively
within
rise,
so
you
know,
I
don't
think
rise
really
has
any
true
gamification
pieces
like
articulate
360
does
even
the
scenario
based
stuff
is
pretty
rudimentary.
Not
exactly
so.
I
I
really
struggle
with
you
know
what
modality
within
rise.
Do
I
use
to
relay
wit
concepts.
B
Not
something
that
has
been
gone
over
and
switched
about
80
million
times
since
I've
been
an
instructional
designer,
but
for
me
it
really
depends
on
the
interaction
that
I'm
looking
for.
So
if
I'm
looking
for
a
straightforward
course
with
kind
of
minimal
interaction,
then
I
would
use
something
like
rise.
B
If
it's
something
that
I
would
need
to
do,
a
lot
of
breaks
or
a
lot
of
knowledge
checks
to
ensure
that
the
the
customer's
still
understanding
what
we're
trying
to
train
and
have
those
breaks
in
between
then
I
would
use
something
like
storyline,
because
it
has
a
lot
more
variable
features
and
I
also
use
a
lot
of
gamification.
So
all
of
my
games
are
built
in
storyline
because
they
have
that
variable
availability,
where
you
don't
have
that
in
your
eyes,
also
anything
anything
that
you
want
scenario
based.
B
C
B
So
say
say:
you
have
a
scenario
and
you
say:
okay,
this
is
the
customer.
Here's
x,
y
and
z.
Here's
a
scenario.
You
know
bad
batter
great.
You
know
you
have
three
answers
and
say
they
pick
the
middle
one.
So
then
the
customer
responds
hey.
That's
not
what
I
wanted
and
the
scenario
changes
based
on
what
they're
choosing.
So
that
would
be
branching,
that's
something
where
they
select
an
answer
and
it
takes
them
off
to
one
path.
But
if
they
selected.
B
Answer
it
would
take
them
to
another
gotcha,
like
I
do,
branching
for
our
system
admin
training
depending
on
what
install
method
they
chose.
So
if
they're
they
collect
it
and
it
goes
down
a
different
training
path.
If
they
did
omnibus,
they
click
it,
and
it
goes
down
a
different
training
breath.
C
B
Well,
use
the
tools
that
that
we
have,
I
mean
one
thing
that
I
will
say
is
linkedin
learning
has
some
really
good
tutorials
on
storyline
and
on
rise,
so
using
using
storyline
snippets,
you
can
actually
port
storyline
interactions
into
ryze,
so
linkedin
learning
has
a
really
good
course
on
rise
and
has
another
really
good
one
on
storyline
as
well.
Oh,
okay!
B
As
far
as
creating
good
evaluation
or
quiz
questions,
the
main
things
that
I
talk
about
or
quiz
question
context,
so
you
can't
quiz
a
topic
that
you
didn't
cover
in
the
training
so
making
sure
that
they
always
tie
back
to
what
you
trained
in
class,
keeping
it
short
well,
you
can
have
a
really
great
scenario
based
question
itself.
That
could
be
a
little
bit
longer.
You
always
want
to
make
sure
that
your
answers
are
short
and
concise,
so
they're
not
reading
18
paragraphs
as
they
go
through
a
question.
B
Apparently
I
didn't
put
this
one
in
here,
my
bad.
Well,
you
get
some
information
with
ci
in
there.
B
Avoid
negative
questions
or
answers.
The
problem
with
these
is
people
it's
what
their
eye
pays
attention
to.
First,
so
they'll
read
right
over
it
because,
if
they're
a
speed
reader
like
I
am
you
skip
through
some
words
as
you're
reading,
a
topic
so
say
like
which
of
the
following
is
not
you
know
a
feature
of
get
lab
so
try
to
avoid
those
whenever
possible,
because
they're
easy
for
them,
like
especially
speed
readers
to
misconstrue
and
then
they'll
answer
incorrectly,
and
always
make
sure
that
you
tie
back
to
your
learning
objectives
of
the
course.
B
A
A
A
B
B
D
I
think
I
think
it'd
be
nice
for
each
of
us
to
share
links
that
we
have
to
different
best
practices,
because
your
question
like
the
way
I
would
answer
your
question
like-
are
there
any
tips
for
writing?
Good
goodness,
ask
me
questions
like
I
there's
like
a
two-day
training.
I
could
do
on
that
too.
But
there's
there's
some
of
our
test.
D
Delivery
partners
in
the
certification
industry
have
have
this
like
tips
and
tricks
on
their
website,
so
you
can
just
go,
read
it,
but
I
guess
my
most
important
one,
because
it's
a
huge
pet
peeve
of
mine,
when
it's
not
done
right,
is
to
make
sure
that
whatever
assessment
items
you
have
whether
that's
a
question
someone
has
to
answer
or
some
sort
of
work
product
they
need
to
create
to
pass
their
assessment.
D
Make
sure
that
task
relates
very
clearly
to
the
learning
objectives
and
remember
the
learning
objectives.
You're
going
to
do.
You
know
be
very
thorough
in
and
thoughtful
about,
writing
those,
including
verbs
and
all
the
details
in
in
the
objective
right.
So,
if
that's,
the
purpose
of
the
course
is
for
someone
to
be
able
to
do
that
task
that
objective,
your
question
should
be
making
sure
they
can
do
that.
It
shouldn't
be
a
you
know,
a
trick
question
where
you're
asking
them
to
remember
a
fact
and
that's
that's
it
right.
D
Instead,
it
should
be
something
where
they
they
have
to
know.
The
fact
to
answer
the
question
but
they're
doing
something.
The
skill
they're
applying
is
a
lot
higher
level
than
memorizing
a
fact,
and
that
that's
my
biggest
pet
peeve
of
a
lot
of
assessments
that
I
see
is
they're
asking
the
wrong
questions.
We
don't
know
if
that
person
can
really
do
what
we
wanted
them
to
do.
E
That's
a
good
point
christine.
That's
some
of
the
feedback
that
I
received
from
the
field
was
that
some
of
the
content
and
the
quizzes
that
they
had
taken
part
in
seem
more
like
a
trivia
exercise,
as
opposed
to
actual
knowledge.
A
A
B
At
devlin
this
week
they
presented
something
called
the
cita
model,
which
I
thought
was
really
helpful
for
me,
because
it
works
on
creating
problem
solving
like
decision
based
scenarios
for
certification
questions.
So
I
put
the
link
last
week
in
the
in
the
channel.
The
ed
content
channel
so
check
that
out
when
you
have
a
minute
and
they
might
align
with
what
you're
looking
for
just
to
end
out
some
other
design
tips
and
tricks.
B
So
one
big
thing
that
I've
noticed
that
my
eyes
always
go
to
is,
if
there's
not
enough
white
space,
it's
like
the
first
thing.
I
notice.
When
I
look
at
training.
If
it's
crowded,
there's
too
much
text,
there's
not
enough
on
the
margins,
you
can
tell
that
they
expanded
the
text
out
as
far
as
it
can
go,
just
being
mindful
that
you
have
good
margins
on
all
of
your
slides
and
your
presentations
to
ensure
that
you're
not
crowding
out
the
screen
so
like
this.
This,
for
me,
is
too
much
text.
B
The
use
of
images
so
only
use
images
when
they
add
value
to
the
training,
so
things
like
infographics
images
and
screenshots
of
the
system
that
type
of
content,
if
you're,
adding
an
image
just
to
add
an
image
or
to
get
rid
of
space,
don't
do
it
be
mindful
of
color
themes,
stick
to
browning
colors
whenever
is
possible
and
just
be
aware
of
how
bright
colors
are
and
the
combinations
that
can
affect
the
learner's
vision,
especially
if
you're
putting
a
color
and
then
a
colored
text.
B
I'm
pretty
sure
it
would
have
sent
me
into
a
seizure
so
just
being
aware
of
that,
so
it
and
just
going
through
the
training
as
a
learner
and
seeing
you
know
how
would
this
work
for
me?
If
I
was,
if
I
was
in
this
class,
would
it
be
easy
for
me
to
read
what
is
it?
Is
it
easy
to
see
always
make
sure
that
you
break
up
the
content,
especially
because
all
of
our
training
is
online?
B
You
don't
want
to
just
have
them
be
reading
for
45
minutes
or
have
them
be
listening
to
a
lecture
or
presentation
for
45
minutes
because
it
gets
boring
and
they're
gonna
start
multitasking
within
about
five
minutes
so
put
in
q.
A
questions
polls
games
other
activities
to
break
up
the
monotony
of
the
training.
So
there's
a
couple
of
different
games
that
you
can
do
in
the
classroom.
You
can
do
jeopardy.
You
can
do
you
know
how
to
be
a
millionaire.
You
can
do
family
feud.
B
You
can
do
polls,
you
can
do
you
know
random
q,
a
sessions
you
can
do
discussion,
questions
there's
a
couple:
there's
lots
of
different
ways
that
you
can
break
up.
You
know
kind
of
the
boringness
of
I
hate,
lecture-based
training,
so
I'm
pretty
biased
when
it
comes
to
that,
but
just
try
to
make
it
as
interactive
as
possible,
especially
in
an
online
environment.
B
B
A
blended
learning
approach
is
proven
to
be
the
most
effective.
This
means
using
more
than
one
type
of
delivery
method,
so
a
combination
of
a
quiz,
a
powerpoint,
a
pdf
online
courses
using
a
mixed
approach
of
all
of
those
usually
gets
the
best
results,
because
different
mediums
hits
the
different
types
of
learners
that
you
have
in
your
audience.
B
You
know
if
it's
just
one
straightforward
course,
or
one
objective-
that
you're
teaching
obviously
you're
only
going
to
have
one
delivery
method
and
maybe
a
knowledge
check.
But
if
it's
a
full
blown
hour
course
or
a
curriculum
or
certification,
then
you
would
want
to
try
to
blend
in
as
many
using
a
blended
learning
approach
to
ensure
that
you
have
a
couple
of
different
delivery
methods.
B
Keep
your
training
free
of
descriptions.
I
know
it's
not
as
big
as
it
used
to
be,
but
there
used
to
be
a
lot
of
like
flash
gifs
that
people
used
to
put
in
their
presentations
or
like
little
flash
images
now
that
adobe's
not
going
to
be
supporting
flash
anymore.
Those
are
thankfully
going
away,
but
things
like
that
like
overly
bright
colors
loud
sounds
loud
transitions
really
flashy
transitions.
B
All
those
do
is
just
distract
from
what
you're
trying
to
teach
them
so
try
to
take
as
much
of
those
out
as
you
can
having
a
nice
smooth
transition
is
perfectly
fine
or
like
animations
that
actually
are
adding
value
to
the
training,
but
if
you're,
adding
a
ton
of
animations
just
to
have
animations,
it's
not
adding
any
value
to
what
you're
delivering
the
biggest
thing.
That,
I
will
say
is
don't
cr
recreate
the
wheel
when
you
don't
have
to
so
use
templates
when
they're
available
reach
out
to
your
peers
for
examples
for
help.
B
If
there's
a
course,
that's
similar
make
sure
that
you're
reaching
out
to
the
the
team
or
the
original
creator
who
created
it
and
checking
before
you
create
new
content,
because
there
might
already
be
one
out
there
that
way.
We
reduce
a
lot
of
the
duplication
efforts.
Our
implementation
with
edcast
is
going
to
help
that,
because
we're
going
to
be
able
to
see
very
quickly
what's
in
the
system
and
what
training
we
have
available,
but
in
the
meantime,
using
slack
our
education
channels
to
ensure
that
before
we
go,
create
a
brand
new
course.
A
Makes
a
lot
of
sense.
I
also
think
you
know
one
thing:
we've
seen
with
like
the
live
learnings
that
we've
hosted
internally
like
using
those
as
as
like
a
networking
forum
as
well
for
for
managers
or
participants
just.
F
A
B
Having
a
feedback
loop
with
whoever
you
have
who's
actually
consuming
your
content,
so
you
want
to
have
a
course
survey
for
everything
that
you
deliver,
because
you
want
to
know
if
it
was
effective
and
if
you
reached
the
objectives
that
you
were,
you
were
trying
to
reach,
because
that
gives
you
that
return
on
investment
that
you're
trying
to
correlate
with.
B
F
B
And
as
far
as
your
last
question
for
difficult,
creating
content
in
an
asean
environment,
there
has
been
times
where,
just
due
to
the
fact
that
everybody
is
busy
like
extremely
busy
periods
like
usually
right
before
qbr,
getting
getting
feedback
just
based
on
availability
like
sometimes
if
the
tams
are
too
busy
or
the
pro
services
is
too
busy,
it
can
be
a
little
difficult
getting
feedback
sometimes,
but
it
really
depends
on
the
urgency
that
you
put
behind
it.
B
B
Can
you
review
this
when
you
have
time
and
a
best
practice
for
me
is
always
checking
with
whoever
owns
that
team
first
to
ensure
that
they
actually
have
the
bandwidth
to
give
the
feedback
that
I
need,
because
it's
not
always
possible
just
due
to
schedules,
you
know
changes
in
team
structures.
You
know
losses
on
the
team
stuff
like
that:
it's
not
it!
B
So
as
long
as
you
continue
to
follow
that
path,
and
you
identify
those
stakeholders
and
your
smes
early
on
they're,
already
aware
that
you're
developing
something
and
they're
going
to
be
contributing
to
the
review
process.
So
it's
not
going
to
be
a
big
surprise.
You
know
you're
not
just
going
to
be
dropping
a
link
on
them
one
day
and
be
like
hey,
give
me
feedback
by
next
week.
So
I
think
ensuring
that
you
follow
that
process
of
identifying
your
sms
and
your
stake
early.
B
Well,
because
you
have
to
be
valuable,
you
have
to
be.
You
know
very
conscientious
of
their
time
as
well,
so
ensuring
that
if
you,
the
most
advanced
notice
that
you
can
give
them
that
something's
coming
down
the
pipe
that
they're
going
to
have
to
review
or
provide
content
to
the
earlier,
you
could
do
it.
The
better
yeah
we
haven't
had
any
issues
with
time
zones.
As
far
as
I
know,
I've
had
I've
had
tams
from
you
know,
sweden
and
a
couple
of
people
on
our
team.
B
Are
you
know
in
spain
and
in
amsterdam
and
in
bangalore,
and
we've
never
had
any
issues
with
the
time
zones
or
differences
there?
They
just
put
it
in
a
document
or
they
put
it
in
a
review
file.
So
if
you
have
articulate
really
really
lean
on
review
360,
because
not
only
will
all
your
feedback
be
captured
and
easily
accessible
in
that
file,
it
makes
it
a
lot
easier
for
async
work
because
they
have
a
link
in
a
comment
field
where
they
can
put
their
comments.
B
And
then
you
can
reply
back
to
them
immediately
when
something's
completed,
if
you're
not
using
async
work
or
review.
360
like
say,
you're,
building
a
deck,
a
google
slides,
powerpoint
use
the
comment
feature
because
those
comments
will
stay
historically
in
the
versioning,
and
you
can
also
use
the
reply.
A
Yeah,
this
has
been
super
helpful.
I
mean
I,
I
I
really
appreciate
you
putting
the
time
together
to
put
the
deck
it's
yeah,
if
you,
if
you
would
like
us
to
feature
it,
you
know
any
on
the
lnd
page
or
something
or
you
know
the
recording,
definitely
I'll
post
that
on
that
page.
So
this
is.
I
think
this
could
be
a
really
valuable
resource
for
for
a
lot
of
team
members.
F
B
Because
there's
a
lot
of
stuff
that
we
have
and
there's
a
lot
of
great
illustrations
on
that
git
lab-
has
that
we
use
from
the
marketing
team
so
just
make
sure
that
you
leverage
what
we
have
instead
of
like
using
a
stock
image.
If
you
can
find
a
git
lab
illustration,
instead
is
always
best
practice.
A
F
A
Yeah
we
should
do
it,
do
it
more
often,
but
keep
it
keep,
keep
it
ad
hoc.
You
know
nobody,
nobody
wants
another
standing
media,
I'm
sure
thank.