►
From YouTube: UX Maturity Scorecard for Release Orchestration
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
C
Please
all
right,
well
all
right,
perfect!
Well,
thank
you,
Christy.
So
much
for
joining
us
to
talk
about
this
I'm
I
wanted
to
preface
this
saying
that
I
am
really
excited
that
we
have
this
process
in
place,
because
I
do
feel
like.
We
need
some
sort
of
evaluation
criteria
for
maturity
from
the
product
standpoint
as
I
do
feel
like
when
I
talk
to
customers,
there
isn't
really
a
great
understanding
of
how
we
move
projects
forward.
C
So
that's
really
exciting
to
hear,
because
the
more
the
more
evaluation
and
results
folks
that
we
have
the
stronger
that
we're
going
to
get
yeah
I
will
say,
though,
when
I
moved
the
release
orchestration
category
from
minimal
to
viable.
We
made
that
change
back
in,
like
we've
discussed
at
the
end
of
February.
The
new
maturity
process
wasn't
merged
yet
in
the
handbook,
so
we
were
using
the
same
definitions
that
were
in
the
handbook
and
the
rebase
line
process.
The
high
on
I
did
using
the
user
insights
from
there
to
validate
the
move
in
maturity.
C
So
if
I
could
accomplish
two
things
out
of
meeting
today
with
you
all
the
first
one
would
be
one
just
to
kind
of
share
what
we
did
to
declare
that
maturity
move
and
to
make
it
so
that
one.
We
can
hopefully
declare
that
that's
an
acceptable
move
to
make
or
number
to
understand
the
overall
goals
for
the
new
category
scorecard
process,
and
if
we
should
declare
a
new
job
to
be
done
to
run
through
the
process
so
that
we
can
qualify
the
release.
Orchestration
stage
would.
B
Sid
was
the
one
who
basically
looked
at
our
maturity
model
and
I.
Don't
like
that.
This
is
so
subjective
and
realistically
I
was
like
hey.
We
don't
like
that,
it's
so
subjective
either.
He
said.
I
really
want
to
get
to
a
place
where
we
are
doing
this
objectively
and
our
customers
are
telling
us
what
we
think
our
maturity
is
and
what
was
nice
about
it.
B
He
sees
in
basically
Christine
look
I'm
getting
you
exa
lever
here,
because
we
know
that
and
by
the
way
I
don't
mean
this
directed
at
this
particular
team,
because
I
don't
think
this
is
true.
Jackie
I
know,
UX
is
very
important
to
you.
Broadly,
we
don't
give
you
X
enough
attention
it
can
it
can
get
sidelined
for
other
concerns,
so
this
is
a
way
for
him
to
raise
it
in
the
priority
list,
because
it's
important
for
everyone.
So
my
response
was
like
philosophically
yay.
Oh
my
god,.
B
B
This
is
how
companies
do
this
we'll
just
go.
Do
that
no!
No!
This
is
all
us,
so
we
spent
a
considerable
amount
of
time
Scott
and
me
in
particular,
trying
to
figure
out.
Where
are
we
gonna
start?
Some
of
the
ideas
we
tossed
around
were
picking
just
kind
of
it
random
as
random
is
not
the
right
way
to
say,
but
looking
at
like
and
lovable
incomplete
and
enviable,
and
just
to
start
and
say
like
hey,
we
need
to
start
validating
somewhere.
How
like?
Where
do
we
start?
How
do
we
learn?
B
B
C
I
understand
that
I
would
say
that
at
the
core,
I
am
fundamentally
worried
that
are
putting
layers
between
the
customer
and
the
value
that
we
deliver
for.
A
customer
is
going
to
slow
down
our
velocity
and
that's
ultimately,
the
metric
that
we're
all
getting
held
accountable
to
at
the
end
of
the
day
is
how
much
value
we're
driving
for
the
end
user
and
I
would
say
when
I
joined
released
as
a
stage
in
November.
We
were
delivering
20
M
ours
a
month,
and
this
past
milestone.
C
We
delivered
37
M
ours
in
release
management
alone,
so
the
scale
that
we're
seeing
an
exponential
improvement
based
on
how
high
on
and
I
are
working
together
with
engineering,
the
reduction
in
cycle
times,
the
throughput
of
engineering.
The
value
that
we're
driving
for
customers
definitely
validates
that
customer
iteration
is
super
important,
because
we've
had
well
over
think
42.
C
Individual
customer
interviews
between
opportunity
canvas
a
solution,
validation,
but
also
when
high
on
and
I
went
through
and
structurally,
when
interviewed,
customers
on
how
to
create
a
release
that
job
to
be
done
has
over
five
months
of
validation
behind
it
and
all
the
UX
are
insights.
So
what
we
can
do
is
maybe
math
all
that
data
and
information
to
the
maturity
scorecard
template
rather
than
having
to
conduct
five
new
user
interviews,
because
we
already
have
a
huge
knowledge
base
of
user
insights
which
isn't
a
universal
truth
across
the
entire
product
management
organization.
B
The
challenge
we're
gonna
run
into
so,
first
of
all,
yeah
every
time
we
add
something
to
our
process,
it
could
have
a
potential
impact
on
velocity
and
that's
one
of
the
things
I
think
that
we
should
you
look
at
coming
out
of
this
pilot,
which
is
why
I'm
running
it
as
a
pilot
is
what
was
the
impact,
but
until
we
go
and
do
it,
you
don't
know
what
the
impact
is,
so
that
we
could
go
back,
for
example,
to
sit
and
say:
okay,
we
did
37m
ours
in
this
timeframe.
B
B
We
have
to
have
a
consistent
and
rigorous
process
to
be
able
to
say
like
we're
doing
this.
The
same
way
we've
asked
the
same
question.
We've
asked
it
the
same
way.
So
now
we
know
when
we
say
something
is
this
level
of
maturity
like
we
know
that
it's
this
level
of
mature,
if
we
start
kind
of
playing
with
that
before
we
even
start
it
before
we
even
know
anything
about
it.
I
think
we
start
diverging
off
of
the
path
already,
and
there
are
some
specific
questions
that
are
part
of
the
category
maturity
scorecard.
B
A
Ya
know
Aquarius.
Well,
you
know
you're
the
the
frame
that
we're
using
is
benchmark
right
and,
if
you're
doing
a
benchmark,
you
need
to
be
asking
the
same
questions
in
the
same
way
each
time.
So
you
can
compare
the
results
backwards.
What
this
team
has
done
and
then
I
talked
to
them
the
other
day
about
this.
Is
why
we're
here?
A
It's
it's
almost
as
if
they've
jumped
past
this
whole
thing
that
we're
trying
to
prove
and
have
gotten
like
they
leveled
up
they're
like
at
a
level
11.
They
have
already
infused
the
UX
into
their
process
and
they've
done
it
for
three
milestones
already.
So
they've
got
a
really
high
level
of
confidence
with
the
decisions
they've
made.
A
So
what
we
talked
about
is
looking
at
that
scorecard
in
the
process
and
then
looking
at
the
data
that
they've
gathered
and
seeing
if
they
can
map
that
data
to
the
scorecard
process
and
if
they
see
any
holes
in
that
process.
Let's
talk
about
that
because,
as
a
team,
we
weren't
really
sure
if
they
had
asked
all
those
questions
that
we
have
outlined
in
that
benchmark
process
and
if
they
hadn't
we
need.
We
would
come
back
and
talk
about
that
and
how
to
gather
that
data.
Do
we
need
those
five
people?
A
Do
we
need
another
research
effort?
I,
don't
know
yet
you
know
we,
let's
look
at
that
data
and
see
how
it
matches
so
that's
kind
of
where
we
were,
but
we
also
didn't
want
to
go
too
far
without
having
a
conversation
with
you
about
that
to
see
as
part
of
this
experiment,
because,
like
I
view,
what
they
are
doing
as
part
of
this
experiment
is,
this
is
ultimately
where
we
want
people
to
be
right.
We
want
them
to
be
using
the
UX
all
the
way
through
to
gain
confidence
at
every
stage
with
their
decisions.
C
Mean
sorry,
we
also
discussed
picking
a
different
job
to
be
done,
yeah
in
release
orchestration
and
doing
the
process
for
that.
Just
because
we've
spent
the
past
five
months
validating
and
revalidating,
they
get
a
release
job
to
be
done,
so
it
feels
like
we
have
way
too
much
data.
We
already
have
our
own
internal
bias
ease
about
that
job
to
be
done,
but
it
just
doesn't
seem
like
it
would
be
a
natural
fit
for
this
scorecard.
Okay,.
B
That's
very
helpful
and
look
counting
clicks
I'll,
be
honest
with
you.
That
is
the
part
of
the
process
that
I
am
personally
the
least
confident
adds
value
to
the
scorecards
okay.
So
we
we
debated
this
a
lot
and
where
I
am
landed
was
to
defer
to
the
researcher.
He
was
coming
up
with
the
process
because
we
didn't
know:
okay
you're
the
expert,
but
we
also
said
we
would
revisit
that
decision.
So
I
think
what
I
would
ask.
B
A
B
A
B
You
had
was
around
asking
the
questions
about
confidence,
so
if
Laurie
is
partnered
with
Hana
and
you'll
come
out
of
that,
and
you
feel
very
confident
on
it
together,
then
I'm,
okay
with
that,
so
that
leaves
us
with,
though
one
of
the
big
points
of
this
okay
are
was
to
vet
the
process,
because
we
know
it's
just
like
we
just
said
you
know.
Quick
counting
is
particularly
useful.
There
are
parts
of
the
process
that
are
wrong
and
we
have
to
go
through
it,
though,
to
know
what
those
things
are.
B
C
B
Was
and
again
after
much
discussion
was
we
were
trying
to
to
make
the
scope
of
this
project
smaller
and
where
we
landed
was
well
we're
doing
so.
Much
up
should
be
doing
so
much
up
front
research
between
problem
and
solution,
validation
to
get
something
to
what
we
are
gonna
call
minimal,
that
we
decided
it
wasn't
as
valuable
to
do
it
on
a
move
to
minimal
and
that
we
would
allow
the
PM's
to
make
the
call
minimal.
C
We
wanted
to
declare
another
job
to
be
done
in
order
to
count
the
release
orchestration
as
viable
I
would
be
really
worried
about
the
bandwidth,
mainly
because
going
back
to
the
velocity
comment.
That
I
was
mentioning
if
we
look
at
the
key
results
that
we're
trying
to
accomplish
it's
driving
and
influencing
iacv
and
revenue
and
I
have
three
deals.
Next
quarter
that
I'm
trying
to
influence
with
book
secrets
management
NS
the
ICD
dashboard.
C
B
Look
I
get
in
a
situation
that
you're
in
so
I
want
to
I
want
to
express
empathy.
Why
you're,
like,
oh
god
and
I,
think
the
problem
we're
going
to
keep
running
into
you
is
that
every
p.m.
is
gonna
feel
is
amazing.
When
is
the
milestone
where
you
don't
have
something
super
valuable
and
customer
impacting
to
deliver,
but
that's.
C
Exactly
what
I
was
telling
Nadia
and
Lori
in
our
Skol
was
that
we
could
do
this
for
release
orchestration,
move
to
complete
next
quarter,
because
we're
looking
to
move
release
orchestrations
next
quarter,
and
that
would
be
something
that
we
could.
Definitely
the
job
to
be
done
would
be.
I
can
track
progress
of
a
release
in
Gil
AB,
and
that
would
be
the
feature
set
that
we're
executing
against
in
12
10
and
13
and
13.
Not
one.
So
do
me.
B
You
know
we've
gotta
be
able
to.
We
are
gonna,
be
held
accountable
for
making
progress
on
this
initiative.
So
I
want
Scott
to
be
aware
and
bought
into
this
and
then
what
I'll
end
up
doing
is
when
we
score
the
okay,
our
I
will
put.
We
always
do
a
good
bad.
Try
as
a
retrospective
for
each
okay
are
and
I'll
put
the
details.
So
so
give
me
these
details
and
I'll
put
those
details
in
the
okay
are,
and
that
way,
because
it's
gonna
be
links
from
the
handbook.
So
I've
said
once
to
look
at
it.
A
C
B
C
B
C
Then
we
will
like
I'm
so
on
board
with
testing
this,
for
the
move
to
complete
thoroughly
and
confidently
I'm,
just
again
anxious
for
us
to
declare
this
okay
are
valid,
given
that
we
did
all
this
other
work
for
it
with
this
with
the
previous
job
to
be
done,
just
I
can't
think
of
another
job
to
be
done
for
release
orchestration.
That
would
capture
the
completeness
as
much
as
create
a
release
would
be
okay,.
B
C
C
C
B
This
is
known
that
I'm
having
trouble
even
talking
about
it,
yes,
which
is
why
so
I
will
tell
you
what
was
important
to
me
was
important
to
me.
Is
we
put
the
task
in
front
of
the
user?
We
see
how
they
use
it.
We
get
the
information
about
where
it's
not
going.
Well,
that
was
one
of
the
things
we
really
debated
too
was
like.
There
was
a
point
at
which
the
recommendation
was,
and
no
talk
out
loud
and
I
was
like.
I
cannot
come
out
of
this
research
sessions
and
say
to
a
p.m.
A
B
Yeah,
so
to
me
this
is
path
based.
I
want
to
come
out
of
it
with
meaningful
information
that
we
can
take
action
on,
and
then
the
most
important
part
to
me
is
for
the
user
to
verbally
express
their
level
of
confidence
in
like,
in
the
maturity
level,
asking
the
question
the
way
that
we
have
asked
them
to
ask
the
question
so.
C
It's
the
scale
getting
that
scale
number.
That's
number
one
thing:
okay,
so
that
nails
it
for
me
because
we're
doing
a
lot
of
the
interviewing
and
questioning,
but
if
it
really
is
to
get
that
quantifiable
scale
of
confidence
and
comfort
in
our
products.
Sense
I
is
that
it's
very
workable
I.
Don't
think
it
necessarily
is
it's
as
counter
to
the
values.
I
thought
it
was
when
I
first
read
this
process
of
like
this
is
so
inefficient.
It's
gonna
take
such
a
long
time.
C
B
I
believe
that
we
will
change
this
process.
I
will
rely
on
the
people
who
are
trying
to
use
it
to
come
back
and
say
I
tried.
This
here
are
the
things
that
worked
here
are
the
things
that
did
not
work.
Take
these
things
out
change
this
thing
a
little
bit.
That's
the
only
way.
We're
gonna
know
how
to
to
do
this.
So
it
is
that's
why
I
keep
saying
your
feedback
is
critical
and
I
agree
I.
My
my
goal
is
for
this
to
be
as
useful
as
user.