►
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
Cool
so
hi
everyone,
my
name
is
emily.
I'm
a
product
designer
on
the
growth
team
and
I'm
gonna
be
here
to
share
the
ux
scorecard
for
first
time
experience
authoring
a
pipeline
today,
so
the
task
at
hand
was
really
to
grade
the
onboarding
experience
for
authoring
a
pipeline
for
the
first
time,
but
at
the
same
time,
let's
really
pilot
out
this
new
onboarding
scorecard
and
kind
of
work
and
adjust
that,
while
I
was
doing
this
ux
scorecard
so
as
doing
ux
scorecards,
we
had
a
few
steps.
A
We
realized
that
the
persona
here
would
probably
be
sasha
or
devin
coming
in
and
setting
up
the
ci
pipeline
and
the
job
to
be
done
would
be
when
I
have
a
stable
development
operations
organization.
I
want
to
author
a
ci
pipeline
so
that
others
in
my
team
can
leverage
ci
to
increase
the
efficiency
of
their
tasks.
A
The
next
key
point
in
this
one
was
to
select
the
entry
point.
So,
as
you
know,
there's
like
multiple
entry
points
to
get
into
authoring
a
pipeline
for
the
first
time,
and
I
really
worked
with
nadia
here
to
kind
of
choose.
Four
key
flows.
A
Users
typically
choose
to
enter
into
this,
so
those
being
the
create
a
pipeline
widget
on
the
merge
request,
page
the
entry
point
on
the
empty
state,
ci
cd
page
as
well
as
one
where
people
are
just
finding
out
about
ci
cd
through
a
google
search
and
the
ci
cd
button
on
the
repo
page.
So
these
are
the
four
kind
of
experiences
I
wanted
to
go
through
and
grade
and
just
see
how
they
related
to
each
other.
A
A
This
is
really
helpful
for
me
to
just
kind
of
see
and
compare
these
flows
together
and
see
how
it
worked.
I
then
put
them
all
into
a
mural
board
made
on
user
journeys
and
kind
of
worth
the
ux
scorecard
through
there,
and
the
fourth
step
here
was
to
evaluate
the
experiences
with
the
new
ux
heuristics.
So
this
is
really
working
with
jackie
on
this
new
heuristic
scorecard
kind
of
scoring
them
all
on
these
new
heuristics
and
coming
up
with
a
final
grade
on
this
way.
A
So
now
I'll
walk
through
kind
of
the
four
experiences
I
went
to
for
this
scorecard,
the
first
being
that
widget
on
the
merge
request
page.
So
in
this
experience
a
user
would
be
up
creating
a
merge
request
without
a
cia
pipeline,
and
then
they
would
see
this
show
me
how
to
add
a
pipeline
button
there.
A
This
would
be
the
first
entry
point
so
if
they
click
this
they're
brought
into
this
editor,
where
they
can
they're
prompted
to
choose
a
template
thing
I
noticed
here
is
the
templates
aren't
really
explained
they're,
you
can
choose
them,
but
there
is
no
really
explanation,
but
it
really
does
show
you
where
to
go,
how
to
choose
them,
what
to
do
so
once
you
choose
the
template.
That
applies
you're,
given
the
template
there,
with
some
stubs
on
how
to
edit
it,
as
well
as
a
prompt
to
kind
of
commit
your
changes.
A
And
then,
finally,
you
get
this
like
celebratory
high
moment
that
you're
done
your
pipeline
is
ready
to
go,
and
you
can
kind
of
see
your
pipeline
in
action
which
lands
you
on
like
the
pipeline
page,
where
you
can
see
it
run.
So
this
one
was
fairly
quick.
There
is
that
celebratory
moment
after
you're
done
kind
of
your
first
pipeline,
which
is
nice.
A
The
second
one
is
the
entry
point
through
the
ci
cd
empty
state.
So
this
is
where
you
can
kind
of
click
to
get
started
with
ci
cd,
and
this
one
sends
you
straight
into
the
docs.
So
this
one
is
a
lot
more
manual.
You
have
to
kind
of
read
through,
like
the
cicd
process,
how
to
get
started,
navigate
through
the
docs
to
kind
of
find
this
like
getting
started,
template
page
where
you
can
kind
of
copy
and
paste
this
into
a
new
file.
A
So
this
is
where
you
kind
of
go
here,
paste
in
your
code
kind
of
make
the
edits
you
need
and
then
you'd
be
sent
to
a
merger
page
where
you
would
enable
the
ci
cd
through
this,
and
you
can
see
your
pipeline
running.
So
this
one
was
a
lot
more
manual.
It
was
a
different
experience
than
the
pipeline
widget
for
onboarding
oops,
one
sec.
The
zoom
thing
went
in
the
way
there
we
go.
The
third
one
which
nadia
actually
said
is
pretty
common.
A
Is
people
just
googling
around
searching
for
get
light
up,
ci,
cd
or
ci
cd
in
general
and
then
landing
in
the
docks
from
here?
This
experience
is
very
similar
to
the
empty
state,
one
where
they
pretty
much
land
in
the
docks
they
have
to
navigate
through
this
themselves
kind
of
on
board
themselves
through
the
docks
and
the
experience
is
very
similar.
So
I
won't
go
through
the
whole
one,
but
this
one
again
doesn't
have
many
onboarding
prompts
for
them.
A
We
don't
give
them
any
much
context
other
than
what
they
read
through,
but
it
is
similar
to
the
one
before
and
the
final
one
was
users
clicking
on
this
setup:
ci
cd
button
on
the
repo
page.
So
this
is
the
fourth
experience
where
they
would
get
into
this
flow
if
they
just
click
ci
cd-
and
this
would
bring
you
to
this
page,
where
it
gives
you
a
little
prompt
on
what
you're
about
to
do
and
gives.
You
tells
you
to
create
the
new
cicd
pipeline.
A
So
from
there
you're
brought
to
this
pipeline
editor
page
with
the
side
panel
expanded
where
you
can
read
a
bit
more
about
gitlab
ci
cd
check
on
some,
like
templates,
you
could
use
how
to
run
your
first
pipeline.
There's
a
lot
of
information
here,
but
we
do
give
them
more
of
like
contextual
information
than
sending
them
to
the
docs.
In
the
other
two
experiences.
A
So
then
they
could
go
in
to
see
icd
examples
potentially
like
look
here
through
templates,
find
a
template
that
works
for
them
and
your
changes
have
successfully
been
committed.
So
that
was
the
fourth
experience
going
through
this
back
here.
A
So
the
results
here
when
I
went
through
some
of
the
interesting
things
is
that
we
have
multiple
different
ways
to
introduce
users
to
ci
cd.
They
don't
seem
to
be
aligned
depending
on
what
entry
point
they
go
through.
All
the
entry
points
seem
to
have
different
experiences
onboarding
into
it,
but
I
did
want
to
say
that
first
experience
where
it
was
really
quick
to
the
onboarding
widget
was
probably
the
best.
A
It
had
a
lot
of
contextual
information
on
where
to
click
it
gave
the
users
context
of
how
to
do
it
and
it
hit
them
at
the
right
time.
A
Some
improvements
here
is
the
copy
could
be
tweaked
to
further
explain
the
benefits
of
pipelines,
and
while
exploring
this
experience,
I
actually
found
a
bug
that
led
users
to
a
404
page
instead
of
the
pipelines
page.
So
it
was
a
useful
experience
to
go
through
because,
while
doing
it,
I
found
a
bug
myself.
A
The
entry
point
to
the
empty
state,
cicd
page
rated
a
little
bit
less.
This
is
the
manual
one
where
we
sent
them
to
the
docs.
We
do
provide
users
with
all
the
information,
but
it
is
very
manual.
There
isn't
much
onboarding
help
in
this
at
all
when
they
create
the
ecieml
file
and
there
really
isn't
like
a
celebratory
or
high
moment
for
setting
up
your
first
pipeline.
So
there
are
a
lot
of
improvements
we
could
do
to
this
flow.
A
Finding
out
about
gitlab
cicd
through
the
google
search.
This
is
the
lowest,
obviously,
because
we
don't
really
on
board
them
much
at
all
the
experience
you
can
get
through
you
just
read
through
the
docs,
but
we
aren't
really
prompting
them
there
and
again,
there's
a
lot
of
opportunities
here
to
encourage
onboarding
and
like
happy
moments
and
the
final
one
is
the
ci
cd
buttons
on
the
repo
page.
This
one
scored
second
highest.
It
was
easy
to
go
through.
A
There
was
a
lot
of
information
in
that
side
panel,
but
we
did
give
them
all
the
information
they
needed.
This
entry
point
was
a
little
hidden.
It
was
hidden
in
all
those
gray
boxes
on
the
repo
page,
so
it
wasn't
really
obvious
from
first
glance
and
there's
still
a
lot
of
focus
on
docs
and
templates
user
have
to
kind
of
find
and
copy
over,
so
there's
opportunities
there.
A
Another
bonus
while
doing
this
is.
I
was
working
with
mate
and
he
came
up
with
this
great
onboarding
template,
which
we
kind
of
worked
into
making
a
mural
template
for
anyone
on
the
team
to
use.
So
I've
published
this
onto
mural
if
anyone's
doing
scorecards,
and
they
want
to
use
this
template.
It
is
now
kind
of
available
for
anyone
to
use
and
it'll
be
linked
in
this
slide,
as
well
as
the
issue
that
I've
put
in
the
agenda.
A
A
This
work
highlighted
piloted
the
new
ux
heuristic
scorecard,
which
I
personally
found
extremely
easy
to
use.
It
was
really
easy
kind
of
putting
your
scores
into
these
heuristics
and
having
it
automatically
calculate
out
a
final
score
for
you.
So
I
wanted
to
say
thank
you
for
jackie
for
all
the
work
on
that,
because
I
think
it
really
simplified
the
scoring
aspect
of
the
ux
scorecard
as
kind
of
like
a
newer
designer
here.
It
just
gave
me
great
insights
into
the
pipeline
authoring
flows,
so
that
was
really
great
too.
A
I
have
a
lot
more
context
and
a
funny
one
for
me
is
the
key
learnings
is
recording
the
ux
scorecard.
I
learned
that
I
think
I
stressed
myself
out
a
little
too
much
recording
it
a
few
times
before.
I
thought
it
was
perfect
and
just
learning
to
let
go
of
that
fear
and
kind
of,
let
yourself
record
and
don't
re-watch
it
don't
judge
yourself
too.
Hard
was
a
big
learning.
For
me,
too,.
A
And
just
wanted
to
give
a
small
thank
you
to
nadia
for
her
amazing
collaboration
on
this.
I
don't
think
I'd
be
able
to
do
this
scorecard
without
her,
so
really
wanted
to
call
her
out
on
this
as
well,
and
the
collaboration
between
growth
and
verify,
and
that's
it.
Thank
you.
B
Thanks
emily,
I
I
have
a
few
questions,
but
I'm
gonna.
Let
other
people
ask
their
questions
first.
So
it
looks
like
pedro.
Has
some.
C
Yeah
sure
yeah,
I
had
initially
two
questions,
but
then
it
turned
into
three
questions.
Thank
you
so
much
emily.
This
was
a
nice
walk
walkthrough.
C
I
think
this
is
interesting,
because
now
we
can
ask
more
specific
questions
that
maybe
they
wouldn't
be
so
easy
to
ask
if
it
was
outside
of
the
ux
showcase,
but
anyway
it
was
very
interesting
to
see
the
many
ways
that
we
introduced
pipeline
authoring,
and
I
wasn't
aware
that
we
had
yes
such
a
big
difference,
or
at
least
it
looks
like
it's
inconsistent,
and
I
don't
know
if
that
inconsistency
is
intentional.
A
Yeah,
that's
a
great
question
and
it's
actually
one
I
worked
through
with
nadia
earlier
on
in
the
scorecard
to
kind
of
narrow
it
down
to
these
top
four.
So
I
can
get
that
information
for
you,
but
yeah.
There
was
a
lot
of
information
showing
people
coming
in
through
the
dots
on
google.
That
was
like
a
very
common
path.
Users
took
so
yeah.
There
is
information.
I
can
get
it
and
kind
of
link
it
to
you.
If
you're
interested.
A
Yeah,
no,
that's
a
great,
I
think,
yeah.
It
would
be
a
great
like
idea
to
see
hey,
which
path
should
we
be
focusing
on?
First,
if
we
know
people
are
coming
in
through
the
docks,
how
can
we
make
that
experience
better?
If
we
know
people
are
coming
in
through,
like
the
widget
more?
Is
that
an
experience
to
focus
on
so
yeah?
I
think
that
would
be
very
valuable
information.
C
Yeah
cool
thanks
that
makes
that
makes
sense,
and,
and
so
I
think
it
that
the
plan
is
to
narrow
down
to
one
of
those
paths
more
or
less
and
then
try
to
like,
for
example,
that
there
was
that
one
that
had
the
the
sidebar
with
with
some
of
the
steps
and
then
links
to
documentation.
C
If
we
were
to
say
like
this
is
the
preferred
way
of
guiding
users
when
authoring
pipelines?
Is
the
plan
to
use
that
method
or
that
approach
versus
the
others?
Or
do
you
think
every
onboarding
approach
makes
sense
on
their
own.
A
Yeah,
that's
a
great
question
and
I
think
the
focus
we
want
to
get
into
now.
There
are
some
experiments
planned
to
kind
of
work
with
that
pipeline,
editor
with
the
side
drawer
out
and
test
different
onboarding
steps
through
that
and
really
make
it
a
consistent
experience
through
all
the
entry
points.
So
the
initial
goal
is
to
make
any
entry
point
into
gitlab
ci
cds,
similar
so
you're
not
going
through
different
steps.
Every
time
you
enter
this
from
different
areas.
A
B
A
C
That
that's
that's
yeah!
That's
what
I
wanted
here.
That's
perfect
and
yeah
one
one
idea
that
came
to
mind
when
you
were
evaluating
all
these
paths
from
a
an
expert
point
of
view,
like
you
were
saying
this.
This
path,
like
takes
all
of
the
boxes,
and
this
one
doesn't
take
so
many
boxes
and
also
to
that
to
you
to
what
you
were
saying
that
maybe
you're
going
to
experiment
and
see
which
one
performs
best.
C
Would
it
be
reasonable
to
ask-
and
I
know
this
is
we
could
be
here
forever
talking
about
solutions
but
like
a
very
quick
reaction
from
you
on
an
idea
that
would
it
be
possible
reasonable
to
ask
users
to
rate
their
satisfaction
when
they
finish
one
of
the
flows,
because
we
have
that
celebratory
thing
like
saying:
hey,
you
completed
a
pipeline,
but
we're
not
sure
if
people
are
happy
with
the
process,
maybe
they
felt
it
was
a
bit
confusing.
They
completed
it,
but
it
was
a
bit
confusing
or
you
know
what
I'm
saying.
A
No,
I
I
think
that
would
be
great
to
and
something
I
highlighted
kind
of
at
the
end
of
the
presentation
is:
it
would
be
great
to
get
users
insights
on
some
of
these
paths
see
what's
working
see.
What's
not,
I
know
we
have
like
a
few
a
b
test
experiments
planned
around
onboarding
in
the
cicd,
so
I
think
that
would
be
very,
very
useful.
I
agree.
B
And
just
add
real
quick.
I
think
some
teams
are
also
looking
at
putting
a
little
mini
surveys
in
the
app
so
like
after
you
finish
the
task
you
could,
we
could
say
you
know
how
is
this,
and
maybe
it's
just
thumbs
up
thumbs
down
or
something
like
that,
but
we
don't
have
that
yet
I
don't
know
it
could
get
to
be
too
much.
Could
it's
the
kind
of
thing
that
could
be
overused
but
that'd
be
interesting
too.
D
I
had
the
next
question
emily.
Thank
you
so
much.
I
really
enjoyed
learning
from
you.
This
was
real
cool.
Is
there
a
boring
first
solution
where
we
just
improve
the
docs?
I
love
that
we're
experimenting,
that's
cool,
but
that
takes
time
and
and
thought
docs
are
faster
to
update
and
it
sounds
like
we
do
know
which
of
those
paths.
Right
now
is
the
smoothest
path,
and
since
we
know
so
much
of
the
access
is
coming
through
the
docs
anyway.
A
Yeah,
some
of
the
actual,
when
I
was
going
through
it,
there
is
the
option
to
like
get
send
them
out
of
the
docs
completely
and
always
do
like
onboarding,
but
there
are
options
to
improve
the
docs,
like
some
small
ones
like
I
saw
the
templates
you
had
to
like
copy
and
paste
manually
entry
points
into
like
the
get
started
were
kind
of
hidden.
So
I
do
think
there
are.
If
we
wanted
to
do
some
quick
changes
to
the
docs.
D
I
think
that
would
be
amazing
and
man
our
docs
team
is
all
over
it.
So
I
highly
encourage
you
to
reach
out
to
your
to
the
technical
writer
of
that
area.
I
think
it
might
be
marcel
amaro,
but
things
kind
of
shuffled
around
you
might
have
to
dig
around
just
a
little
bit
to
find
out.
Awesome
sounds
good.
B
So
I
wanted
to
ask
a
quick
question:
the
hero6
that
we
have
listed
are
in
different
categories.
Some
of
them
are
really
specific
to
an
onboarding
or
first
time
experience,
and
then
some
of
them
are
more
generic
heuristics.
So
I
wanted
to
know
if,
like
how
relevant
were
the
non-on-boarding
heuristics
and
were
there
anywhere,
you
were
like
these
just
don't
apply
and
you
didn't
use
them.
A
Yeah
there
were
a
few
that
when
I
was
looking
through,
I
was
trying
to
see
how
they
would
apply.
The
first
one
was
like
the
I
think,
is
my
information
secure,
but
then
I
realized,
if
it
was
their
first
time
going
through
it,
that
might
even
be
more
relevant
than
their
second
or
third
time
going
through.
So
I
personally
did
use
all
the
heuristics.
A
I
did
find
ways
to
link
them
back
into
onboarding,
but
I
did
like
the
flexibility
to
kind
of
zero,
the
ones
I
wouldn't
use
out
so
that
ability
on
the
scorecard,
I
think,
is
really
useful
in
the
future.
If
I'm
going
through
an
experience-
and
I
don't
find
a
particular
one-
should
be
graded
or
should
really
be
affecting
the
grade
itself,.
B
Cool
thanks
and
for
those
of
you
who
don't
know
what
we're
talking
about,
we
updated
the
onboarding
scorecard
or
just
the
ux
scorecard
template
to
have
weights
in
it,
so
that,
if
there's
something
in
there,
that
really
is
less
important.
You
could
give
it
like.
You
could
reduce
its
weight
all
the
way
to
zero
so
that
it
doesn't
factor
into
your
score.
B
So
that's
kind
of
new
and
and
open
the
feedback
on
that.
Okay,
cool,
okay
and
does
anybody
else
have
questions.