►
From YouTube: UX Showcase: Rethinking split dropdown buttons
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
How's
everyone
doing
my
name
is
jarek
ostrowski.
I
am
on
the
ux
foundations
team
and
I'm
going
to
be
talking
about
split
drop
down
buttons
today
and
some
recent
user
testing
that
we
went
through.
That
was
just
enlightening
and
entertaining
all
at
the
same
time,
and
they
linked
to
the
the
main
issue
is
in
the
slides,
as
well
as
the
dock.
If
you're
interested
in
taking
a
look
as
we
work
through
this,
so
first
thanks,
jeff
crowe
for
bringing
this
up
in
a
previous
ui
showcase.
A
It
was
just
re-affirmation
or
confirmation
that
this
needed
to
be
rethought.
The
example
being
the
split
drop
down
where
you
open
the
drop
down
and
you
make
a
selection
and
it
actually
changes
the
value
of
the
button.
A
And
then
you
click
the
button
to
do
that
action.
So
in
this
case,
when
you
hit
new
subgroup
and
actually
just
change
the
button-
as
we
you
know,
are
all
pretty
familiar
with
this,
but
those
watching
might
not
be
so
yeah.
So
this
was
an
opportunity
for
us
in
the
foundations
team
to
do
some
user
testing
and
gather
some
research
around
a
specific
element
or
component,
because
it's
actually
more
rare
to
come
across
these
opportunities
than
trying
to
test
features
out
in
a
stage
group
which
is
really
cool.
A
So
our
split
drop
down
buttons.
We
have
actually
two
different
behaviors
with
the
same
visual
component,
so
obviously
right
off
the
bat.
This
isn't
a
good
thing.
We
want
to
have
consistency
when
you
see
something
and
you
click
on
something.
It
should
be
consistent
throughout
the
entire
experience
and
not
like
here's
a
button
click
it
something
you
know,
pops
open
or
just
doesn't
meet
expectations,
and
then
another
visually
similar
thing
does
something
different.
So
here
are
some
examples
that
we
gathered
just
preliminarily.
A
This
was
started
a
while
back
and
we
a
couple
of
them,
I
think,
are
being
worked
on
or
refactored
right
now,
which
is
really
cool
to
see.
So
the
marcus
draft
performs
the
action,
whereas
the
other
ones
change
the
default
button
and
that's
the
behavior
that
we
tested
and,
of
course
this
would
would
cause
confusion
right.
A
And
it's
a
fake,
just
kind
of
a
fun
little
thing,
because
one
thing
two
things
next
to
each
other
spot:
the
differences
like
there's
no
difference,
but
then,
when
you
click
on
it
something
different
happens.
So
all
right.
Let's
the
testing
begin.
Let's
do
this,
you
know
greg,
you
know,
grab
the
flame
thrower
get
in
the
monster
truck
yeah,
jumping
20
cars;
sweet;
no!
No!
No!
No!
Okay!
A
I
jumped
the
gun
a
little
bit
and
I
put
10
tests
out
there.
Yeah
everything
looked
good.
We
just
like.
I
collaborated
collaborated
on
it
with
a
couple
people
to
see
if
it
was
good
sent
it
out
and
eight
of
the
ten
people
didn't
follow
the
direction
properly.
Now
a
lot
of
you
have
gone
through
this
learning
curve.
Already
with
user
testing,
I
haven't
in
particular
at
least
in
a
while.
A
I
I
did
some
user
testing
a
while
back,
and
you
know
just
forgot
that
you
need
to
refine
the
test
before
you
send
it
out:
okay,
cool!
Let's
fix
you
know:
let's
do
this
more
testing.
Let's
go!
You
know
jump
in
the
monster
truck
again
we're
going
to
jump
some
more,
but
it's
still
failing.
Okay,
so
even
after
putting
in
the
instructions
do
not
click
in
all
caps,
no
people
still
clicked.
A
So
this
was
a
problem,
but
we
kept
refining
kepler
finally
kept
iterating,
and
so
here's
the
problem
and
a
visual
example
of
what
the
actual
prototype
was.
So
basically,
people
were
asked
in
this
scenario,
because
I
felt
the
comment
and
start
a
thread
was
as
basic
of
an
example.
A
Real
world
example
that
we
can
get
to
test,
and
so
they
were
asked
to
click
start
the
start,
a
thread
option
in
the
drop
down
then,
once
they
clicked
asked
if
this
met
their
expectation
or
not,
and
so
when
they
did,
the
button
changed
to
the
default.
Like
these
a
lot
of
these
split
drop
downs,
do
they
change
the
default
of
the
button
and
then
nothing
happened,
and
so
there
was
this
sort
of
awkward
pause
and
people
didn't
see
anything
happening.
They
thought
something
was
going
to
start
or
like
what
you
know.
A
A
So
that
was
just
a
really
interesting
sort
of
nuance
with
the
testing
and
seeing
nothing
happening,
or
at
least
you
know,
we
they
saw
the
button
change.
They
saw
the
button
default
change,
but
it
wasn't
what
they
expected
to
happen
when
they
clicked
it.
But
then,
of
course,
when
they
clicked
the
button
said
they
said:
okay,
yes,
this
is
what
I
expected
when
I
clicked.
So
it
was
just
those
little
nuances
during
the
test,
but
we
did
gather
enough
data
to
come
to
a
conclusion.
A
I
ran
like
three
or
four
tests
and
the
ones
who
clicked
the
option
in
the
drop
down
the
button
changed
the
defaults
and
they
did
talk
about
their
expectation
after
that
action.
It
did
not
meet
their
expectations,
so
this
is.
This
was
my
hypothesis
all
along,
especially
since
we
already
had
that
feedback
from
jeff's
elec,
showcase
and
research
that
this
was
something
users
were
confused
by
that
they
had.
A
You
know
there
was
more
work
to
get
to
what
they
wanted
to
do,
etc,
etc,
and
so
this
behavior
will
eventually
be
deprecated
and
I'll
go
over
the
next
few
steps
of
what's
going
to
happen.
For
that
and
what's
interesting
is
this
is
a
gitlab.com
specific
element
behavior.
This
is
sort
of
a
legacy.
If
you
will
element
our
current
gl
drop
down
component
when
using
a
split
does
not
have
this
behavior.
So
this
like
we
didn't
even
need
to
go
into
the
component
to
adjust
this.
A
So
next
steps,
this
part
is
done.
We
went
and
updated
the
pajamas
docs
to
remove
the
the
quote-unquote
changing
default
button
text.
So
now
we've
updated
that
documentation
so
that
that
behavior
is
not
going
to
happen
or
shouldn't
happen
anymore
and
there's
a
link
there,
as
well
as
in
the
ux
showcase
stock.
A
Next
step
that
I'm
currently
working
on
is
identify
locations
of
all
these
split
buttons
with
this
behavior
and
then
identify
which
of
you
are
associated
in
what
particular
area.
So
if
a
split
button
is
in
plan,
then
I'll
reach
out
to
the
person
in
plan
to
talk
about
how
we
can
deprecate-
and
you
know
I'll
work
with
you
to
update
and
do
these
things,
because
what
was
interesting
about
this
is
that
we
had
the
opportunity
to
test
this
component
and
are
not
compo,
let's
not
component
technically
in
gitlab
ui.
A
But
it
is
an
element
so
by
identifying
problems
with
components
that
might
be
in
all
these
different
stages.
If
it's
a
problem,
then
it's
a
problem
in
all
of
the
different
stages
that
we're
all
working
in.
So
it's
super
important
that
there's
consistency
and
continuity
here,
and
so
in
that
issue
I'll
be
identifying
where
we
have
the
rest
of
those
instances.
A
And
really
that's
it.
I
just
a
couple
t,
like
I
kind
of
went
over
two
takeaways
today.
One
was
the
user
testing
piece
which
I
just
thought
was
interesting
and
I
know
a
lot
of
you
have
gone
through
this,
but
it
was
a
unique
case
for
me
to
go
through
it
in
the
foundation
side,
because
you
know
again,
we
don't
have
the
opportunity
to
do
that,
a
lot
with
specific
elements
or
components,
and
so
anytime.
A
I
can
identify
something
in
a
ux
showcase
or
if
somebody
posts
something
in
the
foundations
channel
about
like
I
know,
becca's
mentioned
tabs,
and
so
we're
going
to
be
looking
at
our
tabs
to
see
if
there's
not
enough
affordance,
because
tabs
live
everywhere
in
a
lot
of
stages.
And
if
you
know
people
are
confused
or
there's
they're,
not
discoverable,
then
that's
something
we
need
to
test
and
validate
and
update
on
the
foundation
site
so
yeah.
C
Oh
I'll
ask
a
question:
this
is
alexa.
Sorry,
how
are
you
going
to
stop
them
from
clicking
next
time
jack?
C
C
So
when
you
were
testing,
I
you
learned.
A
C
They
were
clicking
when
you
didn't
intend
them
to
so
is:
did
you
kind
of
redesign
the
test
in
some
way
or
like?
How
are
you
going
to
change
for
next
time?
You
do
a
test
like
this.
A
Yeah,
no
okay,
thank
you
yeah!
So
each
iteration
of
the
test,
the
first
one
was
like:
oh
yeah,
everyone
will
get
it
and
you
know
it'll
be
perfectly
clear
and
everything
because
we
collaborated
on
it
and
we
reviewed
it
and
refined
it.
But
really
the
real
refining
happens
after
you
send
the
test
down.
A
So
after
the
first
iteration
was
like
add
a
sentence:
do
not
click
again
once
you
clicked
the
drop
down
option,
but
you
know
and
that
sort
of
reduced
the
percentage
of
people
who
clicked,
but
people
still
clicked
and
then
was
like.
You
know,
refine
the
prototype,
because
there
was
something
about
how
they
kept
clicking,
but
nothing
happened
in
particular
when
they
clicked
started
the
start,
a
thread
button.
A
Nothing
really
happened.
The
prototype
was
just
changing
the
button
and
nothing
else
afterwards,
and
so
there
was
changing
that
so
that
opened
a
new
dynamic
of
understanding,
people
and
the
test
and
yeah
I
just
kept
trying
to
iterate
like
do
not.
You
know
once
once
open
once
the
drop
down
is
open
talk.
A
You
know
verbalize
how,
if
that
met
your
expectations
and
you
just
sort
of
like
keep
running
into
the
wall
running
into
the
brick
wall
until
you
get
enough
data
and
research
to
say,
like
okay,
these
people
who
did
talk
about
their
expectation
and
follow
the
instructions
to
a
t,
then
there's
enough
data
too,
because
I
I
feel
like
it's
never
going
to
be
perfect
and
you
know,
even
if
you
have
to
refine
it
and
you
get
as
long
as
you
gather
enough
data
to
give
you
a
consensus
conclusion.
I
think
that's
good.
C
A
C
Was
your
prototype?
Well,
I'm
assuming
it
was
clickable
from
the
start
right.
A
C
My
I
mean
I
don't
want
to
get
into
ideation
too
much,
but
maybe
maybe
somehow
not
having
a
clickable
prototype
at
first
right
and
then
you
like
have
to
click
a
next
button
to
move
on
or
something
like
that.
Yeah.
A
I
don't
know,
but
yeah
we
can
play
around
later,
yeah
I'll,
listen,
I'm
open
to
any
and
all
ideas
when
it
comes
to
this
stuff,
so
anything
to
make
it
better.
That's
awesome.
D
Hey
jared,
could
you
list
or
link
the
videos
or
excuse
me
some
of
the
videos
I'd
like
to
I'm
kind
of
curious
on
how
the
users
are
failing.
The
tests.
A
Yeah,
absolutely
I'll
I'll.
Add
the
I
created
a
dovetail
project
so
I'll
link
the
dovetail
project.
It's
all
the
transcriptions
and
tagged
a
bunch
of
the
feedback
and
organized
it
a
little
bit
so
I'll.
Add.
B
B
Awesome
great
if
there
are
no
other
questions
or
comments,
then
thanks
so
much
derek
for
sharing.
Thank
you.