►
Description
Annabel Dunstone Gray walks through recent research regarding the merge request user flow and how it relates to onboarding to a new team.
A
A
I
created
it's,
it's
a
merge
request,
navigation
and
user
flow
synthesis
that
I
completed
last
month,
and
then
I'm
going
to
talk
about
how
this
related
to
onboarding
onto
code,
review
and
kind
of
the
the
pros
and
cons
of
going
through
something
like
this
as
as
an
onboarding
exercise,
but
also
as
an
actual
work
item
to
be
completed
so
walking
through
the
presentation.
What
we
did
here
was.
A
I
looked
at
several
years,
actually
of
folder
research,
that's
been
conducted
regarding
merge,
requests
and
how
users
interact
with
them,
and
it
encompassed
different
things.
It
was
videos
and
just
issues,
and
it
was
internal
users,
external
users.
Some
of
it
was
papers
that
were
written
about
just
merge
requests
in
general
and
anyway
it
was.
It
was
a
lot
of
things
that
I
looked
through
to
kind
of
compile
this
and
the
top
takeaways
that
I
came
out
with
were
that
these
were
the
two
hypotheses
that
I
was
actually
focusing
on.
A
The
merge
request
ui
is
overwhelming
and
the
page
logs
focus
that
was
definitely
the
case.
It
can
be
difficult
to
parse
and
it
feels
crowded.
There
is
too
much
information
presented,
but
despite
all
of
that
information
it
can
still
be
difficult
to
get
a
quick
overview
of
the
merge
request
and
that
information
and
actions
are
scattered
and
it
can
be
unclear
what
users
should
actually
be
looking
for
and
then
the
second
hypothesis
is
that
user
flow
changes
depending
on
the
role
and
that
can
be
the
author
or
the
reviewer
and.
B
A
That
user
is
within
the
entire
review
cycle
and
so
from
from
merge,
request
creation
all
the
way
through
to
the
actual
merge
for
that
one
that
was
also
validated,
but
the
there
wasn't
enough
to
really
flesh
it
out.
So
I'm
actually
in
the
middle
of
conducting
more
targeted
research
about
the
entire
merge
request
journey.
A
A
But
this
links
to
a
skydeck
that
I
believe
anna
put
together,
that
encompasses
issues,
ethics
and
merger
requests
metadata,
and
then
everything
is
ranked
by
what
users
value
the
most
and
and
the
least
so
that
will
hopefully
help
us
as
well.
The
third
one
is
move
the
merge
request,
widgets
to
a
consistent
location,
to
increase
their
findability
and
usage.
A
This
one's
definitely
important
too,
but
merger
quest
widgets
in
general.
I
feel
like
it's
a
bigger
issue,
so,
yes,
they
need
to
be
more
consistent,
but
also
I've
seen
mixed
usage
and
some
are
definitely
used
more
than
others.
I
feel
like
the
most,
the
ones
that
are
used.
The
most
are
usually
the
merge,
widget
the
pipelines
and
the
approvals,
and
then
the
rest
of
it
is
a
little
bit
more
up
in
the
air
about
who's.
A
Using
what
and
again
this
was
only
the
research
I'm
doing
now
is
only
internal
users,
so
just
because
our
users
aren't
looking
at
certain
widgets
doesn't
mean
that
nobody
is
and
to
help
understand
more
about
that
we're
adding
telemetry
data
to
all
those
widgets
so
that
we
can
find
out
who's
using
what
and
how
much
and
we're
adding
tracking
to
like
every
piece
of
the
widgets.
A
Fourth
is
redesign
comments,
an
activity
to
highlight
relevant
items
and
use
chronological
reading,
I'm
gonna
go
into
that
one
later,
because
it's
a
long
one
and
then
five
is
research.
The
currently
fixed
ui
elements
to
remove
this
behavior.
This
one
is
very
near
and
dear
to
my
heart,
because
I'm
always
advocating
for
less
fixed
elements.
It's
I'm
trying
to
remove
my
internal
bias,
but
I'm
really
not
a
fan
of
them.
A
It
didn't
come
up
that
much,
but
the
the
fact
that
users
consistently
said
that
the
ui
was
cluttered
and
hard
to
read
and
there's
not
much
space
to
do
the
actual
review
portion
of
it.
So
we're
going
to
look
into
if
we
can
remove
some
fixed
elements
and
hopefully
kind
of
clear
the
page.
A
bit
and
make
it
a
little
bit
more
a
little
bit
easier
to
parse
so
going
into
some
of
the
details.
A
So
this
is
an
example
of
an
open
word
request.
I
found
I
think
last
month.
So
when
you
look
at
it,
you
don't
know
how
many
lines
have
been
changed
right
away.
You
don't
know
if
it's
been
reviewed,
yet
you
don't
know
if
the
pipeline
passed.
Unless
you
look
at
the
the
tab,
you
don't
know
if
it's
been
approved
yet
and
you're
not
sure.
If
really
all
this
metadata
is
useful,
it
probably
isn't.
A
And
then,
when
it
comes
to
the
discussions,
threads
can
break
the
chronology
of
events,
making
it
difficult
to
see
what
has
occurred
since
previously
viewing
the
merge
request-
and
this
is
an
example
of
a
really
long
march
request.
I
found-
and
that's
I
mean
that's,
not
a
good
example
actually,
but
I'll
look
at
some
of
the
details.
You
can't
see
the
resolve
discussions
without
expanding
them.
If
they've,
if
they've
been
resolved,
then
they're
collapsed
and
you
can't
see
without
opening
every
single
one
of
them
and
then
somewhat
important
as
well.
A
Is
that
they're
all
kind
of
styled
the
same?
So
in
this
example,
you
can
see
threads
were
started
on
old
versions
of
the
diff
and
then
at
the
top.
This
is
on
a
current
if,
if
you're
a
reviewer,
it's
really
important
to
know
if
the
code
is
up
to
date
or
not,
so
you
might
not
want
to
expand
the
ones
that
are
on
old
versions
versus
the
the
current
code,
but
as
a
reviewer
coming
into
this
they're,
all
they're
all
collapsed
and
you
don't
know
really
what's
what's
happened.
A
Because
we
allow
threading
of
comments,
the
chronology
is,
can
be
all
over
the
place.
So
in
this
example,
this
is
like
the
first
comment
on
an
issue
or
a
merge
request.
Sorry-
and
you
can
see
that-
there's
a
comment
that
appeared
after
the
merge,
but
it's
actually
at
the
top
of
the
page,
so
it's
really
hard
to
tell
where
everything
is
equal
weight
and
styling
of
the
activity
makes
it
difficult
to
see
more
important
actions,
for
example,
just
looking
at
this:
when
was
it
merged?
A
A
A
couple
considerations
to
take
into
account
that
users
frequently
mention
information,
overload
and
they're
seeing
way
too
much,
but
at
the
same
time
those
same
users
would
frequently
say
they
can't
find
what
they're
looking
for
quickly
enough.
A
And
then
I've
got
further
reading.
Here
you
can
look
at
the
whatever
links
you
feel
like
looking
at,
so
that
that's
the
synthesis
that
I
did
for
my
onboarding
issue
for
the
first
milestone
that
I
was
doing
code
review
and
I
was
going
to
just
talk
a
little
bit
about
the
good
parts
and
the
bad
parts
of
that.
So
the
good
part
about
doing
something
like
this
for
onboarding
was
that
I
was
able
to
get
such
a
great
overview
of
all
the
research
that
well
not
all
of
it.
A
But
a
good
amount
of
the
research
that's
been
done
throughout
the
years
on
the
on
the
merge
request
page,
and
it
just
gave
me
a
really
good
overview,
because,
even
though
I've
been
at
gitlab
for
almost
six
years,
I
didn't
know
that
we
had
done
all
this
research
and
also
helped
me
just
kind
of
figure
out
where
we
are
and
what
we're
going
to
work
on
next.
So
part
of
the
reason
that
we
did.
This
too
was
to
help
us
kind
of
inform
our
next
steps.
A
A
I
think
it's
already
at
the
lovable
stage,
so
what
we're
doing
now
is
trying
to
make
it
even
better,
and
I
that's
kind
of
where
the
negative
part
I
guess
comes
into
here,
because
when
I
was
looking
at
all
this
research
and
making
this
presentation,
I
was
just
thinking
that
this
is
things
that
everyone
already
knows.
A
I
think,
and
I
I
felt
weird,
putting
together
a
presentation
and
being
tasked
with
what
are
we
going
to
do
next,
when
it
kind
of
felt,
like
everyone
already
knew
what
we
were
going
to
do
next
and
every
time
I
had
an
idea
coming
into
code
review.
Like
my
first
few
weeks,
I
had
all
these
ideas
and
I
was
like
let's
open
issues
and
I
think
I
did
almost
every
single
one,
kai
or
pedro
was
like
yeah.
A
We
already
have
an
issue
for
this
or
it's
already
being
worked
on,
and
so
I
was
just
like
okay,
I
feel
like
I
felt
like
I
wasn't
making
as
much
of
an
impact
as
I
was
hoping
to,
and
I
think
a
lot
of
that
stemmed
from
I'm
getting
really
like
self-reflective
here,
but
I
think
I
put
maybe
a
little
too
much
pressure
on
myself
to
know
everything
about
code
review
just
because
I
used
to
be
on
the
front
end
team
and
I've
been
here
for
so
long
like.
A
I
should
know
everything
and
I
should
be
able
to
make
an
impact
right
away,
and
that
was
not
the
case
I
mean
I
hope
it's
going
to
be
the
case
soon,
but
yeah
that's
a
long
process.
I
don't
know.
C
A
I'm
going
with
this,
I
just
thought
it
would
be
interesting
to
talk
about.
So,
to
summarize,
I
think
doing
something
like
this
is
such
a
great
idea
for
onboarding
learning.
Everything
that
we've
done
and
helping
to
shape
where
we're
going
is,
is
really
helpful,
not
having
a
specific
deliverable
and
maybe
putting
too
much
pressure
on
yourself
is
not
good.
So
yeah,
that's
the
pros
and
cons,
but
that's
all
I've
got
right
now.
C
D
All
right
I'll
make
a
comment.
I'll
say
this
is
such
an
incredibly
important
part
of
our
experience
and
it's
so
important
to
our
business.
I'm
so
pleased
to
see
you
focused
on
this.
Thank
you
annabelle.
This
was
great.
It
was
informative
for
me.
I'll
actually
probably
go
back
and
watch
the
recording
again,
because
I
think
it's
that
important.
So
thank.
F
Yeah
that
was
awesome.
I
just
wanted
to
say
that
I
took
the
same
approach
when
I
moved
over
from
static
analysis
to
threat
insights
of
like
going
over.
What
do
we
already
know
like
going
over
all
of
the
research
findings
and
just
making
sure
that
I'm
caught
up
on
like
what
are
the
known
pain
points?
Do
we
have
issues
open
for
all
of
those?
F
I
too
also
found
myself
wanting
to
like
contribute
right
away
to
like
show
value
or
something,
and
I
just
had
a
chat
with
my
pm
yesterday
about
like
I,
I,
like,
don't
love
how
I,
how
that
manifested
in
some
ways
like
I
feel
like
I
jumped
the
gun
a
little
bit
on
on
some
things,
because
I
wanted
to
like
prove
myself,
rather
than
stepping
back
and
being
like.
Okay,
wait.
Is
this
a
feature
that
we
really
need?
F
It's
kind
of
like
when
you
join
a
new
company
and
you
just
want
to
like
show
everybody
like
they
made
a
good
hire?
I
thought
it
was
interesting.
I
was
being
self-reflective
too,
and
just
thinking
like
what
am
I
doing
like
that's,
not
how
I
usually
work
or
how
I
should
work.
So
I
had
a
chat
with
him
yesterday
about
that
like
but
anyways.
I
I
appreciate
your
vulnerability
there
and
just
I
I
think,
it's
great
to
start
with
reviewing
research.
I
think
that's
a
really
great
place
to
to
start
off.
F
I
had
a
thought
and
a
conversation
yesterday,
actually
about
like
category
maturity
and
sus
scores,
and
if
there
should
be
like
a
relationship
between
the
two,
I
don't
think
we
have
time
to
get
into
that
now,
but
maybe
that's
something
to
discuss
async.
I
don't
know
if
those
things
should
be
tied
at
all
but
good
job
taking
this
on,
because
this
is
obviously
a
huge,
hugely
important
part
of
the
product
and
a
lot
of
a
lot
of
considerations
to
make
and
different
workflows
and
personas.