►
From YouTube: Create:Code Review Prod/UX Sync - 2020-12-16
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
The
two
we
didn't
talk
about
yesterday
were
related
to
okay.
This
meeting
is
being
recorded
now
yeah,
so
the
two
items
that
we
had
left
to
talk
about
yesterday
in
the
ux
sync
were
related
to
the
focus
of
the
group
and
also
the
the
risking
and
increasing
the
confidence
about
what
we're
focusing
on.
Basically,
so,
let's
start
about
the
focus
of
the
group
and
yeah
just
in
general.
A
B
I
think
the
first
one
is
like
I
linked
a
merge
request.
I
would
say
largely
this
is
one
the
direction
page
hasn't
been
updated
and
two
one
of
the
things
that
that
I
got
pushed
on
for
the
web.
Ide
direction
was
like
rewriting
it
in
a
way
that,
like
the
first
section
sort
of.
B
Described
a
series
of
problems
and
then
like
how
we
thought
about
those
problems,
and
so
what
I'm
trying
to
do
and
how
I'm
trying
to
frame
thinking
about
code
review
which
leads
into
this
like
what
are
we
gonna
go,
do
is
sort
of
describe
the
workflow
and
problem
that
exists,
and
then
I
want
to
focus
that
to
one
or
two
things
that
are
in
service
of
that,
and
I
think
you
asked
sort
of
like
now
that
I've
seen
what's
going
on
in
code
review.
B
I
think
my
biggest
concern
with
code
review
is
that
there's
lots
of
things
going
on
and
they
seem
related,
but
not
not
easy
to
draw
a
line
back
to
like
how
they're
all
related
or
like
what
they're
in
service
of
or
like.
Why
we're
going
to
do
the
sort
of
five
different
feature
areas
that
we're
doing
at
one
time
or
something
like
that,
and
so
what
I'd
like
to
do
is
say
like
I'd
like
to
give
us
a
lens
to
sort
of
look
at
every
feature
or
improvement,
or
anything
that
we're
going
to
do.
B
Maybe
we
still
need
to
do
that
anyways
or
well,
then
maybe
it's
not
as
important
as
we
think
it
is
because
what
we
ultimately
care
about
is
getting
to
this
end
state
and
so
trying
to
like
reframe
the
way
we're
gonna
think
about
everything
in
that
with
that
lens,
and
so
that's.
That
is
what
this
is
supposed
to
be
the
foundation
of
so
like
this
overview.
Section
will
probably
take
a
lot
longer
to
like
refine
and
get
right,
but
like
once
it
is
everything
else
should
sort
of
flow.
B
Naturally
there
I
think
that
answers
your
question.
The
other
thing
I'm
thinking
about
is
like
code
review
philosophy
which.
B
Is
one
of
the
that
related
to
the
above
one
of
the
things
that's
sort
of
disappointing?
I
guess
in
looking
at
gitlab
as
a
company
is
as
an
engineering
company
building
engineering
tools.
B
We
talk
about
code
review
like
if
you
go
read
our
development
docs.
We
talk
about
it.
It's
talked
about
in
a
very
tactical.
Do
this
don't
do
that
sort
of
way?
And
not?
B
There
is
no
like
that.
I
have
seen
there
is
no
like
engineering
philosophy
behind
the
way
gitlab,
as
a
company
does
code
review
that
would
drive
the
reason
our
code
review
product
does
what
it
does
like.
We
don't
there's
not
a
if
you
contrast
that
with.
If
you
look
at
the
linked
issue
and
you
go
look
at
the
document,
that's
there
from.
B
If
you
go
look
at
google's
docs
about
code
review,
it
is,
there
is
tactical
things
in
there,
but
it
is
told
much
more
narrative
based
and
much
more.
It
begins
with
sort
of
the
philosophy
and
their
overriding
thing
that
they
care
about,
which
is
you
know.
Does
this
make
things
better?
B
Yes
or
no
they're,
very
binary
at
part
of
that?
And
if
the
answer
is
yes,
then
like
you,
should
you
should
default
to
merging
versus
we
don't
as
a
company?
We
don't
have
that
sort
of
like
philosophy,
I
think
it's
there,
but
it's
not
articulated
as
well
and
so
coupled
with,
like
writing
the
code
review
direction.
B
B
I
have
to
sort
of
skirt
around
and
say
like
well,
we've
got
this
or
we
got
this
and
like
we
sort
of
don't
really
agree
with
that,
but
like
we
can
keep
talking
down
this
path
or
whatever
that
case
may
be,
but
I
think
that
is
so
on
top
of
like
trying
to
make
sure
that
the
group
in
terms
of
the
product
direction
I
want
to
make,
I
want
to
also
in
a
longer
tail
piece.
I
want
to
start
figuring
out.
How
do
we
codify
the
way
gitlab?
B
A
B
During
a
code
review
and
how
we
push
that
forward,
and
so
those
are
the
things
that
I'm
and
I
think,
if
both
of
those
are
the
same
thing
and
together
what
we
go
and
build,
becomes
much
easier
to
decide
on
right,
because
we
build
things
that
service
that
and
we
we
worry
less
about
sort
of
the
other
things
and
we
take
them
into
account.
But
we
try
and
figure
out
how
could
we
reframe
that
or
redo
those
things
just
in
in
support
of
this
versus
you
know
what
someone
might
say
they
need
or
want.
A
So
let's
imagine
that
we,
as
you
were
given
a
great
example
of
someone
wanting
to
block
a
merge
request,
because
so
they
want
to
signal
like
this
is
a
negative
or
like
garrett:
does
it
where
they
have
the
minus
one?
And
now
I'm
saying
oh
in
terms
of
code
quality
or
code
style.
This
is
minus
one
and
they
say
they
need
it,
but
our
philosophy
is
not
aligned
with
that.
So
what
we
tell
them,
we
just
tell
them
that
gitlab
is
not
the
right
product
for
that
or
that
they
need
to
change.
B
That
was
what
you
decided.
It
was
going
to
do
well,
part
of
delivering
faster
means,
there's
a
workflow
right.
This
gitlab
is
fundamentally
a
workflow
tool
that
allows
you
to
deliver
faster.
If
you
just
use
all
of
your
old
habits
and
things
in
a
new
tool,
you
don't
you
didn't
get
better,
you
just
moved
it
all
to
the
new
thing
and
then
you're
gonna,
not
like
the
new
thing,
because
you're
gonna
say
the
new
thing
doesn't
make
you
faster
right.
This
is
the
same.
B
This
is
like
the
thing
that
the
jira
to
get
lab
issues
thing
people
are
like,
oh
well,
look
what
all
I
can
do
with
jira
and
we're
like
yeah,
but
you
just
told
us
that
you
hate
jira,
because
it's
got
so
much
stuff
in
it
and
you
have
to
do
all
these
things
to
like
manage
it.
But,
like
you
want
all
those
things
so
you're
just
going
to
redo.
All
of
that
here
right,
you
don't
get
any
of
those
operational
benefits,
and
so
I
think,
like
part
of
that
is
like
yeah.
B
You've
got
to
have
that
harder
conversation
and
I
think
part
of
it
is
is
we
need
to
do
a
better
job
of
explaining
like
how
our
tools
support
that?
So,
in
the
case
of
like
a
negative
approval,
if
somebody
wants
that
you
know,
I
think
the
flip
side
to
a
negative
approval
is
well.
We
have
gitlab
can
have.
We
have
affirmative
approvals
right,
we
have
required
approvals,
and
so
the
absence
of
a
required
approval
is
in
effect
a
negative
approval
right
like
if
I
do
not
approve
your
merge
request.
B
B
I
think
it's
important
that,
like
we
sort
of
get
stronger
about
that
and
get
get
more
explicit,
and-
and
we
can
do
that-
it's
just-
we
need
to
have
a
way
to
point
back
to
that,
and
I
think
pointing
back
to
that
is
like
gitlab's
engineering
team
works
like
this
and
here's
the
principles
that
they
use
for
code
review.
Git
lab
is
arguably
a
very
successful
engineering
company.
The
product
direction
is
in
support
of
our
engineering
direction
and
so
like.
B
Therefore,
we
also
have
a
product
direction
that
supports
that
workflow,
and
so
then
it
becomes
much
easier
to
sort
of
like
provide
that
example
versus
someone
today
would
go
well
we're
different.
We
want
that
and
you
would
you
I
don't
have
a
way
to
like
debate,
who's
right
or
who's
wrong,
necessarily
yeah.
B
A
So
I'm
I'm
thinking
about
that
example
of
that
feature
that
you
also
stumbled
upon
recently
that
allows
you
to.
I
don't
know
if
you,
if
you
were
able
to
understand
how
it
worked,
but
adding
already
merged,
commits
to
a
merge
request.
A
A
I
talked
with
james
about
it,
I
think,
almost
a
year
ago,
and
basically
we
didn't
want
to
build
it,
because
we
had
so
many
priorities
and
commit
by
commits
or
in
in
this
case
it's
not
committed
by
committed
it's
post.
Merge
review,
wasn't
a
thing
that
we
were
interested
in
pursuing,
and
so
in
this
case
a
customer
decided:
okay,
you're
not
going
to
do
it.
We
do
it
we're
going
to
contribute
it,
and
so
we've
been
having
those.
Mrs
to
add
that
feature.
A
How
would
we
balance
that
with
our
philosophy,
we
would
just
say
no.
In
that
case
we
would
give
a
negative
approval
to
the
merger
request
right.
B
The
perks
of
being
the
maintainer,
I
don't
know,
I
think
I
think
this
is
the
challenge
with
our
open
source.
Everyone
can
contribute
peace
that
we
have
to
balance,
because
I
honestly
I
have
a
I-
and
I
want
to
look
more
into
this
like
thing
that
this
customer's
adding,
but
that
I
haven't
I
could,
I
haven't,
been
able
to
fully
like
spend
the
time
in
grocket
yeah.
B
Like
it's
being
broadly
introduced
right
like
when
this
rolls
out,
anyone
would
be
able
to
see
this
and
there's
now
this
like
new,
complicated
piece
in
our
ui.
So
like
random
people
will
stumble
upon
and
like
not
know
what
it's
doing,
because
it
was
sort
of
designed
for
one
user
by
one
user
and
I
think,
like
you're
you're
dealing
with
some
of
that.
But,
like.
B
I
think
you
know
that
is.
B
B
B
A
B
A
A
Yeah,
you
also
have
that
other
issue
to
choose
a
default
branch
that
is
different
for
merge,
requests
and-
and
that
has
also
a
lot
to
do
with
how
teams
work
and
they're
structured
and
how
they
ship
things
and
how
they
integrate
their
code,
and
we
haven't
built
it
because
it
hasn't
been
something
for
us,
but
it
could
be
for
others.
I
think
yeah,
I
think,
at
the
end
of
the
day,
is
a
balance
and
you're
right.
A
I
think
clearly
you're
right
and
I
agree
with
two
things:
one
is
the
philosophy
and
align
aligning
that
with
our
current
philosophy
and
guidelines
for
code
review
at
kit
lab
the
company
and
as
those
evolve,
the
product
will
likely
evolve
as
well
in
that
direction
and
also
closing
things
that
we
are
not
going
to
do.
We
I
mean
we
keep
issues
open
and
we
just
say:
okay,
we're
not
going
to
close
it.
This
is
interesting
and
we
are
waiting
for
further
demand.
A
I
even
think
we
have
a
either
a
label
or
a
milestone
that
says
awaiting
further
demand
or
something
like
that,
and
if
something
is
waiting
for
further
demand,
I
think
it's
fine.
If
we
keep
things
because
we
haven't
been
able
to
validate
the
problem
yet
like
okay,
maybe
this
is
something
interesting
that
could
fit
into
our
code
review
philosophy,
but
we
haven't
time
to
do
it
yet.
A
So
it's
still
in
our
validation
backlog,
we're
both
waiting
for
demand,
but
we're
also
at
a
certain
point,
going
to
look
to
see
if
there's
demands
but
other
things.
As
you're
saying
like
they're
clear,
there
could
be
clear
contradictions
to
our
philosophy
and
in
those
cases
we
just
close,
and
we
give
a
clear
message:
we're
not
going
to
do
this
and
and
here's
why
and
and
maybe
here's
a
way
that
you
could
continue
or
here's
a
way
for
you
to
get
that
outcome
using
gitlab
right.
A
A
B
A
I
I
I
skimmed
through
it
as
you
were
talking,
and
I
was
immediately
getting
what
you
were
saying
and
the
google
example
as
well,
where
they
they
have
just
the
the
narrative.
As
you
were
saying
like
something
that
is
not
tied
to
two
features:
it's
not
tied
to
a
tool,
it's
something
that
can
be
applied
more
or
less,
regardless
of
what
your
which
stool,
you're,
using
basically.
B
Yeah,
I
would
go
read
the
so
like
the
link
in
the
issue
goes
to
their
engineering
practices
repository,
which
has
like
the
code
reviewer's
guide,
the
author's
guide
and
sort
of
their
code
review
guidelines
across
the
whole
thing.
It's
a
it's
a
fairly
interesting
read.
I
would
just
in
terms
of
like
thinking
about
it
differently,
and
I
would
contrast
that
with
which
is
not
linked
in
here,
but
I
think,
is
linked
in
my
google
doc.
B
A
Yeah
yeah,
the
I
think
very
closely
related
to
this.
The
philosophy
is
the
jobs
to
be
done,
and
we
have
I
added
I
think,
last
month
or
so
the
our
proto
jobs
to
be
done
for
code
review.
It's
it's
two
of
them
and
there's
those
are
just
the
job
statements
that
we
used
when
we
were
doing
the
category
maturity
scorecard
in
in
q2,
and
I
think
the
links
are
not
working
well,
the
ones
in
that
table.
I
don't
know,
I
think
it's.
A
The
generator
is
not
it's
it's
kind
of
broken,
but
basically
they're
not
validated
and
there's
a
whole
process
about
how
we,
internally,
how
we
validate
javascript
and
how
we
create
them.
A
These
ones
were
created
just
out
of
brainstorming
with
me,
daniel
and
catherine,
and
like
looking
at
what
we
already
had
in
terms
of
research,
but
we
need
to
do
like
proper
interviews
with
both
internal
and
external
folks
and
to
get
to
these
job
statements,
and
I
think
this
is
very
related
to
the
philosophy,
because
it
could
be
a
way
not
the
only
way,
but
one
way
to
first
have
something
that
we
can
use
every
day
in
our
work.
That
translates
the
philosophy
into
something
more
actionable
and
that
we
can
link
to
and
say
hi.
A
We
have
these
job
statements
and
and
again,
like
the
philosophy.
I
think
these
are
not
tied
to
a
tool
they're
tied
to
an
outcome
to
a
purpose,
to.
B
Yeah,
I
think
I
have
because
you
you
were
one
of
the
guinea
pigs
of
category
maturity.
We
got
to
look
at
a
lot
of
your
stuff
when
we
did
the
snippets
one.
B
B
And
I
don't
know
if,
because
it
was
a
new
category,
if,
like
ops,
is
more
newer,
it
seemed
like
more
people
on
the
op
side
adopted
jobs
to
be
done
than
on,
like
the
dev
side,
like
broadly
at,
like
section
levels
like
I,
don't
you
don't
hear,
pms
on
the
dev
side,
talk
about
jobs
to
be
done.
That
often,
like
you
hear
it
much
much
more
often,
but
I
think
it
would
be
good
to
like
yeah
really.
B
B
Are
they
and
then
yeah?
That's
the
actionable
way
to
like?
Are
they
in
service
of
like
our
grander
vision,
and
it
would
be
good
to
like
get
some
solidified
and
correct?
And,
like
I
mean
you
know,
one
of
the
hypothesis
like
one
of
the
things
that's
in
that
that
merge
request
is
sort
of
the
philosophy
is
like
we
want
you
to
like
deliver
better
code,
faster,
just
sort
of
like
how
that's
raised
and
better
being
vague
in
like
not
a
good
way.
B
See
if
they're
drawing
that
correlation
or
if
maybe
we're
like
dead
wrong
and
velocity,
is
like
not
a
thing
that
people
care
about
right
like
or
speed
to
delivery.
You
know
which
is,
it
could
be,
and
then
we
we
would
sort
of
have
to
like
go
back
and
figure
out
how
we
deal
with
that.
But.
B
A
If
you
had
to
stack
rank
them
or
create
a
pyramid,
it
would
be
at
the
top,
but
it's
it's
supported
by
many
other
things
like
it's
like
maslow's
pyramid,
where
of
course,
the
the
ideal
is
for
you
to
be,
is
to
fulfill
your
purpose
and
feel
your
life
is
meaningful.
But
in
order
to
do
that,
you
have
to
eat
and
sleep
and
have
to
be
happy
and
have
friends
and
all
that.
So
I
think
it's
it's
in
the
same
thing
here.
A
It's
it's
a
pyramid,
but
I
think
the
ultimate
purpose
is
to
do
that
and
again
I
think
jobs
to
be
done
would
allow
us
at
different
levels
of
granularity
like
we
can
define
a
couple
of
jobs
to
be
done
that
cover
the
whole
category
and
a
macro
level.
Sorry
the
whole
group
of
code
review,
but
then
we
can
have
smaller
jobs
that
talk
about
more
specific
aspects
of
code
review
like,
for
example,
delivering
well-intentioned
feedback
or
having,
because
how
yeah
having
constructive
discussions
over
the
proposal
so
that
the
the
speed
itself
is
faster.
A
So
we
can
even
then
like
talk
more
and
be
more
granular
about
those
but
yeah.
I
think
I
think
we're
relying
on
that.
I
I
I
have
to
look
at.
I
think.
Basically,
what
we
need
to
do
now
is
just
align
on
on
what
the
focus
is
and
what
to
be
able
to
plan
ahead.
A
So
that,
like
this,
is
something
I've
been
wanting
to
do
for
a
long
time
is
to
have
these
jobs
to
be
done,
because
I
think
they
will
help
us
tremendously
in
that
focus,
but
at
the
same
time
and
designing
the
solutions
and
all
that,
but
at
the
same
time
I'm
not
going
to
like
do
that
now
without
knowing
exactly
what
we're
going
to
be
working
on
in
the
next
one.
Two
three
milestones.
A
B
B
B
You
know
some
of
these
weirder
longer
running
things
off
our
plate,
so
that
we
can
actually
focus,
but
we
also
need
to
be
doing
that
at
the
same
time
like
we
need
to
be
getting
to
the
one
to
three
ahead.
You
know
we
talked
about
like
what
one
of
those
might
be
and
we,
I
think
we
need
to
talk,
probably
more
about
tracking.
I
think
that's,
probably
the
other
area
which
I
know
you've
mentioned
before
that
we
should.
B
We
need
to
spend
some
time,
and
I
think
those
are
the
areas
I
think
reviewers,
yeah,
reviewers
and
handoff
and
tracking
are
probably
the
areas.
The
question
is
now:
how
do
we
scope
those
down
into
meaningful
things
and
like?
Where
do
we
go
do
with
those?
And
how
do
we
frame
that
back
in
terms
of
like
what
those
bring
to
the
table
in
terms
of
efficiency
and
better
code
and
speed
to
deliver
right
like
how
do
we?
B
How
do
we
frame
those
up
for
those
things
and
then
use
that
as
like
the
evaluation
of
like
well,
here's
15
things
we
want
to
do
for
reviewers,
here's,
the
five
that
are
in
the
support
of
that
and
the
other
five
are
sort
of,
and
then
five
more
that
aren't
and
and
then
use
that
to
like
help
help
get
those
there,
and
I
think
it
would
be
good
to
like
also
have
the
jobs
to
be
done.
For
that.
A
I
mean
if,
if,
if
what
we're
doing
is
in
service
of
the
higher
goal,
which
is
velocity
and
if,
at
the
same
time,
we
are
deciding.
B
A
B
A
Frustrating
thing
in
code
review
today
in
gitlab
that
is
slowing
us
down
right,
and
I
mean
it's
something
that
of
course
we
the
code
review
group
can
control
like
they
could
say.
Oh
the
pipelines
are
slow
and
we
could
raise
that
flag,
but
it's
not
something
that
we
can
action
on,
but
other
than
that
inside
of
our
scope.
A
What
is
the
most
frustrating
thing
that
our
developers
are
because,
for
example,
reviewers?
It's
it's
very
interesting,
but
it's
not
something.
We've
we're
struggling
with
today,
necessarily
right,
we
have
the
danger
bot.
People
were
already
assigning
themselves.
I
don't.
I
don't
know
if
it
was
like
the
top
thing
that
was
making
us
slow
and
hindering
our
velocity
in
terms
of
code
review
yeah
from
my
perspective,
as
also
the
kinds
of
reviews
that
I
do
what
I
and
what
I
told
you
and
I
think
affects
everyone.
A
Is
that
tracking
and
knowing
what's
changed,
but
maybe
there's
something
else
that
I'm
missing
and
in
in
that
regard.
If,
if
we
agree
that
that
is
the
higher
goal
velocity
and
that
we
will
align
with
our
internal
ambitions
and
workflow
yeah,
we
just
have
to
look
inward
and
survey
the
developer
population
about
what
it,
what
they're
frustrated
about
in
end
code
reviews
and
that
could
at
least
give
us
a
north
star
to
align
towards.
B
A
B
I
agree
correctly,
given
that,
like
we
have
engineers
that
support
a
way
to
do
all
of
those
things
I
would
I
would
you
know,
and
it
sounds
like
lucas-
is
gonna
change
danger
to
like
use
the
reviewer's
field.
That's
why
I
was
trying
to
filter
out
of
it
to
like
get
that
data
back
in
james
asked
he's
like.
Could
we
do
something
with
like
danger
and
reviewers
to
sort
of
force
it
like?
Could
we
win?
B
Danger
runs
the
first
time
and
suggest
like
who
should
be
the
reviewer
in
like
the
front-end
and
back-end?
Could
we
just
tell
danger
to
automatically
assign
reviewers
is
like
a
way
to
see
like
how
like
what
the
feedback
and
response
is
to
like
auto-assigning,
our
reviewer
right,
like
because,
in
theory,
dangers
got
sort
of
like
at
an
in-state
a
lot
of
the
intelligence
that
we
would
want
if
we
were
going
to
automatically
assign
reviewers
so
like
what,
if
we
just
told
danger
to
automatically
assign
reviewers.
A
A
I
mean
that
I
think
that's
a
very
that's
a
very
interesting
internal
experiment.
I
think,
because
we
already
have
all
of
that,
and
probably
what
people
are
going
to
do
is
is
is
request
reviews
from
those
people,
but
I
would
say
that
in
that
case,
probably
developers
would
need
to
be
more
diligent
about
their
use
of
the
draft
mode
so
that
it
would
stay
in
draft
and
only
when
the
mark
is
ready,
the
request
gets
sent,
or
else
people
will
review
empty.
Merge
requests
sometimes.
A
But,
but
I
like
that,
I
think
that's
that's
an
interesting
thing
for
us
to
do,
even
if,
as
an
experiment
for
a
couple
of
days
to
see
what
developers
think
I
don't
think
it
will
break
anything.
To
be
honest,
I
don't
think
people
will
be
necessarily
like
super
annoyed.
If
that
was
done,.
B
B
A
So
what
do
you
think
are
the
actionable
steps
here
like
in
terms
going
back
to
the
original
question
that
I
had?
How
can
I
help
you
and
yeah?
Basically
that's
my
question
here
yeah.
B
A
B
B
I
think
it
would
be
interesting
to
like
get
that
to
a
mergeable
state
sort
of
with
the
changes
that
are
in
it
and
something
that
we
like
potentially
agree,
is
not
in
final
form,
but
that
is
the
basis
of
a
way
to
start
thinking
about
that,
and
so
any
help
you
have
in
review
or
like
thoughts.
There
would
be
valuable
to
me.
B
B
You
know
the
jobs
to
be
done
and
other
stuff,
and
I
think
you
know
I'm
appreciative
that
you
brought
that
up
and
like
trying
to
figure
out
how
to
get
that
and
fit
that
in
and
let's,
let's
start
doing
those
I'll
I'll,
say.
You'll
probably
have
to
like
push
me
more
on
those
things
because
they're
not
fun
things
for
me,
but
I'm
happy
to
like.
Let's
see
how
we
can
do
those
things
and
you
know,
maybe
by
the
end
of
january,
we've
got
a
like
a
really
strong
way.
B
B
On
february
1
we
have
like
a
monthly
code
review
meeting
or
something,
and
we
sort
of
like
reset
the
we
try
and
like
start
that
conversation
with
other
people
to
get
them
to
to
rethink
about
code
review
in
this
way.
So
I
would
say,
like
the
immediate
things,
are
the
merge
requests?
That's
open.
B
Let's
start
talking
about
jobs
to
be
done,
and
then
I
think
you
know
help
in
terms
of.
B
B
Because
that'll
help
get
us
a
few
milestones
ahead
of
in
terms
of,
we
will
be
more
well
thought
out
about
it.
We
can
go,
do
what
research
we
want
to
go.
Do
we
can
go
do
surveys
we
can
do
whatever
and
we
can
try
and
get
some
work
built
out
of
that.
You
know
on
the
fly,
but
also
for
milestones
to
come.
A
Yeah,
I
think
the
all
of
that
sounds
good.
I
think
in
terms
of
the
priorities
I
I
still
see,
I
mean
with
the
information
I
have
today.
It
would
be
great
if
we
could
survey
the
developer
population
to
see
internally
to
see
what
they
think
and
what
they're
frustrated
about
and
what
their
pain
points
are.
But
I
think
just
the
time
that
people
spend
understanding
where
they
left
off
a
merge
request.
I
think
that
is
so.
A
The
catching
up
is
more
of
a
priority
than
the
handoff,
although
I
agree,
as
I
agreed
with
you
yesterday,
that
it
would
be
great
if
we
could
provide
some
short-term
relief
for
people
that
started
using
reviewers
and
have
now
a
way
to
ask
for
a
re-review
other
than
that,
like
more
advanced
things
with
reviewers,
I
wouldn't
do
and
as
you've
probably
already
seen,
we
have
so
a
huge
backlog
of
performance
and
polish
things
to
do
that,
we
don't
have
a
shortage
of
work
there
and
in
performance.
A
If
anything,
it's
it's
the
thing
that
most
clearly
impacts
velocity
right.
So
if
anything,
we
could
put
all
of
our
team
working
on
those
things
that
we
know
need
to
be
fixed
for
the
sus
small.
A
Things
that
we
know
are
low
effort,
but
high
impact
fixing
bugs
performance
and
then
focus
on
just
one
direction
thing,
as
I
think
you
were
also
suggesting
before
and
for
now.
As
I
said,
what
I
see
as
a
very
likely
direction
is
the
catching
up,
but
I
might
be
mistaken,
and
if
we
really
want
to
align
with
the
how
gitlab
the
company
does
code
review,
we
also
need
to
align
with
the
frustrations
and
needs
of
the
internal
developer
population.
B
I
think
should
we
try
and
put
like
a
survey
together
now,
that's
like,
maybe
if
we
could
simplify
it
to
like
a
handful
of
questions
like
three
or
something
like
on
a
scale
of
on
a
scale
of
one
to
ten,
like
how
would
you
rate
the
code
review,
experience
and
then,
like?
Maybe
it's
just
two
questions
and
like
what
would
you
change
if
you
could
change
or
like?
What's
your
biggest
pain
point
and
then
like?
What
would
you
change
if
you
could
change
anything
like?
B
B
A
Survey
is
is
helpful
more
than
an
issue
or
a
slack
thread
at
all
of
that,
but
just
a
survey
and,
to
be
honest,
I
think
if
we
were
to
do
something,
it
would
be
either
this
week
or
only
in
january.
A
So,
and
I
would
strive
for
just
one
question-
I
think,
asking
developers
if
they're,
satisfied
or
not
like
their
level
of
satisfaction,
I
don't
think
gives
is,
is
very
useful
for
us
because
if
they
say
they're
not
very
satisfied,
I
mean
yeah,
you
have
to
put
up
with
gitlab,
that's
that's
the
tool
we
use
to
build
and-
and
it
doesn't
tell
us
exactly
what
we
should
focus
on.
I
think
just
one
question,
maybe
with
already
a
couple
of
options
or
maybe
a
way
for
people
to
somehow
like
stack
rank
problems.
A
I
think
there's
there's.
There
are
some
survey
question
types
that
you
can
like.
This
is
the
first
one.
This
is
the
second
one.
Third,
one
and
order
like
what's
we've
already
identified
as
the
big
problems
of
velocity
with
code
review,
git
lab
and
also
have,
of
course,
an
option
for
them.
To
tell
tell
us
if
something
is
missing
and
if
we
were
blindsided
by
something.
A
Yeah,
I
don't
know
if
you
still
have
some
some
minutes
or
if
you
have
another
call
to
attend
to.
Maybe
not
I've
got
time.
Okay,.
A
Of
minutes
yeah
or
else
we
will
be
number
one
forever,
yeah
number
two,
so
I
I
think
again
this
is
related
to
to
the
the
first
point,
but.
A
A
A
What
developers
think
I
think,
that's
also
considered
like
problem
validation,
not
one
problem,
but
like
looking
at
all
the
problems
that
we
have
and
to
be
honest
for
us
to
be
able
to
be
to
plan
more
ahead
and
have
more
predictability
and
have
more
confidence
technically
and
design
wise
and
that
we're
solving
the
right
problems
and
we're
solving
them
right.
A
I
don't
know
if
you
use
that,
often
or
not,
but
like
just
thinking
about
the
reach,
the
impact,
the
confidence
and
like
an
estimated
effort-
and
I
know
you
talk,
I'm
saying
all
of
this
and
it's
it
might
sound
like
I'm
saying
that
you
don't
reach
out
to
customers
or
that
you
don't.
A
B
For
better
or
worse,
I
I
try
and
examine
things
in
a
vacuum
to
some
extent,
knowing
that
there's
like
larger
things
going
on,
but
one
of
the
things
is
like
I
mean
we
have
that
like
we
can
get
into
this
one
too,
like
to
create
merge,
requests
like
auto,
assigning
a
user
right
like
in
a
vacuum.
B
I
think
you'd
be
hard-pressed
to
say
that
like
measurably,
there's,
probably
like
not
an
improvement
to
that,
and
so
to
me,
I
look
at.
I
look
at
things
like
that,
and
they
say
like
will
this
improve
the
user
experience
for,
like
the
bulk
of
the
use
case,
that
you
would
use
this
button
based
on
sort
of
what
we
know
today
and
I
go
yeah.
That's
probably
true,
and
I
can
I
can
sort
of
disassociate
from
that
and
move
on
and
like
not
need
to
think
about
the
rest.
B
Like
personally,
I
think
where
that's
that's
not
helpful.
Is
that,
like
thinking
in
a
vacuum,
sort
of
takes
away
from
sort
of
the
impact
across
what
that
might
do
in
other
areas
of
the
application?
And
I
think
that's
what
I
think
you
and
I
differ
in
that
and
I
think
that's
fine.
I
think
that's
been
good
and
something
we'll
continue
to
work
on
but
yeah.
So
I
would
say
like
for
me:
I
try
and
be
much
faster.
B
B
A
B
Yeah,
so
what
I
did
is
I,
like
I
refocused
those
and
and
one
of
the
ways
that
I
and
and
part
of
this
is
because
the
way
that
I
will
often
think
about
things
and-
and
you
and
I
are
gonna
have
to
like,
I
think,
adjust
I'm
I
sort
of
was
going
mad
over
like
ux
improvement,
epics
and
like
we're
gonna
need
to.
B
Because
they're,
like
they're,
almost
useless
right,
they're,
not
in
the
sense
that,
like
those
things,
aren't
valuable,
but
what's
not
clear
right
in
the
way
that,
like
I'm
mental
model
and
trying
to
get
back,
is
like
what
are
the
things
that
we're
gonna
do
and
like
what
is
the
like:
larger
space
or
problem,
they're
they're
in
support
of,
and
so
like
you
there's
like
this
there's
one
of
these
epics,
it's
called
like
reviewer
ux
improvements
or
something
right,
and
actually,
when
you
look
through
those
there's
some
things
that
are
like
in
support
of
like
how
do
we
deal
with
the
re-review
workflow
right?
B
That's
not
a
ux
improvement.
That's
like
that's
a
problem
like
is
the
re-review
thing
and
so
like.
When
we
to
me
being
able
to
logically
group
those
into
like
bigger
problem
statements,
then
I
can
sort
of
look
and
go
like
okay.
Well,
here's
all
these
things
that
are
problem,
and
then
I
I
don't
have
to
debate
like
are
ux
improvements.
Important,
like
I
sort
of
naturally
get
to
like,
say,
they're
part
of
this
problem
and
I
can
mentally
move
on.
B
So,
for
me,
like
that's,
how
I
think
about
sort
of
all
of
those
problems,
the
bigger
ones
and
that's
like
for
small
sort
of
tactical
things
when
they
are
much
larger,
I'm
not
opposed
to
doing
opportunity.
Canvases,
I'm
not
opposed
to
doing
more
research,
I'm
not
opposed
to
like
having
focus
user
conversations,
but
what
I
try
and
avoid
is
trying
to
get.
B
That
is
a
highly
leveraged
activity.
Right
like
it
requires
a
lot
of
time
and
other
things,
and
so
I
try
and
avoid
over
rotating
in
a
direction
that,
like
requires
me
to
to
make
simple
things,
become
like
highly
leveraged
and
one
of
the
things
that
like
gitlab
sucks
at
is
data,
and
so
like
oftentimes.
We
don't
have
data
and
so
like.
I
make
a
lot
of
assumptions
based
on
like
what
we're
seeing
feedback.
How
like
we
use
things?
B
If
we
don't
have
data
like
I'm,
not
gonna,
go
try
and
instrument
something
or
spend
hours
going
through,
whatever
they've
changed
in
size
sense.
That
now
like
requires
me
to
relearn
where
everything
is
located
in
the
tables
now
to
do
things,
and
so
I
think
I
think
for
me
it's
a
balance
of
like
what
is
a
big
like.
I
don't
understand
this
and
like
what
do
we
have
a
reasonable
amount
of
understanding
of,
and
how
do
we
do
that?
B
The
other
thing
that
I
do
is
part
of
like
trying
to
validate,
and
this
is
like
a
bad
habit.
I
subscribe
to
our
labels
so,
like
I
subscribe
to
the
code
review
label
and
the
merch
request
label
and
all
those
god.
Oh
I
read
every
single
issue
and
every
single
merge
request.
I
will
always
read
every
single
one.
B
It
also
makes
it
much
faster
for
me
to
like
close
issues
and
logically
group
them
into
like
epics
of
things
that
we're
working
on
and
so
like
all
over
the
course
of.
Like
I
don't
know
a
few
months,
I
should
have
a
fairly
good
like
mental
rolodex,
of
every
issue
that
exists
for
for
code
review,
but
it'll
take
a
little
while.
So
then,
like
stuff
just
gets
closed,
and
I
know
I
know
where
everything
is
at
that
point.
B
That,
as
like
an
additional
validation
step,
because
I
think
that's
one
of
those
things
that,
like
sometimes
you
see
an
issue
and
in
a
and
like
you
look
at
that
and
you're
like
oh,
but
only
one
person
is
complaining
about
that
where
the
reality
is.
Is
there's
probably
like
four
issues
that
are
really
similar
to
that.
Given
git
labs
like
issue
tracker
and
no
one
can.
A
A
A
I'm
I
find
it
crazy
that
you're
subscribed
to
the
labels,
but
I
would
probably
if
I
was
a
product
manager.
I
would
probably
do
the
same,
because
I
want
to
be
aware
of
that,
and
that
would
be
one
of
the
sensing
mechanisms
yeah.
I
was
just
trying
to
fit
that
into
my
in
my
workload
and
it
would
be
impossible.
I
wouldn't.
A
But
yeah
no,
it
makes
it
makes
perfect
sense
and
I
completely
agree
with
the
level
of
validation
like
when
it's
tactical
things
when
it's
more
strategic.
A
How
do
you
balance
that
and
which
methods
you
use
and,
for
example-
and
I
agree
we
suck
at
data
collection,
but
it
will
especially
for
code
review
in
our
group
that
it
has
so
much
usage
that
we're
not
leveraging
and
many
many
many
things
we
could
validate
ourselves.
Just
by
looking
at
the
data,
and
or
I
mean
we
wouldn't
validate
it
100,
but
we
would
have
a
very
a
much
better
sense
of
what
the
problem
is,
but
just
by
just
doing
that.
So
I
completely
agree
with
not
overdoing
opportunity
commences
interviews
and
all
of
that.
A
So
I
think,
I
think
we're
allians
and
in
a
way
I
think,
a
lot
of
like
we
know
what
the
problems
are
and,
as
we
saw
yesterday
like
the
more
concrete
example,
is
the
the
re-review
or
or
the
catching
up
like.
We
know
that
is
a
problem,
so
I'm
not
entirely
sure
we
need
to
validate
that.
It
is
a
problem,
but
perhaps
we
need
to
learn
more
about
the
problem
right.
A
So
in
that
regard,
I
think
I
can
help
with
that
and
maybe
in
the
future
I
could
even
be
much
more
a
part
of
those
validations,
but
I
I
wanted
to
know
how
open
you
would
be
to
leading
more
of
those
problem.
Space
explorations.
B
It
just
means
both
of
us
end
up
having
to
like
deeply
understand
something
to
versus
being
able
to
like
deal
with
more
of
the
tactical
and
some
of
the
smaller
things,
and
I
think
yeah,
and
I
think
you
know
we
just
need
to
find
a
nice
balance
between
where
you
and
I
both
get.
B
And
I
I
will
say
I
think
you
and
I
are
on
different
scales
of
where
we
find,
like
validation,
being
fine,
like
I'm
good
at,
like
some
anecdotal
evidence
and
like
a
little
bit
of
data
here
and
there,
and
I
think
you
have
a
stronger
bias
for
for
more
data
and
more
concrete
examples,
and
I
think
that's
fine.
We
just,
I
think,
we'll
strike
that
balance
over
time.
But
the
big.
B
Think
those
are
those
are
absolutely
ball
in
my
court.
I
don't
think
I've
ever
like
from
an
opportunity
campus
perspective
or
even
from
like
figures.
I
don't
think
I've
ever
sent
our
designers
down
those
pilots
to
like
go.
Do
that,
I
don't
know
you
can
ask
marcel
and
mike.
I
guess,
if
I've
ever
made
them
good.
A
B
Figure
something
out
and
saying
that
I
didn't
know
about,
but
I
try
and
avoid
that
what
I
what
I
like
to
do
is
get
like.
B
I
like
to
get
an
epic
to
the
point
that,
like
I
feel
pretty
good
about.
What's
going
on
and
like
that,
I
understand
the
problem,
and
then
I
think
this
is
the
thing
we've
got
to
have.
A
conversation
with
engineering
about
is
handing
that
to
you,
so
we
can
sort
of
like
get
through
solutioning
and
where
an
in-state
is
so
like.
B
This
is
what
we
don't
need
to
do
and
then,
like
all
those
issues
are
written
and
you
just
sort
of
like
figure
out
where
your
shipping
points
are
inside
of
that
and
then
like
and
then
I
personally
will
get
much
more
relaxed
about
our
feature,
flag,
usage
and
other
things
that
we're
doing
because,
like
we
sort
of
agreed
upon
scope
at
a
certain
point
right
and
like
that's
a
very
different
thing
than
like
right
now,
we're
like
we
don't
have
that
and
so
yeah
like
I
want.
I
want
huge.
B
I
want
like
big
epics
that
are
well
thought
out,
and
then
you
and
I
go
and
like
make
sure
we
clearly
understand
the
rest
of
that
problem,
make
sure
we
figured
out
what
like
we
want
to
go,
do
and
build
and
then
get
engineering
involved
in
that.
So
that
way
we
we
get
through
that.
But
but
I
don't
want
you
to
have
to
like
draft
that
big
initial,
like
problem,
epic,
that
will
have
no
proposal
or
no
solution
in
it
yeah.
B
A
Yeah,
I
think
you
said
said
every
I
agree,
100
with
everything
you
said
it
makes
it
makes
perfect
sense
and
and
yeah.
I
think
I
think
a
lot
of
the
things
that
we're
thinking
about
right
now.
A
I
think
that
would
alleviate
a
lot
of
the
risks,
a
lot
of
the
uncertainty
and
we
would
then
to
the
future
flags.
We
would,
I
think,
we're
still
intentional
with
the
future
flags,
but
we
today,
but
if
we
did
all
of
that
well,
the
feature
flags
would
not
only
be
intentional,
but
they
would
make
sense
and
we
would
use
them
when
they
need
to
be
used
right.
So
I
think
that's
the
big
difference.
A
I
agree
I
actually
have
to
go,
but
but
yeah.
If
there's
anything
that
comes
up
you,
you
can
slack
me
and
I'll
look
at
it
tomorrow,
but
I
think
I
got
all
of
the
answers
that
I
wanted.
B
Awesome
well,
I'm
excited,
let's
get
a
chat,
enjoy
the
rest
of
your
day
and
holidays.
I
will
probably
not
see
you
again
until
january,
so.