►
From YouTube: Code Review Blueprint - GitLab Contribute 2021
Description
Speakers: David O'Regan (Engineering Manager, Create:Editor), Jannik Lehmann (Frontend Engineer, Secure)
Slides: https://docs.google.com/presentation/d/1AuPIyDkpozb0nx2TZHIcEfIRo_0o1uoAd4RVr2RHUvU
In this session, we will discuss a Blueprint to a good Code Review. You'll be learning the most efficient code review patterns, how to best communicate with authors, how to use the GitLab interface and more.
00:00 Intro
02:11 How can I be a good author?
04:35 Reviewing a Merge Request
11:09 Receiving Feedback
12:23 Code Review Sins
16:55 Real World Examples
30:15 Q&A
A
B
A
So
we're
excited
to
talk
about
some
code
reviews.
So
look
you
know
a
lot
of
people
they've
already
been
through
the
chops
on
code
reviews.
You
might
be
asking
yourself
what
have
I
got
to
learn
here?
Really,
you
know
so
we're
going
to
tell
you
what
we
hope.
You'll
come
away
with
at
the
end
of
this
talk.
B
Absolutely
that
is
probably
a
question
that
is
going
through
a
lot
of
minds
at
this
workshop
at
this
moment,
but
here's
what
we
have
for
you
if
you're
already
a
code
when
you
visit
you're,
going
to
learn
about
technical
empathy
about
how
to
beat
you
and
how
to
collaborate
like
pro.
If
this
topic
is
rather
new
to
you
when
you're
a
new
kid
on
the
block,
we'll
make
it
better
when
it
comes
to
the
most
efficient
code
review
patterns,
how
to
communicate
with
horses
and
how
to
use
gitlab
interface.
A
A
About
code,
reviewing
the
thing
at
the
forefront
of
everybody's
mind
should
always
be
merge.
Profitability.
What
is
merge
profitability?
It
is
essentially
a
percentage
representation
of
whether
or
not
the
merge
you
are
looking
at
is
gonna,
add
value
to
the
code
base
and
that
value
can
either
be
for
developers.
A
B
Great
stuff,
let
me
take
a
second
and
tell
you:
how
are
we
going
to
do
that
we'll
be
having
all
the
details
about
what
makes
that
code
good
or
bad?
We
actually
prepared
a
real
world
example
and
give
that
to
the
thoughts
and
we're
happy
to
answer
all
your
questions
on
the
topic
in
your
in
our
q,
a
after
the
session
or
asim
in
the
issue
which
you're
going
to
find
in
the
chat.
A
A
Code
you've
got
to
write
some
code
or,
if
you're,
going
to
review
changes,
you've
got
to
write
some
changes.
So
what
are
some
things?
We
can
do
to
be
good
authors?
Well,
the
first
thing
you
can
do
is
whenever
you're
offering
a
merge
request.
You
can
put
yourself
in
the
reviewer's
seat
ahead
of
time
and
you
can
self
review
your
changes
in
the
exact
same
way.
You
would
review
any
other
merge.
A
A
B
B
Absolutely
great
stuff,
before
you
hand
dmr
over,
please
read
your
own
description
and
seminar
check
it.
You
want
to
focus
all
the
concentration.
Your
very
has
on
make
it
easy
for
them
to
understand
the
problem
you're
trying
to
solve
this
also
comes
along
with
providing
all
the
information
and
banners
in
a
digestible
way.
Your
revenues
will
surely
appreciate
that.
Do
we
have
anything
else
to
consider
today.
A
The
last
thing
to
consider
is
a
pattern
we
would
like
to
encourage
within
your
code
reviews
so
in
gitlab
we're
not
opposed
to
squashing
commits
and
rewriting
history.
If
it
needs
to
be
done,
it
needs
to
be
done,
but
a
valuable
pattern
to
get
into
the
habit
of
if
you're
using
it
is
to
commit
small
and
commit
often
and
to
use
at
looking
small
commits
to
build
out
a
linear
change
history
and
emerge.
This
gives
the
code
reviewer
a
set
of
contextual
steps
committed
by
commit.
A
B
Absolutely
yeah,
so
let's
switch
sizes
and
speak
about
some
best
practices
when
you're
actually
reviewing
the
merch
request
from
somebody
at
gitlab
or
someone
from
our
community
assume
positive
intent.
Nobody
will
assign
you
a
not
perfect,
mr
just
to
annoy
you.
There
might
be
reasons
for
it
that
you
don't
know
about
so
lean
back,
be
relaxed,
provide
feedback
in
a
timely
manner
if
a
merchandise
is
out
of
your
comfort
zone
or
you
currently
don't
have
any
time
to
revenue,
it
there's
zero.
Shame
in
saying
that
and
assigning
it
to
another
person.
B
If
you're
doing
the
review,
please
make
sure
to
provide
some
feedback
within
two
days
when
it
comes
to
giving
feedback,
please
be
sure
to
be
as
precise
as
possible.
Don't
let
the
alpha
assume
what
you
might
have
meant
to
say.
Clearly
the
most
important
thing
to
me:
don't
let
each
other
fail
if
you
as
a
regular
notice
that
the
author
is
having
a
hard
time
solving
the
problem,
help
them
out,
try
to
point
them
in
the
right
direction.
A
B
A
Each
other
not
terry
to
the
down,
so
what
else
can
we
do
when
we're
reviewing
a
merge?
Well,
we
can
actually
make
sure
we
understand
the
intention
of
the
empire,
and
this
is
something
that
can
be
done
through
looking
at
associated
issues
associated
ethics,
maybe
even
getting
in
touch
with
the
product
manager.
Who
is
driving
the
future
that
spawn
the
mr?
We
need
to
really
understand.
What's
the
intention
of
the
mr,
what
the
problem
is
being
solved
to
really
be
able
to
justify
good
review.
A
A
A
You
do
need
to
make
sure
that
the
author
was
solving
the
problem
that
the
issue
or
epic
actually
described.
So
you
need
to
make
sure
that
they're
in
alignment
with
what
their
pm
or
what
ux
is
driving
from
the
change.
If
it
looks
like
it
might
be
in
some
way
out
of
sync
feel
free
to
reach
out
to
the
pm
or
ux
directly
on
the
merge,
this
creates
good
collaboration,
open
transparency.
B
Absolutely,
but
there
are
even
more
things
to
consider
giving
praise
is
free
of
charge.
So
if
you
see
something
you
like
just
say,
it
be
careful
with
phrases
just
for
the
sake
of
phrasing.
Them
encourage
errors
and
analyze
them.
Make
pmr
of
psychological,
safe
place.
Emerge
request
is
a
place
of
human
interaction,
and
humans
tend
to
make
mistakes.
If
that
wasn't
the
case,
we
could
skip
our
revenue
process
completely
celebrate
errors
that
are
fine,
because
that's
a
great
thing
when
a
bug
was
catched
before
it
even
hit
our
main
crunch.
B
Another
thing
to
keep
in
mind
is
to
define
goals
and
make
clear
what's
needed
for
approval.
This
is
crucial,
especially
on
larger
requests.
As
an
author,
it's
sometimes
tricky
to
find
out
what
is
good
advice
from
a
regular
or
an
actual
issue,
so
provide
additional
information
about
whether
your
comment
is.
A
B
A
A
B
A
B
A
A
This
feeds
into
the
next
point,
which
is
if,
for
some
reason,
you
find
your
context
or
concentration
being
broken
by
lack
of
understanding
or
awareness
feel
free
to
ping
people
blue
people
in
often
blue
people.
In
early,
we
don't
believe
in
bothering
people
we
have
short
toes
here
at
hitlab
is
of
value.
So
we,
ideally
you
get
experts
in
as
soon
as
possible.
If
you're,
something
you
don't
understand,.
B
A
Want
to
continue
the
merge
training
next
be
precise
in
your
language.
This
is
a
sin
that
a
lot
of
people
suffer
from
when
they're
reviewing
code.
We
need
to
be
very
deliberate
and
very
nitty
gritty
in
our
language.
Take
the
time
to
genuinely
explain
what
might
be
the
issue.
What
might
be
the
problem?
A
What
you're
looking
for
from
the
author
authors
are
not
mind
readers
and
we
need
to
make
sure
that
we
express
ourselves
succinctly
in
our
language
last
recommendation,
which
is
going
to
be
the
basis
of
how
we
demonstrate
the
code
review
in
the
live
example
is
use
the
conventional
common
system
developed
by
our
very
own
paul
slaughter.
The
conventional
common
system
is
a
wonderful
contextual
common
system
designed
to
inject
your
comments
on
a
merge
request
overview
with
stateful
context
that
an
author
compares.
It
also
makes
them
grappable
for
later
on.
B
Awesome
yeah,
so
let's
go
full
circle
and
speak
about
best
practices
when
you're
receiving
feedback
as
an
offer
be
fearless
about
making
mistakes.
We
link
to
your
great
example
here
and
we
will
share
our
slides
within
all
the
shared
contributed
stacks
later
on
and
go
check
it
out.
It's
great.
If
you
keep
in
mind
that
assuming
positive
intent
is
a
two-way
street,
it
also
applies.
When
you
see
my
feedback,
you
lose
the
ego
if
ideas
and
your
changes
are
being
questioned,
try
to
stay,
open-minded
and
as
objective
as
possible.
B
A
A
Speaks
for
itself,
for
maybe
whatever
reason
you
haven't
had
the
time
to
express
yourself.
You
use
ambiguous
language.
You
don't
lean
into
collaboration,
look
to
suggest
solutions
or
look
to
work
with
the
author
to
find
a
better
way
of
offering
up
some
code.
It's
not
a
good
way
to
review
code,
you're
not
going
to
feel
good
about
it
as
a
coder
viewer
and
the
author's
not
going
to
feel
good
reading.
A
A
Nitpicks
the
books
are
a
double-edged
source
and
they
can
be
both
good
and
bad.
So
we
need
to
be
careful
if
we're
using
them
in
code
reviews
nitpicks,
where
they
can
be
useful,
is
if
you're
suggesting
something
that
might
be
succinctly
beneficial
to
the
code
itself,
so
you're
actually
going
to
increase,
merge
profitability
by
offering
this.
A
But
if
that
is
the
case,
we
would
actually
encourage
you
to
use
something
else,
maybe
like
a
suggestion
which
we'll
demonstrate
later
from
the
conventional
commons,
syntax
nitpicks
have
this
terrible
habit
of
becoming
high
cognitive
load
and
they
generally
don't
actually
increase,
merge
profitability.
So
if
you
find.
A
Because
it's
your
personal
style,
maybe
consider
leaving
no
involved
next,
not
locally
tested
or
not
testing
locally.
So
this
does
come
with
a
caveat
if
it's
a
one
line
change
to
some
copy,
you're,
probably
okay,
to
skip
the
local
testing.
However,
as
soon
as
a
merge
requires
some
sufficient
complexity,
you're
really
not
going
to
be
able
to
understand
the
full
scope
of
the
changes
without
actually
testing
it,
and
we
have
different.
A
A
A
A
If
that's
okay,
you
can
leave
me
assigned,
if
not
feel
free
to
assign
it
to
a
different
reviewer,
but
also
consider
feel
free
to
reassign
it
yourself
if
you're
just
not
going
to
get
there,
it's
much
better
to
do
that
to
respect
the
author's
time
than
it
is
to
let
the
merge
sit
because
for
the
longer
the
merge
sits,
the
quicker
the
code,
rots
more
context
is
lost.
So
if
there's
one
thing
you
can
avoid
out
of
this
list,
please
avoid
radio
silence
last,
not
understanding
the
code
you're
reviewing.
This
is
a
problem.
A
If
you
find
yourself
reviewing
code
that
you
don't
understand,
contextually
reach
out
to
domain
experts
and
the
domain
expert
in
this
case
could
be
the
author.
It
could
be
somebody
who
is
an
expert
in
the
technology.
Maybe
you
need
a
graphql
expert.
Maybe
you
need
a
rails
expert.
It
doesn't
necessarily
matter
just
if
you
find
yourself
dealing
with
things,
you
don't
understand
and
you
can't
dedicate
the
time
to
avoid
the
urge
to
hit
approve,
just
because
it's
easier
look
to
reach
out
to
people
for
help.
We
work
with
a
company.
A
This
is
why
there's
lots
of
developers
get
some
help.
That's
okay,
lean
into
collaboration,
work
with
the
author,
so
moving
on
from
our
code
review,
since
we're
gonna
actually
go
through
a
live
demo
of
what
we
hope
makes
a
good
code
review
and
give
you
some
skills
to
take
into
your
next
set
of
code
reviews.
So
what
we're.
B
A
A
B
Absolutely
so,
let's
jump
right
in
into
the
mr
that
they've
asked
me
to
review.
Let's
go
for
his
initial
description
at
first,
it
says
tiny
mvc
for
an
issue
where
we
migrate
the
merge
frame
when
dots
powerfully
generated
on
the
back
and
into
the
front
end
to
avoid
drops
and
redundant
code.
I
don't
know
about
you
this.
I
don't
really
know
what's
going
on
so
in
my
revenue
that
was
actually
my
first
point.
I
mentioned
as
a
non-locking
suggestion
that
this
description
was
a
little
hard
to
get
my
head
around.
I
helped
him
out.
B
I
provided
a
screenshot
that
is
yeah
to
update
his
description,
to
make
it
easier
for
a
maintainer
or
everybody
else,
checking
on
this
to
actually
see.
What's
going
on,
I
provided
a
list
on
how
to
reproduce
the
behavior
on
a
local
machine
and
even
a
little
git
patch,
to
make
it
easier
for
everybody
involved
to
figure
out.
What's
going
on.
A
A
The
conventional
comments,
api
docs,
so
we
encourage
you
to
have
a
read
through,
but
the
meat
and
potatoes
of
the
conventional
common
system
is
made
up
of
two
parts
and
the
first
one
is
issues.
So
issues
are
highlighting
specific
problems.
Often
they
are
blocking.
If
you
find
yourself
reaching
for
an
issue,
this
is
more
than
likely
a
show
stopper
for
this
merge.
What
we
want
to
do
is
we
want
to
get
in
the
habit
of
writing
good
contextual
issues
and
really
laser-like
focus
in
on
what
the
problem
is
using
precise
language.
What
we
don't.
B
A
Is
we
don't
want
ambiguous
language?
We
don't
want
to
just
leave
a
comment
on
an
arbitrary
line
of
code
that
says
hey.
This
doesn't
work.
What
we
want
to
do
is
we
want
to
say
this
is
a
problem.
This
is
why,
and
then
we
want
to
look
to
collaboratively,
find
a
solution
for
it.
Yannick,
thankfully,
has
found
a
blocking
issue
and
he
has
found
a
way
to
work
with
me
collaboratively
for
us.
I'm
going
to
hand
off
to
him
now
sweet.
B
Thanks
yeah,
there
was
one
issue:
actually
we
found
in
uranus
and
you
can
see
all
of
this
in
the
demo
video
I
provided,
and
that
is
one
of
those
takeaways
or
a
good
example.
I
I'd
like
to
recommend
to
everybody
just
pop
up
one
of
those
videos.
There
is
super
fast
to
create
and
really
make
a
difference
when
it
comes
to
understanding
the
issue
that
you
actually
speak
enough.
So
please
feel
free
after
this
talk.
Just
give
this
a
spin
and
yeah.
Let's
use.
A
B
A
A
For
where
yana
could
highlight
about
so
next
we're
going
to
move
into
questions
and
generally
you're
not
going
to
see
too
many
questions
inside
merges,
but
we
would
highly
encourage
you
to
use
them
very
often.
This
is
for
breaking
contact
silos.
So
if
you
find
yourself
unsure
of
something
that
you
are
highlighting,
what
could
possibly
be
a
potential
problem
but
you're
just
not
quite
sure,
use
a
question
and
when
we
use
questions
again,
we
want
to
use
good,
precise
language
with
our
questions.
We
want.
B
B
Absolutely,
and
so
I
did-
and
I
can't
even
stretch
this
enough
david
here-
are
you
speaking
of
an
incredible
complex
code,
basically
here
at
kidnap
and
yeah,
just
very,
very
hard
problems
to
serve
and
revenues
as
well.
They
will
not
know
everything
so
with
this
question,
and
that's
why
I
was
just
not
quite
sure,
what's
going
on
here
and
I've
been
yeah
looking
into
a
thing
that
was
removed
and
I'm
seeing
the
potential
of
them
that
this
might
not
be
removed
fully.
B
A
A
A
We
have
two
ways
generally
of
proposing
changes
and
our
first
is
unit
using
our
native
gitlab
ui
tool,
the
suggestion
tool,
which
is
a
really
awesome
tool,
and
I
quite
like
it.
I
think
it's
fantastic
for
small
succinct
changes,
but
the
trick
with
using
the
suggestion
shield
is
just
that
as
yannick
had
mentioned
previously,
if
you're
going
to
offer
changes
using
the
suggestion
tool
make
sure
that
you
have
tested
these
changes
before
you
offer
them
to
the
user,
make
sure
that
you're
not
offering
changes
across
complex
pieces
of
code.
A
A
A
B
Absolutely
yeah
on
dave's
merch
request.
There
is
actually
a
good
example
from
another
suggestion,
so
he's
been
working
on
some
specs
and
those
are
actually
running
fine,
and
there
is
nothing
really
wrong
about
this.
I
just
found
a
little
thing
to
do
better
and
to
help
us
maintain
this
easier
in
the
future.
A
B
A
End
to
be
mixed
with
making
a
viable
change
to
codebase
the
next
way
of
making
suggestions,
if
this
is
specifically
targeted
for
complex
suggestions
that
maybe
span
a
couple
of
lines
or
a
couple
of
files
and
that's
a
patch
file
specifically
so
a
patch
file
is
a
generated
file.
That's
going
to
come
from
the
diff
that
you
will
create
locally,
and
then
you
can
attach
that
inside
your
merge
request
for
the
author
to
review
and
if
you're
happy
with
it,
they
can
apply
it.
A
So
why
use
a
patch
file
over
pushing
directly
to
somebody's
branch?
Not
that
there's
anything
wrong
with
pushing
directly
to
somebody's
branch,
but
what
patch
files
do?
Is
they
really
lean
into
the
collaborative
spirit
of
hey
I've
tested
your
code?
I've
looked
at
it.
I
think
this
can
be
done
slightly
better.
I'm
going
to
give
you
this
patch
file
that
I
wrote
and
you're
going
to.
B
Yeah
we're
going
back
to
my
very
first
comment
and
which
was
actually
talking
about
the
merch
request
later
information
on
this.
So
I
applied
a
git
patch
to
make
it
easier
for
reviewers
or
everybody
involved
to
reproduce
the
behavior
of
local
machine.
So
git
patches
are
such
a
great
tool.
They
can
be
used
on
any
kind
of
issue
or
comment
you're
actually
going
to
make.
B
A
B
B
A
Foster
a
fantastic
environment
for
the
encode,
the
praise
is
where
you're
going
to
highlight
something
positive
and
the
trick
with
praise
is
to
really
make
sure
that
what
you're
praising
is
actually
praiseworthy,
you're,
not
making
something
else
that
can
actually
be
detrimental
to
the
author
as
an
example,
something
I
genuinely
love
when
I
see
in
reviews
is,
if
somebody
writes
their
test
specs
to
succinctly
describe
what
the
feature
is
doing
so.
Essentially,
I
can
start
by
reading
the
tests.
I
know
exactly
what
this
feature
is
going
to
do.
That
is
something
I
love
to
see.
B
Absolutely
yeah,
I
found
something
to
be
praiseworthy
in
your
mind
and
it
was
it
had
to
do
with
tests,
though,
so
after
writing
some
tests
removing
some
tests
rather-
and
this
is
something
which
is
easily
forgotten-
I'm
just
gonna
we're
gonna
be
left
with
a
huge
of
cluttered
mock
data
that
nobody
really
knows
about.
What's
actually
going
on,
so
it's
important
to
keep
those
things
yeah
clean.
So
that
is
absolutely
great
for
me.
B
A
A
And
it
can
be
very
tempting
as
a
reviewer
to
jump
straight
to
problems
in
code.
Try
and
humanize
the
problem
by
looking
at
the
good
things
first
and
then
second,
the
anik
did
involve
a
gif.
Everybody
loves
a
gif,
it's
a
humanizing
thing
and
it
looks
fantastic
on
merges
and
it
creates
a
fun
safe
environment
to
you
know,
make
mistakes.
It's
not.
A
B
B
B
I
know
about
some
people
actually
using
like
recorded
c
sessions
with
themselves,
so
there
are
plenty
of
options,
whatever
works
for
you
and
I
go
by
obs,
which
I'm
going
to
put
a
link
in
the
chat
frequently
and
yeah
one
more
thing,
actually
that
I
tend
to
do
and
if
you
are
working
with
longer
diamond
videos,
because
gitlab
has
a
maximum
upload
limit
from
10
megabit,
which
can
be
quite
annoying.
What
I
tend
to
do
is
then
still
upload
it
in
the
resolution.
Whatever
does
make
sense
and
then
just
privately
send
whoever.
A
That's
a
fantastic
tip.
I
personally,
I
tend
to
just
use
quicktime
and
then
trim
the
video
down
as
needed
and
reduce
the
quality
if
it's
too
big,
but
I
do
like
obs
quite
a
bit.
It's
quite
useful
technology
and,
as
annabelle
mentioned
people
campaign
for
loom
licenses
in
the
future,
so
I
would
highly
recommend
swapping
over.
A
A
B
A
If
you
submit
a
piece
of
code
and
you
go
to
bed
and
you
wake
up
and
all
of
a
sudden,
you
see
an
email
or
a
notification,
and
you
have
20
comments
to
work.
I
can
actually
be
very
demoralizing
over
time
and
since
this
is
our
bread
and
butter,
we
do
this
a
lot
and
take
a
really
big
toll
on
author.
So
breaking
that
into
genuine
context.
So,
as
an
author,
I
know
exactly
all
right,
I'm
just
being
asked
a
question.
A
A
B
Give
it
a
go
I'd
say
actually
all
of
the
above
like
and
definitely
when
reviewing
things
from
one
engineers
they
might
be
not
so
familiar
with
our
practices
and
with
the
techniques
we
are
using
so
be
super
super
cautious
and
go
slow
and
yeah,
communicate
in
a
precise
way,
adding
demos,
maybe
when
adding
a
good
patch,
try
to
drop
in
a
little
link
on
how
to
work
with
those,
because
this
can
be
overwhelming,
but
other
than
that.
A
So
I'm
going
to
give
a
miniature
shout
out
to
both
amelia
and
sarah
from
the
monitor
stage,
with
sarah's
former
monitor
my
p.m.
When
I
worked
there,
they.
A
A
You're
doing
is
actually
the
polished
product
that's
going
to
go
into
the
production
code
base
with
pms,
offering
a
very
succinct
how
to
test
guide,
because
not
everybody
is
super
technical,
as
you've
mentioned.
So
it's
really
on
me
as
an
author
to
offer
up
an
easy
way
to
get
in
and
adjust
the
changes
that
we've
made
and
to
make
the
code
review
process
much
much
smoother.