►
Description
Jeremy and Nick discuss iteration from a PM's perspective.
Related to: https://www.youtube.com/watch?v=M56CeleqpDI&feature=youtu.be
A
So,
hey
jeremy.
Thank
you
very
much
for
for
joining
us
really
looking
forward
to
having
our
talk
about
your
perspective
as
a
product
manager
on
iteration.
So,
first
off
what
does
what
does
iteration
mean
and
how
does
that
apply
to
your
role
as
a
product
manager.
B
Yeah
thanks
a
lot,
so
I'm
jeremy,
I'm
a
product
manager
here
at
get
lab
and
I'm
happy
to
talk
a
little
bit
about
iteration.
So
iteration
is
one
of
gitlab's
values,
but
I
think
it's
one
of
the
most
important
principles
that
we
have
as
a
product
department
here
at
gitlab.
I
think
it
means
not
only
shipping
the
the
smallest
thing
possible
and
kind
of
making
sure
that
we
get
customer
feedback
and
rapid
kind
of
feedback
on
on
whether
or
not
we
did
ship
the
right
thing.
B
But
I
think
it's
also
a
principle
that
helps
us
move
quickly,
ship
ship
effectively
and
work
collaboratively
kind
of
like
across
disciplines.
So
I
think
it's
actually
one
of
the
most
important
kind
of
principles
that
we
use
at
gitlab,
and
it's
really
one
of
the
foundational
pillars
that,
like
we,
we
have
as
a
product
team.
B
Definitely,
I
think
that
you
know
when
you
see
collaboration
kind
of
break
down
a
little
bit
when
you
fail
to
iterate,
because
the
scope
of
like
what
you're
trying
to
accomplish
grows
and
as
that
kind
of
grows,
then
you
have
to
you
know,
encompass
the
number
of
people
that
you
have
to
pull
to
the
table
in
order
to
get
something
accomplished
also
grows
in
tandem.
B
So
if
you
have
like
a
very
large
initiative
that
you
know
for
you
know,
for
you
know,
let's,
let's
say
taking
changing
like
the
something
key
and
doing
a
a
complete
revamp
of
like
your
own
boarding
flow.
For
instance,
if
you
fail
to
iterate,
and
you
fail
to
kind
of
like
scope
that
large
task
down
well,
you
know
you're
going
to
have
to
involve
probably
like
design.
You'll
have
to
involve
legal
to
like
decide
whether
or
not
like
the
things
that
you're
doing
or
you'll
have
to
involve.
B
Engineering
you'll
have
to
involve
like
this
massive
kind
of
scope
of
folks
that'll
have
to
have
buy-in
on
this
decision
in
this
direction.
But
if
you
scope
the
you
know
the
task
down
to.
Oh,
we
just
want
to
change
the
copy
on
this,
or
we
want
to
just
change
this
one,
thus
modal,
to
like
the
experience
there.
Then
it
involves
a
much
smaller
group
of
folks.
B
So
I
think
that
you
know
you
what
you
see
is
you
see
like
a
much
higher
level
of
collaboration,
because
the
task
becomes
a
lot
less
daunting
at
each
individual
step
of
the
way.
So,
if
you
iterate
effectively,
I
think
it.
It
increases
the
velocity
and
the
amount
of
collaboration
by
just
like
reducing
the
amount
of
scope
that
you're
trying
to
kind
of
tackle
in
every
single
kind
of
step.
So.
A
B
That's
that's
what
I've
seen
and
and,
as
a
result,
you
tackle
kind
of
more
ambitious
projects,
because
you're
honored,
the
people
that
are
involved
are
honestly,
not
you
know
kind
of
daunted
by
the
by
the
task
ahead.
So
the
small
incremental
steps
are,
I
think,
are
really
important
when
it
comes
to
making
progress.
A
Cool
and
I've
heard
you
I've
heard
you
say
before
that
product
managers
are
effectively
prioritization
engines
or
prioritization
machines
along
that
line,
and
how
does
how
does
how
does
iteration
actually
help
you
as
a
product
manager
more
effectively
prioritize.
B
I
think
you
and
I
have
talked
a
little
bit
about
kind
of
local
and
global
optimums
kind
of
in
the
past,
and
I
think
that
when
you
break
things
down
to
their
component
parts,
it
becomes
a
lot
easier
to
see
kind
of
whether
or
not
they're
contributing
or
not
to
like
your
end
vision
of
where
you're
trying
to
get
to.
I
think
that
iteration
is
becomes
really
hard.
B
But
if
you
as
a
company
decide
you
know,
wow
like
onboarding
is
really
important
to
us,
because
the
top
of
funnel
like
is
something
that
is
so
important
to
like
our
business.
Onboarding
is,
is
a
huge
part
of
that
and
this
particular
piece
of
copy
becomes
like
a
really
important,
like
experiment
or
test.
Then
it
becomes
a
lot
easier
to
kind
of
see
that
incremental
kind
of
like
value
that
you're
that
you're
delivering
but
like.
B
If
you
don't
have
that
that
that
10
000
foot
view
of
what
you're
trying
to
accomplish,
I
think
then
iteration
becomes
really
challenging.
So
I
think
that
that's
the
thing
that
I
think
is
really
valuable
is
like
in
context
with
not
only
like
a
vision
and
also
a
like
a
feedback
loop
with
your
customers.
Whether
or
not
it
is
like
a
strong
like
data
driven
ev
testing
system,
where
you
have
like
strong
quantitative
knowledge
back
that
comes
back
from
feedback
or
you
have
like
passionate
customers
like
we
do
at
gitlab.
B
That
will
give
us
like
feedback
for
our
issue
tracker
and
cut
through,
like
lots
of
conversations
that
we
have.
I
think,
if
you
have
those
two
things,
iteration
becomes
your
best
friend,
but
if
you
don't
yeah,
it
becomes
a
little
bit
challenging
to
make
sure
that
that's
connected
to
something
valuable.
A
Right,
yes,
one
of
the
topics
that
we
discussed
on
gitlab
design
talks
before
was
think
big
ship
small
learn
fast
and
that
sort
of
frames
that
that
position
of
having
the
the
outcome
oriented
position
tied
to
the
vision
and
using
that
as
a
context
for
your
iteration
and
then
also
using
customer
feedback
as
a
way
to.
B
It's
one
of
those
things
that's
challenging
to
do
in
practice,
and
I
think
that
that's
one
thing
that
I
think
is
is
something
that
you
have
to
develop,
like
muscle
memory,
for
which
is
not
only
like
writing
down
and
thinking
about
success
criteria
when
you
first
build
something,
but
also
making
sure
that
you're
you're
you're,
following
back
up
on
those
signals
to
make
sure
that
you're
on
the
right
track
that
you're
actually
following
up
on
some
of
the
success
criteria
and
that
you're
double
checking
to
make
sure
that
you're
on
the
right
path.
A
Right,
I
want
to
tie
back
to
something
that
you
you
talked
about
before
around
local
versus
global
optimum.
A
I'm
not
sure
what
that
player
offer.
It
is
but
yeah,
I
suppose
within
git
lab
we've
been
around
for
six
years,
something
like
that.
Maybe
a
little
bit
longer
don't
test
me,
but
yeah,
obviously
we're
not
completely
greenfield.
Now
we
we
have
different
areas
of
the
product
that
are
constructed
in
particular
ways
that
customers
may
be
used
to,
or
we
have
may
maybe
particular
data
structures
that
may
not
help
us
when
we're
trying
to
do
new
new
things.
A
So
we
need
to
deal
with
some
more
of
the
the
nuance
and
complexity
that
happens
when
you
have
big
things
going
on
and
and
things
that
customers
are
used
to
as
well.
So
my
question
is:
what
happens
when
you
found
that
as
a
product
manager,
you've
built
your
yourself
into
a
local
optimum
and
you've
got
a
solution
which
is
useful
and
well
adopted,
but
everything
new
that
you
build
doesn't
seem
to
quite
get
you
where
you
need
to
go
in
terms
of
that
broader
vision
that
you're
talking
about.
So
how
do
you?
B
Where
you're
trying
to
go,
you
you
don't
know
if
you're
there
or
not-
and
you
know
this
could
be
something
like
like
performance
to
say
like
well.
We
can't
build
feature
x
at
scale
because
we
just
and
and
that
is
of
critical
importance
to
like
where
we
want
to
get
to.
We
can't
like,
like
a
company
like
tinder,
for
instance
like
when
they
first
like,
had
a
vision
for
their
product.
B
I'm
I'm
pretty
confident
that,
like
their
co-founder,
like
their
founding
team,
had
a
vision
for
like
expanding
that
product
around
the
world
at
the
scale
of
millions
and
millions
of
users
and
by
nature
of
the
user
experience
that
they
wanted
to
create,
it
has
to
be
really
snappy.
It
has
to
be
performant
when
people
are
swiping
left
and
right.
B
This
could
also
be
something
like
you.
The
signal
you
could
find
is
like
a
customer
metric
like
satisfaction
or
retention.
When
you
just
hear
recurrent
themes
from
your
customer
base
around
certain
things,
and
also
you
try
to
improve
something
like
retention,
onboarding
like
conversion
or
something-
and
you
just
simply
can't
seem
to
do
it
given
like
smaller
changes
within
what
you're
already
doing
so,
you
know
those
are
the
things
that
I
kind
of
like
think
about.
B
B
We
know
that
you
know
gitlab
prefers
postgres
as
its
database
of
choice,
and
that
has
like
trade-offs
and
at
this
point
is
really
hard
to
change,
but
we
kind
of
just
know
that
there
is
some
like,
like
a
local
optimum
here
that
is
preventing
us
from
like
moving
like
towards
a
global
optimum
analytics.
B
We've
never
actually
written
it
down,
but
it
might
look
like
something
like
well.
You
know
we
need
to
be
able
to
respond
to
a
you
know.
We
need
to
be
able
to
query
50
000
different,
merge
requests
and
have
that
respond
with
under
a
second
or
something
we've,
never
actually
written
that
specifically
down.
B
We
just
kind
of
inherently
know
that
we're
kind
of
up
against
some
kind
of
performance
barrier
but
ultimately
like
we
have
decided
that,
like
our
vision
or
our
direction,
is
that
we
want
to
be
able
to
query
hundreds
of
thousands
of
objects
across
like
an
entire
gitlab
instance
and
have
that
perform
in
real
time.
So
there
is
clearly
like
a
change
that
we
need
to
make
and
some
valley
that
we
need
to
cross
in
order
to
kind
of
like
solve
that
problem.
So
I'll,
stop
there.
B
A
Yeah
yeah,
that's
really
interesting.
I
think
tinder
is
a
an
interesting
example
as
well
because,
like
in
terms
of
design,
there's
some
like
they're
the
first
people
who
basically
made
the
card
swiping
interface.
A
So
I
wonder
whether
there's
anything
you
can
think
of
as
a
product
manager
which
sort
of
allowed
allowed
that
design
created
the
right
environment
for
that
design
to
actually
happen,
whilst
taking
iteration
into
account
as
well.
B
Yeah
and
because
they
they
had
this,
they
they
based.
They
had
this
hypothesis,
probably
that
you
know
that
particular
experience
was
something
that
users
would
respond.
Well
to
that.
The
number
of
interactions
would,
you
know,
be
at
this
significantly
elevated
level,
as
opposed
to
like
the
high
friction
kind
of
like
interaction.
That
was
typical
react,
like
probably
compose
a
message
to
someone,
and
so
the
number
of
of
matches
that
you
would
have
like
with
any
other
product
would
be
this
very
low
level.
B
But
with
this
new
type
of
way
of
interacting,
it
would
be
like
10x
in
terms
of
like
what
people
like.
So
I'm
sure
that,
like
they
didn't
have,
they
didn't
like
their
first
iteration
of
that
product
was
not
prepared
for,
like
millions
and
millions
of
users
at
scale
like
they
first
wanted
to
test
this
idea
to
say
like
did
we
actually
do
we
actually
have
that?
B
This
is
really
working
and
then
underneath
the
hood,
I'm
sure
there
were
many
many
many
different
changes
and
improvements
that
they
made
under
the
hood
to
make
sure
that
they
could
get
to
that
scale.
But
it
all
started
with.
We
have
this
core
idea
we're
going
to
test
it
and
we
see,
like
the
data,
come
back
in
the
res
and
the
response
from
our
users
that
this
is
really
working.
A
Okay,
nice,
so
I
want
to
talk
about
another
practice
that
we
have
within
our
handbook
that
we
talk
about
quite
a
bit
about
and
get
lab
of,
boring
solutions.
And
I
wonder
whether
you
can
give
an
introduction
to
boring
solutions
just
a
little
blurb
of
what
it
is.
B
B
I
think
that
the
best
I
the
best
way
of
sending
that
up
is
like
do
the
simple
thing,
my
dad,
when
we
were
growing
up
whenever
we
would
have
my
dad,
was
an
electrical
engineer
and
whenever
we
would
be
faced
with
a
problem,
I
I
would
reach
for,
like
the
fans,
the
fancy
cool
tool
that
we
had
like
in
the
garage
or
something
and
his
all
his
his
phrase
that
he
would
always
say
over
again
was
just
like
keep
it
simple
stupid
like
keep
it
as
simple
as
you
can
and
find
the
simplest
most
obvious
solution
to
the
problem,
and
I
think
that
that
is
that
is
you
know
the
the
a
key
tenant
in
product
management
find
define
the
problem
and
find
the
simplest,
most
straightforward
solution
to
that
problem.
B
When
at
like,
at
a
previous
role
early
in
my
product
career,
like
we
were
faced
with
the
problem
of
user
onboarding
users
failing
to
come
back
after
their
first
session-
and
we
had
this
hypothesis
that
that
changing
the
user
experience
in
this
in
a
certain
way
would
result
in
people
returning
to
the
product
a
little
bit
more
and
and
not
heeding
the
boring
solution,
which
was
to
kind
of
make
a
small
change,
ap
test
it
see.
B
If
that
worked,
I
decided
that
we
were
going
to
reach
for
machine
learning
and
we
were
going
to
make
this
kind
of
customizable
dynamic
experience
that
kind
of
changed.
So
instead
of
validating
that
problem
first
and
then
doing
kind
of
the
very
simple
thing
I
reached
for
like
for
the
complex
solution,
and
it
resulted
in
like
a
ton
of
tech
debt.
It
resulted
in
like
kind
of
like
this
confusing
scenario,
user
scenario,
which
was
really
hard
to
understand
whether
or
not
it
actually
moved
the
needle
in
the
right
direction.
B
So
boring
solutions
to
me
like
to
me.
It's
all
about
figuring
out
the
simplest,
most
straightforward
way
of
solving
your
problem
and
just
doing
that,
instead
of
reaching
for
something
more
needlessly
complex.
A
Yeah
I
I
worked
for
in
a
previous
job.
I
worked
with
with
a
an
elevator
company,
a
lift
company
as
a
as
a
customer,
and
they
always
used
to
tell
this
story
where,
like
some
of
the
first
elevators
customers
would
get
in
or
users
of
the
elevator
would
get
in
they'd,
try
it
out
and
they'd
start
to
think.
Oh,
this
is
taking
too
long
30
floors.
This
is
like
yeah.
A
Maybe
you
could
speed
up
the
elevators
in
some
way
or
another,
so
all
these
engineers
sort
of
got
together
just
like
how
do
we
actually
speed
up
the
elevators?
How
do
we
sort
of
do
with
the
with
the
the
pulleys
and
lifts
and
all
this
sort
of
stuff?
And
someone
had
the
good
idea
of
putting
some
mirrors
in
the
in
the
elevator
and
from
installing
some
mirrors
in
there?
People
had
something
to
do
while
they're
in
the
elevator,
so
the
problem
space
was
sort
of
redefined
from
we
need
a
faster
elevator
to.
B
B
A
B
I
think
that
the
best
designers
of
the
scenic
year
lab
do
a
very
good
job
of
balancing
both
the
vision
and
what
we
want
to
accomplish
in
the
future,
with
small
iterative
changes,
and
so
I
think
that
we
can
lean
into
both
of
those
more
number
one.
I
think
that
whenever
I've
seen
like
the
vision
for
for
a
particular
feature
or
particular
area
of
the
product,
develop
it's
it's
always
it's
it.
B
It
might
extend
like
six
months
into
the
future,
but
I
very
very,
I
don't
think
I've
ever
seen
a
gitlab,
a
really
ambitious
vision.
That
extends
like
three
to
five
years
into
the
future
of
like
what
user
onboarding
could
look
like
what
analytics
could
look
like
what
anything
could
look
like
part
of
this.
B
Is
you
know
the
nature
of
iteration
and
part
of
this
is
a
nature
of
time,
but
I
think
that
in
an
ideal
world,
because
I've
emphasized
so
much
of
the
importance
of
making
sure
that
we
understand
of
where
we're
going,
that
we
take
the
time
to
you
know,
maybe
once
a
quarter
define
like
let's,
what's
the
three
to
five
year,
vision
for
value
stream
analytics
like
what
do
we
think
that
it
could
look
like
in
if
we
waved
our
magic
wand
and
we
could
completely
redefine
or
rethink
the
user
experience
for
what
people
what
people
want
to
see?
B
B
Well,
let's
take
this
small
piece
and
let's
fast
forward
six
months,
maybe
if
we
had
you
know
a
couple
engineers
to
work
on
this
for
that
long
and
then
I
think
then,
the
the
incremental
changes
and
the
iterations
that
we
define
become
a
lot
more
interesting
and
meaningful
because
they
can
decide
whether
or
not
that
they're
kind
of
moving
in
that
direction,
and
that's
not
something
that
you
know
that
that
design
does
in
a
vacuum.
It's
something.
B
Obviously
we
do
in
partnership
with
product
and
our
customers
and
engineering,
but
that's
what
I
would
really
like
to
see,
more
of,
which
is
define
what
the
real
future
looks
like,
because
I
think
that
you
know
that
for
for
some
of
these,
especially
some
of
these
areas
of
the
product,
we
do
find
ourselves
kind
of
up
against
some
like
local,
like
optimum
that
is
potentially
like
holding
us
back
from
the
future.
B
A
B
About
when
I
think
about
you
know
a
valley
cross
thing
or
like
what
do
you
do
when
you
find
yourselves
in
some
kind
of
like
optimum
state,
where
you're
like
really
struggling
to
make
forward
progress?
There's
like
a
spectrum
of
things
that
you
can
do
right
like.
I
think
that,
at
the
end,
at
the
one
extreme,
where
you
have
where
you
have,
where
you're
trying
to
rethink
like
a
a
area
of
the
product
or
a
problem
space
that
has
like
so
much
inertia
behind
it,
that
it
is
really
really
hard
to
change.
B
You
know
people
will
create
like
a
beta
or
a
new
experience
and
they'll
just
like
run
with
two
things
in
parallel,
some
people
will
some
organizations
like
well
like
it's
risky,
but
we
don't
need
to
do
that,
so
we're
just
going
to
a
b
test
and
we'll
just
do
a
bunch
of
experimentation,
because
we
have
the
scale
and
the
data
to
make
sure
that's
meaningful
and
then
some
will.
B
You
know
it'll
be
at
the
lower
risk
end
of
the
spectrum
and
they'll
just
simply
like
iterate
on
it
or
make
like
you
know,
more
significant
changes
at
one
end
of
the
spectrum.
You
have
you
know
our
analytics
features
which
some
of
them
have
low
usage.
B
So
I
feel
like
we
can
be
relatively
you
know,
agile
with
what
we
want
to
change
as
long
as
we
know
what
we're
trying
to
accomplish
what
hypothesis
we're
trying
to
prove
out
and
why
we're
doing
it
at
the
other
end
of
the
spectrum,
you
have
someone
like
I'm
trying
to
think
of
like
a
really
large
product,
that's
been
reddit
like
which
has
been,
which
has
had
this
new
and
old
reddit
kind
of
like
running
in
parallel,
for
who
knows
how
long,
because
they
feel
like
it
is
such
an
important
decision
that
they
need
to
get
right.
B
I
don't
I
don't.
I
think
that
that's
super
costly
and
I
wouldn't
I
would
I
think
that
they're
doing
going
about
that
slightly
the
wrong
way.
But
I
think
that,
depending
on
kind
of
like
where
you
are
in
that
spectrum
kind
of-
but
you
don't
know
until
you
have
a
solid
sense
of
where
you're
trying
to
go,
I
think
that
that's
that's
what
I'm
trying
to
say.
So,
I
think
at
gitlab
when
it
turns
to
product
development.