►
A
And
maybe
before
that
cal,
it
might
be
a
good
idea
for
you
to
give
everyone
an
introduction
of
yourself
and
your
book
right.
B
Sure,
let's
see
where
does
it
start,
I
guess
well,
I'm
no
longer
at
zora,
which
is
the
company
where
I
wrote
the
book
and
the
experience
the
book
was
based
on.
So
I
can
go
into
that
later,
but
I
mean
I
was
the
the
head
of
the
data
science
team
at
zora
for
well
like
five
years
since
before
they
ipo'd.
Until
I
guess
we're
like
two
or
three
years
post
ipo
now
and
one
of
the
things
I
did
while
I
was
there
is.
B
Pretty
much.
Everyone
has
a
churn
problem
nowadays,
so
it's
gotten
really
broad
beyond
just
subscription,
but
I
guess
subscription
was
where
people
first
recognized
it.
I
guess
more
about
my
background.
What
can
I
say
I?
I
did
a
phd
in
kind
of
neuroscience
with
ai,
which
sounds
sort
of
normal
to
say
that
nowadays,
but
actually,
when
I
did
a
phd
in
that
area,
it
was
not
such
a
common
choice.
It
was
kind
of
before
you
know.
Neural
networks
had
taken
off
again
after
that
I
actually
worked
on
wall
street
for
a
while.
B
I
have
a
great
foresight
to
join
the
finance
industry
in
2007
shortly
before
it
sort
of
imploded,
but
it
was
a
very
interesting.
You
know
career
experience
and
you
know
it
was
so
it
was
all
good,
but
then,
after
being
on
wall
street
for
a
while.
I
I
because
I
had
done
machine
learning
in
my
graduate
school
like
well.
When
I
went
to
wall
street,
there
was
no
such
thing
as
data
science.
I
mean
there
was
no
jobs
for
data
science.
B
When
I
finished,
although
when
I
was
interviewing,
I
was
actually
made
a
decision
between
machine
learning,
jobs
and
wall
street
job
that
I
maybe
again
maybe
unwisely.
I
chose
the
wall
street
job,
but
you
know
hindsight
is
20
20.,
but
anyway,
during
the
time
I
was
on
wall
street.
That
was
when
data
science
became
a
thing
and
since
I
had
done
machine
learning,
you
know
back
in
the
day,
was
kind
of
a
natural
transition
to
get
out
of
finance
and
into
data
science.
B
Where
the
first
company
I
worked
at
when
I
became
a
data,
scientist
was
trying
to
do
churn,
so
it
was
actually
even
before
zora.
I
was
at
a
smaller,
a
small
startup
that
was
trying
to
do
a
product
to
help
with
churn
and
well
that
startup
didn't
do
too
well.
But
based
on
that
experience
and
my
connections
from
the
company,
I
moved
to
zora,
where
they
also
did
a
lot
of
churn
work.
So
that's
kind
of
the
whole
background,
so
I
was
gonna
say
I
have
well.
B
I
have
like
two
presentations,
one
that
I
gave
at
a
data
science
conference
and
it
has
like
more
some
actual
code
in
it,
and
one
presentation
that
I
gave
at
saster,
which
is
more
just
like
the
general
interest
presentation.
A
B
Sure,
okay,
then
I
can
do.
I
can
kind
of
quickly
go
through
the
presentation
that
I
gave
at
saster.
It
was
actually
just
about
a
year
ago,
which
you'll
see
from
some
of
the
data.
B
Let's
see
and
share
my
screen
and
there's
this
first
slide
with
the
video
I'm
not
going
to
show
you
the
video
you
can
find
it
on
youtube.
If
you
look
to
see
that
the
there's
like
a
video
at
the
start
of
this,
which,
with
rock
music
that
makes
being
a
data
scientist,
look
like
whoops,
I'm
not
trying
to
share
this.
I'm
trying
to
present
this.
I
got
the
wrong
button.
There.
B
Okay,
so
this
is
the
presentation
I
gave
at
saster
scale
last
year.
I
don't
know
if
anyone
went
to
that.
Of
course
it
was
a
virtual
event,
so
I
start
out
with
what
is
churn.
You
know
we
can
probably
just
I
can
really
skip
this
one
at
saster.
Also,
it's
like
anyone,
who's
at
saster
scale
is
gonna
know.
What
churn
is
so,
but
I
usually
start
out
with
a
slide
anyway.
Does.
C
B
Want
to
talk
about
what
churn
is
probably
not
right:
let's
go
to
okay.
What
is
financing
with
data?
That's
the
book.
I
don't
have
to
promote
my
book
to
you
guys
what
is
zora.
Also,
I
don't
think
I
have
to
tell
you
guys
about
zora
really.
Do
you
know
what.
B
Yeah,
so
a
lot
yeah
lots
of
it's
pretty
much
the
who's
who
of
the
sas
world
at
this
point
all
right,
so
this
is
actually
where
we
get
into
some
of
the
actual
content.
This
is
data
on
typical
monthly
churn
rates
for
sas,
and
this
is
from
the
subscription
economy
index
actually
hold
on
a
second
someone's
like
mowing
their
lawn
right
outside
my
window,
I'll
be
back
in
just
a
sec.
I'm
just
gonna.
A
A
B
Yeah
cool
all
right.
Well,
I
have
my
noise
situation
under
control,
so
there's
a
document
that
they
published.
I
mean
I
used
to
publish
the
subscription
economy
index
and
they
still
do
this,
so
you
can
get
updated
versions
of
these
numbers
by
going
to
zora.
B
So
at
the
time
that
we
did
this,
the
median
churn
was
two
percent
per
month.
Bottom,
ten
percent
is
five
percent
and
the
top
ten
percent
is,
as
you
can
see,
0.6.
So
there's
you
really
see
a
huge
range
just
on
the
zora
platform,
and
the
key
thing
to
remember
is
it's
not
like
which
it's
not,
that
the
it's
good
companies
and
bad
companies,
because
there's
so
many
variables
that
affect
churn
rates.
B
One
is
if
you're,
b,
to
b
your
b
to
c
generally,
a
consumer
product
will
always
have
higher
higher
churn,
although
there
are
exceptions
like
netflix,
of
course,
but
we'll
talk
about
that
in
a
sec
company
maturity
very
important,
more
mature
companies
almost
always
lower
churn.
You
have
some
very
high
growth.
Companies
in
the
early
stages
also
have
very
high
churn,
and
that
can
be
okay.
You
know
for
early
stage
also
where
you
operate.
I
say
market
segment
here,
that's
not
really
clear,
but,
for
example,
from
the
zora
data.
B
You
know
we
had
customers
whose
business
was
largely
in
latin
america
and
guess
what
they
had
high
churn
rate.
You
know,
then,
a
comparable
business
operating
in
europe
or
the
united
states.
So
there's
so
many
things
that
can
affect
your
churn
rate.
It's
really
important
to
make
sure
you
get
an
appropriate
benchmark
if
you're
trying
to
you
know,
compare
yourself
so
for
gitlab,
you
would
really
want
to
look
at
b2b
sas
companies
at
a
similar
stage
of
development
and
don't
compare
yourself
to
consumer
companies
or
even
public
companies.
B
I
mean
this
is
annual
churn
rates.
This
is
exactly
the
same
data
just
converting
it
to
annual,
so
the
median
is
around
20
top
10
is
down
below
10.
Also
public
companies.
That
report
churn
are
generally
reporting
less
than
10.
That's
like
your
netflix,
I
think
disney
plus,
is
also
you
know
well
under
10,
although
some
of
them
are
not,
they
don't
have
all
that
data.
B
This
is
all
from
zora
data
and
again
so
there's
just
a
really
big
range
that
you
see
any
questions
or
comments,
or
should
I
just
keep
going.
D
Carl,
real
quick
on
the
on
the
turn.
This
is
churn
across
all
segments
within
those
customers
correct.
So,
if,
if
a
if
a
subscription,
if
the
sas
company
is
selling
something
it's
taking
into
account,
any
smb
mid-market
enterprise-
and
this
is
accumulative-
these
are.
D
B
Well,
generally,
it's
always
the
case
that
the
bigger
your
customers,
the
less
they
churn,
and
so
smbs
are
more
like
consumers,
you
know
and
bigger,
but
the
this
data
takes
every
company
and
just
calculates
an
overall
churn
rate.
Without
you
know,
it's
impos,
if
you
haven't
seen
the
the
subscription
economy
reports
that
I
used
to,
do
it's
pretty
much
taking
every
zora
customer
as
an
anonymous
sample,
aggregating
all
the
data
and
then
calculating
the
churn
rate.
So
we
would
only
calculate
one
aggregated
churn
rate
for
each
tenant
and
it
was
only
one
calculation.
B
We
used
like
an
annualized
calculation,
even
for
consumer
companies
where
that
wasn't
really
the
right
churn
rate
for
a
consumer
company
because,
usually
consumer,
you
look
at
monthly
churn
and
sap
b
to
b
sas
you'll
look
at
annual
churn,
but
for
this
study
we
calculated
everything
as
annual
turn
and
then
just
turned
it
to
monthly
for
comparison.
When
we
wanted
to.
E
Cool,
thank
you
carl,
quick
follow-up.
To
that.
Did
you
see
any
like
cool
correlations
between
net
revenue
retention
and
gross
revenue
retention,
so
maybe
startups
that
had
significantly
higher
gross
churn,
but
they
also
were
the
ones
that
were
leading
the
way
when
it
related
to
net
because
they
were
expanding
their
existing
base.
So
much.
B
I
don't
remember
exactly:
we
definitely
saw
a
correlation
between
high
growth
and
high
churn
right
so
early
stage,
companies
that
were
growing
very
rapidly
yeah,
and
we
would
you
know
this
was
aura.
We
would
see
like
50
80
percent
growth,
a
hundred
percent
growth,
we
you
know
we
saw
that
kind
of
data
routinely,
but
they
often
went
with
a
high
churn
rate.
So
at
one
point
we
said:
oh,
maybe
companies
with
low
churn
rate
grow
faster
right.
That
would
be
your
cause
and
effect
reasoning.
If
you
lower
your
churn
rate,
your
growth
should
go
up.
B
Actually
we
saw
the
opposite
that
the
higher
the
growth
rate,
usually
the
higher
the
churn
rate.
If
you
just
look
at
the
data
all
else
equal
because
then,
who
has
that
high
growth
rate
early
stage
companies
who
has
high
churn
rate
well,
a
lot
of
companies
have
high
churn
rates,
but
early
stage,
companies
have
high
churn
rates,
so
we
generally
saw
churn
and
growth
rates
correlated
positively,
which
well
that
wasn't
what
people
were
hoping
to
find.
B
B
The
past
couple
of
years
was
q3
of
last
year,
but
it
wasn't
like
that
bad
in
a
way
and
what
we
observed
at
zoro
was
we
had
a
lot
of
companies
whose
churn
went
up,
but
you
know
what
zoom
is
azor
a
customer
too,
and
their
churn
went
way
down
and
a
lot
of
zora
customers
also
had
churn
go
way
down
because
they
were
part
of
the
solution
for
working
from
home.
B
So,
okay,
this
is
actually
getting
into
the
content
of
the
book,
which
is
first.
Just
to
summarize,
you
know
what
are
data
driven
churn
reduction
strategies
first,
one
I
always
like
to
say
is
having
a
great
product,
because
if
your
product
is
great,
people
won't
churn
right
and
of
course
you
can.
The
the
the
data
part
is
figuring
out.
What
are
the
most
engaging
and
sticky
features
without
doing
surveys,
because
your
churn
data
is
like
a
survey
that
every
customer
takes.
They
only
answer
one
question:
it's
churn
or
renew,
but
every
customer
takes
it.
B
So
it's
a
great
if
you
see
it
like
a
survey
like
that.
Also
marketing
is
for
reducing
churn
by
giving
people
information
that
they
really
want.
You
know
try
trying
to
target
the
messages,
so
that
you're,
you
know
not
just
spamming
people
also
whoops.
Oh
that's
two
at
once.
Well,
the
first
one
was
customer
success.
Obviously
you
guys
know
about
that.
B
Sometimes
I
have
to
explain
what
customer
success
is
and
obviously
using
data
to
target
people
for
customer
success
is
important,
especially
when
you
start
thinking
about
well
strategies
which
have
a
cost
associated
with
them.
You
know
if
you're,
just
emailing
people,
the
only
cost
is
you
might
drive
them
away
if
you're
actually
doing
customer
success.
You're
gonna
have
to
you
know
there
are
costs
associated
with
doing
training,
making
onboarding
documentation,
and
you
know
all
the
things
that
customer
success
does
on
pricing.
The
message
is
always
not
to
discount
to
reduce
churn.
B
B
Lastly,
is
targeting
which
is
figuring
out
where
your
best
customers
tend
to
come
from
and
get
more
from
that
channel.
I
always
say
this
is
like
the
weakest
form
of
fighting
churn,
because
for
one
thing,
you're
not
actually
retaining
your
existing
customers
and
you're
not
actually
making
your
product
better.
You
know,
but
also
you
usually,
no
one
can
get
all
the
customers
they
want
from
one
preferred
channel
or
from
the
one
channel
that
gives
them
the
best.
B
So
it's
always
just
you
know
it's
just
one
tool
in
your
tool
kit
to
do
that
kind
of
targeting.
Let's
see
so
that's
more
of
the
background.
So
why
is
churn
hard
to
fight
it's
hard
to
predict
even
with
ai
and
modeling,
for
a
number
of
reasons?
In
my
experience
I
mean
the
reasons.
People
churn
are
very
subjective
and
you
have
limited
information
about
why
they're
really
churning
it's
better
with
b2b.
B
You
might
have
more
in
from
info,
but
still
there's
also
always
a
lot
of
random
timing,
and
if
this
is
I'm
sure,
we've
all
done
this,
where
you
forget
to
cancel
a
subscription
you're
not
using
because
then
what
does
the
data
driven
algorithm
at
that
company
say
it
says:
oh
carl's
a
risk.
He
hasn't
watched
a
video
in
four
weeks.
Oh
carl's,
a
risk.
He
hasn't
watched
the
video
in
eight
weeks.
Why
doesn't
that
guy
churn?
Oh,
he
forgot
you
know
so
a
turn
model.
You
know
it's
less.
B
Also,
common,
with
b
to
b,
usually
b
to
b
are
more
rational,
but
still
you'll
have
some
b
to
b
customers.
Where
you
look
at
this,
their
data
and
you're
like
what
is
wrong
with
these
people,
why
don't
they
churn?
You
know
they
probably
just
haven't
gotten
around
to
finding
a
replacement
or
they
have
bigger
fires
to
fight
and
they're,
going
to
churn
as
soon
as
they
you
know,
have
the
I.t
resources
to
migrate.
You
know,
so
it's
also
hard
to
prevent
churn,
because
these
are
customers
who
know
what
you're
selling.
B
B
The
joke
is,
there's
a
diamond
bullet.
You
just
reduce
your
prices
and
you'll
reduce
your
churn,
but
I
call
it
a
diamond
bullet
because
it
works,
but
it's
too
expensive.
You
can't
afford
it
and
really
it's
just
another
way
of
losing
money.
If
you
have
to
reduce
your
prices,
to
reduce
churn
you're
losing
money
that
way,
so
it's
not
really
a
churn
reduction.
It's
just
you
know
shifting
it.
B
I
guess
where
the
source
is
and
lastly
well,
as
we
can
probably
see
from
this
call
churn
is
hard
because
it's
multi-team
and
customer
success
is
often
said.
Oh
they
own
churn,
like
quote
owned,
the
churn
metric,
but
the
truth
is
in
many
companies,
it's
also
the
product
and
the
marketing
and
the
pricing
which
are
all
you
know
going
into
churn.
So
well,
yeah.
I
mean
there's
just
a
lot
of
risks
there
of
people
not
talking
to
each
other
using
different
metrics
but
yep.
B
So,
let's
see
going
through
more
ai
for
churn,
I
say
and
I'm
a
data
scientist
and
I
do
ai
for
a
living,
but
ai
is
only
a
little
bit
helpful
with
churn
and
well
definitely
only
for
organizations
that
are
already
using
advanced
analytics.
So
if
you're
already
doing
stuff
with
ai-
and
you
want
to
do
a
churn
model
using
ai-
then
it's
like
great,
but
if
you're
not
already
doing
ai,
don't
rush
to
use
ai
for
churn.
B
It's
not
just
about
the
accuracy.
Let
me
just
get
all
these
things
out.
It's
that
an
ai
algorithm
can
tell
you
who's
most
at
risk
for
churn,
but
it's
not
going
to
do
a
good
job
targeting
them.
For
you
know
for
a
particular
intervention
I
mean
it
would
be.
Very
I
mean
really
you'd
have
to
be
a
very
advanced
company
to
be
using
your
analytics
to
do
to
target
your
interventions,
but
also
your
interventions
need
to
be
designed.
B
Like
let's
say
your
intervention
is
going
to
be
an
email
campaign
that
will
highlight
good
features.
Well,
ai
is
not
going
to
write
that
email
and
ai
is
not
going
to
tell
you
who
to
send
it
to,
and
you
need
to
send
the
email
about
features
to
people
who
aren't
using
the
features
which
is
so
your
data
on
the
who's.
Using
what
features
is
going
to
be
really
useful,
but
not
the
ai.
B
You
know
that
will
just
flag
people
at
risk
also
for
same
things
like
you
know,
your
pricing
strategy
and
your
product
market
fit.
You
know
ai
is
not
going
to
tell
you
how
to
change
your
product,
which
is
really
the
best
way
to
reduce
churn
is
always
you
know
the
best
product
is
the
best
way
to
reduce
churn.
B
So
what
do
you
do
with
data
I
in
the
book?
I
focus
on
what
I
call
customer
metrics
and
if
you're
in
the
data
science
profession,
you
call
this
features
or
feature
engineering.
I
don't
know
if
anyone's
in
that
line
of
work,
I
avoid
it
in
the
book
saying
you
know,
features
in
feature
engineering
just
because
if
you're
a
software
company
it
can
get
kind
of
confusing.
B
Are
you
talking
about
the
data
features
or
the
product
features,
so
customer
metrics
should
be
easy
to
understand
and
well-designed
customer
metrics
do
predict
churn
and
retention
as
I'll
show
you
very
in
like
the
next
slide.
So
it's
not
that
there's
no
predicting
of
churn
it's
just
not
using
an
ai
model
to
do
it
and
your
customer
metrics
are
also
what
you
can
use
to
do
your
targeting
so
take
like
an
email,
that's
going
to
promote
a
product
feature
that
you
think
is
really
sticky
and
engaging
well.
B
If
you
have
a
metric
of
who's
using
the
feature,
you
can
use
that
metric
to
target
people
for
that
email
campaign
or
if
it's
an
onboarding
training
for
people
a
certain
feature
area.
You
know
you
can,
if
you're
measuring
who's,
using
what
you
can
target
people
who
need
help
with
that
product
feature.
B
B
Well,
broadly,
it's
a
little
confusing
when
I
explain
their
business,
so
it
says,
broadly,
ensures
your
business
looks
great
online
and
is
found
by
potential.
Customers
broadly
manages
reviews.
So,
if
you're
ever
at
a
website-
and
it
pops
up
a
little
box
that
says,
will
you
review
this
place?
It's
probably
broadly,
or
maybe
a
broadly
competitor?
I
don't
even
know,
but
that's
what
broadly
does
it
gets
people
to
give
reviews?
B
But
the
thing
to
remember
is
the
broadly:
customers
are
our
businesses,
so
this
is
b2b
and
it's
consumers
who
are
giving
the
reviews
and
they
have
events
in
their
system
which
are
called
promoter
and
detractor.
This
has
nothing
to
do
with
nps
score.
So
just
this
is
not
about
nps
promoter
and
broadly
means.
Your
customer
gave
you
a
good
review,
so
it's
not
the
broadly
customer.
Reviewing
broad
broadly
is
broadly's
customers
customer
reviewing
broadly's
customer,
because
that's
what
broadly's
customers
want
to
do.
B
So
this
is
what
happens
when
you
look
at
the
number
of
positive
reviews
per
month
that
they
get
on
broadly,
and
this
is
a
simple
example
of
using
a
metric
to
tell
you
something
about
churn
by
looking
at
customers
in
well.
I
call
it
cohort
analysis.
Usually
cohort
analysis
is
based
on
a
time
or
they
all
signed
up
in
the
same
month.
B
This
is
actually
cohort
analysis
based
on
they
all
have
a
metric
in
a
certain
range.
So
the
bottom
cohort
is
all
the
group
of
customers
who
have
basically
one
or
zero
promoters
per
month,
and
then
the
next
cohort
is
like
the
next
decile.
I
don't
think
there's
10
but
they're
designed
to
be
equally
sized,
and
so,
as
you
can
see
the
more
good
reviews
per
month,
the
less
churn.
So
this
is
good,
and
this
is
what
you
should
expect
to
see
for
really
whatever
your
important
product
features.
B
Are
I've
seen
it
at
so
many
different
companies
that,
if
I
don't
see
a
chart
that
looks
like
this
then
like
something's
wrong,
usually
something's
wrong
with
the
data.
You
know,
I
don't
show
you
the
churn
rate
here,
because
that's
broadly's
confidential
information,
and
that
goes
for
all
the
other
case
study
charts.
B
I
won't
show
you
the
actual
churn
rate,
but
this
this
is
the
right
relative
scale,
so
the
people
who
are
healthy
customers
have
well
about
one
quarter
of
the
churn
rate
of
the
unhealthy
customers
in
the
bot
in
the
bottom
cohort,
and
you
usually
see
an
elbow
like
this,
where,
whatever
the
metric
is,
it
helps
for
a
while
and
then
beyond
a
certain
point.
It
actually
doesn't
continue
to
reduce
to
reduce
churn,
that's
also
really
standard
in
these
okay,
so
that
was
good
reviews.
What
about
bad
reviews
if
a
broadly
customer
gets
bad
reviews?
B
Well?
Let's
look!
Actually,
if
you
look
at
it
this
way,
it
looks
like
bad
reviews
also
reduces
churn.
So
this
is
customer
detractor,
that's
the
number
of
bad
reviews
they
get
per
month
and
it's
a
little
bit
weird.
Actually,
why
do
more
bad
reviews
lead
to
less
churn?
It's
a
trick
question
which
I'll
give
you
the
answer
to
on
the
next
slide.
F
Okay,
could
it
be
because,
if
they're
getting
a
lot
of
reviews,
they're
also
getting
more
bad
reviews,
that
is,
that
was
the.
A
B
B
B
Here,
instead
of
looking
at
the
number
of
bad
reviews,
you
look
at
the
percentage
of
bad
reviews,
and
now
you
see
what
you're
expecting
to
see
that
the
more
bad
reviews,
the
worse
it
is,
but
you
have
to
look
at
the
bad
reviews
as
a
percentage
of
the
total
or
when
you,
when
you
look
at
this
one
you'll
get
fooled
by
the
correlation,
and
it's
really
common
in
all
of
this
work,
because
most
companies
yeah,
we
have
a
lot
of
behaviors
which
are
correlated
and
where
good
users
who
use
the
product
a
lot,
do
a
lot
of
everything.
B
So
you
actually
have
to
look
at
these
kind
of
comparison
ratios
to
see
the
the
influence
of
well
actually
even
to
figure
out.
What's
bad,
so
there's
other
examples.
Okay,
like
this,
this
is
broadly.
This
is
where
they're
top
turn
fighting
moves.
Okay,
so
they
use
the
customer
success.
Actually
uses
the
customer
metrics
also
to
talk
to
the
customer
so
when
they're
having
a
call
with
a
customer,
they'll
pull
up
a
dashboard
and
be
like.
Oh,
I
see
your
number
of
bad
reviews
like
went
up
last
month.
B
You
know,
and
they
just
use
it
as
like
a
talking
point
to
engage
with
the
customer.
Also
many
companies
do
this.
They
just
ignore
the
people
who
are
most
at
risk
like
going
back
to
this
slide.
You
just
ignore
these
people
here
who
have
like
zero
or
one.
You
know,
they're
they're,
usually
not
savable.
B
B
They
found
that
integration
with
a
booking
system
was
very
sticky,
and
now
they
prioritize
it
in
the
setup,
and
I
think
a
lot
of
people
find
like
booking
and
accounting
systems
are
very
sticky.
So
if
you
somehow
integrate
with
their
booking
and
accounting
system,
that's
usually
a
good
good
play
across
different
sas
categories.
E
G
B
B
The
problem,
I
don't
recommend,
actually
automating
the
discovery
of
the
best
metrics
either,
because
if
a
big
theme
I
talk
about
in
the
book,
maybe
I
don't
really
say
it
in
this
slide
deck.
Is
that
usually
it's
pretty
obvious
to
people
who
know
the
business,
because
it's
whatever
gives
customers
value
or
utility?
You
know
in
economic
terms,
or
they
say
engagement,
you
know
or
the
magic
moments.
You
know,
and
you
know
I
don't.
B
I
don't
work
at
gitlab,
but
I'm
pretty
sure
your
most
important
metrics
are
going
to
be
things
like
how
many
repos
do
they
manage
how
many
pipelines,
how
many,
how
many
check-ins,
how
many
merge
reviews,
because
that's
what
people
are
paying
for
when
they
use
git
labs,
so
those
are
some
form
of
those
and
the
thing
is,
you
can
also
usually
can't
measure
the
value
that
people
get
directly
only
in
some
cases,
because
I
mean
really
people
using
gitlab.
What
do
they
really
want
to
do?
They
want
to
make
money
from
their
software
companies.
B
Usually
that's
actually
their
true
goal
right.
They
want
their
software
company
to
be
a
success,
and
you
probably
can't
measure
that
directly
trying
to
think
if
you
could
zora
was
very
different
because
in
zora
well,
zoro
was
in
the
enviable
position
for
churn.
This
is
kind
of
going
off
the
talk
track
here,
but
zora
our
customers
are
their
customers.
B
Now
they
want
to
make
money
and
in
zoro
we
actually
measured
how
much
money
they
made
because
it
was
running
through
our
system,
so
we
actually
knew
how
much
money
they
made
and
how
much
they
paid
us
to
make
that
money.
So
at
zora
we
had
a
very
great
insight
into
who
was
likely
to
churn,
although
you
still
couldn't
stop
it
always
because
if
they
weren't
making
money
on
zora,
we
didn't
have
a
magic
wand
that
would
have
them
make
more
money
either.
B
So,
but
so
usually
the
reason
I
don't
recommend
ai
for
figuring
out
the
best
metrics
is
it's
usually
well.
First
of
all,
it's
can
train
constrained
by
what
you're
tracking
so
you're
going
to
start
by.
B
Looking
at
all
the
events,
you're
tracking
and
then
it's
just
common
sense,
like
the
most
valuable
events
for
users
of
the
system,
I
mean
you
might
get
some
surprises
when
you
actually
do
this
kind
of
analysis,
but
yeah
there's
always
a
few
surprises,
but
you
usually
just
start
with
the
things
that
you
you
know
and
this
quantifies
it
and
will
show
you
those
weird
things
with.
Oh
actually,
you
know
how
disengaging
are
the
disengaging
things
all
right
anyway,
let
me
get
keep
getting
through
the
stuff
in
the
deck
there's
two
more
case.
B
Studies
clipfolio
is
a
bi
tool,
basically
for
making
dashboards,
and
so
the
events
in
their
system
are
things
like
viewing
and
editing.
Editing,
dashboards,
we'll
also
look
at
metrics
for
account,
tenure
or
age,
active
users
per
month
and
seat
utilization
is
another
good
one.
So
this
is
the
time-based
cohorts
for
clipfolio,
which
is
also
I
mean
it's
really
like.
I
said
before
that
sort
of
normal
cohorts
are
based
on
the
time
that
you
sign
up,
but
this
is
portraying
it
in
a
way
similar
to
the
previous.
B
So
this
shows
actually
a
pretty
standard
thing
for
a
sas
company.
I
think
that
was
one
of
the
questions
in
the
document
was
about
the
time
profile
and
it's
actually.
This
is
pretty
common
for
the
first
year
that
well
churn
does
peak
at
six
months,
and
then
it
goes
down
in
the
third
quarter
and
then
comes
back
up
again
at
one
year.
I
don't
know
if
you,
if
you
guys,
have
annual
contracts.
B
You
might
just
see
a
lot
of
churn
at
the
end
of
the
first
year,
but
there's
almost
always
some
kind
of
churn
early
in
the
customer,
a
peak
in
churn
early
in
the
customer
life
cycle,
where
people
who
you
know
it
wasn't
a
good
fit
for
just
kind
of
dropping
out
and
for
clipfolio
they
actually
had
a
variety
of
contracts.
I
think
monthly,
quarterly
and
annual.
B
So
you
can
actually
see
that
there's
kind
of
a
little
dynamic
in
the
first
year,
but
then
after
the
first
year
it
basically
starts
going
down
long
term,
and
that
would
also
be
the
healthy
that
that's
what
you
would
expect
to
see.
Also,
I'm
sure
for
gitlab
that
once
people
make
it
past
their
first
year,
they
must
get
more
and
more
invested
and
churn
should
go
down
over
time.
B
Let's
see
here's
another
one
which
is
active
users
and
churn
and
now
note
the
shape
of
this
curve
is
really
looks
a
lot
like
the
one
for
the
good.
The
good
reviews
right
that
the
curves
all
kind
of
look
like
this,
that
the
lowest
cohort
has
the
highest
churn,
there's
a
steep
drop-off
in
the
first
couple
of
deciles.
B
It
means
you've
got
some
guy
in
a
company
making
dashboards
and
no
one's
looking
at
them
right
that
dashboard
creator
needs
to
get
a
few
users
if
he
wants
the
company
to
be
more
invested
so
also
even
better
than
the
active
users
is
actually
seat
utilization,
because
the
active
users
is
also
correlated
with
the
size
of
the
customer.
So
you
could
have
you
know
a
company
that
signs
up
for
100
seats,
but
only
you
know
20
are
being
used
and
they
would.
B
You
would
think
that
20
seats
would
be
good,
but
if
you
go
over
to
this
chart-
and
you
see-
oh
wow,
only
20
of
seats
are
being
used.
That's
really
bad
so
again,
like
the
ratio
of
the
number
of
active
users
to
the
total
possible
number
based
on
the
number
of
seats
sold,
is
a
really
good
metric,
because
that
again,
that
shows,
if
they're
getting
a
good
value,
are
they
really
taking
advantage.
B
Let's
see
clip
folios
top
churn
fighting
moves,
they
also
said
no
discounting.
They
reconfigured
pricing
and
packaging
clipbolio.
I
know
they
even
like
cut
off
their
lowest
tier,
they
reconfigured
their
pricing
and
packaging
into
tiers,
and
then
they
analyzed
their
tiers
and
found
out
that
they
were
losing
money
on
their
bottom
tier.
So
you
might
even
do
something
like
that,
but
still
also
for
them.
More
active
users
is
key
and
also
they
found
high
churn
risk
early
on,
so
they
gave
three
months
free
support.
B
B
It's
kind
of
interesting,
because
I
mean
telcos
were
the
first
companies
that
really
embraced
churn
and
churn
fighting
as
a
thing,
and
so
they
have
events
that
are
calls
basically
local
international
and
you
can
slice
and
dice
them
in
various
ways,
so
they
make
metrics
out
of
how
many
calls
do
people
make.
Of
course,
although
we
don't
just
look
at
the
number
of
calls,
we
look
at
the
total
duration
of
calls
mrr,
you
guys
know
what
that
is
right
because
you're
in
the
sas
industry.
B
Sometimes
I
have
to
explain
that
all
right.
So
let's
start
looking
at
these,
so
this
is
again
the
really
common
curve
to
see
that
those
customers
who
are
practically
not
using
the
product
over
here
are
very
much
at
risk
of
churn.
In
fact,
some
of
them
might
have
already
decided
to
churn
and
that's
why
their
usage
is
so
low,
but
then
just
getting
a
little
bit
more
usage
makes
a
big
reduction
in
churn
until
you
get
to
this
the
elbow
of
the
curve
around
2500
and
for
also
for
customer
success.
B
These
types
of
charts
are
really
great
for
just
picking
out
what's
a
healthy
level
of
a
metric,
because
that
question
comes
up
a
lot
of
times
like
what
should
be
the
target
behavior
of
the
user
and
when
you've
got
a
curve
like
this,
with
a
sharp
elbow.
It's
really
easy
to
say:
oh
look,
500
minutes
per
month.
You
know
that's
the
target
and
beyond
that,
look
it's
not
really
doing
you
anything
according
to
this
one
metric,
so
charts
like
this
are
great
for
targeting
healthy
levels.
B
Now
what
about
cost?
This
is
the
the
result
for
mrr.
So
on
the
x-axis,
this
is
the
mrr,
but
I'm
not
showing
you
the
actual
amounts.
This
is
normalized
so
that
zero
is
the
average
and
then
one
is
one
standard
deviation
above
average
and
minus
one
is
minus
one
standard
deviation
below
average,
and
that's
because,
in
addition
to
that,
I
can't
tell
you
about
versatur's
churn
rate.
I
also
can't
tell
you
about
their
pricing.
B
If
you
want
to
know
about
versatur's
pricing,
you
know
you
should
call
one
of
their
reps
they'll
be
happy
to
tell
you,
but
so
this
is
yeah,
basically
anonymized
data.
But
what
is
it
saying?
It
actually
says
that
people
who
pay
more
churn
less
so
there's
the
next
trick.
Question
of
the
day,
why
do
people
who
pay
more
turn
less?
Isn't
that
weird.
A
B
B
Yeah
customers
who
are
on
more
expensive
plans,
they're
bigger,
so
they're,
less
likely
to
go
bankrupt
for
one
thing
with
small
companies,
you
could
always
lose
your
customer
to
business
failure.
A
big
customer
is
not
going
to
go
out
of
business.
Also
they
invest
more
to
get
set
up.
You
know,
and
also
once
they're
set
up
they're
just
going
to
use
it
more
so,
whatever
the
value
they're
getting
out
of
it,
they're
going
to
get
more
so
that
all
makes
sense,
but
it's
not
very
satisfying
to
just
say,
because
what
does
that
mean?
B
Oh
raise
the
prices
and
people
will
churn
less?
No,
no
don't!
This
is
when
you
look
at
a
better
metric
which
for
versature
is
the
dollars
spent
per
call
that
they
make,
and
here
you
can
actually
see
that
spin.
This
is
also
normalized.
This
does
not
mean
like
point
five
or
one
dollar
per
call.
This
is
the
average
and
one
standard
deviation
above
average.
So
clearly,
if
you're
one
standard
deviation
above
average
in
cost
per
call,
that's
really
bad
for
your
churn.
B
If
you're
below
average,
it
doesn't
matter
so
much
like
there's
all
these
people
here
who
are
well
below
average
in
the
cost
per
call
they're
good
and
it
doesn't
actually
make
much
difference
or
you
know
for
them,
but
so
paying
a
lot
in
its
just
in
absolute
terms,
does
not
cause
churn
they're,
bigger
customers,
but
paying
too
much
for
what
you're
using
definitely
causes
churn
and
the
metrics
that
I
always
tell
people
to
use
is
take.
B
B
I
guess
would
just
be
pennies
for
most
companies,
but
if
it's,
if
the
average
cost
per
commit
per
month
of
a
customer,
is
like
one
cent,
I
guarantee
you
you'll
find
some
customer
who's
got
like
one
dollar,
because
they're
paying
too
much
and
they're,
not
they're,
not
using
the
product
and
they
end
up
paying
a
dollar
per
commit.
You
just
think
like.
Oh,
my
god,
a
dollar
per
commit.
B
That's
really,
you
know
crazy,
but
you'll
probably
find
some
people
like
that,
and
those
are
the
ones
who
are
really
at
risk
from
paying
too
much
and
using
too
little
we
saw
that
in
zora
in
zora
we
looked
at.
Basically,
the
price
per
invoice
was
one
of
our
top
churn
metrics
at
zora,
and
I
can't
tell
you,
of
course,
what
anyone's
prices
per
invoice
are.
B
But
of
course,
when
you
look
at
the
zora
data,
there's
a
healthy
range
of
the
cost
per
invoice,
and
then
there
are
just
some
crazy
outliers
for
people
who
got
sold
a
whole
bunch
of
stuff.
They
never
got
it
implemented
and
then
those
people
are
obviously
very
much
at
risk,
so
really
good
metrics
for
picking
out
people
who
aren't
getting
a
good
value
on
the
product,
all
right.
What
else
did
versace
say?
Identify
customers
early
focus
on
customer
success?
B
Again
they
use
their
their
metrics
for
conversations,
the
the
metrics
are
not
just
good
for
detecting
they're
good
for
just
giving
you
a
representation
of
where
the
customer
is
at.
So
when
you
get
on
the
phone
with
them,
you
have
some
good
conversation
starters,
and
you
know
you
have
a
good
sense
of
what
their
health
is
all
right.
This
was
what
I
said:
post
covid.
B
B
Let's
see
all
right,
we
also
talked
about
pausing.
Anyway,
that's
not
so
interesting
for
now.
I
think
we're
kind
of
out
of
the
immediate
post
coven
we're
just
in
the
ongoing
age
of
disruption
at
this
point,
so
these
were
the
top
takeaways
from
the
talk
which
was
well
first,
that
customers
are
mostly
rational,
so
reducing
churn
in
the
long
run
requires
actually
delivering
something
to
them.
Something,
that's
useful!
That's
back
to
that!
There's
no
gimmicks!
B
In
the
long
run
I
mean
there
are
some
quick
fixes,
usually,
but
it's
like
in
the
in
the
form
of
bugs,
or
you
know,
for
very
immature
companies,
a
company
like
git
lab.
I'm
sure
you
guys
don't
have
any
bad
bugs
that
are
going
to
be
quick
fixes
for
churns.
It's
more
like
a
consumer
product
is
like.
Oh,
we
forgot
to
link
the
sign
up,
there's
a
bug
in
the
sign
up
button,
or
you
know
something
like
that.
B
Okay,
so
it
helps
if
what
you're
doing
is
sticky
hard
to
switch.
That's
actually
probably
good
news
for
git
lab,
because,
oh
my
god
changing
your
you
know
your
repo,
especially
if
you're
using
pipelines,
I'm
sure
pipelines
are
like
your
stickiest
feature
because
they're
hard
as
hell
to
migrate
as
an
outsider
observer.
Well
also,
a
good
thing
to
note
for
any
sas
company
is
your
competition
is
not
just
your
competitors
because
there's
lots
of
do-it-yourself
solutions
out
there.
B
I
know
for
zora
yeah,
our
biggest
competitor
was
the
it
department
who
would
be
like
no
we'll
do
billing.
You
know,
don't
worry
and
for
get
lab.
You've
probably
got
a
lot
of
people
who
are
like,
oh
no,
we
can
make
our
own
pipeline,
you
know
what
do
we
need
this
pipeline
tool
for
we
can
we
can
make
a
pipeline
in-house.
You
know
and
that's
your
competition,
you
know
okay
main
church,
reducing
strategies.
B
We
went
over
this
product
improvement,
engagement,
marketing,
customer
success,
pricing
for
value
and
targeting
customer
metrics
are
are
really
the
focus
of
for
me.
The
focus
of
using
data-
and
you
know
getting
the
right
metrics
in
front
of
the
right
people
and
a
shared
set
of
metrics-
was
really
the
focus
smart
segmentation.
B
You
can
predict
churn.
Really
anyone
in
the
business
can
predict
churn
in
a
way
just
by
having
a
good
set
of
customer
metrics
and
knowing
how
to
read
them,
and
the
pro
tip
is
that
yeah,
a
lot
of
really
good
metrics
are
ratios.
B
Your
metrics
always
start
out
from
whatever
events
you're
tracking,
I
don't
know
what
system
you
guys
use
to
track.
Customer
events,
you
know,
there's
a
lot
out
there
like
segment
or
omniture,
or
just
your
own
in-house.
You
know
that
you
know
systems
but
then
actually
taking
a
lot
surprisingly
taking
the
raw
metrics,
which
are
counts
and
sums
and
dividing
them
to
form
ratios
is
a
really
good
strategy,
because
they're
really
intuitive
too
actually
ratio
and
rates
are
very
intuitive
ways.
B
A
Yeah,
it's
perfect,
you
answered
a
lot
of
questions
and
but
it
inspired
many
more
questions,
so
I
will
get
it
right
to
it.
I
I
tweaked
the
orders
as
well
of
the
of
the
questions,
so
everyone
get
to
ask
so
the
ques
first
question
from
me:
when
you
talk
about
the
cohort
major
cohort
analysis,
you
recommend
we
start
from
the
just
the
known
features
that
are
critical
to
the
customers
and
do
this
analysis
and
we
assuming
at
gitlab,
we
already
did
some
initial
analysis.
A
A
B
Because
you
know
having
good
metrics
also
isn't
going
to
reduce
churn,
I
mean
it's
important,
I
mean
if
you
have
some
customer
metrics
already
and
you're,
pretty
sure
you
know
what
is
good
and
what
is
bad.
Then
the
question
is
yeah.
It's
definitely
more
important
to
think
about.
How
can
you
actually
move
the
needle
in
getting
people
to
adopt
the
good
features
that
make
that
are
tend
to
be
most
valuable?
B
That's
yeah!
It's
definitely
more
important
to
put
actions,
yeah,
put
basic
knowledge
into
practice
and
then
come
up
with
more
advanced
metrics.
You
know
well,
maybe
at
the
same
time,
but
also
yeah,
but
definitely
I
mean
the
most
basic
level
of
doing.
I
think
there
was
a
question
about
this,
like
what's
kind
of
like
a
basic
level
versus
an
advanced
level.
A
So
if
we
have
gitlab
is
a
very
complex
product,
I'm
sure
a
lot
of
my
our
customer
success.
Team
members
can
relate.
They
are
mapping
out
different
use
cases
and
for
each
use
case
there
could
be
10
features
and
we
can
we
can
get
to
the.
Maybe
we
can
reduce
that,
but
still
we
likely
will
have.
I
don't
know,
10
key
features.
We
want
customers
to
use
and
they
yeah
some
customers
use
other
some.
Some
customers
use
others.
So
in
that
situation,
do
we
monitor
all
of
them?
Do
we
segment
five
of
them
10.
B
Trying
to
do
50
or
100
would
be
crazy,
and
you
know
there
is
and
the
question
of
how
to
know
which
ones
to
start
on
is
also.
That's
tough
and
you
know
a
lot
of
this.
You
need
to
just
use
your
business
knowledge
and
your
other
sources
of
information.
You
know
just
your
gut
on.
What's
the
most
important
I
mean
from
those
charts,
you
can
sometimes
kind
of
see
which
ones
make
the
most
import
most
impact.
B
I
didn't
actually
show
you
that
in
this
because
remember,
I
didn't
show
you
any
of
the
churn
rates,
but
for
a
typical
company
you
will
see
which
metrics
have
the
biggest
impact
in
reducing
well,
the
biggest
correlation
with
reducing
churn.
But
the
thing
to
also
think
about
is
the
cause
and
effect,
because
none
of
the
data
that
I
showed
you
shows
you
cause
and
effect,
and
I
don't
recommend
using
really
advanced
statistics
to
come
up
with
the
cause
and
effects.
B
You
really
just
have
to
think
about
the
business,
and
you
know
what
what's
the
use
case
for
the
customer:
where
do
they
actually
get
value.
H
Yeah
thanks
carl
and
you
kind
of
started
to
touch
on
this.
I
think
with
the
answer
to
the
previous
question,
but
I'm
just
wondering
your
opinion.
What
do
you
see
as
kind
of
the
crawl
walk
run
phases
for
leveraging
data
to
fight
churn,
so
it
starts
with.
It
sounds
like
having
the
metrics
and
then
yeah
mentions,
and
then
what
comes
after
that
yeah.
B
Well,
there's
a
lot
actually
yeah
in
the
last
chapter
of
the
book.
I
actually
go
through
the
whole
kind
of
like
a
crawl
walk
run
framework
without
calling
it
that
exactly,
but
definitely
having
customer
metrics
is
like
the
most
basic
foundation,
and
if
you
don't
have
those
it's
kind
of
hard
to
do
much.
The
next
level
is
putting
those
metrics,
together
with
the
churn
data
to
show
well
like
in
those
charts.
B
We
looked
at
you
know,
what's
the
healthy
level
for
each
metric,
and
maybe
you
can
tell
which
ones
are
most
important
just
by
looking
at
you
know
the
shape
of
those
curves,
because
you
know
sometimes
you
really
get
surprises.
You
might
have
something
where
you're
like.
Oh,
I
was
sure
this
would
reduce
churn,
but
it
only
has
like
a
really
weak
zigzaggy
relationship.
B
You
know
so
yeah,
so
basically
getting
the
metrics
together
with
churn
data
is
the
next
step
and
then
the
next
step
after
that
is
doing
something
with
that
information
which
is
hard
and
that's
the
part
where
in
the
book
I
I
give
the
least
concrete
advice,
because
the
data
analysis
this
is
it's
actually
really
remarkable.
The
data
analysis
is
really
standard
across
all
these
different
companies
that
I
worked
with,
but
the
plan
of
action
after
the
data
analysis
is
never
standard
because
it
depends
on
what's
your
product.
What's
the
technology
stack?
B
B
Unfortunately,
I
have
the
least
specific
advice
there,
but
anyway
taking
action
is
definitely
the
next
level,
and
then
I
mean
kind
of
the
most
advanced
level
would
be
doing
it
all
with
more
advanced
analytics
like
if
you've
got
it
down
that
you
can
measure
your
customers
figure
out.
What's
related
to
churn
and
take
some
action,
then
you
might
want
to
say
hey
if
we
used
an
ai
model.
Would
that
improve
our
targeting?
B
A
H
A
Them
with
you
dave,
you
have
the
next
one.
F
Great
thanks
so
much
carl,
so
you've
already
kind
of
partially
answered
this,
but
you
know
we've
noticed
that
you
know
between
stage
adoption
data.
You
know
different
user
roles,
doing
certain
things
as
well
as
like
member
invites.
These
are
all
deeply
connected,
and
you
know
I
was
just
wondering:
is
there
like
a
crawl?
B
Actually,
well
speaking
of
multiculinary,
if
you
mean
like
actions
that
people
take
together,
there's
there's
kind
of
a
whole
chapter
in
the
book
devoted
to
it,
where
I
recommend
one
approach
to
grouping
together
related
actions
which,
for
example,
if
it
were
a
document,
editing
website
you
have
like
you,
know,
editing
and
saving
and
copying
and
pasting.
You
know,
guess
what
copying
and
pasting
are
highly
correlated
with
editing
documents.
So
I
do
recommend
techniques
to
group
those
metrics
together
and
make
an
overall
score
for
each
product
area
of
correlated
actions.
F
A
A
Also,
when
someone
invited
him
right,
he
will
begin
to
use
those
features
so
for
each
single
one
we
we
observe
correlation
with
retention,
but
they
are
actually
also
connected
in
a
deeper
level.
B
Yeah,
it
gets
really
hard.
I
mean
to
get
really
so
much
in
detail.
It
depends
on
how
many
customers
you
have
too
and
as
a
sas
enterprise
sas
company,
you
guys
are
probably
gonna.
Have
you
know
challenges
in
the
size
of
your
data?
I
don't
know
how
many
customers
gitlab
has,
but
you
know
with
zora.
You
know
they
only
have
about
a
thousand
or
so
customers.
So
it's
not
like
a
consumer
business.
Where
you
can
you
have
so
many
more
so
many
more
data
points
to
look
at
complicated
correlations,
so
it's
pretty
hard.
B
It's
also
hard
because
churn
happens
at
the
account
level,
but
there
are
interesting
things
going
on
at
the
user
level
and
usually
you
end
up
just
having
to
make
an
account
level
metric
for
things
going
on
at
the
user
level,
like
as
an
aggregate,
unless
you
actually,
you
can,
in
theory,
make
a
model
of
user
churn.
But
again
you
need
a
lot
of
users
and
a
lot
of
data.
You
know
to
to
look
at
individual
users.
You
know
stop
and
then
user
churn
just
means
that
they
stop
using
it.
A
A
For
sure,
let's,
let's
move
on
to
the
next
question,
sam.
C
Go
quick
here
thanks
for
taking
the
time
carl,
I'm
curious
within.
F
C
Of
the
data
that
customer
you
absorb
customers
give
to
zora
were
there
any
trends
that
correlated
to
churn
like
the
the
named
account
owner
in
zora.
Changing
oh.
B
We
talked
about
the
relationship
of
like
account
politics
versus
data,
driven
things
and
people
always
I
mean
we
all
have
observed
yeah
the
account
owner
changes
and
then
they
churn
I,
but
we
never
had
enough
data
to
really
strengthen
that,
and
personally
I
always
took
of
a
utility
driven
review
of
those,
because
the
thing
is
oftentimes.
The
change
in
the
person
is
just
a
trigger.
B
I
mean,
if
you
look
at
it
from
the
point
of
view
of
utility,
if
they
were
getting
a
great
deal,
why
would
they
churn
anyway?
You
know
it
might
just
be
that
you
were.
They
were
only
staying,
because
the
old
manager
had
a
some
vested
interest
or
you
know
they
they
knew
it
well,
and
maybe
the
new
person
was
just
the
first
one
to
look
at
the
data
and
be
like.
Oh
my
god,
you
know
how
much
we're
paying
for
this.
You
know
and
then
it's
like
well.
B
B
A
Cool
rebecca,
you
have
a
next
question.
I
Yes,
thank
you.
This
is
actually
really
exciting.
Thanks
for
your
time,
and
so
my
question
one
is,
you
know
you
mentioned
that
it's
also
common
for
them
for
people
who
to
just
ignore
the
smallest
tail,
which
you
deem
unsavable
or
just
not
worth
it,
there's
one
or
two
licenses.
It's
very
small,
but
also
our
largest
clients
are
the
ones
where
we're
seeing
the
least
amount
of
churn.
So
so,
where
are
you
going
to
determine
your
focus
if,
in
my
case,
I'm
running
ls,
so
I'm
finding
one-to-many
approach?
I
How
can
we
save
the
tail
end
if
it's
an
easy
fix,
if
it's
a
new
button,
if
it's
a
wording,
if
it's
on
boarding,
emails,
etc,.
B
It
could
be,
I
mean
definitely
focusing
on
onboarding
is
what
would
help
those
new
customers?
I
mean.
That's
the
thing
yeah
yeah,
the
thing
is
yeah.
Definitely
new
customers
is
separate
from
people
who
have
been
on
for
a
while
and
aren't
using
it.
I
mean
definitely
for
new
customers.
You
just
want
to
focus
on
onboarding
and
getting
them
live
and
getting
them
to
value.
B
You
know
everything
that
everyone
says
yeah
and
the
statement
about
lost
causes
is
more
for
people
who,
at
their
first
at
their
renewal,
if
you're
still
seeing
really
low
activity
like
they
haven't
adopted
it
I
mean
I'm
not
saying
you
don't
want
to.
I
mean,
as
I'm
sure,
because
you
guys
are
enterprise
sas.
Also
you
have
account
managers.
So
a
lot
of
this
advice
is
for
people
who,
don't
even
like,
have
an
account
manager,
for
you
know
the
client.
I
E
I
B
Well,
I
mean,
wherever
you
think
you
could
get
some
traction.
I
mean
I
mean.
Usually
people
find
it's
in
the
mid-range.
You
know
people
who
are
kind
of
using
it,
but
not
fully
healthy
and
not
the
people
who
aren't
even
using
it.
But
of
course
you
know
the
amount
that
their
contract
value
has
to
play
into
this,
too,
I
mean,
if
you
see
like
a
really
big
account
and
they're,
not
using
it.
I
mean,
for
god's
sake,
get
on
the
phone.