►
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
C
B
B
So
this
conversation
will
be
broken
out
generally
into
three
pieces.
One
of
them
will
be
exploring
met
specifically,
then
we'll
be
talking
about
dashboarding
and
and
reporting
in
general,
and
the
last
bit
will
be
specific
queries
about
the
how
the
data
team
can
improve
in
its
delivery
of
value
to
the
organization
as
a
whole.
B
I'm,
starting
with
the
first
piece
in
exploring
metrics,
said
and-
and
this
is,
we've
been
having
this
conversation
with
various
stakeholders,
so
this
is
both
the
representation
of
you
personally
and
the
organization
as
a
whole.
Can
you
walk
us
through
what
numbers
you
you
look
at
on
a
daily
basis,
yeah.
A
B
A
Most
of
them,
sir
cadence
wise
most
of
them
I
think,
are
on
a
monthly
basis
before
us
on
a
every
month.
We
have
our
metrics
goal
and
we
look
at
all
the
KPI.
We
should
look
in
every
metric
school.
We
should
look
at
all
the
KPIs
for
that
department
and
we
should
look
at
how
the
okay,
ours
aren't
progressing.
B
A
Many
times,
okay,
ours
can
be
different
things
and
most
time
she
wants.
Okay
are
well,
every
okie
are
should
be
measurable
and
okay,
ours
are
the
change,
so
maybe
I
show
you
a
paragraph
from
the
bottom
of
the
Oakley.
Our
page,
the
Oakley
RS,
are
what
an
initiatives
were
focusing
on
in
this
quarter.
Our
most
important
work
are
things
that
happen.
Every
quarter,
things
that
happen
every
quarter
are
measured
with
key
performance
indicators.
This
page
needs
an
update
and
I'll
write
it
right
now,
so
we
have
it
updated.
B
A
So
I
think
like
we're.
Not
gonna
have
an
OPR
that
says
closed
deals
or
ship
software,
because
that's
what
we
do
so.
Ok,
RS
what's
different
but
frequently
an
OPR
will
be
to
improve
something
and
a
lot
of
the
times
either
the
goal
will
be
to
improve
a
KPI
or
it
will
be
a
secondary
effect
of
that
change.
So
some
oakley
are
should
relate
to
changes
in
kpi's,
okay,.
B
A
A
A
A
B
A
It
says
it
all
on
the
same
page,
so
these
links
actually
made
it
worse,
because
now
it's
not
possible
for
me
to
tell
anymore
which
KPIs
actually
have
an
active
definition.
Also,
the
links
go
to
one
metrics
page,
which
that's
not
how
git
lab
works,
get
labs.
We
have
functional
areas
of
people
and,
as
you
know,
like
the
had
the
data
we
collected
across
different
departments
and
I,
think
the
department
should
be
responsible
for
defining
the
what
a
metric
means.
A
It's
their
thing
like
Co
Co,
define
what
I
what
they
think
is
ie
a
CV
marketing
should
define
what
is
the
marketing
efficiency
ratio
I
think
the
data
team
should
be
consulted
on
that,
but
it
should.
It
should
be
something
that's
front
and
center
for
marketing
people.
So
when
they
look
at
their
marketing
pages,
they
see
those
definitions.
The
page
shouldn't
say
definitions.
The
page
should
just
talk
about
like
how
to
do
their
roles,
but
explain
the
concepts
involved.
A
So
I
think
we
have
a
long
way
to
to
go
there,
and
a
great
first
step
would
be
to
remove
all
the
the
links
that
just
link
to
it.
To
do
and
then
start
working
with
the
with
the
teams
to
get
get
that
definition
or
that
concept
into
it's,
not
the
definition.
It's
a
concept
to
into
their
handbook
and
then
link
to
those.
B
Okay,
excellent
excellent.
Thank
you
very,
very
much.
That's
that's
extremely
insightful!
So
so
shifting
away
from
from
from
ethics
very
briefly
and
then
moving
into
very
specifically
into
reporting
and
at
the
dashboarding,
so
the
previous
bi
tool.
How
would
frequently
would
you
say
you
you
would
have
accessed
that.
A
Yeah
a
couple
of
times
a
week,
I
was
I
thought
the
grass
looked
really
nice
I
was
frustrated
by
my
inability
to
quickly
access
something
it's
one
of
the
reasons
for
making
the
KPI
page,
so
I
think
that
the
navigation
in
the
tool
was
very
hard
for
me.
So
I
much
rather
I
have
like
one
overview
like
these
are
all
the
metrics
for
an
executive
that
the
team
has
I,
can
quickly
find
something
on
that
page
and
then
go
to
the
relevant
graph
aside.
A
The
handbook
is
how
we
expose
all
the
information
in
get
labs,
so
instead
of
building
big
composed
dashboards
or
adding
a
lot
of
navigation
to
any
bi
to
at
least
make
sure
that
it's
also
navigatable
from
the
outside
another
problem
was
that
some
of
the
things
were
out
of
date
without
you
really
knowing
but
I'm,
not
not
sure
that
was
true,
but
a
few
times
we
made
something
like
we.
We
just
added
data
to
an
excel
sheet
and
a
made
a
graph
out
of
that.
A
That
becomes
a
still
really
really
fast,
but
it
was
the
best
solution
we
had
at
the
time,
but
I
think
all
of
those
things
got
out
of
date.
So
I
think
it
was
a
wasted
effort
and
so
I
think
the
guy
did
we
get
it
from
the
system
of
record
and
we
make
sure
it's
always
up
to
day,
or
we
just
say:
okay.
Well,
this
is
apparently
not
the
most
important
thing
to
get
right
now,
we'll
get
it
on
an
as-needed
basis.
B
Okay
and
and
now
that
that
that
we're
in
the
transition
and
the
previous
tool
is
not
currently
available.
What
would
you
say,
your
primary
access
to
reporting
and
metrics
is
metrics
calls,
and,
and
how
do
you
feel
about
the
metrics
calls
generally
as
a
structure
and
as
an
and
and
in
regards
to
tooling
I.
A
They're
worthwhile
it
took
some
time
to
get
everyone
to
use
the
Google
Doc
as
a
way
to
kind
of
ask
questions
or
suggestions
and
I
think
every
metric
skull
should
kind
of
start
with
like
hey
what
what
what
were
the
traduce
from
last
time,
and
how
far
are
we
in
addressing
them,
because
I
I
had
a
lot
of
calls
for
this
stuff
from
the
last
time
just
wasn't
addressed,
and
that
makes
it
frustrating
because
then
you're
like
okay.
Well,
they
didn't
even
do
what
we
agreed
on
last
time.
They
don't
even
refer
back
to
it.
B
But
when
you
need
access
to
do
a
specific
metric
or
a
specific
report,
assuming
the
the
data
team
suddenly
blinked
out
of
existence,
who
would
you
who
would
be
your
first
point
of
contact.
A
That's
hard
to
answer
I'll
answer
something
different
I've
asked
a
few
times
to
make
sure
that
everything
that's
in
our
metrics,
so
the
sheet
that
we
share
with
investors,
that
that
has
a
link
to
a
definition.
Yesterday,
I
changed
multiple
search
terms
on
that
metric
sheet,
because
I
think
they
were
not
clear.
For
example,
something
report
referred
to
employees.
Well,
it
meant
to
refer
to
team
members.
I
made
a
couple
of
other
changes.
You
can
probably
find
them
in
the
google
history,
but
still
we
don't
have
links
to
definitions.
A
For
example,
we
said
and
github
Enterprise
who
we
see
it.
We
say
gitlab
C
set
C
sad
stands
for
customer
satisfaction.
Now
for.
Oh
that's
amazing
that
we
have
done
because-
and
investors
asked
for
that,
and
it's
a
really
stat
that
we're
missing
then
I
realized,
that's
not
satisfaction
with
the
product.
That's
satisfaction
with
support.
That's
how
people,
how
people,
how
happy
people
are
with
our
get
life,
support
people
not
wait
the
product,
so
we've
been
sending
something
to
investors
for
years
now.
B
A
Don't
know
I'm
not
close
enough
to
see
that,
but
I
I
think
what
a
natural
like
it's
always
hard
like.
Do
you
have
one
data
team
or
do
you
have
multiple
data
teams
for
every
function?
We
have
one
data
team,
but
the
pitfall
of
that
is
that
the
natural
response
of
the
data
team
is
to
cater
to
the
needs
of
finance
and
I.
Think
that's
happening
to
a
large
extent
where
a
lot
of
the
data
team
is
is
focused
on
financial
metrics
and
financial
reporting,
and
not
on
the
needs
of
the
rest
of
the
business.
A
Now
I
think
that
on
one
that
that
makes
sense
like
the
financial
metrics
have
to
get
to
a
certain
quality
level.
The
first,
because
that's
like
stuff
we
put
out
to
reporters
and
like
if,
if
you
could
choose
one
department
that
had
accurate
metrics
at
your
company,
it
would
be
finance.
So
we
might
be
putting
the
prioritization
exactly
right
as
it
is.
But
it
is
a
common
pitfall.
A
So
I'm
not
sure
what
are
we
doing
it
wrong,
but
I
want
to
point
out
that
we
got
to
pay
a
lot
of
attention
that
this
data
team
serves
the
needs
of
the
company
and
all
the
executives,
not
just
off
the
finance
departments
and
the
finance
executives,
and
it's
something
I'll
I'll
be
watching
and
I
want
everyone
to
be
aware
of,
and
also
I'm
very
grateful
that
finance
did
such
a
good
job.
I
know
of
other
startups
in
YC
that
had
to
redo
three
years
of
accounting
that
ain't
fun
and
it
it
is.
B
A
So
we're
all
trying
to
figure
out
these
things
like
how
many?
How
many
support
people
do
we
need?
Oh,
it's,
a
percentage
of
revenue.
How
many
technical
writers
do
we
need
is
that
a
percentage
of
revenue
is
that
a
percentage
of
the
number
of
developers
we
have
is
that
mixed
I'm
trying
to
figure
this
out
with
the
technical
writing
team?
At
the
same
time
we're
making
projections?
A
It
will
be
very
cool
if
we
can
integrate
all
these
things
if
we
can
make
sure
that
we
are
okay.
Ours
are
changes
in
KPIs
and
that
our
KPIs
has
certain
gearing
ratios
that
predict
a
financial
model
and
the
projection
forward,
because
that
makes
it
a
lot
easier
to
start
tweaking
these
things.
It
makes
for
a
much
easier
conversation
like
hey
sales.
We
we
should
tweak
the
number
of
solution.
A
Architects
or
salespeople,
like
we
increase
that
recently
and
I-
think
it's
a
great
thing,
but
we
have
if
we
treat
that
we
can
achieve
this
outcome,
so
exposing
those
gearing
ratios,
I
think
makes
the
whole
company
much
more
understandable
and
also
more
manageable,
because
you
can
point
directly
to
something
and
then
you
can
track
whether
like
this
is
what
we
projected
for
the
KPI.
This
is
what
it's
doing.
This
is
what
we
projected
for
the
giving
ratio-
and
this
is
what's
happening
in
reality.
A
You
can
see
what
what
is
getting
results
so
having
that
kind
of
all
the
way
is
something
that
I
fantasize
about
and
I
think
might
be
very
useful
as
a
way
to
kind
of
instrument
and
and
and
control
and
control
the
company
or
like
have
great
feedback
mechanisms
in
the
company.
Like
suppose,
we
get
a
whole
bunch
suppose
that
technical
writing
does
score
related
revenue.
We're
get
a
whole
bunch
of
extra
revenue
coming
in
boom
immediately
the
headcount
for
that
team
goes
up,
but
no
one
has
to
do
something
make
a
yearly
plan
or
nothing.
A
B
A
B
A
Yeah
luckily,
I
have
two
iacv.
The
ICV
in
the
future
would
be
that,
but
without
cheating,
no
I
would
like
to
it's
a
great
question.
I
would
have
liked
to
have
more
data
on
how
people
are
using
the
product,
so
we
have
on
our
homepage.
We
list
like
the
ten
stages
in
which
we
have
an
offering
on
an
O
of
every
stage,
how
many?
What
percentage
of
our
users
is?
As
you
said
in
the
last
30
days,.
A
A
A
Think,
like
active
users,
is
like
Freddy
day.
Active
users
is
a
great
metric.
We
can,
we
can
get
more
granular,
but
I
would
already
be
amazing
because
it
would
allow
us
to
detect
when
we,
when
we
make
something
worse,
we
can
detect
when
something
needs
attention
or
when
when
something
is
not
surfaced,
enough
I
think
that'd
be
amazing.
Our
product
managers
would
be
very
happy
and
we'd
be
able
to
direct
our
engineers
to
the
more
most
worthwhile
efforts
and
measure
whether
we're
having
what
things
have
an
impact.
A
Correct,
of
course,
that
would
be
nice
as
well,
but
imagine
we
have
this
table
and
then
either
here
like
under
the
stages
or
after
everything
we
have
a
percentage
of
like
of
all
our
monthly
active
users.
This
is
how
many
people
are
using
chat.
Opps.
Then
we
can
say:
okay,
we're
gonna,
invest
for
two
months.
We're
gonna,
invest
some
extra
time
in
chat.
Opsin
are
we
then
seeing
that
riser
is,
is
just
a
lost
cause
so
that
feedback
loop
would
help.
A
Also
we'd
say
like
we're
shipping,
this
we're
saying
we
have
it,
but
nobody
uses
it.
So
I'd
like
to
put
that
actual
number
on
our
homepage,
because
people
see
this
and
the
first
question
they
have
is
like.
Is
this
real
or
people?
Are
they
using
this?
So
we
should
show
it
like
these
parts
are
real.
These
parts
aren't
that's
great.
Thank
you.
Thanks
for
asking
for
clarification.
B
Actually,
thank
you
very
much.
I
had
that
is
all
very
useful
and
immediately
actionable
information.
I
I
would
like
to
open
Walter
to
any
closing
remarks
or
questions
asked
that
was
the
last
one
I
had
and
then,
if
you
have
any
other
topics
that
you'd
like
to
explore
personally
said,
I'd
turn
it
over
to
you.
Yeah.
C
C
A
A
C
So
speaking
for
myself
and
on
behalf
of
everybody,
I
think
it's
gonna
be
an
interesting
human
challenge.
Just
it's
a
lot
of
conversation
to
drive
those
definitions,
I
think
historically,
we've
leaned
a
little
bit
on
the
data
team
for
managing
those
definitions
and
driving
those
definitions,
and
so
there
there
might
be
a
little
bit
of
lift
and
getting
that
used
to
I/o.
C
Make
sure
that
we
don't
in
a
position
where
we're
sort
of
maintaining
that
across
the
handbook-
and
you
know
in
hearing
what
you
have
to
say,
the
goal
is
very
much
to
drive
the
opposite,
so
we'll
be
in
that
position.
But
I
want
to
make
sure
that
sort
of
to
our
previous
points.
We
don't
spend
all
of
our
time
trying
to
thrive.
All
of
those
definitions
isn't
our
only
priority.
There
are
a
lot
of
other
things
that
can
be
done
and
make
sure
we're
yeah
I.
A
Think
that,
although
you
should
not
be
responsible
for
maintaining
them,
and
that
should
be
the
team's
I
think
it's
we're
talking
about
50
definitions
or
something
that
are
missing,
I,
think
for
most
of
them
there's
an
easy
for
an
80%
of
them.
We
can
write
something
up
even
in
an
hour-long
call,
and
we
can.
D
Add
some
additional
context
to
that,
though,
that
sometimes
some
of
the
nuances
of
what
a
definition
what
goes
into
calculating
a
metric,
we
won't
know
until
we're
in
the
middle
of
building
out
that
analysis.
So
a
specific
example
when
we
talk
about
retention
byproduct,
what
happens
if
the
number
of
seeds
goes
down
and
then
or
the
number
of
seats
goes
up,
but
the
product
that
they
choose
goes
down
like?
Is
that
an
account
that
was
retained
or
like?
Do
we
just
face
that
on
m
RR,
when
we
break
it
down
right?
D
A
That
makes
sense,
I
think
it
might
be
a
bit
so
so
those
make
sense,
but
I
think
we
should
iterate
and
at
least
get
as
much
done
as
we
can.
So
when
I
look
at
my
CEO
metrics
I've
one
metrics,
that's
called
uptime
gitlab
comm,
so
I'll
share
my
screen
for
a
second.
So
this
is
like
top
16
metrics
in
the
company
like
very
important
thing:
I
click
that
and
I
do
not
have
a
link
to
the
uptime
I.
A
A
So
that's
a
that's,
not
that's,
not
a
very
complex
process
to
figure
that
out
and
I
think
we
can
make
great
strides
with
very
relatively
little
effort,
and
it's
not.
We
don't
need
a
hundred
percent
perfection,
even
if
the
definition
is
off
a
bit.
It's
already
really
good.
If
we
don't
confuse
investors
by
telling
them
our
product
satisfaction
is
equal
to
the
satisfaction
with
support
tickets,
like
that's
already
a
really
big
win,
and
so
I
I'd
go
much
more
incremental.
Let's
get
the
easiest
definitions
in
there.
The
first
I
appreciate.