►
From YouTube: Plan stage weekly meeting - 2020-04-29
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
First
week,
primarily
is
going
to
be
onboarding.
We
were
going
to
do
like
a
fast
food-type
onboarding,
where
they
were
going
to
be
on
site.
Of
course,
that's
not
happening
now,
but
I.
Think
most
of
the
first
week
will
still
be
the
same
type
of
approach.
Well,
they'll
be
they'll,
be
doing
a
rapid
onboarding,
primarily
with
the
other
interns.
A
A
couple
other
notes.
He
is
in
south
korea,
which
I
think
is
UTC
plus
nine.
His
mentor
is
going
to
be
florea.
She
has
been
we've
been
working
to
put
together
a
bit
of
a
plan
and
she's
pretty
close
to
his
time
zone,
which
is
great,
John
asked.
If
there's
anything,
we
could
do
to
help
with
onboarding
I'm
still
working
on
putting
together
kind
of
a
formal
onboarding
process
for
him
I'm
sure
there
will
be
some
stuff
and
I'll
make
sure
to
reach
out
to
you
all
with
anything
that
that
you
can
Outlook
yeah.
B
A
A
I
asked
Excel
to
demo
this
really
last
minute.
I
was
just
going
to
do
it
so
we'll
see
how
see
how
it
goes,
and
also
this
is
excellest
big
feature
that
he
worked
on,
which
he
did
an
excellent
job,
especially
considering
it
ended
up
being
a
lot
more
complex
than
originally
thought,
and
it's
involved
by
him
having
to
do
some
back-end
stuff
having
to
put
it
through
database
review
back
in
review
and
also
big
big
thanks
to
yura
and
yan,
who
helped
out
with
the
backend
side
of
of
this
feature.
C
So
we
basically
have
their
their
issues
list
here
that
we
all
know
and
know
what
we
did
in
this.
In
this
case,
we
have
this
epic
filter
to
search
buried
by
epic,
and
what
we
did
here
was
two
things.
First
was
a
including
all
of
this
child
Epic's
inside
of
the
inside
of
the
lid,
and
we
added
this
icon
that
basically
says
that
this
issue
is
inside
of
a
shy
epic
of
the
filters,
epic.
C
So
basically
we
can
well
in
this
case,
all
of
this
all
of
the
edges
are
our
children
of
this
epic,
so
so
yeah.
Basically,
what
we
also
did
was
to
make
sure
that
this
was
a
performance
that
is
leading
a
slow
down
the
rest
of
the
queries.
To
actually
add
this
information
here
and
and
yeah,
it
is
basically
it
it
is
really
simple,
but
it
has
like,
like
a
big
war
behind.
D
D
We
did
know
I,
think
and
again,
I
want
to
get
some
additional
feedback
on
it.
You
know
the
main
problem
here
is
searching
through
nested
epics,
another
oftentimes,
mesud
epics,
yet
created
and
issues
get
moved
and
folks
looking
for
them.
Don't
know
that
we're
trying
to
drive
clarity
on
actually
finding
items
and
a
nested
epic
tree,
but
being
clear
that
there
are
possible
results
that
are
gonna
epoch
further
down
the
chain,
so
I
think
probably
the
brother,
the
first
iteration
would
be
surfacing
or
yeah
prioritizing
the
results
in
the
actual
parent.
E
D
C
E
Let
me
have
the
open
issue
for
doing
or
right,
like
I,
don't
know,
I've
gone
back
and
forth
where
I
talked
to
some
customers.
Lately
they
just
want,
like
an
advanced
view
like
an
advanced
filtering
search
like
a
basic
right,
a
database
query
so
yeah
we
kind
of
need
to
go
back
and
I
think
revisit
the
filter
bar
a
little
bit.
No,
that's
the
question.
D
Right
there
we
had
to
deviating
paths,
we
continue
to
advanced
filtering
in
the
search
bar
in
the
way
it
is
now
and
make
it
try
to
make
it
better
or
we
bite
the
bullet
and
figure
out
some
sort
of
jaql
type
cuz.
E
G
C
C
Yes,
yeah
yeah.
G
H
D
F
C
A
Yeah,
that's
something
I
notice
also
is
because
we're
loading
so
many
epics
and
I
think
for
autocomplete
we
load
every
epic
that
they're
capable
of
searching
or
in
the
autocomplete.
So
it
is
somewhat
slow.
I,
don't
think
that's
something
that
was
caused
recently.
No!
No!
Ever
since
we
had
it's
always
yeah
yeah,
so
we
do
have
a
car,
a
few
issues
to
clean
up
the
auto
complete
on
filtered
search,
because
it's
not
just
a
pick
set.
G
C
G
A
A
G
D
A
D
D
That's
yeah,
no,
this
is
this,
is
this
is
a
huge
step.
Yeah
yeah.
This
is
like
a
question
for
Holly.
I
guess
is,
is
the
yellow
warning
the
one
we
go
to
or
is
that
more
of
a
like
red?
You
actually
I,
don't
know,
I,
think
it
probably
comes
to
how
severe
you
track
blocker
or
dependencies
as
a
team
I
think.
I
G
I
think
naege
actually
asked
that
specific
question.
What
is
the
appropriate?
You
know
coloring
working
symbols,
all
that,
because
he
was
trying
to
keep
it
consistent
with
other
areas.
So
I'm
not
sure
there
was
a
lot
of
other
areas
that
it's
been
like
this,
but
I
think
we
settled
on
orange
again
we
can
iterate
if
we're
not
happy
with
it.
I.
G
Understanding
was
red
was
when
something
actually
went
wrong
like
if
you
vary
and
couldn't
get
a
result,
it
would
show
up
as
red
is
like
something
went
wrong.
An
orange
was
like
hey.
We
need
your
input
to
continue,
but
nothing
is
technically
wrong.
Just
we're
also
looking
at
the
option
of
having
toast
messages
for
certain
things
too.
A
Cool
so
awesome
job,
flori
and
Philippe
on
the
back
end
of
that,
alright,
moving
full
moving
on
the
O,
so
just
wanted
us
to
have
a
discussion.
I,
don't
know
if
the
decisions
made
on
this
or
not,
but
it
did
come
up
on
whether
because
we're
adding
the
concept
of
sprints
and
they
were
originally
called
Sprint's.
But
then
we
had
some
conversations
on
ground,
whether
it's
iterations
and
I
link
to
a
little
bit
of
that
discussion
in
slack
there
do
we
have
a
decision.
Are
we
calling
it
Sprint's
or
iterations.
A
B
E
So
Sprint's
are
a
subset
of
an
iteration
basically
and
like
lingo
speak,
but
an
iteration
is
more
inclusive
of
broader
methodology,
so,
like
I,
did
extreme
programming
before
coming
to
get
lab
and
we
called
them.
Iterations,
not
sprints,
and
also
just
from
like
a
thing.
You
can't
sprint
indefinitely,
which
is
like
the
definition
of
Sprint
is
going
as
fast
as
you
can
and
I.
Don't
think.
That's
sustainable,
so
I'd
rather
have
something
that's
more
sustainable
as
a
construct
in
our
in
our
product,
which
iteration
is
so
right
yeah.
They
did
not.
D
Didn't
we
get
the
yeah
I?
Think
it's
the
theme
well
with
our
value
structure
and
a
lot
of
the
language
that
we
use.
I
think
we
didn't
need
to
be
cognizant
of
the
common
issue
that
comes
up
where
you
know
we're
gonna,
have
to
say
manager,
Sprint's
with
get
labs
new
iteration
feature.
Much
like
you
know,
a
lot
of
customers
think
you're
issues.
They
automatically
think.
Oh,
that
means
bugs
oh
no,
its
ticket
stories,
whatever
you
want,
I
think
it's
just.
D
When
we
get
to
the
documentation
and
the
marketing
side
of
things,
it's
the
same
song
and
dance.
We
do
with
a
lot
of
our
functionality
to
clarify.
You
can
do
many,
many
things
with
this
feature,
which
is
why
it
has
a
name
that
doesn't
map
one-to-one
with
every
framework
out
there.
Yes,
yeah
so
I
think
I'd
support
like,
as
we
talked
about
the
feature
to
people
internally.
H
D
D
A
A
Alright,
my
last
point
around
workflow
I
just
wanted
to
give
a
heads
up
that
we
merged
NMR
that
I
submitted
around
not
weighing
bugs
that
we're
going
to
try
out
just
a
little
bit
context.
Primary
reason,
for
it
was
the
the
majority
of
time
that
we
spend
in
fixing
bugs
is
on
the
investigation
so,
and
you
can't
really
accurately
weigh
a
bug
until
that
investigation
is
done.
A
So
it's
it's
essentially
hard
to
weigh
bugs
so
we're
gonna
try
just
not
weighing
them
for
now.
Also,
in
addition
to
that,
we
really
want
to
start
focusing
customer
a
lot
customer
or
we
want
to
start
focusing
velocity
on
being
customer
value
that
we
produce
bug,
fixes,
don't
really
qualify
and
especially
regressions,
because
we
should
have
produced
that
customer
value
before
the
bug.
That
makes
sense.
So
we're
going
to
try
not
waiting
bugs
just
a
heads
up
that
that's
in
there.
D
D
If
we
at
the
end
of
resolving
a
bug,
we
can
get
like
a
comment
or
a
short
discussion
on
investigation
was
difficult
for
these
reasons,
or
maybe
key
learnings
from
it,
just
something
to
quantify
how
difficult
it
was
and
maybe
then
use
it
as
an
opportunity
to
I.
Don't
learn
something
in
JIRA,
just
something
to
give
us
like.
Oh
okay,
hey
we
were
a
little
lower
on
value
delivery,
but
look
we
had
these
bugs
and
they
were
difficult
for
this
reason.
But
we
learned
this
just
a
thought.
D
G
E
G
We're
tracking
our
team
should
we
be
done
tracking
velocity
as
the
only
metric
is.
We
have
ulti
tracking
something
like
iron
value,
which
is
work
performed.
The
point
is:
is
we
need
to
understand
how
much
throughput
the
team
has
and
by
taking
bugs
and
not
weighing
them,
we
now
effectively
lose
the
capacity
of
knowing
what
the
throughput
of
the
team
was
for
last
month
because
going
forward
I,
don't
want
a
plan
and
say:
okay.
G
This
is
how
much
velocity
we
had
so
I'm
gonna
plan
for
that
next
month
and
assume
we
have
the
same
number
of
bugs.
But
if
the
number
of
bugs
goes
up
or
down
now
we're
planning
against
an
artificial
number,
we
need
to
plan
against
throughput
and
include
a
margin
for
bugs
if
we're
not
waiting
them
or
we
need
to
weigh
everything
and
then
plan
against
throughput.
E
That's
where
the
use
the
the
velocity
is
basically
like,
if,
if
our
velocity
is
going
down
over
time,
we're
either
losing
people
on
the
team
or
we're
not
delivering
customer
value,
we're
delivering
more
defects.
So
it's
like
it's
a
signal
to
say
basically
like
here's,
how
much
customer
value
we
can
deliver
right.
So
the
thing
the
features
of
stuff
that
we
want
to
build
and
if
that,
if
that's
going
down
over
time,
we
can
basically
like.
E
G
They
do
with
shipping.
Our
value
has
to
do
with
the
quantity
of
work
performed
over
the
amount
of
time
they
took
to
do
the
work,
so
you
earn
value
by
working
to
a
specific
cadence
right.
So
if
you
say
I'm,
gonna
ship,
40
units
worth
of
work
in
40
hours
and
you
ship
20,
that
means
your
earned
value
is
going
to
show
the
only
ship
20
against
the
hours
you
work,
which
is
a
problem.
G
That's
a
performance
issue
or
it's
a
estimation
issue
or
it's
a
quiet,
there's
all
sorts
of
reasons
for
it,
but
it
triggers
the
discussion
of.
Are
we
being
efficient?
So
it's
a
measure
of
efficiency
yeah.
Would
you
measure
if
customer
valuable,
that's
totally
makes
sense?
I
think
we
should
be
tracking
velocity,
but
I
would
love
to
track
velocity
against,
earned
value
and
understand
for
every
amount
of
time
that
we're
spending
on
something?
How
much
customer
value
do
we
achieve?
G
That's
the
ultimate
in
knowing
if
we're
being
efficient
as
a
team,
because
then
we
can
say,
hey
look.
We
delivered
all
this
velocity
and
this
is
how
much
time
we
spent-
and
it
was
a
really
good
number
meaning
we
delivered
we're
very
efficient
in
delivering
velocity,
as
opposed
to
oh,
we
spent
at
a
time
we
delivered
a
little
bit
of
value
to
the
customer.
What
can
we
do
differently
in
the
planning
phase?
We're
not
breaking
down
issues.
Are
we
prioritizing
the
wrong
things?
That's
that
triggers
the
discussion
to
ensure
that
we're
operating
smoothly.
E
Yeah
I
guess
I've
always
had
the
same
discussions
with
just
tracking
velocity
and
not
recording
bugs
because
you
get
the
same,
would
really
get
this
same.
Byproduct
of
let's
have
conversations
about
stuff
and
what's
working
what's
not,
and
but
I
don't
have
a
strong
like
I'm
up
for
experimenting
with
whatever.
So,
let's.
A
H
Thank
you.
So
tomorrow
is
12:10
celebration.
So
I
put
this
in
slack
and
as
of
this
morning,
including
Cormack,
sorry
I
forgot
you
Cormac.
Everyone
should
have
another
calendar
now,
if
you
don't
I'm
I'm
forever.
Sorry
I
will
forward
it
to
you,
but
I
thought
we
could
play.
Maybe
draw
something
or
another
jackbox
game
like
we
did
14
day
virtual
contribute
teamed
a
couple
like
a
month
ago.
So
there's
two
slots:
I'm
gonna
try
and
wake
up
very
early,
my
time
or
late
for
the
other
one.