►
From YouTube: 2023-06-14 Product Analytics UX/PM sync
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
The
product
analytics
pmux
weekly
sync,
so,
starting
with
the
sticky
items,
we're
viewing
the
latest
prototype
review
results.
Kevin.
You
want
to
walk
through
what
we've
learned
so
far
on
a
couple
of
interviews
that
we've
had
for
the.
B
Program,
yeah
I'm
still
tagging
the
sessions
in
dovetail,
but
overall
we
didn't
add
so
many
like.
B
Red
flag
on
what
we
have
again,
it's
internal
testing,
one
of
the
things
that
we've
kind
of
discovered
I
guess
yesterday
was
potentially
the
ability
to
be
able
to
share
a
dashboard
right
when
you're
creating
it.
So
like
the
way
we've
done.
Our
prototype
is
that
we
have
three
steps
where
you
first
select
a
data
source,
then
you
create
a
visualization
and
then
you
kind
of
validate
all
that,
like
you,
give
a
name
in
the
description
to
your
dashboard,
then
you
save
it.
B
B
So
that's
it
for
the
main
I
would
say,
findings.
So
far,
that's
a
bit
lightweight
the
other
thing
is
we
had
some
feedback
on
the
naming
of
some
of
our
labels
in
the
visualization
designer
one
of
the
confusion.
Point
was
around
Dimension,
so
the
funny
thing
is
like,
depending
on
who
you
ask
it,
people
might
see
different
things
in
the
in
the
drop
down.
B
So
that's
more
of
like
a
copy
feedback
of
his
Dimensions
to
write
word
but
again
like
it's.
It's
it's
not
a
trend
because
we
haven't
been
able,
like
I,
wouldn't
say
it's
a
trend
because
we
haven't
been
able
to
get
a
sustainable
amount
of
participants
that
are
external.
So
for
now
we
kind
of
went
with
internal
participants,
so
I
wouldn't
draw
too
many
conclusion
of
that.
B
A
Those
were
my
big
take
place
as
well.
Some
help
texts,
maybe
describing
what
things
are
and
some
text
around
as
we
display
the
visualization
before
this
is
what
it
will
look
like
on.
The
dashboard
were
the
big
things
to
potentially
add,
but
we've
only
had
two
so
far
so
yeah
no
trends.
A
Yeah,
it's
been
a
good
exercise,
then
testing
the
discussion
guide,
like
you
said,
and
kind
of
the
overall
methodology
for
us
cool
I.
Think
that
Kevin,
you
had
a
point
in
the
agenda
of
or
missing
the
rapid
part.
If
we
just
get
external
folks,
let's
look
at
another,
recruiting
Source
I
started
trolling
through
stuffs
in
pnps
respondents
yesterday
for
q1
and
I
didn't
see
anyone
mentioning
metrics
or
performance
or
any
other
keywords
that
I
would
find
that
they
might
be
interested
in
are
featured
set.
So
I,
don't
think
that's
a
good
one.
A
So
we
might
look
at
the
different
respondent
panel
searches
that
we
have
to
find
a
couple
other
external
ones.
If
we
haven't
found
external
Folks
by
the
end
of
the
week,
let's
go
with
that
route
so
that
we
can
get
something.
That's
good.
A
Awesome
and
I
don't
think
we're
ready
yet
to
talk
about
the
next
feature
at
prototype,
we're
still
on
the
dashboard
designer
but
we're
getting
close
and
then
the
last
sticky
item
is
reviewing
the
upcoming
design
items.
A
B
But
yeah
like
we've,
had
like
a
couple
of
back
and
forth
on
that
I
think
we're
kind
of
in
a
good
place.
We
just
need
to
sort
out
some
of
the
copy
details
moving
forward.
B
D
Yeah
I
think
the
design
looks
really
good
I
think
it's
a
nice
balance
between
being
easily
observable
by
users
and
not
being
too
fleshy
or
like
distracting
and
yeah
it's
it's
something
we
can
iterate
on
and
expand
like
in
terms
of
more
detailed
error
messages
linking
to
specific
parts
of
our
application
or
documentation
to
allow
users
to
to
take
action
and
resolve
errors.
B
A
All
right,
looking
back
at
last
week's
action
items
obtaining
the
ux
research
issue
from
Nick
at
done
so
I,
don't
think,
there's
anything
pending
and
then
I
wanted
to
talk
about
ongoing
user
feedback
mechanisms
as
we
get
this
actually
launched
and
into
open
Beta.
A
What
are
some
new
and
creative
or
old
and
boring
ways
that
we
can
get
users
who
are
on
the
page,
signing
up
for
customer
interview
sessions
giving
that
feedback
in
the
feedback
issue?
Other
mechanisms
for
us
collecting
feedback
on
the
feature
set
so
that
we
have
a
good
roster
of
folks
that
we
can
reach
out
to
to
discuss
either
prototypes
or
rapid
validation
or
general
customer
interviews
of
you
know
more
for
the
discovery
side
of
things
of
what
additional
problems
are
you
trying
to
solve
than
we're?
Not
solving
for.
B
There
are
broadcast
messages,
I've
been
using
one
of
them
in
the
past
to
retrieve
feedback
on
the
page
that
we've
launched
with
growth
and
it
didn't
win
as
expected.
To
be
honest,
like
we
didn't
get
a
lot
of
people
engaging
yeah.
My
my
hunch
is
that
I
don't
know
if
it's
the
right
place.
You
know
like
to
prompt
someone.
I
know
like
a
lot
of
products.
B
Do
that
right
away,
but
when
you're
trying
to
get
something
done,
and
then
you
just
you
know,
go
in
the
space
where
you
do
your
work
and
then
there's
something
like
popping
up.
It's
like
whatever
the
message
is
most
likely.
You're
gonna
have
this
kind
of
Primal
reaction
of
saying
not
now
and
dismissing
it
right.
Yeah.
B
I
assume,
like
that's,
why
we
don't
have
a
lot
of
people
engaging
with
that.
So
what
I'm
trying
to
say
essentially,
is
that
I
don't
know
if
UI
messaging,
whatever
the
form
is,
is
a
good
pattern
to
recruit
people.
That's.
A
A
B
We're
developing
like
a
painted
door
type
of
stuff,
yeah
yeah.
We
could
try
it
I,
don't
remember
if
we
ever
did
that
in
growth.
Actually,
because
that's
definitely
some
of
the
things
that
we're
trying
to
do
but
I
don't
have
an
example
in
mind.
So
I
know
I,
didn't
design
a
painted
door
and
get
up,
that's
for
sure,
yeah.
All.
A
B
Gotta
be
careful
with
the
way
we
do
it,
because
if
it
looks
like
it's
a
real
thing,
you
know
like
I,
don't
know
how
we
would
say
a
lot
or
whatever
we
need
to
dig
of
it
into
it,
but
it
could
be
quite
confusing
as
well
of
having
a
button.
Then
you
click
and
then
there's
nothing
and
you're
like
oh
right
right
so
yeah.
We
just
need
to
think
about
that.
If
we
want
to
do
a
painted
or
type
of
stuff.
B
E
A
As
an
alternative,
too,
our
normal
procedure
of
you
know
recruit.
Six
people
go
talk
to
them,
see
if
they
have
the
problem,
build
a
prototype
for
it.
It's
a
safe
way
for
us
to
continue
to
go
faster
and
test
hypotheses
that
we
have
are
test
assumptions
that
we
have
that
people
want
these
features
or
could
do
this
work
more
rapidly.
B
A
C
B
A
For
sure
Justin
we
kind
of
jumped
over
your
point.
Do
you
want
to
vocalize
sure.
E
The
AI
folks
are
also
dealing
with
feedback
mechanisms,
it's
primarily
for
the
AI
results
or
prompt
type
of
stuff.
You
know
thumbs
up
thumbs
down,
but
they're
also
thinking
about
ways
how
this
might
be
a
general
component
that
could
be
put
on
like
every
page
at
the
footer
or
something
so
you
could
always
provide
feedback
across
klab,
so
it's
possible
that
it
could
be
something
that
you
will
might
want
to
look
at
as
well.
So
I
wonder
if
we
can
pipe
that
data
into
product
analytics
too.
E
C
Yeah
I
wasn't
sure
if
this
was
gathering
feedback
once
features
have
launched
versus
sounds
like
it's
something
that
could
also
be
done
prior
to
like
be
able
to
get
them
to
sign
up
for
customer
interviews
or
get
them
into
like
proper
prototyping
tests.
I
guess
they
would
still
remain
the
same,
but
I
know
if,
like
in
terms
of
actually
UI
patterns.
C
I
know
we
use
a
lot
of
pop
over
for
all
that
kind
of
stuff
out,
mostly
for
like
feature
highlighting
though
I'm
not
really
some
feedback
Gathering,
so
yeah,
I,
I
I
have
I
have
a
ton
of
ideas
that
we
could
build,
but
that
wouldn't
really
solve
the
problem.
Right
now,
but
I
know
that's
that's
a
future
category
for
us
at
some
point,
but
yeah,
that's
that's
other
than
that.
I!
Don't
really
have
a
whole
lot
to
offer
right
now.
All.
A
C
But
yeah
I,
don't
know
like
maybe
doing
feature
color
like
it,
doesn't
help
that
we
don't
have
like
a
really
good
way
of
like
finding
I
know.
We
have
like
group
based
feature
flags
as
well,
but
we
don't
really
have
a
way
to
do
like
cohorts
or
like
potential
customers
that,
like
you'd,
want
to
actually
put
a
pop
over
in
front
of
that.
We
would
actually
be
more
like
more
likely
to
sign
up
and
then
also
touches
on
like
a
little
bit
of
baby
testing
of
like
something.
C
We
also
want
to
try
to
experiment
with
at
some
point,
but
yeah
painted
doors,
but
yeah
it
could
be,
could
be
yeah
gotta,
be
careful
with
that.
Sorry.
B
B
C
Yeah
yeah,
I
I,
think
it's
based
off
of
on
Splash
is
is
like
or
what's
running
under
the
hood,
but
I
remember,
actually
Jeremy
Jeremy
Jackson
had
was
maintaining
that
gem
quite
a
bit
Yeah
you
mentioned
that
yeah
we
we
I
had
him
do
some
investigation
on
what
it
would
take
to.
C
We
could
use
it
as
is,
but
we
would
want
to
take
that
or
kind
of
like
modern
or
updated
a
little
bit
if
we
wanted
to
incorporate
ourselves.
But
that's
that's
a
different
topic.
We
could
potentially
look
at
that
for
sure.
C
Yeah
we
have
an
existing
way
to
do
AVN
testing
that
we
have
a
gem4,
but
if
I'm
not
mistaken,
it's
using
basically
an
an
open
source
version
of,
what's
called
I,
think
not
on
Splash
unleash,
which
is
an
open
source
thing,
but
it's
severely
out
of
date,
so
I'm
not
sure
if
we're
still
actively
using
it
across
the
organization
or
not.
C
But
regardless,
like
that.
That
was
something
that
we
could
potentially
then
kind
of
repurpose
and
build
into
the
product
officially
as
part
of
like
an
experimentation
but
I,
don't
think,
there's
anything
stopping
us
from
using
it,
as
is
to
kind
of
help
validate
some
of
the
stuff.
We're
trying
to
do
right
now,
oh,
but
we
would
just
have
to
have
the
capacity
to
take
a
look
at
that.
But.
A
A
C
It'll
be
a
fun
time
for
sure.
Oh
man,
part.
A
D
C
Why
and
we'll
we'll
talk
about
a
length
tomorrow,
I
mean
when
we
sync
up,
but
that's
part
of
why
I
really
want
to
get.com
data
in
because
the
scale
of
it
will
really
help
to
do
a
lot
of
cost
modeling.
A
Yeah
yeah
cool
all
right,
great
discussion
on
ongoing
feedback.
Thanks
everybody,
the
last
Point
here,
Kevin
we're
missing
the
rapid
part
of
Rapid
iteration
testing
I
think
we
should
just
move
forward
with
starting
a
recruitment
issue
getting
that
going
so
that
we
can
have
Caitlyn
start
recruiting
as
soon
as
we
want
to
I
can
create
the
recruitment
issue
in
the
research
or
the
ux
project
today
and
I'll
tag
you
on
it.
If
that
works.
B
Sure
but
wait
didn't
we
say
at
the
beginning
of
the
call
that
we
were
okay,
give
it
in
by
the
end
of
the
week.
If
we
didn't.
A
B
Got
it
but
like
the
the
recruitment
request
that
you
would
open
in
this
scenario,
I
think
would
be
for
moderated
if
we
want
to
go
with
unmoderated,
we
don't
and
Justin
correctly,
if
I'm
wrong,
but
I
believe
we
don't
necessarily
need
to
open
a
recruitment
request
like
could
we
just
come
up
with
a
screener
ourself?
Have
it
reviews
and
then
go
on
usertesting.com
release
it
and
get
people.
E
A
I'll
spin
up
the
issue
today:
Kevin
we
can
work
on
an
async
and
then
we
can
go.
If
we
want
to
go
to
the
user
testing
route.
We
can
and
kick
that
off
next
week,
cool
yeah,
all
right,
awesome.