►
From YouTube: CI PM/UX Research Planning on 2020-06-24
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
All
right
welcome
everyone
to
today's
continuous
integration
p.m.
UX
research
planning
session,
so
first
item
on
today's
agenda.
What
UX
research
issues
are
in
progress
now,
so
I
listed
the
ones
that
that
we
had
called
out
at
our
last
UX
meeting,
so
the
Jenkins
rapper
solution,
validation.
We
had
said
last
time:
Utica
would
be
the
DRI
any
updates
on
that
Becca.
B
A
And
yeah
and
I
I
was
going
to
run
with
that
on
the
solution,
validation,
but
I
think
we
were
had
the
last
time
we
talked
we
said
well,
normally
p.m.
does
the
problem
validation
and
we
want
p.m.
to
help
but
solution
validation
with
belong
with
the
designer,
so
I
work
with
you
on
on
Vedika.
But
when
you
look
at
the
meeting
agenda
notes
from
June
10th
and
you
go
to
item,
you
know
what,
if
you
scroll
kind
of
down
there,
I'll
highlight
where
I'm
looking
so
that
we
can
all
see.
A
It's
like
it's
indented
quite
a
way
like
agenda
item
one,
but
everything
else
is
indented
in
okay
and
if
you
scroll
down
to
above,
be
midway,
we
had
selected
Viduka
for
the
DAR,
and
that
was
support
from
Demetri.
So
but
I
didn't
expect
that
there
would
be
any
progress
yet
on
that.
I
just
wanted
to
make
sure
that
something
hasn't
happened.
So
this
so
for
the
B
item:
Jenkins
migration
to
get
lab
C,
ICD
problem
validation,
I
actually
made
my
note
here
no
progress.
A
A
Do
you
want
to
be
the
DRI
for
that
me
to
go,
because
it's
fine,
yeah
I
would
do
this?
Oh,
okay
and
I
think
that
makes
sense
because
we
are
not
only
collecting
feedback
on
the
beta
I,
see
it
as
a
problem.
Validation
by
another
name.
Does
that
sound
right,
Laurie
we're
kind
of
validating
our
our
MVC
I?
Think.
C
The
Demetri
and
I
went
through
that,
so
yes,
you're,
right
solution,
validation
for
sure,
Ithaca
and
I
have
talked
about
it.
I
think
she's
got
a
solid
approach,
she's
already
started,
so
it
makes
sense
to
keep
her
as
the
DRI
for
it
because
she's
already
started,
and
so
that
taking
that
into
account
we'll
have
to
balance
that
piece
of
her
workload
with
vege
Eakins
wrapper
solution,
piece
of
her
workload
as
well.
So
it
sounds
like
the
Jenkins
wrapper.
C
A
And
so
beep
tick
on
that
when
we
kind
of
talk
about
all
these
things
that
you're
the
DRI
on
there's,
there's
no
expectation
that
you
make
progress
on
all
of
these
all
at
once.
We
just
want
to
make
sure
we
have
someone
as
the
DRI
and
then
between
myself
and
and
Demetri
and
Laurie
to
some
extent
we're
here
to
support
you
on
that,
and
so
you
and
I.
We
think
I
can
talk
about
little
incremental
it
progress.
A
C
A
All
right,
so
that's
that
first
agenda
item
was
just
to
get
a
status
update
on
any
progress
agenda
item
two,
and
these
are
by
the
way,
questions
that
are
just
off
of
the
template
and
that's
why
it
doesn't
really
have
a
name
of
a
person
that
is
driving
these
questions
for
to
what
are
the
upcoming
problem?
Validation
issues?
Well,
I,
don't
think
the
audience
so
I'll
read
hers.
She
has
the
first
one,
not
he's
saying:
do
we
want
to
plan
the
workaround
come
up
with
ubiquitous
language
for
pipeline
to
pipeline
relationship?
A
A
It
came
up
if
you
look
at
the
issue,
I
think
Fabio
created
it
and
it
came
up
when
we
realize
we
need
to
stop
and
involve
our
designer
and
our
tech
writer
to
really
come
up
with
something
that
is
our
standard
on
the
the
naming
convention
or
how
we
refer
to
things
because
there's
a
he,
he
outlined
beautifully
the
the
problem
that
we're
trying
to
solve
for
in
the
description
of
that
issue.
So
what
I'd
like
to
do
mm
I
just
noticed
this
issue
is
not
linked
to
the
epic
for
parent-child
pipeline.
A
What
I'd
like
to
do
that
occur
is
is
look
at
this
as
part
of
the
work
we'll
be
doing
together
to
organize
stack,
bring
the
issues
within
the
parent-child
pipeline.
Epic
I'm
right
now
going
to
add
this
issue
to
the
parent-child
line,
epic,
and
this
being
one
of
the
priorities
that
we
want
to
tackle
this
particular
issues.
A
A
A
C
Sure
I
just
put
a
quick
note
in
terms
of
scheduling
anything
really
when
it
comes
to
research,
it's
always
best
to
figure
out
when
you
need
the
results
by
and
then
work
back
to
try
to
try
to
plan
it.
So
if
we're
not
quite
sure
when
we
would
consume
the
results,
then
they
may
not
be
ready
to
to
talk
about
that,
but
to
your
point
town,
just
to
make
sure
you
don't
lose
lose
this
yeah.
A
That's
a
good
point
thanks
for
the
and
it
dries
when,
in
instance,
it
drives
when
we
might
target
even
needing
it
to
allow
time
in
between
for
the
the
research
okay,
the
the
second
note,
I
have
there
for
upcoming
on
problem
validation,
I
think
I
have
the
next
two.
Actually,
my
first
one
is
so
based
on
Marcel's
question
in
yesterday's
meeting
that
was
the
PM
UX
agenda.
A
I
referred
back
to
it
in
case
you
want
to
scroll
through
that
item.
I
do
want
to
consider
a
future
research
project
to
get
insights
and
the
users
reaction
to
new
warnings
and
deprecation
notices
that
we're
going
to
add
to
the
error
messaging,
that
is
in
both
CIT
linter
and
also
I,
think
pipeline
failure.
A
Messages
are
going
to
use
the
same
framework
and
return
the
same
warnings,
I
believe
I
haven't
looked
at
that
issue
in
a
while,
so
I
listed
out
the
questions
that
Marcel
had
mentioned
at
yesterday's
meeting
under
questions
to
consider
and
thanks
Laurie,
for
adding
your
thoughts
there.
Let
me
just
go
through.
Let
me
just
before
we
even
go
there
and
talk
through
some
of
those
questions
so
that
we
all
are
understanding
what
we're
trying
to
get
out
of
it
and
it's
a
great
start.
A
A
Before
we
even
have
this
research
on
what
the
users
reaction
would
be,
so
we
basically
are
going
to
collect
the
reaction
once
this
MVC
is
in
production.
What
is
and
I
need
your
your
ideas,
everyone
at
this
this
session.
What
is
a
reasonable
time
to
actually
try
to
run
this
research
like
after
a
month?
A
What's
the
feature
already
out
there
or
immediately
as
soon
as
possible
after
it's
out
there
and
keep
in
mind,
we
may
not
even
have
the
bandwidth
between
folks
on
the
team
that
would
run
the
research,
like
maybe
Tuukka,
with
a
little
help
from
Laurie
to
even
do
it
right
away.
But
what
would
be
the
ideal
I
think.
C
C
I'm,
not
sure
if
we
can
get
that
metric,
the
best
of
what
would
be
wonderful
if
we
could
otherwise,
it's
kind
of
crap
shade
of
have
you
seen
this
before
the
other
way
we
can
do.
It
is
to
create
some
mock-ups
that
show
the
warning
and
the
error
messages
and
then
put
them
into
a
forget
with
a
technical
term
of
it.
Is
they
just
show
them
a
picture
in
the
survey?
C
And
then
you
ask
some
questions
about
the
picture,
so
we
could
do
it
that
way
as
well,
and
so
that
that
would
mean
they
wouldn't
have
to
have
experienced
the
interaction
in
the
interface,
we're
simply
just
getting
some
feedback
about
how
we've
designed
the
messaging
they're
like
we
can
give
them
a
little
scenario
and
say:
okay.
Well,
if
you
saw
this,
what
would
you
think
or
whatever
the
questions
would
be,
so
we
can
do
it
that
way
as
well.
C
That's
what
we
would
do,
we've
used
Qualtrics
and
it's
just
a
different.
That's
why
I
said
forgot
the
name
of
it,
but
it's
like
a
different
option
in
Qualtrics,
where
you
just
show
a
picture
in
the
survey
and
then
you
have
a
couple
questions
and
then
you
can
go
to
like
another
page
and
show
another
picture,
some
other
questions.
So
it's
very
asynchronously
done.
C
You
wouldn't
have
to
sit
with
anybody,
and
the
only
the
other
idea
is
if
we
have
other
research
that
we
want
to
conduct
to
also
include
this
in
that
in
person
in
person,
remote
research
to
say,
hey,
I've
got
a
couple
more
things:
I
want
to
get
your
feedback
on,
pull
these
things
up
and
talk
to
them
about
it.
So
that's
also
another
option.
Okay,.
A
C
A
Okay,
so
it
it
okay,
I,
don't
know
how
we
would
collect
the
metrics.
It
may
be
that
we
tried
this
this
simulation
of
sharing
the
mock-up
of
the
warning
and
then
collecting
the
feedback.
Async
in
that
case,
I
think
the
questions
that
Marcel
suggested
that
we
ask
let's
go,
let's
see
which
ones
I
could
achieve
through
the
async
survey
or
which
ones
we
would
need
to
get
to
through
a
live
interview.
A
C
I
had
some
questions
around
this,
so
I
wasn't
sure.
If
so
so,
that
kind
of
jumped
over
do
they
even
see
the
difference?
Do
they
even
know
that
there's
a
difference,
I,
don't
I,
don't
know
what
anything
looks
like
myself.
So
you
have
to
forgive
me,
but
it's
pretty
clear.
Then
that's
not
a
problem.
I
think.
A
C
My
other
question
is:
do
they
understand
the
difference
between
what
an
error
is
and
what
a
warning
is?
I
would
assume
that
they
would.
But
again
what
where
we
add
in
our
assumptions
with
this
this
test,
the
other
piece
was:
do
we
have
an
hypothesis
around
this
I
suspect
they
might
react
differently
when
they
see
an
error
versus
a
warning,
but
it
also
might
depend
on
what
they
trying
to
accomplish.
C
I
think
it's
going
to
be
very
wide
in
answers,
because
it's
going
to
be
very
dependent
on
what
they're
trying
to
accomplish
when
they
get
that
messaging,
either
way.
Warning
or
error,
so
this
would
be
very
difficult
to
ask
in
in
a
survey,
because
unless
you
are
very
very
specific,
with
the
scenario
that
you
set
up
for
them,
you
are
trying
to
sell
this
pipeline.
You
are
trying
to
do
this
task.
You
click
the
button
and
then
this
is
what
you
see.
What
happens?
That's
totally
acceptable.
C
We
just
really
need
to
figure
out
which
tasks
or
scenarios
that
we
want
to
present
to
them
to
get
their
feedback,
because
this
is
same
throughout
the
whole
UI
right
and
it
just
very
I,
think
I
think
the
responses
are
going
to
vary
their
their
actions.
That
they're
going
to
take
people
vary
depending
on
the
circumstances
that
they
see
the
messaging
in
so.
C
C
A
C
C
D
There
be
any
benefit
to
me
kind
of
brainstorming,
some
possible
situations
of
where
this
error
might
occur,
and
the
type
of
user
that
my
are
not
Arab
but
warning.
So
the
act
just
to
be
clear,
I
think
the
errors
are
clear
to
people
because,
like
the
pipeline
doesn't
run
it
just
literally
says
no
pipeline
doesn't
run
it's
bad,
but
you're
you're.
D
C
D
A
D
Let
me
just
clarify
this
just
just
to
be
clear
yeah.
This
was
just
in
relation
to
the
first
and
I
was
just
kind
of
pinging
people's
hoping
for
experience
with
kind
of.
Maybe
someone
had
done
some
research
on
warnings
versus
errors,
and
so
I
was
mainly
asking
like.
Have
you
heard
of
any
you
know,
or
do
you
have
any
experience
with
research
on
errors
that
block
you
from
what
you're
doing
versus
warnings
that
alert
you,
but
don't
stop
you
from
what
you
were
doing
so
I
was
more
in
a
general
sense.
C
Point
to
a
study,
but
I
can
tell
you
that
some
people
will
blow
blow
pass
warnings.
They
won't
even
read
them
and
if
the
error
makes
you
stop
and
not
go
forward,
then
they'll
have
to
address
it.
They
do
they
make
krauss
and
be
angry
about
having
to
stop
their
work
to
fix
it.
But
if
the
system
won't
let
them
go,
they
will
stop
to
fix
it
because
they
have
to
achieve
that
task.
C
Right,
I,
suspect,
more
new
users
will
stop
and
read
all
the
warnings
where
your
seasoned
users
will
blow
past
them
because
they've
either
seen
it
before
they
know
what's
going
on
or
they
think
they
know.
What's
going
on
and
they'll
go
ahead
and
try
to
to
do
what
they
need
to
do.
That's
13
years
of
UX
research,
experience
talking
but
I
don't
have.
D
That's
exactly
what
I
was
looking
for,
not
specific
research,
but
the
13
years
of
experience.
Anecdotes
is
literally
what
exactly
what
I
was
looking
for.
Oh
and
I
think
that
basically
addresses
number
three
as
well
that
when
I
said
non-fatal
again,
it
was
just
kind
of
brainstorming
ways
to
explain
it
of
a
warning
that
doesn't
block
you
would
be
non-fatal,
so
I
think
yeah
yeah.
That's.
C
D
Do
they
feel
if
their
configuration
is
technically
correct
but
causes
warnings
which
is
kind
of
well?
One
of
my
concern
that
if
we
start
giving
warning
something
something
something
but
their
pipeline
runs
fine
and
is
actually
correct
and
working
properly
deprecations
to
me
are
clear
because
you
can
give
it
a
date.
You
could
say
it
is
working
now,
but
it
might
fail
later.
I
mean
error.
Is
it's
not
gonna
work
now,
but
then
you
have
a
warning
where
it's
like:
hey
you're,
using
this
keyword.
D
C
C
Fine
but
yeah
so
I
think,
like
I,
tried
to
reword
it
to
understand
what
you're
getting
that
with
your
initial
question,
so
I
think
I
think
I
got
it
right
like
what
happens
when
your
configuration
is
technically
correct,
but
you
get
a
warning
or
warnings
about
what
you're
trying
to
do
like
whatever
it
that
is
so
I
also
put
a
comment
like
I
think
this
is
also
better
suited
for
an
interview,
because
you
want
to
have
that
discussion
with
them.
Have
you
ever
seen
anything
like
this
in
another
system?
What
do
you
do?
C
C
You
know
those
kinds
of
questions
and
you
can't
really
get
that
kind
of
interaction
with
a
survey.
Unfortunately,
if
we
were
just
asking
some
simple
questions
about
like
what
do
you
think
this
means-
or
something
like
that,
remember-
could
do
that
in
a
synchronous
survey,
but
this
sounds
like
we
really
want
to
understand
their
their
way
of
thinking
and
their
schema
behind
what
they're
projecting
onto
these
warnings
and
these
error
messages
and
I
think
that's
going
to
be
a
conversation.
C
It's
not
a
long
one,
which
is
why
Isis
I
suggest
that
maybe
we
think
Oh
with
the
Jenkins
wrapper,
maybe
that
research
we
compared
a
small
conversation
there.
These
error
messages
with
those
participants
too,
so,
like
just
kind
of
pair
it
with
us
the
same,
recruiting
effort
to
have
a
conversation
around
this
with
people,
hey.
A
So
we
are
at
about
time,
I
put
a
time
marker
in
our
agenda
so
that
we
remember
to
come
back
to
the
stuff
that
at
the
next
session,
sorry
Marcel
we're
not
getting
to
your
last
question
and
then
my
items
down
there.
We
can
pick
up
at
the
next
one.
It's
not
urgent
feel
free.
If
anyone
has
answers
to
Marcel's
question
there
to
answer
it,
a
sink
in
the
agenda
I
mean,
if
not
we'll,
come
back
to
that
discussion.
A
Marcel,
you
can't
make
it
to
the
next
session
out.
You
can
always
come
back
and
listen
to
the
recording.
Okay,
oh
thanks,
yep
and
we'll
end
on
time.
So
I
can
run
to
another
meeting
with
an
external
user
that
wants
to
interview
us
about
merge
strain.
So,
oh
okay,
I
think
that's
it
I'll!
Stop
the
recording
and
thank
you.
Everyone
for
joining
today
have
a
great
day
thanks
guys,
bye.