►
From YouTube: CICD UXR / Design Leadership Sync 2022-02-03
Description
Meeting to discuss research and design needs for Ops (Release, Verify & Package).
A
Cool,
so
let's
start
from
the
top
in
in
your
section,
can
you
talk
about
some
of
the
the
items
that
you've
got
here.
B
Yeah,
so
this
is
the
first
time
right
that
you're
having
this
meeting
you're
gonna
go
the
research
updates
and
needs
for
package
releasing
and
verify,
and
my
first
agenda
item
here
is
designer
needs
and
research
support.
So
I'll
start
with
the
package
team.
So
erica
is
not
here
right
now,
but
for
package
we
are
still
facing
some
challenges
in
recruiting
the
right
participants
for
the
dependency
proxy
research.
B
I
discussed
with
katie
yesterday
the
day
before
and
although
erica
and
caitlin
they
have
been
super
helpful
and
then
they
have
been
sending
participants
towards
katie
and
tim.
They
still
feel
like
these
are
not
the
right
people
and
they,
I
think
they
have
been
almost
100
from
the
respondent
and
katie
asked
me
about
gitlab,
my
god.
First
look.
B
First,
look
right:
how
do
you
get
access
to
to
that
right?
That
bucket
of
users
how
to
access
those?
What
is
the
process?
What
does
she
need
to
do?
So?
That's
a
point.
I'm
gonna
add
here
for
erica
to
check
that
with
katie
and
to
see
if
we
can
have
a
better,
not
a
better
approach
but
look
into
those
participants
again,
because
I
think
this
this
research
in
particular,
has
been
going
on
for
a
couple
of
weeks
now.
B
A
So
when
it
comes
to
first
look
our
research
coordinator
caitlin,
she
manages
that
for
the
most
part,
so
whenever
I'm
recruiting
for
a
study-
and
I
know
that
I
want
to
use
first
look
participants
or
even,
if
I'm
not
unsure
about
it,
I
can
ping
her
within
the
recruiting
issue.
With
any
questions.
A
I
think
it's
like
a
it's
either
in
our
license
database
or
somewhere
else
potentially,
but
she
has
full
knowledge
of
it
and
she
can
like
pull
based
on
specific
questions
that
we
ask
we
like
put
through
people
through
a
screener
when
they
say
that
they're
interested.
A
A
That's
specific
to
that
study.
I
know
that
erica,
you
know,
has
put
together
an
idea
around
a
common
screener
to
kind
of
cut
down
on
the
use
of
you
know
like
one
type
of
screener
per
study,.
A
But
you
know,
caitlin
can
send
the
screener
out
to
first
look
participants
and
then
we
monitor
it
in
in
qualtrics
and
and
sends
out
from
there.
B
Okay,
that's
good
information
because
I
haven't
been.
I
haven't
used
first,
look
in
so
long
and
before
I
remember
that
we
did
have
some
level
of
access
to
those
participants,
and
we
understand
now
is
that
kaitlyn
is
the
one
in
a
way
managing
and
going
to
that
database
and
sending
the
screener
to
this
to
these
people
right.
So
we
have
to
go
through
that
process
and
ask
for
her
support.
A
B
Yeah
because
I
couldn't
find
information
about
access
to
first
look
but
that's
fine.
I
know
the
information
is
up
there,
but
I
just
wanted
to
confirm,
because
I
haven't
done
that
in
a
long
time.
A
Yeah
and
we
have
a
a
page
on
our
about.gitlab.com
website
that
has
some
like
basic
information.
It's
more,
you
know
steered
towards
participants,
but
there
is
like
some
q
and
a
towards
the
bottom
to
explain
some
of
that.
So
I
can
add
that
to
the
doc.
B
I
couldn't
find
like
okay,
this
is
how
we
get
access
to
it.
It
wasn't
also
in
the
tech
stack,
so
it
what
I
assume
from
that
is
that
we
cannot
request
access
to
first
look
and
I
understand,
because
there's
participants
data
there
sensitive
information
that
not
everyone
should
be
managing
or
should
be
yeah
just
moving
to.
A
Yeah
and
that's
a
good
overall
question-
I'm
gonna
type
this
in
here
but
like
we
have
a
like
ux
research
meeting
today
with
with
caitlin
and
all
the
other
researchers
and
adam.
So
I
can
add
that
to
like
our
agenda
to
try
to
figure
out
like
is
there
a
formal
process
outside
of
just
contacting
caitlyn.
B
Yeah
that
would
be
helpful,
especially
that
for
dependency
proxy.
I
know
that
katie
has
already
talked
to
a
number
of
participants
and
she
has
been
facing
this
challenge
for
a
while.
Now
right,
yes,
you
get
to
the
participant
to
send
out
a
screener,
but
then,
when
she
got
to
the
interview
there
was
even
someone
that
should
say
that
they
didn't
speak
english
at
all,
so
people
that
don't
actually
have
experience
with
gitlab
or
with
the
type
of
tooling.
B
So
it's
yeah
it's
one
of
the
things
that
on
the
side
that
we
also
want
to
have
a
look
into,
but
also
helping
her
find
the
right
people
for
this.
This
study,
I
think
that's
I'd,
say
that's
one
of
the
the
priorities
now,
since
the
study
has
already
been
going
on
for
a
while.
B
So
if
you
can
raise
that
and
ask
for
keitlin
to
have
a
look
will
be
great.
I
also
asked
katie
to
reach
out
in
the
research
channel
in
slack
ask
the
same
questions
as
here
right.
So
how
do
I
get
access
to
first
look.
I
had
this
challenge
with
a
participant,
etc,
but
trying
to
over
communicate
here.
It's
always
better.
A
Okay,
do
you
want
to
talk
about
the
deep
dive
for
the
release
stage.
B
But
this
is
mostly
a
heads-up
publisher
if
you've
seen
in
slack
or
conversations,
we
had
a
call
yesterday,
jackie
daniel
chris,
and
I
and
we
went
through
the-
let's-
let's
say,
historical
context
of
a
release
or
some
of
the
categories
that
daniel
and
chris
are
working
with,
because
that
conversation
that
we
had
asynchronously
of
the
team
not
having
full
visibility
into
the
existing
insights
and
then
reopening
questions
or
talking
about
validation
of
things
that
we
already
validated
in
the
past
right
or
in
the
last
year
or
so
so
we
had
this
call.
B
This
is
the
recordings
here
where
we
talk
about
the
business
side
of
those
capabilities,
why
they
came
to
be
the
way
they
did
and
then
it's
mainly
around
release
orchestration.
We
talk
a
little
bit
about
environment.
We
talk
about
deployment
and.
B
We
also
talk
about
some
of
the
design
decisions,
for
example,
for
release
a
lot
of
the
work
that
we
did
in
the
last
last
two
years,
we're
around
making
sure
that
the
release
manager
persona-
that
is
a
person
on
that-
can
be
technical
or
non-technical
depending
on
the
company
right
or
is
someone
that
is
responsible
for
managing
releases
in
the
arc
that
they
can
do
what
they
need
through
the
api,
but
also
through
the
ui
and
a
part
of
the
conversation
was
around.
B
Why
in
the
insights
and
why,
in
the
research
and
in
the
design,
decisions
and
the
product
decisions
that
we
made,
the
flow
seem
incomplete
in
terms
of
managing
environments,
managing
deployments
managing
releases?
It's
because
well,
this
persona
is
technical,
but
it's
also
not
technical
right.
So
we
had
to
kind
of
manage
expectations
there
and
deliver
on
both
ends.
So
I
think
it
was
a
40
45
minutes
discussion,
there's
also
a
dock,
I'm
gonna
link
here.
B
I
think
it
will
be
useful
for
you
to
kind
of
scan,
maybe
have
a
look,
because
it
really
provides
more
insight
into
yeah.
Why
we
didn't
build
it,
for
example,
why
we
didn't
build
an
environment
dashboard
or
why
we
didn't
redesign
it
or
why?
You
know
there
is
a
a
bunch
of
incomplete
stuff
and
how
that
influences
the
the
direction
for
release
today.
So
messy
summary,
but
it's
about
strategy,
the
research
and
the
design
it
was
a.
It
was
very
productive
conversation.
So
just
thought
you
should
know
let
this
happen.
A
Yeah,
I
appreciate
that
I
I
think
I
had
heard
some
of
that
happen
in
one
of
the
issues.
There
was
some
talk
about
that
meeting
coming
together,
but
good
to
know
that
that
happened
and
there's
a
recording
of
it
so
yeah.
I
can
go
back
and
and
watch
it
and
skim
through
the
dock,
because
that
will
be
like
pretty
interesting
for
me
to
have
that
historical
context.
B
And
if
you
have
also
any
questions
I'll
link
here,
the
issue-
I
think
jackie
created
this
issue
where
on
chris,
but
it's
just
like
async
agenda,
but
if
there
are
also
other
questions,
think
it
would
be
worth
the
documenting
those
in
gitlab.
B
So
we
can.
We
can
find
the
answers
later
on.
Okay
and
my
last
update
heads
up
is
that
a
focus
area
for
all
product
designers
and
on
my
direct
report
in
terms
of
growth
in
development
is
research
right.
So
we
identify
that
as
part
of
the
skills
matrix
exercise
that
we
did,
I
think
in
october
last
year,
but
also
through
the
performance
assessment.
B
So
this
month
next
month,
I'm
going
to
be
facilitating
these
conversations
with
my
team
and
making
recommendations
for
the
growth
and
development
plan
right
or
the
things
that
they
should
be
working
or
they
want
to
be
working
on
and
opportunities
within
gitlab
and
in
terms
of
research.
It
was
a
really
around
qualitative
versus
quantitative
research
right.
So
we
generally
like
the
product
designers
in
my
teams.
They
don't
really
do
a
lot
of
quantitative
research.
B
I
think
there's
an
opportunity
there
to
maybe
understand
how
to
apply
or
learn
those
skills
within
gitlab,
but
I
also
want
to
understand,
I
think,
I'm
still
formulating
these
ideas,
but
I
think
I
also
want
to
understand
from
you
and
erica
once
I
have
a
draft
of
this
plan.
B
How
can
you
help
right
and
based
on
the
the
upcoming
research
or
the
current
efforts
that
you're
working
on
and
how
to,
I
would
say,
intentionally
focus
on
some
of
these
areas
with
specific
product
designers,
so
we
have
people
that
are
seniors
that
have
experience
with
research.
We
have
intermediate
junior
product
designers,
so
it's
I
think
that
we
need
to.
I.
I
think
I
need
to
do
some
leveling
there,
but
I
want
to
discuss
this
with
you
and
erica
to
understand.
B
How
can
you
help
right
and
make
sure
that
you're
being
strategic
in
that
sense?
So
this
is
also
a
heads-up
I'll
formalize
that
and
I
create
issues
tracking
issues
for
growth
and
development
for
my
team
and
yeah.
Maybe
we
can
use
one
of
these
calls
in
the
future.
So
I
I'll
talk
to
you
a
little
bit
about
what
my
plans
are.
A
Cool,
I'm
not
sure
why
the
doc
like
keeps
like
tracking
my
changes.
B
A
Okay,
cool
yeah
that
works
now,
so
we
can
talk
about
erica's
stuff,
or
at
least
I
can
just
read
out
what
she
has.
I
I
may
not
have
all
the
context:
she's
working
to
hand
off
the
common
screener
coordination
to
caitlyn
so
sounds
like
she
built
out
like
a
kind
of
a
rough
handbook,
page
that
I
had
reviewed
and
it
sounds
like
she's
just
moving
that
process
over
to
caitlin
so
that
she
doesn't
have
to
erica
doesn't
have
to
manage
it
exclusively.
A
B
I'm
just
going
to
say
that
yeah.
This
ties
like
nicely
to
my
first
agenda
item,
but
I
think
for
in
particular
for
team
and
katie.
It
will
be.
I
don't
know
if
this
has
a
deadline
of
when
you
know
the
the
screener
coordination
would
happen
so
in
terms
of
helping
them
out
with
the
the
screener
and
finding
out
finding
participants.
A
So
I
missed
some
of
that.
Could
you
elaborate
a
little
bit
more
about
what
you
were
saying
there,
so
I
can
capture
it
in
the
dark.
B
Yeah,
so
the
way
I
understand
and
just
by
reading
the
the
update
here
in
the
dark
is
that
that
they're
gonna
try
to
feed
right
into
use
the
the
participants
that
gina
and
jackie
are
recruiting
to
see
if
they
fit
for
tim
and
katie
right,
so
that
other
teams
can
also
leverage
from
this
pool
of
participants.
B
And
my
question
is:
when
do
we
expect
that
to
happen?
Because
to
my
previous
point,
katie
and
tim,
I
think
they
they
need
to
be
prioritized
right,
because
this
this
screener
is
for
everyone.
This
both
participants
is
hope
for
everyone,
but
from
the
people
that
we
already
have
in
first
look
or
other
places.
B
How
can
we,
you
know,
assign
them
to
the
dependency
proxy
research?
Yes,
participants.
A
A
B
A
Okay
and
then
she
also
has
an
item
about
an
ops
product
directions
survey.
So
we
have
a
draft
of
the
survey
mainly
keeping
designers
in
loop.
Okay,
if
they
don't
want
to
comment
just
you
know,
looping
them
in.
B
A
Nice
cool,
so
any
other
comments
or
questions
about
what
erica
has
under
project
coordination.
C
A
Like
upload,
the
recording
to
the
the
thread
so
no
worries,
so
we
had
some
like
questions
about
some
of
the
stuff,
we're
we're
kind
of
in
the
project
coordination
steps.
Okay,
so
had
just
found
the
mr
for
the
handbook
page.
A
C
I
wanted
to
tell
you
that,
like
the
first
thing
that
I
noticed
when
I
came
on
is
everyone
was
like
help
me
with
this
recruitment
issue.
Look
at
my
screener,
so
I
was
like
okay.
The
first
way
I
can
add
value,
is
just
to
help
with
this
process
like
something
we
need
help
on
this,
like
typical
user
research
right
like
they're
telling
me
this,
so
they
need
that.
C
So
we
have
the
pilot
kind
of
done
and
we're
trying
to
wrap
it
up,
and
I
think
what
we've
proven
is
that
we
can
be
pulling
the
common
recruit
right
and
we
have
a
couple
other
lessons
like.
I
think
we
need
to
have
a
separate
recruit
for
enterprise.
It
doesn't
work,
so
I'm
just
working
to
like
codify
that
in
a
way,
that's
not
overwhelming,
with
an
update,
but
also
we
didn't
get
the
common
screener
as
like
a
silver
project
prioritized
in
my
prioritization
issue.
C
So
I
need
to
like
take
less
of
an
instrumental
role,
but
I
think
we
have
this
need
so
there
we
are
in
terms
of
the
recap
I
need
to
just
start
working
with
caitlyn
to
figure
it
out
and
how
she
can
work
on
her
like
fitted
it
into
her
cur
current
processes
a
little
bit
like
a
revolution.
C
So
that's
why
we
kind
of
started
small.
I
think
we're
in
a
good
position,
though,
now
because
we
have
a
lot
of
the
system
set
up,
what
it
will
mean
is
that
we'll
keep
that
participant
pool
for
like
the
quarter,
and
we
need
to
do
that
because
we
get
we
have
to
like
feed
them
back
into
our
systems,
because
we
have
like
contact
rules.
C
So
there's
just
all
these
like
logistical
things,
but
we've
really
solved
for
that
with
this
commons
demeanor
approach.
So
we
our
team
kind
of,
has
right
now
and
our
team
has
right
now
this
set
up
for
us.
C
C
But
but
I
know
that
gina
and
jackie
are
setting
up
this
project,
it's
more
like
a
brawn
project
where
I'm
helping
with,
but
they
wanting
or
wanting
to
like
get
from
tans
and
essays.
C
They
want
to
fill
up
the
participant
pool.
So
we're
going
to
try
to
bring
those
people
into
the
fold
too,
as
a
way
of
like
naturally
feeding
the
pool
and
then
that's
helpful
for
caitlyn,
because
one
supported
it'll
feed
back.
So
I
think
I
think
we're
like
closer
than
we
might
think,
because
we
have
it
kind
of
set
up.
But
I
don't
I
think
it
would
take
like.
I
think
it
will
take
like
six
months
to
like
flatten
out
in
a
way
that,
like
it's
smoother
for
all
types
of
studies,.
B
But
yeah
thanks
for
the
update
erica
and
that's
we
kind
of
touched
based
on
that.
So
it's
good
that
we
also
provide
more
context
there
and
one
of
my
agenda
points
like,
although
in
the
beginning
was
also
recruiting,
but
for
package
right.
I
think,
for
the
dependency
proxy
problem
calibration
research
that
they're
still
having
challenges.
B
You
know
interviewing
the
right
people,
so
you
share
with
me
that
recently
there
was
a
one
participant
that
didn't
speak
english
or
someone
that
doesn't
have
experience
with
with
the
tool
or
doesn't
really
meet
the
the
profile.
B
So
to
your
point
that
something
will-
and
I
already
discussed
that
awesome-
we
love
the
common
shared
screener.
But
what
do
we
understood
is
that
we
want
to
get
that
pool
of
people
right
from
from
not
valerie,
jackie
and
angina
and
feed
into
the
common
screener,
and
my
request
is
that,
while
that
happens
in
parallel
that
with
katie,
you
can
also
look
into
finding
the
right
people
for
dependency
proxy.
B
So
maybe
there
are
a
couple
of
people
coming
from
jackie
and
gina's
research
that
will
meet
the
profile,
but
that
yeah,
the
the
dependency
proxy
screening
can
be
prioritized
and
that
we
can
help
them
move
forward,
because
I
know
that
you
and
caitlin
have
already
been
doing
that.
But
from
what
I,
what
I
learned
from
katie
is
that
yeah
it's
it's
still
not
meeting
all
the
criteria.
So
I
think
it's
really
about
yeah
those
participants,
but
you
can
align
with
katie
and
with
timmy
that
too
we
were.
C
Able
to
give
her
some,
which
I
felt
like
yeah
so
yeah.
I
think
she
just
should
I'll
work
with
her,
but
I
think
the
way
to
do
it
right
now
is
to
just
ping
me
in
that
read
me
file
and,
let
me
know
like
we
want
to
track.
Why
they're,
not
usable,
if
they're,
coming
from
that
recruitment
screener,
because
then
we
can
like
the
benefit
of
that
is.
We
can
like
make
little
notes
and
fall
tricks
and
be
like
a
language
barrier.
C
So
at
least
the
next
person
will
know
that.
That's
but
yes,
that's
sad,
but
yeah,
so
yeah,
so
I'll
loop
back
with
her
I'm
kind
of
waiting
for
the
pool
to
fill
up.
We
did
it
wasn't
sure
if
there
was
like
a
demand
and
then
we
did
a
social
media
push
that
was
branded
to
see
how
that
went
and
it
it
wasn't.
So
we
waited
to
feed
the
pool
to
just
like
know
what
the
numbers
were.
But
anyway,
I
think
we
just
did
another
push.
C
B
B
Right
and
we
have
a
oh
yes,
I
think
before
you
join
will
ask
me
if
I
had
any
questions
in
terms
of
projects
and
I
think
the
issues,
the
tracking
issues
are
clear
to
me
so,
but
what
I
want
to
ask
is
that,
is
there
anything
that
change
in
terms
of
priority
in
this
next
week
or
anything
that
yeah
is
going
to
change
in
the
future?
That
might
not
be
reflected
in
in
the
trackers.
C
C
I
have
like
a
retro
thread,
comment
thread
in
there
and
then
a
like
put
your
new
requests
thread
here
in
there
and
I
think
we'll
just
keep
it
set
until
we
do
that,
like
until
we
do
a
next
release
and
or
like
lots
of
needs
clog
up,
and
then
I
built
my
gold
project
like
a
wrap
like
totally
around
the
team's
needs
in
terms
of
all
these
things
that
they're
working
on
to
try
to
like
connect
the
dots.
C
But
we
want
to
make
sure
because
that
recruit's
going
to
be
really
hard
because
it's
particular
enterprise
and
jeans,
and
I
have
been
working
on
a
study
and
we
just
can't
find
enterprise
participants.
So
this
is
how
also
how
I
know
that
we
need
a
different
strategy
so
for
that
work,
we're
setting
up
this
case
study
where
we'll
have
like.
If
we
can
get
it
going,
we'll
have
recurrent
meetings
and
then
we
can
weave
in
their
topics
so
that
I
think,
is
going
to
help
the
team
like
bolster
us.
C
People
will
see
these
case
studies.
I
prototyped
an
issue
where
we'll
have
those
profiles
and
summaries
and
we
can
use
like
engineering
and
people
can
comment
in
additional
follow-up
questions
for
subsequent
interviews,
but
the
thing
to
make
sure
everyone
understands
and
is
tracking
there
is
that
they
want
to
continue
their
individual
problem,
validation
studies
in
parallel.
C
I
guess
it
could
be
really
like
if
they're
waiting
for
me
to
deliver
all
of
this
stuff
for
them
it
could.
We
we
just
are
not
in
a
position
with
the
recruit
to
do
that
once
we
get
this
like
set
up
and
jackie
and
I
kind
of
toyed
around
and
I
and
lauren-
and
I
are
talking
about
using
this
as
a
research
program
like
so
like
once
we
get
them
going,
then
we
can
have
that
as
like
a
kind
of
touch
point
or
customer.
C
I
think
we
can
like
build
in
some
of
their
questions,
but
it's
not
there
yet.
So
those
are
my
and
then
I
have
the
ops
product
directions.
Survey
we're
actually
in
a
really
good
spot.
With
that,
I'm
gonna
do
like
a
broadcast
update.
Nobody
needs
to
feel
that
they
need
to
respond,
but
I
always
err
on
the
side
of
like
including
way
too
much
so
I
I
but
we're
supposed
to
be
transparent,
so
I'm
just
still
working
through
like
how
to
keep
everyone
in
the
loop
while
not
overwhelming
them.
C
So
let
me
know,
but
I
think
philly.
This
will
be
good
right.
So
now
you
know
like
they
don't
need
to
do
anything,
but
they
have
the
chance
to
like
comment
on
the
survey
questions
and
then
what
we'll
do
is
because
we're
all
set
as
we're
able
to
I'll
start
doing
like
talk
about
protocols
where
people
take
the
survey
and
then
they're
like
this
is
a
weird
question.
I
don't
know
what
this
means
and
then
we'll
make
like
smaller
calibrations
and
then
I'll
update
those
I'm
going
to
make
a
separate
issue
for
that.
C
A
Much
of
it
is
not
going
to,
I
think,
apply
to
your
areas,
because
I
I
just
cover
so
much
in
enablement
as
well,
so
I
have
like
11
different
projects
across
like
ops
and
enablement
that
have
been
requested.
A
B
Okay,
so
that's
good,
because
then
the
cms
he
can
tackle
with
the
with
his
pm.
So
it's
one
last
thing
in
your
long
list
of
pythons:
it's
a
very
yeah!
A
I
I
did
get
another
request
like
this
morning
for
like
personas.
So
even
though
this
one
is
going
off
there,
I'll,
probably
adding
one
on
there
to
replace
it
with
so.
A
C
So
there
is
a
dialogue
happening
with
like
a
lot
of
pms,
where
they're
talking
about
differentiating
a
couple
of
the
personas
and
they're
doing
it
like
as
a
discussion,
and
I
would
love
for
us
to
bring
empirical
data
so
we're
in
a
pizza.
But
I've
been
waiting
to
like
make
sure
because
it's
like
scary
for
me
a
little
bit
to
like
jump
in,
but
I
think
that
at
kubecon,
with
our
survey,
we'll
be
asking
kind
of
building
on
the
common
screener
these
job
to
be
done.
Questions.
C
C
We
could
do
a
factor
analysis
on
how
those
tasks
hang
together
and
then
have
an
empirical
way
of
helping
those
teams
understand
that,
because
they're
right
now
like
well,
we
have
assistance
admin,
but
there's
also
this
infrastructure
operator
and
a
platform
engineer
and
then
like.
We
know
that
whether
or
not
people
are
whether
or
not
those
personas
are
overlapping
within
a
particular
company
depends
on
the
industry
vertical
and
like
the
complexity
of
what
they're
solving
for
so
like.
Are
they
doing
machine
learning
like
that?
C
Requires
all
these
different
roles
and
more
specialization
then
like
how
devops
mature?
They
are
like,
are
they
just
starting?
So
then
their
infrastructure
team
is
like
old
school
and
less
cloud-based
or
yeah
enterprise,
enterprise
or
small,
medium
business.
So
there
are
these
like
three
factors,
so
we
will
be
able
to
tease
apart
this
in
the
common
screener.
I
think
I've
been
waiting
to
talk
about
that,
because
it's
confusing.
C
C
A
C
A
A
I
see
what
you
mean
like
I've,
linked
to
the
the
issue
that
was
just
created
like
not
even
like
half
a
day
ago,
so
it
sounds
like
they
want
to
potentially
like
combine
elements
of
existing.
A
C
Yes,
daniel
mora
is
fantastic.
I
thought
that's
coming,
I'm
like
yay,
but
yes,
so
this
is
what
I'm
tracking
to
and
that.
But
I
think
it's
large
enough
in
scope
that
it
would
be
a
larger
project
and
we
would
want
big
quant,
otherwise
we're
all
going
to
do
independent
studies
on
this
and
then
it's
going
to
depend
on
all
those
different
factors
and
it's
going
to
be
really
confusing
that
we're
getting
different
findings.
C
But
but
so
we
can
talk
with
lauren
about
or
you
could
just
drive
that,
but
I
I
think
we
want
to
think
about
as
just
like
a
systematic
approach,
and
I
think
we
do
have
this
kubecon
data
that
we
could
leverage.
But
it's
we'll
have
to
wait
for
it.
But
it's
the
perfect
thing,
because
kubecon
is
where
we'll
have
those
personas
gathering.
C
C
We're
setting
up
these
two
persona
job
to
be
done,
questions
with
differentiator
tasks.
We
should
be
able
to
do
a
factor
analysis
on
this
data
and
say
yes,
this
infrastructure
operator
role
and
this
system
admin.
Those
are
the
same
people
saying
yes
to
those
questions
or
there's
different
groups,
saying
yes
to
those
questions
which
would
that's
the
empirical
information
that
they
need
to
be
like
got
it
on
the
same
page.
A
C
And
let
me
let
me
work
through
because
I'm
just
working
through
so
yeah,
let's
work
on
a
coordinated
plan,
so
we
like
look.
We
look
really
together,
but
I
think
we
want
to
be
like.
First,
we
need
to
introduce
to
them
this
concept
of
job
to
be
done
and
differentiate.
Our
tasks
is
the
way
of
looking
at
personas,
because
I
think
a
little
bit.
C
A
Okay,
I
can
just
dm
you
to
kind
of
talk
through
some
different
ideas
about
how
to
respond
to
this
particular
issue.
C
Yeah-
and
I
think
we
just
want
to
coordinate
with
lauren,
so
we
can
then
say
like
because
what
we
could
do
is
be
able
to
say
we
would
be
in
a
position.
We
may
be
in
a
position
to
deliver
on
this
in
queue,
probably
three
well
at
the
end
of
q2.
Maybe.
B
C
B
No,
don't
worry,
maybe
you
can
also.
I
was
telling
this
to.
Will
I'm
going
back
to
europe
next
week,
so
I'll
probably
have
to
adjust
this.
This
call
anyways
because
yeah
I'll
be
four
hours
ahead.
So,
okay,
don't.
C
B
We'll
look
for
another
time.
C
I
have
my
child
care
drop-off
times
like
plugged
in
there.
No
one
can
do
anything
then,
but
besides
that
I
can
really
be
pretty
flexible,
but
I
think,
but
thank
you
for
this
meeting.
I
was
only
here
for
a
section
of
it,
but
I
feel
like
we
are
like
this
is
fantastic
we're
I
feel
very
aligned
and
we
already.