►
From YouTube: Verify:Testing Group Think Big #13
Description
Today's ThinkBig came out of a recent customer conversation. The customer is a GitLab advocate and user but wishes they were doing more "testing" or that it was just happening for them in their GitLab projects. This aligns with our On By Default Product Principle and the team wanted to discuss more ways we can get existing GitLab users introduced to testing features sooner.
A
This
is
the
verify
testing
think
big
for
march
2nd.
This
is
a
think,
big
part
of
think
big
versus
think
small.
The
topic
today
is
I've
renamed
to
enabled
by
default
but
drew.
I
think
your
issue
was
make
all
the
things
work
or
something
like
that
turn
everything
on
by
default
turn
everything
on
by
default,
so
in
the
agenda
all
of
our
normal
prompt
questions
but
drew.
Why
don't
you
give
us
a
like
30,
second
overview
of
the
issue
before
we
start.
B
Yeah
so
where
this
came
out
of
the
customer
conversation
we
had
on
verify
day
and
it
we,
a
few
of
us,
were
chatting
after
about
how
we
were
talking
with
like
one
of
the
most
successful
use
cases
of
gitlab
right.
This
team
loves
gitlab
and
their
feedback
was
still
love,
some
more
testing
tab
and
automatically
and
just
jump
in
there
and
tell
us
what's
wrong,
and
so
we
kind
of
have
this
takeaway
that
we're
not
being
forceful
enough,
and
we
should
just
like
just.
B
We
should
be
giving
answers,
maybe
that
people
haven't
asked
questions
to
yet
and
we
should
be
more
opinionated,
and
we
should
just
tell
you,
hey
you're
doing
this.
This
is
what
we
think
of
it.
If
the
people
who
like
and
use
us
most
enthusiastically,
don't
have
like
all
the
verified
testing
features
turned
on,
that's
probably
a
sign
that
we're
not
being
aggressive
enough.
B
So
my
I
think
the
in
the
the
idea
behind
the
think
big
issue
is:
how
can
we
be
as
aggressive
as
aggressive
as
possible
like
be
way
too
aggressive?
Make
people
have
to
turn
stuff
off
like
go
super
overboard.
A
I
would
augment
your
question
just
a
little
bit
to
say
how
aggressive
can
we
be
to
answer
customer
problems?
They
don't
know
they
have
or
to
answer
questions
they
don't
know
they
have.
I
realized
that
the
problems
they
don't
know
they
have
wasn't
really
a
question
or
yeah.
Sorry,
I
don't
know
if
it's
too
much
coffee
or
not
enough.
B
I
I
wrote
a
little
bit
about
this
in
the
in
the
think,
big
issue
that
there's
a
sort
of
when
I
they're
not
the
first
person.
I've
talked
to
to
say
I
you
know
I
wish
there
was
some
automatic
guard
rails
and
which
sort
of
gives
me
the
impression
that
it's
like
saying.
Not
only
do
I
not
know
what
I
want,
I
don't
even
want
to
know
what
I
want.
A
B
Yeah,
what
just
to
jump
in
before
that?
The
the
thing
that
they
had
said
that
was
interesting
to
me
is
like
they're.
They,
they
were
kind
of
disassociating
the
problem
with
gitlab
right
when
when
they
were
talking
about
when
I
asked
them
that
question
like
what's
on
next
on
the
roadmap,
what
are
you
thinking
about
the
next
year
in
the
next
few
years
for
for
evolving
your
devops
tool
chain,
and
they
basically
just
said
yeah?
B
I
know
we
need
to
do
some
more
testing,
but
they
didn't
have
a
clear
idea
about
what
their
strategy
would
look
like
or
what
they
could
use
in
order
to
get
there.
So
I
I
think
that
that
also
points
to
giving
us
an
opportunity
to
shoehorn
ourselves
in
there
before
people
even
really
have
a
strategy
formed,
because
that
seems
like
a
common
trend.
B
Sorry,
that
seems
like
a
common
trend
as
well
in
industry
talking
to
people
it's
like,
oh
yeah.
I
know
I
need
to
be
doing
more
testing.
I
need
to
be
doing
more
with
testing,
but
it's
it's
isn't
very
clear
to
a
lot
of
people,
particularly
if
they
don't
have
like
a
ton
of
historical
experience
with
unit
testing
and
uncle
bob
and
all
that
type
of
stuff.
B
B
That
sort
of
that
a
higher
level
need
than
the
testing
itself
is
people
are
aware
that
they
want
all
the
benefits
of
testing,
but
maybe
not
knowing
what
that
testing
itself
looks.
Like
is
part
of
the
reason
you
know
this
customer
hasn't
built
it
yet
is
because
there's
not
a
strong
enough,
not
a
clear
enough
picture
of
what
exactly
can
I
do
that's
going
to
get
me
this
confidence
and
reliability
that
I
feel
like
I
want.
B
We
know
that
confidence
and
reliability
factors
into
many
different
spots
in
in
the
kind
of
like
software
development
life
cycle
right,
like
code
review,
the
fact
that
we
have
maintainers
that's
a
big
confidence
thing
when,
because
we
have
that
maintainer
process
to
give
us
confidence
that
the
changes
that
they're
merging
in
are
gonna
be
okay,
but
another
company
who
has
like
really
lots
of
confidence
in
their
unit
tests
and
antenna
tests,
and
things
like
that,
might
not
require
that
maintainer
step,
because
the
test
themselves
provide
enough
confidence
that,
as
long
as
it's
been
code
reviewed,
it's
probably
fine
to
go
in
because
the
test
passed.
B
So
I
I
like
framing
it
that
way
drew
a
lot
around
confidence
and
and
all
the
things
and
benefits
that
you
get
from
being
confident
in
your
changes.
Moving
forward.
That's
actually
part
of
the
vision
for
for
the
team
as
well.
This
is,
to
you
know,
provide
confidence
to
to
software
development
teams
to
deliver.
C
C
We
want
to
give
them
confidence
to
like
insight
into
what
their
tests
are
doing
right
or
showing
them
their
tests
aren't
quite
enough
yet
and
and
to
do
more.
I
know
this
is
our
whole
like
yes,
yes,.
A
Yes,
so
that
would,
I
think,
enabled
by
default
and
where
we're
going
with
confidence
are
a
couple
different
things,
but
so
enabled
by
default.
I
would
say,
for
us
means
every
scan
runs.
If
it
can
you
get
low
performance
testing,
you
get
browser
performance
testing,
you
get
accessibility,
hi
dude.
You
get
all
of
the
things
that
are
gonna
scan
in
a
in
code
quality,
I
think,
or
static
scanners,
plus.
A
Looking
at
how
much
coverage
you
have
potentially
play
an
interesting
role
in
that,
and
we
can
start
to
point
out
where
there's
gaps
in
your
tests
or
point
out,
hey
here's
problems,
and
that
would
be
the
next
step
for
them.
So
even
if
they've
only
written
one
test,
we
can
say
hey,
you
only
have
99
or
you
know
they
have
one
percent
that
wasn't
right
of
your
code
covered
and
it's
all
low
quality.
So
here's
where
you
should
start
writing
tests
to
get
more
reliable.
B
Yeah
and
there's
like
you're,
saying,
there's
stuff
that
we
can
do
without
without
them
having
to
write
tests
as
well.
When
it
comes
to
the
code,
quality
and
static
analysis
and
linting,
like
we
can
kind
of
just
like
drew
said
so
eloquently
in
his
issue.
We
can
just
throw
a
bunch
of
stuff
at
the
wall
and
see
what.
C
A
When,
like
drew
said,
customers
want
us
to
be
opinionated
for
them,
because
they
just
don't
know
where
to
start
like
the
best
analogy
for
me
is
when
I
don't
know
where
to
start
on
a
direction
update
and
my
manager
has
just
opened
an
mr
that
updates
one
line
for
me
like
it
gets
me
over
the
hump.
A
Even
if
I'm
gonna
delete
what
they've
written
and
replace
it,
it
gets
me
started,
and
so,
if
we
give
them,
here's
like
you're
running
ruby,
here's
whatever
the
ruby
letter
is,
and
if
you
don't
like
any
of
these
lending
rules,
you
can
turn
it
off,
but
at
least
now
you
know,
there's
problems
in
your
code.
C
A
You
could
you
could
choose
whatever
is
the
most
popular
you
can
choose
whatever
is
going
to
run
the
fastest?
I
we
we
don't
have
to
worry
about
the
minutia
today,
so
pulling
back
out
just
a
little
bit.
The
problem
we're
trying
to
solve
is
customers
want
to
be
more
reliable
and
no
testing
can
get
them
there
and
they
don't
know
where
to
start.
A
A
Okay,
I
think
so
cool,
so
we've
used
customers
as
our
who
has
this
problem,
but
if
we
were
gonna
focus
in
on
a
single
persona,
where
would
we
put
our
focus?
Who
has
this
problem?
Who
isn't
mandating
that
tests
are
written
or
who
isn't
writing
tests
because
they
don't
feel
like
they
know
where
to
start
or
which
multiple
people
have
this
problem.
B
C
Know
so,
with
the
developer,
it's
like
you
have
out
of
the
out
of
the
box.
You
know
you
can
see
exactly
what
you
want
or
what
you
need
to
develop
and
write
more
with
the
lead.
It's
all
right.
What
do
we
need
to
focus
the
team
on
kind
of
thing,
I
think
also
upper
management
would
be
good
to
see.
C
You
know
the
overall
projects
like
our
code
quality
is
really
low.
Therefore,
I
I
should
be
able
to
like
we
prioritize
the
department
to
focus
on
code
quality.
Instead,
it
gives
me
more
of
a
reason
to
rather
than
you
know.
Sometimes
the
developer
is
like
hey
our
code.
Quality
is
is
low
and
up
management.
You
know
it's.
You
have
to
really
persuade
upper
management
to
get
behind
those
kind
of
refactors.
B
This
might
also
vary
across
the
maturity
of
the
project,
I'm
thinking
about
like
at
get
lab.
I
think
that
the
higher
level
sort
of
management
and
progression
functions
are
more
useful,
but
in
cases
like
in
a
for
a
brand
new
project-
or
you
know
somebody
doing
a
five
minute
production
app
type
of
thing-
it's
a
place
where
it's
more
valuable
for
the
individual
developer
to
just
like
click,
the
yes,
I
want
testing
button
and
have
it
all
slammed
right
in
there
and
say:
oh
look.
B
We
we
ran
a
billion
tests
and
your
app's
great
deploy
it.
You
have
30
seconds
left,
you
know,
like
that's
the
it's
sort
of
how
good
is
my
testing
so
far,
and
what
testing
can
I?
How
much
generic
testing
can
I
automatically
add
right
now?
I
think
our
different,
I
don't
know
if
we
have
like
the
equivalent
of
personas
for
projects,
but
the
stage
of
the
project,
I
think,
is
a
is
a
factor
here.
If
there's
existing
code
or
if
I'm
starting
from
you
know,
rails
new.
B
Yeah,
I
think
it's
hard
to
get
away
from
us
being
able
to
parse
something
right
like
we.
How
how
do
we
know
that
there's
no
tests
or
not
it's
it's
almost
like
we
would
need
to.
I
I
feel
like
this
is
the
the
ux
go
to,
but
we
need
to.
We
need
to
get
to
clippy
somehow
and
be
like
hey.
It
looks
like
you,
don't
have
any
tests
did
you
know
you
can
upload
a
test
report
in
your
ci
pipeline
and
we'll
integrate
blah
blah
blah?
B
You
know
what
I
mean
like
that
that
prompt
to
get
users
to
engage
with
it.
B
It's
hard
to
know
how
to
get
away
with
that,
get
away
from
that,
particularly
when
it's
entirely
language
dependent
and
project
dependent,
but
there's
four
different
ways
to
hook
up
a
cobra
tour
apart
at
least
to
a
java
project,
and
and
then
that
just
depends
on
whether
they're
using
maven
or
it's
like
a
android
app
or
like
it
depends
on
their
build
manager
and
a
bunch
of
other
things
that
kind
of
determine
how
they're
actually
going
to
be
able
to
get
that
test.
Data
into
gitlab.
B
A
That
I
think
ci
did
something
similar
on
the
dag.
Where,
like
there
was
a
prompt,
the
tab
was
there
on
the
pipeline
and
they
prompted
users
of
how
they
could
visualize,
but
that's
starting
to
think
small.
Let's
think
big
again.
D
Ahead,
so
the
way
I
see
is
that
so
far
we
have
two
main
types
of
tests
that
we
could
run
so
it's
like,
sometimes
that
could
be
performed
from
the
outside
in
like
a
code
quality
and
some
tests
that
require
the
developer
to
actually
write
the
test
and
junit
style,
and
so
we
could
be
able
to
automatically
generate
jobs
for
the
outside
in
style
of
test
reports,
but
we
won't
be
able
to
do
generate
generate
like
junit
reports.
D
If,
if
we
don't
know
like
if
the
user
doesn't
generate
the
the
xml
report
itself
right,
so
we
might
be
able
eventually
to
figure
out
whether
there
are
files
dropped.
There
are
some
sort
of
xml
junit
xml
and
automatically
create
an
artifact.
So
say
you
know
you
have
some
files,
we
actually
didn't
create
artifacts,
but
here's
the
one.
D
So
I
think
for
the
second
type
of
category
we
we
probably
could
do
yeah
better
with
suggestions.
So
that's
probably
something
we
kind
of
lack
in
in
our
product
like
automatic
suggestions
and-
and
we
do
have
like
a
feature
with
pipelines
to
that
we
can
display
in
different
types
of
messaging
than
errors.
D
So
today
we
can,
for
example,
show
warnings,
and
the
plan
was
to
also
eventually
extend
that
to
even
show
some
suggestions
or
info
messages
so
stuff
like
that,
so
you
run
a
pipeline
in
the
end,
you
might,
we
might
display,
like
a
suggestion
there
with
the
same
type
of
messaging,
that
we
do
errors
other
things.
A
Cool
for
sake
of
time,
I'm
going
to
move
us
into
ideal
outcome.
What
would
it
look
for
like
for
someone
who
starts
a
project
or
migrates
a
project
into
gitlab
for
ci?
And
let's
say
they
turn
on
auto
devops?
A
A
B
B
We
checked
you
can
deploy
it
now,
some
just
I'm
thinking
about
you
know
somebody
just
coming
into
gitlab
for
the
first
time
and
it's
a
if
we
can
hand
them
some
stuff
for
free
off
the
bat
just
say
you
know,
welcome
you
have
higher
quality
assurance
now
that
that
would
be
a
good
first
impression.
Yeah.
B
I
think
that
ties
into
my
third
point,
which
is
like
once
fulfillment
kind
of,
gets
going
a
little
bit
and
we
get
that
better
billing
for
runners
and
then
shared
a
better
billing
for
shared
runners.
It'd
be
neat
if
we
could
potentially
like
through
saying
give
people
some
free,
shared
runner
minutes,
and
we
can
just
run
that
co-quality
job.
They
don't
have
to
worry
about
it
that
decreasing
the
balance
of
their
shared
range
is
like
hey,
the
first
one
is
free.
B
You
know
it's
a
standard
marketing
tactic
right,
yeah
give
them
the
first
couple
of
minutes
for
free,
we'll
run
the
test
for
you,
don't
worry
about
it,
distracting
from
your
test
and
then
we'll
show
you
how
to
set
it
up.
If
you
want
to
do
it
yourself
in
the
long
term,.
A
Ricky,
your
second
one,
the
as
a
developer
when
I
set
up
a
project
with
tests
test
reports,
inline
code
coverage
code
coverage
graphs.
All
that
should
work
automatically.
A
B
That'd
be
pretty
neat
like
if
you
think
about.
We
have
all
those
examples
in
the
documentation
for
this.
So
let's
assume
for
argument's
sake
that
we've
consolidated
all
of
our
coverage
features
to
only
use
the
cobratura
report.
So
you
have
a
corporate
report
automatically
you
get
inline
code
coverage
automatically.
You
get
code
coverage
calculation
in
your
pipeline
automatically.
B
I
don't
know
you
get
the
test
report
feature
too,
which
wouldn't
really
work,
but
then
we
could,
like
you
said
we
could
put
up
an
automatic,
mr
with
a
bot
that
just
adds
that
example
from
the
docks
into
their
gitlab
cie
ammo,
like
that's,
not
the
worst,
that's
kind
of
how
dependable
works
right
on
github
it
just
bumps
the
version
for
you
when
you
have
a
security,
vulnerability
and
your
dependency.
B
And
yeah,
I
I
talked
about
the
github-
depend
upon
security
scanners
down
there
like
as
a
developer.
When
I
set
up
a
new
project,
it's
still
it's
scanned
for
those
outside
in
tests
automatically,
and
then
we
give
them
that
information
like
there's
even
an
opportunity
really
where
we
could
just
do
those
scans.
B
B
Too
similar
sorry,
just
I
just
I'm
wondering
it
does
fuzz
testing
have
a
similar
sort
of
automatic
execution
framework
to
depend
upon.
I
don't
know
a
lot
about
it,
but
I
know
the
idea
is
that
it
just
automatically
dumps
everything
in
to
see
what
lights
it
on
fire
and
so
that
it
seems
like
a
very
outside
in
kind
of
testing
that
and
this
dependable
stuff
made
me
think
of
it
yeah.
I
think
that
would
be
another
good
kind
of
like
free
trial
like
get
them
get
them
interested
in
it
like.
B
Let's
do
the
dast
scan
or
the
sas
scan?
Let's
do
the
fuzz
testing.
Let's
do
the
license
scanning.
Why
not
just
do
like
a
free
scan?
Oh
no,
I'm
just
I'm
I'm
thinking
about
like
how
do
they
figure
out
what
to
do
automatically
like
the
just
the
ideas
around,
so
I'm
thinking
that
they
probably
do
some
kind
of
detection
to
figure
out
what
to
do
automatically
and
we
might
be
able
to
borrow
from
that
to
design
some
other
automatic
non-fuzz
testing.
D
I
believe
the
the
goal
for
us
will
be.
I
could
commit
some
code
on
master
and
then,
after
even
after
a
while
the
next
day,
I
could
get
the
pipeline.
That
is
automatically
run
by
a
gitlab
bot
that
has
somehow
permissions
to
some
super
permissions
like
in
the
background
that
looks
you
know,
it
doesn't
have
to
be
triggered
by
me
directly,
but
tells
me
this
is
we
run
some
tests,
and
these
are
some
results
for
you.
If
you
want
to
set
it
up,
how
exactly
you
want
to
do
this?
D
D
Kind
of
yeah
so
automatic
demonstration
how
to
do
certain
things.
A
These
half
hour
sessions
are
just
not
long
enough
to
think
big,
so
we
are
at
time.
Unfortunately,
there's
one
more
prompt
question
in
here
that
if
you
have
a
chance
and
have
some
thoughts,
I'd
love
to
get
people
to
contribute
asynchronously
and
that's
how
do
we
differentiate
from
the
existing
solutions?
We
talked
a
lot
about
depend
about
style.
Are
there
other
things
that
are
inherent
to
git
lab
that
aren't
like
that?
A
Or
we
can
add
additional
value
because
of
our
all-in-one
nature
or
just
other
ideas
that
are
out
there,
where
we
can
differentiate
from
the
rest
of
the
market.
A
And
then
I'll
sum,
this
up
I'll
post
the
video
sum
this
up
in
next
week's
agenda
and
we'll
think
small
same
time
same
day
next
week
about
what
is
something
we
can
put
into
an
upcoming
milestone
that
moves
us
towards
the
broader
vision.