►
Description
GitLab introduced the Compliance Dashboard in 12.8. Now that we have wrapped up 13.8, we have some key insights on how we will improve it next. If you want to see where we are at currently, you can follow this epic below
https://gitlab.com/groups/gitlab-org/-/epics/5237
A
A
What
I
want
to
start
with
is
where
we're
coming
from
so
we
released
a
feature
called
the
compliance
dashboard
back
in
12.8.
A
So
then,
what
were
we
trying
to
achieve?
So
we
had
this
idea.
We
wanted
to
continue
to
supplement
the
compliance
dashboard
and
that
we
wanted
to
introduce
more
insights
into
it,
and
so
today
you
can
assume
there
are
a
few
things
that
you
can
learn
and
those
are
identified
by
this
green
checkbox.
You
can
figure
out
who
made
the
change,
who
did
some
of
the
peer
review?
What
were
some
of
the
tests
that
passed,
and
so
how
that
looks
today
in
the
compliance
dashboard?
Is
you
can
see
a
merge
request?
A
You
can
see
who
created
it?
What
project?
It
was
a
part
of
the
merch
number,
the
people
that
approved
it,
its
branch,
its
path
when
it
was
merged-
and
this
was
a
really
great
mvc
for
getting
it
out
there
and
getting
some
initial
feedback.
It
kind
of
highlights
some
of
the
most
important
aspects
of
the
merge
request,
but
from
a
compliance
perspective.
So
there
are
certain
aspects
that
you'd
want
to
look
into
depending
on
the
regulations
you
might
need
to
abide
by
or
just
for
general
best
practices.
A
A
And
so,
if
we
look
at
this
job
like
we
want
to
enable
camera
to
manage
the
compliance,
controls
of
applications
meet
their
policies,
so
they
can
avoid
problems
when
they
come
to
an
audit
and
there's
some
more
tasks
that
go
associate
with
that.
It's
not
just
you
know
getting
ready
for
audits,
there's
a
lot
more
things
that
go
in
there,
but,
generally
speaking,
that's
the
direction
they're
trying
to
be
prepared
for
so
over
time.
We
evolved
this
idea
to
where
we
felt
like
we
had
a
pretty
good
idea.
A
We
want
to
build
and
what
I
love
about.
This
is
a
great
example
like
we
didn't
necessarily
think-
or
at
least
I
didn't
think
as
a
designer-
the
right
solution
first,
and
we
definitely
learned
that
through
some
of
our
iteration
process.
So
now
I'm
going
to
walk
you
through
a
little
bit
of
the
designs
that
we
mocked
up
and
where
we
got
to
before
we
started
to
do
some
of
the
testing.
A
So
again,
this
is
what
the
existing
dashboard
looks
like.
Some
of
our
initial
ideas
were
like
exposing
the
different
environments
in
which
merge
requests
are
released
to
so
maybe
you
have
a
production
environment,
a
staging,
a
canary
different
places
that
you
progress
through
a
system.
So
this
is
the
way
you
could
tab
out
that
data.
A
A
It's
only
showing
you
the
most
recent
one,
so
that's
problem
number
one
that
needs
to
be
resolved
so
with
that
we're
trying
to
expose
more
merch
request
data
trying
to
show
more
rich
information
around
it
and
trying
to
create
this
insights
drawer,
where
you
go,
dig
and
find
out
all
the
different
details.
A
So
everybody
is
really
excited
about
that
from
our
side,
so
we
continue
to
run
with
it.
Refine
it.
Some
more
and
you'll
notice
that
I'm
playing
more
with
high
fidelity
data,
as
opposed
to
like
low
fidelity
mockups.
That's
partly
because
what
we're
most
concerned
about
is
how
are
we
going
to
present
this
data
in
a
way
that
makes
sense
so
breaking
down
to
like
lower
marks?
A
We
assume
information
would
be
landing,
was
a
little
bit
harder
to
envision
what
that
would
actually
look
like,
because
we
rely
pretty
heavily
on
very
synthesized
view
of
components
like
an
icon
and
a
badge
to
relay
a
lot
of
information.
So
this
helped
us
quickly
validate
or
invalidate
some
of
those
ideas.
A
Again,
we
kept
adding
to
this
drawer
and
it
became
like
quite
large
something
that
I
heard
from
a
few
users.
They
didn't
scroll,
they
didn't
know
they
needed
to
so
that
was
pretty
insightful.
I
think
sometimes
users
forget
that
they
can
scroll
or
they
don't
know
they
can,
and
so
some
of
this
data
that
was
below
the
line,
they
wouldn't
necessarily
know
to
scroll
down
and
find
it.
A
But
we'll
talk
more
about
the
insights
that
we
learned
in
a
moment,
but
this
was
essentially
the
prototype
that
we
were
bringing
in
what
we
were
trying
to
ask
the
different
participants
to
do
in
some
moderated
research
sessions
was
to
perform
a
few
tasks
and
essentially
we're
trying
to
say,
hey,
you're
interested
in
this
merge
request
called
fix,
guiley
ruby,
not
starting
a
test.
Tell
me
about
it.
What
compliance
related
problems?
Do
you
see
with
it,
and
so
they
did
what
I
would
have
expected
them
to
like
they
clicked
on
it.
A
A
A
We
had
about
six
people
now
one
we
didn't
include
in
the
findings
because
they
really
weren't
a
good
fit
for
the
study
overall,
but
based
on
this
research,
it
really
gave
us
the
emphasis
that
we
are
moving
in
the
right
direction,
but
we
need
to
make
a
little
bit
of
a
change
in
order
to
be
moving
in
the
right
direction
for
sure
some
pretty
interesting
insights
that
I
found
where
everybody
said,
we
have
to
change
the
fact
that
we're
only
showing
one
merge
request.
A
They
need
to
see
more
than
that
and
they
really
want
to
see
exactly
what
the
problems
are
with
those
merge
requests.
So
today,
if
you're,
looking
at
that
compliance,
dashboard,
you'll
see
that
will
tell
you,
you
have
one
rule:
that's
not
adhering
to
the
separation
of
duties
which
you're
like
okay.
What
does
that
mean?
Well,
then
you'd
have
to
go
dig
into
your
docs,
so
you
go
look
at
the
compliance
dashboard
and
we
break
down
what
this
means.
A
A
A
A
Now,
jumping
back
over
to
the
report,
the
other
thing
we
saw
was
yes,
the
participants
were
going
to
find
information,
the
merge
request,
but
they
didn't
understand
what
was
most
important.
Everybody
was
emphasizing.
I
want
to
know
what
is
most
important
next,
they
want
to
be
able
to
see
a
priority
of
what
they
needed
to
go.
Take
action
on
so
what
we
did
was
document
the
different
changes
that
we
wanted
to
introduce
into
the
ui
and
also
documented
a
few
things
that
we
thought
would
be
really
great
for
future
considerations.
A
Like
I
mentioned
initially,
we
had
all
these
ideas
for
what
we
wanted
to
introduce
in
terms
of
insights,
but
what
we
really
kind
of
discovered
was
yes,
those
are
some
really
great
insights
that
we
want
to
add
in,
like
what
we're
going
to
be
putting
in
will
be
the
ability
to
understand
like
how
did
this
affect
the
code
quality.
A
What's
really
great
about
this
epic
here
is
we
I
broke
down
all
the
different
work
that
went
into
it,
so
you
can
see
over
time
how
this
idea
progressed.
So
I
mean
it's
like
back
in
october,
right
after
we
finished
the
jobs
to
be
done,
research
we
started
redefining
how
we
wanted
to
shape
the
experience
for
the
compliance
dashboard.
A
And
the
biggest
thing
that
I
wanted
to
emphasize
with
the
compliance
dashboard
is
like
what
are
the
violations
and
how
can
we
make
them
better,
so
we'll
jump
back
into
the
designs
here
and
walk
through
a
few
more
ideas
that
we
had
so
initially.
I
was
like
all
right.
Let's,
let's
summarize
them,
let's,
let's
say
each
of
these
have
a
badge
next
to
them
showing
a
counter.
This
will
tell
you
how
many
violations
each
one
has.
Then
you
can
open
this
drawer
and
I'll.
A
A
So
that
was
like
all
right.
Well,
what
if
we
specified
like
a
level
of
severity-
and
I
was
really
trying
to
keep
like
the
current
data
that
we
had
in
the
existing
dashboard
and
just
reshuffling-
some
things
around
and
again
still
tucking
those
violations
into
the
drawer,
which
I
think
made
it
a
little
bit
more
clear.
It
definitely
synthesized
the
table
down
a
bit,
but
it
still
wasn't
obvious
exactly
from
just
looking
at
the
report.
What
the
individual
problems
were.
A
And
so
then
I
had
this
other
idea
of
like
what,
if
we
just
did
the
exact
opposite,
we
brought
all
the
violations
into
the
main
report
broke
them
down
by
merge
request.
So
this
merge
request
shows
up
multiple
times,
but
it
has
individualized
violations.
So
there's
a
violations
for
a
merge
request
being
approved
by
an
author
and
there's
also
a
violation
for
it
being
approved
by
its
committers,
and
so
that's
great,
because
now
we're
individualizing
some
of
the
results.
A
So
this
is
the
idea
that
we're
ultimately
trying
to
run
with,
and
what
I
did
was
then
pull
us
back
into
the
team.
After
we
refined
some
of
the
designs
got
some
feedback
familiar
designers
got
some
input
from
front
and
dev
and,
from
their
fresh,
take
they'd,
be
able
to
give
some
very
great
opinions
on
some
things
that
made
sense
and
some
things
that
didn't
a
few
things
that
stood
out
were
the
details
button.
A
They
weren't
really
clear
if
that
was
related
to
the
violation
or
if
it
was
related
to
the
merge
request,
or
they
also
weren't
super
clear
on
what
this
little
alert
meant,
which
I
thought
was
totally
valid.
It's
not
exactly
obvious
so
took
that
out
changed
the
button,
and
now
we
have
a
much
more
reasonably
scoped
vision
for
what
we
want
to
do,
and
we
have
validated
that
this
is
going
to
be
an
improvement
with
our
users.
A
Something
that
we
documented
in
our
report
here
was
that,
let's
find
exactly
where
I
referenced
it,
but
everybody
said
that
this
change
that
we
were
proposing
was
much
more
useful
compared
to
what
we
have
in
the
current
compliance
dashboard.
Now
I
feel
like
it's
a
little
bit
easier
to
meet
that
goal,
because
we
were
only
showing
the
most
recent
merge
request,
and
that
was
probably
the
biggest
pain
point,
and
so
I
took
that
design
that
I
was
showing
you
here
well.
A
This
is
more
of
the
interactive
prototype
of
how
it
would
show
up
but
took
this
took
this
report
and
I
started
to
scope
it
into
an
epic
that
I
broke
down
to
a
number
of
issues.
A
I
believe
right
now,
it's
quite
low,
and
so
my
hope
is
that
we'll
see
more
users
using
it
that
we
can
advocate
for
its
usage
more
because
now
that
we're
actually
giving
some
sort
of
opinion
on
what
we
feel
the
severity
of
different
violations
are
and
putting
them
into
a
granular
view
will
be
able
to
better
help
cameron
going
back
to
that
original
job
understand
if
they're
meeting
the
requirements
of
their
policies.
Now
our
definition
of
what
is
a
high
severity
item
might
not
matter
to
cameron.
A
Well,
we
need
to
figure
that
out,
but
this
will
at
least
will
give
them
a
starting
point
to
go
start
chasing
down
some
things
that
might
become
a
problem
later
on.
We
feel
like
things
like
maybe
code
drops,
you
know
below
one
percent,
probably
not
a
big
deal
on
code
coverage
or
maybe
pipelines
that
pass
with
a
warning.
That's
just
informational,
but
we
see
things
as
having
not
enough
approvers
or
having
a
poor
separation
of
duties
is
a
very
high
severity,
something
that
you
definitely
want
to
go
resolve
for
the
future.
A
I'm
interested
in
finding
out.
How
does
camera
want
to
interact
with
these
components
of
data?
Do
they
want
to
mark
them
as
resolved?
Do
they
want
to
pin
them?
Do
they
want
to
be
able
to
create
a
specific
report
that
they
can
then
export
out
to
other
people?
Currently
they
have
an
ability
to
get
some
of
the
merge
commits,
but
we
know
we'll
need
to
probably
evolve
that
in
the
future.
A
But
overall
I
just
wanted
to
show
the
progression
of
where
we
came
from
and
how
we
evolved
again.
This
kind
of
started
back
in
october,
and
you
can
kind
of
see
how
every
like
week
or
two
we
had
some
little
milestone
along
the
way
of
what
we
were
doing
together.
We
spent
a
good
bit
of
time
shaping
up
the
research
and
then
ultimately
bringing
together
this
idea
that
we're
going
to
scope
down
and
plan
to
prioritize
a
future
milestone,
so
be
on
the
lookout
for
some
good
changes
coming
to
the
compliance
area.