►
From YouTube: Brown Bag on Split build/scan PoC
A
Okay,
welcome
to
this
front
back
session
about
splitting
this
build
scan
job.
The
context
is
that,
right
now,
the
scanning
scanning
jobs
like
SAS
SAS
jobs
and
dependencies
scanning
job
sorry
do
the
build
prior
to
the
scan
and
that's
what
these
session
is
about
about.
Extracting
the
build
out
of
the
scan.
Okay,
I'm
gonna
share
my
slide
deck
and
also
some
projects
get
our
projects.
A
A
A
It
is
how
the
first
of
all,
what
are
we
trying
to
solve
so
we
got
I
mean
every
now
and
then
SAS
jobs
and
even
scans
just
jobs
fail,
because
the
dependency
is
required
for
the
the
build
of
the
project
peer
to
the
scan
not
installed
missing,
and
that
can
be
a
vascular
system
library,
a
specific
version
of
such-and-such,
like
the
interpreter
particular
version
of
ruby,
maven
java,
but
not
particular
language.
Specific
libraries
like
one
particular
yeah
I'm,
not
I've,
gone
home,
go
through
the
list.
The
list
is
pretty
long
and
yeah.
A
Just
just
a
reminder.
We
right
now
we
we
ship
the
SAS
and
even
the
scanning
analyzers,
actually
all
secure
analyzer
scanners,
a
stoker
images
and
meaning
that
we've
got
to
choose
one
password:
distribution,
Linux
distribution
in
this
case
and
one
patch
provision,
and
that
the
case
of
Debian
we
have
to
choose
one
patch
version,
flavor
of
gbn
and
that's
may
not
match
what
what
users
need
to
build
the
project
again
peer
to
the
scanner
to
scan.
We've
got
this
b4
script
in
the
chapel
efficient.
A
That's
something
users
can
customize,
but
that's
not
enough
and
also
I
wanted
to
say
that
we
we
keep,
introducing
you
see,
I've,
to
make
the
scanning
environment
more
flexible.
For
instance,
what
comes
to
mind
is
even
scanning
for
Python,
so
we've
introduced
their
variables
to
choose
the
version
of
python.
The
version
of
peat,
which
is
a
package
manager
for
python
and
and
the
list
goes
on,
and
it's
never
enough.
A
Whatever
solution
we
come
up
with,
it
has
to
be
ordered
about
is
compatible,
it
has
to
be
a
gap,
pretty
really
meaning
that
the
scanning
jobs
should
work
behind
a
firewall
in
the
so-called
offline
environment
and
of
course
it
has
to
be
as
much
as
possible
backward
compatible
with
the
existing
job
divisions
of
sass
and
scanning,
and
oh
yeah
and
burner
it.
Yes.
So
far,
this
proof
of
concept,
the
proof
of
concept
I've
worked
on
I've
chosen
band,
relate
the
pandiraj
it
analyzer,
which
is
a
dependency
scanning
for
bender
recommender.
A
Why
is
that?
Because
it
has
a
limited
number
of
dependencies
compared
to
let's
say
Nasia,
it's
already
a
gap
comparable,
meaning
we
have
the
what
we
have
now.
The
variables
were
CI
variables
to
turn
all
the
auto-update
of
LMC
database
anyways.
It
was.
It
was
a
good,
a
good
environment
to
play
with
go
sorry,
analyzer
to
play
with,
and
also
chosen
fpm
as
a
package
manager
to
build
the
TVM
packages.
Just
because
it's
easy
to
use
really
easy
to
use
and
I.
Think
oh
yeah,
one
last
slide,
I
guess
before
demonstration.
A
How
does
it
work
so
in
the
scanning
job?
What
changes
is
the
script
of
of
the
job
before
through
the
change
the
script
on
iran's
with
the
analyzer?
It's
just
one
line
running
the
analyzer
and
that's
it
and
in
this
proof
of
concept,
though,
we've
got
two
lines.
Of
course
we
render
analyzer,
but
before
that
we
install
being
an
either
if
it's
not
if
it's
missing.
A
A
So
this
is
what
changes
in
the
job
definition
on
the
user
side
and
in
the
analyzer
project.
In
the
odd
case,
bender
audit
we've
got
new
jobs
to
build
DB
on
packages,
and
so
that's
a
new
job,
beam
being
actually
new
stage
being
added
to
the
pipeline
and
also
the
the
book,
the
job
that
builds
image
changes,
because,
instead
of
having
a
long
took
a
file
that
creates
docker
image
from
scratch,
we've
got
we.
We
reuse
the
DPN
package
to
build
the
docker
image,
but
we
all
see
all
that
later
on
anyways.
A
And
what
about
the
dependencies?
So
it
depends
scanning
analyzer
like
Panaji.
It
comes
with
Bernhard
it
itself.
That's
a
religion
and
the
CLI,
the
Android
it
comes
with
its
dependencies
will
be
champs
like
bender,
but
that's
not
the
only
one
and
also
see
Ruby's
extensions
has
to
be
that
have
to
be
compiled
for
the
target,
the
target
environment
and
also
it
comes
with
ability
database.
So
that's
only
for
than
scanning
that
we've
got
to
we
need.
We
had
all
these
dependencies
for
the
analyzers
to
run
and
we'll
go
back
to
that.
A
But
the
way
it
works
in
this
proof
of
concept
is
that
the
the
TBN
package,
installs,
what's
missing
in
the
DP
and
before
install
script,
will
see
that
iran
and
that's
essentially,
because
it's
more
flexible
and
in
a
way
it's
it's
a
gap
compatible.
Even
if
it's
possibly
fetched
that
she's
sorry
our
dependencies,
because
you
can
always
pre-install.
What's
what
you
need,
what
the
analyzer
needs
prior
to
the
scan-
and
I
think
that's
the
last
line
before
the
team.
Oh
yes,
it
is
okay,
so.
A
So
that's
a
simple
pipeline
in
the
in
the
context
of
the
ribbon
monitor
test
project.
That's
one
of
the
test
projects
we
use
for
QA.
It
is
with
the
vendor
and
that's
a
pipeline.
We
weights.
Almost
the
pipeline.
We
have,
for
the
number
of
the
current
occur,
set
up
the
set
up
where
we
don't
have
the
dough
conductor
Orchestrator.
So
the
pipeline,
using
this
proof
of
concept,
is
the
same.
It's
identical,
except
that
the
implementation
is
different,
so
we
got,
we
got
the
scans
and
we've
got
the
QA
jobs
that
check
that
that
generated
repulse.
A
A
You
can
see
that
the
script
changes
I've
really
find
the
script
again.
There
are
two
lines
and
the
the
first
line,
which
is
pretty
known,
actually
try
to
detect
the
enlarger
and
if
it's
missing
and
see
or
if
it's
missing,
then
it
fetches
the
era,
the
TV
and
package
and
then
installs
a
package
and
then,
of
course,
in
the
nine,
is
running
the
job
running
the
my
dresser.
A
A
So
that's
what
changes
here
and
if
you
look
at
the
code,
if
we
look
at
the
merge
quest
for
this
book,
so
we've
got
to
really
find
the
stages,
because
there's
this
new
one,
the
package
for
GPM
packages
and,
of
course,
we've
got
two
new
jobs.
Well,
new
jobs,
including
one
that
bill
it'll,
be
an
package
here
and
actually
also
an
alpine
Linux
Alpine
package.
But
here's
the
one,
here's
a
line
that
builds
a
chicken
package
and
and
then
the
package
is
available.
A
A
A
A
A
Scrape
it
does
nothing
because
it
tries
to
detect
the
enlarger.
Is
there
nothing,
no
package
to
to
fetch
no
bucket
to
install
okay,
and
here
are
other
scenarios
where
we
pre-installed
dependencies
of
the
analyzer
that
could
be
pre-installed,
bender
bender
did
and
all
that
to
prove
that
if
you
provide
all
the
dependencies,
it
doesn't
touch
anything
and
ultimately,
it's
possible
to
run
that
behind
a
firewall.
As
long
as
you
proxy
the
the
package,
the
Debian
package.
A
Okay,
oh
yeah,
Auto,
devups,
Old,
Olive,
apps,
again,
I'll
still
there
really
with
a
better
test
project,
and
it's
that
ones,
feigning
that
that's
just
for
you
to
the
last
job
is
failing.
That's
just
because
I
have
to
update
down
the
expected
report.
It
says
it's
out
of
date,
but
that's
some
a
simplified
version
of
Auto
reverbs,
where
I've
got
the
the
auto
built
and
auto
test
and
in
auto
test.
I
only
had
this
proof
of
concept,
but
the
build
job
is
really
the
one
of
auto
reverbs
and
it
works.
A
A
A
A
A
It's
not
something
I've
explored
that
much
and
also
we've
got
to
work
on
detection,
build
addiction
so
that,
for
instance,
in
for
instance,
in
the
case
of
SAS
for
Java
in
the
context
of
auto
develops
so
that
it
doesn't
build,
doesn't
try
to
build
the
project,
the
Java
maven
project,
when
it
detects
that
the
the
project
has
been
built
already,
and
we
already
have
the
class
files,
all
the
charm
files
and
I
think
that
said,
we
are
questions
and
I've
created
a
and
that's
it
just
it's
just
the
beginning.
So
your
questions
now.
B
A
B
A
B
A
A
A
A
A
The
package
is
built
here.
This
is
a
DPN
package,
it's
it's
built
from
a
directory
and
that's
the
directory,
a
directory
containing
the
the
binary
build
by
the
kobolds
job,
so
I'll,
not
to
say
that
it's
it's
it's
it's
package
that
only
contains
a
binary
and
that's
the
name
of
the
package
and
the
version
of
package,
and
it
defines
the
before
install
script
a
before
Tyrion
before
install
script
is
executed
when
installing
the
package,
I
repeat
when
installing
the
package
that
is
very
late
in
the
process.
A
In
that
case
right
before
installing
the
binary
at
the
analyzer,
it
will
run
this
script.
Let's
have
a
look
at
the
script
of
the
script,
tries
to
detect
bender.
If
banner
is
not
installed,
it
installs
spender
using
the
gem
command
the
gen
CLI.
So
here
we
assume
that
chain
is
installed,
but
this
is
that's.
This
is
an
environment
where
a
ruby
is
available
and
with
will
be
the
chance
you
know
better
is
not
available
installed.
Bender
Bernhard
is
not
available,
install
been
loaded.
A
B
A
But
all
that's
yeah,
but
there's
anima
location
for
the
Ruby
advisory
database.
I
mean
the
one
ability
to
exist
because
it's
hard-coded
in
Ballard
it
so
by
probing
the
directory.
It's
very
make
things
very
flexible.
You
can
you
can
clock,
though
there
will
be
the
the
value
database
using
kicked,
get
close.
You
can
install
the
validity
that
is
as
a
deviant
package
by
other
means.
In
the
end
you
get
the
directory.
A
Say
it's
based
on
really
to
the
five
stretch,
deep
and
stretched,
and
so
here
it
runs
the
typical
dpkg
command
to
install
the
package
and
one
installing
the
package.
So
that's
and
it
starts
here
while
installing
the
package
it
it
friends,
it
starts
with
running
the
before
install
script.
Here
it
cannot
find
it
looks
like
it
can
find
vendor.
So
it
doesn't
try
to
install
Burma,
but
it
couldn't
find
audits,
and
this
is
where
Ben
wrote
it,
the
better.
A
A
A
Let
me
see
all
these
fun
Oh
not
miss
the
files,
but
these
lines
sorry
are
generated
by
the
before
install
script.
That's
the
output
of
the
before
install
script.
In
that
case
before
install
script
and
saw
the
audit
vendor
auditor-
and
this
is
here
and
then
it
cannot
find
there
are
realized
real
database.
So
it
clones,
the
the
repo
are
corresponding
to
these
advisory
database
in
any
trends
the
scan.
A
So,
in
that
case,
at
runtime,
the
TPN
package,
as
part
of
the
install
a
fetch,
is
more
appearances,
which
is
not
so
convenient
when
running
behind
a
firewall,
but
which
is
very
flexible,
because
if,
for
instance,
the
the
analyzer
depends
on
a
ruby
gem
that
depends
on
C
extensions.
This
is
going
to
work
because
this
extensions,
the
Ruby,
is
essential,
we'll
be
built
from
this
particular
version
of
Ruby
for
this
bachelor
environment.
This
particular
image-
that's
very
flexible.
Now
what?
If?
What?
A
If
we
don't
want
this,
we
before
install
script,
to
install
anything,
and
you
don't
want
to
fetch
anything
in
that
case,
maybe
I
should
close
at
Han,
okay
yeah.
It
is
in
that
case
you
can
pre
install
all
the
dependencies,
and
so
that's
it.
Okay,
hey!
It
is
so
it's
similar.
It's
also
based
on
the
revita
to
the
five
image,
which
is
to
be
an
image.
A
The
autumn,
auto
update
of
the
Volusia
database
is
disabled,
so
that's
passive,
see
available
that
it's
been
introduced
as
body
of
the
air
capped,
epic
and
before
running
the
analyzer
before
running
its
scripts,
its
scripts
right
now,
the
before
script,
installs
Penrod
it
and
update
the
database
and
because
the
database
has
been
updated,
it's
not
something
we
have
to
do
anymore.
It's
fresh
okay
and
that's
the
output
of
a
job,
that's
a
before
script,
and
that
is
that
is
the
first
line
of
the
script.
So
this
is
user
defined
and
that's
pretty
finder.
A
The
the
CLI
here
analyzer
CLI
cannot
be
found,
so
it
fetches
TP
on
package
and
install
the
TP
and
package
and,
as
you
can
see,
it
doesn't
install
banner
T
because
it's
already
there
it
doesn't
so
it
doesn't
fetch
the
value
T
database,
because
it's
already
there
and
same
goes
with
that
one
here
in
the
before
script,
we
install
the
Libyan
the
lava
Nikita
days
as
a
TPM
package.
That's
part
of
the
before
script
and
thanks
to
that,
the
the
venom
C
database
is
not
fetched
and
installed
when
running
the
analyzer.
A
C
Questions
yeah
hide
it
some
to
the
dock
and
the
first
one
is.
As
we
are
billing
a
diggin
package.
It
makes
us
only
compatible
with
Debian
based
build
environments
which
is
similar
to
the
current
situation,
so
I'm,
assuming
that
what
we
could
do
with
that
architecture
is,
we
can
build
other
executables
for
other
environments
and
how
things
will
have
the
same
flexibility
when
it
comes
to
installing
the
dependencies.
A
A
C
A
Would
be
a
it
would
be
like
the
Pandora
g2
analyzer
would
be
compatible
with
all
versions
of
Barnard
it.
That
would
be
ideal
because
it
will
lower
the
the
requirements
expectations.
But
if
we
do
that,
that
means
that
the
analyzer
project,
which
is
rapper
really
it's
compatible
with
all
the
burner,
did
all
the
versions
of
better
entity.
A
C
If
we
are
providing
the
deep
packet
this
Debian
package,
we
are
building
it.
We
have
specific
version
of
the
tool
on
its
dependency,
so
we're
not
even
sure
that
you
work
in
the
project
environment
and
today,
with
the
environment
variable,
we
have
the
option
to
make
this
compatible
like
you
can
set
up
a
different
version
of
PEEP
for
the
Python
tool,
for
example,
give
a
debian
package.
How
would
you
achieve
that.
A
D
A
C
Yeah
because
debate,
what's
what
needs
to
be
clarified
that
this
base
image
is
actually
the
image
from
the
build
job.
The
idea
here
is
to
include
CTV
and
package.
We
in
the
build
job
of
the
other
project,
instead
of
having
a
separate
job
puzzle
scan
so
yeah
I
understand
that
actually
yeah
if
they
are
building
their
project
with
Python
2
and
people
whatever
it
will
really
be
part
of
this
base
image.
So
it's
just
a
matter
of
ensuring
that
our
tool
is
compatible
with
those
versions.
Yeah.
A
We
are
turning
things
upside
down.
Oh
right
now
we
user
users
have
to
make
that
project
fit
a
scanning
job
in
a
way
and
I've
tried
to
change
I
mean
if
you
decide
you
go
that
way.
The
scanning
job,
the
scanning
tool
that
well
the
scanning
the
analyzer,
would
have
to
fit
without
the
build
environment
of
the
project.
C
C
Today,
yeah
today
we
say
you
know
you
can
use
Python
3
we're
using
special
everybody
called
and
you
can
use
Python
2
and
we
have
a
test
for
Python
2
to
make
sure
it
still
work.
So,
in
a
way
we
are
more
testing
the
different
agreement
we
are
providing.
Even
if
it's
a
limited
number
of
environments,
this
is
purely
a
product.
Are
you
expecting
of
this
approach
like
we
are
not
stating
clearly?
What
do
we
support
and
we
we
risk
to
face
the
same
problem.
C
A
To
restrain
leave
the
tests
to
ensure
that
it's
compatible
with
such-and-such
environment
like
with
such
version
of
Ruby,
again
this
wrong,
but
it's,
but
that
is
it's
only
a
matter
of
create
jobs
and
documentation.
It's
not
a
matter
of
implementation
anymore.
The
implementation
remains
the
same.
It's.
B
A
matrix
problem
yeah
and
we
will
add
the
same
issue,
for
example,
with
the
with
the
dependencies
of
the
analyzers
themselves.
When
we
say
when
the
road
is
developing,
maybe
it's
not
exactly
that.
It's
maybe
ready
to
go
to
ready,
2.7
or
something
like
that
and
for
prior
versions.
We
named
a
different
package
because
where
every
D
was
packaged
or
something
is
different,
so
yeah,
we
need
a
winna
nemetrix
anyway.
That
does
also
proud
that
we
have
today
that
we
can't
ensure
that
yeah,
it's
going
to
work
with
any
version
self
or
a
bee.
B
C
E
B
C
If
I
get
me
a
question
right,
man
I
think
this
is
true
at
the
level
of
the
iOS.
The
operating
system,
like
obviously
the
runners
to
be
compatible
with,
for
example,
Windows
or
a
Linux,
but
after
that
I
think
the
when
it
comes
to
Ruby.
This
is
given
this
version
of
Ruby
this
version
of
Python.
This
is
depending
on
the
image
itself.
C
I'm
still
I
mean
I
haven't
had
time
to
look
carefully
at
the
murder
occurred,
but
I
have
a
hard
time
to
to
see
the
value
of
this
approach.
I
mean
the
the
harbor
18
maintenance
curves
for
us
versus
the
the
gain
in
certain
environment
I'm.
Just
maybe
not
viewing
it
right
now,
because
I'm
lacking
context,
but
money
when
it
comes
to
all
this
I
mean
this
aspect
of
testing,
multiple
environments
and
how
we
support
them.
I
hardly
see
the
value
added
with
the
current
situation,
but
again
we
would
cover
most
scenarios
yeah.
A
We
would
cover
most
scenarios
more
environments,
that's
one
thing
we
would.
We
would
be
compatible
with
auto
develops,
which
is
not
I,
believe
not
really
the
case,
because
mine
are,
we
can
the
image
that
is
built
by
order.
Build,
cannot
be
used
as
a
Bayesian
edge
for
the
scan.
Oh,
it
is,
which
is
not
what
I
love
you.
B
Currently,
we
only
cross
fingers
that
the
user
using
the
right
version,
but
in
the
future
we
will
be
able
to
detect
the
white
version
and
download
the
right
package
for
that,
and
that's
not
the
case
in
the
skills.
That's
widely
on
the
scope
of
this
POC,
but
I'm
dreaming
that
in
the
future
we
will
be
able
to
detect
and
update,
maybe
the
pipeline
accordingly,
pretty
much
that
we
were
doing
in
the
first
place
with
a
human
could
good
backs
that
they
were
not.
B
A
A
A
B
A
B
C
A
Well,
technically,
a
job
can
upload
multiple
reports
of
the
same
type,
but-
and
that's
that's
already-
working
for
j-unit-
that
we'd
have
to
switch
from
row
reports
to
reports
and
we
would
have
so
give
that
it's
known.
So
it's
a
matter
of
getting
a
burner
of
telling
you
get
a
burner,
but
we'd
have
to
change
it
back
here
and
supposed
to
be
very
lease
which
the
email
syntax.
Because
my
now
we
can
only
have
one
job
of
one
particular
type.
The.
C
A
C
C
C
A
A
A
A
A
C
B
A
A
C
A
D
If
there's
time
left,
I
was
just
wondering
you're
going
to
dependency
conflicts
if
it
would
make
sense
to
install
the
the
actual
analyzer
is
inside
the
change
with
environment,
which
is
much
more
performant
and
token
Dhaka,
but
it
would
provide
some
isolation,
at
least
for
delivery.
Center
tendencies,
so
you
could
basically
change
route
into
our
environment,
run
the
analyzer
with
the
dependencies
that
we
have
without
affecting.
What's
already
there
yeah.