►
From YouTube: License Compliance pip Live Demo
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
B
My
name
is
Moe
I
am
a
software
engineer
on
the
compliance
analysis,
team
and
let's
get
started
so
first
step
is
setup,
improve
air
gap
so
before
I
jump
into
the
hosts
I'm
just
going
to
give
you
a
quick
overview,
just
a
refresher
for
everyone.
In
this
particular
scenario,
we've
got
three
hosts,
one
being
the
self-managed
get
lab
instance,
which
is
get
this
name.
We've
got
a
Python
package
registry,
so
this
isn't
a
full
mirror
of
a
pi
PI,
but
it's
a
partial
mirror
I,
just
quickly
jump
over
to
that
you
can
see.
B
We've
got
any
subsets
of
a
dozen
or
so
packages
hosted
on
this
pi
PI
instance
hosted
within
our
offline
network
and
the
third
host
is
the
Bastion.
So
this
isn't
as
important
for
us,
but
just
it's
important
to
know
that
this
is
the
instance
that
we
used
for
side
loading,
the
docker
images
that
are
hosted
on
get
lab
comm
into
the
self-managed
instance.
So
it's
more
of
like
a
proxy
to
get
these
images
to
the
right
location.
B
So
those
are
the
three
hosts
we're
going
to
work
with
today
and
the
next
thing
I,
just
wanna,
quickly
show
you
is
the
firewall
rules,
all
this
information
you
can
find
right
on
the
Google
compute
instance
dashboard
I,
just
sort
of
screencap
the
important
pieces.
So
these
are
the
inbound
rules
for
the
gate
lab
test
instance.
We
can
see
that
we've
disabled,
oh
by
default,
and
then
we've
allowed
access
to
SSH,
RDP
and
HTTPS
as
well
as
Payton.
So
these
are
the
inbound
firewall
rules
and
then
the
outbound
is.
B
This
is
a
new
rule
that
we've
added,
which
is
to
allow
outbound
to
anything
within
the
local
private
network.
So
this
will
be
everything
in
the
ten
dot
star
dot
star
that
star
that
IP
range
and
then,
after
that
everything
else
is
is
blocked.
So
these
are
going
to
be
all
outbound
calls
to
the
public
Internet.
So
this
is
the
firewall
setup.
So
now
what
we'll
just
quickly
do
is
jump
into
a
terminal
just
prove
that
we
we
are
in
a
somewhat
like
offline
environment.
B
So
this
is
the
actual
test
instance
that
I
described
above
I'm
just
going
to
open
up
two
panels.
So,
on
the
right
hand,
side
is
my
host
machine
from
where
I'm
sitting
at
home
and
on
the
left
hand
side
is
that
test
instance
I'm.
Just
gonna
do
a
quick
curl
from
my
home
machine
to
get
live.com
and
you
should
be
able
to
see
what,
like
the
output,
would
look
like
in
a
scenario
where
we're
actually
able
to
make
outbound
requests.
B
So
in
this
particular
case,
you
can
see
that
it
was
able
to
resolve
the
IP
address
for
get
lab
comm.
We
sent
an
HTTP
GET
request
to
the
actual
IP
address
that
we
resolved
in
this
case.
It's
telling
us
to
redirect
to
HTTPS,
because
we
want
to
do
things
over
the
TLS
layer,
alright
and
I'm,
going
to
run
the
same
thing
this
time
on
the
self
managed
instance
and
we'll
just
go
to
http
again.
A
B
Calm
and
you
can
see
that
we
were
able
to
resolve
the
IP
address,
but
then
we're
not
actually
able
to
make
the
outbound
requests
to
get
lab
comm.
So
to
do
like
comprehensive
proof,
I
think
we'd
have
to
sort
of
brute
force.
Oh
IP
ranges
in
the
ipv4
ancient
and
ports,
but
for
the
most
part,
this
is
like
a
very
high
level:
proof
that
we've
disabled
outbound
access
from
the
test
instance
just
jump
back
to
the
scorecard,
make
sure
I'm
not
missing
anything.
C
C
A
B
So
step
two
is
the
show
setup
in
config:
follow
user
documentation
in
order
to
download
resources,
follow
user
documentation,
order
to
set
up
and
configure
licence
getting
to
run
for
default
pipelines.
So
I,
don't
think
you're
gonna
get
a
good
score
on
this
one,
because
we
don't
really
have
any
documentation
for
this
directly.
So
the
one
thing
that
we've
added
support
for
is
the
index
URL.
Now
this
documentation
is
currently
on
the
dependency
scanning
page
I.
We
haven't
been
able
to
get
this
documentation
also
added
to
the
license
compliance
page.
B
So
that
is
something
that
I
think
we
still
need
to
finish.
So
if
we
look
at
the
license
compliance
page,
the
available
variables
hasn't
been
updated
yet
so
index
URL
is
the
variable
that
we're
taking
advantage
of
we're
mirroring
the
same
variable,
that's
being
used
in
dependency
scanning
to
keep
a
consistent,
just
keep
consistency
between
the
two,
but
I
don't
have
any
documentation
to
walk
through.
B
So
what
I'll
do
is
I'll
just
show
you
the
test
project,
so
this
demo
is
broken
up
into
two
sections,
so
for
Python,
there's
pip
and
pip
M
I'm
going
to
go
through
the
PIP
example
first,
as
the
package
manager
for
the
first
example.
So
I
broken
stand
in
two
to
merge
requests
in
the
first
merge
request.
This
is
an
example
pet
project.
So
in
this
example,
oh
I
should
just
backtrack.
One
sec
I
didn't
show
you
the
configuration.
B
B
We
don't
we
don't
have
a
different
DNS
entry
for
it,
which
is
why
it's
hooked
directly
into
four
five
six.
Seven.
We
could
add
some
rules
to
you
like
the
nginx
configuration
to
do
some
sort
of
reverse
proxy,
but
for
the
purpose
of
this
demo,
I
think
that's
just
overkill,
so
we're
piggybacking
again
contain
a
registry
repo
that
is
hosted
on
this,
get
lab
self
managed
instance,
and
so
the
prep
work
was
mirroring
the
get
lab,
comm
version
of
that
image
into
the
self
managed
environment.
B
Alright.
So
that's
the
default
setup
and
let's
jump
into
the
pip
project
example.
So,
in
this
project,
we've
added
a
single
requirements
txt
file-
and
this
is
how
you
describe
your
dependencies
and
typical
pip
projects
and
we've
added
one
dependency:
the
pip
depth
tree,
so
we're
gonna
try
to
source
this
dependency
from
the
index,
URL
that
we're
specifying
as
a
variable
in
the
configuration.
So
this
is
the
key
variable
that
specifies
like
where
to
go,
pull
these
packages
so
that
we
can
analyze
the
metadata
to
figure
out
what
the
software
licenses
are.
B
Now
the
one
workaround
that
we
have
here
that
you
know
ideally
would
like
to
get
rid
of
is
we've
added
a
custom
configuration
file
to
just
add
this
hose
as
a
trusted
host.
Without
this
at
the
moment,
the
package
pool
will
fail
because
we
haven't
provided
an
interface
for
specifying
a
root
certificate
to
trust.
So
as
soon
as
it
connects
to
the
HTTP,
endpoint
it'll
receive
the
x.509
certificate
and
say
I
this
is
self
signed.
It's
not
your
signed
by
anyone
that
I
trust
I'm.
B
C
B
That's
great
news
all
right,
so
this
is
the
setup
and
what
I'm
gonna
do
so
I'm,
just
gonna
I've
already
run
a
pipeline,
so
I'm
gonna
go
ahead
and
just
show
you
the
results
of
that
pipeline
and
then
I'll
go
ahead
and
add
a
new
dependency
to
this
requirements.
Doc,
txt
and
I'll
push
that
up
and
we'll
see
the
changes
in
the
in
the
pipeline
after
that.
So
let's
just
review
the
latest
pipeline,
so
we
can
ignore
this
top
one.
This
was
me:
I
made
a
mistake
in
the
configuration,
syntax
error
and
llamó.
B
This
happened
an
hour
ago,
so
the
most
recent
pipeline
is
actually
from
16
minutes
ago.
So
let's
just
take
a
quick
peek
at
that
pipeline
and
we
can
see
the
license
caning
job
was
successful.
I
can
get
down
to
the
gory
details.
We
can
see
the
job
output
to
see
where
its
sources
data
from,
but
the
most
interesting
part
I
think
will
be
that
we
were
able
to
source
the
dev
tree
dependency
as
well
as
identify
its
license.
Based
on
the
metadata
that
we
sourced
from
downloading
it
do.
C
B
B
B
Let's
just
take
a
quick
peek
back
at
what
dependencies
are
available
to
me.
So
if
I
look
at
the
Python
or
the
pi
PI
index,
you
can
see
that
these
are
the
available
packages
that
are
available
to
me.
So
if
I
choose
a
package,
that's
not
on
this
list!
It
should
actually
fail
because
it
then
wasn't
able
to
determine
that.
So,
let's
just
make
up
a
package
I'll
submit
that
and
let's
watch
it
fail.
So
I'm
gonna
make
this
up.
We're
gonna
call
this
what's
not
on
there.
Moke
is
not
on
there.
B
Origin
and
we're
gonna
push
this
back
up
and
trigger
a
pipeline.
Now
going
back
to
the
merge
request,
who
put
let's
take
a
peek
at
the
PIP
project,
we
should
have
triggered
a
new
pipeline
now
from
that
commit
there's
a
couple
of
things
that
I'd
like
to
improve
on,
but
I'm
just
going
to
call
them
out.
Right
now
is
I,
don't
think
they're
blockers,
but
the
scan
is
a
little
bit
slow.
B
So
one
of
the
steps
at
the
beginning
of
the
scan
is
it
attempts
to
determine
the
toolchain
associated
with
the
project
so,
for
example,
if
it's
a
ruby
project,
that's
dependent
on
a
specific
version
of
Ruby,
let's
say
an
older
version,
2.4,
etc.
The
image
will
then
attempt
to
install
the
tool
chain
associated
with
that
project,
and
so
you'll
see
at
the
very
beginning
of
this
run
here.
B
This
is
where
it's
actually
trying
to
reach
out
and
say
what
are
the
latest
versions
of
those
tools
that
are
available,
and
this
is
something
that
we
don't
need
in
an
air-gap
environment,
I'm
assuming
unless
customers
want
to
control
like
where
they
can
pull
those
tools
from
and
install
them
from.
So
this
is
something
I'd
like
to
improve
on
you'll,
just
see
this
as
like
a
slow
run.
C
Right
now,
your
setup
config,
using
the
local
set
up
like
their
local,
only
yeah,
could
step.
Two
is
that
you
set
up
you
set
up
using
the
yeah
mole
everything
so
I
think
we
might
be
in
the
state
where
we
can
assess
step
two,
which
is
just
that
you
have
finished,
setting
up
place
and
scanner
to
be
able
to
run
in
the
air
gap
and
right
now,
because
you're
using
the
local
docker
image
for
the
analyzer
I.
Think
because
that's
the
current
one
on
your
main
branch
right,
correct.
B
C
C
B
C
Can
do
a
little
job
and
if
you
actually
wanted
to
talk
a
little
bit
about
you're,
saying
that
some
of
the
length
in
this
is
that
it's
looking
to
update
its
tool
chain
yep.
So
if
the
customer
is
running
their
own
local
registries,
could
that
be
something
it
that
we
could?
Let
them
point
and
other
resources
are
not
really
at.
B
B
Let's
just
take
a
quick
peek,
so
I
added
a
package
called
mocha
and
says
could
not
find
a
version
that
satisfies
mocha
because
it
doesn't
exist
on
our
internal
registry,
and
so
we
did
the
best
that
we
could,
because
we
weren't
able
to
pull
it
from
the
registry
that
was
provided,
which
is
here.
We
continue
to
scan
as
much
as
we
could
now
in
the
this
is
the
gory
details
are
here
is
because
I
increased
the
verbosity
of
the
log
level,
but
you
can
see
we're
still
able
to
complete
the
scan.
B
It's
just
it's
not
a
complete
scan
because
we
couldn't
source
these
packages.
So
this
is
where
I'm
I'm
not
sure
if
this
should
be
a
failure
for
the
job.
If
we
need
to
surface
that
because
it
is
an
incomplete
scan
in
the
sense
that
we
were
able
to
scan
as
much
as
we
could,
but
this
project
has
a
package
that
isn't
actually
available
from
the
index
that
was
provided
to
us.
So
if
I
go
back
so.
C
Right
now,
if
you
look
at
connected
environments,
I
believe
that
we
don't
fail.
We
pass
with
the
note
that
we
were
unable
to
gather
additional
data
points
and
due
to
the
fact
that
licenses
tend
to
be
a
policy
item,
not
a
secure
compliance
and
policy
item,
not
a
security
item.
We
have
defaulted
to
not
alarming
the
user
per
se,
and
so
that
is
consistent
with
our
connected
and
offline
environments.
C
B
So
this
is
consistence.
You
can
see.
We
identified
sir
licenses
because
we
weren't
able
to
complete
the
package
installation
step
because
the
package
installer
said
I
can't
finish
this
because
I
couldn't
find
one
of
the
packages
and
therefore
was
it
was
a
failure,
but
it
surfaces
as
a
check
mark
which
is
compliant
with
our
current
way
of
doing
things.
B
Okay,
so
now,
let's
go
get
something
happier:
let's
go
actually
install
a
package
that
is
available
in
our
registry,
so
I'm
going
to
pick
on
Bato
three,
specifically
because
it's
got
a
few
other
dependencies,
we'll
be
able
to
see
that
we
detected
the
licenses
for
that
package
as
well
as
its
dependencies.
So
let's
go
update.
This
we're
gonna
go,
add
bottle
three
apologize
for
the
speed.
B
That's
my
agent
and
let's
kick
off
another
another
run
now
just
to
respect
your
time.
This
is
gonna,
take
a
little
while
for
that
pipeline
to
kick
off
so
I'd
like
to
come
back
to
that
pipeline
and
maybe
just
take
a
quick
peek
at
pip
and
while
that's
running
does
that
work
for
everyone?
Alright,
let's
look
at
our
pip
end.
Project
I
just
want
to
make
sure
the
pipeline
did
trigger
and
that's
it
will
running
while
we're
talking
about
PIP
ends,
and
it
is
running
perfect.
Okay.
B
So
in
this
particular
example,
we're
still
piggybacking
on
the
docker
image
that
is
hosted
on
the
self-managed
instance
container
registry,
so
that
hasn't
changed
that
we're
also
adding
pip
index
URLs.
So
this
is
a
location
where
we're
going
to
source
the
actual
package
is
defined
in
the
file
dot
lock
in
the
pip
file
itself.
It's
important
to
describe
the
source
as
well
like
the
default
source
is
org,
so
you
know
as
a
consumer
as
a
sort
of
project
owner,
it's
important
to
include
the
right
sources
in
there.
B
Otherwise
this
is
where
issues
could
arise,
and
the
one
last
thing
to
note
is
that
I'm
also
disabling
the
verify
SSL
step,
and
this
is
the
circle
back
on
the
issue
that
when
we
get
the
x.509
search
for
this
particular
DNS
entry,
it's
going
to
be
signed
by
its
own
by
itself,
and
so
we
don't
have
that
certificate
in
our
trusted
root
store
at
this
point.
So
this
is
just
again
that
workaround,
but
I
think
we
said
this
is
okay
for
MVC
and
then
we
generate
the
lock
file.
B
So
if
we
look
at
a
recent
scan
this
and
the
job,
we
can
see
that
we
were
able
to
identify
the
PIP
Deb
tree
dependency
as
well
as
that
it's
associated
with
a
MIT
software
license.
If
we
jump
into
the
job
output
will
see
the
gory
details
of
where
it's
actually
pulling
it
from.
So
it's
just
specifying
the
mirror
as
the
internal
registry
and
everything
else
is
happening
within
the
context
of
the
network.
This
particular
case
you
can
see
it's
I'm,
not
ready
to
show
you
that.
B
C
B
And
I
don't
know
if
it
helps
the
level
set
like
pip
is
the
default
package
manager
for
the
Python
ecosystem.
Pip
M
seems
to
be
like
the
direction
that
the
Python
community
is
moving
towards.
There
is
the
the
PIP
file
standard
or
the
R
ceefor
it
is,
is
intended
to
be
moved
back
into
pip,
but
right
now
the
reference
implementation
is
being
done
in
pip
M.
B
So
we
will
start
to
see
I
think
I,
don't
know
what
the
timeline
will
be,
that
these
pip
files
will
actually
merge
back
into
pip,
and
then
this
will
be
sort
of
the
same
thing.
The
difference
between
Pip
and
VIN
pip
is
that
pip
end
is
the
mixture
of
PIP
plus
a
virtual
M,
which
creates
like
a
virtual
environment
and
you're
too
much
of
them?
Oh
ok!
So,
let's
go
back
to
so
here.
What
I've
done
is
I'm,
adding
a
new
dependency
to
this
pin
file
project.
So
I
had
this
as
a
stash.
B
You
didn't
have
to
see
this
live
I
just
popped,
the
stash
and
I'm
adding
a
Botto
3
dependency.
So
we
already
know
that
Bato
3
cut
is
available
on
our
internal
Python
registry
and
then
from
that
I
had
to
update
the
lock
file,
which
pulls
down
the
actual
signatures
for
this
packages
to
make
sure
that
nobody's
tampering
with
them
and
that
they
match
so
I'll
commit
this
and
will
trigger
a
pipeline.
B
And
I'm,
sorry
for
the
context.
Switching
this
pipeline
is
going
to
take
a
little
bit
of
time,
so
I'm
going
to
circle
back
to
the
pit
pipeline.
Unless
there's
any
questions,
ok,
perfect!
So
let's
just
double
check
that
this
pipeline
is
fired
and
running
and
we
can
see
the
file
pipeline
is
going
ok.
So
let's
go
back
and
check
on
the
previous
pip
pipeline,
so
we're
looking
at
the
pit
project
again
and
just
as
a
refresher.
B
What
we
did
last
time
was
we
added
a
new
dependency,
Bato
3
and
Bato
3
has
other
dependencies
and
those
dependencies
are
all
available
from
the
internal
Python
registry.
So
it's
look
at
the
pipeline.
Oh
well!
Well,
it
passed
and
it's
saying
it's
passed
and
the
job
and
we've
discovered
a
few
more
licenses.
So
in
this
particular
case
the
previous
of
pipeline,
we
saw
that
we
were
only.
We
only
had
one
package
and
that
was
the
DEP
tree.
B
This
time
we've
got
a
few
more
licenses
detected,
we've
still
got
pip
DEP
tree
in
there
now,
we've
also
got
bought
it
three
and
it's
dependencies
and
it's
dependencies
of
its
dependencies.
So
all
these
packages
were
sourced
from
the
internal
Python
package
mirror
and
we'll
just
double
check
the
job
output
to
make
sure
that
we're
not
calling
out
to
anywhere
else.
So
we
were
able
to
where's
my
okay,
so
our
pip
install
went
to
that
registry
installed.
Install
install
detected
the
new
licenses
from
local
metadata
and.
B
I
think
what
am
I
missing
here,
I
think
that's,
okay,
so
for
the
most
part
just
a
full
circle
back
we
had
a
requirements.
Txt
file,
we
added
one
dependency.
We
were
able
to
download
that
dependency
from
the
internal
mirror.
We
were
able
to
detect
the
software
licenses
from
the
local
metadata
of
that
mirror.
There's
still
some
room
for
improvement,
because
we're
still
we
haven't
addressed
things
like
this,
so
these
are,
like
things,
I
think
in
the
backlog
for
being
able
to
parse
the
actual
expressions
better.
B
A
C
Of
it
successfully
running,
we
can
probably
grade
step
three
and
then,
when
it
finishes,
we
can
show
the
results
of
it
to
step
four.
So
looking
back,
we
were
supposed
to
make
the
code
changes.
So
we
saw
all
the
code
changes.
Have
it
be
NMR
and
have
it
run
then
we
saw
the
jobs
running.
I,
don't
have
any
concerns
about
step.
Three
anyone
else
have
any
yeah.
D
C
A
C
A
C
B
C
C
D
And
this
is
why
I'm
asking
for
the
details
good,
because
it's
hard
for
me
to
help
queer
because
there's
nothing
for
me
to
compare
against
what
we're
doing
and
what
the
expected
results
are.
I
mean
more
great
blow
detail
and
think
that's
if
we're
taking
it
a
step
further,
but
we
need
to
have
it
written
down
so
that
everybody
in
the
demo
can
chime
in
and
then
grade
together.
So
yeah
I
think
we
made
progress.
B
Sure
all
right
so
back
to
the
end
of
the
story.
This
was
the
end
pipeline
and
we
could
see
that
we
also
identified
the
new
package
dependencies
and
the
licenses
associated
with
those
package
dependencies.
So
this
is
actually
functioning
for
both
Pip
and
commend
okay.
So
the
last
step,
I
guess,
is
to
actually
merge
one
of
the
pipelines.
So
now
there's
going
to
be
a
conflict
if
I
merge
one,
because
I
won't
be
able
to
merge
the
other,
so
I
have
to
pick
one
and
I'll,
let
the
audience
decide.
They
prefer
pip
or
bit
Bend.
B
C
B
Endpoints
can
be
respected,
so
we're
currently
working
on
trying
to
just
make
that
work
in
license
scanning
as
well,
so
I'm
going
to
just
run
another
change
and
let's
see
what
happens.
So
if
we
go
to
the
CI
CD
output
or
configuration
we've
added,
a
variable
which
is
the
additional
CA
cert
bundle
and
the
actual
value
itself
is
the
x.509
certificate
for
the
sorry,
the
container
registry,
which
is
this
guy
here.
So
if
we
take
that
endpoint
here,
we
can
actually
see
the
certificate
associated
with
that.
The.
B
C
B
B
There
we
go
ok,
so
we
can
see
the
certificate
details,
the
subject
and
the
issuer
are
the
same.
It's
a
self-signed
cert.
This
is
for
the
actual
Python
or
the
pi
PI
instance
that
we're
running-
and
this
is
the
actual
x.509
certificate
encoded
as
base64.
So
he
took
this
literally
just
copy
this
and
paste
that
into
that
environment
there,
and
this
would
be
the
sort
of
the
interface
for
customers
to
install
their
trusted.
You
know
certificates
at
this
point.
It's
only
doing
adding
it's
not
replacing.
B
So
speaking
with
the
customer
this
morning
they
said
they
were
fine
with
using
the
trusted
certificates
that
were
available
with
the
system
as
well
as
adding
their
own.
In
some
cases,
I
think
like
in
more
hardcore
organizations,
they're
going
to
say
no,
we
don't
care
about
the
trusted.
Certs
I
come
with
the
OS.
We
have
our
PEM
file
that
we
trust
and
we
don't
care
about
what
else
the
OS
trusts
and
so
I'm
not
handling
that
scenario.
For
now,
it's
just
adding
it.
B
So
if
you
have
this
environment
variable,
the
pipeline
will
attempt
to
install
that
certificate
into
the
routes
the
trusted
store
and
let's
and
I
haven't
actually
tested
this
before
this.
So
we're
gonna,
we're
gonna,
have
some
fun
here
and
open
the
IDE
and
I'm
sorry
if
this
isn't
fun
to
you,
but
it's
so
fun
to
me.
Let's
go
down
to
pip
file
and
let's
just
get
rid
of
this
I'm
going
to
make
that
true,
so
no
more
hacking
or
overwriting.
D
B
Need
to
move
the
webcam
stuff
out
of
the
way
we're
gonna
say
true,
so
now
we're
going
to
tell
the
scanner
to
please
respect
the
TLS
verification
step.
So
you're
gonna
get
that
you're
going
to
connect
to
this
endpoint
you're
gonna
get
an
x.509
certificate
from
that
endpoint
I
want
you
to
see
if
you
trust
that
endpoint
by
first
verifying
the
signature
on
it
to
see.
If
that
signature
is
from
someone
that
you
trust
in
our
root
certificate
store,
we're
gonna
commit
this
directly
to
the
pip
end
branch.
C
Believe
it's
gonna
be
a
complicated
answer
of
the
same
thing
as
the
vulnerability
interactions,
which
is
if
the
computer
you're
accessing
the
limited
connectivity
environment
with
which,
in
this
case,
most
computer,
my
host
has
has
connectivity
than
the
links
will
work.
Otherwise
they
are
expected
to
not
work.
Okay,.
C
D
D
B
So
the
piece
that
I
was
showing
you
last
with
installing
the
root
certificate
authority.
That
part
is
an
introduction,
yet
that's
still
being
in
development.
That's
that's
likely
going
to
have
issues.
Everything
else
before
that
is
in
production.
I
haven't
been
able
to
identify
any
defects
and
waiting
for
open
issues,
I'm
happy
to
be
told
the
third
bugs,
but
in
my
own
testing
and
with
the
testing
of
our
team,
I
haven't
identified
any
issues
yet
and.
C
A
B
D
B
We
detected
licenses
sorry
I'm
happy,
so
we're
on
our
way
to
detect
our
to
being
able
to
install
the
certificate
or
a
trusted
certificate
from
our
customers
into
the
search
store.
Just
like
since
I've
got
you,
here's
what's
actually
happening.
So
we
detected
a
that
environment
variable
that
I
showed
earlier
I
forget
what
it's
called.
The
additional
CA
surf
bundle.
We
identified
that
that
was
provided
in
the
environment.
So
during
scan
time
we
took
that
file.
B
We
dropped
it
into
a
location
where
it
can
then
be
appended
to
the
default
certificate
chains
that
ships
with
the
debian
10
instance
and
we're
just
calling
update
CA
certificates
and
here's
the
gory
detail.
So
we
can
actually
see
every
certificate
every
root
certificate,
that's
being
trusted
on
this
instance
at
the
very
bottom.
We
should
see
our
custom
one
in
our,
maybe
not
the
bottom.
Where
is
it?
Okay?
So
we
added
one
new
one
and
zero
were
removed
and
if
we
scroll
a
little
bit
further
where's
the
details.
B
D
E
B
Been
working
on
this
because
my
understanding
of
how
root
certificate
authorities
work
was
what
was
described
this
morning.
So
that's
the
direction
I
had
been
going
as
like
a
sort
of
a
side
process.
Now
what
I
was
hearing
was
conflicting
information,
so
hearing
from
the
customer
today
was
fantastic
because
it
just
verified
my
understanding
of
how
they
wanted
to
deliver
it
because
being
able
to
give
us
the
full
chain
allows
us
to
actually
do
this
and
then
I'm
like
if
I
don't
have
the
full
chain,
how
do
I
get
the
intermediate
certificates?
B
It's
there's
no
URL
in
the
x.509.
That
tells
me
where
the
intermediate
certificates
are
I've
got
the
CRL.
I've
got
the
OCSP
URL,
there's
nothing
else
so
and
like
for
when
they
said.
No,
we
were
going
to
have
the
full
bundles
like
fantastic,
and
the
only
concern
left
was.
Are
we
going
to
add
that
too
store?
E
E
B
C
Yeah,
when
you
watch
the
video
the
two
things
we
wanted
you
to
kind
of
pay
critical
attention
to
where
the
fact
that
the
we
were
doing
our
own
Python
mirror.
That
was
not
hosted
like
so
last
time.
The
repo
that
we
used
was
the
get
lab
internal
repo
to
the
instance,
not
like
the
public
good
lab
repo.
And
so,
if
you
could
just
check
that
out,
then
obviously
we
don't
have
public
user
documentation
yet
and
that's
a
failing.
But
we
knew
that
and
we.