►
Description
Meeting notes: https://docs.google.com/document/d/1KQalBRzfRBvsqh73JUYfp1KG-AJdXcv2Z8LTIFoQP8c
A
B
B
A
A
Good
morning,
good
morning,
good
good
afternoon,
but
yes
good
morning,.
B
B
A
We
added
two
one
or
two
more
minutes.
We
had
a
another
couple
of
guys
saying
they
would
they
would
join.
A
A
And
again,
this
architecture
picture.
A
Happened
over
the
weekend,
I
think
a
couple
of
guys
were
off
on
vacation
and
then
yeah.
There
were
just
too
many
malicious
packages
being
published
in
the
week
before,
so
they
decided
to
temporarily,
you
know,
forbid
new
accounts
being
created
and
packages
being
I
think
actually
it
only
created
affected
mu,
entirely
new
packages,
not
new
package
versions
of
existing
packages
render
right
correct.
D
A
All
right-
and
the
third
link
is
to
this
risk
Explorer,
which
we
will
use
again
to
when
we
go
over
the
the
different
threads
or
different.
Let's
say
assets
and
systems
of
the
of
the
architecture.
B
A
So
maybe
I
can
summarize
a
bit
what
we
have
done
last
week,
which
I
found
was
a
nice
and
productive
meeting,
so
we
basically
went
on
and
share
the
screen
before
and
so
I
can
show
a
little
bit
with
the
document
of
the
screen.
This
one
here.
D
A
Okay,
all
right
so
last
time
we
were
basically
discussing
different
security
threats,
materializing
on
the
developer
machine
on
the
very
left
hand,
side
of
this
architecture
diagram-
and
we
came
up
with
a
good
number
of
threats
that
I
have
documented
here.
I
just
changed
a
little
bit
the
a
few
phrasings
here
and
there,
but
not
very
much,
and
so
what
we
have
come
up
with
is
basically
it
was
all
around.
A
You
know
whether
there
is
a
defined
tool
set
to
be
used
by
developers
of
a
software
organization,
whether
there
is
an
approval
process
that
is
maybe
incomplete,
whether
it
covers
all
the
versions,
whether
it
also
describes
the
distribution
channels
that
can
be
used
by
developers
to
obtain
their
dependencies.
A
A
One
is
one
that
we
had
even
before
last
week's
session,
which
is
about
misconfigured
package
managers
that
go
and
get
stuff
directly
from
the
internet
rather
than
going
through
private
Registries,
and
then
the
last
two
ones
are
basically
different
attack
vectors
that
we
picked
from
the
right
hand
side
from
this
attack
three,
the
topmost
node,
which
is
basically
malicious
packages
developed
from
scratch
that
get
malicious
code
at
later
points
in
time
at
later
versions,
and
the
second
one
av-200
refers
to
that
one
here
in
the
middle,
which
is
summarizing
all
the
different
name,
confusion
attacks
such
as
type
of
squatting
and
so
forth.
A
So
these
are
all
threats
on
the
developer
machine
and
then
my
thought
was
maybe
go
and
discuss
whether
there
is
anything
else
on
the
developer
machine
missing.
A
On
top
of
this
architecture
diagram
here
we
started
basically
defining
the
scope.
The
assets,
as
well
as
a
couple
of
date,
of
a
good
number
of
data
flows
that
are
related
to
basically
open
source
packages
and
proper
proprietary
code
flowing
through
the
architecture
and
system
outlined
above.
A
Right
any
comment
or
input
so
far.
E
Yeah,
just
additional
possibilities,
so
I
think
laptop,
usually
nowadays,
at
least
for
the
employer.
I
worked
for
nowadays
have
backup
in
place
to
back
it
up
to
a
remote
location,
so
that
backup
itself
is
at
risk
right.
So
that
whatever
is
on
the
laptop,
is
on
that
backup.
A
So
what
you
mean
is
because
we
we
look
at
threats
related
to
the
consumption
of
basically
malicious,
open
source
components
right
and
so
do
I
get
it
right
that
if
a
developer
is
backing
something
up,
that
backup
is
tampered
with
and
then
it
re
he
restores
it
on
on
his
or
her
developer
machine.
This
would
be
a
vector.
E
Yeah
I
guess
once
it's
back
up.
There
are
multiple
multiple
ways
to
use
that
backup.
One
way
that
you
just
said,
maybe
the
backup
can
be
something
can
be
added
malicious
code
or
virus
can
be
added
to
the
backup
and
one
developer
restore
to
the
developer
machine
that
the
problem
goes
comes
back
to
the
developer
machine.
There
is
also
possible
that
the
the
password
file
or
key
files,
whatever
is
on
the
secrets,
that's
stored
on
the
developer
machine,
could
be
could
be
accessible
on
the
on
the
backup.
E
So
then
that
can
be
used
to
I
guess
actually,
that's
still
considered
part
of
the
supply
chain
attack.
A
Yeah,
this
is
for
me
the
question
because
I
mean
the
the
basically
the
Assumption
or
the
scope
we
said
is,
for
the
time
being,
to
exclude
Insider
attacks,
which
I
guess
would
be
the
ones
yeah
related
to
what
you
just
described,
because
I
mean
the
the
backups
so
all
to
those
potentially
malicious
or
those
malicious,
open
source
components
come
in
through
those
reddish
data
flows.
A
And
so
if
we
develop
a
machine
and
but
the
developer
back
up
would
would
not
you
know,
be
stored
outside
of
that
organization.
Right
and
would
not
be
would
be
unrelated
to
those
data
flows
so
even
though
in
general
I
agree,
but
that
this
could
be
a
problem
as
long
as
we
exclude
Insider
threads
I'm,
not
sure
how
this
can
be
a
vector
for
consuming
malicious,
open
source,
I.
D
Know
what
I
mean
I
think
that
they
kind
of
I
think
it
walks
a
a
very
debatable
line,
because
I
have
argued
many
points
of
view
when
it
comes
to
backup-
and
there
are
many
points
of
view,
because
there's
also
the
point
of
view
of
the
best
they're,
the
most
secure
backup
that
you
can
have.
There's
no
backups
I
mean
I,
don't
necessarily
agree
with
it,
but
I
mean
I,
guess
so.
D
I
I
think
that
I
agree
with
Henrik
I
think
that
that
we
should
focus
more
on
supply
chain,
specifically
because
I
think
that
that's
an
entirely
separate
topic
about
backups
Integrity
of
backups
and
where
you
keep
your
backups
and
the
setup
of
backups
and
the
you
know,
there's
a
bunch
of
different
programs
that
you
could
use,
even
though
on
Linux
we
all
try
to
use
kind
of
the
same
stuff,
but
I
I
think
that
it
distracts
the
topic.
If
you
want
my
personal
opinion,.
E
I
agree:
it's
it's
probably
not
directly
related
to
the
to
the
red
line
that
you
have
so
far.
Yeah.
D
Now
now
let
me
say
this
on
the
topic
of
backups
I,
guess,
theoretically,
some
package
managers
do
allow
you
to
save
something
of
like
a
set,
because
all
oh
you
go
you
get
where
I'm
going
with
that
Henrik.
D
No,
no,
please
go
ahead
because
all
bills
are
supposed
to
be
like
replicatable
on
everything
except
Gen
2.
Basically,
right
like
every
build,
is
reproducible
every
single
time.
You
should
get
this.
You
should
be
installing
the
same
thing.
So
theoretically,
I
mean
there
could
be
a
way
of
like
managing
package.
D
Integrity
I
guess
is
what
we
would
be
talking
about
and
like
backing
up
like
a
certain
setup,
I
mean,
but
but
it
would
be,
it
would
be
more
from
like
how
do
I
like
save
this
type
of
situation,
like
I,
have
a
setup
that
I
have
audited
I
feel
secure
about.
How
would
I
be
able
to
reproduce
this
setup
in
another
machine
foreign.
A
D
I
guess
it
depends,
you
know
it
really
like
like.
It
really
depends,
but
I
mean
yeah.
It
was
just
a
thought
that
I
had
as
far
as
going
as
far
as
like.
What
would
you
do
if
you
wanted
to
like
maintain
a
certain
package
set,
and
you
wanted
to
maintain
that
the
Integrity
of
said
package
set
on
the
developer
machine?
If
that
makes
sense,.
A
Yeah
this
this
rings
a
bell
and
I
think
of
an
attack
that
I'm
not
sure
whether
this
was
just
a
proof
of
concept
or
so,
but
here
basically
malicious
packages
were
able
to
modify
cached
packages
that
were
used
in
other
development
projects.
So
basically,
as
soon
as
you
start,
caching,
your
downloaded
stuff,
such.
B
A
D
E
A
Okay,
I,
just
wonder
where
to
where
to
put
this,
then
and
also
I,
wonder
I
mean
the
fact
that
such
a
local
cache
could
be
tampered
with
by
another
open,
malicious,
open
source
component.
Does
this
need
to
be
documented
as
a
separate
threat
or
not,
because
you.
D
That
would
be
a
separate
threat
because,
in
order
for
you
to
modify
the
caches
of
the
package
manager,
I've
I've,
if
I
remember
correctly,
you
would
have
to
physically
like
go
in
like,
for
example,
I.
Don't
remember
where
Gen
2
keeps
it,
but
we
there's
a
folder.
The
disk
files
is
what
we
call
it
dis-
files
and
basically,
all
the
all
the
installer
files
and
whatnot
that
get
downloaded
by
Portage
yeah
put
in
that
folder.
D
So
if
you
were
to
go
into
somebody's
machine
and
tamper
with
that
file,
so
that
the
next
time
somebody
were
to
install
or
try
to
execute
the
same,
install
for
whatever
reason
and
now
it
would
not
download
anything
because.
D
D
A
C
A
B
E
Don't
come
from
other
regular,
you
can
say
it's
kind
of
a
red
classic
attacks
such
as
fishing,
a
physical
theft
suffer
just.
Does
someone
take
the
laptop
I
secured
and
encrypted
those
are
not
in
school
right?
This
is
just
a
classic
threats,
rather
than
a
supply
chain
thread.
Okay,.
A
Yeah
I
think
we
excluded
also
all
kind
of
physical
security,
related
topics,
honesty
kind
of
you
know,
laptops
being
stolen,
not
encrypted
and
then
tempered
with
us.
So.
A
Okay
and
I
I
note
down
the
same
thing
that
you
know
unencrypted.
A
A
D
I
would
also
like
to
point
this
out
Henrik,
because
it's
not
really
there
not
really
a
threat,
but
I
I
mean
it's
a
very
dangerous
ideology
because
we've
always
said
an
open
source
trust
but
verify
and
I.
Think
if
you
don't
do
that,
that
in
itself
is
a
threat.
A
A
Does
this
relate
more
to
the
general
mindset
that
you
would
expect
from
a
developer.
D
So
I
think
that's
where,
like,
if
you're
going
to
be
a
safe
user,
no
matter
what
you're
doing
you
have
to
know
what
you're
installing
so
I
think.
That's
where,
like
the
trust
but
verify
ideology,
comes
into
play
because
at
the
end
of
the
day,
it
doesn't
really
matter
how
good
of
a
supply
and
system
you
design
if
you're
never
actually
going
to
Care
to
like
understand
what
you're
installing
and
how
it
got
to
your
computer,
then
it's
all
kind
of
really
pointless.
Isn't
it.
A
Yeah
I
think
we
we
started
having
this
discussion
last
time
as
well,
and
I
reflected
this
in
this.
This
bullet
point
here
that
I'll
just
highlight
I
wonder
whether
yeah,
whether
whether
to
phrase
this
as
a
threat
or
maybe
rather
maybe
educate
developers
and
and
teach
them
this
principle
of
trust
but
verify.
C
C
D
D
And
just
to
point
out
Henrik
the
reason
that
I
really
say
this
is
because
this
is
kind
of
a
ongoing
issue
point
between
packagers
and
users
that
we
Cur
we
have
at
all
times
because,
as
I
said,
a
lot
of
people
do
think
that
packagers
have
the
responsibility
to
keep
you
safe
and
it's
kind
of
like
we
do.
But
we
don't.
You
know
what
I
mean
yeah.
A
D
Right
and
I'm
not
going
to
lie
to
you
that
sometimes
it's
a
lot
of
in
the
attitude
of
the
packaging
teams
as
well,
because
a
lot
of
packaging
teams
will
have
this
attitude
of
like,
and
this
is
why
I
say
this
they'll
say
it
like
trust
but
verify
but
they'll
say
it
but
like
in
a
much
meaner
way
and
kind
of
make
it
like.
But
it's
your
fault.
And
how
could
you
not
do
this,
so
you
get
what
I'm
saying
like
yeah.
D
A
Yeah,
it's
just
just
not
I,
don't
know
this
whole
verification
as
much
sense
as
it
makes
it
is
just
so
unrealistic.
At
some
point,
I
had
a
look
earlier
today
at
the
Transformers
library
or
package
that
comes
from
hugging
face
and
it
has
like
2
000
dependencies
yeah.
Good
luck,
very
fine.
This
I
mean
but
yeah
I
mean
that
we
can
have
those.
A
D
D
And
let
me
also
say
this
Henrik
I
think
also
in
terms
of
this
getting
Mass
adoption
and
whatnot
I.
Think
what
one
of
the
big
pieces
of
feedback
I
get
at
a
lab
sometimes,
is
that
open
ssf
likes
to
Champion
a
lot
of
things,
but
they
don't
realize
that
security
has
been
an
issue
since
Linux
has
been
open
sourced
in
1991..
D
Like
I
won't
name
specifics,
but
I
know
of
a
package
manager
that
just
uses
a
shot
at
256
on
a
package
and
that's
what
they
call
security.
It's
just
drawing
things
left
and
right
and
at
the
end
of
the
day,
if
the
downloaded
package
shawl
matches
the
one
that
we're
delivering
then
in
past
security-
and
you
see
that's
that's
what
I'm
saying
like
it's
it's
difficult,
because
there
are
different
generations
and
I'm
sure
that
may
have
worked
at
the
time
when
that
was
implemented.
A
There
was
a
article
just
a
few
days
thick
or
maybe
that
is
when
I
came
across
it
about
package
signatures
on
paypi
and
how
uneffective
they
are,
in
fact,
that
kind
of
a
guy
basically
looked
up
tried
to
find
the
the
the
public
keys
of
the
gpg
signatures
that
were
created
on
Pi
packages,
and
he
was
only
able
to
get
it
for
a
certain
number
of
packages,
and
then
it
was
also
looking
into
the
crypto
algorithms
being
used,
many
of
them
being
weak
and
basically
concluded
this.
A
A
A
Okay,
then,
what
are
other
problems
on
developer
machines.
E
A
E
D
I
have
to
think
about
that
one,
because
that
one
that
one
that
one's
not
not
bad,
that's
probably
it's
a
good
one.
I,
don't
know.
E
For
example,
they
are
there's
a
another
organization,
independent,
sometimes
in
its
foundation
it's
called
open,
compute,
Computing
Foundation,
so
one
of
the
threats
they
have
identified
is
firmware
virus
no
out
of
virus
or
sturdy
risk
that,
after
even
if
you
like,
if
your
laptop
authenticated
using
species
Inspire,
for
example,
and
the
it's
possible
for
this
firmware
to
trigger
a
reboot
right
after
reboot,
the
it
can
be
remotely
controlled
so
by
the
by
the
actor,
I
guess
hacker,
so
I
I
can
bring
up
that
post.
That
link.
E
E
B
E
A
Okay,
so
let
me
just
third-part
malicious-
and
this
is
third
party
firmware.
A
D
Now
let
me
say
this:
let
me
say
this:
there
are
firmware.
That's
in
the
Linux
kernel
gets
meticulously
like
audited,
so
I
would
say
also
to
keep
I
mean
if
we're
talking
about
Linux,
specifically
most
firmware
like
should
relatively
be
safe,
but
it
kind
of
goes
under
that
install
but
verify
I
would
say,
because
you
gotta
figure
out
like
do
I,
have
the
right
firmware.
A
Okay,
so
here
we
don't
speak
more
of
now,
we
speak
of
third-party
Hardware
drivers
right.
D
D
Those
are
the
dkms
Dynamic
kernel
modules,
which
is
really
the
ones
that
you
need
to
be
careful
of,
because
those
are
the
kernel
modules
that
can
get
loaded
on.
The
Fly
could
be
a
problem,
but
this
is
where
the
kernel
has
like
signing
and
stuff
like
that
that
you
can
Implement,
depending
if
you're,
building
your
own
kernel
or
not.
D
B
A
E
I
just
listed
the
added
the
the
open
compute
foundations,
common
security
threat
version,
one
the
first
one
in
the
table.
If
you
just
go
see,
it
is
called
execution
of
unauthorized
firmware.
E
Let's
scroll
down
to
the
table,
it
will
add
the
first
one
is
firmware.
D
You
don't
really
have
a
whole
lot
of
option
on
what
firmware
you
have
installed
like
the
distro
comes
with,
so
it
depends,
I
think
on
the
distro,
and
it
depends
on
how
you
have
things
set
up.
E
F
E
But
the
basically
they
take
the
responsibility
away
from
the
end
user,
but
underneath
the
the
cloud
itself
still
have
the
same
consideration
concerns
right
so
similar
to
here
firmware
yeah.
In
most
cases,
user
developer
shouldn't
need
to
worry
about
the
firmware
now.
Is
it
in
scope
to
consider
it
who's
responsibility?
It
is
to
consider
it
then.
D
That's
a
fair
point,
but
here
here's.
What
I'm
saying
is
that
like
we're
talking
about,
for
example,
a
version
of
firmware
that
has
a
specific
problem
like
how
would
a
user
go
about
controlling
that?
Because
in
most
districts
that
I
know
you
either
install
the
firmware
per
vendor?
Or
you
just
install
this
one
huge
package
that
is
just
firmware.
E
D
Let
me
say
this
Victor
I
think
they're,
referring
to
like,
if
you're
using
like
yakto
project,
to
basically
create
like
an
in
like
an
embedded
operating
system,
then
I
think
then
that
would
be
when
you
would
do
this,
because
you
would
have
control
over
everything
in
like
yocto
project,
which
is
similar
to
Gen
2.
But
it
would
only
be
in
those
scenarios
that
you
would
actually
have
control
over
this.
D
E
D
B
D
Does
but
like
how
useful
is
that
legitimately,
and
do
you
be?
Would
you
would
be
one
out
of
like
two
people
that
do
it
so
I
mean
unless
of
course,
you're
using
again
Gen,
2
or
yakto
project,
because,
for
example,
in
Gen
2,
you
can
select
the
firmware
that
you
want
to
have
it
or
you
want
to
carry
on
your
system
and
yakdo
project?
You
can
do
that
as
well,
because
basically
you're
doing
essentially
Linux
from
scratch
and
building
an
immutable
image
with
everything
for
your
embedded
device.
D
That
should
be
somewhat
tamper-proof,
so
you
know
so
so
in
those
scenarios
absolutely
but
like
here
so
I
guess
the
question
Henrik
that
I'm
proposing
is
do.
Is
this
a
developer
like
because
we're
talking
about
developer
machines?
So
if
we're
gonna
to
talk
about
developer
machines,
I
would
almost
argue
that
that
wouldn't
be
in
scope.
They
were
talking
about
embedded,
embedded
programming
and
embedded
devices,
then
that
would
be
in
scope.
D
E
E
I,
don't
know
yeah
what
I
mean
is
I,
think
it's
possible
to
exist,
just
treat
it
likes.
Insiders
thread,
because
this
the
mitigation
for
this
firmware
risk
basically
means
the
the
the
corporate
policy
in
the
network
policy
will
not
allow
like.
If
you
build
your
own
laptop
right
using
any
firmware,
those
Hardware
will
not
be
allowed
into
the
corporate
Network.
E
So
so
as
long
as
corporate
Network
policy,
this
require
really
strictly
secured
laptop
devices.
Then
this
will
not
be
an
issue.
A
It
goes
a
little
bit
in
the
same
direction
than
the
thought
that
I
had
that.
Interestingly
in
such
bigger
development
organization,
I
suppose
these
are
completely
different
teams
and
responsibilities
right.
So
you
have
one
group
of
people
that
tell
developers
how
to
securely
develop
whatever
software
application,
and
then
you
have
an
entirely
different
team.
A
That
is,
you
know,
providing
the
machines
and
providing
you
know,
sorting
out
operating
system
updates
and
and
all
this
and
so
I
found
it
interesting.
From
that
perspective,
let.
D
Me
just
say
this
real
quick
Henrik
that
I
have
had
that
point
of
view
for
many
years
and
in
my
almost
10
years
of
being
a
packager
I
have
yet
to
find
a
company
that
actually
has
a
a
like
team
that
oversees
their
packages
or
that's
like
hey.
Let's
look
at
all
of
the
patches
that
we're
using
and
make
sure
that
there's
no
junk
in
them
or
anything
I've,
never
foreign
I,
haven't
met
anyone
yet
so
I
I
would
say
in
theory.
D
A
No
yeah
I
didn't
I
didn't
want
to
say
that
there
are
teams
in
such
organizations
that
vet
each
and
every
version
I
completely
agree.
This
is
happening
not
very
off,
not
to
say
very
seldomly.
What
I
meant
is
that
those
Central
Security
teams
care
more
for
the
the
security
in
the
application
development
life
cycle
right.
A
They
they
organize
and
prescribe
the
static
application
scanners
to
be
used,
or
maybe
some
Dynamic
scanners
to
be
used
and
they
teach
them
the
secure
development
practices
and
all
this
on
the
level
of
the
application,
and
this
group
is
completely
different
from
the
ones
managing
the
the
low
level
systems.
Basically
correct.
D
A
A
Yeah,
no,
but
but
I
find
it
interesting
that
those
people
that
supposedly
are
more
aware
of
supply
chain
text,
or
maybe
I'm
I'm
also
biased,
but
those
those
application
security
teams
that
are
very
well
aware,
by
now
of
malicious,
open
source
components
and
the
the
risks
coming
with
the
consumption.
A
They
are
not
in
control,
and
they
cannot
do
much
about
the
lower
operating
system
level
through
which
some
malicious
stuff
can
also
come
in
and
I'm,
not
sure
whether
the
it
teams
are.
You
know
on
the
same
page
in
regards
to
the
consumption
of
third
party
stuff.
So
that's
that's.
Why
I
find
it
an
interesting
point
here,
you're.
F
A
Okay,
so
let
me
just
write
a
quick
note
for
this.
A
A
I
don't
know
there
were
different
experiments
in
Germany
I.
Remember
one
from
the
city
of
Munich
I
think
many
years
back.
They
they
went
away
from
Windows
in
order
to
deploy
Linux
systems
all
across
the
city
systems,
but
after
I
think
a
number
of
years.
They
went
eventually
back
to
Windows,
but
it
took
them
a
couple
of
years.
A
D
D
A
A
D
B
A
So
I
wonder
whether
we
can
actually
turn
this
into
maybe
make
this
so
rather,
this
was
very
particular
for
malicious,
third-party
firmware
and
Hardware
I.
Wonder
whether
that's
worthwhile
transforming
those
thoughts
we
had
into
a
more
into
another
thread
or
a
general
thread
such
as
this.
This
split
between
organizations
caring
for
different
parts
of
a
stack
one
for
the
development
one
for
the
operating
system,
and
that
this
is
resulting
in
less
oversight.
Let's
say.
A
D
Henrik
for
next
time,
I'm
going
to
see
if
a
colleague
of
mine
being
is
available
to
come
to
this
call,
he
does
a
lot.
He
doesn't.
He
knows
a
lot
of
people
in
the
colonel
knows
lioness
and
everyone,
but
he
does.
Although
the
training
for
security
Kernel
wise
and
for
LF,
like
he
got
space,
his
main
job
is
to
maintain
that
stuff.
D
A
That
sounds
interesting.
I
would
also
love
to
have
a
seat
back.
Maybe
on
this,
this
research
that
was
done
by
the
by
some
U.S
University
a
couple
of
I
think
one
or
two
years
back
where
they
tried
to
basically
suggest
malicious,
pull,
requests
and
checked
whether
they
were
caught
by
the
reviewers,
and
it
was
a
large
debate
about
this
going
on
I'm,
not
sure
whether
you're
aware
of
that.
A
All
right,
I
think
we
again
I
mean
time
flies.
We
reach
the
top
of
the
hour
already
anything
else
that
you
want
to
have
mentioned
today
before
we
close
the
meeting.
A
D
E
E
Even
reach
out
to
the
teams,
who
usually
manage
developer
machine
to
see
what
kind
of
thread
they
can
think
of,
for
example,
Microsoft,
Desktop,
Windows
or
a
Linux
desktop
whichever
right,
VMware
desktop.
D
D
Linux
desktop
and
Mac,
OS
and
and
I
think
we
got
a
handle
on
that
between
me
and
me
and
Microsoft
though
I
rarely
use
only
for
video
games.
A
No
but
I
think
what
it
but
I
think
what
Victor
meant
is
that
we
have
so
I
can
I,
guess
technically
you
you
you
can
handle
this
I.
Don't
have
any
doubt
about
this,
but
more
from
an
organizational
point
of
view.
When
you
happen
to
work
in
an
organization
that
manages
those
operating
systems
for
the
development
Workforce
that
what
are
the
typical
challenges
and
problems
that
they
face.
I
think
that
is.
D
A
Okay,
I
will
I
will
write
him
a
message
on
slack
to
to
see
whether
he
knows
anybody
working
for
such
a
t,
system,
departments
or
so
and.
D
A
All
right,
I
think
one
I'm,
not
sure
one
thing
that
I
I
find
is
not
sufficiently
reflected
is
about
what
I
mean
here.
We
we
speak
of
operating
system,
productivity,
software,
Ides,
plugins
and
so
forth,
and
so
here
in
the
diagram
on
the
right
hand,
side.
We
very
well
say
this
is
kind
of
across
the
whole
software
stack.
But
if
we
look
on
the
left
hand,
side
we
stay
very
generic
and
high
level
without
I
would
be
nice
to
have
examples
of
all
the
different
cases
in
order
to
give
it
some
meat.
A
This
reminds
me
much
of
this
xcode
ghost
attack
from
2016
or
so,
where
some
kind
of
a
malicious
I
think
this
was
this
xcode
apple
development
environment
that
was
deployed
and
available
for
download
on
on
some
Chinese
mirror,
and
it
exploited
the
fact
that
all
the
Chinese
Mac
OS
developers
or
iOS
Developers
wanted
to
to
basically
downloaded
it
much
quicker
than
from
the
authoritative
sources,
and
so
there
was
a
vector
to
distribute
a
malicious
xcode
environment.
A
And
here's
one
okay
guys
if,
unless
there's
anything
else,
I
would
close
the
session
and
thank
you
again
for
this
very
nice
discussion.
I'm
looking
forward
to
speaking
you
soon
thanks,
everyone.