►
From YouTube: SES-mtg: Layer modules on SES, loader config, SESify
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
B
B
B
B
Basically
one
of
the
nice
things
that
that
approach
had
over
original
kaha
approach
was
that
it
did
not
require
an
actual
pass
through
the
code.
That
was
just
strictly
runtime
semantics
and
this
seems
to
move
us
back
into
the
world
where
we
have
to
have
a
parsing
code,
checking
intermediary
and
part
of
the
big
win
not
having
that
was
about
load
times,
and
this
seems
to
put
us
back
in
the
world
of
extensive
load
tax.
So.
A
I'm
still
there
I'm
thinking
about
how
to
say
this
is
because
we
want
the
initial
the
top
level
state
of
a
module
to
be
to
let's,
let's
just
take
the
pure
case
for
it,
so
for
the
pure
loader,
which
operates
at
the
SES
root
realm
level,
so
that
it
can
be
implicitly
shared
by
everything
in
the
root
realm
in
or
in
order
to
be
implicitly
shareable.
It
must
not
provide
any
privilege
must
not
provide
any
mutable
State.
So
in
that
sense,
the
top
level
module
state
is
an
extension
of
the
primordial
state.
A
A
Then,
in
the
old
SCS,
before
modules
than
anything
else
entering
that
environment,
any
other
code
entering
that
environment
did
it
by
evaluation.
The
result
of
evaluation
was
not
assumed
to
be
pure,
but
then
two
evaluations
of
the
same
source
code
would
be
completely
separate
evaluations,
giving
you
stateful
things
that
are
ISIL
that
are
confined.
They
don't
have
any
access
to
anything
outside
themselves
that
weren't
granted,
but
they
are
internally
stateful
and
they're
isolated
from
each
other
by
virtue
of
a
separate
evaluation.
A
A
B
Is
it
is
important
to
note,
or
maybe
valuable
anyway,
that
monotonic,
you
cannot
observe
the
change
of
is
an
interesting
interesting
you
to
mediary
between
mutable
states
and
immutable
state,
and
the
loader
likely
fits
into
that
up,
where
the
only
way
you
can
find
out.
If
something
is
there
is
by
loading
it
which
point
is
loaded,
and
so
you
can't
tell
whether
you've
loaded
it
in
or
someone
else
did,
and
that
means
you
know
and
the
mechanisms
like
that
you
can
have
effectively
mutable
state
with
no
with
no
visible
side
effects.
A
B
A
Safe
by
construction,
with
an
important
caveat
that
needs
to
be
explained
somewhere
well
and
carefully,
which
is
a
loader,
has
to
actually
fetch
the
contents.
The
source
code
of
the
module
from
somewhere,
given
the
name
of
the
model
as
a
turn
names
of
modules
into
module
source
cavity,
and
it
typically
does
so
by
going
somewhere
external,
like
a
file
system
or
across
a
network
or
something
externally.
A
And
yeah,
so
that's
externally,
visible
IO,
and
if
the
content
of,
if
the
initial
content
of
the
file
can
change
over
time,
then
the
first
time
that
it's
read
the
content
can
be
different
depending
on
the
time
from
which
it's
read.
So
here's
the
stance
that
needs
to
be
carefully
explained
but
which
I
believe
is
a
consistent
stance,
which
is
when
you're
running
just
SES
inside
a
language
implementation
just
considered
in
then
tional
sense,
while
there's
still
a
TCB,
which
is
the
language
implementation.
A
That
internally
is
running
things
on
beautiful
memory,
but
the
mutation
of
the
memory
that
the
language
implementation
is
using.
It's
up
from
the
TCB
to
make
choices
there
such
that
those
mutations
are
not
visible
at
the
level
of
abstraction.
Where
you
talk
about
immutability
in
the
language
now,
in
this
case,
the
TCB
distinction
is
not
language
versus
not
language,
but
it's
the
code
that
sets
up
a
new
SES
root
realm
versus
the
SES
root
realm
itself.
A
The
when
you
said
when
you
use
the
realm
api
in
the
SES
API
to
create
a
new
SES
root,
realm
you're.
Basically,
in
the
position
that
has
equivalent
power
to
setting
up
a
new
meta
interpreter,
rather
than
use
the
actual,
primitive
realm
API,
you
could
instead
have
emulated
the
whole
thing
with
an
interpreter
of
SES
written
in
SES.
And
then
you
can
do
anything
from
that
perspective
that
you
can
do
by
virtue
of
being
in
control
of
meta
tripping.
A
So
then,
if
you
want
to
write
a
correct
med
interpreter,
it's
important
for
you
not
to
expose
to
the
interpreted
code
mutability
that
is
below
the
level
of
abstraction
of
the
definition
of
the
language
that
is
being
interpreted
so
an
equivalent
manner.
I
would
say
when
you
set
up
an
SES
greet
realm
with
a
pure
loader.
You
have
to
give
it
the
I/o
channel
the
fetch,
basically
the
fetch
logic
that
it
will
use
to
somehow
dereference
a
module
name
into
module
source
code.
A
So,
if
that's
where
the
responsibility
lies
to
make
the
configuration
to
decisions
such
that
within
the
language
created
inside
that
realm,
there
is
no
resulting
visible,
mutable
State
and
likewise
it's
the
same
locus
of
responsibility
that
deals
with
the
previous
issue
we
just
discussed,
which
is
how
do
you
arrange
it
so
that
the
pure
loader,
only
loads
modules
that
have
been
statically
verified
to
be
purified?
If
we
adopt,
let's
say
chip
story,
where
we're
using
a
cryptographic
hash?
A
Well,
there's
still
what
file,
what
you're,
what
record
of
cryptographic,
hashes
of
things
that
are
already
approved,
or
what
signature
verification
key
or
whatever
do
I
trust,
to
only
indicate
that
something
has
passed
a
static
checker
where
that
static,
checker
is
elsewhere
and
elsewhere,
and
that's
exactly
the
same
kind
of
TCB
responsibility.
So
I
think
that,
on
seeing
the
setting
up
of
a
reground
versus
being
inside
of
a
root
realm
as
being
a
inside
to
TCB
versus
on
top
of
the
TCB
kind
of
dependency
is
a
consistent
story.
A
B
A
It
enumerates
all
of
the
own
properties
of
the
object
it
does
it
with
get
own
property
descriptors
rather
than
get
so.
If
you
have
an
own
property,
which
is
an
accessor
property,
it
does
not
invoke
the
getter
of
the
access
of
property.
If
you
have
an
own
property,
which
is
a
data
property,
then
it
reads:
the
value
of
the
data
property
within
the
own
property
is
an
access
of
property.
It
recurs
on
the
getter
function
and
the
center
function.
A
It
works
transitively
through
such
property,
such
reflective
property,
traversals
of
own
properties,
and,
in
addition,
it
accumulates
the
the
the
prototype
links
now.
Javascript
has
two
different
meanings
for
the
word
prototype,
which
we
always
need
to
be
careful
to
distinguish.
It
enumerates
the
the
objects
that
this
object
inherits
from.
Oh
and
it
does
both
of
these
enumerations
after
it
freezes
the
object
itself,
so
it
first
freezes
the
object
itself,
guaranteeing
that
neither
of
these
things
then
change
once
it
freezes
the
object.
It
says.
A
Okay,
what
object
is
this
object
inherit
from
it
accumulates
that
into
a
set?
It
then
does
a
get
on
property
descriptors
to
do
the
recursive
property
walk,
so
it
does
the
full
recursive
property,
walk
of
own
properties
accumulating
those
objects
that
are
inherited
from
into
a
set,
and
then
once
the
own
property
traversal
has
finished
with
all
of
those
objects
themselves
being
successfully
frozen.
A
It
then
asks
the
question
are
all
of
the
objects
in
the
I'm
going
to
call
it.
The
super
set,
the
the
set,
not
not
super,
set
Oh
God,
terrible
choice.
The
inherits
from
set
then
asked.
The
question
are
all
of
the
objects
in
the
inherits
from
set
things
that
I
already
remember
have
been
successfully
hardened
or
things
that
I
have
successfully
apparently
hardened
in
this
hardening
itself,
and
if
the
answer
to
all
of
that
is
yes,
then
it
takes
all
of
the
objects
that
it
hardened
this
hardening
past
and
it
adds
them.
It
commits
them.
A
Basically,
it
adds
them
to
the
set
of
objects.
We
know
to
be
hardened
and
that
also
reminds
me
of
something
that
I
skipped
in
the
previous
in
the
story
so
far,
which
is
even
in
the
own
property
traversal,
for
each
object
that
I
look
up
in
vessel
and
properly
traversal
I
also
check
it
against
the
objects
that
are
known
to
already
be
hardened
and
if
so,
I
stop.
A
So
the
objects
that
are
already
known
to
be
hardened
are
kind
of
the
fringe
of
the
traversal,
and
every
time
you
successfully
complete
an
entire
hardened,
including
the
inherits
from
check.
Then,
once
that
entire
thing
is
completed,
then
all
of
the
objects
in
that
pass
in
turn
get
committed
to
the
to
the
already
hardened
set.
If
that
fringe,
ok.
A
B
Coming
back
to
what
I
was
getting
at,
I
see
a
couple
possibilities
for
side
effects
in
this
example,
here:
first
off,
if
properties
of
X
or
Y
inside
change
and
the
property
is
a
part
of
what
make
point,
returns
might
change.
And
secondly,
your
two
string
is
not
operating
on
the
hardened
variables,
this
X
comma,
this
not
Y,
but
on
these
external
values.
So
two
string
might
be
disconnected
in
what
it
reports
from
the
actual
X&Y
again.
B
A
A
A
So
so
let
me
do
one,
then
the
other,
so
the
hardened
we
statically
know
when
we
see
an
object
literal,
that
the
hardened
will
proceed
into
all
of
the
names
properties,
ins
that
appear
that
appear
literally
in
the
object
literal.
So
in
this
case
that
we
know
statically
assume
that
Harden
will
dynamically
enumerate.
X
Y
and
the
two
strength
function
itself
and
therefore,
if
harden
succeeds,
then
those
will
in
turn
be
hardened,
so
that
and
then
any
for
any
own
properties
of
x
and
y
must
also
in
turn,
therefore
be
hardened.
Just.
B
A
Right,
it's
hardening
the
argument
to
harden,
which
is
the
object.
Literal,
the
the
object.
Literal
has
an
X
property
whose
value
is
the
value
of
the
X
parameter
and
whose
why
property
is
the
value
of
the
Y
parameter
and
that
therefore,
the
the
recursive
hardening
of
hardened
will
proceed
to
harden
both
of
those
parameter
values.
Its.
B
B
A
Good
good,
so
there
there
are
two
side
effects
that
we
can
talk
about
in
this
very
code.
One
is
the
one
that
Michael
mentioned,
which
is
Harden
itself,
is
causing
the
side
effect
of
hardening
things.
Well,
that
is,
that
should
be
considered
a
side
effect
and
in
particular
one
unfortunate
but
unavoidable
case
is
that
if
any,
if
an
object
being
hardened
is
itself
a
proxy,
the
proxy
can
sense
that
it
is
that
the
steps
that
harden
engages
in
and
can
run
user
code
during
those
steps.
A
So
when
we
did
the
security
review
of
the
original
hardened,
we
were
very
careful
to
thinking
to
think
about
the
orden.
The
order
in
which
hardened
does
the
operations
on
the
objects
that
is
hardening
in
this
recursive
walk,
so
that
the
invariant
that
a
successful
hardened
is
guaranteeing
is
an
invariant
that
we
can
be
confident
of,
even
if
there
are
malicious
proxies
inside
the
graph
being
hardened.
So
that's
a
necessary
property.
A
A
C
B
Yeah
we
had
a
helper
function
called
make
require
that
helps
you
build
a
require
endowment
that
can
be
passed
into
the
into
the
realm
and
the
reason
that
we
built
it.
The
reason
that
we
have
the
assess
realm
itself
providing
that
helper
function
is
because,
if
the
rule
you
can
configure
that
object
to
support
any
require
of
a
gorrik,
slash
hardened
by
providing
the
very
same
hardened
function
that
was
created
to
harden
the
internals
of
the
assess
realm.
B
So
it
seemed
like
a
thing
where
it's
it's.
It's
not
a
slam-dunk
to
implement
some
of
this
functionality
by
simply
importing
new
strings
into
the
environment,
that
functionality
kind
of
wants
to
interact
with
its
parent
environment,
and
so
that's
why
we
decided
to
have
that
helper
function,
be
a
part
of
the
assess
realm
itself.
Yeah.
A
A
Time
so
a
brought
bryant's.
Please
stop
me
if
my
the
following
statement
about
it
is
unfair,
but
the
the
require
that
Brian
is
describing
was
really
kind
of
hard-coded
in
as
a
special
case,
so
that
we
can
could
unbundle
and
separate
parts
of
the
SES
implementation
itself,
such
as
require
and
then
put
it
back
together.
A
A
B
I
think
that
that's
fair,
that
the
immediate
goal
for
adding
make
require
was
to
allow
code
to
be
unit
tested
outside
obsessed.
That
will
also
run
in
the
same
way
inside
obsess
so
when
you're
outside
obsessed,
you're
going
to
say,
require
hardened
or
require
that
or
something
like
that,
and
we
wanted
to
be
possible
to
have
that
same
source
code
that
has
that
same
required
statement
work
both
inside
and
outside.
So
this
is
a
way
of
creating
the
required
endowment
for
it.
B
I
think
in
the
longer
run,
so
I
I
think
that
that
make
choir
and
the
configuration
that
we
pass
in
to
make
require
will
be
an
educational
tool
for
us
to
figure
out
how
we
want
an
API
for
a
safe
module
loader
to
look
I
think
in
the
long
run
we'll
be
building
some
sort
of
module,
loader
object
and
loader
hook
out
of
some
similar
configuration,
maybe
out
of
a
manifest
of
some
sort.
That's
that's
a
scanned
in
in
something
that
people
can
analyze.
B
People
can
audit
people
can
discuss
whether
this
manifest
accurately
describes
the
kind
of
authority
we
want
to
be
passing
around
in
the
module
graph.
We
want
to
link,
but
I
agree
that
in
the
long
run,
this
needs
to
be
something
other
than
an
endowment
called
require,
because
we
really
want
sets
to
be
able
to
support
es6
module
syntax
and
that's
not
something
you
can
do
with
an
endowment.
That's
something
you
have
to
do
to
the
realm
itself,
so
they
can
be
active
at
the
time
you're
pulling
in
the
code,
the
module
that
has
those
supports.
A
Another
goal
I
would
like
to
state
that
I,
don't
know
if
it's
going
to
turn
out
to
be
possible
in
practice
is
right.
Now
the
shim
itself,
the
at
least
the
SES
shim,
combined
with
the
the
underlying
realm
shim.
Both
of
those
are
currently
very
small
and
they're
and
they're
heavily
reviewed
for
security.
A
We
intend
them
to
remain
reviewable
for
security,
so
that
we
can
have
a
lot
of
confidence
that
they
have
the
security
property
we
need,
and
in
order
for
them
to
be
small,
it
is
essential
that
they
do
not
parse
JavaScript
and
we're
using
worker.
It
ease
magic,
eight
lines
of
JavaScript
code.
The
key
thing
there
was
that
we
can
use
direct
eval
on
the
string
being
safely
evaluated
as
eval
code
as
script
code
being
fed
to
a
valve
that
we
can
do
that
without
parsing.
A
A
A
But
therefore,
by
layering,
the
danger
of
getting
the
JavaScript
parser
wrong
is
limited
to
corrupting
the
meaning
of
written
using
modules,
but
not
endangering
any
other
code
from
the
mist
parsed
module
code.
Since
the
mist,
parsed
module
code
must
still
turn
into
script
code
that
gets
evaluated
by
the
eval
at
the
SES
realm
level.
B
B
But
this
is
something
that
we
need
to
explore:
I'm,
reasonably
convinced
that
there
will
need
to
be.
You
know
some
direct
support
for
having
a
module
system
that
is
able
to
bring
in
a
endowment
or
or
anything
impure
or
what-have-you,
and
and
and
that's
something
that,
as
the
secure
module
system
is
is
it
is
just
nailed
down
it'll,
be
a
lot
more
obvious.
What
what
that
would
be
mmhmm.
A
B
So
in
your
I
agree,
those
layering
goals
are
really
good
ones.
Can
you
currently
imagine
what
that's
going
to
look
like?
Are
we
sort
of
somehow
using
the
parser,
which
we
have
less
confidence
in
to
transform
the
source
code
that
has
import
statements
into
something
that
does
not
and
then
have
it?
You
know
transforming
an
in
to
require
statement
somehow
yeah.
A
C
A
Well,
let
me
mention
by
the
way
another
agenda
item
errand,
that
I
would
really
appreciate
it.
If,
if
we
did
during
this
meeting,
if
you're
ready
to
do
it,
which
is
Aaron
together
with
Dan
Finley,
are
at
meta
mask,
they
are
currently
taking
another
JavaScript
module.
Packaging
system
called
browserify,
which
meta
mask
is
currently
using
and
building
a
variant
of
it,
which
they're
calling
Sesa,
Phi
and
Sesa
Phi
is
currently
duplicating
module
evaluation
and
they
need
to
move
to
something
that
does
not
for
which
I
referred.
A
A
C
A
Okay,
so
so
the
so
in
the
fantasy,
let's
say
we
just
translated
echo
script
modules
into
commonjs
modules,
naively
I'll.
Just
just
mention
why
this
is
an
unrealistic
fantasy.
Is
that
the
semantics
aren't
the
same,
so
the
fantasy
only
applies
to
two
modules
that
are,
that
would
be
insensitive
to
the
difference
in
semantics.
Keep.
B
Mean
it's:
it's
well,
I'm,
not
sure
it's
great!
It's
it's!
There
are
a
lot
there's
a
lot
of
tooling
that
lets.
You
use
the
yes
module,
syntax
that
translates
it
into
commonjs
calls
and
and
and
then
people
write,
commonjs
modules
that
are,
you
know
that
are
that
are
the
do
all
of
the
tricks
that
you
can
do
with,
or
rather
they
write
things
which
look
like
yes,
yes,
ESM
modules,
but
they
do
all
of
the
tricks.
The
dynamic
code
tricks
that
you
do
with
I
see
it's
not.
A
An
actual
it's
not
actually
that
they're
insensitive
to
the
change
in
semantics,
it's
that
they
actually
depend
on
commonjs
semantics,
while
looking
like
any
SM
module,
correct.
Okay,
that's
pretty
terrible
yeah,
so,
but
real,
okay,
so
I.
So
in
that
case,
I'll
go
back
to
my
unrealistic
fantasy
that
we're
dealing
only
with
modules
that
are
insensitive
to
that
difference.
Well,
we
run
each
one
through
a
naive
translator.
The
naive
translator
does
not
package
them
together
into
a
big
text
file.
It
keeps
each
module
a
separate
text
file.
A
Has
you
know
the
the
SES
mechanism
provides
an
ability
to
customize
two
levels
of
global
scope
for
the
string
being
evaluated,
the
the
equivalent
of
the
global
lexical
scope,
which
is
what
we're
the
endowments
get
mapped
to
and
then
the
equivalent
of
the
global
scope,
which
typically
corresponds
one
to
one
with
the
global.
This
object
for
that
compartment,
since
we
would
want
multiple
modules
in
a
directory
tree
effectively
at
the
layer
that
understands
about
modules.
A
We
would
want
all
of
those
multiple
of
those
to
turn
into
evaluations
inside
the
same
compartment
at
the
SES
level
and
therefore
into
evaluations,
evaluations
sharing
the
same
global.
This
we
would
need
to
give
each
evaluation
a
different
require
endowment
and
a
different
exports
endowment
because
of
the
way
commonjs
modules
are
defined.
B
D
A
So,
let's
so,
let's
just
hypothesize
a
a
node
prime,
a
build
of
node
that
has
your
extension
and
and
now,
let's
say
that
we're
trying
to
build
a
cess
with
a
safe
module
system
that
is
implemented
in
a
more
direct
manner
when
running
on
node,
prod,
I.
Think
I.
Think
that
that
that's
a
wonderful
goal
and
I
think
it's
doable.
I'm,
hoping
it's
a
difficult
goal
and-
and
such
a
thing
being
achieved,
would
also
help
the
momentum
towards
pushing
the
required
issues
through
the
standards
committee.
A
So
the
the
issue
would
be
that
you
would
want
to
create
a
separate
loader
per
compartment.
You
would
want
to
also
create
the
pure
loader
for
the
root
realm
as
a
whole.
The
SES
root
realm
as
a
whole,
Bradley
I,
think
you
might
have
joined
the
meeting
after
I
went
through
the
rules
of
what
it
means
for
a
the
pure
loader
to,
except
only
purify
obamas.
But
it
was
it's
very
consistent
with
our
previous.
B
A
It's
what
everything
said
today,
I
think
is
consistent
with
all
of
our
previous
discussions
on
that,
and
then
the
manifest
would
declaratively
describe
wiring
and
constraints
of
the
allowed
import
graph.
You
know
what
wiring,
as
in
rhe,
mappings
and
and
limitations
of
the
import
graph
between
these
different
loaders
and
I
think
because
all
of
that
can
be
expressed
declaratively.
D
A
A
A
A
A
Potentially
incorrect
static
analysis,
so
there's
a
large
number
of
asks.
There's
a
large
number
of
things
that
the
platform
could
do
to
that
would
help
us
do
more
reliably
things
that
we're
now
doing
with
musica
well,
but
I.
Think
all
of
those
are
secondary
to
the
to
the
core
issues
that
we've
already
raised.
D
A
That
would
be,
in
fact,
a
tremendous
step
forward,
even
without
new
platform
mechanisms
for
the
global
and
I
think
that,
as
the
realm
proposal
proceeds
through
a
committee
that
realm
proposal
itself
requires
that
kind
of
global
hook.
So
at
that
point
it
becomes
a
issue
of
whether
the
VA
team
will
actively
resist
and
I'm
more
hopeful
now
than
ever
before
that
that,
as
this
proceeds
forward
in
committee,
that
they
are
not
going
to
block
it,
I'm
more
hopeful
for
that.
But
there's
no
guarantee
we
should
proceed.
A
A
A
When
you
say
customizing,
the
import
hook,
I
want
to
be
very
clear.
The
resolve
hook
that
you
showed
is
also,
if
I
understood,
correctly,
also
necessary,
which
is
what
is
the
logic
wood
by
which
a
loader
turns
the
name
of
a
module
into
source
code
for
a
module,
for
example,
where
on
a
file
system
or
over
the
network,
or
wherever
does
it
go
in
order
to
dereference
a
module
name
into
module
Srpska?
That
has
to
be
such
that
that
logic
has
to
be
provided
by
user
code.
D
D
D
So
we're
trying
to
work
through
a
nasty
memory
leak
problem
by
holding
on
to
refs
in
a
bad
way,
so
we're
trying
different
models
of
ownership,
ignore
that
for
now
roughly
let's
say
we
have
a
loader,
the
echo
loader
okay
and
we
have
the
default
loader.
Okay.
These
should
ideally
be
able
to
be
implemented
such
that,
if
you
wanted
to
create
a
entire
recreation
of
the
default,
it's
possible
to
do
in
this
position,
which
I
think
is
what
you're
asking
yes.
A
D
D
D
A
D
A
A
Right
right,
right,
right
and
then
resolve
goes
from
the
call
site
and
specifier
to
other
call
site
got
okay,
so
from
call
site
to,
and
then
there
but
there.
So
this
thing.
So
so
it's
good
that
this
can
be
provided
by
user
code,
but
it
doesn't
answer
the
question
of
where
does
the
logic
come
from
for
for
actually
dereferencing
this
into
module
source
code.
D
Few
ways
we
can
do
it
currently
the
buggy
way
we
have
it.
Essentially,
you
can
unwrap
a
call
site
and
get
what
its
body
is.
I'm,
gonna
right
here,
just
because
I'm
being
in
strip,
so
roughly
you
would
be
like
Paul
site
get
body
and
then
this
most
likely
returns
something
synchronously,
but
you'd
have
to
actually
await
a
body
if
it
because
it
could
be
streaming
in.
A
D
A
Like
okay,
so
I
can
make
a
call
site
object
where
I
provide
the
call
site
object
with
the
contents.
Yes,
okay,
and
at
the
at
the
time
that
I'm
making
the
call
site
I
have
the
parent
call
site
and
the
specifier
as
the
parameters
as
you're
showing
wow.
So
that's
all
adequate.
That
gives
me
what
I
need
to
create
an
entire
new
loader,
often
whole
cloth,
including
the
logic
for
obtaining
the
source
code
from
elsewhere.
D
A
C
A
C
As
I
changed
applications,
if
I
don't
fix
the
font
size,
please
interrupt
me
and
remind
me
okay,
so
so,
first,
what
does
browserify
it's
just
a
it's
an
app
bundler
as
I
understand
it
dates
back
from
when
you
know,
nodejs
and
npm
were
being
introduced.
We
we
had,
you
can
pull
down
dependencies
and
import
dependencies,
these
sort
of
thing
and
people
and
your
web
developers
were
like
this
is
great
I
want
to
use
to.
But
how
do
we
get
all
these
files
in
the
web
browser
and
browserify?
C
C
A
A
A
C
Multiple
and
then
it
will
analyze
that
code
and
look
for
require
statements
but
locally
or
two
dependencies
and
it'll
turn
them.
You
know
it
has
a
small
sort
of
bootloader
or
kernel
or
TCP
or
whatever
you
want
to
call
it,
though
this
is
definitely
not
built
with
security
in
mind.
Okay
and
then
it'll
output,
a
single
javascript
file
that
encapsulates
all
those
things,
and
when
it's
run
it
will
run
your
entry
point.
Okay,.
A
C
That's
correct,
okay
and
also
helps
because
the
intention,
you
can't
run
it
and
no
chance,
but
the
intention
is
that
you're
gonna
run
this
at
the
browser.
Okay
by
the
name
does
refer,
and
so
it
does
sort
of
some
things
like
the
global
object
which
is
available
in
node.
You
know
now
points
the
window.
C
C
A
C
A
C
C
So
we
don't
need
so
this
is
the
compile
pipeline
again.
I'm,
not
gonna,
look
at
all
of
it,
but
basically
this
walk
in
dependency
graph
and
it's
sending
JSON
blobs
through
the
pipeline
that
include
the
dependencies
used
by
that
file
and
how
they're
named
and
the
content
of
the
file
and
then
there's
a
you
can
use
browserify
transforms
these
would
be
things
like
you
know,
converting
CoffeeScript
into
JavaScript
or
Babel,
or
some
minify
kind
of
things
and.
A
A
A
C
And
so
I
guess
the
the
main
thing
I
want
to
show
in
the
pipeline
is
that
it
gets
instant
at
the
end.
It
starts
to
look
like
this,
where
it
has
like
an
ID
that
is
the
ID
for
a
file.
This
is
a
single
file
from
your
project
or
a
file
from
a
dependency.
It
has
a
list
of
the
dependencies
from
that
that
file
uses
and
how
they're
named
or
coming
to
their
ID,
and
it
has
the
content.
What.
C
A
C
A
A
C
C
C
A
C
C
C
C
A
C
C
A
C
Now
we
can
let's
look
happen,
so
this
is
index
us
even
higher
than
that.
Here's
index
dot
HTML
we're,
including
the
bundle
bundle,
is
made
from
his
extra
je
s.
We're
loading
owed
here
and
then
we're
running
some
code
and
the
sort
of
situation
for
this
demo
is
that
both
these
modules
were
good,
useful,
friendly
modules
and
then
sometime
during
the
maintenance
of
this
application
or
the
dependencies
went
rogue
and
that
happens
to
be
the
flow.
As
we've
been
looking
at
their
sort.
C
Oh
yeah
I
must
explain
I'm
using
the
sake
of
this
demo
I'm
using
a
browser.
Five
feature
that
lets
you
override
some
of
your
dependencies
with
a
local
file
or
a
file
from
this
project,
and
that
just
makes
it
so
I
don't
have
to
like
deploy
these
dependencies
just
edit
them
locally,
which
is
convenient.
C
C
A
C
C
When
we,
when
we
load
so
I'm,
going
to
ignore
some
things
that
are
here
and
just
look
at
that
we're
exporting
this
action
here.
That
does
the
attack
so
where
we
are
calling
this
from
our
from
our
code
here,
we're
saying:
hey,
so
do
your
action
and
the
foe
is
document
so
and
that's
what
that's?
How
we
end
up
with
this
page
here.
A
C
C
A
A
C
A
C
A
C
A
C
A
A
C
C
A
C
And
so
this
is
tapes
where
you
know
I've
started
playing
around
with
this
in
meadow
mats
in
minimus
is
a
very
large
app
with
a
very
large
dependency
graph,
where
you
know
we
should
megabyte
two
megabytes
of
JavaScript
and
it
turned
the
just
the
initialization
time,
the
boot
time
from
like
less
than
a
second
to
more
than
eight
seconds
on
my
shoe
and
so
I'm
I'm
sure.
There's
some
computed
improve
this,
but
this
was
the
naive
approach,
significant
impact
that
may
be
viable.
A
A
A
A
The
pure
way
of
securing
it
would
be
to
have
both
friend
and
foe,
export
a
pure
function
that
takes
a
document
as
a
parameter
and
then
the
the
decision
to
give
the
friend
of
the
document
and
not
give
the
defendant.
The
document
would
be
a
decision
of
how
to
invoke
those
outer
functions
so
basically,
the
entire
content,
the
entire
existing
content
of
each
of
those
files,
would
become
nested.
As
the
body
of
an
outer
function,
I'm
going
to
use
the
e
terminology
a
making
function.
A
A
C
I
mean
I
think
to
do
it
correctly
involves
using
a
parser
and
these
sorts
of
things
just
to
make
sure
that
we
are
correctly
wrapping
it
and
the
code
wasn't
expecting
that
and
then
to
close
another
function
head
of
hydrogen
but
yeah
we
have
and
we
are
the
compile
pipeline.
So
if
we
want
to
mutate
the
source,
we
can
do
that
and.
A
Because
you're,
not
using
ECMO
Script
standard
modules,
we're
using
commonjs
modules,
nesting
them
all
within
a
function,
so
the
multiple
instantiate
Bowl
would
work
perfectly.
Well,
yes,
great
yeah.
One
of
the
things
that's
unfortunate
about
ECMO
script
modules
is
that
import
and
export
can
only
appear
at
top
level,
so
you
cannot
simply
wrap
a
module
in
a
function
in
order
to
make
the
module
multiple
extension
I.
C
A
Great
great
so
I
think
that
would
be
the
so
since,
since
you
you're
not
stuck
with
the
legacy
problem,
well,
I
think
that
should
be
the
first
line
of
attack
and
Daria
is
thesis
is
dissertation,
is
specifically
about
the
wyvern
module
system
as
an
OCAD
friendly
module
system,
the
wyvern
language
is
a
statically
typed
language
with
effect
types
as
well.
So
so
there's
more
involved
in
wyvern
than
would
be
relevant
here,
but
but
everything
relevant
here
I
think
can
be
learned
well
from
y
ver
Daria.
Could
you
is
that.
C
C
C
Why
is
it
it's
like
within
a
module
when
files
require
the
same
thing
now,
I,
don't
remember
why
I
did
this?
Maybe
there
was
like
an
infinite
loop
inside
a
module
or
something
like
this,
but
for
whatever
reason,
I
needed
to
cache
within
a
module,
so
I
do
keep
copies
of
that
for
a
dependencies
position
in
the
dependency
gap
is.
C
A
C
We're
going
to
bundle,
we
have
this,
we
create
the
require
the
require
function
that
we
use
to
use
internally
to
instantiate
it,
and
that
has
this.
We
have
this
module
dependency
path,
which
is
like
this
required.
This
record
this
record
this
record
this.
You
just
turn
that
into
a
string
we
needed
by
this
and
then
inside
this
goal
will
catch
which
is
what's.
You
know
why
we
have
this
large
memory
overhead?
We
have
this
local
cache
where
we
need
hold
the
module.
C
So
what
actually
goes
in
the
local
cache
is
the
the
module
ID
which
might
be
like.
This
is
the
like
box.
You
know,
node
module,
slash
buffer,
slash
index
order,
or
it's
just
a
number
or
something
like
that,
and
that
holds
the
module
object
and
the
only
thing
on
the
module
object
is
what
is
the
exploit
so.
A
Under
what
conditions
do
two
different
imports
of
what
would
normally
be
thought
of
as
the
same
module?
Under
what
conditions
do
they
actually
end
up
sharing
of
duplicating?
Clearly,
mostly,
you
get
duplication,
which
is
why
you're
getting
the
bad
performance
that
we
showed
us,
but
but
clearly
what
you're
saying
is.
Sometimes
you
get
share
and
I
don't
understand
what
the
difference
is.
Yeah.
C
So
if
you
have,
if
a
if
module
a
requires,
B
and
requires
C
or
dam
in
C
and
C
requires
something
from
B
and
C,
C
requires
two
different
pieces
or
it
requires
D
twice
or
something
like
that.
That
will
be
a
catch
hit.
That
will,
you
know,
reuse
the
same
thing,
but
then
a
B.
If
that
uses
D
that's
going
to
get
a
fresh
one
that
makes
sense
it.
A
C
C
A
C
A
C
I'm
not
sure,
if
that's
okay,
so
yeah
when
I,
what
I
mean
is
we
let's
say
you
just
say
you
know,
require
buffer
or
some
module
like
this.
That
will
point
the
file
that
is
the
the
main
file
as
specified
in
the
package
JSON
or
defaulted
to
index,
J
s
for
that
module,
so
modules
a
collection
files
and
there
is
a
default
file.
C
C
C
A
A
A
A
In
the
legacy
support
scenario
for
the
safe
module
system,
we've
been
making
the
same
assumption
that
that
the
things
that
are
manifest
talks
about
and
does
the
wiring
probably
correspond
to
packages,
and
that
all
of
the
modules
in
a
package
would
be
loaded
into
the
same
compartment.
It
would
therefore
share
a
loader
and
share
a
set
of
Global's.
A
So
so
I
think
we're
making
compatible
assumptions
there.
Well.
The
reason
why
I'm
being
hesitant
is
that
the
package
concept
is
very
much
part
of
the
JavaScript
ecosystem
and
you
know
comes
from
NPM
and
all
that
and
the
pervasive
use
of
people
talking
about
code
using
package
Jason,
but
it's
not
a
language
concept.
The
the
ACMA
script
standard
has
a
notion
of
individual
modules.
It
has
notion
of
package
well.
Nevertheless,
I
think
I
think
that
package
is
the
right
unit
for
us
to
be
talking
about
in
making
these
rista
thority
distinctions.
C
A
So
to
react:
now
we
have
the
terminology
distinction.
Well,
let
me
state
a
hypothesis
as
to
what
you're,
explaining
and
tell
me
if
I
got
this
right,
that
when
package
a
and
package
B,
both
import,
the
same
module
from
package
C
that
they
each
get
a
distinct
copy
of
C,
whereas
within
one
package,
if
module
a
within
a
within
the
package,
X
and
module
B
within
the
package
X,
both
imports
module
C
within
the
package,
X
that
in
that
case
you
get
sharing
rather
than
duplication.
That's.
A
When
I
see
the
name
tofu
I
want,
let
me
ask
Bradley's
been
doing
work
on
a
tofu
tool
which
the
name
to
food
comes
from
crusted
on
first
use
but
Bradley's
specific
thing:
statically,
analyzes
source
code
to
identify
dependencies
and
identify
well,
basically
identify
the
things
needed
to
create
a
least
Authority
set
of
grants.
That
should
enable
the
current
version
to
work
so
that
we
can
notice
if
a
later
version
requires,
if
it
part
of
the
later
version,
requires
more
authority
than
that
part
of
the
earlier
version
requires.
C
B
C
The
reason
why
I
added
them
is
just
because
almost
you
know
such
a
large
majority
of
modules
want
to
use
them.
It
just
made
a
config
file
gigantic,
but
yes,
obviously
addressed
so
a
lot
of
the
helpers
here
and
then
we're
creating.
The
goal
of
this
file
is
to
create
that
same
config
file.
We
saw
in
my
example
that
was
very
simple
was
like
friend,
gets
the
document.
C
C
No
doesn't
look
so
we'd,
see.
There's
you
know
this
this
one.
This
is
mod
of
this
package
called
bro
RAM
and
it's
detectives
trying
to
use
crypto
in
ML
script
ball
and
then
there's
this
one
called
each
JSON
RPC
filters
and
it's
detective.
Do
you
try
to
just
post
about
error
and
we're
saying
this?
Is
the
this
level
abstraction
around
creating
a
thing
called
xposed
module
and
then
we
I'll
do
it
with
paren?
Just
goes
use
their
ton
of
different
places.
C
We
now
have
this
other
thing
called
exposed
to
dependency,
and
this
is
we're.
Basically,
we
set
up
this
config
here.
That
program
should
get
crypto
and
MS
crypto
and
then
for
each
position
in
the
dependency
graph
for
brand
we're
giving
that
config
we
specified
earlier
to
that
location,
and
so,
if
you
and
the
reason
for
that
is,
let's
say
we
determined
where's
a
good
one
like
a
sink
gets
window.
Obviously
this
is
I.
B
A
C
A
C
A
C
A
C
So,
oh
yeah,
so
my
point
is
so
it
makes
sense
where
I
use
a
sink,
that
it
needs
to.
It
needs
window
to
do
its
job,
but
then,
let's
say,
for
whatever
reason:
you
think
that
window
racing
that
global
is
exposed
like
it
acted,
exposes
us
for
some
reason
and
then
my
attacker
comes
along
and
they
add
a
sink
as
a
dependency
of
theirs
mm-hmm,
and
that
makes
sense
so
they
want
to
do
something
a
sink
so
I'll
give.
A
A
Think
so
I
mean,
let
me
see
if
I
can
restate
it
command.
J
asks
there
is
an
exports
object
and
the
exports
object
has
a
set
of
named
bindings
which,
even
though
in
command
Reyes,
it's
a
general
object.
We
assume
that
those
named
bindings
are
statically,
determined
and
stable,
so
over
here,
just
like,
we
want
to
declaratively
state
what
the
global
variables
are
that
appear
as
endowments
on
a
per
module
basis.
We
also
want
to
declaratively
state
what
the
names
are
of
the
eck
on
the
exports
object
that
a
given
module
exports
is
that
correct.
C
B
C
Extremely
powerful
API,
and
so
this
is
that's
this
really
handy
because
it
just
makes
it
easy
to
use
those
api's,
whether
you're
on
Firefox
or
Chrome
or
opera
or
whatever,
and
it's
I.
You
know,
I
want
to
use
this
helper
in
a
few
places
in
the
go,
but
if
an
evil
dependency
suddenly
required
extension
Iser,
then
it
might
seem
okay,
they
want
to
use
extension
Iser
and
that
doesn't
seem.
Maybe
that
wouldn't
seem
to
fairies
or
you
wouldn't
notice
it,
but
what
you're
accidentally
giving
them
is
access
to
the
whole.
A
C
A
A
C
Need
to
not
give
it
that
and
the
way
we
do,
that
is
by
explicitly
saying
those
permissions
that
we
said
above
for
extension
Iser.
They
are
they
get
them
if
you
require
extension
Iser
just
from
the
root
of
your
app
or
if
this
module
extension
link
enabler,
which
requires
extension,
major
does
it?
If
you
do
it
from
any
other
package
path
into
dependency
path,
then
it
would
not
get
those
and
down
I.
C
So
we
have
a
package
called
extension
Lincoln,
a
blur.
If
that
requires
a
package
called
extension
Iser,
they
the
instantiated
module
that
it
gets,
isn't
Sanchi
ated
with
the
endowment
specified
above
okay,
maybe
I,
but
if
you
were
to,
if,
if
a
package
called
attacker
for
the
request
extension
answer,
it
would
not.
The
extension
agent
that
gets
instantiated
would
not
get
the
endowments
listed
above
because
it
was
not
specified
in
this
part
of
every.
A
C
B
A
C
C
A
C
C
I
guess
the
main
point
on
is
like
you
want
to
give
some
permissions
to
certain
modules,
but
it's
important
to
only
give
those
permissions
to
modules,
whether
it's
our
actives,
when
they
appear
in
a
certain
position
that
they
appear
in
this
different
position,
then
they
could
have
been.
You
know
we
didn't
approve
that
we
didn't
make
sure
that
that
was
when
it
was
appropriate
to
get
those.
Why.
A
So
or
if,
if,
if,
if
I
understand
correctly,
something
in
at
extension,
link
enabler
or
something
at
or
under
extension,
link,
enabler
should
be
able
to
import
or
require
and
I'll
get
the
real
extension
Iser
and
anything
trying
to
import
require
it.
That's
not
in
that
position
should
not
get
the
real
one
should
not
get
one
that
has
any
of
the
real
endowments.
A
C
D
A
C
A
D
C
D
A
Any
case
it's
just
just
given
one
that
just
has
the
elements
that
we
that
we
specified
that
that
we're
writing
at
NU
attenuator
code,
where
we're
writing
code.
That
does
that
restriction,
but
in
the
in
the
simple
cases
like
you're
showing
here,
which
is
actually
the
only
things
that
most
of
our
attenuators
currently
do,
stating
those
attenuators
declaratively
has
some
attraction
to
it.
A
But
the
main
point
is
that
you
are
creating
two
different
instantiations
of
sensitive,
both
of
which
think
they're
accessing
an
object
named
document,
but
where
the
the
actual
thing
that
is
bound
to
document
in
each
of
their
namespaces
are
different
attenuations
of
the
genuine
document
and
then,
when
closely
trusted
imports
sensitive,
it
gets
one
of
those
sensitive
instantiations
and
when
less
trusted
import
sensitive,
it
gets
another
one.
So
this
is
perfectly
good
object.
C
A
People
what's
been
generally
going
on,
is
that
is
people
run
out
of
time.
They
drop
off.
That's
one
of
the
reasons
why
record
the
sessions
and
at
some
point
will
sort
of
have
a
general
sense,
if
now's
a
good
time
to
adjourn,
but
why
did
you
go
on
for
four?
You
know
a
small
amount
of
additional
time
until
you
get
to
a
good
stopping
point
and
then
I
think
we'll
go
ahead
and
adjourn
for
the
day.
B
C
D
C
B
C
C
So
the
next
thing
I
want
to
show
is
that
you
know
we
were
just
looking
at
all
that
config
generation.
That's
the
code,
that's
up
here
and
then
at
the
bottom
I've
appended
some
overrides
to
that
auto-generated
and
here's
where
I
could
constrain
the
the
endowments
or
I
could
add
some
attenuations.
That
are
that
the
you
know,
static
analysis
was
unable
to
I.
A
A
C
A
C
B
A
You
can't
you
can
you
can
overcome
that
locally
with
using
defined
property
rather
than
assignment,
but
it's
also
the
case
that,
for
a
small
number
of
these
things,
like
specifically
object
dot
prototype
we're
going
to
be
repairing
those,
so
that
assignments
work.
So
if
you
want
to
make
it
work
in
the
short
term,
you
can
change
it
to
find
property,
or
you
can
wait
for
us
to
fix
the
bug
for
the
small
number
of
objects
that
should
be
over
rideable
like
object
that
prototype
okay,.
C
Great
so
I'm
looking
for
that
fix
in
the
meantime,
I've
created
this
option
that
lets
you
just
disable
or
skip
sets.
But
what
it's
doing
is
it's
instantiating
these
modules
via
a
plain
email,
even
in
in
the
normal
route,
JavaScript
context
instead
of
inside
the
sass
iframe?
This
could
lead
to
I,
see
discontinuity,
oh.
C
And
there's
other
reasons
why
I
you're
familiar
with
this
one,
so
I
wanted
to
just
point
that
out
and
why
we
have
this
year,
but
there's
I've
had
to
do
that
for
a
number
of
other
reasons,
this
one's
doing
it
via
the
shoestring,
simple
sort
of
the
same
same
issue
there.
Some
things
are
doing
just
really
strange
things
and
I
slightly
described
them
here.
I
probably
should
have
included
links
to
their
source.
That
should
be
offending
folk,
but
basically
through
trial
and
error.
I
added
disable
assassin
in
the
number
of
cases.
A
C
Yeah,
so
that's
going
to
probably
be
your
suggestion
and
a
bunch
of
these
cases
I'm,
not
the
author
of
these
modules
and
they
may
appear
in
the
middle
of
my
dependency
graph.
So
it's
sort
of
out
of
my
control,
though
I
could
you
know,
rewrite
them
or
fork
them
and
then
sort
of
insert
them
into
my
dependency
graph
that
way,
but
in
general,
the
goal
of
this
project
is
to
be
able
to
operate
on
most
stuff
on
NPM
out
of
the
box.
I.
A
Say:
okay,
this
one
goes
back
to
the
the
previous
question.
I
was
asking
about
module.
Styles
is,
it
sounds
like
you
are
trying
to
accommodate
a
significant
amount
of
legacy
code
that
should
be
assessed
compatible
by,
in
which
case
the
the
legacy
style
with
the
man
best
explained
in
the
safe
module
document,
is
probably
going
to
relate
more
closely
to
what
you
need
for
the
legacy
code.
A
C
A
C
B
C
C
A
A
C
Like
accidental
incompatibility
to
assess
okay,
so
I'll
look
at
the
rest
of
these
in
less
detail,
I
just
get
it
over
with
this
is
no
it's
trying
to
figure
out
what
the
global
is.
I'm,
not
sure
what
it's
doing
specifically,
but
sometimes
it's
like
these.
Some
of
these
modules
are
meant
to
run
in
a
bunch
of
different
places,
and
you
know,
Global
might
not
be
better
window,
might
not
be
there,
self
might
not
be
there,
and
then
it
also
is
trying
to
protect
against,
like
accidentally
getting
the
wrong
global.
C
D
You
really,
we
really
need
to
figure
out
a
way
to
do
essentially
ahead
of
time
dead
code,
elimination
to
kind
of
circumvent
these
kind
of
UMD
wrappers
is
what
they're
generally
called
D
standing
for
universal
module
description,
which
is
not
technically
true,
but
this
is
something
that
is
in
a
lot
of
packages,
we're
trying
to
detect
your
outside
environment
to
determine
what
you
should
do,
what
you
should
evaluate
to,
and
so
the
only
thing
I
can
imagine
us
being
able
to
do
is
using
some
presets
of
hey.
C
I
always
in
the
current
implementation
that
satisfy
I
always
expose
a
window
object
as
just
an
object.
That's
there
to
be
the
global,
it's
not
it's
the
actual
window
and
then,
when
I'm
doing
my
static
analysis,
if
I
see
that
they're
just
using
window-
and
it
doesn't
look
like
they're
using
any
sub
keys
on
it,
then
I
don't
expose
window
to
it
and
then,
if
I
see
they
are
using
the
sub
keys.
I
just
ignore
the
window,
part
of
the
only
exposed
to
those
sub
keys
on
an
object
called
window.
A
B
C
C
Do
okay,
I'll,
leave
that
one
to
say
it
seems
you're,
saying
a
constructor
okay,
Aaron.
A
I
think
I'm
going
to
declare
adjournment
unless
there's
a
objection
well,
I
think
I
think
we
kind
of
we
kind
of
have
the
picture
here
and
we
can
go
into
more
depth
than
the
other
time
when
we
before
we
go
into
depth
on
the
other
time.
Well,
I
encourage
you
to
read
our
draft
sales
module
document
and
to
take
a
look
at
the
wyvern
documents
that
Daria
posted.