►
From YouTube: SES-mtg: lightweight safe parsing of SES and Jessie
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
B
A
Okay,
so
I,
since
November
I've
been
exploring
parsing
and
a
completely
different.
You
know
you
know
like
removed
from
common
ways
of
parsing
JavaScript,
and
my
idea
was
to
create
a
regular
expression
based
person
that
at
runtime
can
safely
find
certain
features
and
not
be
you
know,
interested
or
or
impacted
by
sometimes
unexpected
things
happening
in
between.
So
so
it's
a
balance
between
knowing
enough,
but
not
knowing
everything
and
not
knowing
too
little.
A
So
the
idea
is
obviously
you
have
an
opener.
You
expect
the
closer
the
idea
at
runtime.
If
there
is
a
syntax
error,
parsing
or
not
parsing,
the
runtime
will
throw
anyway,
so
so
yeah,
so
we're
basically
running
through
I,
guess,
validation
or
or
I.
Don't
know
what
what
to
call
it
but
we're
exploring.
You
know
how
well
this
fares
out
and
if
there
are
better
ways
to
improve
it
or
so.
B
My
concern
is
that
you
know
the
use
case
that
we
that
we
started
with
is
a
new
space.
That's
tremendously
important
for
a
lot
of
our
purposes,
which
is
creepy
without
the
overhead
of
a
full
parse
accurately
recognize
things
like
important
explore
expressions
so
that
you
can
translate
them
into
something
else,
leaving
all
of
the
text
between
them
has
just
passed
through
text
rather
than
having
to
regenerate
them
from
a
see
and
I.
B
Think
that's
a
good
girl
I
think
that's
an
achievable
goal
in
a
valuable,
but
if
you
don't
eyes
accurately
and
the
inaccuracy
has
the
characteristic
that
a
correct
JavaScript
program,
that
correctly
should
be
seen
as
having
one
token
sequence,
if
the
recognizer
recognize
it
as
having
a
different,
correct
token
sequence,
then,
just
very
specifically
in
terms
of
this
use
case
is
it
might
misunderstand
what
looks
like
an
import
expression.
That's
inside
a
comment
or
a
quoted
string
or
a
regular
expression.
You
may
misunderstand
that
to
be
genuine
import
statement
and
vice
versa,.
A
Yeah
and
and
that
that
is
definitely
I'm
at
a
point
right
now,
where
I
want
to
find
cases
that
this
occurs
and
see
good
ways
to
actually
you
know
evolved.
The
approach
to
you
know
find
good
ways
to
avoid.
You
know
falling
into
these
problems
so,
but
up
until
this
point
and
I'll
be
very
very
honest,
my
goal
was
to
not
throw
anywhere
until
I
see
output
and
when
the
output
looks
wrong.
A
B
B
This
is
this
is
an
example
that
comes
from
what
we're
doing
with
Jessie
is
just
don't
do
semicolon
insertion,
since
semicolon,
insertion
is
only
triggered
by
what
would
be
a
person
just
parse
error
and
reject
the
program
if
the
program
would
call
semi
caught
assertion.
If
you
take
that
approach,
then
accurately
token,
Eliza
will
become
much
much
easier,
mm-hm.
B
A
So
I
think
I
think
running
through
code
that
exists
out
in
the
wild
is
about
the
best
thing
I
can
I
can
try
this
on
at
this
point
in
time.
Up
until
this
week,
I
I
could
not
really
build
up
enough
courage
to
refactor
my
experimental
work
so,
but
I
did
this
week.
That
part
done
has
given
me
confidence
to
actually
know
that.
Okay,
you
know
at
least
one
or
two
people
will
be
able
to
reason
about
this
and
talking
about
something
missed,
I.
Think
I,
don't
have
a
definition
for
this.
A
In
my
lists,
you
sorry
yeah
got
distracted
I'll,
get
it
out
of
the
way
and
will
be
where
was
I.
Where
was
it
yeah?
It
should
be
here
right.
It
should
be
after
this
one
or
before
it.
It's
just
like
that.
Let's
see
if
we
fix
that
they
tell
you
never
do
that
live
right.
So
where
were
we
when
that
happened?
A
B
A
B
A
A
B
A
Yeah,
so
this
has
been
the
kind
of
like
the
artistic
side
of
like
I
like
how
this
looks
and
the
color
theme
is
well.
Those
are
the
creative
parts.
You
know
when
I
can't
write
code
I,
so
I
picked
the
color
theme
with
colors
that
basically
work
on
both
light
and
dark,
with
very,
very
little
adjustment
to
the
saturated,
colors.
C
A
So
so,
unlike
so
obviously
I'm
keeping
my
my
level
of
knowledge
up
until
this
point
to
the
bare
minimum
I
I
have
not
started
to
explore
what
we
need
to
know
beyond
top-level,
because
I
was
thinking.
I
want
to
know,
exports
that
Rd
structured.
You
know
export
cons,
D
structured
whatever.
That
has
been
the
one
thing
that
that
made
me
come
to
the
conclusion
that
regular
expressions
alone
are
not
going
to
ever
work.
A
So
the
idea
that
people
said
regular
expressions
are
not
the
way
to
go,
but
you
have
to
do
a
while
loop
and
you
have
to
scan
tokenize
and
and
rewrite
your
a
you
know
and
create
the
ast
and
then
unfold
the
ast.
All
that
I
thought
was
very
good
until
you
actually
have
to
use
a
third-party
ast
and
they
don't
have
a
plug-in
for
something
that
has
already
landed
in
the
browser,
and
you
can't
use
it
in
the
browser
if
you
use
a
tool
that
uses
this
a
SD
transform.
A
A
A
You
know,
I
might
not
be
able
to
give
a
good
summary.
There
has
been
like
I,
mean
seven
months
of
thinking
of
details,
and
maybe
I,
don't
summarize
well
or
explain
well,
so
so
what
I
look
for
when
I'm,
seeing
what
I'm
asking
whether
or
not
I'm
parsing
correctly,
is
if
I
see
a
key
word.
That
is
not
supposed
to
be
a
key
word.
A
B
It
does
not
satisfy
the
use
case,
I
had
in
mind
where
the
source
code
might
be
constructed
by
an
adversary
for
purposes
of
fooling
your
tokenizer
they
actually
at
in
kaha
at
Google,
with
earlier
version
of
SAS
SES
was
actually
using
the
Acorn
parser
young
acorn
purser
actually
had
some
mistakes,
we
tokenization
object
and
we
had
several
responsible
disclosures
against
SES.
At
least
one
I
think
were
the
one
that
was
due
to
somebody
figuring
out
a
way
to
maliciously
construct
a
source
code
string
that
hit
one
of
those
edge
conditions
that
acorn
got
wha.
A
Yes,
so
I
I
would
like
to
maybe
restate
what
you
said
about.
This
are
not
satisfying,
there's
not
yet
satisfying
because
I'm
at
a
point
where,
like
I've
done,
I've
done
a
lot
of
work.
But
I've
done
all
this
work
independently.
I
want
to
start
checking
where
I
need
to
focus
where
you
know
even.
B
Though
you're
not
going
to
get
that
by
this,
is
normal
programs,
even
by
very
large
corpus
of
mobile
programs,
because
programmers
in
order
to
write
reasonable
code
avoid
writing
code
that
hits
these
tokenization
edge
cases.
You
have
to
find
a
test
suite
that
really
proves
these
tokenization
edge
cases
and
try.
Yours
try
your.
A
B
A
B
Don't
think
you're
going
to
get
to
safety
while
doing
incremental
engineering
on
on
this
approach
by
testing
it
against
education,
I
think
you
have
to
start
by
constructing
a
grammar
machine
based
on
the
ground
definition,
there's
a
big
pension,
constructing
station
based
on
the
grammar
definition
and
what
I'll
refer
you
to
is
Tim
Disney's
dissertation.
He
did
a
very
interesting
high
genic
macro
system
for
JavaScript,
which
I'm
certainly-
and
you
know
he
does
a
full
person-
all
sorts
of
crazy
ast
stuff,
almost
all
of
which
is
not
relevant
to
what
we're
talking
about
here.
B
C
B
And
it
was
so
so
if
I
recall
correctly
there
were
you
only
needed
something
like
four
states,
each
of
those
states
had
a
different
set
of
rules
for
recognizing
the
next
token
and
depending
on
what
the
state
and
token
was
that
put
you
into
another
states.
It
wasn't
just
a
little
state
machine.
You
have
to
additionally
have
logic,
one
context-dependent
thing.
One
thing:
that's
that's.
B
Context
dependent,
which
is
the
thing
you've
already
done,
which
is
the
bracket
counting?
Yes,
so
that
you
know
when
a
substitution
hole
inside
a
template.
Literal
is
closed,
but
if
you've
got
so
you've
already
got
the
you
know,
think
of
the
context
dependent
already
doing
the
the
bracket,
counting
assuming
that
you've
recognized
a
bracket
but
I
think
you
you
in
order
to
avoid
a
full
purse,
but
accurately
tokens
I
would
start
with
the
tokenizer
intended
ten
business
distribution.
A
B
A
A
You
know
peer
review
or
anything
so
so
I
might
not
be
using
the
right
term
to
describe
things
and
I
might
have
the
misunderstanding,
some
concept,
so
my
hope
is,
you
know
if
they
do
get
refined
given.
If
you
know,
as
long
as
we
find
out
cases
that
proved
satisfactory
or
interesting
at
least
then
you
know
I'm
I'm.
Definitely
hoping
I
can
put
more
effort
into
it.
I
think.
A
A
Well,
it
was
initially
that
you
know
I
think
we
covered
them
earlier,
at
least
that
at
runtime,
if
you
wanted
to
to
make
very,
very
little
modifications
to
code,
it's
it's
reasonable
in
know.
To
expect
that
you
can
install
Babel
and
have
it
at
runtime,
do
your
code
and
run
it.
But
if
you're,
if
you're
wanting
that
privilege
in
the
browser
or
in
a
place
where
you're,
restricted
memory
or
otherwise,
then
Babel
is
maybe
not
an
ideal
thing
to
have
on
your
client
site,
but
rather
server-side.
B
A
Iii
want
to
get
to
normal
text,
is
okay
and
then
refine
for
adversarial.
But
for
my
case,
I
did
not
initial
design,
but
it
was
definitely
triggered
by
some
people
saying
that
if
you
parse
with
regular
eggs,
are
tokenizing
with
regular
expressions,
you
will
always
be
less
secure
than
other
kind
of
you
know,
approaches
and
I
really
wanted
to
say
that
there
are
tools
to
things
you're
confusing
here.
Regular
expressions
that
are
written
badly
can
be
hijacked
a
while
loop.
That
is
not
written
with
the
right
scanning.
A
Logic
can
also
be
hijacked,
so
so
it
was
really
driven
that
like,
if
you
use
a
regular
expression
that
is
written
badly
alone.
A
single
regular
expression
is
never
going
to
work.
But
if,
if
you
are
switching
between
regular
expressions
and
you
don't
write
the
right
regular
expressions,
then
you
will
be
as
vulnerable
as
a
scanning
tokenizer
with
a
while
loop
that
goes
over
every
single
character
that
doesn't
have
the
right
logic
running
so
regular
expressions
or
not,
is
not
really
the
problem.
It's
just
the.
B
B
So
some
you
should
correct
me
from
off,
but
I
believe
that,
once
you
know
what
mode
you're
in
you
can
regular,
you
can
use
a
regular
expression
to
recognize
the
next
tuck,
but
only
the
next
token,
because
that
recognition
might
put
you
into
a
different
mode
that
you
need
to
know
that
you're
in
that
mode,
in
order
to
operating
expression,
to
use
to
recognize
the
following.
Exactly.
A
And
and
aside
from
a
glossing
definition
for
regular
expressions,
because
I
didn't
want
to
write
the
mode
for
a
regular
expression
person,
yet
I
wanted
and
honestly
I
didn't
I
didn't
know
how
the
design
will
look
like
if
we're
going
to
switch
modes.
So,
aside
from
regular
expressions,
I
tokenize
one
token
see
how
it
effects
my
context.
If
it
opens
something
if
it
closes
something
you
know.
B
B
B
A
B
A
You
know
compared
to
other
languages
that
I'm
interested
in
person
like
HTML
and
CSS.
A
quote
is
something
that
starts
a
stream.
That's
that's
about
it,
and
a
common
point.
2
meter
is
something
that
starts
a
comment
and
then
I
have
a
closure,
and
this
can
be
anything
that
shares
mostly
the
grammar
of
the
language.
You
know
like
like
we're
talking
about
literal,
curly,
braces
or
square
brackets,
or
you
round
parentheses,
regardless
of
whether
or
not
their
function
body
or
object
at
this
point
I'm.
A
So
so,
like
the
curly
braces,
I
need
there
there
there
is
a
need
to
actually
refine
tokenizing
in
a
closure
which
I
did
not
seem
to
hit
a
case
that
required
a
lot.
A
lot
of
you
know
experimenting
or
exploration
in
this
area.
I
found
that
a
closure
did
not
affect
my
safe
tokenization
of
things
so
far
from
from
whatever
closures
look
like
in
the
JavaScript
grammar.
A
B
So
in
the
tent
Disney
approached
he
did
not
have,
he
did
not
need
to
recognize
closures,
I
mean
avoiding.
Personally,
he
was
basically
you
do.
His
state
machine
had
to
do
essentially
just
with
what's
necessary.
For
the
next
token,
you
know,
and
whether
you're
in
a
closure
has
nothing
to
do
with
how
to
jump
with
the
next
check.
A
B
A
B
A
That's
true,
that
is
definitely
true,
but
so
so
front
from
from
trying
to
determine
if
it's
an
import
statement
or
an
expert
statement
as
in
static
imports
and
exports,
as
opposed
to
import
Medan
and
these
other,
you
know,
other
uses
for
import,
static,
import
and
export
statements
are
always
top-level
dynamic
are
and
whatever
you
want
to
do,
it
could
be
top
level,
but
import
dot.
Meta
is
really
it
can
occur
anywhere
as
well,
just
like
just
like
a
dynamic
input.
A
You
know
a
long.
You
know
process
of
determining
top-level
verses
verses,
not
for
an
export
statement.
I
basically
said:
if
we
have.
If
there
is
a
closure
into
the
stack,
then
it's
not
top
level.
It
didn't
take
much
more
work
for
me
to
have
this
here,
but
it
also
allowed
me
to
say
that
if
I
wanted
to
refine
based
on
closures,
other
language
features
that
are
not
just
import
and
export
I
have
already
in
the
design
of
the
tokenizer
the
efficient
way
to
do
it.
A
B
B
So
what
I
think
I'm
getting
from
this
is
that
it's
you're
not
really
just
trying
to
token
wise.
You
actually
are
trying
to
some
degree
to
parse
in
order
to
create
this
nested
context
object
so
that
whatever
transform
you
would
like,
whatever
you
like
transform,
you
would
like
to
do
that
that,
like
transform,
can
still
be
conditioned
on
some
parsing
information.
True,
so
like
whether
whether
or
not
you're
in
a
closure
is
information
that
conceptually
comes
from
a
person
not
from
the
tokenizer.
True.
A
B
A
If
I
wanted
partial
ownership
yeah,
so
it
is,
it
is
really
like
I
call
it
a
semi,
contextual
Percy.
So
it
tells
you
it's
an
ax
curly
brace.
If
you
want
to
know
it's
the
function
or
not
find
the
parent
to
opener
and
look
before
it
does
it
look
like
function,
then
you
know
take
after
it.
Does
it
look
like
a
function?
That's
all
it
there's,
no
worried
that
you
will
choose
to
do,
but
you
don't
have
to
start
from
the
beginning.
I.
All
you
have
to
know
is
where
the
curlies
occur.
A
A
C
A
B
A
Well,
maybe
I
couldn't
find
a
better
word
to
use
here,
but
my
key
word
is
not
job
is
not
echo
script
keyword.
My
identifier
is
rather
maybe
identifier
and
my
key
word
is
potentially
a
word
of
significance
determining
whether
it's
an
echo
script,
keyword
or
not,
is
again
just
like
determining
what
the
curly
braces
mean.
It
is
something
that,
if
that's
what
you
you
need
to
be
doing,
then,
whatever
tokens
you
can't
hear
will
have
enough
information
for
you
to
determine
whether
or
not
it's
an
echo
script
keyword
or
not.
I
just
used
a.
A
Prefix,
notation
I
think
that's
what
they
call
it:
a
prefix
notation
strategy
to
to
not
identify
a
keyword
in
a
position
that
cannot
happen,
and
it
has
held
true
that
every
time
there
has
been
a
JavaScript
keyword,
it
has
been
read
the
only
time
where
I've
seen
red
words
that
are
not
JavaScript.
Keywords
are
when
they
were
keys
in
the
object.
Literal.
A
Been
trying
to
comb
through
it,
I
I
have
like
JSON
data
being
collected
from
regular
expression
occurrences.
That's
that's
what
I've
been
interested
in
so
far,
but
I
still
haven't
gone
through
to
sift
and
find
you
know,
I
think
I
ended
up
matching
the
visual
city
code.
Matcher
wasn't
enough
and
the
matter
I
used
probably
has
a
lot
of
false
positives.
It's
a
long
way
for
me
to
distill
a
good.
A
good
set
of
you
know,
expressions
that
would
be
worth
the
you
know,
yeah
it
would
be
worth
exploring.
C
C
A
A
B
C
C
C
A
A
B
But
one
so
this
is
a
great
example.
Once
she
missed
tokenize
as
the
opening
slash,
then
the
stage
then
all
of
the
other
cases
I
was
talking
about
easily
follow
or
you
can
construct
a
program
where
there
are
input.
Statements
inside
comments
inside
literal
strings
inside
regular
expressions
and
genuine
import
statements,
and
you
can
strike
programs
in
which
your
tokenizer
will.
This
will
will
misunderstand
any
of
them.
A
C
A
B
A
Okay,
yeah
so
so
I'm
eagerly
again
matching
regular
expressions.
So
this
is
where
I
expect
all
the
failures.
At
this
point-
and
you
know
again,
it's
not
like
I'm,
claiming
that
there
are
no
weaknesses,
I'm,
definitely
the
occupy.
Actually,
no,
there
are
weaknesses
and
I
appreciate
this
part.
I'm
gonna
have
to
think
about
that
one.
Okay,.
B
Well,
you
know
my
concern
is
that
that,
in
order
to
get
to
correctness
with
regard
to
these
issues,
just
incrementally
adjust
your
code
when
faced
with
each
individual
difficult
case
like
this,
you
have
to
go
back
to
the
grammar
definition
and
construct
a
principled
mechanism
of
some
sort.
A
state
machine
can
his
need
it
or
something
you
can
have
to
to
really
think
through
constructing
a
tokenizing
mechanism,
yeah
that
is
derived
from
an
adaptive
to
the
way
in
which
the
grammar
is
hard
to
take
on
us.
Yeah.
A
So
I'm
probably
going
to
add
a
layer
for
automatic
semicolon
inference
before
before
tokenizing
the
next.
So
at
this
point,
I'm
inferring
by
by
design
like
like
I
design
things,
and
it
was
inferred
right
so
far
and
then
when
I
tested
this
case.
No
like
I,
need
to
be
more
intentional
about
my
semicolon,
affecting
the
next
step
and.
B
If
I
recall
correctly,
in
order
to
tell
whether
a
semicolon
is
inserted,
you
have
to
look
ahead.
One
token
yeah
distinguish
based
on
one
token
of
looking
at
whether
what
you've
seen
before
is
potentially
the
beginning
of
a
valid
parse,
and
if
it
is
only
if
it
is
not
and
with
the
semicolon,
the
the
the
token
so
far
without
further
look
at
instilled
the
beginning
is
now
the
beginning
of
the
valid
parse.
In
that
circumstance,
you
do
the
token
interrupts.
B
A
A
B
B
Remember
when
I
designed
Jesse,
one
of
the
explicit
goals
in
the
Jessie
grammar
was
to
omit
enough
of
the
JavaScript
grammars
and
I
did
not
need
any
parameterize
productions.
I
think
there
was
for
parameterize
productions
altogether,
one.
It
was
at
the
token
level
and
three
words
parser
level
or
it
might
have
been
the
other
way
around
say
no
remember.
It
was
somehow
we're
able
to
split
three
anymore.
It's
documented
in
the
the
original
writing
down
and.
A
B
B
A
You
know
they
we
don't
have
anything
that
would
trip
my
my
logic
as
far
as
I
see
because
I'm
not
trying
to
say
that
else
is
a
JavaScript
keyword,
I'm
just
saying
that
else.
If
it's
not
a
JavaScript
p-word,
it's
surely
inferring
that
it
should
have
been
won
and
it's
a
runtime
error
to
execute
this
code
and
it's
up
to
you
to
decide
if
you
need
to
do
something
with
the
elves,
whether
it's
a
valid
thing
or
not,
as
important
to
you
that
that's
my
my
thought
at
this
point.
A
B
A
Yeah
so
so
I'm
saying
that,
as
far
as
my
tokens
are
concerned,
if
you
are
looking
for
a
fourth
statement,
you
will
come
here
if,
if
you're
interested
in
the
first
statement
syntax
and
whether
or
not
it's
valid-
that's
that's
something
you
would
start
to
do
with
the
previous.
The
next
token
and
the
additional
imperative
work
that
you
will
do
here
is
a
cost
that
you
incur.
If
you
want
to
incur.
A
B
C
Gladly
so
I
took
a
different
approach
to
the
whole
import/export
transforms
basically
I.
We
reject
the
the
Jessie
quasi
the
closet,
Jessie
grammar,
to
be
extended
into
a
grammar
that
does
not
create
a
st's
for
most
of
the
term
about
the
source
code
right
instead,
just
for
transverse
constraints
so
for
Jessie,
specifically
the
transformation
from
a
valid
Jessie
program
into
an
T
style,
defying
modules.
C
C
So
this
is
the
dr
cameron
cramer
that
I
create
from
Jessie,
and
the
main
difference
is
that
for
different
module
items
in
the
body
of
the
module,
I
use
this
angle
bracket,
notation,
which,
in
my
peg
parser
means
the
semantic
value
of
that
production,
is
exactly
the
source
text
between
the
pockets.
Oh
cool.
C
C
C
C
C
This
is
specifically
with,
let's
see,
I
have
our
Cerritos,
oh
yeah,.
B
C
No
I
have
to
look
more
carefully
for
that,
but
what
I
was
finding
is
that
I
would
like
to
export
a
maker,
but
also
have
some
properties
on
that
maker.
Ok,
that
was
difficult
to
do
because
I,
couldn't
you
take
the
function
right
once
I
define
the
function
with
the
rules
that
we
have
around
immunisation
so
far,
then
nitrenes
couldn't
go
any
further
was.
B
C
Yeah,
that's
correct,
okay,
yeah,
so
that
the
way
that
the
the
rewriter
works
is
it
basically
from
this
import-export
statements
garbage
that
we
have
below
this
dollar
H
define
is
going
to
be
declared
in
the
library
function
as
the
way
to
define
modules.
The
dollar
h
under
square
means,
basically,
is
not
it's
hidden
for
us,
it's
I'm
inaccessible
symbol.
Otherwise,
so
then
it's
what
is
this
enforcing?
Yes,
it's.
C
B
C
Yeah,
exactly
okay
yeah,
so
this
tolerates
define
is
basically
the
M.
Do
you
need
to
find
okay.
A
C
So
this
imports
from
food
import
some
stuff
from
food,
so
the
second
or
the
first
argument
to
define
is
the
food
module
and
then
it
has
a
factory
function
that
is
called
playing.
Foods
resolved
with
that
the.
C
B
C
Yeah
so
I
need
to
I've
been
trying
to
do.
Research
on
this
I
haven't
found.
Many
are
conclusive
and
I
find
the
semantics
semantics
in
the
spec
to
be
difficult
to
grasp
exactly
for
this
concept,
but
when
the
module
is
instantiated,
what
should
be
returned
like?
Why
does
the
module
record
and
his
default
part
of
the
name
spacers
default
outside
the
namespace,
was
one
question?
I
would
have.
B
So
it's
part
of
part
of
the
reason
why
I
thought
to
have
Jessie
only
have
a
default.
Export
was
to
avoid
all
those
questions
when
and
when
you
want
export
a
function
and
a
bunch
of
other
stuff
within
the
Jessie
that
I
had
in
mind,
you
would
just
name
the
function
as
well,
and
basically
the
the
you
know.
The
normal
case
is
the
only
default
export
that
you
use
is
still
a
record
of
named
members.
B
C
C
Okay,
so
I
guess
I
I
do
actually
want
to
know
what
the
real
answer
is,
because
they're
so
trying
to
find
that
do
I
need
to
somehow
wrap
the
namespace
in
an
object
that
can
contain
either
a
default
or
a
namespace
or
both,
or
is
the
default
actually
supposed
to
be
part
of
the
namespace
object?
That's
why
I've
never
think
about
Richard.
C
C
B
So
this
map
you're
thinking
of
it
as
a
MD
like
including
the
interest
or
the
code
inside
here
for
this
for
this
generated
code,
is
there
any
way
in
which
she
generated
code,
embodies
a
assumption
of
asynchrony
or
buying,
or
is
it
the
case
that,
with
a
different
definition
of
HD
fine,
that
could
actually
run
this
in
a
synchronous
manner?
You.
C
C
Hi
there
is
so
in
a
sense
that
you
can
see
which
the
imports
are
before
you
actually
evaluate
a
module,
so
the
polymer
has
an
MD
loader
that
is
significantly
simpler
than
requirejs,
and
actually
they
have
phases
to
the
loading
in
which
they
define
the
module
initialized,
it
loaded
it
and
so
on
and
in
any
of
those
phases.
You
can
that's
basically
how
behind
up
the
circularity
is
to
make
sure
that
it
happens
in
the
different
phase.
B
C
B
C
B
C
C
B
B
Having
a
simple
translation
from
import
into
a
valuable
scripts
and
then
having
a
packaging
logic
which
we
were
really
super-confident,
preserved
semantics
because,
for
example,
roll-up
we've
continually
gotten
into
problems
where
it's
renamed,
something
without
checking
without
in
some.
Hence
you
know
renaming
something
where
the
code
itself
was
actually
dependent
on
the
name
right
yeah.
So
over
and
over
again,
we've
been
screwed
up
by
roll
up,
not
being
semantics,
preserving
and
we've
gotten
into
different
problems
with
different
packages.
B
We
had
obviously
for
security
work.
We
have
to
be
confident
that
the
program
that's
executed
means
what
the
programmer
wrote
according
to
the
programming
language
semantics
as
interpreted
against
the
original
source.
So
it
seems
like
what
I'm
seeing
here
would
work
equally
well
as
a
transform
from
SES
modules
to
be
big
evaluable
script.
B
Know
so
for
the
immediate
work
purposes,
doing
the
rewrite
and
packaging
at
built
on
satisfies
our
immediate
needs.
But
for
that
we
could
just
do
a
Babel
full
ast
based
translator,
right,
but
eventually
very
nice
to
have
the
kind
of
thing
that
Solomon's
talking
about,
because
because
it
is
the
case
that,
if
you
can
accurately
Lex,
then
you
can
just
trigger
the
parsons
that
you
need
and
ignore
all
the
rest
of
it.
C
This
is
my
test
case,
so
this
is
actually
a
just
test:
okay,
carrying
the
translate
yeah
I'm,
comparing
my
translate
functions:
output
to
the
translated
text,
so
that
the
translator
returns
and
objects
promise
for
an
object
that
evaluated
since
the
parameter
Simmons
called
with,
as
well
as
the
translated
text.
So
I'm,
just
comparing
with
translated
texture,
I,
see
I,
see.
A
Okay,
sorry
mark
you
said
a
statement
a
few
minutes
ago
about
the
eventualities
yeah,
so
so
I
basically
would
like
to
freeze
this
as
I
thought.
This
eventualities
as
something
people
are
staying
away
from
and
I
wanted
to
start
driving
towards
that
eventualities,
I'm
a
long
way
from
getting
there
and
I'm
I.
Don't
think
lion
ever
thought
that
hello
and
I
would
be
the
person
doing
it.
I
really
hope
that
you
know
that
direction
would
have
more
interest
and
other
people
start.
A
You
know
being
involved
in
that
because
you
know,
as
far
as
my
my
own
end
and
goals
for
this,
like
I've,
already
used
it
to
develop
a
lot
of
things
that
I've
been
trying
to
develop
for
a
long
time,
but
I
know
that
the
idea
that
at
runtime
doing
as
little
parsing
as
you
need
in
with
a
parsing
approach,
that
is
that
doesn't
use
the
overhead
intended
for
development,
tooling,
is,
is
obviously
something
that
is.
Is
you
know
an
eventuality
that
people
are
not
trying
to
gear
towards,
and
so
I
don't
know.
B
So
Jesse
modules,
all
of
which
I'm
sure,
should
be
insensitive
to
whether
you're
evaluating
them
asynchronously
with
regard
to
each
other
or
synchronously,
and
also
probably
pretty
insensitive
to
the
order
as
long
as
the
dependencies
SCS
in
order
to
be
compatible
with
more
code
also
admits.
Resource
modules
and
resource
modules
and
porting
resource
modules
can
be
stapled
to
each
other.
So
the
link,
if
the,
if
the
linking
of
Eggman's
fit
modules
to
each
other,
doesn't
run
into
user
code,
the
semantics
of
it
and
therefore
the
linking
these
should
not
be
order
dependent.
A
A
So
so
the
initial
design
of
SES
frame
and
Jessie
free
will
assume
that
you're
loading
code
from
a
trusted
origin
that
is
only
giving
you
trusted
code,
and
then
we
start
securing
that
mechanism
to
in
order
for
us
to
actually
allow
it
to
get
code
from
untrusted
sources
and
I.
Think
it's
a
very,
very
big
journey.
In
my
you
know,
from
my
perspective,
to
go
from
from
the
initial
concept
to
runtime
parsing
that
that
can
definitely
guard
against
that
kind
of
malicious
code.
B
So
for
Jessie,
if
the,
if
the
you
know
if
both
of
the
benevolent
and
the
malicious
code
are
supposed
to
be
ingesting
and
if
they're
not
Jesse
you're,
it's
okay
to
statically,
reject
it,
then
I
think
you
should
pretty
much
already
be
there,
because
Jesse
already
has
an
accurate
cursor
and
written
by
Michael,
which
will
already
I
believe
on
Michael.
Correct
me
if
I'm
wrong
already
with
confidence,
reject
anything
that
is
not
syntactically
valid
Jesse,
at
least
syntactically.
A
B
I
think
I
think
that
what
Michael
showed
does
does
translate
an
important
export
statements
into
something
that
you
should
be
able
to
work
with
directly
and
for
Jesse,
as
opposed
to
cest.
Doing
this
even
based
on
the
full
purse,
should
still
not
be
that
onerous
doing.
The
full
purse
of
SES
is
is
onerous
and
I'd
like
to
avoid
it,
but
I
wouldn't
be
too
scared
of
doing
a
full
person.
Chest.