►
From YouTube: Node.js Foundation Modules Team Meeting - July 18 2018
Description
A
This
is
the
July
18th
notice
modules.
Meeting
we've
got
quite
a
packed
agenda
today.
Initially,
we've
got
15
minutes
to
update
on
a
lot
of
progress,
embarrassed
items
that
are
happening
music,
each
three
minutes,
humbug,
so
I
think
it's
worth
just
getting
straight
into
it.
So
we
don't
go
over
the
first
item.
We've
got
it's.
That
miles
has
been
working
on
a
poor
request
that
implements
input
that
don't
require
in
node
Cole,
and
so
we've
got
three
minutes
to
discuss
any
thoughts
and
feedback
on
this
approach
and
and
what
people
at
it
does.
A
C
So
important
Merida,
require
least,
and
this
PR
is
being
used
as
the
way
to
load
common
j/s.
We've
often
talked
about
the
ability
to
import
common
gist.
This
is
kind
of
an
alternative
route
where
you
use
import
metadata
choir
as
a
entirely
disparate
module
system.
So
you
don't
connect
the
common
Jas
module
system
to
the
ESM
module
system,
and
so
there
are
a
variety
of
reasons
you
might
want
to
do
this.
There
are
still
a
lot
of
design
concerns
to
talk
about
with
this.
C
D
Sure
I
just
have
a
question
for
Bradley
I've
been
persuaded
in
the
past
that
you
know
the
fact
that
important
meta
requires
like
an
imperative
function.
Call
you
you
like
can't
get
it
to
happen
before
imports,
which
is
one
argument
for
transparent,
interact
with
importing
commonjs
with
import
declarations.
C
So,
let's
start
with
it's,
not
the
only
concern,
but
that
does
let
you
do
the
proper
timing
of
what
you
want
to
do.
We've
had
a
objections
around
using
rapper
files
for
a
variety
of
issues,
and
so
there
are
some
of
the
same
arguments
there.
People
arguing
that
things
like
named
imports
from
Kalman
jeaious
need
to
exist
and
we
can't
use
MJS
wrappers
for
those
kind
of
the
same
deal
here.
For
me,
we
could
encourage
that
behavior,
but
it's
not
the
biggest
problem
to
me.
C
A
E
So
I
don't
want
to
dive
into
it
now,
but
just
I
I
have
a
philosophical
objection
to
the
PR
unrelated
to
its
technical
content,
which
is
that
where
it
is,
represents
an
implementation
for
something
for
which
we
do
not
yet
have
consensus.
I
really
don't
want
us
to
build
cars
when
we
don't
know
what
kind
of
car
we're
building
and
so
I'd
like
to
have
like.
A
Okay,
so
we're
a
couple
of
minutes
over
the
next
one
is
an
important
one
thinking
about
deadlines,
and
so
this
is
the
the
post
that
miles
put
up
basically
saying
that
there's
quite
a
strict
deadline
to
get
to
get
in
for
know,
1288,
LTS
and
what
we
think
about
that
and
in
terms
of
how
we're
going
to
still
stare
a
lot
of
the
work
that
we're
doing
around
those
deadlines.
And
if
those,
if
those
deadlines
can
form
part
of
our
constraints
or
or
not
it's
there,
and
how
many
thoughts
on
that.
E
Yeah
same
sort
of
thing,
I
think:
if
expediency
was
the
goal,
we
wouldn't
have
convened
this
working
group
because
decisions
were
already
made.
The
purpose
of
this
group
is
not
to
chip
modules
fast.
It's
to
ship
them
to
be
to
increase
our
confidence
that
we're
shipping
them
correctly.
What
for
whatever
feature
said?
That
means
so
I
think
that
in
no
way
should
we
be
accelerating
towards
shipping,
until
we've
at
least
agreed
upon
roughly
what
we're
gonna
build
and
we're
not
close
to
that
right.
There.
C
F
Yeah
two
quick
things:
first,
quick
thing:
I
think
it
would
help
if
we
would
try
to
focus
just
on
changes
since
the
last
meeting,
because
I
feel
like
we
are
basically
having
some
of
the
discussion
that
we
had
last
week
again,
which
I
don't
think
is
good
use
of
those
three
minutes.
The
second
thing
is
I.
Don't
think
we
have
anybody
taking
notes
right
now.
A
A
So
alright,
let's
have
a
look
at
the
terminology
documents.
This
is
something
that
we're
making
some
progress
on
in
terms
of
China
documents
to
establish
an
understanding
of
a
lot
of
the
terms
that
are
involved
here.
I
think
transparent,
interrupt
was
one
of
the
terms
that
we
decided
to
deprecated
and
no
longer
use,
and
instead
talk
about
require,
interrupt
or
imports
Interop
and
in
terms
of
maintaining
this
document
going
forward,
there's
probably
going
to
be
more
decisions
like
that.
Does
anyone
want
to
updates
on
this
document?
Was
it
solo
who's
managing
this
yeah?
A
We
don't
have
anyone
here
today.
It
was
involved,
who's
been
involved
in
this
document,
but
if
I
can
then
just
speak
to
what
I
think
might
be
useful,
it
would
be
to
have
the
document
become
something
that's
referenced,
as
a
sort
of
a
living
documents
is
just
what
I
was
suggesting
today
that
we
can
then
update
as
we
come
to
decisions
on
turns
and
then
keep
that
updated
as
we're
going
through.
So
we
make
sure
that
we
keep
a
baseline
of
where
we
are
on
terms.
A
B
In
terms
of
getting
like
a
survey
to
ask
questions,
I've
been
involved
in
a
couple
of
the
other
ones
and
I'm
part
of
the
user
feedback
group.
So
I
can
help
to
facilitate
that.
But
in
terms
of
getting
the
questions,
I
think
we
need
this
group
to
basically
agree
on
what
those
are
and
then
we
can
move
it
forward
right.
A
A
A
C
A
Sure,
thanks
Bradley
all
right
and
then
the
last
one
was
when
I
managed
to
sneak
in
here
on
the
agenda.
I
posted
a
PR
with
some
adjustments
to
the
features
list,
marking
stone
duplications
and
just
trying
to
update
it
as
well
as
some
of
the
the
features
that
have
been
noted.
More
recently,
that
aren't
on
that
readme
page.
It
seems
like
it's
worth
keeping
them
today.
From
my
perspective,
just
because
those
thing
people
see
when
they
land
on
the
page,
I'm
posting
the.
A
A
What
I
would
like
to
do,
ideally
as
well,
is
maybe
start
moving
towards
splitting
up
the
speeches
list
into
goals
so
ordering
them
by
like
what
type
of
features
they
are
and
what
milestones
they're
apply
to.
So
that's
what
I
wanted
to
do
after
this,
so
it'd
be
great.
If
we
can
do
this
as
a
first
step
and
then
second
I
wanted
to
just
try
and
split
it
up
by
like
this
is
baseline,
this
is
Lotus.
This
is
what
whatever
feature
applies.
A
A
E
E
If
you
have
import
react
from
react,
how
does
the
browser
know
where
the
string
react
comes
from,
and
so
a
package
name
map
would
provide
a
kind
of
static
mapping
of
from
the
specifier
to
to
a
final
URL
and
there
that
packaged
a
map
also
has
kind
of
structure
in
it,
so
that
you,
you
can
have
nested
things
like.
You
can
say
anything
that
starts
with
foo
good.
You
know
how
it
follows
this
URL
mapping
and
then
it
just
dumps
whatever's
after
it.
E
You
know
so
like
so
when
you
for
doing
deep
imports,
for
example,
you
could
do
anything.
That's
in
under
lodash.
Slash
goes
under
a
certain
URL
things
like
that.
The
relevance
to
us
is
that
it
seems
useful
that
people
number
one
should
be
able
to
easily
create
a
package
named
mat
by
default
so
that
they
can
take
their
node
in
their.
You
know
the
code
that
runs
in
Odin
is
installed
with
NPM
and
so
on
and
use
it
in
a
browser
and
then
separately
should
note
itself
the
question.
E
A
B
Question
just
was
like:
how
does
this
fit
into
the
overall
priorities?
It
sounds
like
something
that
maybe
helps
us
in
terms
of
not
having
to
worry
about
bear
in
ports
or
whatever,
but
is
it
something
that
we
need
to
prioritize
in
terms
of
compatibility,
or
is
it
something
we
can
sort
of
think
about
later
on
I.
G
So
in
that
sense,
it's
a
compatibility
concern
for
me,
and
so
like
one
thing
that
was
discussed
was
like
well,
if
they're
importing
a
package-
and
it
doesn't
have
a
package
name
map
with
them,
we
can
use
you
know
whatever
nodes
current
algorithm
is
and
then
when.
If
the
package
name
map
is
present,
then
node
will
I
prefer
that
instead,
which
could
work,
but
then
it's
like
as
packages
add
these
things.
There
could
be
breaking
changes
in
terms
of,
like
other,
you
know
downstream
apps
that
do
imports
from
them.
G
Suddenly
the
imports
don't
resolve
on
the
new
version.
The
way
they
did
before
I
mean
which
could
always
have
happened
in
general
if
they
were
doing
a
deep
import
and
the
package
author
changed
their
past
or
maybe
that's
not
such
a
big
concern.
I,
don't
know
I'm
just
raising
it
as
a
concern,
but
I
do
think.
There's
a
lot
of
value
in
aligning
with
the
browser
on
this,
since
you
know,
having
the
same
code
execute
the
same
way
in
both
places
is
definitely
a
positive
for
the
ecosystem
that
does
that
answer.
Your
question.
C
I'm
just
saying
I
completely
disagree
and
don't
think
it
is
the
same
level
of
compatibility.
Concern
is
Jeff.
Jeffery,
node
and
the
web
browsers
are
going
to
vary
in
a
lot
of
ways
and
we
need
to
determine
what
subset
is
compatible
between
them.
We
aren't
going
to
enter
a
master/slave
relationship
ever
because
of
these
incompatibilities
in
a
variety
of
ways.
So
I
think
we're
going
to
have
compatibility,
concerns
greater
than
package
name
Maps
and
using
them
as
instruments
is.
C
For
package
name,
Maps
I,
don't
see
compatibility
problems
with
supporting
the
subsets
of
resolution
in
both
environments,
I
see
bigger
problems
with
how
caching
works,
how
import
that
URL
works.
Things
like
how
importing
JSON
can
work
HTML
modules,
there's
this
whole
slew
of
things,
and
we
aren't
a
subset
of
the
web
and.
C
A
C
B
Last
question
was,
and
maybe
it's
because
I
haven't
looked
at
them
enough,
though,
is
is
I'm
guessing
in
the
browser.
Some
of
those
like
the
the
mapping
from
the
bare
name,
maybe
to
an
external
URL
and
I,
can't
see
that
nodes
gonna
want
to
support
that
necessarily
directly
because
that
changes
the
whole
security
model
right.
So
I
just
see
some.
You
know
that
makes
me
wonder
if
there's
like
a
fundamental
level
of
compatibility.
A
A
There's
definition
uestion
there
we
is
that
story,
but
that's
all,
but
it's
difficult
for
us
to
solve
the
story.
I
think
we
can
focus
on
compatibility
and
we
need
relative
pop
is
the
big
compatibility
one
if
you
want
to
stop
automatic
extensions
for
relative
paths,
because
that's
the
one
thing
that
you'll
be
able
to
write
a
note
that
wouldn't
necessarily
work
in
the
browser.
A
B
So
I
guess
I
was
coming
at
that
like
does
it
make
sense
to
focus
on
you
know,
like
Bradley
just
said
hope.
You
know
maybe
provide
input
to
the
people
working
on
package
name
maps
to
help
address
the
compatibility
issues,
as
opposed
to
thinking
that
our
first
priority
is
to
actually
implement
them
for
no
I,
don't
know
just
trying
to
find
some
way
to
prioritize
from
all.
You
know,
there's
tons
of
work
to
get
to
where
we're
gonna
get
I.
A
C
Loaders
yeah
cool
we're
just
gonna
roar
through
this,
so
the
purpose
of
this
presentation
is
to
try
to
get
people
to
stop.
Thinking
of
loaders
is
some
magical
amorphous
thing
that
lets
you
do
anything
you
want
and
more
in
a
concrete
sense,
what
they
are,
what
they
can
do,
what
they
must
do
and
in
that
sort
of
sense.
C
So
what
is
an
ASM
loader
whenever
I'm
talking
about
an
ASM,
loader
I'm
talking
about
something
that
uses
the
Ekman
script
spec
hooks
in
order
to
instrument
the
ESM
module
system
which
essentially
comes
down
to
you,
control
how
import
works?
You
don't
get
to
change,
fancy
things
about
how
modules
work
you
don't
get
to
change
like
create
new
module
systems
and
stuff
like
that.
You
can
do
that
at
the
hosts
layer,
but
that's
not
what
a
loader
is
doing.
It's
just
how
you
instrument
import.
C
There
are
some
invariants
that
must
be
left
up
to
the
host
platform
and
loaders
generally,
the
api's
were
going
to
use
to
create
an
implementation,
are
not
the
same
as
your
spec
text
and
we're
not
going
to
actually
discuss
anything
about
shipping,
a
full
implementation,
we're
just
talking
about
how
to
instrument
those
import
hooks,
because
that's
all
the
spec
gives
us
there's
a
note
on
conformance.
We
have
this
come
up
every
so
often.
Basically
there
are
some
conformance
constraints
that
the
spec
provides
optimization,
especially
hidden.
C
Optimization
is
fine
there
even
call-outs
in
the
ECMO
script,
spec
saying
it's
kind
of
expected
and
there's
even
contradictory
spec
text
which
wouldn't
work
in
the
real
world,
because
it
would
leak
memory
forever
and
stuff
like
that.
So
when
we
talk
about
conformance
we're
talking
about
observable
effects,
not
about
implementation,
spec
is
an
implementation
whew.
C
So
let's
talk
about
the
hooks
that
we
got
to
work
with
host
resolve
imported
module.
This
used
to
be
called
something
else.
Now
it's
called
this:
it
intercepts
all
static,
import
requests.
You
must
return
a
module
record
for
a
request.
You
can
return
any
kind
of
module
record.
You
want
it
must
extend
abstract
module
record,
which
is
basically
this
abstract
base
class
for
all
kinds
of
modules
for
wasum
for
jeaious,
Talmadge's
json.
C
C
C
Generally,
we
represent
referring
modules,
a
string,
but
really
it's
another
module
record.
So
just
keep
that
in
mind
as
we
go
on
through
this.
This
is
generally
what
I
call
the
resolve
hook:
host
import
module
dynamically
does
the
exact
same
thing,
except
for
import
as
a
function,
dynamic
import.
It
has
the
same
item
potency
requirement,
so
it
better
work
the
exact
same
as
static
import,
because
once
you
statically
import
something,
it
also
sets
it
for.
What's
dynamic,
we
imported
okay.
C
C
C
So
when
you
import
FS,
you
get
the
referring
module
whatever
it
is.
Your
app
KS
is
module
record
and
the
string
FS
and
you
have
to
go
and
generate
some
kind
of
module
record,
for
it
doesn't
really
matter
how
you
generate
the
module
record
for
it.
So
there's
essentially
two
things
we
can
do
when
we're
generating
module
records.
If
we
return
a
module
record
and
we
want
to
create
a
new
type,
we're
gonna
take
something
like,
for
example,
module
dot.
C
Exports
equals
one
two
three
we're
gonna,
actually
create
a
source
text
module
record
just
because
that's
the
only
concrete
class
of
abstract
module
record
v8
lets
us
do
anything
with
and
we're
going
to
convert
it
to
roughly
the
string,
export
default
and
then
a
way
to
get
that
module
dot
exports.
So
we
haven't
modified
that
commonjs
code
at
all,
who.
C
C
So
we
are,
we
are
strictly
not
modifying
this
module.
Dot
exports
equals
one
two
three.
When
we
create
what
I'm
gonna
call
facades,
the
alternative
is
something
the
things
like
code
coverage
need
to
do
or
babel,
where
you
transform
the
source
text
into
something
else,
whoo
so
I'm
gonna
call
transformation
of
source
text
to
source
text
compiling
just
so.
C
We
have
some
clear
terms:
facade
you
do
not
modify
source
text
and
compiling
you
do
modify
source
text
and
there's
a
reason,
but
I
want
to
do
that,
not
we'll
see
if
we
can
get
to
it
in
general.
The
reason
why
I
want
to
point
out
facades
is
if
we
are
to
ship
ESM,
we
should
do
neither
of
these
things.
C
So
yeah
I'm
going
a
little
fast
I'm
skipping
over
some
points,
but
we're
going
so
what?
If
we
implemented
so
far,
we
have
this
loader
CLI
flag.
You
can
use
in
order
to
generate
a
loader.
Currently,
our
loader
has
three
parameters:
the
specifier,
the
parent
module
URL
and
a
default
resolver.
There
is
APR
to
support
multiple
loaders,
which
changes
default
resolver
to
parent
resolver,
so
yeah
and
you
always
return
some
URL
stream
right
now.
In
some
format,
it's
an
enum
dynamic
is
a
specialized
way
of
creating
facades.
Basically,.
C
Yeah
and
my
overall
review
right
now
is
when
we
get
multiple
loaders,
we
do
need
to
kind
of
move
away
from
this
idea.
The
default
loader
and
just
have
the
parent
loader
that
we
delegate
to
we
return
a
URL
string,
but
it's
not
always
a
URL.
It
could
be
something
like
FS,
which
is
a
little
weird,
so
probably
just
want
to
rename
that
and
format
is
kind
of
this
weird
enum.
C
C
Dynamic
is
also
kind
of
special
and
we
should
make
it
more
capable
than
it
already
is,
and
that's
probably
a
big
API
design
change
right
now
it's
limited
to
facade
capabilities
roughly,
and
we
should
probably
think
about
more
direct
support
for
compilation
where
you
can
ship
a
full-body
of
source
text
back.
So
this
is
pretty
much
the
halfway
point.
We
can
drain
questions
and
anything
I
said
wrong.
H
C
H
B
C
C
A
C
C
C
G
G
C
We
we
essentially
run
this
module.
Dot.
Exports
equals
one
two
three
and
that's
what
we're
going
to
evaluate.
We
do
not
touch
that
right
now
in
our
implementation.
It
is
unchanged
and
we
create
a
separate
module
record,
which
we
do
some
extra
bookkeeping.
It's
more
than
this.
In
order
to
create
the
module
record
that
roughly
equates
to
the
source
text.
A
C
C
C
We
we
can
do
a
bunch
of
other
things
according
to
how
to
communicate
stuff.
How
do
you
integrate
with
this
result?
Hook?
Api
right
now
we
have
the
command-line
argument
with
loader.
Eventually
we
may
get
multiple
loaders,
so
multiple
loaders,
the
environment,
variable
node
options.
Lets
you
pass
in
loaders,
so
this
is
great
for
APM's
and
stuff,
because
it's
put
in
front
of
any
command-line
argument,
actually
there's
an
idea
of
per
package
loader
hooks.
Some
people
are
opposed
to
it.
It's
something
we
should
think
about.
C
Slightly
less
straightforward
spec
mechanism
for
this
dynamic
format
to
let
us
return
a
body.
So
we,
instead
of
only
having
the
facade
capabilities,
which
has
a
few
problems,
we
I
think
we
should
totally
just
let
resolve
return,
a
source
text
and
format
pair
so
that
it's
like.
Oh,
this
is
a
well-known
format
that
note
knows
how
to
transform
into
ESM
so
waz
on
whatever
it's
like.
Oh
I,
know
how
to
load
that
format
and
here's
the
body,
and
so
we
can
get
the
compilation
workflows
more
directly
yeah,
so
I
have
a
research
branch.
C
C
C
C
And
then
there's
some
kind
of
line
mitigations
that
we
can
do
by
moving
the
loaders
off
the
main
thread
in
particular,
I
still
haven't
gotten
clarification
from
v8
about
off
thread.
Parsing
people
talk
about
it,
but
it
doesn't
look
like
it's
actually
possible
without
blocking
the
main
thread.
Given
the
api's
that
we
have
available,
I
will
continue
to
bother
them,
and
then
we
just
need
to
be
sure.
Whatever
default
loader
we
do.
We
don't
want
to
put
it
on
the
main
thread.
Otherwise,
it'll
block
the
main
thread
we
can.
C
We
can
do
that
in
its
own
thread
or
the
topmost
loader
thread,
either
or
so
yeah,
and
then
this
is
just
showing
you
can
use
a
Thomas
weight
to
block
the
main
thread
while
you're
performing
some
loader
tasks
such
as
instrumenting
require
the
API
proposed
has
a
lot
of
problems
around
caches,
because
the
web
is
not
going
to
perform.
Caching
the
same
way
that
node
has
done
ever
or
any
tool
that
I
found
ever
so.
C
If
we
want
to
allow
instrumenting
the
cache
kind
of
what
I
talked
about
earlier,
where
you
can
import
something
and
then
the
loader
returns
FS
or
something
we
need
a
way
to
control
where
that
eventually
ends
up
in
the
cache.
If
we
want
to
allow
full
web
compatibility,
I
personally,
don't
think
it's
worth
it
because
of
how
weird
it
is.
I
would
not
recommend
people
do
caching
how
the
web
works.
C
C
C
It
works
it's
reasonably
fast,
trying
to
make
it
faster
as
it's
a
naive
implementation
and
I
think
we
could
a
better
proof
of
concept
of
caching
with
it.
The
thing
I
want
to
prove
out
is
you
can
use
this
loader
to
do
all
the
transformations
ahead
of
time
in
a
somewhat
generic
way.
So
if
somebody
does
type
script,
you
can
have
this
like
caching
loader
ahead
of
type
script
and
make
it
very.
C
So
one
of
the
big
problems
with
all
these
loaders
is
just
the
raw
amount
of
time
people
can
spend
doing
transformations
and
things
like
that.
Babble
is
not
cheap
and
we
should
try
to
avoid
doing
it
on
every
run,
and
that
means
we
need
to
make
caching
the
workflow,
simple
and
that's
hard.
I
haven't
figured
out
how
to
make
it
simple,
I've
made
it
workable,
but
not
simple.
A
A
C
And
that's
largely
why
I
really
don't
want
to
put
web
caching
as
a
capability
of
these
loaders,
because
it
it
is
a
can
of
worms
any
time
you
let
somebody's
all
your
cash.
It's
very
different,
I
think
for
the
compilation,
edition
of
capabilities
to
loaders
I
think
it's
just
kind
of
a
necessary
feature
and
you
start
having
to
account
for
people
using
this
for
source
code
transforms
that
are
very
expensive
and
I.
A
B
A
Much
either
I
think,
given
that
throw
is
for
today
did
anyone
wanted
to
talk
more
about.
It
is
we'll
go
back
to
some
of
the
time
boxed
items
meaning
of
the
meeting
and
follow
up
that
I
know.
We've
got
Salah
presence
if
we
wanted
to
discuss
the
terminology
document
again
and
we
could
also
have
discuss
deadlines
briefly
again,
any
does
anyone
want
to
suggest
anything.
A
No
suggestions
in
that
case,
so
are
you
available
to
chat
yeah
yeah
here?
If
you're
wanna
maybe
give
an
update
on
the
tech
terminology,
we
were
discussing
it
earlier
and
saying
that
it
could
be
worth
considering
what
the
next
steps
are
on
that
if
it's
something
that's
good
enough
to
either
mergence
the
reaper
or
something
like
that.
I
A
I
Really
did
nothing
since
last
week
about
this.
There
has
been
no
progress.
Anyways
and,
like
I,
came
in
a
bit
late
today,
so
I
didn't
really
catch
the
conversation
it
but
I,
don't
I,
don't
think
what
we
have
like.
We
have
two
documents
at
the
moment
we
have
one
that
is
like
I,
went
way
overboard
and
then
another.
That
was
an
attempt
to
try
to
steer
this
in
a
new
direction,
and
that
was
I,
guess
just
inputs
from
one
or
two
people
so
between
those
two
is
a
middle
ground.
I
A
A
I
I
think,
like
just
hash
out
a
middle
ground
document
until
like
before
next
week,
definitely
I'm
sure
a
middle
ground
document,
I'm
gonna,
consider
actually
doing
it
a
markdown
document,
and
then
we
can
just
you
know
it
depends
on
how
everybody
else
feels
if
they
want
to
add
that
as
a
Markov
document
or
sorry
markdown
document
or
they
would
rather
be
able
to
just
edit
a
Google,
Doc
I
think
a
markdown
document
is
good
enough.
We
can
all
edit
that,
if
we're
able
maybe
like,
if
we
have
permission
to
just
you
know,
make
pull
requests.