►
From YouTube: Node.js Foundation Modules Team Meeting 2020-07-29
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
Okay,
we
are
now
live
on
youtube.
With
the
july
29th
meeting
of
the
node.js
modules
team.
We
are
joined
by
eight
individuals
on
this
call.
We
have
a
pretty
full
agenda
today
for
those
who
are
in
the
meeting,
I
will
drop
a
link
to
the
meeting
doc
inside
of
the
chat.
If
you
could
please
help
with
taking
notes,
it
would
be
greatly
appreciated.
A
B
B
So
we
have
basically
two
points
relevant
to
the
modules
working
group
here,
while
going
and
updating
policies
to
integrate
against
a
bunch
of
the
new
features
that
we
landed
this
year
for
modules,
we're
trying
to
land
conditions-
and
I
noticed
we
use
different
prefixes
for
the
internal
scheme
for
built-ins
policies,
use
node,
colon
and
the
loaders
use
node.js
colon.
B
I
think
we
should
align
these.
I
personally
slightly
prefer
just
node
colon,
that's
one
of
the
issues
just
if
we
can
align
which
of
the
two
we
want
to
align
with
and
then
the
other
is
there.
While
implementing
tests
for
this,
we
don't
actually
share
a
lot
of
our
exports
traversal
code
between
locations,
and
we
should
probably
figure
out
how
we
want
to
share
it.
B
I
looked
at
extracting
the
common
js
exports
traversal
for
how
you
do
searching
within
a
package.json
exports,
but
it
was
pretty
tied
tightly
together
and
it's
using
some
throws,
rather
than
return
values
to
jump
around
and
slightly
odd
ways
to
my
eye,
so
we
should
figure
that
out.
So
those
are
the
two
issues.
B
Do
we
have
any
thoughts
on
changing
this
scheme?
Do
we
have
a
preference
besides
mine.
A
Yeah,
I
don't
have
a
strong
preference
between
node
or
node.js.
I
could
see
people
preferring
less
characters.
I
guess
the
the
like
two
things
that
I
personally
care
about
here
would
be
one
that
it
is
consistent
and
I'm
really
glad
that
you're
digging
this
up
and
then
and
then
two,
although
we
have
not,
we
have
not
really
dug
into
this
too
much.
Yet
I
would.
I
would
love
us
to
think
about,
maybe
this
being
something
that
could
build
on
supporting
built-in
modules.
A
C
All
right
in
the
standards
process
yeah,
I
think,
when
the
discussion
I
think
we
might
even
in
the
loader,
initially
have
used
node
colon.
B
I
would
be
surprised
nobody
has
registered
that.
I
know
that
because
I
did
write
it
originally
as
node,
which
is
why
I
was
surprised
when
I
updated
things.
B
B
Me
basically,
I'm
just
asking:
are
there
any
objections
with
that
pr
to
also
alter
the
node.
B
A
A
Yeah,
I
guess
I
guess
just
one
thing
there
since
loaders
are
experimental.
I
think
that
a
change
of
to
that
extent-
albeit
it
might
be
better
for
your
to
alias
it,
with
a
warning,
for
example,
for
12
or
14,
or
something
like
that.
I
I
do
think
a
breaking
change
like
that
is
totally
in
scope
for
the
current
stability
of
loaders
and
jan
has
their
hand
up
sorry
for
interrupting.
B
D
B
Eventually
yeah,
I
would
prefer
we
eventually
move
only
to
one
and
I'd
prefer
that
one
be
node
colon,
but
since
this
is
a
breaking
change,
I
think
it
would
be
fine
to
support
both
in
the
intermediate
term.
We
did
something
similar
when
we
added
the
get
format
hook
for
loaders,
where
we
did
support
the
old
workflow
as.
D
Well,
okay,
so
then
yeah
the
thank
you
for
clarifying
and
then
my
my
position
is
like
that
colloquially,
like
the
binary
is
called
node.
It's
referred
to
as
node.
I
only
see
it
written
out
as
node.js
when
people
have
apt-installed
it
from
the
default
ubuntu
repository.
D
A
Also
for
what
it's
worth,
my
understanding
is
that
I
think
it's
iana,
where
we
would
where
we
could
put
in
like
a
request
to
standardize
the
protocol.
But
the
last
that
I
checked
node
colon
was
not
a
registered
protocol,
so
we
very
well
could
go
through
the
process
of
registering
it
and
then,
like
you
know,
then
it's
on
other
people,
not
us.
If
they
choose
to
overload
it.
C
Yes,
just
to
clarify
my
concern
was
not
that
I
personally
think
that
node
would
be
a
dangerous
protocol.
My
concern
is
that
people
have
spoken
up
in
the
discussions
about
built-in
modules
and
using
name
spacing
for
supporting
built-in
modules
in
require
statements
and
in
import
statements,
and
in
that
context
they
have
spoken
up
against
node.js,
and
what
I'm
concerned
about
is
that,
yes,
we
can
make
decisions
here
about
the
loader
api,
but
if
then
the
loader
api
uses
a
protocol
that
wider
node
then
objects
to
to
using
an
import
specifiers.
A
So
putting
up
my
hand
here,
john,
I
don't
I
don't
disagree
with
you.
I
actually
think
that
in
jordan
had
a
had
a
pull
request,
a
while
back
about
about
this,
and
I
think
that
there
was
there
was
a
lot
of
debate
about
like
what
the
mechanism
would
be,
what
it
would
look
like
what
the
string
would
be.
There
has
been
some
movement
on
the
built-in
modules
proposal
of
tc39.
Since
then,
it's
gotten
to
stage
two,
and
as
far
as
I
know,
they
are
they're.
A
Looking
at
moving
forward
with
protocols,
I
think
that
it
was
like
we
were
talking
about
whether
or
not
it
would
be
a
scope
or
a
protocol
or
different
ways
of
doing
it.
I
think
that
if
other
folks,
within
this
group
and
within
node,
do
not
object
to
using
a
protocol
for
the
purpose
of
name
spacing
node
and
we're
okay
with
moving
forward
with
like
supporting
that
protocol,
even
if
it's
not
the
exact
syntax
that
the
language
lands
on
for
using
built-ins,
I
don't
know
why
we
wouldn't
just
move
forward
with
it.
A
I
could
see,
though,
why
people
would
be
okay
with
it
landing
inside
of
loaders,
which
is
like
its
own
custom
kind
of
side
thing
and
not
in
like
the
main
script
tags,
if
we're
unsure
about
whether
or
not
they'll
be
long-term
support
for
that
protocol
syntax
for
what
it's
worth,
I
think
that
we
should
time
box
this
at
another
three
minutes.
If
there's
more
discussion
to
be
had.
B
I
think
I'm
just
going
to
move
forward
with
not
keeping
it
aligned
and
I'll
make
a
separate
pr
to
align
them.
We
had
a
second
thing:
where
our
do.
We
have
a
clear
refactoring
that
we
want
to
do
to
get
all
the
loaders
to
share
the
exports
traversal.
Is
anybody
actually
working
on
that?
If
not
that'll,
be
a
separate
pr
that
I'll
work
on
later
it'll
be
a
fairly
hefty
refactor,
though.
B
A
A
Brad,
I'm
not
aware
of
any
refactors
having
in
that
space.
The
only
thing
that
I
would
want
to
like
encourage
before
we
do
more
major
major
refactors
is
sitting
back
looking
at
14
and
looking
at
12
and
trying
to
ensure
that,
like
as
much
as
possible,
we
can
backboard
all
these
changes
that
we're
looking
at
making.
B
Already,
if
the
concern
is
back
porting,
we
should
not
do
the
refactor
in
that
timeline.
It
will
not
land
cleanly.
A
Well,
I
mean,
I
don't
think
it
even
matters
if
it
lands
cleanly.
I
think
it's
more
like
12
is
going
to
go
into
maintenance
in
october
and
that's
fine.
I
think
we
probably
want
to
sit
down
and
do
a
quick
evaluation
of
the
future
gaps
between
12
and
14
that
we
have
right
now
and
what
we're
expecting
to
land
in
15
and
try
to
figure
out
like
what
can
we
backport?
What
can't
we
backport?
A
What
features
are
in
flight
that
we
want
to
land
that
we
do
want
to
backboard,
and
if
we're
going
to
do
a
very
large
refactor,
perhaps
wait
until
a
few
things,
land
and
backport
before
we
start
moving
a
lot
of
things
around.
B
A
Cool
anyone
else
have
anything
to
add
for
this
for
right
now,
are
we
ready
to
go
to
the
next
one.
A
E
Hi
everyone
yeah,
I'm
I'm
here
to
update
from
from
the
last
meeting,
I
was
asked
to
kind
of
dig
in
to
dig
into
it
more
in
terms
of
what
are
some
alternatives.
Other
ways
we
could
go
about
it-
maybe
pros
and
cons
to
this
method
and
on
the
pr
is
also
asked
or-
or
it
was
proposed
by
guy
bedford-
about
wouldn't
allowing
transform
source
the
the
fourth
hook
to
have
the
same
kind
of
behavior.
E
Add
consistency
so
just
here
to
kind
of
update
on
the
exploration
that
I've
done
there,
alternatives
and
and
to
that
concern
too.
So
you've
seen
the
pr
the
document
with
kind
of
a
an
exploration
of
alternatives
and
with
the
first
alternative
there
being
one,
my
pr
kind
of
implements,
which
is
the
the
kind
of
optional
format
and
get
source,
and
I
think
it's
it's
something
that
that,
after
exploring
further
into,
for
example,
the
the
loader
chaining
pr,
I
think
it's
compatible
with
that
with
with
some
work
there.
E
But
it's
definitely
kind
of
conceptually
compatible
to
answer
a
guy's
question.
We're
concerned
about
maybe
transform
source
should
be
something
that
has
the
same
behavior.
I
think
kind
of
that
should
stay
outside
the
scope
of
this
pr.
For
now,
I
think
it's
it's
kind
of
it
would
go
against
the
the
intended
behavior
of
transform
source
and
so
keeping
the
behavior
in
this
pr
kind
of,
as
is,
I
think,
would
be
the
way
to
go
for
now.
E
C
B
B
B
B
B
B
I
believe
this
is
the
one
that
exposes
package.json
by
default
within
a
package
boundary
so
that
you
don't
have
to
include
it
in
exports.
Do
we
have
anything
we
want
to
say
on
this?
Nobody
wanted
to
pick
it
up.
Last.
D
Time
at
the
last
at
the
last
meeting,
we
discussed
a
bit
about
the
like
an
api
on
module
to
get
a
package
directory
which
would
provide
a
like
blessed
path
for
tools
that
need
access
to
the
package,
json
and
sort
of
obviate
the
need
for
exposing
it
by
default.
So
I
think
that
my
recollection
is
that
that
we
were
going
to
move
forward
with
evaluating
a
pr
for
that,
and
I
had
volunteered
to
work
with
miles
to
write
that.
D
B
That
out,
I
will
somebody
else
take
over.
Do
the
next
topic.
A
Yeah
I
can
I
can
do
that.
Is
this
topic
done
for
now
we're
good
to
go
to
the
next
one.
A
Which
is
cjs
exports
detection
guy?
Would
you
like
to
take
this
one.
F
Yeah,
I
can
do
that,
so
I
think
last
time
this
sort
of
ongoing
discussion
around
this
pr
has
been
how
to
how
to
get
around
objections
to
landing
it.
We
tried
with
the
flag
and-
and
then
I
think
the
last
time
in
the
discussion
was
that
we
would
try
enabling
it
through
a
loader
instead
and
and
allow
named
exports
from
chroma.js
using
loaders.
F
F
And
it's
got
one
approval
right
now,
it's
seven
days,
so
I'll
probably
aim
to
land
that
a
little
later
this
afternoon,
unless
anyone
else
has
some
more
feedback
on
that
pr,
and
with
that
pr,
I
also
put
together
a
loader
example.
F
I
haven't
published
it
to
npm,
just
because
I
don't
want
to
maintain
it,
but
a
loader
example
of
a
loader
that
can
do
this
named
export
support
exactly
as
in
the
pr.
So
it's
the
wasm-based
parser
that
will
detect
the
the
exports,
so
people
can
try
it
out
in
new
zealand
like
they
would
have
if
we
landed
it
behind
a
flag
through
this
loader.
F
I
think
it's
yeah
likely
just
going
to
let
things
lie
on
the
other
prs.
Otherwise,
as
discussed
last
time,.
B
B
F
Wesley
last
meeting
you
did
voice
some
concern
about
named
exports.
Have
you
had
any
more
time
to
think
about
that?
I
mean
just
just
that,
like
the
community
expects
them.
G
By
and
large,
like
the
you
know,
publish
the
custom
loader
in
the
community
and
see
if
it's
worth
shipping
in
core
raises
an
eyebrow
to
me,
because
the
community
has
been
using
named
exports
through
custom
loaders
for
years
already.
So
I
don't
think
we
need
more
data
points
in
that
regard,
but
you
know.
F
G
F
Could
be
a
path
there
if
this
was
something
that
you
cared
about
strongly
enough
to
use
feedback
from
that
user
land
loader
to
say
that
coming
from
a
typescript
perspective,
this
is
working
for
the
ecosystem
as
far
as
you're
concerned
and
and
strongly
supported
in
core,
because,
ultimately,
if
it
were
to
land
in
court,
it
does
require
some
some
strong,
some
someone
to
advocate
for
it
strongly,
and
so
there
there
is
still
a
role
that
this
public
loader
could
be
used
to
do
that,
and
I
would
strongly
recommend
you
to
do
that.
F
G
C
That's
okay,
but
that
makes
your
argument
of
so
we
don't
need
further
proof,
not
quite
as
convincing,
because
that
is
the
exact
objection
that
some
people
had
on
that.
Pr
is
that
it
does
not
exactly
replicate
the
behavior
that
the
ecosystem
has
been
using,
so
give
it
getting
some
evidence
that
people
find
this
reduced.
Behavior
still
valuable,
and
it
does
actually
work
out
in
practice
would
maybe
help
convince
some
people
that
it
could
land
in
core,
even
though
it
is
just
a
heuristic
and
it's
not
100,
covering
all
observable
properties
on
the
export
object.
H
If
I
can,
I
guess
I
can
just
go
next,
if
I
can
just
give
an
update.
I've
been
working
on
a
ricoh
to
test
the
the
pr
against
core
the
one,
that's
in
a
flag,
and
I've
been
really
swamped
lately.
So
if
anyone
has
any
time
to
help
with
that
effort,
I
would
really
appreciate
the
help.
But
the
latest
thing
I've
been
doing
is
trying
to
update
it
to
try
to
figure
out
from
like
packages
readmes
what
the
intended
public
apis
are.
H
Well,
if
they
even
intend
to
have
named
exports
like
do
they
suggest
that
users
do
destruction
of
common
js
or
do
they
have
an
example
of
an
import
statement
in
their
readme
etc
to
try
to
narrow
this
down
so
that
because,
like
basically,
what
we
have
is
that
this
pr
seems
like
it's
like
whatever
only
66,
successful
or
something
like
that,
based
on
just
comparing
pure
all
the
things
that
we
find.
H
Not
the
common
js
exports
object
against
what
this
detects
in
esm,
but
the
situation
is
actually
a
lot
better
than
that
because,
like
a
lot
of
those
are
just
like
superfluous
or
not
intended
to
be
public
and
so
on
and
so
forth.
So
I've
been
trying
to
find
ways
to
programmatically
determine
like
what.
What
do?
I
think
the
intended
public
api
is
for
a
particular
package
and
then
are
those
named
exports
detected
and
I'm
still
working
on
that
so
yeah
wes,
if
you
or
anyone
else,
has
time
to
help
with
this
effort.
H
Or
you
know
if
someone
wants
to
go
through
manually
like
define
what
the
expected
named.
Exports
are
for
50
packages
or
100
packages
or,
however
many
people
want
to
go
through.
Then
we
can
kind
of
get
a
realistic
kind
of
success
rate
and
I
think
that's
the
blocker
or
that's
that's
what
gus
is
using
to
block
he's
like
he
wants.
H
You
know
a
success
rate
of
over
90
or
90
something
percent
and
then
he'll
be
like
okay,
fine.
This
is
good
enough
and
we'll
let
it
land
so
yeah.
I
know
this
is
sort
of
like
defining
the
conditions
to
back
us
into
the
desired
success
rate,
but
I
think
I
think
these
are
the
right
conditions.
We
should
use
for
evaluating
this
because
it's
really
about
user
expectations
about
when
they
type
an
import
statement
with
a
named
import.
H
Do
they
expect
it
to
work?
And
I
think
that's
that's
where
this
is
ultimately
going.
G
We
use
the
essentially
the
same
emitter
as
babel.
Now
it
is
so
you
know
should
be
also
100.
H
So
that
would
be
another
thing
maybe
like
if
there's
some
way
to
there
probably
is
some
way
to
programmatically
detect
like
this
one
was
created
through
babel,
or
this
one
was
created
through
typescript
and
then
he's
like
okay.
For,
though,
for
for
those
cases,
it's
a
hundred
percent
and
then
blah
blah
blah.
You
know
I
it's
just
a
question
of
doing
the
research
to
find
fine-tune
the
algorithm
that
evaluates
each
package
and
to
get
these
good
statistics.
H
F
Is
is
there
maybe
someone
involved
in
typescript
at
all
who'd
be
interested
in
helping
with
this,
or
is
there
any
way
we
could
possibly
reach
out
or
any
lists
you'd
recommend
to
discuss
this
on.
G
I
mean
the
the
best
place
for
test
script.
Discussions
at
this
point
is
actually
probably
the
discord
server
community
discord.
H
If
you
want
to
give
me
a
link,
I
can
maybe
post
something
there
or
if
you
want
to
post
something
there
to
link
to
the
pr
that
I
started
yeah.
That
would
be
great.
B
H
I
mean
there
are
going
to
be
edge
cases
I
mean
I
can
talk
about
it
better
than
I
can,
but
you
know
at
least
I
think
the
only
objection
is
gus
and
I
think
he's
not
demanding,
like
you
know,
perfect
purity,
because
I
think
we've
ruled
out
all
the
potential
ways
to
get
perfect
purity
from
the
whole.
You
know
all
the
other
cgs
approaches
like
dynamic
and
out
of
order,
etc
that
have
been
banged
around
for
years
now.
H
So
since
those
have
been
ruled
out,
I
think
at
least
gus
might
be
slightly
flexible
on
not
requiring
that.
I
don't
know
if
of
any
other
options
besides
parsing
at
this
point,
so.
F
C
One
thing:
well,
two
things,
I
would
add
is
a
I.
I
do
think
that
gus
was
asking
for
100,
I
think
guys
was
asking
for
something
that
is
not
likely
to
break
for
handwritten
code
in
in
the
in
the
face
of
reflector
rings.
I
think
that
was
his
principal
objection,
so
I'm
not
sure
if
a
perce
apprenticeship
is
high
enough
would
necessarily
overcome
his
object
objection
unless
he
just
changes
his
mind,
but
I
think
in
the
same
vein,
that
guy
just
mentioned.
C
It
might
be
interesting
to
look
at
more
aggressive,
failing
so
to
say
that
if
there
is
anything
where
we
see
patterns
that
might
lead
to
additional
values
on
the
xbox
object,
it
being
passed
into
functions,
we
don't
understand
or
it
being
looped
with
dynamic
keys
or
anything.
Are
we
bailing
out
in
those
cases
to.
G
C
F
Yeah
I
mean
it
does
sound
like
sorry,
it
does
sound
like
a
viable
route
to
get
100.
So
you
know
if
the
argument
is
that
if
there's
something
a
user
can
do
without
realizing
and
change
their
exports
or
if
it's
not
going
to
be
completely
accurate,
we
can
get
past
both
of
those
arguments
perfectly
fine.
By
saying
we
specifically
detect
transpiled
sources.
Obviously
that
detection
can't
be
100,
but
we,
if
we
bail
out
anything,
that's
not
got
es
module,
looks
like
a
transpiled
source.
F
We
will
have
a
hundred
percent
on
on
those
sources
we
might
over
classify,
but
I
don't
think,
there's
there's
an
argument
against
over
classification.
I
think
the
argument
is
against
under
classification
at
any
level,
so
that
would
ensure
we
never
under
classify,
but
yeah
help,
testing
and
moving.
This
forward
would
be
great.
I
mean
I've
pushed
like
three
proposals
now
for
this
stuff
and
yeah.
It's
it's
been
a.
F
F
D
Yeah,
I
just
just
real,
really
quick,
not
to
argue
a
position,
but
I
I
feel,
like
I've
remembered
gus,
saying
specifically
that
he's
not
concerned
really
with
transpiled
code,
because
those
folks
can
are
the
ones
that
are
likely
to
update
he
can
make
changes,
but
that
he
is,
he
seems,
I
recall
him
being
more
concerned
about
old
code
that
conceptually
has
named
exports
but
is
might
never
be
updated
and
a
desire
to
make
those
work.
D
F
Great,
if
you
can
get
100
code
is
that
we
would
just
not
make
them
work,
because
his
argument
is,
we
should
either
make
it
work.
We
should
not
make
it
work,
okay,
so,
let's
just
not
make
it
work
in
those
cases
and
we'll
only
make
it
work
for
transpired
code
where
we
know
we
can
get
100
and
then
obviously
the
next
day
december,
whether
we
are
100,
detecting
transport
code
right,
but.
F
Okay,
I
can
try
and
push
some
discussions
around
that
before
the
next.
F
Meeting
the
next
next
one
was
kind
of
an
open
discussion.
I
just
wanted
to
have
what
I've
been
noticing.
I've
been
doing,
some
prs
for
the
export
field
for
packages,
and
one
of
the
things
I'm
noticing
is
there's
a
lot
of
ui
component
libraries
coming
out
these
days
that
are
have
es
modules
and
sometimes
or
so
component
libraries
and
svg
icon
libraries,
and
some
of
them
have
hundreds
of
icons
or
hundreds
of
components
where
each
of
those
is
an
individual
module.
F
F
So
what
ends
up
happening
is
most
users
who
define
exports
will
probably
define
their
exports
explicitly
and
without
file.
Extensions
they'll
define
individual
features,
but
then,
as
soon
as
these
component
libraries
want
to
define
exports,
they
absolutely
have
to
export
entire
folders
and
require
their
users
to
use
file
extensions.
F
If
you
had
some
kind
of
pattern
system,
you
could
include
the
extension
for
these.
You
know
100
100
or
100
separate
module
component
libraries,
and
if
anyone
has
any
thoughts
on
that,
if
it's
a
cons
concern,
we
are
sometimes
encouraging
extensions
and
sometimes
not,
and
if
so,
if
we
want
to
think
about
more
complex
proposals
that
can
possibly
mitigate
that.
D
D
In
other
words,
what
you're
describing
is
what
I've
been
advocating
for
for
years,
which
is
humans,
don't
want
to
see
extensions
in
their
code,
but
the
choice
of
requiring
explicit
file,
extensions
forces
that
which
means
that
we
have
explicitly
forced
intentionally
people
to
make
very,
very
long,
verbose
exports
maps
specifically
to
remove
the
the
cruft
of
extensions
from
their
code.
D
If
we
can
come
up
with
a
way
that
is
ex,
you
know
that
meets
all
the
explicit
and
static
desires
that
drove
this
current
decision,
but
also
allows
those
extensions
to
be
omitted.
I'm
super
on
board
with
that.
The
exact
proposal
in
that
issue,
I
think,
doesn't
work
because
of
what
directory
x
like
because
there's
some
confusion
around
what
the
slash
actually
does,
but
but
some
mechanism
would
be
great.
D
I
never
want
any
of
my
hundreds
of
packages
to
even
allow
people
to
use
extensions
when
they
import
things,
and
so
I
write
my
exports
maps
that
way,
but,
but
I
I
just
wanted
to
remind
everyone,
this
is
the
trade-off
we
chose
by
not
allowing
extension
based
lookups.
F
D
F
Anything
I
support
extension
searching
in
in
the
path
exports
for
common
js,
explicitly
so
yeah.
In
that
case,
it
is
we
even
when
there's
no
slash
provided
when
there
is
a
slash
provided.
D
That's
what
I'm
saying
when
the
when
the
right-hand
side
ends
in
a
slash,
then
that's
basically
like
hats
off
right
is
that
the
cjs
path
gets
the
full
range
of
node,
specifier
lookups
and
the
esm
side
gets
only
exact
paths
right,
but
the
the
that
means
that
you
as
soon
as
you
have
that
slash
you
no
longer
have
parallel
apis
and
which
means
that,
if
you
care
about
having
parallel
apis,
then
you're
never
going
to
use
a
slash
on
the
at
the
end
of
the
right-hand
side
and
you're
always
going
to
have
to
explicitly
enumerate
these
things,
which
is,
I
think,
the
problem
that
you've
run
into
or
that
you've
seen
people
run
into,
and
it's
certainly
a
problem.
D
I've
run
into.
I
have
a
package
that
had
that
doesn't
have
the
exports
field
yet,
but
has
1700
files
exposed
and
that's
going
to
be
a
real
long
package
json
one
day,
and
I
would
love
a
way
to
keep
that
shorter
without
forcing
people
to
dirty
their
code
with
file
extensions.
D
So
I'm
on
board
with
the
desire.
I
just
I'm
kind
of
like
I
feel,
like
we've,
we've
intentionally
boxed
ourselves
into
this
corner.
F
Can
you
just
clarify
again
what
your
concern
was?
Sorry,
jan
just
I
just
want
to
hear
from
jordan
exactly
what
his
concern
was.
That
he's
said
about
the
current
proposal
and
then
oh.
D
Yeah,
well,
it's
that
it's
that
the
slash
at
the
end
of
the
right-hand
side
already
means
do
the
normal
thing
for
your
module
system
and
you're,
proposing,
I
believe,
a
slash
at
the
end
of
the
left-hand
side
and
you
know,
or
some
globs
or
something
in
there
as
well.
Something
like
that
and
that
to
me
that
just
creates
kind
of
confusion,
because
the
like
I
have
to.
I
have
to
keep
a
lot
more
in
my
head.
D
D
Right
like
like
the
current,
but
the
current
mental
model,
is
that
everything
on
the
left-hand
side
is
the
exact
thing,
the
exact
sub-path
under
the
package,
name,
that
appears
in
someone's
code
and
except
for
when
the
right-hand
side
ends
in
the
slash,
because
that's
the
escape
and
that
that
is
now
going
to
be
made
more
complex.
With
the
current
with
your
current
proposal
here.
F
C
Yeah
kind
of
guy
might
have
heard
this
one,
but
I
I
think
that
today
people
don't
want
to
see
file
extensions
in
their
code,
and
that
is
correct
and
that
is
based
on
a
decade
of
precedent
and
practices
and
what
people
are
used
to.
C
C
Process,
though
that
is,
if
like,
for
example,
people
like
the
the
polymer
ecosystem,
not
the
most
populous,
but
you
know
it
exists-
are
definitely
of
people
like
snowpack
that
are
focusing
a
lot
on
hey.
C
So
it
is
a
question
whether
in
that
world
bare
specifiers
not
having
extensions,
would
actually
feel
natural
or
whether
that
actually
then
sticks
out
as
it's
weird
for
everything
relative
I'm
using
the
file
extensions,
but
then
I'm
not
using
file
extensions
for
everything
coming
from
a
package
right
now,
it's
using
the
same
pattern
for
both,
but
in
that
new
world,
actually
potentially
writing
file
extensions
for
your
specifiers
isn't
really
that
weird,
but
that
is
very
speculative.
Obviously,.
G
G
F
So
would
you
put
two
in
support
of
better
ergonomics
here.
B
Some
reason
yeah,
so
I
do
really
like
the
idea
of
having
templates
and
things
of
that
nature.
Sometimes
you
want
to
do
some
kind
of
odd
skips
around
in
directory
structures.
I've
seen
before,
I
think,
there's
definitely
work
to
be
done
on
the
exact
implementation,
but
there's
a
corollary
with
the
file
extensions,
where
you
can't
always
update
the
specifier
things
are
imported
by.
I
think
we
can
all
agree.
B
If
you
get
a
reasonable
usage,
not
all
of
your
consumers
will
update
willingly
or
ever
to
a
new
specifier,
regardless
of
if
you
use
package
extensions
or
not,
I
think
we
shouldn't
be
trying
to
pass
judgment
on
what
the
future
will
be
if
we
were
to
include
a
file
extension
by
force,
for
example,
then,
if
we
get
to
things
where
we
can
redirect
to
a
different
format
which
you
can
do
with
the
exports
field,
you
don't
have
to
match
the
specifier
with
the
eventual
target.
B
You
know
foo
slash
bar.js,
but
really
be
loading,
wasm,
even
json
files,
and
we
get
into
a
very
awkward
position
if
we
don't
start
prepping
for
this,
especially
with
import
assertions,
they
used
to
be
called.
What
was
it
important?
It.
D
B
Attributes
yeah
so
we're
getting
to
the
point
where
we're
actually
adding
metadata
to
the
import
site
and
we're
going
to
need
to
start
figuring
out
how
we're
going
to
pass
that
metadata
through
to
these
static
resolution
systems,
and
so
some
form
of
templating
probably
needs
to
occur
and
regardless
of,
if
we
enforce
that
it
must
have
a
file
extension.
B
This
is
also
affected,
because
if
we
were
to
do
that,
redirect
from
a
dot,
js
specifier
to
say
a
dot
json
file,
the
import
assertion
would
fail.
If
you
tried
to
assert
that
you're
actually
loading
javascript,
because
you
would
actually
be
loading
json,
and
so
I
don't
think
we
should
state.
We
know
what
the
future
is,
and
I
think
we
should
actually
make
this
much
more
flexible
because
there's
kind
of
a
world
of
pain
coming,
it's
my
imagination
from
various
other
metadata
being
attached
to
module
resolution.
That's
coming
in
the
pipeline.
F
So,
just
to
very
briefly
go
into
the
specific
proposal,
then
the
reason
why
I
didn't
include
a
wild
card
on
the
left
hand
side
so
to
match
the
wild
card
replacement
on
the
right
hand.
Side
was
that
if
you
have
both
that
and
path
exports,
there's
possibly
some
ambiguity
between
which
one
is
meant
to
be
selected
with
higher
precedence,
whereas
by
making
them
the
same
key
that
it
avoids
that
whole
problem.
F
But
perhaps
if,
if
that
precedence
is
just
carefully
defined,
then
a
wild
card
pattern
on
both
both
sides
that
match
up
the
or
upgraded
or
an
active.
F
Yeah,
if,
if
we
then
again
define
the
conflict
case
and
what
what
it
means
to
conflict,
because
at
the
moment
for
example,
path
exports
have
precedence
over
each
other,
you
can
have
a
nested
path,
that's
more
that
overrides
a
higher
level
path,
and
then
how
does
that
interact
with
with
wild
cards?
And
things
like
that?
But
we
can.
I
mean
this
proposal
was
very
much
a
bike
show
we
can.
We
are
interested
in
pursuing
these
kinds
of
ideas.
F
While
we
still
have
a
window
for
it,
we
could
have
those
discussions
separately
and
try
to
maybe
flush
some
of
that
out
a
bit
more
and
see
if
we
can
come
up
with
something
that
might
make
sense
unless
anyone
has
strong
objections
or
concerns
at
this
stage
to
to
not
pursue.
C
One
quick
thing
that
I
forgot
to
bring
up
is
the
one
property
we
risk
losing
with
this.
Is
that
just
that
we
are
not
no
longer
statically
convertible
to
what
is
currently
proposed
as
input
maps,
meaning
if
you
want
to
convert
the
wildcard
prefixing
to
a
currently
proposed
import
map,
there
is
no
static
way
of
doing
it
without
scanning
the
file
tree,
which
we
may
or
may
not
care
about,
but
just
mentioning
it.
D
That
is
basically
I
mean
the
trade-off
we
made
is
that
you
know
unless
we
can
convince
import
maps
to
add
some
more
logical
features
to
dynamically,
to
statically,
configure
a
dynamic
import
map
or
like
a
dynamically
generated
import
map,
then,
like
the
we're
tying
ourselves
to
something
that
requires
excessive
boilerplate
for
anything
complex
and
that's
just
you
know
we
have
to
make
that
decision.
Do
we
couple
ourselves
to
that
or
do
we?
B
Just
to
be
clear,
this
does
not
make
any
searching
on
the
disk
actually
occurs.
Just
rewriting
specifiers.
C
Right,
but
the
problem
is
that
that
the
current
rewrite
cannot
express
by
an
import
map
without
creating
an
entry
for
every
possible
matching
file,
because
you,
because
the
input
map
doesn't
have
a
wildcard
content.
So
if
you
want
to
express
that
rewrite,
you
have
to
to
generate
the
import
map,
you
have
to
scan
the
file
system
to
know
what
the
actual
mappings
will
look
like,
so
that
you
can
generate
the
input
map.
A
Y'all,
I'm
sorry
click
point
of
order.
We've
got
one
minute
left
until
the
end
of
this
meeting.
Are
people
ready
to
move
on
for
right
now
and
call
it
or
is
there
a
need
to
kind
of
keep
the
discussion
going
over
time.
A
Offline,
okay,
cool,
so
sorry
for
stopping
the
party
so
abruptly,
but
I
would
like
us
to
keep
to
our
timeline
thanks
everyone
for
joining
today.
Let's
try
to
get
work
done
asynchronously
through
the
issue
tracker
and
pick
this
up
again
in
two
weeks.