►
From YouTube: Node.js Foundation Modules Team Meeting 2020-08-12
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
The
august
12th
edition
of
the
node.js
modules
team
meeting
we
are
joined
by
myself
and
six
under
other
individuals
and
we
have
a
fairly
full
agenda.
So
I'm
going
to
jump
right
into
it,
but
before
we
start
does
anyone
have
any
announcements
that
they'd
like
to
make.
A
A
One
thing
that
a
couple
people
ran
into
that
they
found
confusing
was
the
fact
that
it
is
still
flagged
in
the
rebel.
That's
because
the
rebel
is
actually
like
a
kind
of
its
own
goal.
Slash
run
time.
A
There
is
an
open
issue
on
the
node
repo
now
discussing
both
like
how
how
would
there's
a
pr
to
unflag
the
rebel
with
discussion
about
what
would
be
needed
and
a
separate
one
actually
about
supporting
static
import
inside
of
the
repel,
which
is
currently
not
supported,
which
could
use
somewhat
similar
infrastructure
to
supporting
top
level
awaiting
the
rebel.
So,
if
you're
interested
in
either
of
those
topics,
there
are
discussions
happening
upstream.
A
So
with
that
in
mind,
we
may
as
well
get
right
into
our
first
topic.
Number
three,
four,
four
one.
Four.
Actually,
you
know
what
I'm
seeing
not
on
the
agenda,
which
I
know
is
an
issue
we
should
discuss
really
quickly.
If
folks
don't
have
any
concern,
is
we
do
have
a
pull
request
right
now
to
add
duran
sur
as
an
observer?
It
has
been
open
for
four
days.
I
do
not
believe,
though,
that
we
have
quorum
here
so
duron,
I'm
personally
fine
with
letting
you
to
continue
participating
in
the
meeting
for
right
now.
A
Does
anyone
object
to
having
duron
participate
in
today's
meeting
cool,
sorry
about
being
super
formal
there,
but
just
trying
to
be
like
respectful
of
our
of
our
rules
and
stuff.
So
next
up
that
we
have
is
number
three
four,
four
one
four
brad
can
do
you
wanna
take
this
one.
B
Sure
so
this
I
believe,
was
talked
about
at
the
last
meeting
as
well,
so
the
policy
mechanism
and
node
is
being
brought
in
line
with
the
current
capabilities
that
we've
shipped
for
the
module
system
this
year.
One
of
those
is
bringing
up
the
ability
to
vary
based
upon
the
import
conditions
or
require
condition,
and
so
this
is
more
heads
up
at
this
point.
I'm
planning
on
landing
this
pr
policies
remain
experimental
pretty
soon
and
doing
so
does
affect
this
group
somewhat.
B
C
So
yeah,
can
you
just
clarify
that?
Are
you
saying
that
the
the
policy
is
defining
which
conditions
to
resolve.
B
No,
it's
reacting
to
conditions
it
sees,
so
it
is
just
if
a
policy
is
gotten
to
through
the
default
loader.
It
will
just
it's
effectively,
making
a
static
routing
table.
That's
it.
C
Right
that
makes
sense,
as
as
discussed
in
the
pr
I
was
wondering
if
it
should
be
a
kind
of
a
higher
level
thing.
So,
if
you
think
of
import
maps
as
a
mapping
of
resolution
in
the
browser
say-
and
you
want
to
generate
an
input
map
for
the
browser
for
node.js
sorry
from
from
your
node
modules
structure,
you
would
consider
the
mapping
a
function
of
the
of
the
conditions.
C
And
if
you
wanted
to
have
mappings
for
different
conditions,
you
would
have
possibly
like
a
base
mapping
and
then
extensions
of
that
mapping
that
are
restricted
to
the
different
conditions,
which
would
just
it's
basically
just
a
slight
inversion
of
control,
where
the
condition
is
seen
as
the
base.
And
then
the
mapping
has
a
fixed
structure.
Whether
it's
here,
I
believe
the
conditions
are
sub-objects
of
the
resolution.
So
it's
just
a
slight
inversion
of
models.
C
B
On
that,
I
would
state
that
that
inversion
is
not
possible
to
represent
context-sensitive
conditions,
like
importer
require.
C
So,
just
just
to
be
very
clear
like
if
you
have
an
object
that
defines
mappings
and
each
of
those
mappings
then
forks
off
into
into
an
individual
object
with
conditions.
C
You
can
always
equivalently
write
that
as
a
base
mapping
and
then
individual
extension
mappings
for
different
conditions.
Sorry,
it's
much
easier
just
to
write
it
out
with
it
with
a
code
example.
I
would
do
it
in
the
chat,
but
I
don't
want
to
take
up
too
much
time
on
this.
I
understand.
B
C
We
take
it's
a
concern
that
we
are
getting
divergence
from
import
maps.
B
Policies
have
repeatedly
stated
are
not
trying
to
emulate
import
maps.
It's
very
clear.
C
Okay,
but
the
reason
why
I'm
bringing
this
up
as
well
is
that
I
I'm
wondering
if
input
maps
should
imitate
policy
because
that's
what
they
end
up
doing
in
system.js,
but
we've
got
an
agenda
item
at
the
end
for
that.
If
we
get
to.
B
B
This
does
contain
it,
but
we
have
moved
it
back
so
that
they
do
not
align.
I
don't
have
a
separate
pr
for
that.
Currently.
A
Okay,
thank
you
brad
next
up,
and
I
see
that
we
did
not
have
any
notes
in
here.
So,
if
anyone's
okay
with
kind
of
summarizing
that
discussion,
please
do,
but
if
folks
can
please
take
notes
as
we're
going.
That
would
be
greatly
appreciated.
A
The
next
issue
that
we've
got
right
now
is
number
three
three,
four:
six
zero
special
treatment
for
package,
json
resolution
and
exports.
Does
someone
in
particular
want
to
leave
this
one.
A
F
Yeah,
it
just
took
me
a
second,
because
there
were.
There
are
two
issues,
and
this
was
the
one
that
was
left
on
the
agenda
so
essentially
there's.
F
The
question
was
raised
that
there's
a
lot
of
tools
that
need
access
to
package
json
there
isn't
with
the
addition
of
exports,
there's
no
easy
way
to
there's
no
reliable
way
to
get
it
in
every
situation.
So
there
are
kind
of
two
pa.
Three
paths
that
were
they're
roughly
being
looked
at
here.
F
One
is
do
nothing
and
packages
that
want
we'll
have
to
explicitly
expose
package
json
if
they
want
this
to
be
easy,
if
they're
in
the
bucket
of
cases
where
that's
otherwise
impossible
to
get
to
it,
another
option
would
be
automatically
exposing
package
json
like
like
implicitly
as
an
export,
but
there's
a
number
of
folks
that
do
not
like
that
idea
and
another
option
that
I,
as
far
as
I
recall,
we're
roughly
in
favor
of,
is
adding
some
explicit
api.
F
You
know
bike
shed
aside,
like
module
dot,
get
package
root
or
something
where
you
pass
in
a
specifier
and
like
import.meta.url
or
under
durname
and
or
you
know
or
similar
and
and
then
it
gives
you
the
path
to
the
folder
in
which
you
can
then
grab
package.json
or
whatever
else
you're
looking
for
so
I
don't
know
actually,
if
there's
much
to
discuss
here,
because
I
haven't
yet
have
like
come
up
with
a
pr
for
it.
F
But
my
intention
was
to
do
that
and
miles
to
pair
with
you
when
you
have
time
to
try
and
make
that
proposal,
but
but
effectively.
My
plan
is
to
do
that.
It's
gonna
be
something
some
spelling
of
module
dot,
get
package
root
that
takes
two
arguments,
a
specifier
and
a
directory,
and
you
know
there's
some
open
questions
that
can
be
synchronous
or
must
be
asynchronous
and
so
on.
F
You
know,
but
the
put
that
aside
the
it
it
feels
from
reading
all
these
discussions
like
that,
would
be
sufficient
to
for
the
ecosystem
to
migrate
to
using
that
for
getting
reading
package
json,
and
then
they
will
no
longer
need
to
rely
on
getting
to
it
through
require
or
import.
So.
A
I
guess
there's
like
two
competing
facts
that
I
have
here
so
so
one
of
them
is,
we
don't
even
have
a
way
of
loading
json
right
now
in
eosin,
at
all,
like
we
haven't,
find
a
flag
but
like
there
is
not
a
standard
way
of
loading
json,
it's
being
worked
on
it's
being
standardized
and
jordan,
and
I
are
painfully
aware
of
how
long
that
process
can
take.
So
I
say
that
not
as
like
a
negative
but
more
just
like.
A
We
can't
be
bullish
on
waiting
for
this
to
like
land
in
the
spec
and
be
specified
in
a
super
timely
fashion,
like
maybe
in
the
next
year,
but
like
a
year,
is
a
very
long
time
and
patterns
vacant
that
time
so
on
one
hand,
I
could
see
the
value
in
some
sort
of
api
for
loading
it
that
is
different
than
like
get
package
root,
create
a
url
using
the
package
root
and
appending
to
the
package
root,
package.json
reading
that
from
the
file
system
and
then
json
parsing
that,
oh,
that's
terrible,
like
that's
a
really
bad
user
experience.
A
So
that's
one
thing,
but
that's
actually
completely
separate
from
the
like
package.
Exports
can't
resolve
it
thing,
and
I
I
have
gone
back
and
forth
on
this,
and
and
really
at
this
point
with
the
number
of
people
who
have
found
this
metadata
being
not
included
to
be
unintuitive
the
number
of
projects
that
have
like
shot
themselves
in
the
foot
over
this.
I
recognize
that,
like
we
are
dealing
with
people
who
are
used
to
having
this
and
they're
learning
how
this
feature
works
so
like
these
are
kind
of
expected
edge
cases
to
bump
into.
A
I
understand
that
that
kind
of
goes
against
the
encapsulation
of
it,
but
especially,
if
I
think
about
like
tools
that
need
to
be
able
to
read
to
figure
out
what
is
encapsulated
and
not
encapsulated.
The
package.json
needs
to
be
exposed
to
get
the
exports
to
even
determine
that.
A
That
expects
this
to
be
used
and
the
absolute
massive
pain
in
the
ass
that
it
that
it
is
right
now
to
set
this
up
seems
to
me
that,
like
you
know,
perhaps
we
we
should
consider
that
as
an
option.
John,
you
have
your
hand
up.
What's
up.
D
D
It
would
be
to
me
surprising
if
there
is
a
way
to
opt
out,
because
if
we
say
yeah,
this
is
a
reliable
way
that
tools
can
use,
and
this
is
kind
of
the
api
that
we
expected
them
to
use.
But
then
we
also
say
oh
yeah,
but
we
do
allow
packages
random,
npm
packages
to
add
a
field
in
the
basin
that
just
ran
and
breaks
it
that
feels
counter
intuitive.
D
So
I
think
if
we
decide
that
resolving
package,
slash
package.json
is
the
api
that
we
accept.
Then
we
should
not
have
an
opt-out.
I
think
that
is
my
opinion.
A
A
So
a
really
great
example
of
this
could
be
like
an
eslint
rc
well
or
like
some
sort
of
testing
config,
if
they're
expecting
to
use
require
to
grab
those
files,
but
those
files
aren't
in
the
map
like
that,
like
like
it
just
becomes
like
a
ball
of
yarn,
being
inconsistent
there
and
almost
starts
to
get
to
the
point
of
like
oh,
like
should
the
exception
actually
be
for,
like
all
static
assets,
but
then
it
starts
becoming
like
well
like
now.
What's
the
point
of
even
having
this
interface,
so
I
definitely
bounce
back
and
forth.
B
I
I
would
be
very
careful
about
what
we
try
to
say
is
an
assurance.
I
don't
think
even
if
we
wanted
to,
we
could
ever
state
with
assurance
that
it
will
get
packaged
json
metadata
even
on
disk.
You
are
not
actually
required
to
have
that
in
your
node
modules,
folders
and
data
uris
you're
not
going
to
have
that
james
snell
is
looking
at
blob,
urls
they're
not
going
to
have
that.
B
Well,
that's
a
long
sort,
but
they're
not
going
to
have
it
if
we
support
something
like
https
they're,
not
gonna,
have
it
so,
no
matter
what
we
do,
we
have
to
come
up
with
some
sentinel
value
of
you
don't
get
the
data
back.
B
So
I
think
I
think
just
knowing
that
it's
not
so
bad
to
have
an
opt-out,
because
the
data
literally
may
be
nonsensical
or
not
exist.
D
Those
tools
will
break
one
way
or
another,
even
with
the
getpackage.jsonpath
built-in
function
that
we
suggested.
We
don't
have
anything
that
has
any
kind
of
flexibility
when
it
comes
to
getting
package-based
metadata.
None
of
these
solutions
have
it
my
my
concern
is
less
about.
There
is
maybe
some
edge
cases,
and
my
concern
is
more
about
somebody
saw
on
in
our
docs
that
they
can
opt
out
and
they're
like
well,
I
mean
I
want
to
be
make
sure
that
people
don't
read
random
appeals
for
my
package.jsons.
D
I
will
just
set
it
they're,
not
100,
up-to-date
on
the
exact
implications,
and
then
they
publish
that
widely
used
package.
That
is
four
layers
down
under
under
babel
and
now.
Suddenly
all
of
npm
is
broken
because
everybody
is
now
having
fading
builds,
because
that
one
author
didn't
quite
think
about
it
before
they
set
the
packages
into
null.
B
Well,
we
saw
the
accidental
risk
from
his
promise
doing
effectively
that
with
exports
without
knowing
it,
I'm
not
convinced
that
doing
it
on
purpose
is
the
same.
The
whole
blog
post,
post
mortem
from
his
promise
was
about
how
they
did
not
understand
that
they
were
limiting
things
when
they
published,
not
that
they
were
intentionally
publishing.
B
B
D
Is
only
applicable
to
export
and
I'm
pretty
sure
we
stopped
searching.
Well,
I'm
pretty
sure
we
use
the
exact
package
json
at
node
modules,
slash
name
of
the
package,
slash
package.json,
it
does
not
do
any
searching
for
exports.
So
in
the
case
of
exports
there
is
no
searching
and
you
have
and
if
there
are
exports-
and
there
is
a
package.json
in
that
directory-
we
know
that.
F
So
I
think
I
think
that
regardless
of
miles's
bounce
suggestion
of
like
regardless
of
the
path
of
like
making
it
implicit,
it
still
seems
important
and
useful
to
me
to
be
able
to
get
the
you
know
possibly
get
the
location
on
the
file
system
of
the
route
right,
obviously,
for
a
modules
like
for
modules
like
bradley
mentioned,
where
that
doesn't
exist,
you
wouldn't
get
anything,
but
you
know
it
does
anyone
what
are
people's
thoughts
on
going
ahead
with
this
sort
of
get
package
root,
api
regardless
and
then,
if
we
still
don't
think
that
the
use
case
for
that
or
that
the
usability
of
that
for
getting
package
json
is
sufficient.
F
I
see
duran
and
yawn
have
their
hands.
D
D
All
right
yawn,
I
think
it
looks
like
you-
want
to
go
wrong,
drop
the
hand.
Okay,
I
I
think
I
had
to
get
the
route
it's
valuable
one
way
or
another.
Another
example
where
it
is
valuable
is
right
now
the
only,
for
example,
the
way
for
coffeescript
in
the
coverscript
loader
right
now.
The
only
way
that
we
can
determine
the
type
of
the
package
is
that
the
loader
currently
either
needs
to
replicate
a
bunch
of
behavior
or
to
make
up
a
fake
file
name
and
call
the
get
format
hook.
D
That
is
built
in
which
is
that
kind
of
ai,
of
just
getting
the
package
route
from
a
from
a
path
or
from
a
url
that
was
such
a
loader
to
actually
use
a
lot
of
node
built
in
machinery
to
get
a
package.
Json
read
the
type
and
actually
make
a
determination
from
that.
So
I
think
that
kind
of
api
is
valuable
when
it
comes
to
making
body
loading
programmable,
so
I
think
one
way
or
another.
It
makes
sense
that
okay.
F
So
it
seems
like,
then,
that
I
should
proceed
in
making
the
pr
for
that
api
and
we
can,
you
know,
discuss
it
as
needed
and
then
of
the
two
issues
I
referenced
earlier.
That
would
close
one
of
them,
but
they're,
not
the
one.
That's
on
the
agenda,
unless
everyone's
convinced
that
the
ergonomics
around
that
are
sufficient.
F
Cool
unless
yawn
you
just
haven't,
dropped
your
hand.
Yet
then
there's
nothing
or
if
you
haven't
just
done
a
dropper
in
it,
there's
nothing
else
to
talk
about.
So
I
think
we
can
uncomfortable
moving
on
to
the
next
topic.
A
C
Yeah
I
can
give
an
update
on
that.
So
jeffrey
did
some
more
analysis
and
he
posted
some
numbers
at
the
end
of
that
issue.
C
C
C
So
I
kind
of
feel
like
it's
chasing
around
a
criteria
that
that
isn't
necessarily
going
to
be
possible
to
get
consensus
on,
but
I
mean
we
can
go
through
that
on
on
the
es
module,
so
that
one
one
suggestion
that
I
made
previously
was:
if
we
just
get
the
detection
for
modules
that
are
transpiled
and
have
an
es
module
branding
which
are
basically
all
transpiled
modules
from
typescript
and
babel
and
other
tools.
C
C
So
there's
kind
of
two
questions
the
one
is:
are
we
still
trying
to
define
the
criteria
with
gus
to
sort
of
use,
something
more
sort
of
rough
about
what
users
expect
or
are
we
going
to
stay?
Are
we
going
to?
You
know
converge
on
the
cs,
module
check
and
then,
depending
on
which
way
we
go
improving
the
accuracy.
My
personal
feeling
is
that
the
accuracy
isn't
there.
C
I
think
we
can
get
it
there
for
years
module,
but
it's
going
to
be
a
lot
of
work
still
because
we're
only
at
81
for
years
module
we're
at
90
according
to
jeffrey's
classification.
Here
he
should
really
have
introduced
it.
Ideally
today
it's
unfortunately
can't
go
through,
but
yeah.
We
need
to
kind
of
decide
on
what
we
want
to
do,
but
it's
a
little
bit
tricky
today,
because
we
don't
really
have
all
the
stakeholders
here
wesley
and
jeffrey
and
gus.
A
I
guess
something
that
I
could
bring
up
right
now
and
I
I
just
ping
some
people
privately
to
see
if
we
can
get
from
them.
Jeff's
latest
comment
from
two
days
ago
does
claim
that,
for
the
es
module
case,
specifically
he's
seeing
a
99
pass
rate
and
the
failures
that
are
happening
are
arguably
failing
due
to
documentation
and
it
was
21
packages.
A
We
can
dig
into
those
packages
and
see
a
little
bit
more
details
about
about
what
makes
sense
to
me
at
a
high
level
and
like
obviously,
there
may
be
some
subtlety,
I'm
missing.
Those
numbers
seem
to
imply
that
we're
getting
accuracy
at
a
level
that
should
be
acceptable
to
gus,
and
at
least
myself
on
this
and
and
folks
may
have
differing
opinions.
A
I
think
any
small
bit
of
work
that
we
can
do
to
improve
this
would
be
good.
I
know
that
it
could
be
confusing
if,
like
some
cjs
modules,
have
it
and
some
don't.
But
the
case
that
we're
talking
about
here
are
cjs
modules
that
were
written
as
esm
and
are
being
transpiled
by
tools
to
specifically
target
cjs
and
the
one
percent
of
failures
that
we're
seeing
are
actually
failures
compared
to
the
distribution.
C
By
that
briefly,
the
the
results
that
he's
posted
are
not
based
on
this
on
modules
that
are
transpiled.
It's
based
on
all
modules
that
fit
these
criteria
for
named
exports
on
transpiled
modules.
I
think
we're
getting
81
or
83
percent
is
the
number.
F
A
A
Talking
to
gus,
you
just
got
back
to
me
really
quickly.
It
sounds
like
maybe
the
question
is
is
like:
do
99
of
modules
with
underscore
underscore
es
module,
have
a
hundred
percent
of
their
exports
correctly
detected
and
do
100
of
modules
without
es
module,
have
zero
exports
detected?
A
That
seems
to
be
kind
of
like
the
angle
that
we're
looking
at
it
seems
to
me-
and
you
can
tell
me
if
I'm
wrong
here
guy
but
like
we
could
do
a
check
for
underscore
underscore
es
module
and
just
not
go
down
this
code
path
at
all.
If
it
doesn't
exist
which
can
guarantee
us
the
modules
without
es
module,
donate
an
attempt
to
detect
exports,
and
then
it's
just
a
question
of
like
hey.
What's
the
accuracy
of
the
ones
that
that
have
that
that
have
that
field,
the
current
pr.
C
Or
the
current
approach
is
kind
of
based
on
that
and
jeffrey
did
do
the
numbers
on
that
as
well.
I
don't
think
he
posted
them,
but
that's
the
one
where
I
say
it's
83,
I
believe
jan
can
possibly
clarify.
D
Yeah
the
80
sounds
about
correct.
I
think
the
99
number
only
happens
when
you
filter
by
read
me
that
have
matched
a
regex
that
makes
it
look
as
if
import
is
encouraged,
which
is
a
very
I.
It
feels
a
bit
like
the
numbers
are
designed
to
get
a
high
percentage
you're,
not
necessarily
designed
to
track
a
user
expectation
or
a
package.
Author
expectation,
okay,
because.
A
Yeah
so
it
seems
like
perhaps
I
misread
those
numbers
a
little
bit
they
weren't
as
clear.
So
so
I
would
say,
if
we're
not
getting
to,
if
we're
not
getting
to
those
numbers,
maybe
we
need
to
you
know
to
the
point:
maybe
you
are
making
guy
like
kind
of
rethink
if
this
is
worth
pursuing
at
all.
Is
that
kind
of
what
you
were
hinting
at.
C
A
D
Can
you
post
the
summary
you
got
from
gus,
I
think
in
the
issue
just
to
have
it
clearly
documented
that
this
would
be
the
accepting
script
criteria.
We
can
definitely
some
issue.
That's
there.
A
Yeah,
my
my
understanding,
but
I
could
be
wrong-
was
that
gus
has
more
or
less
already
said
this,
but
it
just
might
be
kind
of
varied.
I.
D
Think
the
reverse
was
important
that
you
also
mentioned,
which
is
that
if
it
doesn't
have
ears
module
understand
the
previous
module,
it
shouldn't
have
any
named
exports,
and
I
think
that
is
currently
not
as
fast.
I
can
tell
not
in
this
comment
the
other
direction
is
covered,
but
not
that
direction,
which
I
think
is
an
important
second
requirement
that
he
has
it's.
I
think
the
current
implication
does
satisfy
that
requirement,
but
I
wanted
to
have
it
documented,
at
least
okay.
A
I
I
I
just
dropped
that
in
the
chat
in
the
in
the
issue.
A
Thanks
cool,
for
the
sake
of
time,
are
we
good
to
move
on
to
the
next
one,
great,
very
cool?
So
first
up
we
have
sub
path.
Extension
pattern:
wild
card
expansions
number
535
guy.
Would
you
like
to
introduce
this
one.
C
Yeah,
so
this
has
been
discussed
the
last
few
meetings,
it's
kind
of
a
patent
scheme
in
in
the
exports
definition
to
allow
patents
like
having
folders
with
index
files,
or
things
like
that
in
your
exports,
without
requiring
users
to
explicitly
provide
it.
So
it's
kind
of
like
a
sort
of
manual
extension
searching
and
obviously
it
it
touches
on
the
whole
extension
searching
debate.
So
I
guess
that's
the
difficulty
with
this
one
is
that
it
it
awakens
those
old
wounds.
C
So
I
I
guess
in
in
structuring
the
discussion
like
one
of
the
concerns
is
that
you
know
it
it's
close
to
extension
searching,
but
this
is
not
an
agenda
item
on
extension,
searching
which
it
can
very
easily
turn
into
pr
is
actually
a
relatively
simple
patent
replacement,
and
what
was
interesting
was
I
completely
forgot
that
someone
made
this
suggestion
for
import
maps
as
well
that
yan
linked
to
there's
actually
an
issue
that
was
suggested
by
dominic,
based
on
a
very
similar
pattern
matching
system
in
import
maps.
C
I'm
just
going
to
link
to
it
there
I
completely
forgot
about
it
and
then
we
ended
up
doing
that's
basically
the
same,
which
is
quite
interesting,
and
another
interesting
thing
is
early
system
days.
Also,
I
remember
jason
arendorf
writing
a
very
similar
pattern
structure.
You
know
this
was
a
kind
of
a
standard
thing
to
want
in
these
maps.
So
it's
interesting
that
it
that
it
keeps
coming
up,
but
the
hard
thing
is
how
to
discuss
this
without
the
engine
searching
at
this
point,
I
think.
D
A
D
A
Myself
and
then
jordan
and
brad,
I
still
like,
am
significantly
flip-flopping
on
this
one.
I
I
am
like
genuinely
concerned
about
the
amount
of
complexity
that
this
introduces
into
the
export
map,
and
I
recognize
that,
like
import
maps
and
export
maps
are
already
complex,
and
I
have
in
the
past
expressed
this,
especially
when
we
were
talking
about
like
imports,
and
we
tried
to
like
kind
of
overload
other
things
into
the
export
map.
A
Like
I'm
not
saying
that,
I
would
be
for
the
future
that
I'm
about
to
say
either
but,
like
I
almost
feel
like,
I
would
be
more
comfortable
with
like
a
separate
extensions
map
that,
like
potentially
could
even
map
to
folders
and
I'd
see.
You
smiling
jordan,
because
I
know
that
this
is
something
that
you've
suggested
but
like
teasing
these
apart
as
separate
things
rather
than
kind
of
having
everything
in
one
place
may
be
more
palatable
to
me.
A
But
I'm
still
kind
of
like
on
the
fence,
if,
like
this
amount
of
even
if
it
results
in
a
static
output
like
the
kind
of
dynamicism
in
the
map,
is
something
that
that
I
can
get
behind.
I'm
definitely
not
at
the
point
right
now
where
I'm
like
objecting
to
it,
but
I
do
want
to
express
that
my
initial
gut
reaction
is
discomfort
and
then
we've
got
jordan,
bradley
yawn
and
then
guy.
F
So
we've
had
these
philosophical
worldview-like
discussions
about
the
you
know,
specifiers
and
stuff
for
the
years
into
all
the
years
that
we've
been
in
this
working
group
and
the
current
implementation
prioritized
staticness,
and
essentially
that
that's
it
staticness,
and
so
we
don't
have
extension
resolution
or
index
resolution.
We
don't
have
wild
cards,
we
have
the
ability
to
generate
a
static
import
map
from
just
the
package
json
without
needing
to
check
the
file
system
and
so
on.
F
However,
the
trade-offs
that
we
accepted,
I
think
for
that,
is
that
now
a
bunch
of
usability
issues
exist,
which
is
that
people
want
to
do
all
these
things
that
you
cannot
do
in
that
static
world,
so
it
to
me
it
doesn't
really
make
sense
to
have
kind
of
for
lack
of
a
better
word
to
like
half-ass
one
of
those
sets
of
trade-offs.
I
feel
like
we
should
be.
We
should
either
accept
we
go.
F
We
went
with
the
static
thing,
which
means
some
stuff's
hard
to
do,
and
some
stuff's
not
usable,
and
we
explicitly
decided
we
don't
care
that
that's
unusable,
because
these
other
benefits
are
more
important
or
we
need
to
say
no.
The
usability
is
what
matters
the
things
people
want
to
do
is
what
we
need
to
enable
and
if
that's
the
priority,
then
we
should
have
all
of
those
resolution
things
we
should
have
wild
cards.
F
We
should
you
know
we
should
not
worry
about
trying
to
statically
generate
things,
because
the
priority
in
that
case
is
what
people
want
to
do.
Let's
let
them
do
it,
and
you
know,
given
that
that's
been
said
many
times,
the
all
of
the
dynamic-y
things
don't
prohibit
the
static
use
cases.
It
just
means
you
have
to
write
tooling
to
sort
of
lint
for
it.
If
you
want
to
retain
those
invariants,
my
preference
obviously
still
is
and
has
always
been
going
all
the
way
in
that
direction.
F
But
given
that
we've
spent
all
this
time
and
effort
landing
on
the
current
implementation,
I
to
me
if,
if
wild
cards
lands
extension
resolution
should
land
too
they're
the
same
bit
like
for
the
same
reasons,
because
we
want
usability,
we
want
people
to
be
able
to
express
what
they
want
to
do
and
we're
not
as
worried
about
the
staticness,
and
I
like
guys
proposal
conceptually
because
this
is
these.
Are
things
I
want
to
do
and
I
like
extensions
maps?
F
Obviously
I
proposed
it,
but,
like
you
know,
I
think
that
I
I
think
we
just
need
to
decide
as
a
group
whether
we
want
to
stick
with
our
original
set
of
trade-offs
or
whether
we
want
to
decide
to
abandon
them
when
usability
issues
exist,
which
they
clearly
do
around
this
stuff.
B
So
I
really
like
this
because
this
proposal
doesn't
have
the
dynamicism
that
jordan
wants.
I
guess
I
I'm
very
concerned,
especially
if
we're
having
downstream
tools
like
typescript
or
fuse
to
add
a
file
extension
because
of
how
much
load
time
it
can
add
to
their
tooling
that
if
we
do
something
that
isn't
static,
not
only
can
it
not
be
precomputed,
it
also
might
simply
be
refused,
because
if
you
do
something
dynamic
that
requires
iteration,
it
does
require
iteration
at
every
potential
point
it
could
be
used.
B
This
proposal
does
not
require
that,
which
is
what
I
like
about
it.
It
reclaims
a
fair
amount
of
reusability,
but
there
are
some
concerns
still
about
usability,
even
with
this,
in
particular,
the
appending
of
dot
js
on
the
end
means
you
can
no
longer
use
file
extensions.
B
If
you
use
this
feature
which
would
not
match
other
environments,
I
actually
think
that
that's
a-okay,
that
is
a
choice
of
a
package
author.
It
is
still
static,
you're
not
going
to
be
facing
any
real
performance
problems.
B
As
we
grow
potentially
number
of
file
formats.
We
support
you're
not
going
to
face
tools
really
being
unable
to
do
something,
because
once
again,
there's
no
runtime
component
really
required
to
perform
these
lookups.
B
I
really
don't
want
to
state
that
every
feature
has
to
be
perfect.
I
love
this
feature
and
I
recognize
that
it
has
limitations.
So
that's
all
I
want
to
say
about
that.
The
only
other
thing
is
there's
some
minor
nitpicks
about
whether
to
use
like
star
on
the
right
hand,
side
or
dollar
sign
and
stuff
like
that,
but
I
think
that's
more
just
pr
nitpicking.
D
Right
yeah,
the
the
slight
disagreement
I
have
with
jordan
about
is
the
implication.
This
is
actually
going
going
further
than
extension
resolution
in
some
ways
and
I
think
it's
in
some
ways
doing.
This
is
not
a
sign
that
we
should
do
accentuation.
But
if
we
do
this,
it
is
kind
of
orthogonal.
D
D
Exports
definitely
create
some
indirection,
but
at
least
it's
a
one
step
in
direction,
so
you
know
exactly
if
you
import
something.
At
least
you
have
a
bounded
amount
of
information
that
you
need
to
know
exactly
the
one
file
that
this
import
will
import
searching
its
forks
in
the
road
which
this
unfortunately
does
not.
What
I
find
unfortunate
about
this
is
that,
yes,
we
removed
all
the
rules
of
which
files
to
try.
D
Now
every
single
directory
might
have
a
different
set
of
ones,
which
seems
weird
to
me,
but
I
think
I
can
get
over
it.
D
D
D
C
C
And
can
easily
iterate
the
mapping
and
discover
the
one-to-one
mapping
between
exports
and
their
targets,
and
the
important
thing
there
is
that
it's
one-to-one,
whereas
extension
searching,
is
one
to
many.
Possibly
so.
The
fact
that
it's
a
one-to-one
static
mapping
is
is
kind
of
the
important
thing
and
that
it
can
be
inferred.
For
example,
when
a
published
time
when,
when
the
package
is
published,
you
can
iterate
through
the
files
and
imagine
it
as
sugar
on
top
of
the
file
system,
for
all
the
paths
that
are
supported.
C
So,
to
reiterate,
the
the
use
cases
that
this
came
out
of
were
things
like
svg,
icon,
libraries,
where
each
svg
icon
is
a
js
file
and
you've
got
hundreds
and
hundreds.
You
know
thousands
potentially,
where
listing
those
all
in
the
package.
Json
is
now
going
to
slow
down
your
npm
install,
so
it's
extensions
or
nothing
but
yeah
in
terms
of
the
complexity.
C
I
think
it
sounds
more
complex
than
it
is,
which
is
why,
in
the
last
comments,
I
just
posted
the
exact
commits,
because
it's
not
it
doesn't
add
that
much
complex,
try
and
respond
to
jan
a
bit
on
that
that
you
know
it's.
It's
not
right.
Now,
it's
just
a
single
wild
card
at
the
end,
on
the
left
hand
side,
so
you
can't
actually
have
anything
after
the
wild
card
to
avoid
any
kind
of
precedence,
confusion.
C
We
could
relax
that
in
future,
but
I
mean,
I
think
the
1999
percent
use
case
is
being
solved
by
this.
I
don't
see
it
getting
extended
too
much,
whether
it's
a
difficult
thing
to
to
put
a
you
know
a
wall
up
against,
but
yeah.
I
also
do
get
the
feeling
that
this
is
kind
of
like
that.
We
are,
you
know
getting
getting
far
into
the
sort
of
realm
of
complexity
for
these
mappings
and
that
we
shouldn't
go
further
than
this.
C
That
this
is
this
would
kind
of
be
the
last
step
in
terms
of
the
the
complexity
of
these
export
mappings
and
that
we
sh
we
should
draw
a
line
after
this
and
say:
okay,
that
is
peak
complexity.
Now
we
don't
want
to
add
more
complexity
to
the
resolver,
because
it's
a
very
good
thing
to
to
be
aware
of
and
the
the
precedence
has
worked
out
quite
nicely.
C
We,
I
discussed
it
last
time
where
you
can,
for
example,
have
you
know
folder,
mappings
and
wildcard
mappings
and
precedence
is
just
the
longest
mapping
between
them
and
that
works
out
quite
straightforwardly
in
the
end,
so
there
were
no
implementation
complexities
that
came
up.
That
sort
of
feel
like
they're,
weird
edge
faces
that
I
could
see.
A
Thank
you
guy,
I'm
gonna,
add
my
bit
really
quickly
and
then
to
brad
and
just
for
everyone,
a
five
minute
warning.
I
know
that
julian
you
had
an
agenda
item
that
you
added
down
here
for
removing
the
get
format
hook.
My
gut
is
the
import
assertions
rfc
and
the
import
maps
and
nodejs
ones
could
be
skipped.
A
Cool
and
then
julian
was
the
remove,
get
format
hook,
stuff
super
time
sensitive
or
is
that
something
we
can
punt
to
next
week
or
two
weeks
from
now.
E
A
Right
cool,
so
so,
if
you
could
open
an
issue
to
just
track
that
discussion
and
are
you
an
observer
on
the
team
or
have
you
just
been
kind
of
showing
up,
I
can't
recall
I.
A
A
So
back
to
the
topic
at
hand,
just
for
the
sake
of
time,
thanks
everyone
for
being
flexible.
A
Is
there
a
reason-
and
I've
heard
a
lot
of
reasons
for
like
why
we
would
want
to
explore
this
feature-
that
it
adds
back
a
lot
of
usability,
that
it
keeps
things
somewhat
static?
A
Is
there
a
reason
that
we
would
want
to
enforce
this
or
like
kind
of
shove,
this
into
the
exports
map?
So,
for
example,
I
could
imagine
a
world
where
it's
like,
and
especially,
if
we're
talking
about
like
mapping
folders
the
export
map
maps
the
folders,
and
then
we
could
have
like
an
extension
map
that,
on
the
left
hand,
side
is
like
you
know
the
equivalent
left
hand
side
and
then
the
right
hand
side
is
just
the
extension.
A
I
would
see
that
as
like,
potentially
a
little
bit
more
flexible,
a
little
bit
more
explicit.
I
understand
it's
like
adding
additional
stuff,
but
it
would
allow
us
to
like
avoid
syntax
with
the
stars
and
like
how
many
stars
we
use
how
we
parse
that,
and
at
least
like
to
me
kind
of
keeping
those
separate
similar
to
how
we
separated
import
and
export
maps,
and
this
map
could
actually
be
reused
for
both
kind
of
keeps.
The.
A
Keeps
what's
going
on
there
like
pretty
consistent,
I
don't
know
it
could
be
a
bad
idea
too.
I'm
just
trying
to
figure
out
how
I
get
less
uncomfortable
with
it.
So
brad
you're
up
then
yawn,
but
as
a
heads
up,
we've
got
two
minutes
and
we've
got
a
hard
cut
at
the
hour.
B
Sure
so
I
just
wanted
to
build
on
what
guy
was
stating.
This
is
something
that
we've
faced
also
in
policies.
The
reason
why
these
svg
icon
libraries
are
problematic
in
things
that
look
like
monorepos
that
are
not
pointing
directly
to
files,
but
specific
directories
essentially
is
the
manual
data
entry
effectively,
and
people
don't
want
to
do
that.
B
People
don't
want
to
maintain
it,
and
so
you
end
up
with
subpar
things
where
you
ask:
why
am
I
importing
an
svg
that
has
a
dot
js
thing
or
why,
in
order
to
use
this
sub
folder
within
my
package,
do
I
have
to
also
include
slash
main.js
or
whatever
is
inside
of
it
as
its
main
entry
point,
we've
somewhat
solved
both
of
those
with
export
maps,
but
the
ergonomics
of
them
are
kind
of
terrible.
B
D
Yeah
and
one
of
the
things
that
I
would
add
is
that,
for
me,
the
big
value
is
actually
portable
specifiers
between
common.js
and
mgs.
D
A
Okay,
we
are
a
minute
over
I'm
guessing
we'll
have
to
kick
this
conversation
back
to
the
repo
guy.
You
have
your
hand
up.
Is
there
an
expectation
or
a
hope
to
land
this
feature
in
the
next
two
weeks
before
our
next
meeting.
C
It
is
still
compatible
with
import
maps,
but
given
that
it's
also
something
that
has
been
suggested
for
import
maps,
it
might
even
be
the
sort
of
thing
where
you
know
we
we
might
consider
if
so
that,
just
to
touch
briefly
on
the
last
agenda
item,
you
know,
a
lot
of
what
node.js
is
doing
is
brushing
up
against
import
maps,
and
so
I
did
want
to
just
try
and
think
about
it.
Maybe
us
getting
more
involved
in
import
maps
or
the
spec
process.
Maps
might
make
sense
for
the
project,
but
that's
for
next
time.