►
From YouTube: Open RFC Meeting - Wednesday, July 15th 2020
Description
In our ongoing efforts to better listen to and collaborate with the community, we're piloting an Open RFC call that helps to move conversations and initiatives forward. The focus should be on existing issues/PRs in this repository but can also touch on community/ecosystem-wide subjects.
A
A
178
in
the
MPM
RC
repo,
as
per
usual
quick
housekeeping.
All
these
calls
and
all
communication
that
we
have
in
the
RCS
repo
is
governed
under
Code
of
Conduct,
just
be
mindful
and
and
allow
people
speak.
We
just
ask
that
you
raise
your
hand
if
there's
discussions
happening
and
I'll
call
on
you
to
to
speak
as
well.
The
point
of
this
meeting
is
ideally
for
us
to
be
moving
forward.
No
those
by
the
community
and
hoping
to
have
another
channel
for
us
to
have
discussions
and
and
hopefully
move
things
forward
together.
A
B
A
Know
that
Roy
had
helped
clean
this
up
a
bit,
so
I
just
wanted
to
quickly
say
thank
you
right
for
for
helping
create
the
agenda
today
and
so,
let's
dive
into
the
first
issue
160,
which
is
the
pole
for
preference
on
how
to
filter
workspaces.
So
again,
this
is
a
pulled
everybody
spun
up
to
essentially
dig
in
and
have
some
discussion
around
the
filtering
mechanism
and
sort
of
design
as
proposed
and
there's
I
think
three
or
four
options
there.
Roy
did
you
want
to
give
a
quick
update
on
that
yeah?
C
D
I
had
some
comments
on
on
this
poll.
I
think
it.
Basically,
the
the
poll
is
essentially
three
getting
at
sort
of
three
different
or
two
different
questions
and
then
there's
kind
of
three
main
options
here.
If
we,
if
we
do
it
with
workspace,
equals
foo,
that
means
it's
a
config
value
and
we
can
add
W
as
an
alias
for
that.
You
know
trivially.
D
The
the
fundamental
question,
then,
if
we're
gonna
do
something
cute
like
NPM,
:,
foo,
blah
blah
blah
we're
introducing
a
new
style
of
sort
of
setting
a
config
or
setting
a
separate
command,
which
that's
that's,
the
one
that's
kind
of
the
costliest
to
to
implement
and
I
strongly
push
back
on
that,
even
though
it
is
neat
and
I,
think
it's
I
think
it
is
very,
very
cute
I
like
it
but
having
to
actually
build
it
and
implement
it.
I'm,
like
oh,
that
looks
like
work.
D
D
Writing
that
comment
in
the
mindset
that,
like
NPM
workspace,
should
be
a
top-level
command
which
defines
which
workspace
you're
working
in
because
running
tests
in
a
given
workspace
are
is
kind
of
a
different
sort
of
thing
in
some
ways
than
running
tests
at
the
root
of
the
repo.
But
by
the
end
of
kind
of
outlining
these
questions,
I
I'm
and
the
more
I
think
about
it.
D
I
might
want
to
have
a
command
like
NPM
workspace,
add
foo,
and
that
shouldn't
try
to
run
NPM
foo
in
the
workspace
name
to
add
it
should
add
a
new
workspace
called
foo
so
right
and
that
that,
if
we're
gonna
do
that
and
we're
going
to
kind
of
reserve
that
top-level
command
to
do
workspace
management,
then
it
doesn't
make
any
sense
at
all
to
have
it
be
the
thing
that
defines
what
workspace
you're
working
in.
In
that
case,
we
might
want
to
have
you
know.
D
E
What
is
the,
what
is
the
whole
repo
called
project,
a
project?
Ok,
so
then
it
would
be
really
weird
for
me
if
a
workspace
ran
something
different
or
dudes
was
able
to
do
something
different
when
it's
script
was
invoked
from
the
project
level
versus
when
I
see
deed
into
the
workspace
and
ran
the
same
script,
they
should
be
forced
to
be
identical.
D
D
That's
the
you
know
in
NPM
7.
According
to
that
RFC
in
NPM
6,
every
config
is
dumped
into
the
environment,
regardless
of
whether
it's
a
default
or
over
it
right.
So,
if
you,
if
we
wanted
to
say
well,
you
know
the
workspace
should
not
have
an
NPM
config
workspace
environment
variable
which
actually
could
get
a
little
confusing.
If
it's
gonna
run
sub
commands
right,
so
you
might
have
a
you
might
have
like
a
post
test
script
that
runs
NPM
run
lint
sure.
D
So,
if
you,
if
you're
doing
NPM,
run
lint
and
weave
and
it's
being
run
in
a
workspace
and
that
environment
variable
is
going
to
be
contagious,
so
you're
effectively
going
to
be
running,
NPM
run
lamp.
Workspace,
equal,
you
know
blurgh,
so
it's
going
to
have
to
know
kind
of
what
is
the
it's.
Basically
gonna
have
to
not
update
the
prefix
and
make
sure
it
only
updates
the
workspace.
Well
if
it
applies.
E
D
The
other
approach
is
we
just
set
the
the
prefix
and
the
you
know
the
folder
prefix
and
the
working
directory
are
sorry.
Sorry,
the
other.
The
other
option
is
we
don't
change
the
working
directory
in
the
and
the
prefix
that
might
actually
be
harder.
Yeah
I
mean
I,
I,
think
I'm
I
think
you
bring
up
a
good
point.
We
probably
should
make
sure
that
we
never
include
that
in
the
environment
and
that
actually
absorbed
one
of
the
issues
I
brought
up.
Where,
like
would
you
ever
want
to
run?
D
E
Work
right
yeah
so
like
it
seems,
like
a
possible
feature
to
add
later
to
have
where
you
can
run
within
a
workspace.
You
could
invoke
a
command
that
tells
that
p.m.
look
up
to
the
project
and
then
invoke
in
a
different
workspace
like
maybe
but
I
like.
That
seems
like
something
that
could
be
added
regardless
of
what
of
this
question
and
you'd
still
want
to
run
it
from
the
working
directory
of
the
workspace
right.
D
E
D
E
D
We
would
just
have
to
you
know,
not
set
the
prefix
config
and
and
be
able
to
know
like
okay,
it's
a
workspace.
That's
calling
this
so
I
need
to
go
back
up
to
my
prefix
and
then
do
it
and
that's
just
having
that
as
a
default
feels
really
surprising
to
me
so
I
think
I'm
I
think
I'm
coming
around
and
like
somewhat
emphatic
agreement
with
you
but
yeah,
we
do
need
to
not
put
the
workspace
value
in
the
the
run
script,
environment.
A
Yes
sounds
good,
so
moving
on
unless
anybody
else
said
anything
else
they
wanted
to
top
up
there,
nope
okay,
moving
on
so
number
two
here,
RC
150,
so
add
file
pack
dependency,
protocol,
I'm,
not
sure
David.
If
anybody's
updated
this.
But
if
you
want
to
go
back
and
maybe
give
us
another
update
about
this
SRC.
B
D
So
we're
we're
I'd
kind
of
like
to
go
and
I
think
I
think
where
we
landed
in
the
discussion.
I
apologize
I
did
not
go
and
update
this
ticket
as
I
had
planned,
but
yeah.
Essentially
we,
it
would
be
nice
to
be
able
to
say
I
want
to
install
a
file
depth,
but
I
want
it
to
be
packed
and
unpacked
rather
than
rather
than
something.
D
That
being
said,
then
the
open
question
for
me
is:
should
this
be
at
like
a
dependency
level
where
you
say
I
want
this.
I
want
this
file
to
be
packed
and
unpacked,
but
I
want
this
other
file
to
be
a
sim
Lake,
or
is
it
more
at
a
project
level
where
I
say
you
know
in
this
particular
environment,
where
I'm
in
right
now
I'm
running
this
command
I
want
you
to
install
all
my
file
depths
as
siblings,
so
I
want
you
to
install
all
my
file
depths.
D
As
you
know,
packed
in
unpacks,
tarballs
I
can
see
the
arguments
for
either
one
but
again
it
kind
of.
For
me,
it's
kind
of
like
if
it's,
if
it's
more
of
a
project,
level
thing
or
an
environment
level
thing,
then
it
should
just
be
a
config
value
right.
There
should
just
be
like
a
you
know:
NPM
file
depths
are
sibling,
sequels,
false
or
something
or
you
know,
even
something
you
can
put
in
a
dot,
NPM
RC
file
or
in
the
environment,
and
your
builds
configuration
or
what-have-you.
D
If
it's
really
/,
you
really
need
the
granularity
of
being
per
dependency
and
your
and
you
want
to
make
it.
You
know
consistent
across
all
environments
and
all
uses,
then
using
a
protocol
might
be
a
better
approach,
or
at
least
something
that's
in
the
you
know
saved
in
the
in
the
lock
file
to
say
this
one
is
packed
and
unpacked.
It's
not
assembly,
yeah.
B
So
what
I
was
thinking
about?
It
was
the
last
option
that
you
let
you
share.
I
will
I
would
like
it
to
be
consistent.
So
whenever
I'm,
installing
this
package,
I
know
that
always
that
I'm
linking
this
package
to
another
to
another,
so
maybe
an
application
or
some
other
library
I
want
it
to
be
built
or
installed
with
this
development
right
the
dependencies.
So
it
can
then
be
built
to
have
some
distributors
and
then
I
want
to
link
it
back
to
the
to
the
parent
or
the
consuming
library.
I.
D
B
D
B
D
D
If
something
is,
if
something
effectively
came
from
a
tarball
that
was
packed
from
that
file
location,
then
we
treated
as
valid
right
so
right
now,
if
you,
if
you
list
something
as
a
file
to
file
:
dependency,
then
if
arborist
sees
something
other
than
a
symlink
there,
it's
gonna
say
no.
This
is
wrong
right
this.
This
is
invalid,
so
we
would
just
have
to
be
a
little
more
cognizant
cuz.
If
you
don't
want
to
do,
is
like
I
run.
D
B
Okay,
in
that
case,
if
we
are
adding
a
configuration,
I,
don't
know
if
it's
in
package
JSON
or
in
the
MPM
mercy.
All
this
configuration
we
read
by
the
install
I,
don't
know
if
it's
a
voice,
the
wondering
that
or
who
is
their
responsibility
to
do
that.
But
if
I'm,
for
instance,
is
telling
a
package
which
has
some
depth
dependencies
that
are
required
for
being
built.
Who
is
the
who
is
going
to
be
doing
this
install
before
doing
the
package?
Oh
so.
D
B
D
Yeah
cuz
otherwise
I
mean
we
have
for
a
couple
years.
It's
it's.
It's
required
for,
like
installing
get
depths
that,
have
you
know
babel,
scripts
or
CoffeeScript,
or
whatever
have
you
whatever
else
build
stuff,
so
yeah
doing
that
for
for
file
depths,
not
particularly
challenging.
We've
got
kind
of
the
code
already
there
to
do
it.
Yeah.
D
I
mean
actually
there's
there's
some
logic
in
arborists
today
to
say
you
know
if
it's
a
if
it's
a
file
depth
than
create
a
symlink,
otherwise
do
the
like,
whatever
we
have
to
do
to
unpack
into
this
folder.
So
essentially,
what
we
do
is
just
like
if
that
config
flag
is
set,
then
don't
do
that
switch
and
that's
the
whole
implementation.
F
D
D
I
don't
know
if
we
support
get
plus
file,
though
oh
right
right,
so
we
would
have
to
add
now.
Maybe
we
should
add,
like
that's
really
easy
to
add,
but
yeah.
You
would
I
think
what
you
might
have
to
actually
do
to
do.
That
today
is
start
a
get
demon
and
then
do
like
get
Plus
HTTP
colon,
slash,
slash
localhost,
you
know
whatever
that's
pretty
onerous
and
now
you're
like
oh.
D
C
D
We
don't
support
that
yeah,
we
talked
about
it,
but
it's
I
think
there's
an
open
RFC
for
it.
Okay,
it's
we
don't.
We
haven't
discussed
it
much
in
meetings,
because
it's
it's
pretty
much.
If
there's
no,
there
might
not
be
an
open.
Rfc.
I
might
just
be
an
issue
comment
I'm
remembering,
but
there's
there's.
Definitely
the
idea
has
been
tossed
around
and
again
it's
it's
one
of
these
things.
That's
a
change
to
the
dependency
specifier,
which
is
always
just
like
tremendously
costly,
because
we
have
to
update,
like
a
million
different
things,
to
support
it.
So.
A
D
A
D
G
One
thing
I'm
not
super
familiar
with
this
proposal,
but
there
is
a
new
feature:
that's
landing
in
the
next
version
of
node
4
package
imports,
which
allows
you
to
override
your
imports
internally
in
your
own
package,
not
from
an
external
interface
and
I,
believe
that
we
went
forward
with
using
the
hash
symbol
for
that.
So.
G
The
purpose
of
that
is
so
that
you
can
take
like
a
specifier,
that's
internal
only
to
your
package
but
external
and
then
specify
like
an
entry
point,
that's
specific
to
the
runtime,
so
that
could
allow
you
to
like
override
built-ins
overwrite
internals
by
understanding
for
what
it's
worth
from
the
import
Maps
proposal
its
generic.
So
it's
just
like
left
hand,
maps
to
right
hand,
and
you
do
scopes
on
a
per
package
basis.
G
So
it's
not
based
on
any
particular
symbol
to
drive
that
it's
just
like
it
just
works,
but
the
symbol
that
node
went
with
right
now
is.
Is
that
hash
tag
now
for
what
it's
worth?
It's
still
on
the
experimental
modules
interface
so
like.
If
we
have
feedback
about
that
and
are
concerned,
that's
always
something
that
we
could
go
for.
I
actually
am
actively
working.
Definitely
not
during
this
meeting
on
the
14.6
zero
release,
which
includes
that
change,
so
it
hasn't
even
gone
out.
Yet
it
is.
G
G
D
A
C
D
C
D
D
B
A
D
A
D
A
A
F
D
D
What
is
the
field
and
like,
what's
the
most
useful
thing
that
we
could
possibly
do
with
it,
and
then
let's
try
and
figure
out
how
to
implement
that.
I
really
don't
have
strong
feelings
about
what
gets
included,
and
it's
mostly,
if
there's
a
field
that
people
are
currently
putting
in
package.json.
To
do
this
like
great,
then
it
probably
already
mostly
works.
We
just
got
to
add
it
to
the
Corgi
Docs
so
that
we
can
start
hooking
more
features
on
it.
A
A
We
go
ahead
and
order
yeah
yeah,
so
essentially
just
ratifying
this
and
then
queuing
up
the
work
on
the
registry
side.
It's
kind
of
the
next
steps,
then,
if
we're
saying
this
is
already
how
the
typescript
community
is
defining
us
in
package.json
and
essentially
we're
just
providing
this
metadata
like
it's
be
used
right,
like
so
yeah.
D
A
A
So
in
terms
of
actually
queuing
up
this,
this
type
of,
like
addition,
I
know
that
we
potentially
can
can
make
these
changes
really
quickly,
like
our
team
can
do.
That,
like
is
this
something
that
we
should
be
putting
on
these
types
of
additions
is
something
that
we
should
be
putting
on
the
backlog
of
the
registry
team,
it'll.
I
F
Field
that
they
solve
the
redirects,
where
you
go
to
find
your
your
your
JavaScript
file,
the
same
concept
exists
for
types
in
typescript
and
the
flow
team
open
to
having
a
similar
thing
if
it
means
putting
it
yeah.
The
thing
is
because
it's
like
optional,
it
means
that
you
can't
know
whether
a
package
from
information
in
the
registry
has
types
or
not,
which
effectively
means
you
can't
say
on,
like
DMP
m.com
page,
that
this
has
typed,
because
the
field
for
doing
that
is
optional.
F
F
D
Or
to
in
order
to
increase
the
chances
that
in
shorten
the
time
frame
before
you
get
what
you're
asking
for
this?
This
work
should
be
done
at
publish
time,
I.
Think
because
we
can,
we
can
do
that
now
right.
We
can
do
that
at
MPM.
Seven.
We
could
even
do
that
in
a
six
minor
release.
If
we,
we
really
felt
so
inclined
the
then
the
next
step
could
be
that
we
run
a
you
know.
D
If
we
decide
it's
worth
it,
we
run
a
script
that
will
go
and
apply
that
to
all
the
past
past
published
our
balls,
but
or
we
could
just
not
do
that
backlog
process
and
instead
just
say:
okay,
all
publish
is
moving
forward
with
this
version
of
NPM
and
up
will
include
this
metadata
and
I.
Think
that
would
actually
yeah.
That
would
get
you
what
you
need
much
with
much
less
sort
of
wailing
and
gnashing
of
teeth.
That.
F
Was
my
first
like
attempt
at
this
something
when
I
went
over
to
the
to
like,
try
and
build
it
and
TP
on
that?
They
were
like
you
know
these.
Some
of
these,
like
the
field
that
I
was
trying
to
had,
was
a
the
under
scholar
version
of
it,
but
I
can't
just
make
it
do
the
type
script
one
but
then,
like
you,
know,
then
I'm
doing
tight
script.
F
F
Maybe
you
want
to
spoil
them,
because
maybe
he
typescript
isn't
as
dominant
as
it
is
today,
which
puts
in
on
like
that's
a
registry
conserve,
but
I
can
like
I
thought
of
it
originally
as
a
client
like
just
going
in
and
just
like
resolve
this
Fang
attach
it
to
the
package.json
before
it
goes
up.
Dad.
D
So
we
have
a
module
called
on
github
at
NPM.
Slash,
read
package.json.
That
would
be.
That
would
be
kind
of
the
place
to
put
that
that
extra
metadata
it
already
does
stuff,
like
figure
out
if
you're
in
a
git
repo-
and
you
know,
read
your
readme
file
to
see
a
default
description
and
do
a
bunch
of
other
kind
of
slow
niceties.
What
you
don't
want
all
the
time,
yeah.
A
A
D
D
D
D
Can
define
a
a
package.json
meditate
inherits
fields
from
and
then
it
just
sort
of
overrides
whatever?
Is
there
a
couple
of
three
pretty
big
issues
in
RFC
as
written,
which
I
think
we
could
probably
carve
down
and
make
it
more
reasonable,
but
one
of
them
is
that
it
contemplates
using
a
name
or
URL
or
a
version
of
the
parent
package
to
use
so
the
you
know
you
could
have
like
a
extends
Express
at
10x,
and
so
where
that
gets
kind
of
challenging
is
what
happens
if
it's
deprecated?
D
What
happens
if
it's
deprecated,
what
happens
if
they
just
publish
a
new
version
that
moves
something
or
changes
its
dependencies
right?
So
your
your
package,
JSON,
is
no
longer
deterministic,
the
other,
the
other
problem
like
if
we
could
somehow
say.
Okay,
it
has
to
be
a
specific
version.
It
can't
be
assemble
a
there's
still
like.
Let's
say
you
know,
I
install
2,000
depths
in
my
app,
which
is
not
like
an
unusual
number
of
depths
to
install.
D
If,
if
every
one
of
those
is
extending
a
package.json
from
another
thing
like
now,
I
actually
have
to
do
two
lookups
that
can't
be
parallelized
right.
I
have
to
do
two
serial
lookups
for
each
one
to
get
its
manifest.
Now
we
could.
We
could
somewhat
get
around
that
by
making
the
publish
time
you
know
sort
of
lock
that
down
so
when
you
publish
it
it
just
sort
of
snapshots
it.
It
says
this
is
this
thing
at
the
state
now
this
is
power
publishing,
which.
D
Right
so
then,
even
if
we,
even
if
we
do
lock
down
the
manifest
as
it
lives
on
the
registry,
what
gets
dumped
in
your
package
JSON
file
or
what
gets
dumped
into
your
node
modules
folder
is
still
whatever's
in
the
tarball.
So
if
each
one
of
those
things
is
extending
something,
that's
a
you
know
a
reach
out
across
the
network.
D
Then
we
have
this
situation.
We're
running,
NPM
LS
has
to
make
you
know
a
thousand
HTTP
requests
and
it's
that
could
be
pretty
onerous
yeah
one
one
thing
I
was
thinking
about
was
or
we
have
to
go
back
to
the
kind
of
bad
old
days
of
mutating
the
package.json
as
it
goes
into
the
tarball
or
as
it
comes
out
of
the
tarball
one
thing
that
that
I
was
thinking
that
might
make
it
a
little
more
manageable
is
to
say
that
you
can
only
extend.
D
You
can
only
extend
a
package.json
with
a
file
path.
So
you
know,
I
can
have
a
workspace
that
extends
the
different
like
a
sibling
workspace
or
the
root
project,
or
just
some
other
arbitrary
file
somewhere
on
my
system,
and
in
that
case,
then,
we
do
sort
of
snapshot
the
current
state
when
we
do
a
publish
and
that's
what
goes
in
the
tarball
and
when
somebody
installs
that
package
they
just
don't-
have
to
worry
about
any
extend
stuff.
D
A
D
A
Sounds
like
we'll
give
us
some
time
to
have
a
look
folks
jump
in
then,
unless
anybody
else
said
anything
to
add.
We
can
move
on
to
the
last
item
so
RCE.
This
is
number
114,
so
expand
list
of
ignored
files
right
did
you
want
to
give
a
quick
update
on
this
and
I
know
turns
also
added
some
comments
here.
C
A
E
I
think
that
if
the
goal
is
to
prevent
things
from
leaking
file,
names
aren't
hopeful
solution.
The
only
time
I've
ever
leaked
things
I
didn't
want
to
leak
was
when
I
had
accidentally
or
when
I
had
intentionally
dumped
the
output
of
env
into
two
different
files,
so
I
could
dip
them
and
I
forgot
to
delete
the
two
files
before
I
tipside
them
DM
published
so
I
had
a
bunch.
E
That
ended
up
getting
blasted
out,
which
luckily
were
all
caught
quickly
and
nothing
happened,
but
like
the
real
check
is
like
all
the
server-side
things
that
github
an
NPM
do
to
look
for
credentials
like
there's,
probably
a
subset
of
those
that
you
could
apply
to
look
for
credential
like
things
without
actually
knowing
for
sure,
if
it's
real
potential
that
would
be.
That
would
actually
be
useful
if
applied
to
all
files,
not
just
to
specific
file
names,
because
no,
no,
no
a
file
name
would
have
caught.
My
arbitrary
like
before
and
after
file
names,
yeah.
C
E
C
E
E
In
the
human
case,
at
the
very
least,
when
it's
a
TTY,
you
can
provide
a
prompt
for
whatever
they
can
and
should
I
would
say,
provide
a
prompt
for
anything.
That'd
be
useful.
Like
you
know
individually,
some
of
them
might
end
up
being
annoying
and
you
might
want
to
config
by
fast,
but
that's
like
yeah
yeah.
That's
tomorrow's
problem.
Yeah.
E
Yeah
I
guess
what
I
was
I
was
not
I'm,
not
volunteering,
to
write
that
one
I
already
have
two
or
three
rfcs
I've
volunteered
to
write
that
I
haven't
done
yet,
but
like
the
what
I
guess,
what
I
mean
is
if
the
goal
of
your
RFC
here
is
to
prevent
leaks,
I,
don't
other
than
dot
M
files,
maybe
I,
don't
know
how
much
is
actually
it's
actually
gonna
achieve.
That
I
think
it's
more
likely
to
make
like
package
sizes
smaller,
which
is
good
people
like
that.
E
E
C
H
H
E
That
includes
a
folder
called
dist
turning
and
I'm
only
half
way
down,
and
that
would
break
like
a
large
number
of
NPM
packages.
So
I
think
the
bulk
of
the
contents
are
useful,
but
I,
don't
think
any
automatic
merging.
It's
likely
it's
just
because
about
that
seems
to
be
the
only
one
in
that
that
file
that
would
break
it.
But
you
know
the
other
thing
is
a
the
star
dot
tgz.
Anybody
who
has
a
tarball
intentionally
in
their
package,
which
I
think
lodash
played
with
at
one
point,
would
also
break
but
I
like
I.
E
A
I
think
the
concern
with
like
going
forward
with
any
net
new,
ignore
it's
just
like
the
false
safety
that
people
would
have
downloading
packages
that
would
have
like
historically
already
been
packaged
with
the
previous.
You
know
a
set
of
rules
and
truly,
if
we're
concerned
about
you,
know,
package,
size
and
security,
then
I
think
maybe
the
warnings
and
the
approach
that
we're
speaking
to
that
in
a
net
new
RFC
that
would
take
the
you
know.
A
H
I
guess
I
would
definitely
like
echo
what
Jordan
said
and
assert
that
like
I
wouldn't
look
at
this,
the
RFC
as
a
security
feature,
but
more
of
a
like
limiting
the
ecosystem
in
a
way
that's
enabling
them
to
fall
into
a
pit
of
success.
So,
like
I,
mean
I,
know,
I've
certainly
published
things
that
I
didn't
mean
to
you
like
accidentally
over
published,
or
you
know
stuff
like
that
and
I
think
that
this
is
more
in
line
with
that
and
just
good
hygiene
rather
than
security.
H
A
H
Guess,
yeah
and
I
guess
how
I
would
look
at
that?
Is
you
know?
Do
you
ever
really
want
to
publish
your
TTS
build
info
and
that
yeah
like
there
are?
There
are
things
there
that
I
think
you
you
generally
probably
do
not
want
to
publish
and
that's
like
that's
what
you
should
be
targeting
you
shouldn't,
like
necessarily
be
targeting
everything
at
once,
especially
since,
like
you
know,
a
get
ignore
compared
to
npm,
publish
or
impale
ignore
is
you
know,
one
is
mutable.
One
is
immutable.