►
From YouTube: Working Group: 2020-07-23
Description
* Application Mixins: https://github.com/buildpacks/rfcs/pull/87
* Offline Build Packages: https://github.com/buildpacks/rfcs/pull/81
A
B
B
B
B
Yeah
I'm
a
Mac
I,
had
it
seemed
to
happen
like
most
of
the
time
on
Windows.
Today
it
happened
the
first
time
I
was
like
oh
I,
can't
reproduce
and
then
I
haven't
been
able
to
do
it
again.
Since
then,
yeah
I,
wonder
when,
when
you
say
that,
like
the
visual
bit
gives
rise
to
there
being
well,
any
number
of
things
happening
in
between
the
start
of
detect
and
the
finish
of
detect
and
whether,
in
our
efforts
to
reproduce,
we're
just
facing
timing
issues
identifying
exactly
where
it
is.
D
B
Was
gonna
say
a
hard
issue
I
like
a
good
issue
that
I
had
reproducing
it
was
that
I
find
it
really
difficult
to
stop
it
during
detect
because
it
was
so
fast
it's
right
when
the
detect
face
she
was
up
but
like
before
you
see
any
because
I'll
be
asleep.
The
detect
logs
are
super
fast,
that
cuz
detectives
fast.
B
Logs
but
after
the
detecting
phase
log,
so
it's
probably
a
trusted
builder
thing
too.
You
know
because
it
was
a
trusted
builder
I,
don't
know
if
that
changes.
Anything
right,
well,
I
think
the
and
I
suppose
that
might
be
worth
investigating,
because
there
is
a
because
we
we
trust
by
default.
The
suggested
builders,
but
the
actual
building
is
in
your
example,
was
not
that
the
da
Roku
fell
back
so
yeah.
B
A
A
E
F
We've
got
a
few
updates
on
the
platform
signed,
I
think
David's
been
working
on
getting
the
Tecton
tasks
updated.
We
now
have
two
tasks
that
was
merged
and
I
believe
yesterday,
and
they
are
now
versioned
with
oh
one,
so
dot
one
and
we're
trying
to
figure
out
still
the
strategy
for
how
the
version
will
work
there.
But
overall
we
know
how
the
build
packs
phases,
which
does
the
individual
phases
and
tech
tap
on
and
then
one
the
build
packs
which
uses
creator
and
that'll
be
out
there.
F
D
Still
working
on
the
lifecycle,
onine
hoping
to
get
out
in
the
next
couple
weeks,
merging
the
symlink
fixes
in
from
Oh
8
1
ended
up
being
a
bit
more
of
a
debacle
and
I
was
expecting
because
windows
symlinks
out
of
their
own
wrinkle,
so
their
release
is
I.
Think
last
week,
I
said
two
to
three
weeks:
I'm
still
gonna,
say
2
to
3
weeks
instead
of
1
to
2.
At
this
point,
because
of
the
added
time
from
doing
with
that.
E
F
D
It's
included
deprecated
in
supporting
this
important.
It's
a
superset,
deprecated
and
non
deprecated,
API
x'
and
I
allowed
for
multiple
experimentals
just
for,
doesn't
seem
to
be
any
reason
not
to
allow
n't,
even
if
we
only
think
we're
going
to
use
one
realistically
I
think
we
are
almost
at
a
full
slate
of
approvals
here
and
those
have
not
a
proof.
Please
take
a
look
or
those
who
approved
pre.
Those
changes
make
sure
you're
still
comfortable
with
it.
E
Experimental
features
this
is
mine,
just
haven't
had
time
to
bring
it
up
for
discussion.
It's
also
not
urgent,
so
I
guess
take
a
look.
If
you
have
a
chance
application.
Mix-Ins
we'll
talk
about
today
in
mind,
build
packs.
I
thought
was
FCP,
but
in
any
case,
I
think
this
has
all
the
votes
come
on.
Okay,.
F
D
F
G
E
E
E
B
E
E
E
So
after
our
the
last
time
we
discussed
this
I
made
some
significant
changes
to
the
CSS,
or
example,
to
have
the
to
build
pack
idea
that
we
were
using
we're
like
a
regular,
build
pack
provides
assert
or
well
it's
kind
of
weird.
It's
like
it
requires
a
cert,
and
then
the
stack
pack
provides
it
by
reading
the
actual
cert
contents
from
the
build
plan
and
letters.
Take
a
look
at
that.
E
That
I
think
there's
some
stuff
in
here
that,
if
we
wanted
to
like,
we
probably
don't
need
the
provides
here
and
the
mix
in
here,
so
we
could
make
some
choices
that
made
it
a
little
bit
cleaner,
but
that
definitely
would
work
the
outstanding
question
I
have
about.
This
is
still
the
pattern
matching
I
know
we
talked
about.
E
Regex
is
like
how
do
you
know
when
it's
a
regex
versus
an
exact
string
that
you
want
to
match,
but
I
think
the
proposal
as
it
is,
needs
some
concept
of
an
apt
build
pack
that
can
provide
an
arbitrary
list
of
mix-ins
without
specifying
what
they
are
so
I
haven't
had
time
to
well,
no
I
have
had
time.
I
just
haven't
come
up
with
a
solution
to
that,
so
something
I'm
still
thinking
about
a.
A
E
D
E
D
That's
gonna
be
controversial,
though
it
sort
of
the
stack
capabilities
way
of
doing
this.
I
know
he's,
like
migrated
back
towards
a
lot
of
things
like
that,
but
I
could
imagine
like
if
a
stack
advertises,
two
capabilities,
one
is
certs
and
one
this
make
sense.
Then
you'd
expect
to
have
to
stack
packs,
one
that
does
certs
and
one
that
does
make
sense
and
then
the
way
you
when
people
ask
for
one
of
those
things,
we
just
invoke
that
okay
without
doing
sort
of
the
build
plan
matching
at
all.
A
A
In
the
case
of
a
half
packages,
megabytes
of
package
names-
probably
they're,
you
know
pass
to
the
stack
pack
like
the
yeah
they're
passed,
the
ones
that
are
acquired
or
filtered
by
the
ones
in
the
stack
and
the
remaining
ones
are
passed
to
the
stack
pack
that
match
that
stack
pack
and
then
remaining
ones
after
that
are
passed.
The
next
stack
pack
etc
right.
Something
like
that.
D
A
D
It
filters
out
everything's
realistic.
It
gives
the
rest
of
the
stuff
to
the
mixing
stack
back
done,
there's
no
like
provide
require
match
it's
just
this
one
gets
make
sense,
and
you
assume
you
can
do
all
of
them
as
we
implement
it.
There
might
be
a
time
where
it
fails
if
it
can't
do
all
of
them,
but
eventually
it
probably
should
so
as
we
define
new
mix-ins,
be
supported
in
the
mixing
mixing
stack
back.
A
D
D
A
A
Like
the
idea
of,
if
you
have
starting
with
a
stack
image,
you
can
layer
on
modular
pieces
that
sort
of
dynamically
make
changes
to
the
base
image
like.
If
you
have
a
your
large
enterprise,
and
you
want
to
have
a
special
build
pack,
you
distribute
everywhere
that
installs
your
particular
CA
shirts
right
then
you
know
you
can
add
that
to
your
base
images
and
use
that
to
fan
on
the
logic
right
there.
E
A
It's
like
what,
if
you
want
to
provide
a
change
to
the
base
image
that
maybe
is
mandatory,
but
maybe
you
know,
as
it
could
be
requested
by
the
app
through
a
build
plan.
They'll
pack
or
through
could
be
requested
by
a
custom
users
pill
pack
right
it
I
like
how
we've
interfaced
this
into
mix-ins
and
into
the
build
plan
system
like
that
having
to
put
the
whole
contents
of
this
shirt.
A
E
D
Because
we've
tied
this
into
mix-ins,
like
I,
think
what
you're
talking
about
with
arbitrary
changes
and
modularity
gets
into
the
broader
vision
of
Route
bill
packs
right,
but
I,
don't
think
we
want.
If
we're
making
this
specific
to
mix
is
I,
don't
think
we
want
people
making
custom
definitions
of
mix-ins,
because
we've
said
it
has
to
be
a
stack
author,
who's
defining
what
a
mixin
is
and
then
it's.
This
is
also
shipping
on
the
stack
image.
I
mean.
A
This
is
a
one
day
out,
you're
the
person
who's,
maintaining
the
stack
image.
So
you
can
just
put
all
that
logic
in
your
one.
Stack
pack
like
I,
think
in
many
cases
people
might
not
want
any
stack
packs
from
I
want.
You
know
a
number
of
them
that
make
different
ages.
The
base
image
and
the
thing
that
differentiates
them
from
rebuild
packs
for
me
is
rebuild
packs,
break
rebasing.
Essentially,
because
you
can't
make
guarantees
anymore,
you
can
do
whatever
you
want.
That's
like
there
are
additional
restrictions
for
stack
packs
because
they
have
said
this.
A
You
know
bei
this
api
compatibility
contract
and
they
don't
let
you
do
everything
right.
They
have
to
run
using
the
mixin
system.
That's
used
to
define
what
that
api
compatibility.
Contract
is,
but
they're,
not
you
know
intrinsically
coupled
to
the
person
who
defined
the
stack
ID.
They
just
you
know,
are
distributed
by
people
who
distribute
stack
images,
yeah.
C
Would
imagine
stack
packs
that
help
them
kind
of
not
have
that
I
forget
what
they
called
their
rust
binary
that
did
the
signification
of
stuff,
but,
like
I,
can
imagine
them
want
to
convert
all
those
things
into
staff
packs
that
will
allow
them
have
new
base
that
could
basically
automate
all
those
things
in
a
kind
of
reasonable
way.
You.
D
A
Maybe
say
you
have
a
stack
pack
that
just
installs,
you
know
dynamic
packages
from
official
repos
right,
but
on
top
of
that,
you
want
to
add
support
for
installing
from
two
different
other
PPS
you'd
create
a
static
pack
for
you
to
the
PPAs
that
pulls
packages
from
the
ppas
instead
and
then,
when
build
packs
required
those
packages,
the
ones
that
you
know
are
don't
have
that
whatever
the
PPA
special
prefixes
go
to
the
you
know,
regular
generic
Bionic
packages
thing
and
the
PPA
specific
ones
go
to
the
build
packs
installed.
It
could
be
those.
A
D
But
then,
as
we
define
it,
can't
we
just
add
to
the
one
mixin
thing:
the
installation
of
it
right,
like
I,
don't
know
because
we're
not
letting
people
to
find
their
own
mix-ins
on
the
fly.
I.
Don't
know
why
they'd
want
to
add
their
own
mixin
installers
on
the
does
that
make
sense
or
like
if
they're
defined
there
and
they're
sort
of
on
the
hook
for
all
of
it.
I'd.
A
Here's
an
example
without
BPA's
say
instead
of
one
apt,
you
know
generic
apt
stack
pack,
you
only
wanted
stack,
packs
and
install
specific
subsets
of
packages
that
are
allowed
in
your
organization
from
upstream,
so
you'd
want
to
filter
out
and
there
could
be
a
you
know,
different
modular
pieces
of
functionality
to
install
the
different
package
sets.
This
way
you
could
require
all
of
those
and
the
right
ones
and
make
it
to
the
right
ball
bags.
B
E
E
I
was
really
happy
with
this
bill
pack
stack
pack
division
where
there's
like
a
generic
provider
of
CA
certs
and
then
a
user
space
build
pack
that
you
know
in
here,
I'm
just
including
the
content
but
like
I
was
saying
it
could
be
a
path
to
the
platform
door
or
something
there's
a
lot
of
different
mechanisms.
But
this
felt
a
lot
better
to
me
than
the
type
of
construct
that
I
was
trying
to
introduce.
E
Sure
I
mean
that's
I,
think
there's
a
I
almost
feel
like
that's
a
separate
discussion
about
how
we
actually
implement
that
and
like
what
I'm
trying
to
illustrate
here
is
do.
Are
we
giving
that
stack
pack
or
build
pack
author,
the
the
power,
the
capabilities
they
need
in
order
to
implement
something
like
that?
We
don't
have
to
make
a
hard
and
fast
decision
about,
or
it
doesn't
even
have
to
be
the
same
right
like
the
Heroku
18
stack
could
have
its
own
weird
mechanism
for
providing
certs.
If
we
wanted
to
I.
A
E
B
E
You
know
yeah,
so
the
the
reason
I
have
this
year
is
because
I
was
playing
around
with
this
in
the
current
life
cycle
and
I
was
just
running.
This
stack
pack
as
a
regular,
build
pack
and
pretending
it
was
a
stack
pack
or
whatever
so
in
order
to
satisfy
that
requires.
I
had
to
provide
it,
but
I
do
want
like
that's.
E
A
If
we
took
the
bin
detective
part
of
the
stack
pack
for
now
and
decided
that
we
need
to
add
the
dynamic
functionally
functionality
back
and
I
wouldn't
be
opposed
to
that.
But
it'd
be
easier
for
me
to
approve
with
like
this
is
where
the
provide
comes
from
its
static.
It
can
be
on
that
build
package,
so
you
can
pre
validate,
but
you
either
way
or
like.
If
you
wanted
to
keep
it
in
and
just
to
explain
that
we
should
allow
the
dynamic
once
to
they'll,
be
okay.
A
F
Could
still
be,
what
is
it
like
dynamic,
but
at
the
mixin
level,
dynamic
right
so
like
we,
it's
ultimately
queue.
You
could
have
both
I'm,
not
really
saying
that
I
like
it,
but
you
can
see
that
the
mix
in
itself
could
use
pattern
matching
right
for
the
static
validation
aspect
of
it.
But
then
the
more
specific
in
the
detect
be
more
specific
to
what
the
actual
concrete
dependency.
A
Is
you
shouldn't
have
to
because,
like
the
app
don't
pack
can
provide,
you
know
two
hundred
thousand
packages
right,
and
so
it
wouldn't
output
during
detect
before
it
can
before
it
gets
the
require
here.
The
two
hundred
thousand
packages
I
install
so
an
app
build
pet
case.
I
would
say
you'd
be
outputting
the
pattern
in
both
cases
or
if
you
couldn't
use
the
pattern
in
the
dynamic
case,
then
you
wouldn't
be
outputting
a
pattern.
They're.
E
Okay,
so
I'm
gonna
revise
this
to
have
like
a
first-class
pattern
versus
list
of
explicit
strings
in
the
filled
pack,
Tamil.
That
seems
pretty
straightforward
and
I'm
gonna.
Think
some
more
about
Ben
detect
I
feel
like
if
it
is,
it
isn't
sitting
well
for
me
either
so
I
think
I'm
with
you
Steven
now,
I
want
to
figure
that
out
a
little
bit,
I,
don't
know,
I,
don't
know
what
direction
will
go,
but
I
want
to
think
about
it.
Some
more
sounds
good
anything
else
that
I
should
be
thinking
about
as
I
rework
it.
A
G
G
A
little
bit
gun
is
memory,
the
probably
a
good
idea,
okay,
sweet,
so
kind
of
starting
from
the
top
kind
of
what
this
RFC
is
doing
is
pulling
out
buildpack
dependencies
out
of
the
build
pack
archive
and
there's
some
some
benefits
we
can
get
from
this,
namely
space
optimizations,
and
this
kind
of
one-to-one
correspondence
between
ID
and
version
from
the
build
pack
author's
perspective.
You
know
there
was
some
talk
about
this
point
in
particular
in
the
last
revision,
but
I
think
that
there's
like
a
little
more
context
here,
it'll
make
it
clearer.
G
So
this
is
just
an
example
of
kind
of
a
current
generation
offline
build
pack
where
all
of
the
dependencies
are
actually
shoved
inside
of
the
bill
pack
archive,
and
we
just
kind
of
want
to
pull
this
out,
make
these
first-class
citizens
so
that
the
lifecycle
can
interact
with
them
and
kind
of
like
decoupled.
This
idea
of
a
build
pack
bits
and
the
dependency
bits,
and
so
how
this
kind
of
works
is
that
we
are
going
to
modify
the
package
tamil
format
to
have
this
new
field
for
offline
dependencies.
G
So
if
we
run
the
PAC
package,
build
pac-man
we're
going
to
get
in
OCI
format,
it's
going
to
have
these
bill
pack
layers.
It's
gonna
have
some
dependency
layers
awesome
and
what
this
will
look
like
when
it's
actually
all
these
layers
are
splat
it
out
it's
going
to
be.
We
have
this
new
depth
directory
that
sits
adjacent
to
the
bill
pack,
structure,
e
and.
B
G
G
And
so
when
a
built
pack
is
installing
or
when
a
bill
pack
is
being
run
in
this
offline
mode,
what
we're
gonna
be
able
to
do
is
we're.
Gonna
have
to
provide
the
build
pack
a
little
bit
of
additional
information,
namely
around
what
this
ID
field
is
right.
It's
gonna
select
a
version
that
it
wants
to
install
and
it
can
kind
of
like
use
this
information
to
figure
out
exactly
where
on
the
file
system.
G
It
should
be
looking
for
these
offline
dependencies
and
we
can
kind
of
there's
like
a
an
additional
environment
variable
that
we
can
provide.
It's
called
C
meetups
and
it's
just
going
to
basically
give
us
C
and
B
slash
steps.
So
at
Build
time
you
can
use
this
concatenate
it
with
your
Bill
pack
ID
the
version
you
select
check
out.
This
file
path
exists.
If
it
does,
then
you're
built
back
and
like
do
a
little
bit
of
additional
logic
to
do
an
offline
build.
G
So
in
the
last
iteration
of
this,
there
was
like
a
little
bit
of
discussion
about
why
we
want
to
use
this
combination,
as
opposed
to
like
a
hash
and
there's
a
couple.
Dependencies
of
one
of
them
is
this
SDC
V
tool
which
both
contain
like
a
binary
and
static
data.
It
required
it
like
requires
an
absolute
path
when
you
build
it.
G
So,
in
order
to
have
the
ability
to
build
this
dependency,
put
it
in
a
build
pack
and
have
all
of
the
information
below
this
depths
ID
directory
you,
actually
you
need
to
be
able
to
that
stuff
out
beforehand,
so
any
sort
of
passion
function
isn't
quite
gonna
work
for
us
I
guess
week,
one
of
the
kind
of
side
effects
of
doing
this
as
we
get
offline
meta
build
packages
for
free.
As
long
as
all
the
implementation
build
packages
are
packaged
up
correctly.
G
Your
overall
build
pack
execution
build,
will
just
run
completely
offline,
and
so
this
kind
of
outlines
some
of
the
label
changes
to
that.
We're
going
to
need
to
have
so
that
one
of
these
offline
build
packages
can
identify
the
different
archives
that
it
actually
wants
to
splat
out
on
the
file
system,
and
so
we're
just
gonna
add
an
array.
It's
gonna
have
a
sha
that
points
to
each
of
these
dependency
layers.
G
Terrance
I
think
you've
brought
up
some
points
around
layer
ordering,
so
the
builds
would
be
reproducible
in
the
last
version
of
this,
and
this
just
outlines
like
a
lexical,
alexa
graphic
ordering
by
diff
ID,
so
that
every
time
you
build
one
these
build
packages,
it
looks
about
the
same.
It's
reproducible.
G
A
G
C
A
D
Am
I
correct
in
assuming
here
that
now
something
like
bundler
artifact
tgz
right
now,
when
we
have
a
dependency
it
like
this
is
usually
just
containing
bundler,
for
example.
But
now
all
these
artifacts
would
be
layers
like
even
before
we
package
them
into
a
pill.
Pack
I'm
assuming
to
be
able
to
do
this.
The
URI
to
the
bundler
artifact.
They
were
pointing
at
an
alkaline
dependency.
Sort
of
has
to
be
a
container
layer
object.
So.
G
I,
don't
think
it
does
I
think
the
part
of
this
is
you
should
be
able
to
put
any
artifact
you
want
here
and
what's
gonna
interact
with
it?
Is
your
build
pack
and
the
only
way
it's
gonna
interact
with
it
is
by
saying:
okay?
Is
there
something
here
and
then
it's
really
left
up
to
the
build
pack
to
figure
out?
Okay,
this
dependency
was
packaged
with
me.
I
should
know
what
format
it's
in
I
said.
D
D
F
A
F
A
A
F
A
It's
the
it's
it's
the
tgz
from
the
URI
has
expanded
into
1
2
3
right.
So
then
it's
repackaged
up
as
a
layer
along
with
the
contents
that
19-8
assumption
under
offline
dependencies
will
be
put
in
the
Tamil
file.
So
you
end
up
with
a
layer
with
no
tar
zorty
disease
inside
of
it.
You
know
for
sure,
right
or
like
like
no
tars
or
two
jerseys
inside
of
it.
Unless
you
had
it
too
juicy.
F
F
A
F
A
E
G
E
Have
a
strong
feeling
about
what
how
that
should
work.
I
do
is
very
bike.
Shetty,
but
I.
Think
dependency
is
the
wrong
word
because
in
the
other
section
of
packaged
tamil
dependency
as
a
build
pack
yeah-
and
that's
like
a
it,
it's
kind
of
like
when
I
first
read
through
this,
it
was
confusing
to
me.
I
was
expecting
those
to
be
images
and
then
they're,
not
so
I
wonder
if
there's.
B
A
E
D
Thing
I
have
wanted
in
the
past
before
is
to
be
able
to
take
an
online
builder
and
make
it
an
offline
builder
sort
of
without
having
to
repackage
each
lease
built
pack
and
then
repackage
the
meta,
build
pack
and
then
repackage.
The
Builder
I'm
wondering
if,
like
in
the
online
piece,
were
preserving
you
our
eyes
in
metadata
like
we
would
later
have
all
the
information.
We
need
to
take
it
image
and
just
make
an
offline
version
of
it,
which
would
be
a
very
convenient
thing
to
do
like.
D
Platform
level:
I,
don't
think
the
lifecycle
can
really
do
this,
but
if
you
could
say
pack
builder
offline
and
give
it
an
online
builder
and
spit
out
a
offline
image,
Debbi
very
convenient
or
like
at
the
platform
level.
You
could
even
you
know,
add
optionality
to
only
do
that,
for
some
artifacts
I
feel
like
you
could
do
cool
things
in
the
cool
and
convenient
things
in
the
platform.
If
you
sort
of
preserved
the
you
are
eyes
in
the
metadata,
so
that
little
you
could
hydrate
the
offline
assets
later,
instead
of
doing
it
upfront.
C
That
I
guess
that
was
like
along
the
lines.
I
was
they
hand
out
was
if
I
had
an
online
well
I
guess
we're
calling
it
online,
but
maybe
that's
also
the
wrong
terminology,
but
like
non
hydrated
bill
pack.
That
does
the
Bill
pack
need
to
then
build
all
the
logic
to
hydrate
itself,
or
does
the
expectation
that
this
would
be
handled
either
by
the
platform
or
somewhere
else,
so
that
the
bill
pack
could
always
assume
that
that
depths
directory
is
always
filled.
G
Yeah
so,
as
things
stand
right
now,
they're
completely
separate
artifacts,
there's
like
a
little
piece
in
this
alternative
section
around
like
forcing
you
to
have
these
like
you
are
eyes
and
forcing
these
like
the
labels
on
each
image
to
actually
have
references
to
where
their
offline
artifacts
would
end
up.
So,
if
they're
present
on
the
registry,
they
can
just
get
used.
G
F
Sorry
good
I
was
gonna,
just
I
want
to
make
sure
that
I
completely
understand
the
the
way
that
the
current
RFC
is
spelled
out.
The
platform
or
life
cycle
don't
have
to
do
anything
right
because
it
would
just
be
very
specific
locations
within
the
Builder
itself
or
the
packages
that
then,
would
have
the
additional
dependencies,
and
then
the
build
pack
is
looking
at
those.
So
let's
say,
for
instance,
pack
or
life
cycle
don't
have
to
care
about
anything
in
relation
to
the
information.
The
metadata
and
whether
or
not
those
layers
should
be
hydrated.
G
So
I
think
it
would
actually
have
to
do
some
additional
behavior,
because
these
would
like
suddenly
the
layers
that
didn't
exist
in
build
pack.
The
online
build
package
that
you're
using
they're
just
sitting
on
like
the
registry,
so
I'm
not
exactly
sure
how
pack
would
navigate
that,
but
I'm
gonna
assume
that's
different
than
just
having
everything
in
a
build
pack
and
basically
treating
all
these
new
dependency
layers
as
the
same
exact
things
as
build
pack
layers.
We
can
just
unzip
them.
They
end
up.
Okay,.
F
A
You
could
think
of
it
like
if
your
build
package
has
dependency
layers
in
it.
You
use
your
build
package
to
do
a
builder.
The
dependency
layers
flow
into
the
Builder
right
and
hydrate
that
builders
cache,
so
that,
when
builds
are
done
on
that
builder,
those
dependencies
are
present
and
the
build
packs
can
use
them
and
then
like
in
a
platform
like
a
pack,
you
could
decide
when
you're
building
your
builder,
how
many?
What
cache
dependencies
to
include
in
the
builder
and
you
know,
I-
have
control.
The
performance.
A
F
So
I
think
one
of
the
things
that
might
be
in
line
with
what
other
people
have
been
asking
about
and
might
be
overzealous
but
I
think
overall
might
be
necessary
is
if
we
look
at
the
depth,
diff
IDs
right
now,
they're
just
a
race
where
I
think
it
would
be
helpful
to
have
the
entire
metadata.
So
almost
like
a
map
in
a
case
where
we
know
what
the
idæan
version
of
that
thing
is
so
that
we
could
do
more
effective
lookups
if
we
need
it
to
and
then
I
think.
F
G
F
A
D
Of
builder
magic
you're
talking
about
Stephen,
is
really
the
magic
of
subtraction.
It's
like
imagining
that
there's
a
world
where
everything
is
packaged
offline
and
then
you
create
a
builder.
You
can
serve,
opt
out
of
things
which
I
do
you
think
is
a
lot
of
value,
because
you
might
want
a
smaller
builder
image.
It
doesn't
have
a
bunch
of
dependencies.
You
don't
need,
then
wondering
if
there's
a
way
we
can
frame
it
so
that
instead
you
can
have
up
on
of
online.
It
also
works.
D
A
D
A
B
B
D
A
Think
that's
actually
a
very
large
change.
I
mean
it
seems
like
it's,
because
the
artifacts
change
a
lot.
It
seems
like
it's
different
from
what's
proposed,
but
everything
here
about
paths
and
how
you
construct
those
images
stayed
the
same.
It
just
takes
the
in
the
build
package
thing
and
pulls
it
out
into
a
different
image
for
a
little
bit
more
decoupling,
and
then
there
are
definitely
some
benefits
there
too.
I'm
not
not
fully
convinced
this
direction.
We
should
go
but
I'm
becoming
more.
A
A
C
G
Yeah
I
mean
I
feel
like
that.
This
is
like
the
exact
solution
for
that
like
yeah,
because
I
think
we
are
like
doing
that.
What
we're
trying
to
figure
out
ways
to
do
the
same
thing
and
just
like
the
inevitable
problem,
when
you
run
up
into
it's
like
managing
a
ton
of
different,
build
packs,
they're,
all
almost
exactly
the
same
yeah.
C
D
F
F
And
then
the
really
cool
thing
here,
if
I'm
not
mistaken,
is
we
could
tie
it
into
some
caching
improvements
right
when
certain
dependencies
are
shared
across
different
applications.
These
would
be
very
specific
to
identifiers
that
are
globally
unique
in
which
we
could
cache
things
like
the
JDK
from
not
having
to
be
downloadable
times.
C
Yeah
I
know
for
us,
like
there's
even
simple
stuff
like
YJ
or
something
for
the
basketball
packs
that
we
have
like
seems
like
every
basketball
pack
uses
it,
because
they
need
to
do
tamil
stuff
right
and
being
able
to
have
a
globally
unique
thing.
That
is
only
packaged
one
time,
no
matter
how
many
times
every
single
build
packed
packages,
it
would
be
nice.
G
C
G
If
I
was
gonna,
if
we're
gonna
allow
for
there
to
be
any
sort
of
hydration,
there
need
to
be
like
this,
some
kind
of
like
claws
around
always
producing
the
metadata.
You
need
to
find
offline
dependencies
right
or
these
dependencies
that
exist
and
then
there'd
be
another
piece
which
I
think
Xavier
brought
up,
which
is
okay.
How
does
pack
actually
reach
out
and
grab
this
if
it's
not
just
sitting
in
the
build
package,
so.