►
From YouTube: Working Group: 2020-05-28
Description
* Plan Merging: https://github.com/buildpacks/rfcs/pull/67
A
A
A
A
C
C
Awesome
so
this
RFC
is
aimed
around
taking
I,
guess,
solidifying
the
format
for
the
build
plan
and
build
pack
plan
and
with
a
solidified
format
using
that
to
our
advantage
to
allow
the
lifecycle
to
do
some
work
that
is
very
prevalent
and
kind
of
toil
for
build
packs
and
built
pack.
Author
libraries,
I
guess,
is
what
the
aim
of
this
was.
C
This
emerged
from
how
we
were
trying
to
pass
communicate.
What
in
which
images
dependency
should
be
available.
So
right
now
we
have
like
a
current
state
of
the
world.
We
just
put
a
flag
called
build
or
launch
in
our
metadata
for
our
built
in
Tamil,
and
then
we
read
that
on
the
other
side
during
build
and
we
assign
that
to
layers
after
the
fact,
so
we
really
wanted
a
way
of
maybe
doing
that
more
expressly
and
more
thoughtfully
through
the
actual
interface
itself.
C
And
then
it
got
a
little
out
of
control.
I'm
not
gonna
lie,
and
we've
ended
up
with
what
you
see
here.
So
I
can
do
like
a
quick
walkthrough
of
what
these
sort
of
mean
and
what
I'm
trying.
What
we're
aiming
to
do
here,
I,
think
that
that's
probably
a
fair
idea
so
provides
looks
generally
the
same,
except
that
we've
added
another
field,
which
is
a
strategy
version,
and
then
we
have
a
flag
called
collect
now,
which
will
become
relevant
once
we
get
to
the
build
pack.
C
The
source
field
is
important
for
what
we're
going
to
be
proposing
for
the
bill
tech
plan,
which
is
to
merge
all
of
the
version
constraints
together,
and
if
there
is
a
failure
during
that
merge,
which
happens
at
the
very
end
of
the
detect
phase,
that
will
cause
a
failure
in
detect
and
we
will
be
able
to
surface
what
the
source
of
that
failing
constraint
was
capabilities,
which
is
meant
to
be
a
map
of
strings
to
boolean,
and
this
will
be
what
we'll
get
into
this.
But
this
will
be
in
Bill
Peck
plan.
C
This
will
be
or
merged
together.
So
once
one
of
these
this
had
true.
It's
like
a
sticky
bet
once
one
of
these
is
that
true,
then
every
subsequent
thing
will
be
set
true
after
that
I
just
have
built
and
watching
here
as
examples.
This
could
be
anything
that
would
be
a
boolean
operator
for
the
build
pack
and
then
we
have
requires,
and
this
one
is
meant
to
be
string
to
interface
or
string
to
arbitrary
data,
and
this
is
meant
to
hold
unique
communication
pieces
between
build
packs.
C
I
then
talk
about
these
keys
and
give
a
long
explanation,
but
let
me
let
me
switch
to
a
build
pack
plan
which
now
has
the
same
sort
of
array
of
entries,
but
they
have
been
condensed
such
that
this
name
is
unique
for
each
individual
entry.
So
any
given
name
all
of
its
requires
are
condensed.
The
constraint
is
now
the
merged
constraint
or
it
doesn't
exist,
and
it
is
part
of
this
collection
and
that's
where
that
strategy
bit
comes
in.
C
We
were
not
necessarily
super
confident
that
we
could
approach
every
single
version
merging
strategy
ever,
but
we
wanted
to
get
the
most
general
cases.
You
know
semver
aligned
cases,
so
we
can
give
you
a
constraint,
but
we
also
give
us
bailout
essentially
down
here,
where
you
can
do
it
yourself,
then
we
have
the
or
merged
capabilities,
and
then
we
have
a
blast
in
when
merged
modifier
field.
C
Yeah,
that's
pretty
much.
The
crux
of
the
whole
thing
I
could
get
into
more
detail.
If
you
all
have
any
questions
and
I
have
some
examples
of
what
these
look
like
below
between
emerging
and
not
merging
strategy,
but
yeah
I,
don't
know
if
I
have
a
whole
lot
more
to
say
without
some
questions
about
this.
D
E
C
So,
let's
go
through
the
emerging
example
first
here
we're
using
note
as
our
example.
So
we
have
our
provides
be
known,
and
then
we
have
our
collect
strategy
to
fall,
so
this
will
default
to
fall,
so
we
will
always
try
and
merge
first,
so
you
have
to
intentionally
go
out
of
your
way
to
set
this
to
be
true,
and
it's
worth
pointing
out
that
the
provider
is
the
only
one
that
sets.
C
This
is
true,
so
the
build
pack
that
is
providing
the
dependency
is
the
one
that
is
dictating
D,
whether
or
not
the
versions
are
merged
and
the
reason
it's
laid
out
like
this
is
so
that
it
can
be
potentially
expandable
in
the
future.
If
we
decide
that,
we
want
to
extend
it
to
modifiers
and
capabilities
and
whatnot.
So
then
we
have
to
requires
fields.
We
see
that
this
is
a
constraint
that
we're
pulling
from
a
configuration
file
called
bill
PMO,
and
this
is
a
constraint
that
we're
pulling
from
package.json.
D
C
So
then,
when
we
look
at
the
merge
at
the
end,
we
will
get
that
there
is
one
entry.
There's
one
entry
filled
with
the
name
node
that
has
a
constraint,
and
this
is
I
I,
don't
know
exactly
what
this
merge
will
look
like
in
terms
of
constraints.
This
is
what
I
have
for
right
now
and
that
is
subject
to
change.
Depending
on
the
implementation
that
we
choose.
C
A
To
confirm
the
whiteboard
functionality
of
anybody
being
able
to
put
any
value
in
is
gone
now,
right,
like
there's.
If,
since
I
have
basically
as
a
build
pack
author,
no
way
to
guarantee
that
people
are
putting
values
into
modifiers
that
are
unique
from
one
another.
There's
no
way
for
me
to
ever
see
all
of
the
ones
that
all
of
the
other
build
packs
requested.
A
C
C
C
Otherwise,
all
other
fields
are
the
same
as
the
previous
example.
So
we
now,
when
we
get
to
our
build
plan,
we
will
see
that
there
is
no
entries
version
anymore,
because
we
are
now
doing
a
collection
of
what
that
would
look
like.
So
this
is
maybe
where
this
would
be
expandable
for
if
you
wanted
to
have
all
of
your
modifiers
collected
and
then
you
could
introspect
on
them
or
whatever.
We
could
potentially
add
like
a
provide
strategy,
modifiers
collect
and
then
do
the
same
thing
with
modifiers
down
here.
D
C
And
there's
like
a
conflicting
strategy:
I
actually
don't
know
what
that
looks
like
in
the
actual
life
cycle
itself.
When
you
do
something
like
that,
is
there
like
what
what
happens
when
you're
doing
multiple
provides
the
same
name
and
then
there's
like
a
requirement
meetup
for
them
after
that,
if
that
makes
sense,
I
don't.
B
Actually
not
stop
my
head
right
now.
What
happens
when
they're
multiple
arrives,
with
the
same
name
like
I,
think
I.
Remember
that
sort
of
a
built
packing
claim
that
it
provided
that
thing
and
then
maybe
that
metadata,
but
the
requirement
isn't
provided
to
a
later
build
back,
but
I
could
just
be
making
it
another
nice
to
double-check.
D
E
B
Starting
with
what
I
like
about
this
I,
don't
think
it
could
be
good
to
add
a
more
structured
schema
to
requires,
but
maybe
limit
the
depth
and
arbitrariness
of
the
data
there
and
then
hopefully,
more
consistent
contracts
between
bill
packs
can
emerge
from
that,
because
I
think,
with
a
little
bit
more
guidance
in
structure.
Things
will
just
end
up
looking
more
similar
and
be
limited
in
their
complexity,
in
a
way
that
I
think
could
be
good
for
the
ecosystem.
B
Yes,
that
they
can
all
merge
together,
I
guess,
but
does
every
now
being
that
could
be
provided
or
required.
The
way
people
would
specify
that
can
actually,
you
know,
come
up
with
a
strategy
that
we
can
apply
that
broadly
everywhere,
I'm,
not
sure
about
that,
and
then
also
it's
really
just
making
sure
the
requires
don't
conflict
with
each
other.
There's
no
matching
of
a
provides
version
constraint.
E
Don't
see
it
as
a
generalization
I
see
it
as
a
as
an
extra
feature.
So
by
default
you
get
your
list
of
you
know
you
get
essentially
what
you
get
now,
but
my
sphere
structure
right,
but
the
providing
they'll
Peck
whose
job
it
is,
would
be
to
parse
all
the
versions.
Anyways
can
opt
in
to
have
all
the
sum.
E
If
all
the
the
providing
bill
Technos
I
only
accept
summer,
a
compliant
versions
and
my
job
is
to
merge
them
together
to
find
the
one
that
you
know
matches
all
the
constraints,
then
it
can
opt
in
to
having
a
life
cycle,
do
exactly
what
it
would
do
for
it,
because
that's
a
pattern
that's
shown
up
in
so
many
billed
packs
I
I.
It
doesn't
bother
me
that
you
know
that's
some
logic.
A
B
E
Think
if
you're
trying
to
write
a
simple
build
pack
that
provides
a
dependency
right,
provides
a
different
version.
You
know
different
subsets
of
versions
of
node,
whatever
having
the
logic
and
a
life
cycle
makes
it
really
easy
to
do
that
because
you
get
you
know,
regardless
of
other
contracts,
you
get
one
number
back.
You
know
that
that's
or
arrange
back
you
know
that's
what
you're
supposed
to
install
again.
It's
purely
additive
build
pipes,
don't
have
to
use
this
right,
they
can.
E
C
Think
it's
it's
currently
the
reverse
right
now,
I
think
I
think
a
collection
being
true,
as
is
not
the
case
where
we
merge,
but
we
can
we
could.
We
could
easily
swap
that,
but
I
think
what
you're
talking
about
was
modifiers
in
particular
right
that
you
can't
necessarily
like
you
can't
have
colliding
modifiers
as
it
stands.
Now
would
I
like
a
collection,
not
a
collection
of
those
but
like
and
a
list
of
those
be
sufficient
to
like
avert
that
in
your
eyes
or
what
you
need
something
else.
On
top
of
that.
A
E
A
Absolutely
no
control
over
any
other
build
pack.
That's
put
in
an
order
with
me.
The
keys
that
they
choose
to
contribute
can
be
anything
I
believe
there
is
a
piece
of
text
in
this
that
says
they
are.
These
values
are
expected
to
be
unique
and
I
have
exactly
zero
enforcement
of
that
particular
requirement,
because
I
don't
control
all
of
the
other
build
packs.
A
The
only
reasonable
expectation
there
is
to
be
handed
all
of
them
in
trying
to
a
build
pack,
specific
semantic
resolution
of
multiple
values
and
hope
for
the
only
being
one
last
winds
is
just
not
a
viable
option
in
a
completely
open
ecosystem.
I
have
no
idea
about
order.
We
have
no
idea
about
who's
asking,
but
you.
E
Can't
look
up
the
build
pack
Pro
that
says
it's
gonna.
It
says
it
requires
the
dependency
anyways
right
like
like
the
same
thing
applies
when
they're
not
merged
together.
Also,
you
have
an
arbitrary
list
of
you
know
things
and
the
bill
Peck
has
to
decide
to
pick
some
of
them,
and
so
you
know
what's
so.
A
That's
an
example,
a
concrete
example
of
this.
Today
it
is
the
proc
file
build
pack,
it
reads
a
proc
file
during
detect
because
it
can't
guarantee
that
the
proc
file
will
exist
when
build
actually
happens
like
compiling
go
and
removing
the
source
code
and
a
go
application
would
remove
the
proc
file,
so
it
adds
proc.
It
adds
every
entry
in
the
proc
file
as
metadata
to
the
requirements.
There
is
absolutely
nothing
stopping
a
no
there
bill
pack
from
adding
another
entry
to
proc
file
right
or
to
to
a
proc
file
request.
A
A
E
I
say
you're
saying
you
want
to
have
a
whole
bunch
of
bill
packs
that
contribute
this.
This
means
that
you
can't
have
a
whole
bunch
of
bill,
packs
that
contribute
disparate
things
that
you
want
to
join
together.
Yeah
yeah
got
it.
That
makes
a
lot
of
sense.
There's
a
reason.
It's
a
little
bit
off
top
so
reason.
Proc
file
they'll
pick
use
the
dole
plan
for
its
process
types.
That
seems
how.
A
Proc
file
build
pack
is
traditionally
the
last
build
pack
to
run,
and
so
you
can
say
oh
yeah
I'm
a
proc
file
and
then
you
go
to
the
go,
build
pack
and
the
go.
Build
pack
removes
everything
in
slash
workspace
and
then,
when
the
proc
file
build
pack
comes
along
later
on,
it
says.
Okay,
let
me
go
read
that
proc
file
and
find
out
what
those
commands
were
they're
all
gone.
Oh.
E
A
D
A
But
there
are,
there
are
other
places
and
like
this
is
my
overriding
concern
with
this
entire
RFC
is
there
are
novel
things
that
we
cannot
anticipate
that
will
simply
be
closed
down
by
the
fact
that
we
are
making
the
bill
plant
stricter
than
it
currently
is
right.
It's
not
a
matter
of
oh
tell
me
the
use
case
and
we'll
add
some
sort
of
functionality
to
the
use
case.
The
beauty
of
the
whiteboard
pattern.
A
The
reason
it's
in
the
ganga
for
book
is:
it
allows
the
most
amazing
flexibility,
which
is
especially
useful
to
a
project
that
is
as
new
as
bill
tax.
We
have
no
idea
how
bill
PACs
are
going
to
use
the
build
plan
today,
and
yet
we
are
codifying
behaviors
that
some
small
subset
of
bill
packs
do
do
today.
I.
E
If
you
could
so
like,
if
we
go
at
the
structure,
looks
like
with
stuff
besides
version,
but
if
you
could
create
lists
for
the
other
things,
if
you
could
create
lists
for
modifiers
as
well.
Oh
yeah,
we
have
collection,
but
if
you
could
do
that
collection
thing
for
modifiers
and
capabilities,
would
that
make
you
feel
better.
D
I
mean
it
probably
would
for
me
it
doesn't
make
sense
to
actually
merge
these
if
you're
going
to
collect,
not
collect
them
right,
like
I
guess
like
even
in
this
example
like
it's
weird
to
me
and
either
of
those
examples
that
if
I
require
a
special
edition
to
be
true,
but
it
happens
to
come
first,
but
last
winds
as
the
bill
Pat.
That's
providing
node
ID
net,
like
if
I'm
able
to
provide
a
special
edition.
I,
never
know
that
that
is
the
thing
that
people
want.
I.
E
Wonder
if
we
need
to
create
a
different
system
for
our
'but
rarey
metadata
across
this,
then
it's
like
the
system
is
very
much
designed
for.
I
provide
a
thing
and
I
require
a
thing.
Those
things
are
version
things
and
the
providing
bill
pack
provides
one
of
them
and
others
have
a
bunch
of
that
feels
like
we're,
trying
to
cram
a
bunch
of
other
use
cases
in
that
to
makes
that
system
overly
complex
in
a
generic
structure,
but.
A
But
I
don't
think
we're
trying
to
cram
things
in
the
point
of
the
build
plan
originally
was
to
give
a
whiteboard.
Even
before
we
switch
to
the
declarative,
build
plan,
it
was
always
supposed
to
be
a
whiteboard
for
arbitrary
communication
between
build
packs.
That's
its
original
inception
of
it
and
a
whiteboard
pattern
with
zero
structure
allows
exactly
that.
It's
not
cramming
functionality
into
it.
It's
utilizing
it
for
what
it
was
designed
for
and
now
what
we're
doing
is
tightening
it
down
and
claiming
all
those
other
use
cases
that
we
have
are
no
longer
valid.
I.
E
Using
it
to
create
a
different
mechanism
for
build
text
and
provide
process
types,
that's
not
launched
tamil,
so
that
a
proc
file
doesn't
get
removed
at
the
end.
So
you
must
be
very
contrived
to
me,
but
I
would
like
to
see
use
cases
outside
of
that
that
are
about
generic
information
being
transferred
ahead
of
time
that
can't
be
transferred
at
build
time
that
our
dependency
resolution
I'm.
B
Questions
of
like
having
more
structure
around
the
schema
of
a
requires
versus
the
merge
and
I
think
Ben
disagrees
with
both
of
those,
but
just
to
take
the
one
at
a
time.
The
one
I
that
I
have
prompted
merge
because
it
removes
information
from
this
system
right
like
once,
you've
merged
stuff,
you've
lost
information
and
there's
nothing
to
stop
build
packs
with
the
help
of
a
library
if
they
want
from
ignoring
or
merging
that
as
they
want
later.
B
D
Like
yeah
that
sense
man
like
I
think
that
an
example
potentially
even
for
like
capabilities
is,
you
can
imagine
a
built
pack
having
a
dependency
on
node,
but
only
at
Build
time,
but
you
can
imagine
another
build
pack
that
wants
that
launch
and
it
needs
it
to
run
it.
But
if
that
happens,
to
come
first
in
the
build
path
order,
you
would
lose
that
information
aren't
able
to
actually
know.
E
E
E
E
B
E
E
You
know,
example:
is:
there
can
be
different
configuration
properties
that
are
often
disjoint
sets
about
a
particular
dependency
they
get
merged
together.
So
node
needs
to
be
built
with
this
flag.
Node
needs
to
be
built
with
this
other
flag.
Those
could
get
merged
together.
Note
needs
to
be
built
with
both
these
flags
to
satisfy
all
the
requirements
was
the
idea.
D
E
E
E
Hey
I,
like
that
idea,
I
still
think
that
having
an
opt-in
wait
for
the
lifecycle
to
do
some
verb,
merging
merging
together,
that's
optional
and
it's
controlled
by
the
provider
side
buildpack.
As
long
as
it's
opt-in,
I
want
to
try
to
break
this
down
into
low
pieces.
The
does
anybody
disagree
that
that
should
exist
in
the
lifecycle
at
all,
as
even
if
it's
an
opt-in
thing
on
the
provider
side,
I'm.
E
B
E
D
D
E
B
A
The
point
of
constraints
is
they're
literally
constraints;
they
cannot
be
resolved
without
an
understanding
of
all
of
the
possible
options
available
to
it
right.
So
this
is
the
way
semper
has
traditionally
handled
this.
Is
it
concatenates
them
with
a
bunch
of
commas?
And
then
you
use
something
with
that
additional
context.
You
figure
out
what
satisfies
this
entire
collection
of
constraints
and
make
your
decision
from
there,
but
the
lifecycle
inherently
can
never
do
that.
Second,
but
the
best
they
can
do
is
concatenate
all
of
the
constraints
together,
because
these
aren't
version
numbers
right.
A
D
B
Almost
be
more
okay
with
like,
if
we
really
wanted
to
fail
early
in
detect
like
if
that
was
the
primary
goal
rather
than
convenience
of
consuming
data,
I
must
be
better
served
by
another
hook
into
the
build
pack
like
here
is
what
we
got.
Can
you
do
this
before
you
get
into
building
things
rather
than
having
the
life
cycle?
Do
a
small
slice
of
that
logic,
upfront,
but
a
lot
of
it
gets
left
to
the
build
pack.
B
E
Let's
put
the
merging
of
the
versions
aside
for
a
second
and
go
back
to
the
structure,
because
I
think
that
might
be
something
that
we
can
vary
like.
There
might
be
more
constructive
things
about
that,
because
we
have
that
original
needs
that
you
have
this
ad
is
the
building
launched.
Flags
are
really
weird.
There's
all
this
logic
and
bill
paxton
range
together,
those
and
launch
flags
from
all
the
metadata,
and
it's
it's
really.
E
You
know
it's
commonly
used
and
before
a
big
argument
against
contractual
isaac
knows
is
it
would
be
contractual
izing
them
for
no
reason
the
build
pack
would
still
have
to
read
those
values
and
then
set
them
somewhere
else,
which
is
very
weird
but
made
a
good
point
that
that
doesn't
make
sense
right.
But
now
we
have
a
different
use.
Cases
come
up
around
this,
which
is
we
want
to
be
able
to
exclude
all
of
the
build
specific
stuff.
E
That's
not
for
launch
from
the
build
plan
in
the
image,
because
otherwise
we
don't
end
up
with
a
reproducible
images
the
build
tools,
even
if
they
have
nothing
to
do
with
the
image
like
if
generated
in
the
end.
Even
if
they
like
something,
that's
supportive
of
the
build
pack
doing
I'm
doing
processing
those
build
tools,
end
up
in
the
build
materials
now
and
make
the
image
is
not
reproducible.
So
does
that
requirement
then
mean
that,
like
we
don't
need
the?
D
If
so,
are
you
saying
because
I
don't
know
if
this
is
this?
This
is
not
laid
out
in
this
RC,
but
I
assume
you're
applying
if
like
it
ends
up
saying
like
build,
is
true.
The
launch
is
false
and
capabilities,
and
that
gets
passed.
The
bill
pack
is
the
bill
pack
responsible
for
moving
that
from
the
bill
plan,
or
does
that
is
this
proposing
that
that
he
gets
removed
by
this
I
mean
the
thing
you
just
said
and
not?
What's
in
the
herb,
see.
D
Then
they're
like
if
the
resulting
thing
you
get
at
the
end
for
the
capabilities
says,
build
is
true,
but
launch
is
false,
like
after
resolved
everything
when
it
comes
to
the
bill
pack
and
the
bill
pack
is
the
bill
pack
expect
you
to
remove
that
to
not
be
in
the
bill
plan.
Are
you
proposing
that?
Because
it
isn't
in
the
launch
image,
we
don't
want
it
in
the
build
plan
and
that's
a
capability.
We
want
the
life
cycle
and
at
all.
E
If
that
make
sense,
I
think
it
makes
sense
if
the
if
the
providing
bill
peg
removes
it
from
its
build
materials
entry,
then
it
pushes
it
to
the
next
build
pack
or
I
think
maybe
fails.
If
there
is
no
next
build
pack,
so
it's
impossible
to
remove
the
build
equals
to
entry
and
end
up
with
a
reproducible
thing.
E
It
could
try
to
remove
the
version
number
of
it,
but
the
feature
where
you
can
push
it
to
the
next
build
pack
conflicts
with
the
idea
that
you
could
remove
it
from
the
build
plan
in
order
to
improve
reproducibility
there
I
think
there
needs
to
be
something
where
it
stays
in
the
build
bill
of
materials,
but
we
we
know
that
this
entry
shouldn't
be
added
to
the
image
metadata.
It
should
go
into
report
Donal
or
some
other.
E
E
We
could
just
I
mean
the
simplest
thing
we
can
do.
Is
moist
build
and
launch
to
the
top
along
with
the
current
version
and
make
them
special
and
then
again
it
does
feel
a
little
weird
because
they
don't
actually
control
the
builds
and
launch
the
fact
that
it
still
has
to
rank
those
Dallaire
metadata.
But
they
really
control
is
whether
or
not
they
end
up
in
the
build
materials,
and
you
couldn't
have
a
exclude
flag
at
the
top
or
something
because
you
don't
know
if
somebody
else
is
gonna
have
something
that's
built.
D
Is
it
actually
that
weird,
if
you
hoist
them
to
the
top
like
I
guess,
like
kind
of
independent
of
this
artisan,
like
you're,
basically
saying
I,
require
this
thing
to
be
on
a
build
layer
or
launch
layer?
I
guess,
like
part
of
the
thing,
is
you're
making
assumptions
about
layers
where
the
depends
you
may
or
may
not
turn
into
a
lei
or
not
I.
Guess
right,
yeah.
A
A
D
B
Crazy
thought:
do
we
just
want
to
be
decoupling
the
build
plan
from
the
build
materials
to
know
that
we've
found
these
complications
were
they're?
Not
the
same
thing
like
maybe
would
also
he'd
end
up
with
more
pegs
writing
a
bill
of
materials
if
it
wasn't
sort
of
this
backdoor
in
the
plan,
but
wasn't
explicit
I'll
put
up
a
bill
park,
yeah.
E
A
And
I
think
for
a
lot
of
things
like
the
fact
that
we
put
our
dependency
metadata
into
the
build
plan
on
the
way
out
well
really
into
the
build
pack
plan
on
the
way
back
out.
We
just
put
it
into
a
different
place,
like
that's
already
a
from
scratch
or
from
whole
cloth
contribution
to
the
build
plant.
None
of
none
of
that
data
existed
on
the
input
into
a
build
part.
D
B
I,
like
about
a
little
bit
more
structure,
it's
the
way
we
have
things
concurrently,
I
could
very
easily
like
we
have
building
launch
flags
at
the
top
level.
In
a
minute
data
I
could
see
an
equally
reasonable
person
saying
under
metadata,
I
have
a
field
called
table
called
layer
flags
and
that's
where
I
put
my
buildin
launched
by
convention.
So
what
I
like
is
reflecting
things
out
a
little
bit
so
that
people
doing
the
same
thing
or
more
more
likely
to
look
at
the
sim.
E
One
thing
I
like
I,
agree
with
that,
and
one
thing
I
don't
like
about
the
current
like
way
we
had
things
laid
out
is
on
the
providing
side.
You
receive
a
list
of
objects
and
they
repeat
the
same
like
names,
sometimes
out
of
order
right
and
you
have
to
find
all
the
things
that
have
the
same
name
and
merge
those
together
at
the
very
least,
the
lifecycle
grab
all
the
things
that
are
called
note
and
put
those
in
a
list
for
you
right.
E
E
D
I
mean
you
could
just
have
in
like
the
gold
plaque
plan
tunnel.
If
you
don't
touch
the
build
plan,
I
guess
Thomas
structure
is
like
the
thing
provided.
The
bill
pack
could
be
like
entries,
dot,
node
and
that,
like
then,
has
all
the
stuff
in
there
I
guess,
especially
that's
part
of
the
parks,
because
then
you
have
to
like
it
right
through
all
those
things,
but.
D
E
D
D
D
Yes,
I
just
like
for
Emily
and
I'm
still
unclear
like
what
an
actual
thing
you
would
want
on
this
arsing.
Are
you
like
a
heart?
No,
do
you
want
changes
to
it
like
I'm,.
E
A
These
frictional
changes,
I,
believe
and
I,
commented
just
before
this
meeting.
In
fact,
not
only
do
they
not
take
into
account
my
concerns
from
the
previous
incarnation
of
this
about
promoting
things
to
top
levels,
they
go
in
exactly
the
opposite
direction,
and
instead
of
suggesting
that
we
promote
to
build
and
launch
now
we're
going
to
promote
three
of
them.
So
I
believe
this
has
gone
180
degrees
from
my
preference
for
this
change.
E
E
So
so,
right
now
the
build
pack
receives
a
flat
list
of
objects,
kind
of
out
of
order,
some
of
which
may
have
matching
names
and
some
of
which
don't
I,
build
pack
has
to
filter
that
list
into
things
with
similar
names
like
has
to
go
through
and
say:
okay,
this
one's
node
and
this
one
down
here
is
node,
and
this
one
over
here
is
no.
Those
are
the
ones
that
need
to
merge
together.
A
I
thought
we
had
that
at
some
iteration
of
this
the
plan
was
there
would
be
a
single
entry
for
node
and
then
entries
metadata
would
effectively
be
a
list
of
maps
exactly
yeah.
That's
that's
the
yeah
like
that.
I
can
totally
that
feels
right
to
me.
I
understand
the
the
compelling
reason,
because
a
build
pack
is
typically
looking
for
one
particular
key
or
is
looking
for
a
coalesced
state
about
a
particular
King
yeah
awesome.
A
D
C
E
Still
for
what
it's
worth,
I
still
like
the
idea
of
the
lifecycle
doing
more
to
help
do
version
resolution,
because
we've
created
a
resolver
and
it
feels
like
I
couldn't
can
use,
could
do
something
to
make
easier
for
buildpack
authors
there,
but
I
totally
understand
that
that's
controversial
they're,
like
hey,
not
suggesting
that
my
perspective
is
right.
I.