►
From YouTube: Working Group: 2021-01-20
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
B
Packed
on
the
pack
side
pack
just
released
the
last
week
we
are
considering.
B
A
E
A
A
This
one's
in
fcp,
it's
got
all
the
votes,
literally,
all
of
those
it's.
E
A
E
D
I
think
the
only
outstanding
thing
for
me
was
just
resolving
natalie's.
We
talked
about
last
week,
but
I
think
just
getting
some
decision
on
natalie's
label
stuff.
A
E
B
A
D
Yeah
and
then
I
can
just
comment.
D
G
Could
we
not
infer
the
date
by
when
it
was
labeled,
since
it
tells
you
that
at
the
bottom
I
like
that.
A
I
started
up
offhand
discussion
in
the
spec
spec
slack
channel
about
potentially
closing
this
in
favor
of
having
a
like
a
simpler
solution,
so
maybe
I'll
bring
that
back
up.
If
there's
time
today,
if
not
I'll
push
it
to
the
next
time
I'll
put
on
the
engine.
A
A
D
E
A
Right
this
one,
I
saw
some
movement
in
the
comments
on
this
one.
A
bit
is
this
one
on
the
agenda
to
discuss
anymore,
or
we
looks
like
we've
got
at
least
a
vote
from
it
on
this.
A
A
Are
we
waiting
for
more
movement
on
this
from
the
pr
author?
Are
we
ready
to
collect
votes.
E
B
Probably
needs
there's
a
bunch
of
pieces
of
feedback.
I
can
thank
them.
E
A
All
right
that
one,
that's
a
draft
right,
yeah!
The
rest
of
these
are
all
giraffes
so
on
to
whoever
has
the
first
agenda
item.
A
E
F
So
where
do
we
think
this
is
at?
Was
there
agreement
that
we
want
half
of
analyze
before
detect
and
not
a
prepare
phase?.
A
That
seemed
to
be
the
consensus
of
the
group
that
was
discussing
last
week.
D
Yeah,
I
think
the
idea
was
that
not
opposed
prepare
phase
but
one
to
reserve
it.
D
If
we
can
just
use
it
inside
of
the
stuff
we
already
have
now,
which
makes
sense
from
your
proposal.
H
A
You
probably
wouldn't
want
prepare
to
be
optional.
If
it
did
things
like,
you
know,
making
sure
the
that
you
had
registry
access,
and
things
like
that.
So
there's
still
some
reasons
to
have
analyzer
before
to
tech,
to
do
some
of
the
registry
specific
things
that
this
rfc
calls
out.
F
H
Like
I
don't
see
any
reason
why
it
can't
do
inline
bill
packs
as
well
or
things
that
we've
specced
out
like
I
was
okay
with
this
as
it's
written
and
have
approved
it.
A
Yeah,
if
we
can,
we
can
cut
that,
but
I
think
it's
fine
to
try
to
present
as
is,
and
if
there
is
disagreement
on
what
should
be
in
it.
I
think
that
is
the
fact
that
we
would
cut,
but
I
just
didn't
cut
it
because,
as
emily
said,
we
kind
of
discussed
that
it's,
maybe
okay,
leaving
some
of
these
things
in
here
that
we
not
exactly
sure
what's
gonna,
what's
gonna
be
implemented
in
the
short
term,
but
I
think
it's
a
good
stretch
goal
to.
A
The
next
thing,
if
there's
nothing
else
to
talk
about
there,
is
the
pac
command
to
create
a
build
pack.
Repo.
F
Yeah,
so
it's
part
of
some
I'll
share
this
parting
work.
I
was
doing
to
make.
F
I
guess
you
call
it
learning
materials
on
how
you
create
build
packs.
I
don't
even
notice,
but
I've
been
creating
a
build
pack
every
day.
F
It
feels
like
early
days
of
any
ecosystem,
or
it's
like,
oh,
that
could
be
a
build
pad
I'll
make
a
build
pad.
So
a
couple
people
ask
me
like:
oh
you
know,
what's
your
process
for
doing
that,
and
I
was
too
embarrassed
to
show
them
that
it
starts
with
like
10
different
touch,
commands
and
maker,
and
all
this
right.
F
F
So
the
proposal
is
a
pack
filled
pack
crate
command
with
three
options:
language
which
would
default
to
bash
stack,
which
could
be
provided
multiple
times
and
then
path
which
would
be.
You
know
where
you
generate
the
artifacts.
F
I
basically
have
already
written
this
and
that's
there's
a
pr
for
it.
So
that's
how
it
works.
F
The
defaults
of
bash
was
derived
from
the
cv
user
research,
which
I
think
indicated
that,
like
the
comfort
level
of
people,
like
part
of
the
reason
has
been
successful,
is
it's
bash?
Oriented
and
certain
number
of
our
users
aren't
necessarily
go.
Users
go
be
an
option,
but
if
you're
interested
in
this
pr,
I've
been
doing
a
lot
of
looking
at
what
the
default
to
go
would
look
like,
and
it's
not
really
clear
to
me.
I've
looked
at
pacquiao
bootstrapper,
which
is
a
project
I
link
in
prior
arts.
F
I
guess
unless
you're
all
familiar
with
this,
this
works
really.
Well,
though,
because
it's
opinionated
about
like
the
packet
library
and
all
that
and
without
that
stuff
I
don't
really
know
what
to
generate
other
than
like
an
empty
build.go
and
even
like,
should
I
specify
a
package
name
and
that
kind
of
stuff.
So
those
of
you
who
have
actually
implemented
go
build
packs,
I'd
love
feedback
on
somewhere
in
here,
there's
a
yeah.
What
should
this
do?
B
One
suggestion
is
there:
would
there
be
any
benefit
to
having
a
or
template
repos?
Instead
of
this,
no
and
as
part
of
the
buildpacks
project,
we
would
have
a
bash
build
pack
template.
F
Yeah,
that
seems
totally
compatible
with
what
I'm
proposing.
I
I
looked
at
cookie
cutter,
really
quick.
The
thing
that
I
want
to
that
I
would
like
to
avoid
is
kind
of
what
I've
always
felt
with
like
maven
archetypes,
where
you
get
like
dozens
of
options
that
you
need
to
fill
in
like
I
want.
I
want
to
keep
this
command
simple
right,
whether
it's,
whether
it's
pulling
from
a
template
repository
or
whatever
I
don't
want
you
to
have
to
like
give
me
the
version.
Give
me
the
id
give
me
the
license.
F
G
What
I
was
suggesting
was
one
single
repo
that
would
have
different
languages
in
it,
and
then
people
could
just
take
it
and
do
whatever
they
want
with
it
right,
but
individual
contributors
could
also
contribute
different
languages,
so
we
don't
have
to
really
worry
about
it
all
too
much
so.
H
G
I
think
both
right,
like
I,
don't
see
why
they
have
to
be
mutually
exclusive.
I
think
if
you
set
a
very
specific
placeholders
and
that's
all
it's
doing-
it's
really
just
kind
of
mutating
those
placeholders
via
whatever
templating
library.
Then
it's
easy
right.
A
user
can
manually
edit
them
and
our
program
could
essentially
pull
them
down
and
fill
them
in
as
well.
A
F
C
Even
at
vmware
we
have
two
two
internal
competing
libraries,
for
you
know:
bootstrapping
build
pack
templates
for
build
packs,
written
and
go
specifically.
If
that
makes
sense,
I
wonder
if
it's
going
to
be
hard
to
agree
on
what
the
default
template
for
go
should
be
within
the
cmb
project,
because
there
are
competing
options
out
there.
D
B
I
think
that
bash
is
a
good
starting
point
for
people.
You
know
then
stana
built
backs
that
you
do.
I
don't
feel
like
that,
this
or
another
pr
or
pretty
simple
bash,
it's
probably
simpler
for
people
to
start
with.
F
Yeah,
so
so
what
if
we
were
to
start
with
this
exact
man
and
and
remove
language,
and
it
just
always
generates
bash
and
then
in
the
future.
We
add
a
dash
dash
template
that,
at
that
point
we
can
have
a
discussion
about
a
templating
language
and
all
those
kinds
of
things,
but
it
wouldn't
that
wouldn't
change
the
sort
of
essence
of
the
command
and
in
that
original
command
that
doesn't
I'm
sorry
that
doesn't
have
any
parameters
other
than
the
id
would
still
be
totally
acceptable
and
work
exactly
the
same
way.
D
F
B
I
was
thinking
that
as
well,
just
because
then
we
you
know
within
this
and
with
the
stack
of
ipr,
you
know
we
have
already
a
builder,
create
a
method,
but
also
this
would
be
adding
a
build
pack,
create
method
and
a
stack,
create
method,
and
they
do
wildly
different
things.
So
I
yeah
newer.
H
My
other
question
was
we've.
You
know
pushed
kind
of
far
into
making
everything
a
build
pack,
including
multi-build
packs,
and
do
we
care
about
supporting
a
buildback
with
an
order
instead
of
stacks
and
because
we've
called
all
these
things
built
back.
Is
it
clear
to
people
that
this
is
only
a
subset.
F
Yeah,
I
was
thinking
about
that
and
at
first
I
thought
there
would
be
oh,
like
one
option
to
generate
a
regular
buildback
one
option
to
jeremy
the
meta
multi-build
pack,
but
I
think
I
don't
really
know
what
you
would
generate
like
it's
hard
to
assume
what
the
the
package
tunnel
looks
like
and
it's
what
the
subdirectories
are.
If
we
have
them
what
the
order
looks
like,
like
I
kind
of
reduced
it
to
like.
F
Well,
there's
an
empty
build
packed
on
one,
that's
it,
and
so
I
wasn't
really
sure
where
to
go
with
that,
and
I
was
just
kind
of
thinking
that,
like
med,
build
packs
just
a
thing,
we
don't
generate
it's.
You
know
because
it's
mostly
just
a
combo
file.
I
don't
know
if
that
is
right.
That's
what
I
was
thinking.
D
The
metabull
pack
is
just
the
tunnel
file
right,
yeah,
like
at
the
end
of
the
day
when
you
package
it
up,
I
mean
the
build
package
and
other
stuff
may
be
different,
but
the
the
actual
thing
on
disk
that
you're
coding
against
is
just
that
tonal
file.
I
think
it'd
be
useful
to
just
generate
the
skeleton
like
any
unknit
thing,
but
I
don't
want
like
option
explosion,
and
I
also
think
about
like
do
stack.
Packs
change
like
if
I'm
building
a
stack
pack
is
that
different.
F
I
mean
I
could
see
an
option
to
generate
or
like
a
template
for
a
stack
pack,
but
a
couple
things
with
that.
I
don't
see
the
typical
build
pack
new
to
build
pack,
the
person
who
would
use
this
command
typically
generating
the
stack
guy.
That's
one
thing
two
what's
generated
is
so
far
it's
completely
compatible
with
this
documents.
You
just
need
to
add
the
configuration
you
know
for
in
the
billback
tunnel
and
do
certain
things
in
your
build
and
whatever
so
it's
not
like
if
what's
being
generated
today.
F
Let
me
see
if
I
can
pull
this
up.
It's
just
embedded
you're
not
like
this
is
not
like
opinionated
about
whether
it's
a
stack.
D
Yeah,
I
I
guess
just
to
emily's
point
supporting
all
the
different
types.
I
think
it
could
be
an
add-on
to
the
rfc
that
isn't
part
of
this
rc.
Maybe
it's
telescope
because
I
think
the
default
to
generate
the
90
is
fine
and
if
you
added
say
a
types
flag
or
something
after
the
fact,
because
you
need
to
do
something
different
like
for
a
meta
bill
pack
or
something.
I
think
that
would
be
too
hard
to
add
on,
and
it
just
reminds
the
the
topple
file.
I.
C
D
Think
about
basically
most
language
author
people
probably
use
like
npm
init
or
anything,
no
matter
if
you're
beginner
advanced
like
that
is
just
what
you
use,
because
it
generates
the
stuff
correctly
for
you
and
I
think,
even
for
metabo
packs.
You
probably
would
want
that.
F
D
Right,
even
in
like
your
rails,
new
or
cargo
and
net
like
or
it's
not
target
new
like
they're,
not
interactive
like,
but
basically
even
veterans
use
it
right,
like
no
one
generally
saying
that
stuff
by
hand.
D
No,
I
I
always
suppose,
like
I'm,
not
saying
it
should
be
in
scope.
I
was
just
falling
along
on.
E
D
Emily
was
saying,
but
I
also
think
like
supporting
that
would
be
like
one
flag
and
it
would
be
an
optional
thing.
That
is
not
the
default.
G
Slightly
different,
but
you
know
talking
about
mvp,
how
do
you
feel
about
making
the
stack
flag
optional
and
just
defaulting
to
any
it
is?
It
is
okay,
yeah.
E
G
G
But
they
don't
want
to
have
to
think
about
it
right
like
if,
if
you
bring
something
down-
and
it
happens
to
be
like
the
open
shift,
whatever
right
like
you
know
what
you're
playing
with.
Essentially
I'm
just
thinking
about
the
friction
of
trying
to
learn
about
the
stack
and
the
environment
that
you're
running
in.
A
D
G
I
don't
care
for
bionic
right,
like
that's
my
thing
and
again
it's
it's
more
like
the
the
on
wrap,
like
the
exploratory
aspect
of
it.
So
it's
I
don't
know
very
hypothetical,
but
that's
what
I
would
that's
what
I
would
think
when
I
start
running
this
and
I'm
running
it
against.
You
know,
I
guess
a
very
specific
version
of
heroku's
builder
and
then
it
complains
that
the
stack
doesn't
match
like
now.
I
have
to
learn
something
new
right
like
it
just
doesn't
feel
as.
F
But
isn't
that
better
than
using
any
and
it
being
a
lie?
And
then
the
person
who's
using
your
build
pack
has
to
go
learn
what
all
happened
right
like
I
think
what
I
worry
about
with
defaulting
to
any
is
encouraging
the
v2
ecosystem,
where
it's
like
yeah
sure
this
buildback
works
on
any
stack,
but
not
really.
You.
F
Yeah
there's
a
few
guards
in
here.
You
can
generate
bash,
build
packs
on
windows
with
the
pr,
but
not
build.pads.
E
B
E
F
D
Are
you
just
going
to
have
a
future
work
section?
Yes,.
F
D
D
Okay,
now
that
steven's
here
do
you
want
to
talk
about
the
other?
Your
previous
thing,
joe.
F
Yeah
we
were
looking
to
analyze
before
detect,
and
I
want
to
make
sure
we
have
consensus
on
I'll
put
back
up
on
what
the
responsibilities
of
the
analyzer
are,
assuming
that
there
would
be
also
in
addition
to
the
analyzer,
some
optional
prepare
thing:
what
are
what
are
your
expectations
around?
What
that
analyze
would
do
or
would
not.
F
I
mean
yeah,
so
I
was
kind
of
like
thinking
I'd.
Maybe
let
you
talk
before.
I
tell
you
what
was
on
my
mind
in
particular,
but
in
particular
it's
the
project
tunnel,
I
think,
is
the
thing
that's
most
in
question.
C
So
when
you
we're
reading
it
from
the
application
directory
that
the
user
provides
we're
not
going
to
parse
it
on
the
platform,
so,
like
my
my
worry
with
project
tamil
is
not,
I
think
I
have
two
things.
The
thing
I'm
worried
most
about
is
making
sure
that
we
I'll
give
platforms
the
flexibility
to
parse
it
at
the
right
stage.
C
The
thing
I
worry
a
lot
less
about,
but
still
think
we
should
consider,
is
that
project
humbles
is
an
extension
and
a
platform
should
should
be
able
to
opt
in
to
supporting
that
functionality
or
be
good
to
modularize
it
to
some
extent,
but
I
don't
really
have
very
strong
opinions
about
that.
The
second
thing,
if
that
makes
sense,
it's
much
weaker
than
the
first
thing.
C
C
I'm
I
worry
that
a
model
things
could
get
very
complex
if
we
have
a
model
where
we
rely
on
project
tamil
to
be
presented
to
the
life
cycle
in
the
root
of
the
app
directory,
and
we
rely
on
it
to
be
presented
at
the
root
of
the
directory
that
you're
providing
to
the
platform
to
say
I
want
to
do
a
build
in
where
there
can
be
very
big
differences
between
those
things.
One
might
have
a
git
directory
one.
C
C
If
the
plot
was
the
platform's
responsibility
to
parse
project
tamil
right
sort
of
outside,
and
then
there's
metadata
that
the
platform
parses,
that
it
then
provides
and
to
analyze
right
in
order
for
in
a
you
know,
format
that
could
look
very
similar
but
but
is
still
you
know
more
normalized
right
into
things.
We
expect
the
life
cycle
to
actually
need
to
do,
and
then
life
cycle
processes
those
things
right,
that's
kind
of
my
preference
architecture.
I
guess
the
that
you
know
I'm
I'm
very
open
to
whatever
works.
C
C
F
So
what
would
that
you
talked
about
like
something
else,
process
a
platform,
processing
project
tunnel
and
then
providing
some
other,
maybe
very
similar
input
to
analyze?
I
mean
what
what
do
you
imagine
that
would
look
like
I
mean.
The
only
thing
I
think
inline
bill
packs
are
a
good
example
of
something
that
wouldn't
belong
there
like
inline
bill
packs,
would
have
been
processed
ahead
of
time
and
the
ephemeral
build
pack
would
be
on
the
file
system
and
you'd.
Give
the
give
the
life
cycle
an
order.
F
C
Yeah,
I
guess
maybe
an
example
using
the
include
exclude
thing,
because
I
think
that
encompasses
it
looks
different
in
different
platforms
right.
The
pax
cli
could
understand
how
to
parse
project
tommel
and
grab
the
just
the
files
that
need
to
be
included
to
upload
them
into
the
container
right,
ignoring
the
files
that
should
be
ignored
or
or
you
know,
if
you're,
using
kpac
and
doing
a
source
upload
to
your
registry,
where
you
had
to
ignore
some
credential
files.
C
This
definitely
shouldn't
kpec
shouldn't
store
those
files,
so
it
may
exclude
those
files
right
when
it
does
the
upload
by
part
being
a
platform
that
understands
project
tamil,
but
another
platform
like
tekton
or
kpac,
pulling
from
get
or
shipwright
pulling
from
get
or
whatever
right
may
need
an
in
container
step
to
exclude
those
files,
because
it's
going
to
clone
from
a
git
repository
right
and
that's
where
I
see
prepare
coming
in
it's
a
place
where
there's
a
modular
part
of
the
life
cycle
that
can
implement
that
project.
C
Tamil
extension
right
and
you
know,
perform
the
steps
that
are,
you
know
in
containery
platform,
steps
right,
get
the
same,
normalized
format
right
and
then
pass
it
to
analyze
which
implements
things
like
you
know,
pulling
build
packs
that
we
agree
or
core
part
of
the
spec
lifecycle
needs
to
have
knowledge
of
how
to
do
that
again,
not
intended
to
be
like
prescriptive
about
following
that
workflow.
It's
just
like
an
example
of
what
seems
to
be.
H
I
just
wanted
you
stephen,
when
you're
talking
about
prepare
doing
include,
exclude,
I
assume
in
a
lot
of
these
simpler
platforms
like
techton
repo
has
been
cloned
and
if
it
happened
to
in
the
clone,
have
included
or
excluded
things.
The
only
option
we
would
have
is
to
delete
them
right.
C
I
think
if
it's
helpful,
like
save
for
implementation
of
creator
right,
maybe
we
want
to
make
creator
so
advanced
right,
that
it
can
clone
the
app
source
code
and
exclude
things
from
it
and
all
of
that
and
in
that
case
prepare
could
have
cloning
functionality
right.
But
in
the
tecton
case
I
wouldn't
expect
you
to.
You
know,
use
that
and
I
don't
think
creators
should
really
have
that
anyways
right,
yeah.
C
H
C
I
think
it
separates
the
concern
right
of
providing
you
know
the
source
code.
You
want
to
build
from
the
core
life
cycle
phases
right.
If
you
build
that,
like
removing
excluded
files
into
the
core
life
cycle
right,
then
I
feel
like
it's
encouraging
people.
It's
not
a
good
place
to
do
exclusions
right
because,
by
the
time
you're
doing
the
build
it
could
be
happening
in
an
environment
where
you
don't
want,
it's
really
bad
for
you
to
have.
You
know,
excluded
credentials,
for
example
right
so
to
me
that
that
step
should
be.
C
You
know,
decoupled
from
the
core
life
cycle
phases
right,
you'd
be
encouraged
to
handle
it
outside
of
the
you
know
like
you
should
be
encouraged
to
process
the
input
to
be
provided
the
life
cycle
first
and
then
provide
it
to
the
core
lifecycle
phases
that
always
must
run
like
analyze.
If
that
makes
sense,.
H
Yeah,
in
my
mind,
they're
like
two
things
like
one
is:
although
people
should
be
doing
this,
does
it
hurt
to
have
the
life
cycle
as
a
fallback
and
enforce
it?
I
would
lean
towards
no,
but
I
could
you
know
you
could
probably
convince
me
that
it
doesn't
go
the
other
way,
sort
of
in
the
same
way
that
you
know
you're
doing
validation
on
the
web
page
you
validate
it
in
the
client
and
then
you
also
validate
it
in
the
server
like
you
should
still
do.
C
I
think
maybe
the
specific
example
is
like
conflating,
or
it's
making
a
larger
like
from
a
different
direction:
the
cfcli
parses
manifestamel
as
it
you
know
it
doesn't
consider
manifest
dml
to
be
part
of
the
application
source
code.
It
parses
it
as
like.
It's
a
configuration
file
for
cloud.
Foundry
cli
then
passes
that
configuration
further
into
the
platform
and
that
you
know
things
wouldn't
work
very
well.
C
If
it
didn't
do
that,
because
there's
information
that
it
needs
to
know
locally
before
it
can,
you
know
upload
the
thing,
and
it
also
adds
more
flexibility
in
that
you
could
have.
You
know
multiple
manifest
files
in
different
directories
and
use
them
to
operate
against
different
source
code.
If
you
wanted
to
right-
and
I
feel
like
that
architecture-
because
because
we
can't
process
the
file
entirely
in
container,
that
architecture
seems
kind
of
reasonable
here.
If
that
makes
sense,.
A
A
great
example
for
processing
exclude,
at
least
in
my
mind,
is
you
know
if
you
run
pac
today
and
you've
got
a
go
application,
but
you've
got
like
a
ruby
script
in
your
in
your
default
directory.
If
you
run
pac,
it's
going
to
exclude
that
ruby
file
that
you've
added
to
your
project
tunnel,
but
then,
when
you
kick
it
over
to
kpac
now
the
ruby
build
pack
might
kick
in
because
it's
detect
now
passes
and
now
you're,
like
installing
ruby
in
your
kpac
produced
image
but
you're
not
in
yeah
you're,
not
locally.
H
I
wonder
if
part
of
the
concern
here
is
about
like,
if
we
agree
that
maybe
there's
some
stuff
in
project
tunnel,
that
the
life
cycle
is
going
to
want
it'd,
be
helpful
to
be
able
to
provide
it
as
a
file,
because
we
don't
have
to
do
flags
or
environment
variables,
because
that
creates
a
complicated
step.
In
the
beginning.
H
If
you
want
90
of
it,
does
it
mat?
Wouldn't
it
be
simpler,
just
to
take
the
whole
same
file
structure
instead
of
inventing
one
that
like
could
drift
or
be
slightly
different
and
that
every
platform
needs
to
know
to
translate.
I
feel
like
the
file
itself
and
what
it
contains
is
like
a
problem
to
me.
The
location
of
it
at
the
root
of
the
repo
may
feel
like
a
problem,
but
I
feel
like
you
could
make
that
configurable.
H
So
that,
if
you
find
one
at
the
root
of
the
repo
it'll
be
respected,
but
it
could,
it
could
be
provided
in
other
places.
So
in
the
case,
when
you're
pushing
a
jar
or
something
like
that,
you
can
still
provide
the
project
tunnel
and
that
would
at
least
make
it
easier
for
platforms
to
implement,
because,
instead
of
having
to
do
a
translation
that
they
would
have
to
keep
up
to
date
with
the
platform
api,
they
just
you
know
copy
the
file
over.
C
But
you're
still
proposing
that
that
I'm
not
sure
if
I
understand
everything,
sorry
you're
still
proposing
that
that
interchange
file,
the
analyze
phase
of
the
lifecycle
be
able
to
understand
the
project,
normal
format
right,
the
yeah.
I
don't
think
I'm
gonna
push
back.
You
know
really
hard
on
that,
but
if.
C
Let
me,
let
me
ask
a
related
question:
would
you
expect-
and
this
may
be
for
joe
would
do
you
expect
build
packs
to
be
able
to
read
project
tommle
like
should,
should
any
build
pack
look
for
project
tomml
or
because
it's
an
extension
should
all
build
packs
be
okay
with
it
not
being
there
should.
We
always
expect
it
not
to
be
there
at
all
and
in
the
jar
case
where
you
could
get
an
exploded
jar
that
never
has
project
tomlin,
it
can't
right.
Could
that.
C
F
I'm
uncomfortable
with
this
question
because
I
don't
know,
I
think,
did
some
of
the
pacquiao
build
packs.
They
use
metadata
to
store
like
files
and
shots
and
stuff.
Isn't
that
right?
It's
a
bill!
Peck
tunnel!
C
Well,
I
I
would,
I
would
kind
of
like
to
say
we
should
delete
project
tom
from
the
app
directory
before
the
build
pack
sees
it
so
that
we
can
create
an
inconsistent
experience
with
different
platform.
Specific
descriptors
and
you
know,
risk
bifurcating,
the
community,
but
that
you
know
it
has
its
own
set
of
uncomfortabilities
and
I
think
to
me.
F
F
C
I
mean
you
could
build
the
application
and
like
what,
if
you
had
multiple
build
systems
for
your
application
that
were
different
right.
It's
like
is
a
docker
file.
Part
of
maybe
the
docker
file
is
a
good
example.
Is
a
docker
file
part
of
a
docker
build?
Well,
if
you
do
a
docker
build
to
the
docker
file
in
the
directory,
you
end
up
uploading
the
docker
file
into
the
build
context,
but
then
you
don't
do
anything
with
it
afterwards,
right
and
it'd
be
really
weird.
C
C
H
I'm
inclined
to
see
it
that
way
is
down
there's
a
proliferation
of
platforms.
It's
just
like,
from
a
practical
point
of
view,
to
avoid
getting
like
a
lot
of
drift
and
different
behavior.
It
would
be
convenient
if
some
of
this
was
in
the
shared
component.
It
makes
the
ecosystem
more
cohesive.
C
This
is
easier.
This
is
easy
to
refactor
later,
if
we
need
to
right,
because
we
can
have
lifecycle,
parse
the
current
project,
homo
format
and
then
later,
if
you
know
people
don't
like
multiple
things,
parsing
the
same
file
and
we
have
issues
with
you-
know:
mismatched
components
between
the
platform
and
the
build
packs,
reading
different
versions
of
the
file
or
whatever
right
we
can
switch
to
another
intermediary
format.
That's
consistent!
C
So
that's
why
I
said
I'm
not
going
to
block
on
no,
we
absolutely
need
to
have
a
different
format
now,
because
I
think
there
is
a
migration
path
to
that
later.
If
we
need
it,
but
you
know,
I
do
think
it's
possible
that
even
more
practically
we
have
a
problem
where
now
we
have
to
worry
about
the
version
of
the
life
cycle
and
the
version
of
the
platform,
both
parsing
that
file
and
make
sure
that
they're
parsing
the
same
version
of
that
file
right
and
that
they
coordinate
that.
F
A
Yeah,
I
think
it's
all
in
that
one
bullet
point:
that's
kind
of
right
below
the
screen
there.
It's
going
to
kind
of
circle
back
and
say
like
yeah
like
this
parsing
the
project
descriptor,
that's
pretty
much
the
things
that
I
think
are
in
question
of
whether
we
would
want
this
in
the
rfc
and
we
sort
of
talked
about
parsing,
the
project
descriptor
for
a
long
time.
But
we
didn't
talk
about.
You
know
whether
life,
whether
we
would
be
okay
with
analyzer
downloading,
build
packs
or
creating
ephemeral,
build
packs
from
a
project
descriptor.
C
Yeah,
if,
if
I
said,
I'm
okay
with
preparer
parsing
the
project,
tomml
file
format
directly,
if
that
makes
sense,
I
guess
emily
you're
still
arguing
that
analyzer
should
parse
project
tunnel
directly.
Is
that
the.
H
Yeah,
but
I'm
happy
to
give
in
and
say
it's
prepare,
that's
all
fine
with
me
like.
I
have
a
slight
preference
because
I
think
it's
really
like
six
and
one
half
a
dozen
of
the
other,
but
also,
I
think,
we'll
all
be
fine
in
the
end,
just
very
practically.
I
think
we
should
probably
just
take
the
controversial
parts
out,
because
we
want
this
analyze
thing
to
pass
without
any
reference
to
the
project
descriptor
at
all,
and
we
can
just
copy
paste
them
into
a
new
rfc
and
then
keep
discussing
it.
C
H
C
That's
uncontroversial.
Sorry,
I
didn't
mean
to
go
back
to
the
discussion,
about
which
thing
does
which
I
was
trying
to
bring
that
in
the
context
of
the
the
things
you
highlighted
there,
joe
of
those
things
excluding
and
including
feel
like
things
that
are
more
on
the
preparer
side.
Right.
C
E
A
If,
pending
your
approval,
if
we've
removed
all
of
this
except
for
maybe
add,
are
you
producing
an
order
timeline?
I
think
that's
the
one
that
I'm
I
care
most
about
from.
If
we're
gonna
do
a
project
descriptor,
it
could
still
happen
in
a
follow-up
rfc,
but
I
feel
like
this
could
be
another
discussion
we
can
handle
that
other
ways.
But
if
we
remove
this
entire
bullet
point,
would
you
be
okay,
with
kind
of
the
the
responsibilities
listed
here
from
the
new
version
of
the
analyzer.
A
C
E
C
Image
to
get
information
about
it
and
then
having
restore
or
parse
that
metadata
and
split
it
out
into
layer,
toml
files.
Essentially
that's
all
right
and
yeah,
I'm
okay
with
that.
A
Okay,
so
we'll
clean
up
some
inputs
here,
because
we
probably
won't
need
to
necessarily
have
app
source
code,
for
instance
in
this
rfc
and
project
descriptor,
all
the
stuff
around
that
I'll.
Remove
that
from
this
rfc
maybe
transplant
it
to
a
draft
at
some
point
for
prepare
again,
you
can
reopen
that
discussion
and
decide
what
goes.