►
From YouTube: CNB Sub-Team Sync: BAT
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
I
don't
know
if
I've
seen
you
at
this
meeting
manuel.
Do
you
want
to
give
an
introduction,
or
maybe
you're
here,
and
I
just
missed
it
in
the
past,
in
which
case
I
apologize.
B
I
was
here
once
so:
yeah
no
worries
yeah
manual
foots
I'm
working
at
sidespots
on
the
java
side
of
things.
Also,
I
maintain
parts
of
our
lip
cmb,
rust
library.
A
A
Moving
on
to
status
updates,
does
anyone
want
to
give
updates
about
work?
That's
been
happening
on
the
components
owned
by
the
bat
team.
A
A
Rfcs
that
we
haven't
fight
so
I'm
ignoring
everything
that
is
in
draft,
which
brings
us
to
utility
bill
pack
for
dot
profile,
there's
some
back
and
forth
in
here
over
implementation
details,
but
I
suggested
that
we
just
take
that
out,
because
I
think
we
could
decide
that
as
a
sub
team.
I
don't
know
if
that
needs
to
happen
at
the
project
level,
and
it
might
be
the
kind
of
thing
where
we
you
know
evolve
over
time.
It
doesn't
need
to
be
recorded
forever
in
this
rfc.
A
A
Little
comment
period
label
on
this,
okay,
great,
so
moving
on
the
next.
Our
only
other
non-draft
rfc
in
flight
right
now
is
project
scaffolding
in
pack
aidan.
This
is
your
rfc
around
creating
more
templates
for
build
pack
new.
We
did
talk
about
this
a
little
bit
in
working
group
yesterday,
but
we
can
continue.
The
conversation
with
this
group
of
folks
might
be
valuable.
D
Yeah
it
so
this
comes
from
conversations
in
the
learning
team
and
builds
on.
I
think
stuff
that
came
from
natalie's
survey,
we'd
like
to
have
kind
of
golden
paths.
You
know
a
kind
of
one
true
way
of
creating
these
things.
Ideally,
it
just
makes
it
easier
to
document.
So
if
you're
going
to
try
and
create
a
new
build
pack
right
now,
the
documented
way
is
to
use
pac
buildback
new.
D
That
gives
you
a
shell
build
pack,
which
is
great,
but
I
I
suspect
that
we
possibly
want
to
push
bill
pack,
authors
towards
the
the
rust
or
golang
or
or
python
bindings
in
in
some
instances,
so
supporting
those
in
the
same
scaffolding
tool
seems
to
make
sense
to
me
at
least
how
we
go
about
implementing
it.
I'm
really
open
to
things
one
of
the
questions
that
seems
to
be
open.
D
All
right
right
now
is:
do
we
want
to
bundle
these
templates
embed
them
in
pack,
so
that
they're
just
there
or
do
we
want
to
have
effectively
a
set
of
curated
templates
that
that
pack
can
go
off
to
and
pull
them
down
from,
for
example,
github
and
use
them,
and
then,
in
both
cases,
we'd
still
support
third-party
scaffolding.
D
Yeah
right
now
it
the
it's
embedded
in
in
back.
D
D
I
I
I
think
we
I
think
we
want
to
document
the
the
golden
paths
of
getting
there,
but
I
suspect
we
don't
want
to
make
the
language
implementation
language
choice
for
end
users,
so
we're
probably
going
to
end
up
having
a
curated
scaffolding
for
for
shell
golang,
I
would
imagine
rust,
probably
python
and
whatever
else
libc
and
b
bindings
are
there
for
so
the
question
then
daniel,
I
think
becomes-
is
that
just
too
much
to
put
into
pack
as
a
binary
because
it's
going
to
increase
the
the
binary
size.
E
So
we
discussed
a
couple
of
things.
One
was
like
whether
the
templates
should
be
completely
offline,
or
just
like
some,
like
pointer
to
a
template
repository,
whether
it's
github
or
git
lab
or
a
local
git
repository
or
folder.
It
doesn't
matter.
E
Do
we
discuss
that?
If
we
want
something
curated,
we
can
still
use
this
sort
of
link
thing
but
like
we
can
provide,
maybe
shorthand
notations
to
refer
to
templates
that
the
project
provides
firsthand
so
like
similar
to
the
docker
library.
E
That's
where
you
can
avoid
the
registry
and
the
library
repository
would
just
directly
specify
that
I
think
there
were
also
some
concerns
around
having
a
generic
project,
scaffolding,
tool
and
maintained
by
the
project,
since
there's
nothing
really
built
back
specific
about
this
tool,
except
for
the
templates,
and
they
were
questions
whether
the
project
can
provide
templates
for
popular
scaffolding
tools
like
cookie
cutter,
your
man
instead
of
doing
this,
can
we
just
create
the
template
so
that
others
can
use
it?
There
were
also
some
concerns
whether
this
provides
like
a.
E
So
it's
just
like
picking
out,
which
is
the
better
parts
forward
and
also,
I
think
we
may
not
want
to
own
templates
for
findings
that
we
as
a
project
don't
support,
because
then
we
get
into
a
weird
state
where,
if
something
is
broken,
they'll
come
to
us
for
saying
that
hey
the
seams
broken
and
fix
it.
And
then
we
can't
do
much.
E
I
I
would
at
least
prefer
if
like,
if
there
are
bindings
equipped
like
we
can
suggest
something
that
hey
this,
like
here's,
a
way
to
discover
templates
for
the
repositories.
But
if
we
are
like
saying
that
something
is
owned
by
the
project
itself
like
lives
in
the
buildbacks
github
organization,
I
would
prefer,
if
we
have
some
control
over
the
bindings.
D
That
that
that
all
makes
sense
to
me
yeah,
ideally
we
don't
want
to
to
own
a
project
scuffle
until
it's
not
core
business.
We
had
a
look
at
some
other
golang
project.
Scaffolding-
tools
that
we
could
pull
in.
There
are
a
few
that
they
operate
slightly
differently
than
the
way
that
we
might
want
to
operate
this
boiler
and
a
a
recent
clone
of
that.
D
A
recent
fork
and
improvements
on
that
which
are
restricted
to
github
repos,
which
seems
like
something
that
we
might
want
to
avoid,
and
then
there
was
another
interesting
one
and
I'm
going
to
afraid,
butcher
the
german.
I
think
it
was
spring
early
or
something
like
that
which
has
its
own
weird
and
wonderful
little
format.
And
again
you
know
I
mocked
up
an
implementation
using
spring
early
and
it
was
it
works.
It's
fine.
We
can
embed
things
in
their
format,
but
it
is
just
a
different
format.
D
It
uses
the
text
archive
format,
so
it
works
differently
from
things
like
that
people
would
have
experienced
in
yeoman
or
cookie
cutter
in
other
technologies.
D
And,
if
we're
going
to
go
that
far
as
documented,
we
may
as
well
give
people
templates,
and
maybe
that
means
that,
if
we're
going
to
document
in
how
you
use
the
the
lib
cmd
python
bindings,
then
we
give
them
a
cookie
cutter
template,
because
that
is
kind
of
native
to
python.
I
I
don't
know
manuel,
but
is
there
a
rust
project,
template
tool
that
would
kind
of
like
a
standard
du
jour
in
in
that
ecosystem?.
D
And
that
way
we
could
avoid.
We
could
support
the
environment
that
people
want
to
use
and
avoid
maintaining
a
a
scaffolding
tool
at
the
cost
of
having
two
or
three
different
scaffolding
formats
that
your
your
rust
stuff
would
be
in
one
format,
your
python
stuff
might
be
slightly
different,
and
you
know
it's
trade-offs.
Software
development.
E
Out
that,
just
because
people
are
developing
a
buildback
in
a
certain
language
doesn't
mean
that
the
scaffolding
tool
written
in
that
line,
which
is
the
most
easy
to
use
like.
I
know,
first
time
that
installing
like
python
clies
can
sometimes
go
wrong,
because
people
are
not
aware
of
environments,
just
purely
from
an
installation
standpoint
like
having
a
standalone
binary
of
some
sorts
that
this
works
is
easier,
especially
if
it
is
like
it
should.
E
B
Yeah,
I
didn't
agree
with
that.
In
the
sense,
like
I
mean,
maybe
men
will
has
different
opinions,
but
just
thinking
about
for
even
like
the
lip
synthe
rust
stuff
like
we
probably
would
just
build
a
template.
How
you
tell
us
to
build
it
and
just
have
repo,
whether
it's
like
a
repo
we
maintain
or
if
we
have,
the
pr
or
something
in
a
different
repo,
like
kind
of
just
tell
us
how
it's
done
and
then
we
do
it
and
kind
of
integrate
and
make
that
work
versus.
B
I
feel
like
the
kind
of
per
language
thing
means
every
language
binding
or
vendor,
or
whatever
has
to
figure
out
like
the
picatos
stuff,
is
in
go
right
so
like
they
would
have
to
figure
out.
What
is
the?
What
is
the
scaffolding
thing,
literally
the
path
you're
going
down
of
life?
What
which
of
these
ghost
scaffolding
things
is
the
thing
that
we
should
pick
to
get
the
best
thing
right
versus,
like
you
could
imagine
bill
pack,
author
died.
B
That
probably
looks
like
the
rfc
written
in
the
how
it
works
section,
which
is
like
here's,
how
you
do:
props
you're,
the
field
and
flags,
and
this
is
what's
available
and
then
I
don't
have
to
learn
like
a
scaffolding
tool.
In
my
language,
like,
I
may
have
a
lot
of
experience
with
something
like
a
cutter
or
I
may
have
like
zero
experience.
Like
manuel
said
of
like
what
the
state
of
the
art
is
in
restaurant
there's.
Also
the
point
of
discoverability,
I
think
like.
B
If
someone
uses
pack
and
says
packing
uber
pack,
they
can
see
like
oh
there's,
a
rust
binding.
Oh
there's
a
python
binding
in
there
weren't
aware
of
that
binding
before
and
then
suddenly
they're.
Like
oh
cool,
I
can
look
into
this
python,
binding
or
whatever,
which
wouldn't
be
possible
if
we
would
just
use
the
language
native
templating,
stuff.
B
C
Yeah,
I
think
I
agree
that,
like
from
a
user
experience,
standpoint
having
it
in
pack
is
far
superior,
but
I
I
totally
also
get
the
argument
that
it's
just
not
our
our
business
to
to
be
doing
that.
So
I
can,
I
could
see
like
if
we
at
least
standardize
on
one
tool.
That
would
be
easy
enough
for
for
users
and
we
can
document
it
well.
A
Maybe
the
ideal
is
like
if
we
found
a
tool
we
liked
written
and
going
that
we
could
import.
So
we
have
the
pack
experience,
but
we're
not
like
making
decisions
about
templating
conventions
and
stuff
like
if
we,
as
a
group
spend
you
know
a
lot
of
our
time,
arguing
about
the
best
way
to
template
arbitrary
things.
D
D
So
I'll
look
into
getting
one
of
the
existing
ones
and
maybe
extending
it
to
be
more
generic
from
sources
it
pulls
from
and
then
see
where
we
can
go
from
there.
C
B
C
E
B
F
B
D
So
what
I'm
hearing
so
far-
and
please
correct
me
if
I'm
wrong-
is
that
we
know
we
have
our
problem
with
onboarding.
This
could
be
help
with
partial
solution
to
part
of
that
problem.
We're
not
quite
comfortable
with
all
the
trade-offs
yet
so
somebody
needs
to
go
and
just
explore
some
of
these
trade-offs
and
figure
out,
just
maybe
two
or
three
prototypes
to
figure
out,
which
is
the
the
best
direction
to
go
down.
B
A
Feel
pretty
strongly
that
we
shouldn't
tram
them
in
pack
like
even
just
thinking
about
the
workflow
of
you
know,
someone
who
wants
to
make
their
own
variation
on
one
like
when
it's
a
separate,
git
repo.
It's
really
easy
to
you
know,
maybe
a
four
kid
and
I
modify
a
few
things
when
it's
embedded
in
pack.
It's
both
hard
to
find
and
harder
to
change
and
reason
about.
A
B
B
E
I
mean
we
can,
we
can
provide
an
offline
experience
like
pac
can
just
for
the
list
of
repositories
that
we
suggest
during
release
time
back
and
just
capture
a
snapshot
of
the
main
branch
and
include
it.
And
then,
if
you
try
to
create
it,
it
can
try
and
reach
out
to
the
internet
or,
like
you,
can
have
some
config
config
flag.
That
says
internet
like
connectivity
falls
or
something,
and
then
it
just
uses
the
old
scale.
Local
version.
A
Yeah
we
get
cash
pack
already
has
caches,
so
we
could
cache
and
then
have
a
policy.
I
feel
like
right
now.
I
don't
want
to
like
optimize
too
much
for
totally
offline,
because
I
feel
like
we
currently
don't
do
that
right.
It's
not
it's
not
a
tool
that,
if
you
download,
if
you
have
to
do
everything
offline
like
you
need
to
pull
your
builders,
you
need
to
pull
your
own
images,
but
it
should
at
least
be
a
tool
that,
if
you've
already
got
one
copy
of
everything
you
need
it
can
work
offline.
That's
the!
C
E
Also,
I
think,
all
all
the
golang
tools,
if
I
remember
correctly,
do
have
a
local
cache
option
or
something
but
like
you
pull
it,
it'll
pull
it
in
some
ace.
The
the
home
directory,
slash
cash
law
whatever,
or
something
like
that.
So
it
is
that
behavior
is
possible
with
like
completely
offline
buildback
creation.
B
B
If
you
have
the
cash
there,
you
can
say,
don't
pull
but
like
potentially
we
are
uploading
the
templates
to
independent
of
pack
right
and
if
you
do
want
the
newest
template
so
kind
of
leave
it
up
to
the
user.
B
The
question
is:
how
usable
are
these
templates
without
an
internet
connection
themselves
like
for
us?
If
you
have
like
a
cargo
top
of
like
10
dependencies
that
are
part
of
this
template,
then
you
need
to
download
them
as
well.
If
you
don't
have
them,
then
suddenly,
your
on
the
offline
experience
ends
at
that
point
after
template
generation,
so
probably
nothing.
We
can
really
fix
at
pac
level.
A
C
Just
a
quick
note,
I
had
put
in
apr
about
logging
in
v2
just
some
kind
of
api
changes
based
on
some
things.
We
had
talked
about
and
some
things
to
try
to
make
it
easier
for
people
that
want
to
integrate
or
wrap
lib
cnb
with
like
their
library
of
stuff.
C
So
if
you
got
a
little
bit
of
feedback
on
that,
but
if
you
get
a
chance
to
take
a
look
and
provide
feedback,
that
would
be
great.
C
Yeah,
I
just
saw
the
lifecycle,
rc
is
out,
and
it's
for
and
build
pack
08
is
coming,
so
I
just
thought
I'd
bring
that
up
and
just
see
if
we
have
any
plans
or
just
to
sync
on
on
implementing
that,
because
it
looked
like
there
were
a
couple
changes
that
we
need
to
implement
for
libcnb.
E
C
Do
we
want
to
just
open
up
some
issues
on
github,
yeah,
okay,.
E
I
think
the
other
thing,
probably
we
should
file
an
rfc
for
this,
but
we
need
a
way
to
have
like
concurrent
releases
of
all
of
our
offerings
when
the
new
life
cycle
version
is
released
as
stable.
Like
lip
cnb,
we
can't
test
things
before
back
supports
the
new
api
versions.
E
So,
like
we
it's
it's
just
like
a
flow
from
from
life
cycle
to
lip
cnb
like
if
it's
just
the
buildback
api
bump,
we
can
still
mess
around
with
back
to
patch
in
the
new
lifecycle
image
and
then
use
that
to
test
the
new
buildback
functionality
but
like.
If
you
want
integration
test
at
some
point
for
lip
cnb,
we
will
need
to
figure
out
like
a
way
to
like
minimum
things.
E
A
E
F
Yeah,
my
only
comment
to
that
is,
I
think
one
month
behind
is
a
little
generous
right.
I
think
we're
probably
more
behind
on
pack
implementing
some
of
the
apis.
F
But
to
that
extent
I
know
emily
recently
brought
up
the
idea
of
doing
something
similar
for
pac.
Where
follows
the
life
cycle
strategy,
where
we
try
to
release
with
a
lump
sum
of
very
targeted
features,
and
I
think
the
spec
updates
make
sense
to
do
that.
F
The
only
thing
is
that
that
in
itself,
sort
of
takes
work
right
or
or
there
would
be
a
sort
of
delay,
because
we
would
need
for
the
livestream
first
before
pac
could
even
start
the
work
unless
we're
going
to
start
them
concurrently,
which
I
feel
like
hasn't
been
very
well
executed
in
the
past.
E
E
Pac
can
still
consume
that
version
of
the
life
cycle
like
we,
I
think
we
produce
everything
that
we
produce
container
images
for
life
cycle
for
any
intermediate
comments
as
well
right.
We
just
don't
have
like
a
full-blown
release
with
everything,
but
we
generate
the
individual
container
images
that
can
be
used
or
releases
that
can
be
used.
F
I'm
definitely
interested
in
the
conversation
and
coming
up
with
some
ideas
on
how
to
resolve
it.
I
guess
the
other
part
that
I'm
thinking
about
is
is
pac,
isn't
the
only
tool
right
like
the
only
platform
that
we
support?
We
have
techton,
and
some
may
even
say
kpac
could
be
something
that
some
of
our
end
users
really
look
forward
to
or
at
so
a
different
strategy
that
just
came
to
my
mind.
F
E
I
agree
that
pac
is
not
the
only
platform,
but
it
is
the
platform
we
use
in
our
documentation.
So
whenever
we
push
out
new
features,
we
use
pack
as
the
example
so,
for
example,
with
the
sbom
stuff,
like
people
were
writing
blogs
and
stuff
that
our
api
didn't
support:
cyclone
dx
and
sbdx,
when
in
fact
we
did
because
our
docs
were
not
updated
because
we
couldn't
use
the
functionality
in
path
and
so
on.
So,
like
I
agree
back
is
not
the
only
thing,
but
it
is.
C
I
think
it
would
be
at
least
short-term
helpful
if
we
had
documented
like
like
a
build
environment
setup.
You
know.
So
if
someone
wants
to
work
on
this
stuff
and
they're,
they
need
to
use
these
rc's
like
what
do
I
have
to
do
to
pull
all
this
stuff
together,
because
it's
it's
not
straightforward,
I
remember
doing
the
s-bomb
stuff
and
it
was.
I
still
don't
fully
understand
how
it
all
came
together.
F
Yeah-
and
I
don't
think
it's
just
you
right,
like
even
myself-
was
outside
of
what
was
happening
with
the
s-bomb
stuff
as
it
moved
fairly
quickly
throughout
so
yeah.
I
don't
know
how
to
resolve
that
very
specific
sort
of
problem.
It
seems
like
it
would
be
nice
if
drivers
of
certain
features
would
kind
of
drive
it
all
the
way
out
right,
maybe
into
pack,
but
I
know
that's
kind
of
a
pie
in
the
sky
solution.
A
I
think
the
thing
we
can't
do
is
drive
something
as
far
as
the
life
cycle
and
just
assume
that
all
the
other
components
will
pick
it
up
like
pack
and
documentation
and
tech
time,
like
we
at
least
need
to
track
how
far,
through
all
the
different
components
something
is
flowing,
which
is
something
I
was
trying
to
do,
at
least
with
system
build
packs
by
creating
tracking
issues
on
the
rfc
repo.
So
at
least
we
can
see
where
something
is.
F
E
Yeah,
I
think
natalie
did,
for
life
cycle
releases
recently
like
list
of
requirements
that
must
be
met
before
a
life
cycle
releases
shipped
off,
which
included
top
updates
and
a
bunch
of
other
things.
But
I
think
we
need
more
than
that.
E
I
was
just
going
to
say
the
other
thing
that
stands
out
as
the
project
descriptor
we
implemented
it.
In
fact,
I
was
spending
a
release
on
these
specs
for
over
four
five
six
months,
maybe
which
was
also
bad
because
our
users
were
using
it,
but
we
couldn't
point
to
anything
canonical.
That
said,
here's
how
you
use
project
descriptor
4.2
so
like.
A
F
F
But
I
mean
I
would
say
even
that
is
a
slightly
better
situation
than
I
feel
like
the
opposite
right,
where
there's
a
lot
of
maybe
nice,
gems
and
features
and
life
cycle
that
just
haven't
been
exposed
all
the
way
and
even
furthermore,
created
a
nice
extinct,
end
user
experience.
B
A
E
F
I
mean,
I
think,
that's
interesting,
but
I
don't
know
that.
That's
necessarily
true
a
lot
of
the
pac
users
that
we
I've
interacted
with
are
not
build.
Pack
authors
right,
they're
more,
like
app
developers
that.
F
Yes,
yeah
and
and
they're
pretty
content
where
the
the
idea
that
things
just
work
but
they're
not
trying
to
leverage
it
as
much
as
what
some
of
the
people
that
are
trying
to
build
custom,
build
packs
or
trying
to
leverage
some
of
the
existing
build
packs
like
the
google,
heroic
or
potato
build
packs
and
do
their
own
thing
with
it.
A
Yeah,
I
think
it's
about
like
the
direction
requests
come
in
so
like
take
an
example
of
a
big
problem,
we're
trying
to
solve
with
the
spec
like
some
of
the
catching
stuff
right.
I
think
a
lot
of
that
feedback
sometimes
comes
to
little
pack
authors,
because
it's
not
clear.
Why
there's
a
problem?
It's
like
you
know.
Someone
comes
to
dan
and
it's
like
the
paketo
java
bill
pack
keeps
downloading
the
jbm
over
and
over
for
each
one
of
my
apps.
Why
is
your
build
pack
being
so
dumb?
A
This
piece
here
is
slow
or
it
doesn't
work
on
this.
On
my
you
know,
m1
mac
or
stuff,
like
that
yeah
exactly.
F
F
Yeah,
I
think
we
still
have
one
that
would
be
nice,
which
is
whitelisting
registries,
or
something
like
that.
White
listing
is
probably
not
the
right
term
requested.
A
E
E
Move
to
artifact
club
also
takes
away
the
only
live
major
dependency
that
we
as
a
project
have.
I
know
this
is
not
the
right
place,
but
just
curious
what
the
thoughts
were.
Since
we
have
authors
here
who
will
be
impacted
by
such
change,
if
we
propose
it.
E
E
No,
it's
it's
just
a
generic
registry.
It's
not
just
oci
images.
Oc
images
are
part
of
it.
Helm
charts,
for
example,
are
github
repositories.
So
it's
it's
not
even
github
repositories.
It's
like
tarballs
hosted
somewhere,
but
you
can
you
can
you
can
host
any
artifact
type
there?
That's
that's
their
goal.
Just
discovery
of
any
artifact
type
helm,
for
example,
like
is
driven
very
much
by
git
github
or
like
fetching
things
from
web,
nothing
to
do
with
ocr
registries.
E
A
Would
we
want
to
like
we
need
to
move
to
having
like
a
media
type
that
works
in
oci
registries
for
build
packs
in
order
to
distinguish
them
from
images
in
this
registry?
I
feel
like
this
is
always
the
problem
where
there's
actually
a
great
solution
already,
there's
our
build
packs
drop
the
bill
pack
registry
on
the
ground,
our
bill
packs
already
in
a
registry
that
has
like
a
decent
syntax
for
accessing
the
things
there
and
it's
just
an
image
registry.
E
Yeah,
it's
so
artifact
hub
does
have
a
way
of
solving
all
of
that,
but
not
look
in
detail,
but
you
can
search
by
type.
It
doesn't
have
to
be
a
specific
media
type
as
far
as
I
can
tell
like,
but
my
knowledge
on
that
is
a
bit
dubious
and
I
don't
want
to
make
claims
that
hey.
We
don't
need
media,
but
I
I've
recently
seen
a
bunch
of
projects
integrate
with
it
right.
E
People
walking
on
the
distribution
side
right
now
like
or
at
least
like
it's
just
joe
right
now,
who's
who's
on
the
distribution
team,
actively
working.
So
like
that
worries
me
like,
for
example,
we
had
a
github
outage
last
week
and
some
of
the
kettlebell
published,
and
then
it
was
only
joe
who
could
try
and
fix
that.
E
B
I
mean
there
was
talk
about
trying
to
roll
distribution
into
bat
or
something
at
one
point
just
because
distribution
is
probably
not
in
a
healthy
spot
of
one
maintainer,
but
maybe
that's
a
somewhat
separate
conversation.
I
mean
there's
also
another
conversation
of
just
I
mean
github's
been
down
every
day
this
week.
Let's
be
honest,
they
also
have
github.
Actions
are
under
maintenance
right
now,
so
I
hope
the
kettle
is
potentially
not
publishing
buildpacks
to
the
registry.
B
I've
also
been,
I
wear
pager
for
stuff
at
fx
github
and
I've
been
paged
many
times
recently
because
of
it.
So
it
is
they've
not
been
in
a
healthy
spot
for
sure.
E
B
E
B
A
I
think
it
could
be
worth
periodically
evaluating.
You
know
like
we
build
things
based
on
a
hypothesis
about
how
it
will
affect
people
periodically
evaluating
like
to
what
extent
our
predictions
came.
True,
I'd
say
one
especially
with
the
registry
like
there's
a
lot
of
even
some
of
the
suggested
bill
packs
we
have
like
the
google
ones
that
I
believe
did
not
end
up
in
the
registry,
and
I
know
a
lot
of
times
whenever
pac
is
trying
to
pull
something
from
the
registry.
A
I
was
thinking
in
terms
of
images
and
I
get
annoyed
that
it
wasn't
thinking
in
terms
of
images
as
well
like.
I
don't
often
feel
like
I'm
getting
a
ton
of
value
from
the
registry,
but
it
could
just
be
the
way
I
use
the
tools
that
other
people
arm
so
it'd
be
nice
to
get
like
a
broader
set
of
feedback.
There.
A
E
C
E
They
they
have,
they
have
a
hosted
offering
that
lets
them
discover
things.
It
runs
similar
to
our
registry,
where
it
stores
metadata
but
you've
destroyed
the
actual
artifact
somewhere
else.
E
E
So
it's
just
a
discovery
mechanism
and
it
has
an
api
and
it
also
provides
statistics
about
downloads
and
stuff.