►
Description
wasmCloud Weekly Community Call - this week featuring a demonstration of wasi-nn, .15 roadmap discussion, building for arm64 and more!
A
Okay,
all
right
so
welcome
everybody
back
for
the
january
sixth
meeting
for
wasm
cloud
happy
new
year,
where
we're
trying
to
enable
developers
to
build
their
functions
and
services
and
webassembly
and
run
them
everywhere,
and
I
think
we've
got
a
couple
new
people
joining
us
today.
Milan
and
felix.
You
guys
want
to
take
a
second
and
maybe
introduce
yourselves.
B
Sure
you
can
go
first,
felix
thanks!
Very
much!
Hey
everybody
felix
here
from
shopify,
I'm
part
of
shopify's
automation,
group,
there's
another
one
of
my
colleagues
he's
on
the
call
as
well
jeff
charles.
I
recently
joined
this
group
and
we
are
using
a
web
assembly
to
run
low
latency
scripts
and
I'm
looking
forward
to
meeting
everybody
and
working
with
this
group.
Those.
A
C
Don't
think
I
got
the
chance,
but
thank
you
yeah,
so
I
work
with
felix
at
shopify
on
in
our
automations.
D
C
Logic
inside
of
our
monolith,
using.
A
A
Okay,
great
and
then
milan
did
you
want
to
take
quickmen
and
introduce
yourself
yeah
for
sure
hi,
I'm
milan,
I
work
at
capital
one
and
I
previously
did
a
proof
of
concept
using
web
assembly
for
an
internal
security
engine
at
capital,
one
and
I'm
just
looking
forward
to
re-engaging
with
the
community.
A
That's
great
and
in
our
tradition
of
our
tradition,
we're
trying
to
run
is
to
kick
off
each
meeting
with
a
demo
and
I'm
really
happy
to
turn
the
meeting
over
to
andrew
and
min
queue
from
intel
who
are
going
to
do
a
demo
of
wazi
nn
andrew.
Are
you
ready
to
share
your
screen?
I
can
give
you
permissions.
F
Sure,
maybe
I
think
the
way
minsky
and
I
talked
about
it-
mingxu
was
going
to
show
some
slides,
real,
quick
and
then
I'll
jump
into
some.
You
know
the
code.
A
Okay,
super
you
guys
should
both
have
the
ability
to
share
your
screens
go
right.
A
G
Yeah,
somehow,
having
difficult
to
find
one.
A
It's
the
new
year-
and
I
appreciate
you
guys-
I
know
we
talked
a
little
bit
about
this
in
december,
but
I
followed
up
early.
I
do
have
a
little
backlog
in
our
in
our
agenda,
I'll
link
it
here
in
the
chat
for
folks.
A
If
anybody
wants
to
look
at
it
on
topics
before
demo,
it's
down
at
the
very
bottom
here,
while
you
guys
are,
let
me
know
when
I
can
stop
filling
airtime
here,
a
whittle
with
phil,
caddy
and
wapc
with
phil.
I
think
he's
gonna
try
to
do
one
or
both
of
those
next
week,
including
the
some
of
the
new
wapc
code
generation
stuff.
We
have
code
generation
itself,
we
will
have
the
wasmcloud.15
release,
which
is
hopefully
sometime
here
in
january.
A
We
have
the
website
work
that
jordan
and
I
are
working
on
and
we
have
tutorials
crosslit
wasn't
three
on
small
devices
and
possibly
a
talk
through
on
the
yeoman
wpc
generator
stuff
that
the
microsoft
folks
are
working
on.
If
there's
any
other
demos
just
feel
free
to
add
them
right
to
that
agenda
in
chat,
and
we
can
go
from
there.
G
Yeah
yeah,
that's
right!
So
thank
you.
Thank
you.
So
much
yes.
For
some
reason,
I
thought
it's
the
next
hour,
but
it's
at
the
10..
So
so
let
me
just
could
you
move
to
the
second
slide
yeah?
G
So
I
think
we
don't
need
to
introduce
the
water
here,
because
everybody
knows
that
right.
So
you
know
this
was
a
presentation
that
we
prepared
for
the
web
machine
learning
workshop.
You
know
end
of
last
year
ish.
So
so
we
are
part
of
a
buy
code
alliance
which
is
a
you
know:
open
source
community,
that's
dedicated
to
creating
a
you
know,
secure
new
software
foundation.
I
think
some
of
the
members
of
this
group
also
involved
in
the
alliance,
so
we
are
part
of
our
founding
members.
G
You
know,
together
with
mozilla
fastly
and
the
redhead,
so
intel's
involvement
in
this
effort
is,
you
know,
mainly
on
like
a
three
different
front.
First,
we
contribute
a
small
footprint.
G
You
know
implementation
of
a
web
assembly
background
time
and
we
also
contribute
to
the
smd
implementation
in
modern
time
and
the
was
the
neural
network
interface
specification
and
reference
implementation.
That's
kind
of
a
search
front
of
our
effort.
Okay,
next
one.
G
So
I
think
that
have
been
like
a
some
question
about.
You
know
why
we
are
doing
this,
so
hopefully
this
can
help.
You
know
some
clarification
on
the
motivation,
so
we
you
know,
we
think
this
web
assembly
is
a
perfect
format
for
machine
learning
deployment.
G
So
you
know
the
importance
of
this
okay.
You
know
you
need
to
have
support
for
a
variety
of
different
devices
with
different
architecture
and
the
different
operating
system.
G
So
web
assembly
provides
this
ideal
portable
format
of
deployment
and-
and
in
addition
to
that,
you
know,
there's
also
usually
performance
requirement
for
web
assembly,
and
it
could
be.
You
know
some
special
feature
of
a
cpu
like
a
avx
512
and,
potentially
you
know,
gpu
and
special
accelerator,
like
a
tpu
support
on
a
variety
of
devices.
So
that's
the
reason
that
you
know.
G
G
You
know
the
reason
for
that
is
that,
because
machine
learning
is
still
evolving
rapidly,
with
new
operations
and
network
topologies
and
under
some
number,
like
you
know,
machine
learning,
operation,
increased
20
percent
every
year
and
and
there's
no
end
of
a
no
no
no
side
of
ending
the
happening
anytime.
Soon.
Next
slide.
G
Yeah,
so
so
that's
you
know
because
of
those
considerations,
so
we
think
we
can
start
with
a
machine
learning
model
loader
api
first
now
take
care
of
the
influencing,
which
you
know
could
be
like
on
billions
of
billions
of
devices
rather
than
you
know,
training
center,
which
is
probably
like
relatively
small
footprint
in
terms
of
deployment
footprint.
G
So
it's
a
simpler
api
and
it
actually
allows
for
better
eyepiece
protection
because
it
doesn't
really,
you
know
the
models
that
they
involve.
You
know
just
load
the
model
and
you
don't
need
to
have
the
internal.
You
know
knowledge
of
the
of
the
model,
construction
and
the
the
weight
of
the
model,
for
example,
exposed
to
the
api
level
so
we're
closely
working
together
with
another
effort.
G
You
know,
which
is
focused
on
browser
side
of
machine
learning
and
a
web
and
then
side
of
it,
and
we
try
to
be
you
know
in
step
with
with
their
definition
of
the
model,
loader
api,
okay,
next.
G
Here's
the
unless
there's
any
questions
before
we
move
on.
F
Let
me
let
me
share
this.
This
slide
real,
quick,
so
at
a
high
level,
if,
if,
if
you're
into
block
diagrams,
what
what
this
explains
is
that
if
you
want
to
run
machine
learning
models
using
you
know,
this
was
end
thing.
Here's
what
it
would
look
like.
You'd
have
user
application
code,
which
would
say
you
know
hey.
I
want
to
execute
this
model.
F
Well,
first
load
the
model,
then
you
know
give
it
some
inputs,
then
you
know
compute
the
inference
you're
going
to
compile
that
with
a
wazing
and
header
to
this
wasm
file
at
runtime,
you'll
need
to
provide
the
the
the
model
itself
file
or
files
right.
F
Currently
that
you
know
in
our
current
implementation,
that's
only
going
to
be
openvino
ir,
but
the
the
spec
allows
for
any
number
of
different
types
of
models
cafe
or
test
whatever
you
want
right,
but
you
you
just
need
to
back
that
up
with
the
implementation
that
would
work
and
then
in
the
in
the
current
version
of
this
stuff.
F
In
wasm
time
we
implement
wazionnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn
and-
and
we
take
the
wasm
file
in
the
model
file
and
we
we
ship
them
off
to
open
v,
note
to
be
run
on
on
the
right
device
and
then
you
know
back
would
come
your
results.
F
Let's
go
over
here
so
in
the
wasm
time
repo.
I
have
this
script
that
has
now
been
merged
into
the
main
branch
called
runway
in
an
example.
F
F
I
think
it's
it's
an
alexnet
model
and
an
image
formatted
in
the
right,
the
right
way
for,
for
you
know,
openview
to
understand
it,
and
then
we
compile
this
example
code,
which
is
right
here
into
a
wasm
file,
and
we
run
it
using
wasm
time
so
that
this
command
here
this
cargo
run
will
will
run
the
the
wasm
file
with
the
model,
and
you
know
off
we'll
go
now
and
notice
here
you
we
have
this
feature
flag,
so
wazingnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn
isled
wasn't
time
by
default.
F
Currently
perhaps
at
some
point
in
the
future,
it'll
be
a
runtime
flag
and
it
will
be
compiled
in
by
default.
But
this
is
definitely
an
optional
feature.
I
mean
not
not
everyone.
Who's
using
watson
time
is
going
to
want
to
use
machine
learning
capabilities,
but
so
that's
that's.
At
least
we
get
that
here's,
the
code
that
actually
you
know,
runs
the
model.
F
And
so
here
you
see
me,
you
know,
reading
the
model
files
in
I
load
in
those
model
files
using
this
ynn
call
that
russ
knows
how
to
you
know,
translate
into
a
wazzy
call
down
here.
We
so
here's
we
have
the
tensor
we're
setting
the
the
input
here
we're
doing
like
the
computation,
so
this
will
be
the
inferencing
and
then
we'll
retrieve
the
output
and
and
we'll
sort
the
results
so
that
we
can.
You
know
print
something
that
makes
some
sense
to
the
user.
F
So,
let's
see
here
so
that
script
does
a
bunch
into
downloading.
So
I
pre-downloaded
all
that
stuff
here
what's
happening.
When
I
run
this,
is
it
it
compiled
my
example
main.rs
to
a
wasm
file
with
those
wazzy
calls
in
it,
and
now
it's
running
it
and
you
can
see
that
actually
loading
the
model
is
sort
of
slow.
That's
sort
of
an
artifact
of
currently
wowzi
doesn't
like
memory
map
mapping
files
when
it
reads
them,
it
like
reads
them
byte
by
byte.
So
it's
doing
the
whole
byte
buffer
transfer.
F
But
that's
not
really
that's
a
that's
a
separate
issue,
it's
unrelated
to
wes
dnn,
but
other
you
know,
there's
still
stuff
that
needs
to
get.
You
know
sorted
out
and
in
wazy,
but
so
what
you're
seeing
here
is
it
sort
of
run
in
through
that
example?
F
It
executes
the
inference
and
then
it
sorts
the
results
in
and
you
get
these
class
ids
with
their
probability.
So
it's
saying
that
there's
a
53
probability
that
that
image
was
class.
Id
963
turns
out
that
class
id
963
is
a
pizza,
but
you'll
say
wait.
We
didn't
even
see
the
image
yet
so
if
to
look
at
the
those
test,
fixtures
that
I've
created
there's
a
the
whole.
The
open,
vino,
rrs
repo,
contains
those
texts.
F
Yeah
test
fixtures,
and
so
this
this
image
was
what
was
converted
into
the
tensor,
and
here
here
you
see
the
the
alexnet.
You
know
weights
and
stuff
like
that.
That
need
to
be
converted
for
openvino
to
consume
that
so
that
that
stuff's,
all
there
there's
more
links
to
all
this
stuff
in
in
the
blog
post
that
I
posted.
F
But
I'm
happy
that
you
know
it
thinks
this
is
a
pizza.
So
that's
good,
the
the
two
blog
posts
that
we
have
related
to
this
came
out
in
december.
F
I
haven't
done
any
work
on
this
since
then,
because
I've
been
on
vacation,
but
if
you're
interested
in
sort
of
more
details
about
what
I'm
talking
about
right
now,
here's
here's
one
of
the
links,
so
it
would
be
using
wisely
and
then
and
was
in
time
and
then
here's
implementing
westing
and
in
wasm
time
the
first
is
as
a
user.
You
know
how
would
I
use
wasn't
wasn't
in
currently,
and
the
second
is
okay.
If
I
want
to
write
my
own
wazi
proposal,
what
would
I
need
to
do
to
implement
that?
F
And
why
them
time?
Because
it's
it's
not
exactly
trivial,
there's
a
lot
of
things
to
sort
of
work
through
to
make
everything
work
correctly.
F
I
think
yeah
they
have
they
they're
linked
here.
They
should
link
to
each
other,
but
we
haven't
done.
We
haven't
really
publicized
these
links
much
out
there
so.
A
Okay,
that
was
awesome.
Does
anyone
have
any
questions
for
andrew
romingshu.
A
I've
got
a
question:
what
is
the,
what
is
the
thought
or
is
there
been
any
discussions
with
any
other
engines
on
incorporating
wazing,
and
you
know
more
broadly
or
you
know,
building
it
out
or
is
the
idea
that
contributing
it
to
wasm
time?
Other
implementations
should
look
here
and
then
try
to
pull
the
work
in.
F
Still
pretty
early,
you
know
so
right
now
we're
so
so
I
would
say
it's
early
in
a
couple
of
different
ways.
In
one
way,
wazzy
itself
is
is
still
in
a
very
early
phase
and
there's
a
lot
of
groundwork.
F
That
needs
to
happen
with,
like
the
fundamentals
of
the
system,
api
and
wazing
inn
is
you
know
two
or
three
steps
beyond
those
fundamentals
right
like
file
system,
stuff,
networking,
all
that
stuff
needs
to
be
sorted
out,
and
we
sort
of
took
the
approach
that,
if
we're
going
to
dip
our
toes
in
this
wassie
pond
we're
going
to
do
it
in
something:
that's
not
really
going
to
be
fraught
with
with
a
conflict.
F
Yet
right,
the
the
ml
stuff,
you
know,
is
sort
of
down
the
road,
so
we
have
a
chance
to
evolve
the
api.
Without
you
know,
someone
breathing
down
our
neck,
saying
hey
when's
this
going
to
come
out
in
that
sense,
wazzy
itself
is
very
early,
but
wazzy
and
nnn
itself
is
very
early
right.
The
api
we
need
to
evolve
it
and-
and
you
know,
we've
talked
about
switching
over
to
a
model
builder
api.
Where
you
could.
You
know
control
the
the
nodes
in
the
graph
directly
from
within
wazi.
F
That
would
that
would
maybe
look
a
little
bit
more
like
what
you
know
tensorflow,
and
these
guys
can
do.
We've
talked
about
other
implement
ever
other
backing
implementations.
Besides
openvino
right,
you
could
have
tensorflow
or
or
something
like
that,
you
know
implementing
yzn
and
we.
What
we
really
want
to
do
is
we
want
to
see
people
how
people
would
use
this
or
what
changes
they
would
need
in
order
to
use
this
in
in
production
right.
This
is
currently
at
the
proof
of
concept
stage.
You
know
we
can
do
this.
F
I
can
you
we
can
sort
of
do
some
inference
there,
but
there's
probably
some
rough
edges
that
need
to
be
worked
out
and
we
sort
of
need
feedback
from
actual
ml
practitioners
in
the
community.
A
Got
it
so,
my
I
I
I'll
close
with
that
question
around
like
what
is
it
that
you
need?
You
know
like
the
call
to
action
for
us
here
at
the
end,
but
in
the
interim
I
think
you
touched
on
another
great
question:
how
much
collaboration
or
cooperation
was
there
with
other
engine
providers
like
tensorflow
to
ensure
that
the
api
really
kind
of
is
a
truly?
You
know
good
middleware
between
you
and
lots
of
was
there?
Was
there
proactive
engagement?
A
Did
you
guys,
look
at
it,
or
is
this
just
like
a
first
pass
that
you
know?
Maybe
if
you
could
share
a
little
more
about
that
yeah.
G
So,
okay,
so
thank
you.
Let
me
take
on
this
one.
So
so
we
are
really
working
very
closely
with
you
know
with
the
webinar
group,
which
is
a
really
a
consortium
for
like
a
in
the
collection
of
all
those
top
machine
learning.
Practitioners
from
you
know,
for
example,
microsoft
from
tensorflow,
google
group
and
and
intel
and
other
another,
a
bunch
of
companies
that
were
involved,
so
we
worked
really
close
with
them
and
there
is
a
similar.
G
You
know,
proposal
for
model
loader
in
that
community
and
we
kind
of
mirror
that
api
really
closely
yeah.
So
I
would
say
that
you
know
one
of
the
driving
factor
for
us
to
do.
This
is
really
the
performance
disparity
between
nato
and
web
assembly.
G
You
know,
web
assembly
is
supposed
to
be
near
native
performance,
but
in
terms
of
a
high
performance
you
know
high
performance
computing
right,
such
as
the
machine
learning.
You
know
the
gap
is
still
pretty
big.
We
are,
you
know
talking
about
implementing
sim
d
428
bit
right,
but
you
know
avx.
512
is
a
common
practice.
A
Phenomenal
any
other
questions
across
teams
or
people
here,
and
then
I
I
think
the
closing
is
just
to
confirm.
The
call
to
action
is
is
for
people
to
give
you
feedback
on
the
approach
and
ideas
and
stuff
like
that,
and
where
should
they
do
that?
Should
they
do
that
through?
F
I
would
say
issues
in
wasm
time:
I
will
respond
to
those
pretty
quickly
any
issues
to
the
wasan
and
repo
as
well.
You
know
I'll
I'll,
see
and
we'll
respond
pretty
quickly
if
you
want
to
like
discuss
something,
maybe
not
necessarily
an
issue,
I'm
on
zulip
pretty
much
all
the
time
the
bike
coated
alliance
zoo
up.
So
if
you
find
me
on
there
I'll
send
you
the
links
later
liam
and
you
guys
can.
A
Yeah
awesome
I'll
make
sure
that
those
are
linked
in
our
notes
here,
and
I
have
your
articles
linked
in
great
job.
Thank
you
so
much
for
the
for
the
great
demo
today.
I
really
appreciate
it
and
let's
slide
on
down
the
agenda
here,
real
quick
to
next
topic,
which
is
just
a
community
news
tomorrow.
Speaking
of
awesome
time
is
the
wason
time
community
meeting
thursday
at
one
o'clock
meeting
agenda
is
posted.
I,
and
if
you
I
need
a
an
invite,
you
can
ask
for
one
on
zoola
board.
A
I
hit
me
up
and
I'm
happy
to
relay
that
relay
that
over
so
then
into
our
existing
follow-ups
here
on
current
stuff.
I
I
know
chris,
you
were
out
for
much
of
december
and
ralph.
Did
you
guys
connect
on
the
crestlit
and
crit
follow-up
or
just
punt
that
one
still.
A
Had
to
drop
okay
all
right!
Well,
I
guess
we
punt
that
one
that
one's
becoming
like
the
universal
one
we'll
get
to
that
one
later.
Good
news
is
on
the
next
item.
A
Our
azure
registry
is
complete
and
has
been
enabled
with
anonymous,
pull
access,
so
wasmcloud.azurecr.io
we're
still,
I
think,
migrating
artifacts
there,
but
that
has
been
enabled
I
as
a
next
update
chris
and
I
spent
some
time
over
over
break
working
on
the
new
wasm
cloud
website,
and
we
have
sketched
that
out
here
in
github,
with
a
set
of
milestones,
it's
connected
into
netlify,
so
that
if
you
visit
wasmcloud.com,
you
are
hitting
the
main
branch
and
the
dev
branch
is
linked
in
here
as
well.
A
There
are
some
early
early
shots
here
and
I'll
link
in
if
people
want
to
start
giving
us
feedback
on
the
on
the
design-
and
I
went
ahead
and
posted
up
in
the
community
notes
here
kind
of
the
discussion
that
we
had.
If
you
just
want
to
look
for
reference
in
community
meetings
here
under
the
23rd,
actually,
here's
the
link
to
the
netlify
dev
branch.
A
If
you
want
to
give
live
feedback
and
there's
the
link
to
the
decomposed
issues
here,
sort
of
like
you
know
where
we're
going
with
this
ending
with
like
a
nice,
interactive
online
experience,
new
docs
and
so
forth,
and
then
the
sort
of
notes
and
links
to
all
the
artifacts
in
here,
like
the
new
website
map,
the
new
website
proposed
view
the
templates
we're
using
some
of
our
thoughts
around
color
theory
and
logos.
And
you
know
all
the
sort
of
details
are
all
linked
in
there.
A
Awesome
any
of
the
microsoft
folks,
I
don't
know,
speak
for
ralph
if
there's
any
progress
on
the
months
of
interest
here,
which
is
sort
of
a
standing
topic
of
of
links
to
topics
and
ideas
that
are
around
webassembly
I'll
hit
ralph
up
next
time
early,
because
I
think
he's
got
a
conflict
at
1,
30.
and
then
on
to
dot
15
stuff
kevin.
Did
you
want
to
maybe
pick
up
on
a
discussion
around
the
sort
of
stuff
that
we've
still
got
open
here?
A
Our
list
from
december
was
the
actor
interfaces
which
are
linked
here
and
maybe
start
here.
I
Yeah,
so
there's
been
no
no
progress
at
all
on
the
actor
interfaces.
We
still
have.
All
of
those
outstanding
there's
been
a
decent
amount
of
progress
on
on
the
rebel,
but
again
not
really
within
the
last
I'd
say
a
week
and
a
half
or
so
since
most
of
us
have
been
on
vacation.
I
Okay,
basically,
the
the
big
outstanding
thing
for
15
is
the
automation
of
creating
all
the
release,
artifacts
and
publishing
all
of
the
capability
providers
and
the
example
actuators
to
the
registry
and
there's
one
piece
of
functionality
still
outstanding,
which
is
extracting
the
lattice
cache
functionality
from
being
built
into
the
host
to
being
behind
a
capability
provider.
A
Okay,
is
there
any
any
for
the
bill
I'll
I'll
hand
it
over
to
you
for
a
packaging
artifacts
here
in
a
second,
you
can
just
let
us
know
where,
where
things
are,
but
on
the
ones
that
you
mentioned
kevin
that
you're
leading,
is
there
any
any
help
or
feedback
that
you
need
on
that.
I
Which
ones
did
you
say?
Which
ones
are
you
talking
about.
I
A
All
right,
that's
exciting.
I
know
it's
been.
It's
been
quite
the
hall
to
get
here
bill,
anything
to
add
on
packages
or
anything
like
that
or
anything.
You
need
help
or
testing
on.
E
Yes,
so,
as
far
as
I
noticed,
homebrew
is
there
so
homebrew
should
be
set
that
one's
kind
of
a
bit
more
of
a
manual
process.
So
essentially,
when
we
do
a
github
release,
which
is
automated
in
wasmcloud
and
wash
we'll,
essentially
update
the
wasmcloud
homebrew
repository
to
point
to
the
checksum
and
release
file
for
that.
As
far
as
like
the
army
type
packages,
those
should
be
good.
Currently,
it's
kind
of
a
hacky
way
to
build
it.
E
It's
like
docker
and
docker
china
build
arm
packages,
we're
looking
into
purchasing
like
an
arm
instance,
possibly
in
aws
right
now,
just
to
speed
those
builds
up
and
make
it
a
little
less
finicky.
So
those
should
be
looking
good
and
currently
working
for
the
test.
Automation
for
wasm
cloud,
currently
working
through
a
few
little
issues
with
that.
So
hopefully
I'll
get
a
pull
request
in
today
or
tomorrow
to
get
that
moving
forward.
A
Does
anyone
have
any
suggestions
or
feedback
anyone
else?
That's
building
arm
packages
on
how
you're
doing
this?
I
think
the
wasm
time
team
ultimately
ended
up
just
getting
an
aws
box
themselves
that
actually
came
up
in
their
last
meeting.
Anyone
else
have
any
other
suggestions,
for
you
know
best
ways
on
you
know,
building
that
scalably.
H
Provider
for
questlet
and
what
we
had
to
do
effectively
was
on
the
linux
host
for
ci
instance.
We
had
to
have
the
entire
like
c
compiler
and
everything
else
cross
compiled
for
arm,
and
so
those
binaries
are
available,
like
you
can
get
the
c,
compiler
and
whatnot
and
c
lang
and
a
couple
of
other
things
basically
to
be
able
to
compile
into
llvm
on
on
a
forearm,
but
on
a
amd64
architecture,
and
it
works
it.
H
But
then,
if
you
have
to
deal
with
like
open,
ssl
or
anything
else
like
that,
like
you'll
have
to
basically
compile
that
from
scratch
to
make
it
work
with
arm
or
like
cross,
compile
it
for
arm.
So
it's
like,
it
really
depends
on
how
many
dependencies
you're
taking
on
at
that
time.
It's
really
not
scalable.
H
So
if
you
have
the
cash
flow
and
are
able
to
like
swing
in
arm
instance
on
aws
or
something
along
those
lines,
I
would
highly
suggest
doing
that.
Rather
than
doing
the
the
artisanal
way
of
crafting
a
cross
compiled
tool
chain.
E
A
I
love
the
link
here
to
the
artisanal
way
of
doing
it,
as
well
as
the
feedback
on
there
be
dragons
here,
so
we
can
definitely
get
a
box
and
either
spin
it
up.
You
know
on
the
fly,
but
mvp
is
probably
just
turn
on
the
box
and
you
know
ssh.
You
know
like
configure
it
manually.
So
thank
you
for
that
awesome
guidance
on
that.
That's
huge,
I'm
gonna
link
that
into
the
notes.
A
Okay,
that's
great
and
then
bill.
You
said
homebrew.
What
was
the?
What
was
the
feedback
on
the
homebrew
as
far
as
like
does
brew
work
or
do
we
where,
where
are
we
with
that?.
E
Yeah
yeah,
so
basically
there's
a
homebrew
dash
or
wasmcloud
homebrew
repository
out
there
right
now
currently
only
has
wash
listed
in
there.
That's
been
tested,
jordan
rash,
helped
me
out
with
that
we're
just
I'm
just
basically
waiting
on
a
github
release
of
wasmcloud.
So
once
that
is
out
there,
we
can.
You
know
basically
point
the
homebrew
repository
for
wasm
cloud.
E
Specifically
there
you
go
and
basically
all
you
do
is
point
to
the
release
file
on
the
checksum,
and
so
once
we
update
that
you
can
install
it
pretty
cleanly
to
or
using
homebrew
sorry,
okay.
So
it's
pretty
straightforward
to
get
it
up
and
running
all.
A
Right
so
in
the
near
future,
we'll
have
like,
like
a
brew
command
for
for
the
project,
yeah.
E
I
mean
you
can,
actually
you
can
install
wash
with
it
right
right
now,
just
waiting
on
creating
a
release
file
for
wasn't
cloud
to
get
that
added
and
okay,
quick.
A
All
right
super-
well,
maybe
a
mini
demo,
I'll
put
us
down
for
is
once
we
hit
the
next
milestone.
You
can
do
a
like
a
quick
little
video
it'd
be
useful
for
website.
Anyway.
Absolutely
that's
awesome.
Thank
you
so
much
for
all
that,
that's
great,
so
we're
really
close,
as
kevin
mentioned,
on
the
new
wasm
cloud
experience,
which
is
still
linked
here.
A
If
anybody
wants
to
give
feedback,
and
last
week
we
had
a
little
discussion
about
bindle
and
I
think
there's
going
to
be
some
more
discussion
on
that
and
we'll
cue
that
up
for
a
demo
and
taylor,
do
you
want
to
do
a
quick
just
verbal
on
what
bindle
is?
I
know
it's
linked
out
right
right
here,
yeah
I'll
go
over
that
really
quick,
but
also
before
I
forget.
A
As
soon
as
we
cut
this
next
release
of
crescent
we're
gonna
be
moving
over
the
provider
formerly
known
as
wasc
to
the
wasmcloud
org
we
just
it
took
longer
than
we
intended,
but
we
had
a
bunch
of
refactors.
We
were
doing,
and
now
all
those
major
refactors
are
done
so
as
soon
as
we
cut
that
new
release,
hopefully
by
the
end
of
this
month,
we'll
be
moving
that
over
so
that
it
can
be
more
based
on
like
hey.
A
This
is
this
is
a
wasn't
cloud
thing
and
kind
of
centralized
like
we've
talked
about
before
so
that'll
be
going
over
there.
So
sorry
that
side
note
done
now.
I
can
move
on
to
bindle
for
those
who
weren't
here
before
the
holiday
break.
Bindle
is
a
new
aggregate
object,
storage
idea,
experiments
thing
we
have
the
idea
is
that
especially
in
webassembly,
and
we
think
there's
many
other
users
cases
if
you
have
any
use
cases
and
you
want
to
try
it
out-
please
do
but
the
main
thing
is
around
webassembly.
A
So
the
idea
is
that
when
you're
putting
together
an
application,
it's
an
aggregate
sum
of
parts,
and
so
we
use
here
the
analogy
of
a
silverware
drawer.
A
So
we
have
parts
of
an
application
where
you
could
have
one
thing
that
you
want
to
run
locally
or
you
want
to
run
something
else
on,
for
example,
a
wasn't
cloud
host
where
you
have
more
processing
power
or
more
power
to
be
able
to
run
multiple
of
these
things,
and
so
a
a
bindle
is
just
a
way
to
indicate
here
is
a
version
and
it's
and
it's
a
strict
semver
and
naming
requirement
version
for
that
that
specific
thing
it
cannot
be
deleted.
A
It's
immutable
and
then
it
has
a
list
of
the
different
things
that
are
called
partials
in
it
that
are
the
different
artifacts
that
are
just
it's
binary,
blobs
of
some
kind
in
our
case
here
talking
about
wasn't
modulus
they're,
probably
mostly
wasm
modules,
and
so
anyway.
This
is
something
we
talked
about
last
time
we
will
be
cutting
a
new
release,
probably
in
the
next
few
weeks,
and
we're
redoing
one
of
the
api
endpoints
to
make
it
a
little
bit
cleaner
for
everything
to
be
done
with
like
less
objects
needed
and
whatnot.
A
So
I
know
that
we
talked
to
kevin
about
because
they
you
apparently
you
guys
had
already
created
mostly
a
a
similar
thing
so
anyway,
I
just
wanted
to
sync
with
that
kevin
and
see
if
you
had
any
feedback
yet
and.
I
Any
other
things
yeah
one
of
the
things
that
was
on
our
list
to
do
to
explore
as
soon
as
we
got
back
from
vacation
and
once
we
hit
like
the
the
feature
complete
on
the
0.15
stuff,
was
to
take
a
look
at
replacing
our
par
files
or
provider
archives
with
bindles
and
see
what
that
would
entail
and
and
what
the
pros
and
cons
are.
But
we
haven't
started
that
effort.
Yet.
A
No,
that's
fine,
like
I
said
we're
gonna.
I
have
the
a
major
refactor
landing
here,
probably
this
week
of
the
api.
A
Now
that
we've
had
a
little
bit
more
experience
of
like
trying
this
out
and
as
we're
trying
to
apply
more
stuff
with
the
security
model
around
signing
and
things
so
yeah,
but
that's
all
there
we're
moving
forward
with
that.
So
if
there's
any
feedback
or
people
give
it
a
try
or
have
any
ideas,
please
let
us
know
on
the
repo
we're
in
this
meeting.
A
That's
awesome
and,
and
then,
when
you
guys
are
ready,
just
let
me
know
we'll
cue
up
a
demo
and
you
can
walk
us
through
the
tooling
and
how
to
use
it,
and
you
know
the
sharp
edges
and
all
that
kind
of
stuff.
That's
great.
I.
B
A
We
already
have
the
stuff
if
you
want
to
see
like
some
demos
and
examples
and
things,
but
it's
really
not
like
super
impressive
to
like
to
like
watch
in
a
demo
form
it's
more
impressive
and
it's
like
use
and
how
you
can
handle
stuff
and
filter
things
and
whatnot,
we'll
block
we'll
block
out
time
and
do
it
formally,
because
I
think
the
you
and
kevin
getting
through,
and
maybe
we
look
at.
If
wasn't
cloud
moves
to
it.
We
look
at
it
as
an
example
or
something
like
that.
A
I
think
there's
definitely
a
huge
value
in
it,
because
it's,
I
think,
one
of
the
things
that
we
all
have
in
common
here.
With
the
you
know,
we
have
a
lot
of
folks
that
are
around
the
ecosystems.
We
all
see
the
same
problems
and
I
think
us
just
agreeing
on
hey
we're
all
going
to
work
on
this
thing
and
not
three
different
things
is,
is
a
big
piece
of
the
battle,
the
big
piece
of
the
challenge
here.
A
So
if
it's
bundles
versus
you
know
par
files
like
let's
just
pick
something
and
and
then
agree
and
then
make
progress
together
is
huge.
So
on
our
agenda,
I've
got
blocked
out
and
kevin.
I
I
think
I
would
ask
you,
I
think
you
probably
gave
us
enough
overhead
here
or
enough
of
a
story
just
a
quick
glance
at
open
pull
requests
and
issues
for
the
project.
A
I
No,
I
think,
like
I
said,
the
0.15
is,
is
kind
of
our
priority,
and
so
getting
that
finished
is
is
the
first
thing
up
there
and
then,
after
that,
we're
going
to
start
looking
at
things
like
the
the
other
host
runtimes
like
swift
and
the
browser
and
and
some
of
those
but
yeah
we
haven't
really
done.
We
haven't
done
the
backlog
rooming
to
sort
of
put
those
in
in
order
of
completion,
so
we
haven't
figured
out
what
the
0.16
milestone
looks
like
yet.
A
Okay,
all
right,
but
that's
awesome,
kevin.
Thank
you
for
that
and
but
for
anyone,
that's
you
know.
Looking
ahead,
you
know
thinking
through
that
a
little
bit.
You
know
what
we'll
be
doing
later
this
year.
I
think,
would
be
great
as
we
as
we
go
through
there
and
I
think
we're
pretty
close
to
the
to
the
end
and
no
adls
or
anything
like
that
to
anybody
to
discuss
architectural
decision
logs.
A
D
No,
it's
it's
going
pretty
well
and
I'm
sorry,
I'm
in
my
truck.
If
it
sounds
like
garbage
no.
D
No,
I
wish
that'd
be
cool
to
drive.
No,
no,
I
think
the
website's
going
well,
I
mean
I
haven't
started
on
like
the
dot
com,
just
the
docs,
because
right
I
don't.
I
obviously
need
a
lot
of
guidance
there,
but
if
everyone
could
look
at
the
docs,
the
mobile
piece
of
the
feedback
was
really
helpful.
D
I
hadn't
looked
at
it
on
mobile
at
all,
yeah
yeah,
so
yeah
just
getting
in
there
getting
opinions
and
then
hopefully,
by
the
next
couple
days
I
can.
I
can
move
the
docs
to
their
more
permanent
endpoint
the
docs.wasmcloud.com,
and
we
can
have
that
semi.
E
A
Okay,
great
any
other
topics
or
anyone
else
have
anything
they
want
to
bring
up
if
not
I'll,
go
ahead
and
I'll
just
stop
recording
and
then
we
can
just
move
to
like
just
a
hangout
discussion.
A
Okay,
there
are
topics
backlogged
here
I'll
get
the
recording
cut
where
andrew's
a
demo
just
like
magically
works
on
the
first
try
and
get
that
try
to
have
that
posted
up
for
tomorrow.
Thank
you.
Everybody
thanks.