►
From YouTube: Lightning Talk: WAMR, Intel - Le Yao, Intel
Description
Lightning Talk: WAMR, Intel - Le Yao, Intel
Lightning Talk: WAMR, Intel - Le Yao, Intel
This session introduce the WAMR - a hign performance and small footprint WebAssembly runtime in Envoy. The advantages can get by building envoy with this wasm runtime.
A
Hello,
everyone
welcome
to
my
topic
about
walmart
and
why
I'm
loyal
from
intel
as
a
cloud
australian
software
engineer,
and
today
I
would
like
to
introduce
walmart
and
from
three
aspects.
First,
I
will
give
a
brief
explain
about
enguava
sim.
Then
I
will
introduce
the
web
assembly
my
current
time,
which
is
developed
by
intel
mama
here
and
finally,
I
will
give
the
benefits
we
can
get
from
this
work.
A
You
know
why
we
enable
obama
and
why
yeah
anyway,
integrity
was
same
technology,
because
the
same
can
bring
accessibility
and
flexibility
to
involve
without
bringing
up
and
what
is
done.
We
can
learn
about
the
envirosim
as
these
four
modules.
A
First
nua
has
a
external
internal
c
platas
filter
to
encapsulate
what
same
code
and
offer
more
api
implementation,
and
it
also
have
a
project
named
because
they
were
same
host.
If
you
encapsulate,
what's
in
runtime,
extract
sandbox
api
and
request
the
info
api
to
sandbox.
A
We
all
know
web
assembly
is
a
stack
based,
virtualbox
virtual
machine
yeah
and
another
project
is
approximate
same
sdk
yeah.
It
will
explore
export
the
sandbox
api
and
declare
the
reference
to
invol
api
and
also
in
catalyst.
Encapsulate
my
api
and
control
the
sandbox
api,
so
we
can
use
the
ssdk
to
develop
our
waxing
filter
and
the
implant
mentions
interface
we
defined
in
the
envy
was
in
filter.
Then
we
can
integrate
this
filter
into
the
invoice
service
mesh
yeah.
A
Currently
we
must
pause
the
volume
five
was
in
runtime
in
its
latest
release,
and
one
more
is
here:
yeah
the
warmer
is
what
we
bring
into
and
why
the
develop
build
of
the
enviro
binary
is
a
wave
around
them.
It
is
based
on
the
ray
bar
engine
in
browser
yeah
and
over
the
same
time
I
know.
A
Waim
here
is
a
other
runtime
which
is
with
different
features.
Part
yeah
and
the
runtime
noon
is
default,
compatible
modules
linked
into
and
one
yeah.
So
what
is
warmer
here?
A
This
last
shows
the
web
assembly
macaron
temp
warmer
over
real,
and
we
can
see
we
can
see
it
here.
One
more
itself
supports
many
cpui
texture
and
at
least
here,
under
the
platform
support
listed
here,
we
can
see
in
ramona
itself
support
many
cpu
architecture
and
the
platform
yeah,
and
it
had
a
successful
community.
A
Walmart
itself
target
is
a
small
list
and
the
first
standalone
same
runtime
and
the
module
designed
to
cover
the
uses
from
low
end
device
to
cloud
yeah
and
the
supporting
interpreter
ahead
of
time
and
the
just
in
time.
Compilation
is
in
the
warmer
itself
is
a
sea-based
implementation
and
iom-based
compilation.
A
One
advantage
it
has
is
it
has
a
small
flipping
and
a
memory
consumption.
The
ram
call
of
it
shows
the
like
this
yeah
under
there
is
a
self-implemented
ahead
of
time,
audio
loader.
It
also
supports
the
intel
sjx
no
need
for
the
sdx
label
as
other
things,
and
it
can
mature
the
application
for
using
its
application
framework
and
it's
very
easy
to
post
new
platform.
A
Here
we
give
the
one
more
performance
number
we
have
tested
on
the
x86
platform,
architecture
and
platform,
and
it
says
the
performance
number
compared
with
the
native
gcc
compiler.
We
can
say
the
ahead
of
time.
Performance
is
from
0.4
to
1.5
of
the
gcd
calculation
among
the
workloads.
This
is
the
benchmark
of
it,
and
this
is
the
performance
number
yeah
yeah,
it's
gcc
generic
sse
code,
yeah.
A
And
here
we
will
give
the
sgx
or
kubernetes
related
adaption
under
engagements.
A
First
most
of
the
intel
project,
such
as
project
private
data
object,
open
source
project
can
get
the
benefits
from
using
wama
and
ali
and
financial,
from
alibaba,
adopt
obama
on
sdx
for
the
china
leading
blockchain
platform
and
also
kubernetes
under
the
open
source
process
and
the
ali
live
adoption.
A
The
walmart,
with
sdx
on
multipatters
computer
protector
and
also
by
dual
x
level,
mesa
te
also
integrator
and
warmer
and
stitchers.
A
So
wait,
oh,
I
have.
I
have
gave
you
a
brief
introduction
about
walmart,
so
why
we
bring
walmart
in
why?
Because
itself,
wrong
or
similar
time
may
have
caused
the
binary
may
increase
the
binary
size
and
the
performance
number
is
not
real
in
our
text,
so
we
bring
one
more
into
enviro
and
do
some
performance
tests
to
get
the
this
advantage.
A
The
first
is
the
memory
consumption
we
use
the
stalker
commands
to
get
the
noise
and
peak,
so
we
can
see
in
this
table
using
one
more
interpreter
model
or
walmart.
Just
in
time
model
we
can
say
the
memory
consumption
have
a
largely
deal
decrease
yeah
later.
A
Also,
the
binary
says,
based
on
way
bar
on
time,
is
larger
than
murmur
using
warmer
to
build
the
binary.
We
can
get
the
y
binary
and
decrease
the
set
of
about
50
and
the
ships
set
about
10
yeah
and
the
high
performance
way
using
the
lighthock
to
test
the
performance
will
generate
some
hp
requests
to
avoid
a
gateway
and
get
the
performance
number.
A
We
can
say
in
most
benchmarking
we
get
about
five
to
ten
improvement,
yeah
yeah,
also
using
warmer.
We
may
enable
sdx
into
ny
hp
filter
which
will
increase
the
safety
of
the
hp
filter
chain.
Yeah.
A
So
here's
the
reference
I
will
give
you
all
to
about
our
work
working
and
why
a
pull
request
to
and
while
proxy
have
been
merged
on
the
the
proxy
was
in
project.
And
this
is
our
one
more
time
and
if
you
have
any
question
you
can
contact
me
using
this
email.
Yeah
thanks.