►
From YouTube: Load Generators - Extending Meshery to support wrk2
Description
Design session for incorporating @thilofm's friendly fork of @giltene's wrk2 into #Meshery.
A
So
good
so
we're
gonna
say:
yeah
Socko
could
get
us.
Today's
topic
is
low
generation.
So
we
had
a
wonderful
discussion
with
Kimball
a
specifically
Tilo
little
over
a
week
ago
and
so
I'm.
Following
up
on
that
I'll
bring
up
the
well
the
design,
spec
krrish,
you
might
have
been
the
last
one
to
touch
this.
You
might
be
the
best
one
to
speak
to
it.
We've
got
a
couple
of
architecture
documents
as
well.
B
So
I
mean
very
high-level
the
two
diagrams
here,
they're
just
trying
to
depict
like
the
two
models,
especially
if
all
goes
wrong,
the
load
generator
so
yeah,
and
there
was
a
reason
like
in
a
way,
I
actually
kind
of
try
to
showcase
this.
This
was
again
partly
because
at
that
time,
but
you
know
when
this
diagram
was
drawn.
This
is
probably
like
you
know
more
than
a
month
old
and
then
at
the
time
it
was
drawn
ie
again
like
at
that
time.
B
I
had
no
clue
what
of
your
kitten's
I
was
under
the
assumption
that,
like
not
a
VR
Kaku
could
run
as
a
server.
So
this
diagram,
what
you
know
actually
was,
was
drawn
under
that
under
that
pretext
that,
like
mounted
or
key
to
could
work
as
a
separate
server
because
for
I/o
could
and
for
you
can,
and
so
so.
The
point
I
was
trying
to
make
with
with
the
two
diagrams
is
that
the
for
the
first
one
is
how
we
started
measuring
like
nowhere
for
IO
was
actually
a
separate.
B
So
this
in
our
you
know
our
microservices
architecture,
but
then
later,
like
it'll,
be
actually
in
order
to
ease
the
development.
Like
you
know,
and
not
to
have
a
1x
ourselves,
you
know
we
actually
try
to
come
behind
them
into
measure
e.
Given
that
for
IO
is
accurate
and
go.
You
could
just
leverage
that
which
is
kind
of
very
easy
I
just
have
to
find
the
right
into
the
injection
points
and
kind
of
like
you
know,
involve
them,
and
it
works
really
well
with
for
IO.
B
Like
you
know
where
we
are
at
least
like
you
know,
for
the
near
future,
we
won't
be
using
the
embedded
load
generator,
we're
not
going
the
route
of
moving
the
little
generator
out,
but
either
ways
I
mean
now.
One
of
the
things
is
that
the
changes
that
are
proposed
in
this
document
for
implementation,
be
the
main
goal
is
actually
is
to
kind
of
eventually
in
future
support.
B
Let
you
know
both
these
models
so
that
like
if
at
some
point
in
time,
we
decide
to
kick
out
for
I/o
and
any
other
imperative
generators
out
and
have
them
run
as
independent
services.
We
should
be
able
to
make
the
change
fairly
quickly,
or
rather
than
you
know
how
many
you
don't
do
it
in
a
much
more
elaborate
way.
So
so
that's
that's
sort
of
like
you
know
what
I
try
to
describe,
but
you
know
later
in
this
document.
Of
course,
okay
I
know
like
in
other
multiple
versions
of
your
tattoo.
B
You
know
and
we
are
going
to
be
picking
one
and,
of
course
like
another
one
with
capability
to
output
JSON.
So
the
ask
here
is
actually
I
mean
you
guys
can
read
through
it,
but
oh
yeah.
So
here
the
ask
is
actually
like
I've
tried
to
put
some
milestone
markers
because
look
another
way
they
started
working
is
try
to
kind
of
spit
out
like
you
know
what
what
we
would
eventually
do
to
actually
get
implementation
going
and
then
I
try
to
put
some
miles
from
markers.
B
So
if
you
were,
there,
like,
you
know,
feel
free
to
actually
like
take
a
look
at
it.
Fundamentally,
the
first
one
is
is
I.
Think
partly
done
like
you
know,
Sacco
already
has
or
figured
out
the
way
to
actually
add
a
multi-stage
process
or
building
WR
k2
as
fart
of
the.
As
far
as
the
you
know,
the
the
memory
docker
build
the
multistage.
A
B
Know
you
go
further,
no,
it's
actually
the
execution
plan,
but
they're,
looking
before
I
kind
of
put
it
there.
I
actually
I
mean
like
this
is
more
of
a
scribbling
pad.
Looking
for
me
to
you
know,
then
bring
in
a
great
confidence
there,
but
then
anyways
so
so
this
is
kind
of
looking
a
war
where
we
are
so
you
can
see
just
categories
this
like
I'm,
bullying,
packaging
and
then
the
implementation,
mainly
because
the
building
packaging
you
pretty
much,
don't
have
that's
the
girl
code,
but
it's
the
implementation
partake
in
a
way.
B
You
leave
testing
the
go
code.
So
fundamentally
this
part,
let
you
know
the
milestone.
0
called
it
milestones
here,
because
it's
pretty
much
done
soccer
already
has
like
proved
that
he
is
able
to
compile
the
we
are
k2
and
then,
like
I,
put
in
the
measuring
container.
Put
the
executable
in
the
container.
Now
are
the
question
to
Sarkoja
Shakalaka?
C
Basically,
now
we
need
to
just
point
the
upstream
to
title
1
and
this
JSON
formatted,
that
should
be
big
changes
and
I
will
meet
just
once
again
to
make
sure
all
those
working
but
yeah
that
should
be
eat
that
shouldn't
be
so
efficient.
Basically,
changes
just
update
a
bit
upstream
when
in
the
continent
doing
from
the
title
version,
because
as
I
understood,
that
teller
version
also
has
the
G
Jason
already
right.
B
B
A
C
B
There
is
one
thing,
though:
if
you,
if
you
see,
if
you
see
the,
if
you
see
the
Commons,
if
you
see
the
third
comment,
which
I
mean
other
than
the
ones
I
have
included
your
name,
which
I
mean,
let
me
you
can
see,
fYI
II
periodic
run
or
results
either
at
the
woman.
You
know
you
so
so
that
comment
is
actually
an
important
one.
Now
this
again
is
going
to
be
based
on
the
question
which
I'm
going
to
be
asking
I
think
we
are.
We
had
a
very
preliminary
session
in
this
direction.
B
Where
now,
if,
if
the
WR
k
version,
we
are
using
returns,
high-level
summary
results.
That's
going
to
be
a
problem,
because
now
the
way
the
way
our
system
works
is
that,
like
I,
mean
I
mean
since
I
again,
the
other
thing
was
like
you
know,
we
do
have
a
reasonably
tight
coupling
with
for
IO,
but
the
this
library,
the
periodic
dot
on
the
results,
is
actually
part
of
for
Iowa.
Now
the
why
you
using
this
is
actually
four
to
one.
B
It's
just
easy
number
one,
because
we
started
off
with
four
I/o,
but
there
is
also
a
number
two,
which
is
the
results
if
they
follow
this
particular
struts
pattern
like
an
ascent,
the
results
are,
are
like
you
know
they
kind
of
stick
with
this
pattern.
Then
we
have
the
UI
code,
like
you
know,
which
is
actually
borrowed
from
for
I/o.
B
Of
course,
are
you
know
now
we
have
made
some
customizations
to
it,
but
there
is
this
for
I/o
code
which
actually
takes
this
takes
this
result
in
this
format
and
then
does
some
computation
on
it
comes
up
with
like
a
high-level
summaries
and
then
uses
the
whole
thing
to
actually
charge
the
data.
So
so
it's
not
like
a
one
thing
like
so
essentially,
I
could
say
that,
like
you
know,
the
tight
bonding
with
for
I/o
is
not
in
one
place,
but
in
two
spots
one
is
in
the
back
end
server.
B
The
other
one
is
in
the
UI.
If
we
want
to
change
that,
that's
going
to
be
quite
a
quite
some
effort,
because
you
know
we
have
the
I
mean
it's.
It's
not
like.
It's
spread
over
all
the
places.
No,
no,
it's
like
it's
just
a
like.
We
have
three
points
of
integration.
One
is
the
back
end,
which
returns
the
result
in
the
back
end,
we're
not
doing
any
further
processing
of
those
O'riley.
That
means
to
be
so,
for
your
runs.
The
result
runs.
The
Lotus
gives
back
a
response
now.
B
The
good
thing
with
the
response
is
that
whether
it
is
HTTP
or
to
your
PC
both
actually
stick
to
this
particular
type.
Right,
like
you
know
it's
both
of
them
like
you,
don't
stick
to
run
a
result.
Of
course
there
is
a
subtype,
but
there
are
two
subtypes
one
is
a
HTTP
run
a
result
and
the
other
one
is
a
gr.
Pc
run
a
result,
but
the
way
I'm
doing
it
is
like.
If
you
look
at
my
code
and
you'll,
see
it
like
know
that
you
know
I'm
making
the
result.
B
Return,
like
you
know,
of
this
type,
so
that,
whether
it's
HTTP
or
to
your
PC
I,
don't
care
I,
just
sent
it
to
the
UI
and
the
way
takes
care
of
it
kind
of
a
thing.
You
see
the
point
so
which
has
been
helpful,
which
has
been
working
well.
So
that
is
one
point
of
integration.
The
other
one
is
in
the
UI.
We
have
a
I
mean
I
have
a
central
javascript
file
like
which
essentially,
we
borrowed
from
for
I/o,
and
that's
what
it's
doing
all
the
computation.
B
That's
that
I
mean
you
even
call
that
sort
of
like
an
I'm
I
didn't
create
a
component
around
it,
but
I'm
just
using
that
as
a
JavaScript
library,
like
a
list
of
functions.
That
main
thing
is
actually
used
in
two
places:
well,
actually
one
which
is
actually
the
mastery
charting
component
in
the
UI.
The
same
mastery
charting
component
is
used
in
two
places,
one
in
a
performance
page
and
got
one
is
in
the
results
page.
B
So
you
can
see
like
I
mean
it's
not
hard
to
make
change,
because
if
you
change
in
one
it'll
you
know
all
the
faces
will
get
updated.
But
the
only
question
here
and
again,
like
you
know,
this
is
again
another
ignorant
question
partly,
is
that
the
wrok
tool,
whether
it's
the
filo
version,
or
which
all
version
we're
gonna,
continue
using?
We.
B
Is
getting
all
the
result
data
points?
If
not,
if
it
is
only
summary,
then
we
might
have
to
come
up
with
a
different
strategy
because,
like
I
said
you
know,
I
mean
so.
This
is
what
I
try
to
explain
that
if
we
can
make
the
result
from
the
for
io
stick
to
these
particular
types
format
like
the
periodic
on
our
results
format,
because
it
has
all
the
fields
which
are
important
right,
look,
and
it
has
all
the
data
points
it
has
like.
You
know
the
starting
time,
the
duration.
B
You
know
a
number
of
threads
number
of
queries
per
second
and
all
that,
like
I'm
kind
of
both
defined,
so
we
just,
which
is
the
other
reason
why,
like
I've,
been
using
that
because
it's
it's
kind
of
well-defined,
you
know
I
pretty
much
covers
all
the
you
know,
all
the
all
the
interesting
factors
of
running
a
test,
just
reason
why
I've
been
using
it
so
I'm
like
we
definitely
need
a
strong
reason
to
boy
from
it.
It's
kind
of
my
point.
B
So
if,
for
example,
like
you
know,
you
are
like
it
also,
you
I
mean
in
your
POC.
You
write
the
girl
code,
which
will
run
wrq
to
the
right
parameters
and
then
get
the
result
in
JSON
format
you,
you
kind
of
like
it
on
Marshall
it
into
a
structure
the
go
structure.
Now
you
need
to
do
a
mapping
from
that
structure
to
this
periodic
run
of
results,
because
I
mean
the
structure
that's
coming
from
double
your
k2
is
not
going
to
be
in
this
format,
so
we
need
to
do
that.
B
C
B
I
mean
it's
just
easier
to
do
it
at
the
back
end,
rather
than
do
it
in
the
front
and
because,
if
you
do
it
in
the
front
end,
we
will
have
to
have
like
kind
of
parsing
logic
for
every
type
of
word
generator
right,
yeah
I
mean
so
today's
WR
get
to
tomorrow.
I
mean
like
in
our
day
after
I
mean
yesterday
was
for
I/o.
B
Tomorrow
is
your
day
to
day
after
tomorrow,
it's
going
to
be
I,
don't
know
about
your
bench
or
the
next
day,
like
it's
gonna,
be
something
else
right,
I
mean
we
we
might
support
more,
but
I
think
the
way
it
is
is
like
the
front
end
I
mean
my
my
my.
The
direction
of
my
thought
is
that
the
front
end
encode
for
parsing
parsing,
the
data
and,
like
you
know
you
know,
presenting
it
should
be
single,
should
be
just
one.
B
Your
risk
of
total
generator,
it's
kind
of
like
how
I'm
seeing
it
it's
not
set
on
stone,
of
course,
like
in
a
worst
case,
not
gonna,
be
a
way
to
do
it
and
I
was
actually
another
reason
why
we're
doing
this
or
thinking
about
I
mean
like
know
why
I
am
thinking
about
this.
The
reason
for
me
to
think
about
this
is
that,
like
you
know,
you
know
that
now
in
Missouri
we
have
this
comparison
capabilities
right.
B
You
can
compare
results
so
now
when,
when
we
have
this
capability
to
compare
results
like
so
when
as
a
for
I/o
result
and
the
other
one
is
a
for
I/o
result,
we
can
easily
compare
them,
but
now,
when
it
is
a
for
I/o
result
and
a
WR
k2
result,
unless
they
are,
they
have
a
same.
You
know
pattern
now.
You
know
you
cannot
compare
it'll
be
hard
for
us
to
compare
or
we
have
rights.
You
have
to
start
writing
all
these
custom
logic
for
doing
comparisons,
I,
don't
like
it.
B
It's
not
extensible
right,
like
I
mean
so
this
is
my
chain
of
thought
again.
If
you
guys
have
a
strong
opinion
against
it.
Absolutely
I'm,
definitely,
you
know
I'll
be
very
happy
to
hear
it
and
you
know
update
it,
but
but
I
mean
this
is
what
I'm
projecting
here
is
just
my
chain
of
thoughts
like
as
hey.
Okay,
these
are
mean
these
have
been
working
well,
these
are
set
about
pretty
much
like
you
know.
B
C
B
Saco
is
going
to
run
a
benchmark
test
on
SEO
bookmark
with
the
book
in
for
application
today
and
now
in
a
month,
let's
say
we
have
the
WR
k,
2,
embedded,
integrated,
etc,
and,
like
you
know,
Saco
is
running
or
the
market
to
test
on
a
newer
version
of
the
steel
with
a
looking
for
app.
How
would
you
I
mean
you
see
the
point
where
they
don't
mean?
We.
C
B
C
We
from
the
implementation
perspective
we've
been
a
bit,
was
digging
to
each
WTO
and
for
to
how
they're
doing
and
like
even
if
we
start
to
let
Seabees
do
a
script
and
try
to
make
it
as
a
separate
request.
There
could
be
still
some
chord
changes
in
the
implementation
that
could
make
this
difference
absolutely.
B
I
mean
so
I
mean,
like
I,
said,
don't
don't
get
me
wrong,
but
I
mean
if
we
have
to
make
changes
to
get
something
to
work
I'm
totally
towards
it.
But
my
only
point,
like
you
know
off
of
writings,
what
I've
written
here
is
actually
to
kind
of
keep
them
a
little
bit
clean
so
that,
like
all
the
implementation
things
going
over
in
a
single
place
and
rather
not
being
spread
out
in
like
10
places,
you
see
the
point.
B
I
prefer
doing
it
in
the
back
end
and
maybe
a
little
in
the
front
end
but
like
having.
You
know
changes
for
here
here
and
then,
if
we
have
to
you,
know
kind
of
show
a
completely
different
interface
for
the
VRT
to
results.
Now
that
is,
that
is
bad.
Now
the
worst
part
will
be
now.
We
also
have
to
write
a
logic
for
handling
the
comparison
possibilities
right.
You
see
the
point
it
just
keeps
going
on
it.
Yeah
it'll
be
like
the
snowball
see
it
starts
off
with
something
small
yeah.
B
B
B
C
No,
it
was
the
raw
data
in
JSON
format.
It
was
given,
but
it
was
still
the
average
and
all
these
ideas
just
a
summarization.
Yes,
there
was
more
information
than
normal
output,
but
it
wasn't
still
Pericles
and
Muscovy
Stella
when
we
were
talking
yeah.
He
was
also
mentioning
that
kind
of
coordinate
on
omission.
That's
first
thing
and
then
JSON
that's
good
to
go.
But
but
if
you
want
Pericles,
they
didn't
have
class
I
guess.
At
least
he
didn't
gave
the
commitment
for
help
us
on
this
for
product
list
metrics.
B
Lee
your
thoughts
and
the
other
reason
it's
like
I
mean
I
I.
Just
don't
want
to
talk
about
milestones,
one
and
two
I'll.
You
know
I
mean
like
the
time
is
about.
You
know
we're
pretty
much
like
you
know.
On
top
of
the
hour,
let's
just
talk
something
like
I
mean:
let's
have
a
sensible
argument
here
or
discussion
here
we
can
come
back
two
miles
from
when
after
Sacco
has
a
decent
POC.
A
Nice
yeah
I,
don't,
while
I
could
talk
to
her
what
you'd
said,
there's
nothing
more.
That
I
would
add
about
the
direction
that
we're
headed
I
will
clarify
that
there's
a
point
of
confusion,
I
think
between
w
RK
as
the
first
project.
It's
that
a
fork
of
w
RK
called
WR,
k2
guiltiness.
Again
it's
referenced
there
and
I
worked
on
it
and
then
Tilo
worked
off
of
w
RK
too,
and
he
added
it's
all
written
down
into
the
spec.
But
anyway
that
yeah
we're
gonna
want
to
yell
or
I.
Think
yeah
I
thought
the
guilty.
A
C
A
A
We've
got
the
overarching
issue
that
it's
like
number
126.
It's
the
one
that
sort
of
tracks
the
fact
one
I
have
this
overall
and
then
giving
the
rishis
broken
it
down
into
different
milestones.
I
just
created
two
new
I'll
be
creating
a
github
issue
for
each
milestone,
the
check
boxes,
so
that
feels
good
actually
can
turn
it
do
pull
requests
at
the
end
of
each
use.
Yeah.
C
Just
I
guess,
should
it
be
clear,
shows
a
milestone,
zero
included.
There
then
just
forget
about
measure
it
just
doubler
key
to
make
sure
we
can
add
feature
to
return
in
Pericles
JSON,
and
then
he
come
back
and
continue
from
yeah
from
the
integration.
This
measuring
its
mast
on.
That
is
that
the
proof-of-concept.
B
D
B
Everybody
right,
yeah,
yep,
that's
precise
yeah
I
mean
like
that's,
is
essentially
like
a
now
at
the
point
you
know
just
which
is
good,
like
you
know
that
yeah,
let's
not
they
start
getting
to
measure,
is
because
you
know
I
mean
once
you
step
into
my
sri.
You
will
start
having
all
these
integration
challenges.
Let's
just
work
with
the
PLC
and
and
tomb
WR
k
to
the
way
we
wanted
get
it
get
it
the
way
we
want
fit
work
and
then
lick
it
on
out
there,
okay,
hey
cool.
Looking,
we
have
everything
figured
out.
B
Okay,
let's
jump
into
mastery
and-
and
actually
interestingly
looks
like
I
mean
so
I
think
that
you
know
this
is
kind
of
like
know
what
Tilo
mentioned
Sheila
mentioned,
what
always
about
the
Lua
scripting
with
you
are
stripping
like
I.
Think
it
gives
you
a
little
bit
of
flexibility,
but
I,
don't
know
how
flexible
it
is.
B
B
C
Will
go
through
the
other
peers
they
have
and
start
in.
Also
their
pods,
like
Tilo,
can
walk
that
right
one
and
see
if
there
was
anything
already
down
and
see.
Yeah
and
first
thing.
I
will
do
that
one
and
if
no
then
go
to
code
and
try
to
understand
and
try
to
do
some
stuff
around
and
see
if
we
can
get
that
in
JSON.
Oh
then,.
A
C
B
Actually
so
till
oh
sorry,
I'm
sorry
I
just
sent
you
a
link
in
the
zoom
chat
so
that
actually
describes
how
you
can
script
with
Lua.
It
is
actually
describing
like
all
these
these
methods,
and
so
I
think
you
all
you
have
to
do
is
like
you
have
to
maybe
create
a
lower
file.
You
don't
kind
of
implement
the
right
method
and
give
it
to
I
mean
like
you,
don't
have
to
extend
w
RK,
w
RK,
2
or
whatever
Telos
for
I
mean
I
I.
B
Don't
think
you
have
to
do
anything
all
I
think
all
we
need
is
actually
the
capability
to
write,
Lua
scripts
which
and
then
like
I,
think
you
can
actually
then
just
pass
it
to
cause
it
to.
You
know
w
RK
at
runtime
like,
for
example,
like
you
know
here
is
a
sample
command.
I
I
mean
again
I,
don't
know
the
exact
details
like
not
somebody,
like
you
know,
in
working
WR
k
with
a
Lua
script.
Oh
oh
yeah!
So.
C
B
Which
means
you
don't
have
to
like
I
mean
looking
for
for
the
things
we
are
working
towards.
Oh
you
know.
We
need
to
do
anything
like
you
know,
with
the
WR
key
I
mean
we
just
take
T,
let's
work,
alright
good
point,
yeah
just
just
write
some
Lua
scripts
and
then
I
can
give
it
to
the
vrk
tool
as
part
of
your
exact
command
just
founds
the
script
and
in
which
case
like
you
might
have
to
bundle
the
script
in
in
the
you
know
the
mesh
record
here
but
I
mean
like
that's
no
problem.
B
B
B
Yeah,
okay,
we
just
start
off
with
a
deal
with
I
mean
helos,
w
RK
bundle
it
and
then
now
with
that,
I
can
I
see.
If
we
can
like
it'll,
write
some
simple,
you
know
Lua
scripts
and
see
if
we
can
like
a
fine
tune,
the
response,
fine
kill
the
results.
You
know
the
way
we
want
it.
I
think
the
JSON
work
is
pretty
much
done
in
a
smaller
fashion.
I
I,
don't
always
like
I,
mean
yeah
III.
Let.