►
From YouTube: gRPC May Community Meetup/ Demo: Performance Testing gRPC Services by Mostafa Moradian
Description
gRPC has gained a lot of attention as well as a wide adoption since it was first released. With the release of k6 v0.29.0, a native client for gRPC communication is introduced. In this talk, Mostafa Moradian went through the basics of performance testing in general and load testing gRPC services in particular.
A
Yes,
so
I
have
a
few
years
of
experience.
I
am
a
senior
software
engineer.
Also
a
security
lead.
I've
been
with
the
company
for
about
three
years
now.
A
My
main
area
of
expertise
is
python,
but
I
also
code
in
any
other
language
to
get
the
job
done
and
I'm
also
still
learning.
So,
if
you
want
to
reach
me
out,
I'm
available
on
social
media
and
also
on
my
website
so
feel
free
to
reach
out
with
questions
with.
You
know,
suggestions
feedback,
whatever
you
feel
like.
B
Okay,
I'm
I'm
gonna,
I'm
gonna
present
performance.
A
D
C
C
A
I'm
sure
you
you've
heard
about
it,
so
our
system.
B
C
C
D
A
D
C
So
while
we
are
doing
the
performance
testing,
we
are
basically
measuring
certain
qualities
with
quantities,
so
we
are
quantizing
those
qualities
to
be
able
to
test
them.
So,
for
example,
if
we
want
to
measure
reliability,
we
should
measure
latency.
We
should
measure
response
time
failure
rate.
You
know
all
those
metrics
all
those
indicators
for
us.
C
A
B
500
milliseconds,
your
objective
should.
C
B
A
B
Hope
so
so
it
has
a
few
imports.
It
imports
some
modules.
A
B
You
saw
this
script
is
written
in
javascript,
the
syntax
supported
syntax
is
ecmascript,
2015
and
es6.
B
Monitoring
tools
and
stuff
has
a
lot
of
outputs.
It
is
extensible
in
go
via.
A
Takes
your
extension
in
and
builds
it
and
gives
you
a
binary,
so
let
me,
for
example,
give
you
an
example.
B
Of
that
x
k6,
so
this.
C
B
A
D
C
A
A
And
and
and
last
you
can
also.
B
A
If
they,
if
they
do,
I
suppose
you
can
also
bundle
some
because
they're
shims
and
wrappers
and
stuff
to
deal
with
those
apis,
but
all
in
all,
you
can
bundle
a
few
modules
and
it
also,
we
also
wrote
a
lot
of
modules
ourselves.
So
if
you
go
to
the
x
k6
documentation,
you
will
see
that
there
are
a
few
internal
modules:
how
to
bundle
modules,
how
to
import
modules,
how
to
import
remote
like
modules
from
https,
endpoints
and
stuff.
So
that's
that.
D
Okay,
let's
see
what
we
have
here.
B
C
D
Let
me
go
back,
so
I
I
do
have
k6
installed.
You
can.
A
B
Install
it,
let
me
show.
D
B
B
A
D
C
A
A
Like
you
can
like
go
to
the
get
parameter
and
see
what
parameters
it
accept
and
the
response
it
just
returns
and
then
for
the
response
you
can
just
see
what's
in
the
response,
so
that
you
can
check
if,
if
the
things
you're
looking
for
is
in
the
response
back.
C
B
But
what
is
a
virtual
user?
It's
basically.
D
A
way
for
the
code
to.
B
A
So
I'm
just
like
initializing
the
variable
to
receive
the
response
and
I'm
using
get
to
get
the
test
server
of
k6,
and
then
I
also
imported
check.
So
check
is
a
way.
C
A
A
C
A
A
lot
of
other
metrics,
so
we've
we
have
sent
here,
I
can
show
you.
We
have
sent
5006
requests.
A
C
This
is
the
total
duration,
including
connecting
blocked
receiving
and
sending
and
tls
handshake,
and
probably
waiting.
D
If
I'm
not
mistaken,
yeah
and.
C
A
B
A
I
just
shown
you
there's
the
init
default
function,
you
know
vu
code,
we
call
it
and
then
options
and
imports
and
the
turn
down.
So
I
didn't
show
you
the
teardown,
because
it's.
C
B
A
Of
your
request
and
then
record
these
and
then
send
it
somewhere
like,
for
example,
like
this.
C
B
D
B
A
For
recording
the
duration,
it
takes
for
a
grpc
request
to
you,
know,
be
invoked
and
then
receive.
C
C
A
We
give
it
one
second
to
once
again.
B
B
Things
also,
there
is
the
chai
js
chair
session
library,
the
port
of
it
for
k6.
Can
you
can
use
it.
A
B
B
C
You
basically
can
tag
everything
almost
and
tags
are
very
helpful
when
you
are
using.
B
Which
is
how
long
it
takes
for
the
whole
group
of
requests.
For
example,
you
are
testing
your
logging
login
microservice.
D
C
You
can
also
define
your
own
tags
like
this.
A
When
you're
making
a
request,
you
can
like
say
that,
okay,
this
is
my
tag,
and
this
is
the
value
of
my
tag,
and
this
will
be
shown
on
your
request.
Let
me
just.
D
A
D
C
The
next
feature
is
scenarios.
Scenarios
is
a
way
to
describe
how
you
want
your
load.
A
Vu's
to
do
constant
g
cross
rate
to
do
constant
arrival
rate
to
like
buffer
request,
if,
if
the
system
cannot
handle
it
and
so
on
and
so
forth,.
D
A
C
So
for
testing
grpc
k6
has
built-in
support
for
uni
unary
requests.
D
B
D
Am
like
you,
I've.
C
Stored
these
scripts
on,
and
I've
already
built
this.
B
B
A
D
B
And
then
we
also
import
sleep
from
k6,
I'm
using
sleep
basically
because
normal
users.
A
C
B
D
Four
endpoints,
I
would
say.
B
This
we
are
interested
in,
say,
hello,
which
is
a
hello
request,
is
an
optional
string
greeting
the
first
field,
and
then
it
returns.
C
B
Iteration
took
one
second
exactly
kind
of
obvious,
and
we.
B
Like
you
know
three
seconds,
each
iteration.
B
Also
introduce
okay,
so
we
are
using
50
virtual
users.
We
are
going
to
run.
C
B
B
200
milliseconds
and
abort,
if,
if
it
goes
above,
I'm
gonna
set
it
to
something.
A
Like
I
know
that
it
will
fail
on
10
seconds
fingers
crossed.
D
B
Less
than
500
milliseconds,
I
introduced
the
rate
for
the
number
of
requests,
so
it
records
here.
C
C
C
Checking
if
the
response
here,
the
status
of
the
response
is
okay,
so
it
records
the
number
of
failed
and
succeeded,
requests
and
show
us
at
the
end,
and
then
I
also
check
if
the
response
status
is
okay,
that's
a
separate
thing
from
the
rate
metric
and
I
also
check
the
message
to
see
if
the
reply
that
I've
shown
you
is
hello.
Birth.
C
So
100
slips,
that's
not
a
good
idea,
probably
you
can
introduce
sleep
per
group
and
then
at
the
end
you
can
also
sleep
per
virtual
user
per
iteration.
I
would
say
so
then
I
will
close
the
client
after
making
these
many
requests.
D
Okay,
wow.
A
A
If
you
have
a
huge
script,
yes,
because
it
uses
a
lot
of
memory,
I
hope
my
system
doesn't
crash
fingers
crossed.
B
D
Okay,
we
have
five
thousand
views.
C
D
A
B
D
Let's
go
back
yes,
it
concludes
my
presentation
and
I
am
happy
to
answer
your
question
and
thanks
for
listening.
Thank
you
mustafa.
Thank
you
all
for
for
being
here
and
please.
If
you
have
any
question,
please,
okay,
so
paul!
Please
go
ahead.
C
B
Was
on
last
friday,
with
paul
leandro
and.