►
Description
Get your espresso ready as we dive into API testing with Laurent Broudoux to talk about Microcks.io
A
Good
morning,
everyone
and
hi,
and
welcome
again
to
another
episode
of
openshift
coffee
break.
So
today
we
have
my
dear
friend,
lauren
brudu
has
a
special
guest
who
is
going
to
talk
to
us
about
everything
around
api
testing,
mockups
and
I'm
sure
it's
going
to
be
a
very
entertaining
episode
and
we
have
our
esteemed
co-presenters
that
recently
joined
us.
We
have
fabio
and
andrea
so
welcome
everyone
morning.
A
All
right
so
lohan,
would
you
tell
us
few
words
about
you
yourself
before
we
get
started.
C
Yeah
for
sure,
thanks
a
lot
for
the
invitation,
jafar
and
tim,
so
I
said
by
jafar
my
name
is
lauren
and
I'm
working
at
radat,
I'm
a
solution
architect
focusing
on
everything
related
to
cloud
native
application
development,
so
mostly
the
the
the
openshift
platform,
the
the
caucus
frameworks
and
and
so
on,
and
in
my
spare
time
I'm
also
an
open
source
committer
and
I'm
working
on
a
site
project
called
microx
related
to
to
api
mocking
and
testing.
C
A
A
All
in
one
all
right,
so
yeah
again
guys
feel
free
to
jump
in.
If
you
have
any
questions,
also
everyone
who's.
Looking
at
the
session,
please
ask
your
questions
on
the
chat
so
yeah,
let's
get
things
kicked
off.
So
what
does
this
microx
tool
do?
If
you
have
like
a
brief
overview,
why
did
you
feel
the
need
to
create
such
a
tool
and
or
what
use
cases
does
it
answer
to?
Yes,.
C
In
fact,
the
microx
project
started
something
like
six
years
ago,
just
before
joining
red
hat,
and
at
that
time
that
was
working
in
a
big
thing:
financial
company
deploying
transit,
transitioning
from
service
oriented
architecture
to
rest
apis,
and
we
were
struggling
with,
let's
say
thousands
of
apis
thousands
of
existing
web
services,
and
we
were
looking
for
a
scalable
solution
to
mock
services,
because
we
were,
we
were
seeing
that
it
was
very
hard
to
for
developers
to
have
everything
in
line
in
order
to
develop
their
services,
because
there
were
so
many
dependencies
between
services
and
so
on.
C
So
it
was
really
a
mess,
and
just
at
that
time
I
also
started
working
on
kubernetes
working
with
containers.
So,
yes,
it
was
the
right
time
to
start
a
kubernetes
native
tool
that
will
allow
to
do
mocking
and
contract
testing
of
microservices
and
api,
but
in
a
very
scalable
way.
C
A
Very
nice,
very
nice,
so
yeah,
let's,
let's
go
a
few
years
back
and
see
some
of
the
challenges
that
developers
were
facing
so
say.
For
instance,
you
are
doing
a
monolithic
application
and
need
to
connect
to
a
huge
database,
for
instance
like
steps
back
to
to
the
early
days
of
of
iq.
D
A
One
of
the
challenges-
I
guess
was
to
be
able
to
to
get
the
data,
so,
for
instance,
you
want
to
create
new
features
and
you
want
to
have
production
like
data
but,
of
course
not
connected
to
the
production
databases.
A
I
I
I
remember
it
was
always
a
mess
to
to
get
representative
representative
data
that
you
can
mess
with
and
have
also
something
that
would
mimic
the
production
environments
in
terms
of
scale
in
terms
of
functional
data
and
such
things.
So,
for
instance,
you
yeah
you
needed
to
to
instantiate
a
proper
database
and
of
course
it
was.
It
was
hard.
It
needed
resources
to
spin
up
the
os
to
spin
the
database
etc.
A
Now,
if
we
speak
about
apis
apis,
take
that
complexity
even
further,
because,
instead
of
needing
one
system
of
record
for
your
data,
you
now
maybe
have
tens
of
apis
that
you
need
to
integrate
with
as
a
developer
to
to
get
the
data
and
it
becomes
even
harder.
So
if
your
api
is
hosted
on
on
servers
or
substrings,
how
do
you
solve
that
challenge
of
getting
access
to
something
as
a
developer?
C
Yes
exactly
so,
as
you
said,
gathering
data
representative
data
is
a
very,
very
tough
challenge,
but
the
other
challenge
that
we
we
faced
was
also
being
able
to
to
address.
Not
only
the
developers
needs
but
also
being
able
to
address
the
the
api
owners.
C
The
line
of
business
needs
because
we
we
think
it's
really
crucial
when
you're
designing
an
api
to
to
to
have
this
guarantee
that
what
you
are
developing
at
is
really
aligned
with
the
line
of
businesses
view
because,
after
all,
apis
are
products
and
products
should
really
fit
really
be
aligned
with
the
business
point
of
view.
So
the
approach
we
take
at
microx
is
it's
big
difference.
C
We
could
have
started
with
existing
data
gathering
data
and
so
on,
listening
mirroring
data
and
collect
datas
to
get
representative
marks,
but
we
decided
to
go
the
other
way
and
for
that
we
we
do
think
it's
really
important
to
to
put
the
emphasis
on
the
on
the
design
phases
of
microservices
and
apis
using
contracts,
and
what
we've
done
with
microx
is.
C
While
you
are
designing
contracts,
there's
nothing
better
to
to
add
a
real
life
examples
to
your
contracts,
real
life.
Examples
that
can
be
specified
by
the
business
owners
who
can
be
specified
by
the
the
functional
experts
so
that
you,
you
really
assured
that
what
is
specified
is
really
aligned
with
the
business
needs.
So
the
the
the
basis
of
macrox
is
putting
yourself
into
some
kind,
of
example,
driven
development.
C
So
basically,
what
he's
doing
mike
rocks
is
using
all
your
api
or
services
contracts
and
attached
examples
to
generate
api
endpoints,
but
also
to
generate
a
test
data
set
so
that
you
can
start
playing
around
with
your
with
your
api
with
the
data
that
has
been
collected
through
the
business
owners.
So
we
absolutely
want
that
gathering
examples.
Gathering
data
providing
data
should
be
very
easy,
so
we
accept
in
micros,
very
different
format,
so
datas
can
be
provided
within
open
api
contracts
within
soap
ui
projects.
C
If
you
are
dealing
with
legacy
stuffs
or
tools,
you
can
also
use
a
postman
collection,
simple.json
file
and
so
on.
So
the
the
purpose
is
really
to
open
up
the
the
possibilities
to
let
the
business
owner
the
functional
expert
use
the
tool
he
knows
to
yes
to
edit
to
collect
this
representative
data
and
then
feed.
My
crocs
with
any
of
the
different
formats
we
we
may
support.
A
Very,
very
nice.
Thank
you
lauren.
For
this
brief
I
would
say
intro
on
the
the
principle.
So,
if
I
try
to
sum
it
up,
it's
like
a
contract
based
a
contract.
First
approach,
where
exactly.
C
A
Let's
play
some
some
role
plays
here,
I'm
gonna
be
the
developer,
you're
gonna,
be
the
api
designer,
slash,
coder
provider,
architect,
etc,
and
andrea
and
fabio
can
be,
for
example,
the
business
owners.
B
A
A
developer
you
and
I
yeah
I'm
just
trying
to
to
to
rephrase
what
you
said.
So
we
are
all
on
par
in
terms
of
understanding.
So
I,
as
I
never
as
a
developer,
I
need
the
api
endpoints,
so
I
can
create,
for
instance,
my
front
end
or
whatever
exactly
you
and
I
are
going
to
agree
on
the
contract,
the
contract,
meaning.
This
is
what
you
are
going
to
provide
me
in
terms
of
api
data,
content,
structure,
etc.
A
So
I
know
that
I
can,
even
if
it's
an
empty
set,
I
know
exactly
what
fields
are
going
to
be
provided
by
the
api,
and
I
can
already
have
an
understanding
of
how
to
interact
with
the
api
endpoint
exactly
so
that's
the
first
part.
Second
part
is
now
that
we
have
agreed
on
the
contract.
A
We
need
some
some
data,
some
example
some
sample
data,
that
is
representative
of
the
business
I
would
say
function
and
and
that's
where
andrea
and
fabio
come
in
right,
they
are
going
to
generate
some
data,
sets
exactly
using
different
formats.
D
C
Can
use
whatever
tool
they
want?
In
fact,
we
are
accepting
different
formats.
But
what's
really
important
is
to
to
add
these
samples,
because,
most
of
the
time
when,
when
people
are
talking
about
contract
or
about
api
contract
services
contract,
they
they
just
end
up
with
all
the
things
relating
to
syntax
to
data
modeling
and
to
operation
modeling,
and
they
just.
C
To
have
this
extra
step
of
gathering
samples
that
that
are
matching
the
operations
and
data
models,
and
that's
the
starting
point
of
micros,
so
you
can
fabio
and
andrea,
can
can
put
their
samples
within
the
contract
depending
on
the
technology
or
they
can
put
everything
into
a
postman
collection
if
they
are
using
postman,
which,
which
is
really
easy
to
use.
Even
for
non-technical
people
and
that's
the
artifact
that
we
used.
That
will
be
used
by
microbes.
D
D
So
let
me
design
by
contract.
That's
something!
This
key
point
in
api
development.
Second,
involve
the
let's
say
the
business
owners
as
soon
as
possible
in
so
try
to
draw
a
real
devops
as
soon
as
possible.
D
Now
I
have
a
question
I
don't
want
now
the
the
answer,
probably
you
you
will
show
us
something
later
but
which,
in
which
step
of
this
devops
circle
the
testing
of
the
api
using
microsoft
seats.
So
if
you
have
a
sort
of
step,
one
okay
deployment,
creating
speaking
always
about
open
shift
or
kubernetes
basic
project,
so
creating
the
source
code,
the
image
unit
test,
then
probably
something
with
micros
and
then
all
the
other
steps.
So
exactly.
D
C
Yes,
so
so,
basically,
the
the
life
cycle
that
we
see
at
at
people
using
microx
is
that
there
is
a
they're
starting
with
a
design
phase.
Okay,
elaborating
the
contract,
the
syntactic
contract
and
then
adding
the
samples.
C
Then
they
are
able
to
generate
mocks,
live
simulations
using
microsoft
developers,
front-end
developers
or
or
developers
that
need
your
micro
service
to
be
up
and
running.
They
can
start
developing
after
that.
There
is
the
code
production,
as
you
said,
the
the
ci
cd
pipeline
that
may
that
may
start
realize
unit
tests
build
an
image,
deploy
this
image,
just
after
that
having
the
contract
testing
part
that
is
applied
by
microx,
so
that
what
you
are
depl,
you
can
ensure
that
what
you
are
deploying
is
really
compliant
conformant
with
the
contract.
C
B
C
For
sure
that
that's
definitely
the
the
way
we
want
things
to
happen,
because
when
you
start
designing
your
apis
or
providing
samples,
obviously
you
already
you
always
start
with
the
the
most
common
cases
and
then,
when
going
through
development,
you
realize
that
there
is
a
edge
cases
except
exceptions
and
so
on,
and
so
you
have
to
iterate,
and
one
nice
thing
about
my
cooks
is
that
it
can
interact
with
your
git
repositories.
C
So
as
soon
as
you
have
changed,
your
contract
made
evolved
your
samples
into
your
postman
collection.
If
everything
is
stored
in
git,
everything
will
be
pulled
by
my
crux
and
will
be
in
sync
with
within
your
repository.
So
as
soon
as
you
introduce
changes,
microprocessing
is
synced
with
those
changes,
so
it
allows.
B
C
In
fact,
we
do
not
trace
the
the
the
different
iteration
when
it's
when
they
are
related
to
the
same
api
contract
version,
but
we
we.
We
keep
the
history
a
different
api
version.
So,
at
the
end
of
the
day,
in
the
micros
repository
we
you
will
have
the
definition
of
the
1.0
version,
1.1
version
when
the
two
and
so
on,
and
the
nice
thing
with
that
is
that
when
you
will
deliver
your
1.3
version.
C
Remember
that
if
you
are
following
semantic
versioning,
this
version
should
still
be
compliant
compatible
with
the
1.2
contract,
the
1.1
version
contract
and
so
on,
and
this
kind
of
of
tests
can
be
easily
automated
with
micro.
So
it
keeps
the
history
of
all
the
versions
and
it
is
able
to
yes
to
check
the
conformity
on
different
contracts
regarding
cementing
versioning.
A
Very
nice,
so
yeah,
I
don't
know
how
how
many
demos
you
have
a.
A
Know
you
you
always
have
more
than
we
can
ask
for
so
just
maybe
a
last
question
and
and
then,
if
you
want
to
go
into
more
demonstration
oriented
section,
we
can
switch
that
so
I've
worked
in
the
past
with
such
tools
that
allow
you
to
do
like
mocks
and
virtualize
your
your
your
your
data
and
such
things
and
one
of
the
issues
again
was.
A
It
was
not
very
easy
to
deploy
these
things
in
a
live
environment
so
that
every
developer
can
can
get
his
own
version
in
a
very
tiny
environment
and
such
things
it
needed
like
a
traditional
application
servers,
which
is
you
know
in
the
enterprise,
it's
not
something
that
you
can
get
very
easily.
You
have
to
open
ticketing
and
such
things,
so
you
know
we
are
at
openshift
coffee
break.
We
like
to
talk
about
containers
about
kubernetes,
tell
us
a
bit
more
how
makrox
microx
makes
that
much
easier.
A
C
So
obviously
this
is
this
is
why
we
wanted
to
have
something
that
is
a
cumulative
native,
because
we
want
to
to
to
make
the
deployment
very,
very
easy
so
and
in
fact
it's
very
flexible,
so
it
can
be
deployed
on
communities
on
openshift
using
operators.
C
You
can
also
deploy
a
microx
on
your
laptop
using
a
simple
podman
compose
or
doco
compose
file.
So
it's
very,
very
lightweight,
and
the
nice
thing
is
that
it
opened
the
doors
to
many
different
use
cases
and
deployment
topologies.
So,
for
example,
we
have
we
have
users
that
are
deploying
microx
as
a
central
instance
for
the
organization.
C
It
is
always
up
and
running
and
it
is
gathering
all
the
api
definition
of
the
organization,
but
some
of
our
users
are
just
popping
up
ephemeral
instances
of
microbes
within
their
cicd
pipelines
so
that
they
are
able
to
to
to
dynamically
mock
their
environment,
but
in
a
very
ephemeral
way,
just
during
the
cicd
pipeline.
Look
so,
as
you
see,
as
you
will
see
it's
very
it's
very
flexible.
C
C
All
good
is
it
okay,
okay,
just
one
more
thing,
because
we
we
didn't
tell
a
word
about
this
thing-
is
that
one
one
nice
thing
about
microxen,
but
we
absolutely
want
to
implement
it,
is
that
it's
not
a
tool
that
is
dedicated
to
only
rest
apis
or
soap
web
services
or
whatever.
C
We
truly
want
to
embrace
the
diversity
that
that
exists
in
in
big
organizations.
So
at
the
moment
we
are
supporting
all
these
different
kind
of
api,
so
microx
is
not
just
yet
for
for
rest
apis,
but
you
can
also
simulate
and
test
grpc
services
or
async
apis
that
are
based
on
kafka
or
mqtt
or
websocket,
and
so
on.
You
can
also
now
support
graphql
mocking
with
microx,
but
so
the
the
nice
ways.
Nice
thing
is
that
we
you
have
the
the
same
approach,
the
same
tool,
whatever
the
kind
of
of
apis.
C
You
are
dealing
with
okay,
so
now,
let's
jump
to
to
demonstration
so
just
for
for
starting,
I
have
an
openshift
cluster
hot
hand.
Okay,
and
maybe
we
can
just
start
with
deploying
a
new
micros
instance.
Okay,
just
to
to
show
you
how
how
easy
it
is.
So
I
can
create
a
project.
C
C
C
D
D
So
there
is
a
sort
of
multi-tenancy
management
or
better
to
have
a
in
terms
of
of
resource
consumption,
one
installation
of
for
each
team
or
tenancy,
which
is.
C
Yeah
it,
it
really
depends,
and
both
deployment
options
are,
are
possible,
just
create
it
just
to
to
let
it
go.
We
we
have
multi-tenancy
built-in
microx,
so
you
have
the
the
ability,
if
you
have
a
huge
repository
to
to
segment
and
to
isolate
different
parts
of
this
repository,
beat
it
using
a
domain
or
topology
or
by
applications
or
by
teams
whatever.
C
So
you
can
do
that
way,
or
we
also
have
users
in
the
community
that
are
deploying,
for
example,
microbes
instance
for
each
and
every
business
unit
and
then
using
this
instance
as
a
work
instances
and
and
then
having
a
promotion
process
in
order
to
to
to
populate
a
more
central
repository
with
just
the
public
apis
that
should
be
used
by
the
developers.
So
both
are.
It
really
depends
on
your
on
your
organizations
and
and
flow.
D
B
Just
another
question:
along
those
lines:
can
you
envision
using
it
as
part
of
a
service
mesh?
So
it
has
just
one
instance,
which
is
part
of
a
service
mesh.
That's
mocking
all
the
apis
that
haven't
been
developed
yet.
C
Yes,
exactly,
this
is
definitely
one
of
the
use
cases.
We
have
users
in
community
that
are
starting
new
projects,
setting
everything
in
the
service
mesh
setting,
putting
virtual
services
and,
for
instance,
starting
their
project
with
virtual
services,
pointing
to
micro
instance
and
then,
as
the
deploy
as
the
the
implementations
of
microservices
now
are
ready.
They
are
switching
the
the
virtual
services
to
real
life
instances.
So
this
is
also
a
definitive
valid
use
case.
A
A
You
can
also
use
some
tricks
like
using
like
headers
to
say.
If,
if
you
put
a
specific
header,
then
you're
gonna
hit
the
mock.
C
A
And
if
you
don't,
then
you
go
to
straight
to
the
direct
service
and
such
things
so
exactly
it
allows
you
to
do
some
from
some
nice
advanced
testing
features
and
stuff
cool.
C
So
now,
switching
to
the
developer
perspective,
you
can
see
it
has
been
deployed,
so
it
is
a
set
of
different
microservices,
and
just
here
you
have
the
yes,
the
access
url,
maybe
the
the
key
clock
instance
is
not
yet
ready,
but
just
to
to
show
you
here,
here's
the
the
results
you
get
at
the
end.
So
once
you
are
logged
in
you
are,
you
are
accessing
the
the
dashboard.
C
Okay,
we
show
you
a
little
graph
on
what
your,
how
is
composed
your
your
api
repository
and
you
can
see
that
we
already
have
different
kinds
of
apis
within
and
you
have
your
api
catalogs
just
right
here
with
the
different
different
apis,
you
you
will
have.
You
have
imported
into
the
repository
okay.
C
Maybe
let's
start
with
an
example,
just
to
show
you
how
we
can
you
can
use
microx
let
this
start
up.
Okay,
this
is
this
is
ready
so
how
to
start
with
my
crocs.
Of
course
we've
got
a
lot
of
samples
on
the
on
the
website.
But
let's
say,
as
you
said
before,
jafar
we
are
an
api
designers
and
api
developers
and
we
are
starting
with
a
contract
okay.
C
So
this
is
here
a
simple
open
api
contract,
so
this
is
an
email,
but
you
may
have
used
a
designer
with
a
wig
designer
if
you
want
and
we'll
find
typical
elements
of
a
contract
here,
so
we'll
find
a
different
path
by
the
way.
This
is
an
api
that
is
dealing
with
past
tweezers.
Okay,
I
love
pastries.
So
this
is
a
pastry
catalogue
and
in
this
contract
you
define
a
different
different
operation,
so
you
can
get
all
pastries
or
you
can
get
pastries
using
a
name
and
update
existing
pastries
okay.
C
C
C
These
are
french
pastries
baba,
rum,
divorcee,
tartelette,
strawberry,
tartlets,
okay,
so
that's
the
only
thing
we
need
to
get
microx
up
and
running
once
you
have
added
samples,
microx
is
able
to
generate
an
existing,
a
live
apis
for
you.
So
let's
get
back
to
the
microx
ui.
C
A
So
laura,
I
will
have
a
question
about
this
api
file.
Yeah,
so
did
you
create
it?
You
said
we
can
use
some
tools
to
to
generate
that.
But
do
you
have
something
that
can
bootstrap
a
sample
open
api
example,
for
example
like
the
just
the
structure,
so
you
can
easily?
A
C
What
you
want
exactly
one
of
my
favorite
tool
is
a
picoyo
that
is
also
an
open
source
tool
sponsored
by
by
red
hat.
So,
for
example,
if
you
are
working
with
apko
apiqrio
is
a
wysiwyg
designer
for
api
contracts.
C
C
You
can
just
yes
add
new
examples,
edit
existing
examples
in
a
very
yes
wysiwyg
way,.
A
C
So,
let's
get
back
to
microx
once
we.
C
C
C
B
C
Decent
point
is
using
a
different
dispatching
rule
so
that
you
do
not
have
a
a
dummy
behavior,
but
we
have
we
try
to
have
the
the
smartest
behavior
possible.
So,
for
example,
if
you
are
looking
at
the
the
second
operation,
retrieving
the
pastry
using
its
name,
you
can
see
that
we
have
this
eclair
cafe
pastry
with
a
json
representation
and.
B
C
This
is
the
xml
one,
but
if
you
had
the
letter
accept
application,
json
pipe
up.
This
is
the
basin
one
and
the
same
thing
here.
If
you
want
to
replace,
not
have
the
the
eclair
cafe,
but
the
meat
for
you.
C
So
we
really
try
to
have
something
that
is
a
the
smartest
possible
with
what
we
called
the
dispatching
rules:
okay
and
with
different
behaviors,
depending
on
the
kind
of
apis
you
you
have
to
deal
with
so
with
rest
apis,
it
realizes
a
contact
negotiation
with
graphql
apis.
It
realizes
attribute
filtering
or
aliases
query,
and
so
on.
D
A
D
Because
we
mentioned
the
the
test
of
the
contracts
in
terms
of
the
security
part
of
the
contracts,
I'm
pretty
sure
that
in
microsoft
you
you
already
thought
about
it.
If
you
can
just
give
some
more
details
about
these
aspects
of
the
api
testing.
C
We
support
it
in
a
way
that,
if
you,
for
example,
if
you
want
to
test
a
secured
api
using
using
oho
for
open
id,
for
example,
when
realizing
a
test
with
microx,
you
can
provide
the
the
bearer
token
you
will
have
to
to
use,
and
it
will
inject
this
mirror
token
with
the
different
requests
it
will
issue
to
the
to
the
system
under
test.
C
A
C
Exactly
there
is
a
way.
This
is
a
very
nice
question.
Let
me
show
you
the
the
problem
we
face
with
post
question.
Is
that
most
of
the
time
we
want
to
to
to
create
a
new
resource
and
to
create
it
with
different
with
different
properties
with
different
attributes?
Okay-
and
we
want
some
kind
of
dynamic
behavior,
because
we
want
the
mock
to
return.
C
What
we,
what
we
send
in
the
request-
and
this
is
something
we
we
handle,
of
course
with
micros-
with
what
we
call
the
response-
templating,
okay.
So,
for
example,
I
have
here
an
api
that
is
about
user
registration.
C
Okay,
so
I
can
create
new
user,
so
it
only
have
the
post
operation
and
you
can
see
that
I
have
a
sample
here
and
create
a
new
user
called
chuck
norris.
But
I
don't
want
a
static
response.
I
want
a
dynamic
one
and
in
fact
I
can
use
some
templating
expression
within
my
response
to
specify
this
dynamic
behavior.
C
The
incoming
data
from
the
query,
so
here
I
will
re-inject
the
full
name.
I
will
re-inject
the
email
address
and
I
will
also
add
an
extra
field
that
is
the
the
current
timestamp
for
the
registration
date.
Okay,
so
this
is
something
we
we
really.
We
really
support
and
let
me
show
you
an
example
of
this.
I
have
copy
pasted
here,
so
I
can
use
the
same
mock,
url
right
here
and
just
again
just
copy
paste
here,
and
you
can
see
that
I
am
specifying
some
new
data.
B
C
A
Very
nice,
thank
you
lauren.
So
there's
another
question
that
I
think
I
already
know
the
answer,
but
I
will
ask
it
because
it's
asked
in
the
chat.
So
so
somebody
asks
okay,
so
we
did
the
post.
So
how
would
that
work?
A
How
do
we
store
the
data
and
I'm
guessing
what
the
intent
here,
the
intent-
I
I
guess
is:
okay,
I
did
a
post
so,
for
example,
for
your
pastry,
I
added
a
new
pastry
and
then
I
want
to
do
a
get
and
check
that
the
post
has
worked,
but
I
I
think
I
know
the
answer,
and
probably
the
answer
is
this
is
not
the
intent
to
like
do
post
and
get
the
intent
is
to
validate
unitary
endpoints
and
not
have
like
a
functional
flow
with
this
type
of
of
exactly.
C
C
Different
operations,
we
do
not
keep
a
state
on
the
server
side,
we
just
use
the
examples.
The
data
sets
you
provide
and
what
we
really
want
to
to
ensure
is
the
the
the
contract,
the
syntactic
and
the
semantic
contract,
and
also
the
the
business
rules
regarding
the
contract
are
valid.
We
do
not
go
to
that
way
to
keep
a
track
of
what
has
been
recording
and
what
what
can
be
used.
It's
just
pure
contract
testing,
okay,.
C
D
Like
you
have
big
fun,
yeah.
D
I
do
have
one
more
question
if
you
want
to
answer
later:
that's
okay,
regarding
something
that
probably
is
not
so
evident
if
we
look
at
only
the
contoured
part
of
the
testing.
But
this
isn't
it's
not
so
unusual
in
big
integration
project
to
have
more
than
100
api
that
you
need
to
test.
What
about
the
scalability
of
this?
D
Not
of
micro,
scalability,
of
this
kind
of
of
process,
also
api
and
mocking
and
testing
testing
in
terms
of
control,
so
how
micros
could
help?
Because
I
can
imagine
that
if
any
developers
start
to
spawn
its
own
pod
or
pods
receiving.
B
D
Test
their
api,
that
could
be
an
issue
for
the
platform.
C
Yes,
it
could
be
an
issue
for
from
different
points
of
view.
It
could
be
an
issue
from
the
the
the
resource
consumption
point
of
view,
but
it
can
also
be
an
issue
from
the
from
the
the
drift
risk
point
of
view.
I
think
if
anyone
is
starting
to
to
work
with
local
definition
with
its
own
local
definitions
of
of
marks,
troubles
may
arrive
very
soon.
C
So
the
the
answer,
the
solution
that
may
let
me
bring
my
crux
to
the
table-
is
that,
of
course,
because
it's
based
on
kubernetes
and
because
it
has
a
very
flexible
deployment
model,
it
can
be.
C
It
is
very
scalable,
so
we
have
a.
We
really
have
users
in
the
community
that
are
dealing
with
three
four
hundreds
of
apis
definition
within
the
same
repository
without
any
scalability
issues.
So
this
is.
This
is
the
first
answer
to
the
first
point
of
view,
and
also,
of
course,
regarding
the
the
drift
risk.
Is
that
what
we
absolutely
promote
is
the
the
ability
to
to
plug
and
to
sync
everything
with
the
the
source
of
truth
that
should
be
to
into
your
git
repository.
C
So,
even
if
you
decide
to
go
to
go
the
way
where
you
deploy
different
microx
instance,
as
long
as
you
use
the
the
git
repository
connector,
this
is
not
a
problem
because
every
instance
will
still
be
in
sync
and
every
developers,
every
teams
we
still
receive
the
the
different
updates
of
the
different
apis
and
data
sets.
So
everything
should
be
should
be
fine.
So.
A
C
Github's
approach
is
really
nice
could
be
used
to
pop
up
already
populated
microx
instances,
but
it
can
also
be
used
to
to
ensure
that
every
yes,
every
max
definitions
are
in
sync.
A
So,
for
instance,
if
you
have
three
developers
who
have
their
own
instances
in
their
namespaces
and
and
they
in
you
think
with
the
github
repo
they're
they're
going
to
be
updated
afterwards,.
A
C
And
just
this
is
a
teaser,
but
one
thing
we
are
working
on
is
the
is
the
concept
of
a
hub?
Okay,
not.
D
C
C
Okay,
I
can
add
an
import
job
in
my
instance
assume
that
I
will
always
be
in
sync
with
the
standard
mocks
or
I
can
just
direct
import
once
okay
getting
on-
and
here
I
have
in
my
instance
in
my
repository
all
the
open
banking
contracts
and
the
open
banking
marks
and
tests.
If
you
want
to
to
develop
an
application
using
open
banking
for
example,
so
it's
really
really
easy
to
integrate.
We
want
we
want
it
to.
We
want
to
make
it
very
easy.
B
Well,
one
further
question
regarding
the
the
suggestion
of
having
a
sort
of
a
centralized
microx
that
will
work
for
all
the
related
projects.
That
say
again
in
the
context
of,
for
example,
a
service
mesh
you've
got.
B
Projects
but
they're
all
part
of
the
service
best
you've
got
multiple
teams
working
on
subsets
of
those
microservices
and
they're
related
their
related
contracts
and
and
whatever
definition
they're.
Having
you
mentioned
earlier
that
microx
uses
key
cloak.
I
just
wanted
to
explore
what
what
did
use
it.
What
this
is
used
for.
C
Yeah,
so
yes
dfinity,
so
it
is
key.
Clock
is
used
for
authentication,
so
you
can
deploy
key
clock
as
part
of
of
my
clock
so
or
you
can
reuse
an
existing
an
existing
instance
of
key
clock.
If
you
already
have
a
rim
setup
or
something
or
something,
and
we
also
using
a
key
clock
for
rail
roller
roll
based
access
control.
Sorry
and
finally,
we
are
using
eclocks
to
enable
the
multi-tenancy.
C
I
was
getting
too
exactly
so
using
the
micros
interface.
You
can
specify
you
can
you
can
classify
your
different
apis
using
whatever
topology
you
want?
It
could
be
domain,
for
example,
it
could
be
teams,
and
so
on,
and
with
that
you,
you
have
the
ability
to
define
role-based
access
control
on
this
front
domain,
so
you
may
have
one
user.
That
is
a
content
administrator
for
the
domain,
for
the
authentication
domain,
for
example,
but
is
just
a
regular
viewer
of
of
the
other
domain.
C
C
Exactly
so,
if
you
have
a
look
at
the
yes,
the
administration
part,
I
have
just
three
users
and
I
have
the
ability
to
to
view
roles
okay
and
manage
groups.
This
way,
when
I
want
to
manage
groups,
I
will
be
able
to
to
assign
roles
in
different
domains
of
my
of
my
repository.
C
C
D
D
The
next
feature
should
be:
this
is
just
a
request
for
change.
If
you
are
testing
a
pastries
api,
you
should
be
able
to
send
all
the
attendees
pastors.
C
Yeah
exactly
yeah,
so
I
for
for
the
end
of
the
the
session,
just
wanting
to
to
highlight
one
nice
feature
is
that
for
the
moment
we
just
deal
with
rest
apis,
but
now
with
microx.
We
also
support
a
sync
apis.
Okay,
I
think
apis
are
asynchronous
apis
and
this
is
the
apis.
You
want
to
to
define
to
design
on
top
of
kafka
on
top
of
nqp
brokers
and
qt
brokers,
and
so
on
and
within
my
course.
We
also
support
all
these
different
protocols.
C
So
as
an
example,
here
you
can
imagine
we
have
a
user
signed
up
api.
That
is
an
api
that,
once
a
user
has
been
registered
into
a
system,
it
will
just
emit
an
event
to
say
new
user
has
been
registered
okay,
so
you
can
define
an
operation
using
the
async
api
specification
and
you
can
see
that
we
support
different
bindings.
So
here
the
different
the
default
bindings
is
kafka.
C
So
that
way,
the
consumers
of
the
event
do
not
have
to
wait
that
the
implementation
that
the
connectors
that
the
the
event
has
been
really
emitted
and
can
start
playing
around
with
with
events
as
it
were,
as
there
were
real
events.
So
as
an
example
here
I
can
have
just
you
can
see
that
I
am
connected
to
this
kafka
endpoint.
C
C
Okay
and
listening
to
this
topic
that
is
used
by
microx
to
to
generate
mocks,
and
you
can
see
that
each
and
every
three
seconds
I
will
receive
new
random
messages
and
can
start
playing
around
with
with
yeah
with
the
events
as
it
was
a
live
event.
A
C
Yeah
something
about
the
testing.
So
let's,
let's
finish
with
firework
okay,
so
let
me
show
you
one
slide:
where
is
my
slide?
No
just
right
here.
Yes,
let's
see.
C
Kind
of
systems-
this
is
very,
very
common.
Okay,
we
have
a
user
registration
micro
service
that
is
presenting
a
rest,
api,
okay
to
create
a
new
user,
and
once
a
user
has
been
created,
it
emits
a
new
asynchronous
event
that
may
be
consumed
by
different
systems
in
the
organization,
the
crm,
the
marketing
system
and
so
on.
C
So
of
course,
we
can
use
open
api
to,
as
we
have
seen
before,
specify
this
api,
but
we
can
also
use
async
api
to
to
ensure
that
we
can
specify
the
messages
that
are
sent
on
the
topic
once
the
user
is
registered.
C
C
C
This
is
in
my
user
registration
demo,
and
this
is
a
kind
of
pipeline
you
may
define
using
microx.
Okay,
you
want
to
deploy
your
application
and
then
you
have
two
parallel
branch
on
one
branch.
You
will
test
the
open
api
using
microx
and
on
the
other
branch
you
will
test
the
sync
api
stuff.
You
will
connect
to
the
broker.
C
C
C
You
see
that
I
have
a
test
running
in
my
course
and
what
is
a
test
in
microx
is
just
connecting
to
this
endpoint:
okay,
gathering
all
the
incoming
messages
and
validating
the
messages
using
the
the
the
contract,
you're
providing
and
so
that
way.
A
Yeah,
so
it's
not
it's
not
checking
like
the
input
and
output,
but
it's
checking
that
the
the
input
is
valid
according
to
the
schema
and
the
output.
A
A
B
C
C
A
Yeah,
so
I'm
a
bit
curious
about
like
if
you,
if
we
check
that
step
so
did
you
create
a
custom
task
or
is
it
yes?
A
C
You
can
install
this
custom
task
into
your
cluster
and
then
it's
just
a
bunch
of
parameters.
You
can
specify
the
the
api
name
and
version
you
want
to
you
want
to
test.
You
want
to
specify
the
test
endpoint,
okay
and
the
test,
the
testing
strategy,
because
different
testing
strategies
exist
in
my
clock.
So
here
I
want
to
test
the
the
compliance
with
the
open
api
schema
of
this
api.
A
It
could
be
a
nice
announcement
yeah.
That
would
be
a
suggestion
because,
because
one
of
the
from
looking
at
this,
you
have
a
nice
understanding
of
the
testing
scenario
from
the
pipeline
perspective.
A
But
if
you
just
look
at
the
microbes
perspective,
you
don't
have
an
understanding
of
this
is
the
whole
scenario
that
has
been
tested
exactly.
If,
if
you
go
back
to
the
pipeline
and
link
to
the
test
results,
then
it's
going
to
help.
You
assess
basically
the
whole
scenario.
Yeah.
A
And
maybe
from
even
from
microx,
you
could
maybe
link
back
to
the
pipeline
url
or
something
like
that.
Yeah.
C
A
very
nice
idea:
yes,
it
could
be
nice,
yes,
because
here
I
just
have
the
the
id
of
the
test
id
that
you
can
find
right
here
in
the
url
okay,
yeah.
A
Yeah,
yes,
I'm
guessing
it's
just
a
matter
of
like
concatenating
the
the
microscope.
B
D
Yeah,
it
was
a
great
time
and
thanks
microsoft
is
really
cool,
as
more
than
I
understood
the
reading.
Just
your
website,
thanks
for
the
demo.
Thanks
for
the
details,
I
I
will
give
you
just
some
announcement
regarding
the
next
session
we
are
going
next
week.
We
are
going
to
have
a
a
guest
speaking
about
the
new
one
of
the
new
coming
in
red,
that
is,
the
radetto
bishop
data
science.
D
B
C
A
Okay,
so
thank
you
very
much
lauren.
That
was
an
awesome
demo
and
session.
So
we'll
probably
call
you
back
once
in
a
while
to
have
more
info
about
like
the
pipelines
and
yeah
how
how
it
integrates
into
a
cicd
pipeline,
maybe
in
a
tecton
series,
because
we
are-
we
are
also
going
to
have
some
tecton
enablement-
show
episodes
in
here.
So
thank
you
very
much
again.
A
Yeah,
always
a
pleasure
don't
forget
to
to
subscribe,
don't
forget
to
go
to
openshifttv
to
get
the
content
for
all
the
shows
that
we're
going
to
have
today
and
if
you
are
interested
in
actually
and
knowing
how
to
move
from
podman
local
development
to
openshift,
with
the
pudman
generate
that
allows
you
to
generate
cube,
artifacts
check
out
our
level
up
our
show
to
this
afternoon,
where
I'll
be
doing
some
demos
about
that
at
3
pm
cet
all
right.
Thank
you
guys.