►
From YouTube: Spec 3.0 meeting (February 15, 2023)
Description
A
A
Does
anyone
have
anything
they
want
to
start
off
with
before
jumping
into
the
agenda
Samir?
Maybe
you
want
to
introduce
yourself
to
the
group.
B
Yeah
so
I'm
a
newcomer
in
in
the
smkpi
postman
side
I'm
here
to
to
help
the
team
and
support
them
with
anything
related
to
it's.
A
Engineering
Management
before
I
was
helping
different
organizations
with
their
API
program
and
strategy,
and
they
will
also
contribute
to
a
Sync
API
initiative
yeah.
So
I'm
I'm
really
happy
to
be
here
with
you
guys
so
and
I
will
just
listen.
I
will
learn
from
you
guys
and
and
see
yeah.
What
I
can
help.
A
No
right
Lucas
do
you
want
to
take
your
issue
first
before
I
can
kind
of
recap
afterwards,
because
I
think
it's
a
bit
more
important
to
bring
that
up.
C
C
First
of
all
that
there's
no
support
for
Proto
or
xsd
or
any
other
stuff
that
is
not
supported
by
a
dollar
ref
and
the
other
one
was
that
currently
you
can
put
schemas
in
components
only
if
these
are
awesome,
API
schemas,
if
you
have
Avro
or
raml
or
open
API
or
Json
schema,
you
cannot
put
them
in
components.
C
You
can
put
them
in
separate
files,
of
course,
but
you
can't
put
them
in
components:
components
on
the
support,
doesn't
API
schemas,
so
I
basically
promised
to
Jesse
that
I'll
take
a
look
into
the
pending
issues.
We
had
related
to
it.
So
I'd
like
to
talk
about
it
now,
but
so
that's
the
context
in
loud.
Let
me
know
first,
maybe
share
a
screen,
so
you
have
to
give
me
a
second
because
I
wasn't
I
didn't
know
that
I
would
show
it
now,
but
it's.
D
C
Okay,
I
think
I
have
it
where's
the
bottom.
D
C
Okay,
so
that's
the
that's
the
issue
I
was
checking
out
and
where
we
are
because,
like
last
time
when
we
talked
I
think
the
conclusion
was
that
adding
support
for
Proto
or
or
xsd
or
any
other,
not
dollar
ref
supported
formats
is
like
in,
like
there's
a
high
probability
that
it's
not
going
to
be
a
breaking
change.
It's
just
gonna
be
another
feature.
So
that's
why
I
was
just
looking
into
like
from
all
the
list
of
different
issues.
C
I
was
looking
basically
on
Old
proposal
from
from
Machi
that
was
focused,
I
mean
the
discussions
in
the
in
the
in
the
issue.
They
drifted
to
some
different
topics,
but,
like
mainly
main
goal
of
the
proposal,
was
to
actually
support
different
schemas
in
in
the
components
like
different
schemas
than
async
API
schema,
which
is
a
superset
of
Json
schema.
C
C
So
that's
the
I'll
just
share
the
final
example
and
because
I
don't
want
to
make
it
a
boring
statement
from
my
side,
so
I'll
just
make
a
go
through.
The
final
example
that
I
just
read
before
the
meeting
changed
from
two
zero
to
three
zero
and
I'll.
Take
comments
from
there,
so
example
shows
I.
Think
it's
one-to-one
actually
proposal
from
Machi.
C
So
basically
the
idea
is
that,
like
So,
currently
in
payload,
do
you
want
me
to
show
current
as
well
like
next
to
maybe
I'll?
Show
you
one
next
to
each
other
like
do
you
prefer
me
when
I'm
gonna
show
the
proposal
and
how
it
currently
looks
like
in
the.
D
E
C
Okay,
cool
I
think
I
will
be.
C
I
mean
I
applied
here,
3-0
structure,
but
like
what
actually
matters
is
the
messages
that
did
not
change
much
in
the
in
three
zero
so
current.
Currently,
when
you
go
to
channels
use
a
message,
so
let's
take
a
look
on
the
structure
of
the
message.
C
The
payload
is
directly
a
schema
right,
so
in
payload
you
you
either
use
a
reference
to
a
schema
or
you
inline
the
schema
directly
and
then
the
same
or
the
same
as
in
in
schemas
in
components.
You
just
have
a
it's
a
map
of
string
key,
which
is
the
name
of
the
schema
and
then
the
and
then
the
schema
itself
so
far
so
good
I
hope.
C
So
the
idea
from
Machi
and
correct
me
if
I'm
wrong
but
I,
think
it's
the
same
as
as
you
proposed,
so
the
payload,
or
something
that
you
don't
see
in
this
examples.
Or
do
we
have
have
row
here.
C
Oh
so
it's
Kafka
example
without
Avro,
so
normally
like
majority
of
people
in
Kafka
use
Avro,
so
what
they
have
to
do
if
they
reference
payload,
that
is
a
Avro
payload.
They
also
have
to
specify
here
a
property
which
is
schema
format.
C
And
yeah,
it's
excuse
me
about
my
sound,
so
the
the
schema
format
you
specify
like
Avro,
whatever
blah
blah
blah
the
long
string
that
I
will
never
remember
how
to
use
or
actually
I
can
copy
it
from
here.
C
And
it's
is
it.
A
D
C
Oh
yeah
I'm
not
going
to
be
able
to
use
it
because
it's
not
possible
to
inline
afro
in
components.
So
let's
just
ignore
the
arrows.
Let's
assume
they
are
not
there.
The
most
important
is
how
it
was
in
the
previous
version.
So
on
the
same
level
as
payload,
you
specify
the
schema
format
and
that's
how
it
worked
now.
C
The
the
idea
of
The
Proposal
is
to
have
actually
so
oh
yeah
and
important
to
know
that
from
the
spec
perspective,
this
is
called
schema
object
if
it
like,
we
use
terminology
from
the
from
the
spec,
so
you
can
have
in
a
payload.
You
specify
a
schema
object,
but
the
idea
is
that
a
schema
object
is
not
a
async
API
schema,
but
schema
object
is
an
object
with
two
values:
schema
format
and
schema.
So,
basically,
schema
format.
C
C
It
is
a
an
object
of
with
two
keys
schema
format
and
then
again
you
can
put
the
schema
so,
like
one
level
level
further,
you
put
the
schema
and
in
case
of
components
the
same
thing.
So
you
have
ski
mask
key
and
then
directly
you
don't
see
schema,
you
just
say
schema
and
if
there's
a
schema
format
needed,
you
want
to
specify
it's
because
it's
specified,
because
it's
because
it's
Avro
for
example,
then
you
put
it
next
to
the
schema.
C
So
that's
that's
the
the
core
of
the
idea,
so
schema
object
is
not
any
longer
a
async,
API
schema
and
and
what
else
and
yeah
and
the
what
we
knew
as
schema
object.
It
becomes
literally
a
async
API
schema
object,
as
it's
described
in
the
in
the
spec
and
yeah
the
only
disadvantage.
C
We
could
say
that
it's
a
disadvantage,
but
it
can
be
fixed
with
tooling.
Is
that
the
reference?
Then,
if
we
reference
a
oh,
actually,
that's
a
I
quickly,
fixed
it.
But
it's
an
error.
C
So
this
is
actually
I
added
it
later,
but
yeah.
So
basically,
if
you
can
see
the
reference
when
you
reference
a
the
whole
object,
you
points
to
a
to
the
whole
schema
object.
But
if
you
the
the
disadvantages.
C
When
you
want
to
reference
to
a
specific
schema,
so
then
schema.
E
C
C
So,
like
you
cannot
have
an
a
magical
solution
that
can
take
the
name
of
the
schema
from
the
reference
but
like
we
anyway
have
these
issues
now
with
Anonymous
schemas,
and
we
anyway
can
in
the
tooling,
assign
the
IDS
like
X
parser
ID,
whatever
extension
to
the
to
the
schema,
so
I
don't
believe
it's
a
it's
a
huge
huge
problem,
the
huge
disadvantage.
C
So
when
I,
when
I
looked
quickly
on
it
and
I,
think
it
was
reviewed
by
many
people
because,
as
I
said,
it's
an
old
proposal
from
from
Machi,
so
I
haven't
seen
anything
bad
about
it.
That
would
make
cause
issues
and
the
advantage
is
that
like
then
it's
it's
still
ready
for
adding
any
other
support
for
Avro
or
for
Proto
or
or
whatever
else
we
want
in
the
future
and
before
I
I.
Let
others
to
ask
questions
like
Machi.
Is
there
anything
I'm
missing?
F
Yeah:
okay,
no
problem
because
we
had
the
discussion-
I,
don't
remember
when,
but
in
previous
meetings,
maybe
maybe
in
the
previous
year
that
the
first
problem
is
this
nested
referencing,
yes,
I
mean
if
you
want
to
reference
the
another
Json
scheme,
another
schema
from
the
component
schema.
You
need
to
use
this
schema,
suffix,
yeah.
D
F
You
cannot
use
the
schema
with
the
schema
and
schema
format
instead
and
another
problem
accept
payload
message:
payload.
We
can
also
use
the
schemas
inside
the
headers
and
it
need
to
be
the
Json
schema,
and
people
can
think
that.
Okay,
if
I,
can
Define,
for
example,
davor
schema
Ram
inside
the
component
schemas,
maybe
I
have
also
ability
to
use
the
reference
in
the
headers
and
also
we
have
the
parameters.
D
F
The
next
the
problem,
some
some
fields
in
the
bindings
like
in
the
Kafka
Kafka
key
as
I
remember-
can
be
also
described
by
the
schema.
Okay.
So
that's
another
problem
that
people
probably
think
okay,
if
I
use
the
Avro
in
the
payload
I,
should
also
be
able
to
use
the
Avro
in
you
know
the
bindings
or
maybe
also
extensions,
yeah
and
yeah,
and
that's
the
part
I
mean
it's
not
the
problematic
in
this
way
that
if
you
define
that
in
the
given
place,
you
can
Define
the
schema
yeah
schema
object.
F
F
By,
therefore,
you
can
also
describe
this,
you
can
use
the
type
record.
Fields
describe
the
strings.
Yes,
but
at
the
end,
after
parsing
you
will
have
the
decision
scheme,
which
will
be
the
type
object.
So
at
the
end,
yes,
you
can
also
use
them.
E
I
mean
with
headers,
it
maybe
even
makes
sense
we
let
people
use
Abra
or
any
other
format
to
validate
headers
just
for
headers,
because
it's
part
of
the
message
it's
the
same
as
at
the
same
level
as
a
payload,
maybe
so
so
yeah,
but
for
Channel
parameters.
For
instance.
E
What
I
will
do
is
we
I
will
not
support
this
new
schema
object,
but
instead
I
will
I
will
support
just
whatever
you
call
it.
18
kpi
schema
object,
or
you
know
the
the
Json
scheme,
a
specific
type.
You
know
like
the
one
that
we're
using
now
right,
so
I
will
limit
it.
The
one
that
doesn't
have
a
scheme
and
scheme
format
inside
the
one
that
is
in
schema
directly
I
will
use
this
one
for
the
channel
parameters
and
I
will
make
it
clear
like
no.
E
You
cannot,
you
cannot
use,
or
maybe
we
make
it
clear.
You
can
reference.
This
kind
of
new
schema
objects
with
schema
format
and
schema
keys,
but
we
make
it
clear
that
only
Json,
schema
or
type
of
Kinder
Jason
schema
that
we
have
it's
supported
right.
So.
A
E
Because
the
parameters
is
sent
in
there,
it's
strictly
related
to
async
API
in.
E
Payload,
it
has
nothing
to
do
with
its
kpi.
It
is
something
that
might
be
shared
with
other
systems.
How
do
you
define
your
message
in
other
systems
like
a
schema
registry,
or
something
like
that?
So
you
might
want
to
use
schematic
registry
definitions
using
Avro
for
the
headers
and
for
the
payload,
but
no
one
in
the
right
Minds
right.
They
will
use
a
schema
registry
to
define
the
parameters.
E
But
you
know,
but
headers
and
payload
are
a
it's
a
resource
that
it's
shared
across
multiple
tools
and
services
and
included,
including
that
you
know
the
schema
definitions,
foreign.
E
A
D
A
E
User
that
haven't
seen
I
think
you
played
before
it's
going
to
be
confusing
for
sure
many
parts
of
it.
So
that
is
not
really
just
not
really
solving
anything.
B
E
E
They're
probably
I
mean
there
might
be
the
case
where
some
people
are
migrating
from
one
language
to
another
and
they
might
end
up
with
a
mix
in
the
meantime
means,
while
they're
migrating
from
one
to
another.
So.
B
E
D
E
Could
be
defined
that
Road
level
for
the
whole
document,
like.
E
C
E
Yeah
but
yeah
that's
a
different,
so
so
yeah
so
I
mean
in
many
case.
We
could
have
it
as
a
proposed
to
important,
like
that's
true,
so
that
that
will
also
probably
help
and
that's
another
idea
that
we
redefine
it
on
the
levels.
The
whole
document
will
have
to
be
either
everything
about
Abra
or
everything
about
business,
schema
or
everything
about
whatever
format.
So
we
don't
have
to
be,
including
all
this
things,
but.
A
One
of
the
cool,
but
one
of
the
cool
things
is
that
you
can
mix
that
you
can
mix
them
if
you
have
different
definitions
of
those
types
of
components,
because
if
you
have
an
application
that
both
uses
rest
and
Kafka
for
example,
then
you
might
have
a
mix
between
open
API
components
and
Avro
excuse,
and
you
wouldn't
be
able
to
do
that.
If
you
just
limit
it
to
just
one
specific.
E
C
F
E
If
we
put
it
on
the
root
level,
a
default
schema
format,
not
the
default,
we
will
call
it
default.
We
call
it
schema
format
right.
So
this
is
the
one
that
will
be
ruled
for
the
whole
document.
You
can
never
write
it
on
any
single
level,
not
like
the
content.
So
it's
like
this
is
a
schema
format.
Then
we
will
not
need
schema
format
and
schema
inside
the
payload
or
inside
the
schema
definition.
E
We
will
admit
that
we
could
continue
as
it
is,
and
the
parser
will
interpret
this
this
objects,
depending
on
what
it
says
on
the
root
level.
You
know
it
will
enter
predict,
that's
the
schema,
schema
object
or
Abra
or
whatever,
depending
on
the
roots
field
right.
So
it's
a
little
bit
less
flexible,
but
the
reality
is
that
maybe
nobody
needs
so
much
flexibility
or.
F
A
C
Just
to
clarify
so
we're
talking
now
about
default,
schema
format
as
a
alternative
solution
for
having
support
for
afro,
for
example,
in
components.
E
F
F
F
E
My
question
is
as
well
like
what
what
is
the?
What
is
the
probability
that
you
want
to
review
stuff
on
on
components,
parameters
so
think
about
it,
like
you,
probably
have
a
bunch
of
things
on
the
components
on
the
on
the
parameters,
the
material
parameters,
a
bunch
of
parameters
that
are
looking
mostly
the
same
or
exactly
the
same
right
and
but
they're
all
from
they're
all
scholar,
values
right,
so
they're
strings,
they're
numbers,
it
cannot
be
an
object.
E
It
has
to
be
probably
a
string,
a
set
of
allowed
strings,
a
string
that
has
some
bubble
applied
to
it,
a
number
in
between
limits,
something
like
that.
Scalar
values
right
so
so,
and
we
have
components
parameters
right.
So
if
the
parameter
is
the
same
in
every
we
do
right,
I
think
yeah,
I.
E
Have
components
parameters
so
if
we
have
four
parameters
that
look
the
same,
let's
make
them
point
to
the
same,
parameter
under
components,
parameters.
What
is
the
probability
that
one
of
these
parameters
inside
inside
components
parameters
is
going
to
reference
as
schema
on
the
under
component?
Schema?
It's
really.
E
It's
really
weird
I,
never
I've,
never
seen
that
I've
never
had
this
need
right.
So
probably
we
are
over
thinking
the
the
parameters
one
here
and
we
don't
need
to
reuse
there,
because
it's
not
we
need
to
reduce,
but
the
whole
parietal,
not
chunks
of
of
the
parameter
right
of
the
parameter
definition.
C
E
Well,
that
was
my
suggestion
as
well,
and
another
is
that
we
don't
accept
schemas
there,
that
we
do
some
kind
of
just
the
subset
of
schema
and
we
only
allowed
certain
things
that
we're
in
there.
But
if
we
don't
do
it
like
this,
in
any
case,
we
don't
accept
emails
on
the
parameters.
What
we
accept
is
a
parameter.
E
E
But
but
my
point
is
in
this
case,
it
will
not
impact
the
issue
that
we're
discussing
here,
so
that
will
not
be
an
issue
we
can.
We
can
still
change
this
in
any
way
that
parameters
will
not
be
affected.
C
E
But
if
we
reference
instead
of
referencing
from
the
parameters,
if
we
reference
the
components
parameters
instead
of
a
component
schemat,
then
the
problem
is
gone
right
and,
of
course,
component
and
components
parameters.
You
only
support
one
kind
of
schema,
which
is
the
schema
that
we
support
right.
There.
A
So
at
the
moment
you
can
both
reference:
a
property,
that's
under
the
computer,
like
the
same
that
you
have
in
studio
at
the
moment
where
you
have
the
street
street
light
ID
parameter,
but
at
the
moment
you
can
also
reference
the
schema
of
the
property
or
of
the
parameter.
Is
that
what
you
want
to
limit?
So,
instead
of
having
a
reference
there,
you
cannot,
you
can
only
specify
a
schema
or.
E
No,
no,
no!
No!
You
can,
but
you
cannot
point
to
component
schemats,
because
it
will
not
be
the
type
of
schema
that
you
you
can
put
there
right.
It's
not
a
valid
schema,
because
that
schema
that
new
schema
that
we're
introducing
will
have
schema
format
and
and
the
schema
properties
right.
Instead,
what
you
will
have
to
do
is
reference
from
there
from
the
line
147
in
in
that
example,
on
schema,
you
will
have
to
reference
there
some
components,
parameters
right,
which
will
be
defined
as
the
current
schema
that
we
have
right
now.
Yes,.
A
D
E
Can
reference,
if
you
want
you,
can
reference
using
the
trigger
slash,
schema,
let's
game
at
the
end,
but
only
it
will
only
work
if
it's
adjusted
schema
definition
right.
E
You
know
so
so
you're
risking
basically
that
at
some
point
this
stops
working.
You
know
like
you're,
not
pointing
to
the
right
place.
I
think
parameter
definitions
shouldn't
be
there
so
they'll,
be
they
shouldn't
be
on
the
on.
Under
this
definition,
shouldn't
be
on
the
on
the
can
I
say.
B
D
E
D
E
Already
faced
that
problem
on
the
on
the
UI,
when
we
referencing
documentation,
we
have
to
check
if
there
is
a
description
field
like
the
one
on
146
line
or
if
there
is
a
description
field
inside
schema.
F
E
Itself,
it
would
be
just
a
an
object
that
we
create
ourselves
like
description,
the
type
which
could
be
string
or
number
probably
so.
C
E
E
Yeah,
we
can
add
pattern
we
can
use.
We
can
get.
You
know
the
stuff
that
we
know
that
is
going
to
be
useful,
but
then
we
don't
have
to
deal
with
things
like
it
might
be
an
object.
It
might
be
an
array,
it
might
be
a
Boolean
like
how
do
you
see
her
like
this
right?
So
there
are
many
ways
to
see
your
lights
on
that
right
and
right.
So
how
do
you
signalize
this
interesting
then
and
we're
not
defining
this
now?
E
So
maybe
we
should
simplify
this
object
as
well,
because
think
about
it.
In
most
of
the
cases
it's
I
wanted
to
be
I
wanted
to
be
a
pattern.
I
wanted
to
be
a
set
of
specific.
You
know
right
or
options
a
number
in
between
two
limits.
E
I,
don't
know
something
like
that,
but
how
many
times
you
will
want
to
do
more
context
that
there
on
the
on
the
channel
parameter
I,
don't
think
we
need
it
so
probably,
and
if
you
need
it,
it's
cool,
you
can
request
functionality
there
in
the
future
and
we'll
add
it,
and
it
will
not
be
a
breaking
change
to
the
real
extension.
So
we
can
keep
adding
more
stuff
on
3.1
3.2
if
we
forget
about
something
right
so
so
that
can
be.
That
can
be
done.
E
Basically,
we
probably
have
to
see
we
we
have
to
forget
about
Channel
friends
in
this
discussion
in
this
whole
discussion,
because
it
doesn't
have
to
be
as
clear
it
doesn't
have
to
be
pointing
to
schema
at
all
right.
It's
sold
them.
It's
not
needed
like
we
don't
have
several
variables
pointing
to
schema.
It's
not
needed
so
so
if
we
ever
have
a
problem
with
what
do
we
do
with
with
the
channel
parameters?
E
Now
what
we
were
before
is
we
have
two
ways
of
doing
this
so
far,
which
is,
we
include,
schema
format
and
schema
in
each
of
the
schema
definitions
all
right
or
we
make
a
default
schema
format
or
schema
format
at
the
root
level
right,
and
it's
the
same
for
everything
in
such
a
case.
If
we
make
it
the
root.
E
E
E
It's
it's
cool
and
yeah,
that's
and
and
on
on
the
other
hand,
is
a
side
effect
that
many
people
might
not
even
expect
right
that
it's
something
that
is
defined
somewhere
else
not
even
related
to
the
schema
definition
and
it's
affecting
the
scale
of
definitions.
So
on
one
hand,
it's
like
it's
good,
it's
convenient
and,
on
the
other
hand,
it's
like
careful
look
same
as
with
content
content
type
right.
We
call
Quantum
type,
but
yeah.
F
We
can
fix
that
parameters,
yeah
I
mean,
but
the
problem
is
that,
for
example,
here
I
posted
in
the
chat
in
the
zoom.
But
now
we
support
Interlink
the
parsing,
the
average
schemas
in
The
Binding
scaffolding
and
my
question.
The
people
will
also
be
able
to
use
that
schema
format.
The
another
one
from
Jason
schema,
invitings
and
extensions,
because
in
some
places
you
need
in
HTTP
binding.
Probably
we
have
some
place
where
we
need
to
define
the
the
Json
schema,
but.
C
But
also
for
for
the
bindings
it's
going
to
be.
Actually
it's
going
to
make
bindings
life
easier
as
well,
so
there
just
like
they
reuse,
schema
objects.
They
can
reuse
the
the.
C
That
will
be
created
for
for
parameters
or
whatever
so
because
the
thing
is
like
now,
even
now
in
the
The
websocket
Binding
yeah,
you
have
the
it's
not
called
path.
I,
don't
remember.
What's
the
name
so
like
basically
in
The
Binding,
you
can
specify
something
that
you
can't
do
in
the
channel,
which
means
the
you
know
like
query
parameters,
yeah,
that's
what
I
meant
and.
C
I,
don't
recall
if
there
are
any
validation
rules,
I,
don't
think
there
are,
because
you
don't
have
a
message
there
in
The
Binding
it's
on
the
main
spec
level.
So
when
you
when
it
goes
to
schema
object
whenever
it's
used
in
the
bindings
or
Kafka
or
or
websocket,
it's
mainly
for
this
simple
values
where,
in
the
end
you
just
use
const
or
you
know,
and
and
nothing
else,.
F
C
Again,
like
it's
the
same
situation
like
with
the
parameters
like,
why
would
you
use
type
object
to
describe
the
I
mean
I'm
I?
Think
I
did
it
in
one
of
the
examples
but
to
create
to
provide
the
descriptions.
E
F
Yeah
and
still
but
but
you
know
the
extension
to,
for
example,
to
allow
some
this
one,
the
tool.
F
D
F
E
E
That
format
in
that
place
right,
we
will
have
a
way
to
say
this:
is
a
diesel
schema
object,
and
this
is
the
number
object,
and
this
is
an
acpi
schema
object
or
you
know,
like
the
Creator,
the
extension
Creator
will
be
able
to
say
I
want
this
to
be
of
this
specific
type.
That's
how
the
parser
will
know
how
to
particle
right
so
because
it
will
be
defined
there.
A
From
from
a
validation
Point
as
well,
it's
rather
easy
to
adapt
the
Json
schema
documents.
We
have
to
conditionally
apply
different
schemas.
So
if
the
schema
format
is
several,
it's
validated
automatically
against
whatever
Avro
defines,
and
otherwise
it's
the
other
format.
So
it
shouldn't
be
a
problem.
Yeah.
C
Yeah,
basically,
bindings
have
to
adjust
if
we
make
a
change
of
the
schema.
Yes
sure
as
simple
as
that
in
case
of
extensions
yeah
like
they
have
to
specify
what
like
like
extensions
for
me,
it's
not
any
different
than
bindings
anyway
same
stuff,
but
now
I.
What's
the
conclusion,
because
I
lost
you
front
when
you
were
talking
about
this
alternative,
some
solution
with
components,
I
didn't
get
this
one.
So.
E
E
Don't
we
have
it
on
the
root
level
like
we
have
schema
format
on
the
root
level
or
the
positive
for
not
default,
schema
format
but
schema
or
root
level
of
the
document
right,
and
if
we
do
it
like
this,
as
somebody
is
saying,
we
just
have
to
keep
everything
as
it
is.
We
just
need
to
I
mean
the
structure
will
still
be
the
same,
because
that
schema
format
will
not
have
to
be
carried
over
the
message,
definition
or
the
failure
definition.
It
will
be
on
the
root
of
the
document
said.
E
E
E
E
C
C
E
If
we
go
for
the
solutions
that
we
propose,
then
they
will
have
to
migrate
all
the
things
on
their
side.
First,
before
referencing,
either
driver
or
to
this
is
chemo.
They
would
have
to
unify
everything
as
opposed
to
this
solution,
which
will
allowed
them
to
to
just
point
out
to
whatever
they
have.
It
will
be
adopting
so
yeah.
Well,
no,
actually,
no.
C
F
E
E
E
All
right,
and
if
someone
we
can
still
leverage
this
and
scheme
on
the
on
the
coolant
side.
Even
if
it's
not
a
property
system,
we
can
just
create.
On
our
side,
we
created
the
system
and
document
type
whatever
we
find
there,
enum,
whatever
we
find
there
and
so
on,
and
it
will
validated
and
fragrance.
E
B
It
will
be
interesting
also
to
see
from
the
like
migration
cost
perspective
like
in
any
solution
that
we
Implement.
What
will
be
the
impact
on
the
user
if
adopt
the
solution
right.
E
B
A
E
And
right
now
the
solution
that
Lucas
was
proposing.
It
might
have
a
migration
cost.
On
the
user
side,
you
might
want
to
migrate
every
schema
they
had
on
a
schema
registry,
but
they
have
a
limited
solution,
which
is
they
use
our
converted
tools
and
our
they
have
a
dollar
rate
pointing
to
a
schema
registry.
E
If
some
people
have
and
I'm
and
I'm
sure
that
some
people
have
like
this
headers
as
using
schema
and
payload
as
ever,
then
they
will
have
to
migrate
everything
that
they
have
on
the
distance
on
the
schema
registry
side
to
one
of
the
formats,
so
the
cost
is
going
to
be
higher
in
that
time
in
the
case
for
all,
but
only
for
those
for
for
everyone
else,
who's
not
mixing
formats,
then
because
it's
going
to
be
much
less
because
yeah,
like
yeah
everything
is
as
it
is.
E
A
E
E
Yeah,
so
it's
like
yeah,
it's
always
people,
because
we
don't
have
numbers.
So
at
some
point
we
might
start
getting
some
numbers
on
how
people
are
using
it
in
kpis
and
then
we
can.
We
could
calculate
the
cost,
but
right
now,
I'm,
just
guessing
I'm,
assuming
that
most
people
who
are
using
Kafka
are
using
Abra
I'm,
assuming
this
and
I'm,
assuming
that
they
that
they
have
Jesus
schema
either
in
line
or
reference
on
the
schema
live
stream
for
the
headers.
E
E
Yeah
I
think
we
have
a
few
conversations
depending
you
know
before
the
end
of
the
of
February
and-
and
this
is
our
last
meeting
next,
one
is
already
on
March
1st.
Yes,
so
I
think
we
should
have
this
meeting
again
next
week.
E
Because
yeah
or
we
agree
on
a
on,
you
know
I'm
postponing
the
the
March
first
date
the
code
freeze
on
the
spec
a
little
bit
more
until
we
fix
this,
and
we
agree
that
we
just
need
to
fix
this
and
then
the
last
time.