►
From YouTube: AsyncAPI SIG meeting 19 (Mar 17, 2020)
Description
This is the recording for the AsyncAPI Special Interest Group (SIG) meeting #19.
Attendees:
- Fran Méndez
- Francesco Nobilia
- Łukasz Gornicki
Moderation:
- Łukasz Gornicki
Agenda & Notes:
https://github.com/asyncapi/asyncapi/issues/320
Chat Dump:
18:59:39 From Francesco Nobilia : Sorry guys I need to go! See you in 2 weeks time. Have a good one! =)
19:04:37 From Lukasz Gornicki : https://groups.google.com/forum/#!forum/asyncapi-users
A
B
So
we
have
two
items
today
on
the
agenda
to
discuss
with
you
and
share
with
you.
One
of
the
topics
is
the
release,
automation
we
want
to
introduce
this
week
and
so
the
explanation
of
the
workflow
that
we
did
we
design
for
the
release
and
the
second
topic
will
be
the
asking
API
roadmap,
so
the
second
topic
will
be
covered
by
from,
and
I
will
cover
the
first
one
I
think
we
should
start
with
the
first
one,
because
it's
pretty
fast.
B
B
So,
what's
basically
happening
this
week
is
that
for
now
we
starting
of
course,
with
first
project,
no
we're
not
pulling
out
the
automated
release
to
all
the
libraries
that
we
have
for
good
reason
to
have
tested
first
in
one
repo
and
then
roll
out
to
others.
We
with
this
precursor
we're
gonna,
introduce
automated
releases
with
the
technical
bots,
so
no
longer
publishing
to
docker
by
me
or
to
NPM
by
boyfriend,
but
it's
gonna
be
a
passing
API,
but
that
is
gonna.
Do
publishing
2
n
pm
and
docker
hub.
B
So
the
most
important
things
with
this
release
is
that,
with
this
pull
request,
is
that
to
make
those
releases
fully
automated
we're
gonna
introduce
from
now
on
conventional
comets?
So
we
can
already
see
the
comet
that
is
also
used
in
the
subject
of
the
request
that
it's
a
bit
unusual
than
the
do,
and
the
usual
comments
you
can
see
in
other
repos
because
it
has
a
special
prefix.
B
It
means
that
it's
following
a
conventional
comets
specification
where
you
can
in
a
machine,
readable
way
and
also
human
readable
way,
because
at
the
end,
if
you
know
the
spec,
we
understand
those
prefixes.
But
also
for
machine
for
automation,
it's
pretty
easy
to
tell
machine
what
release
should
be
should
be
taken
and,
if
really
should
be
taken
anyway.
So
if
you
it
should
be
major
minor
patch
or
maybe
the
comet
is
not
really
release
triggering.
B
So
that's
the
most
important
thing,
the
cool
stuff
about
handling
it
within
the
within
API
will
be
because
we
introduced
already
squash
and
merge
functionality
on
the
pro
requests.
Then
we
don't
really
have
to
look
into
the
comets
commit
messages
of
a
pull
request
owner.
So
if
you
made
a
pro
request
with
many
comments
like
this,
prequel
I
spend
a
lot
of
time
on
it.
B
A
B
If
we
go
to
the
commit
messages
yeah,
so
you
can
see
I
had
a
lot
of
comet
messages,
but
you
don't
have
to
worry
because
then
once
I
click
on
this
course
and
merge,
I
can
actually
clean
that
up
and
if
I
would
have
even
the
the
title
of
the
PR
was
would
be
bad.
Then
one
of
the
maintainer
of
the
asking
API
before
merging
can
also
adjust
the
comet
message.
So
it's
pretty
pretty
handy.
B
Next
thing
is
the
using
this:
the
opportunity
I
was
cleaning
up
some
scripts,
but
I
don't
think
it's.
We
should
go
here
into
details
of
the
scripts,
how
the
docker
publishing
looks
like,
etc.
So
it's
you
can
just
look
into
the
pr
later
more
important
is
that
the
the
automation
is
done
with
the
semantic
release
package
from
npm,
so
the
workflow
works
in
the
following
way
that
we
have
a
github
action
that
reacts
on
every
comet
who
master
branch.
B
So
the
the
pull
request
is
merged,
the
comet
is
created,
and
then
you
know
the
first
step
of
the
workflow
is
the
analytics
right.
We
analyze
the
comet
message
if
it's
released,
triggering
or
not,
and
what
kind
of
release
it
should
trigger.
If
we,
if
the
comet
is
not
really
released
triggering,
then
we
skip
the
rest
of
the
steps.
There
is
a
proper
condition
in
the
in
the
workflow
that
we
just
stopped.
B
B
You
have
an
example
here
in
the
screenshot,
so
it's
published
by
the
Assange
API,
but
that's
how
it's
going
to
look
like
and
it's
basically
a
list
of
all
these
comets.
With
the
reference
with
the
link
to
the
pro
request,
where
you
can
see
all
the
details
of
what's
changed,
then
we
go
to
to
docker
and
the
last
step
of
this
workflow
is
creating
a
pull
request
to
master
in
the
in
the
upstream
with
the
change
in
the
package
Aysen.
Why?
B
It's
because
we
should
respect
the
branch
protection,
so
we
should
not
make
a
direct
commit
to
the
to
the
master
branch,
but
we
have
to
go
through
the
workflow
like
others,
so
we
have
to
create
a
pull
request
with
the
changes
in
package
Aysen
and
then
there's
a
separate
workflow.
That's
handling
those
per
request.
Those
pull
requests,
it's
separate
because
you
can
imagine
like
now.
We
don't
have
much
testing
etc,
but
in
the
future
you
can,
you
can
imagine
a
pull
request
that
has
a
lot
of
checks
before
it
is
ready
to
be
merged.
B
So
it's
we
have
to
secure
here.
We
cannot
really
do
synchronous
job
here.
We
have
to
be
more
more
prepared
for
a
long,
taking
long
running
a
checks
in
pull
requests.
So
there
is
a
separate
workflow
where
we
first
of
all
check
who's,
creating
the
per
request.
If
the
pull
request
is
created
by
us,
an
API
bot,
then
of
course
we
automatically
approve
and
the
Pro
requests.
The
approval
is
done
by
the
github
action,
and
then
we
have
additional
job
that
is
not
triggered
until
the
approval
is
done.
B
So
there's
a
lot
of
time
for
four
other
checks
to
run
that
at
the
end
automatically
this
the
last
job
is
automatically
merging
and
the
the
pull
request,
and
because,
thanks
off
to
Francesco,
we've
set
a
proper
setting
in
the
repo
that
to
automatically
delete
merged
branches.
Then
of
course
we
are.
We
don't
have
to
be
afraid
that
there
will
be
any
branches
left
after
this
automation
because
and
they
will
be
automatically
deleted
after
the
merging.
C
A
question
you
know:
maybe
it's
not
our
own
issue,
your
case,
but
let's
say
you
have
three
commit,
so
you
start
with
master
which
is
empty,
and
then
we
we
have
three
three
new
commits.
Only
the
one
in
the
middle
should
be
released.
The
department
works
also
with
just
two
commits,
so
the
first
one
we
don't
want
to
release
it.
We
just
want
to
release
the
second
one.
With
this
kind
of
automation
based
on
commit
message,
you
will
release
the
entire
master
right.
The
entire
status,
that
is
on
master
mm-hm.
B
B
It
depends
on
the
types
of
the
commit,
so
basically
the
conventional
Komets
allows
you
to
have
commits
that
are
not
triggering
release.
So
looking
into
this
pack
feet,
prefix
means
that
it's
you're,
saying
that
there's
a
feature
and
the
major
is
the
minor
is
bumped
fix,
is
a
bug
fix,
so
it's
patch
release
and
then,
if
you
add
exclamation
mark,
then
it's
treated
as
a
major.
B
But
if
you
have
some
dogs
change
or
you
have
a
refactoring
or
CI
configuration
or
whatever,
then
you
just
use
some
different,
perfect
slack
dogs,
so
I've
listed
in
the
contribution
guide
for
future
reference,
a
kind
of
cheat
sheet.
So
we
can,
you
basically
do
horror
most
of
the
cases
or
test
or
refactor
or
whatever
you
we
can
come
up
with
and
but
also
those
will
not
trigger
release.
But.
C
Let's
say
that
you
are
now
in
a
big
refactor
situation
by
factoring
situation
and
you're
splitting
your
refactoring
in
in
two
three
pr's
and
in
the
middle
between
two
of
your
peers
and
merge.
Our
fix
so
I
will
know
that
is
a
bege
version
and
also
contain
your
refactoring.
For
you,
it's
not
a
problem.
If
the
refactoring
is
not
finished
right,
it
would
be
a
or
it
could
be
even
a
major,
but
you
should
not
affect
anything.
A
B
You
basically
treat
master
as
a
release
branch
right.
So
that's
why?
Whatever
you
push
into
master
should
should
always
should
not
break
so
even
every
factor.
If
it's
partial,
it
should
not
break
the
library
and
then
it
could
be.
It
can
be
easily
released
when
there's
another
fixed
right
after
our
fault,
refactor,
okay,
yeah.
A
A
The
chances
that
something
happens
in
the
middle
right,
like
we
have
a
commit
that
will
trigger
release
in
the
middle.
It's
it's
hilar
right.
So
whenever
we,
whenever
we
have
this
situation
like,
we
want
to
split
something
in
multiple
pull
requests
to
make
sure
that,
what's
whatever
is
merged
into
master,
doesn't
either
change
anything
substantially
or
I
mean
from
the
user
point
of
view.
A
We're
talking
about
libraries
here,
not
not
an
application
or
a
software
service
right,
so
I
think
having
the
temporary
and
by
temporary
I
mean
like
one
week
or
two
or
two
at
most,
if
condition
saying
that,
don't
do
it
yet,
you
know,
like
a
feature,
flagging
card
cut
it
in
the
code
saying
this.
Is
this
will
not
work
yet?
That's
fine
can
find
right.
B
A
Always
we
will
always
look
for
other
workflows
like
having
a
separate
branch
which
will
act
as
a
as
a
temporary
master
for
this
feature
only
which
I
really
dislike,
but
because
then,
if
you
have
problems
with
fixing
conflicts,
merge
conflicts,
this
conflict
starts
collecting
gap
in
these
branches
and
sub-branches.
So
and
that's
that
can
be
a
pain
so
I,
don't
really
recommend
it
but
yeah
whatever.
A
If
we
I
think
if
we
end
up
doing
something
like
this,
it's
a
problem,
organizing
the
tests
not
so
much
about
the
cabinets
that
we
saw
this
plate,
which
is
fit
work
in
a
different
way
in
a
way
that
doesn't
make
us
suffered
from
the
situations
right
or
we
have
a
long
pull
request
and
that's
it,
but
which
I
also
dislike
in
some
cases.
It's
it's
the
best
option.
B
The
only
opens
open
topic
is
that
I
guess
that,
like
I
said
we
don't
roll
it
out
to
other
library,
a
libraries
like
like
the
parser
or
the
partial
plugins.
We
merge
that
first
see
how
it
works
and
then
then
roll
it
out
to
others,
because
they,
the
workforce,
who
might
differ
a
bit
like
in
case
of
parser.
We
don't
really
stalk
her
image,
for
example,
so
I
guess
the
generator
is
the
most
complicated
here
that
we're
handling.
A
A
So
let's
say
we're
announcing
here
now:
the
the
new
roadmap
for
this
year.
It's
coming
a
little
bit
late,
but
but
yeah
it's
all
still
work,
and
let
me
just
proceed
to
invite
everyone
on
the
on
the
Google
group
to
the
document.
B
A
A
We
want
to
know
we
wanted
to
know
first,
where
we
want
to
go
actually
with
the
with
the
initiative.
It's
like
and
actually
I've
been
asked
this
many
times
and
I
was
always
like
I
know,
I
have
to
work
in
this
in
my
head.
It
was
clear,
but
it
probably
wasn't
clear
for
everyone
right.
Even
in
my
head,
I
had
an
idea
like
a
big
picture,
but
it
was
a
kid
renting
down,
sir.
So
yeah,
that's
exactly
why
I
decided
to
do
this
now
and
and
I
decided
to
split
this
into
goals
and
milestones
right.
A
So
each
goal
is
composed
by
a
series
of
milestones
and
and
just
to
make
clear
goals
are
like
be
art,
big,
as
in
scope,
the
goals
right
that
we
want
to
affirm
please
for
for
the
year
and
then
the
milestones
are
like
steps
inside
each
of
the
goals
that
we
will
we're.
Gonna
do
not
tasks
but
like
group
of
tasks,
if
you
want
to
make
this
goal
a
reality
right.
A
So,
yes,
let
me
start
so
one
of
the
things
that
we've
been
working
on
for
quite
some
time
now
is
pulling
so
after
releasing
version
2
of
this
back.
It
was
time
to
work
on
something
people
can
use.
Actually
the
spec
is
it's
beautiful,
I
like
working
on
the
spec,
but
without
tooling
the
spec
is
nothing
right,
so
so
yeah.
So
the
first
one
is
offers
stable
and
extensible
tooling
to
work.
We
listen
to
get
you.
A
A
This
package
is
under
development
and
it
hasn't
reached
rest
of
one
yet
Jake
means
that
it
might
break
so
we
might
introduce
breaking
changes
without
prior
notice
right
and
that
was
on
purpose
like
we
need
to
rapidly
iterate
to
figure
out.
What
was
what?
What
do
we
need?
I
mean
if
you
go
to
this
until
a
generator.
A
That's
not
the
one,
okay!
So
in
this
one
we
don't
have
it.
But
if
you
look
at
the
person
number,
you
will
see
that
we
didn't
reach
virtual
one
yet
and
Esper
semantic
versioning
before
you
reach
version
one.
It's
allowed
to
introduce
breaking
changes
right
or
you
can
expect
it
actually
so,
but
that
that's
something
that
many
people
were
not
understanding
like
you're
breaking
you're,
introducing
breaking
changes
and
and
I
was
saying,
yeah
we're
not
embracing
one.
A
Yet
we're
not
offering
stable
to
him
yet,
even
though
we're
trying
our
best-
but
sometimes
it's
it's
not
worth
so.
That
means
that
one
of
the
things
that
we
wanted
to
do
first
is
make
the
jumpsuit
person
the
generators
thing
and
so
that
we
slowly
now
we
will
be.
We
will
be
developing
new
features,
but
without
breaking
it
so
much
right
now
that
they
are
stable.
We
know
what
are
exactly
the
things
that
we
need
to
do
to
make
the
packages
stable.
A
Of
course,
at
some
point
we
will
come
up
with
something
new
that
we
need
to
know
to
do
and
we
will
have
to
release
version
two,
but
after
some
last
after
almost
a
year
now
developing
the
the
tools
we
more
or
less
see,
what's
actually
needed
to
have
a
stable
version
right
and
that's
what
the
refocus
right
now
then
also
as
well.
We
want
to
focus
on
automating
as
much
as
possible,
which
is
related
to
what
Lucas
was
explaining.
You
know,
and
we
want
to
automate
everything
or
almost
everything
right.
A
So
if
you
have
played
enough
with
the
witnessing
KPI,
you
might
have
noticed
and
there's
something
called
specification,
extensions
the
same
as
an
open
API,
and
you
have
this
X
whatever
and
you
can
introduce
as
many
as
those
as
you
want,
and
they
are
not
validated
at
all.
They
are
ignored
by
the
parser,
so
you
can
put
whatever
you
want
there,
but
we
see
many
people
using
similar,
not
the
same
but
similar
extensions
for
the
same
purpose.
A
So
we
wanted
to
have
like
a
catalog
of
extensions
of
specification,
extensions
that
could
be
somehow
reused
and
somehow
standardized
by
by
everyone
right.
That
means
that
if,
at
some
point,
for
instance,
we
wanted
to
try
something
new
on
the
spec
and
see
how
it
feels
we
might
create
an
extension
and
we
make
people
use
the
extension
first
if
it
grows
well
and
we
play
with
it,
we
break
it.
We
do
whatever
we
want
there,
it's
an
extension
once
we're
happy
with
it.
We
can
probably
incorporate
it
into
the
spec.
So
that's
that's!
A
The
idea
is
also
also
the
idea
of
their
extensions
right
and,
of
course,
if
we're
going
to
have
this
extensions
catalog,
which
is
it's
a
catalog
where
you
register
the
extension
and
Jason
schema
definition
to
validate
the
information
on
the
extension.
So
it's
like
separate
mini
spec
for
for
this
extension
right,
but
it's
not
tied
to
the
to
the
core
spec
and
it
can
evolve
separately
at
a
different
pace
and,
of
course
we
want
to
use
it.
We
want
to
use
the
catalog
with
a
parser.
A
So
whenever
we
see
one
of
these
tensions,
the
parser
will
connect
to
the
catalog
and
download
these
definitions
and
validate
them.
All
of
this,
like
optionally
right
and
you
have
to
up.
You
have
to
opt
in
right,
so
you
have
to
explicitly
tell
the
parser
a
I.
Also
want
you
to
connect
to
the
catalog
and
download
the
definitions
and
of
the
extensions
of
undated
right.
A
A
And
also
by
the
end
of
this
year,
something
that
we
want
to
do
is
to
make
go
parcels
table
as
well,
and
that's
a
little
bit
like
how
can
I
say
it.
So
this
is,
this
is
actually
bigger
than
it
sounds,
because
the
idea
behind
making
the
go
parcels
table
is
not
for
people
to
use
a
go
party.
You
might
use
the
go
parser,
but
the
focus
here
is
from
go.
We
compile
to
see
and
and
then
from
see
from
any
language.
A
You
can
import
this
libraries
in
see
right,
so
we
can
from
there
we
can
have
a
Java
parser.
That's
let's
say
from
the
goal:
one
we
can
have
a
Python
one.
Obviously
they're
not
gonna,
be
like
super
perfect
like
well
the
perfect
experience,
but
it's
a
way
for
us
with
the
resources
we
have
to
to
start
offering
something
to
other
languages
start
with
them,
the
JavaScript
so
yeah
any
questions
so
far.
A
A
You
will
have
it,
but
not
yet
yeah.
We
were
actually
just
directed
everything.
So
this
is
the
first
step.
I
just
wanted
to
have
the
steps
clear
and
then
we
will.
We
will
elaborate
on
this
and
actually
on
the
gold
parser.
That
was
the
initial
idea
before
the
job
before
we
created
a
script
person,
but
then
we
realized
that
generating
a
JavaScript
parser
that
works
on
the
browser
was
going
to
be
harder
than
we
thought
we're
starting
from
the
goal.
A
One,
and
that's
why
we
decided
to
make
an
exception
with
gem
script
because
of
the
browser
and
that's
why
we're
gonna
maintain
to
write
for
now
in
the
future
I
I'm,
assuming
that
we
will
have
to.
We
will
need
to
have
like
a
parser
for
each
major
language
right
and
we
will
have
to
maintain
all
of
them
separately,
but
yeah.
We
need,
we
need
to
start
somewhere,
yeah.
A
A
Performance
is
something
that
we
need
to
take
into
account
for
the
parcel
because
it
may
be
using
this
decays
that
work
with
messaging.
So
performance
cannot
be.
You
know
we
cannot
have
an
async
API
parser.
That
is
not
performant
right.
So
that's
that's
my
point.
So
it's
like
it's
good
to
have
it
as
a
starting
point
and
then
we
see
how
it
grows.
We
see
the
interest
of
people
creating
a
Java
parser
or
a
Python
parser
or
whatever
it's
the
most
amended
one.
But
again
we
need
to
start
so,
for
instance,
infinite.
A
So
second,
one
second
goal-
big
goal
would
say
is
make
it
easy
for
people
to
start
contributing.
We've
been
discussing
this
for
a
long
time.
I
really
like
and
we've
been
patching
things
here
and
there,
but
we
know
we
are
aware
that
it's
not
really
easy
to
start
contributing.
The
project
is
already
it's
already
too
big
in
terms
of
scope.
Right
we
have
this
bag,
we
have
parsers,
we
have
a
generator,
we
have
the
catalog
or
we
don't
have
it
yet,
but
we
will
have
it.
We
have
documentation,
we
have
lots
of
things
bindings.
A
There
are
many
things
bindings
for
each
protocol,
so
that's
a
lot,
and
so
what
one
of
the
two
things
that
we
want
to
do
here
is
clarify
the
role
of
the
maintainer.
So
in
the
following
months
weeks
or
months
we
will
define,
we
will
clarify
its
already
defined,
but
we
have
to
define
it
better
and
we
will.
A
A
A
That
will
be
creating
a
guide
like
this.
Like
hey,
so
look
in
a
few
minutes
you
will
understand
everything
we're
doing
here
are
the
Reapers.
Here
are
the
technologies
that
were
using.
This
is
how
you
should
start
contributing.
You
see
where
you
can
ask
what
you
should
ask
in
case.
You
have
questions.
A
A
A
Make
async
API
become
the
standard
specification
for
even
driven
api's.
It's
somehow
already.
Somehow
it
already
is
guess,
there's
nothing
similar
so
yeah,
but
we
want
to
make
it
a
standard
specification,
not
like
the
only
one
right.
This
is.
We
want
to
make
it
like
like
official.
This
is
a
standard,
and
for
these
it
cannot
be.
A
A
W3C
things
like
this,
so
that's
something
that
we're
gonna
push
in
in
the
next
months
as
well,
explore
different
alternatives
like
it's,
a
safe
and
and
also
at
the
same
time
improve
the
definition
of
the
governance
model.
We
already
have
this
defined
in
case
you
don't
know.
If
you
go
to
the
sub
KPI
repo,
you
will
see
it
here,
just
there's
a
governance
file
yeah.
A
So
we
should
learn
from
them
and
maybe
take
these
governance
models
and
adapt
it
a
little
bit.
So
we
don't
have
the
same
problems
right
yeah,
that's,
like
you
know,
paperwork
that
we
should
be
doing
and
all
the
staff
and
then
advertise.
So
people
know
that
we
join
this
one
of
these
neutral
homes
or
another
one
that
might
not
be
listed
here.
A
Second,
the
second
milestone
is
increase
the
number
of
sponsors,
so
this
is
already
kind
of
happening,
so
I'm
already
having
conversation
with
with
some
other
companies
to
become
sponsored
and
they're
all
asking
more
or
less
the
same
thing
like
they're
asking
for
a
document
or
a
brochure,
something
with
information
about
the
initiative,
the
open
source
initiative,
how
many
people
are
using
it,
how
many
other
companies
are
using
it?
So
that
will
be
great
to
have
something
like
this
again.
A
A
These
are
the
Mersenne
is
about
increasing
adoption,
so
we
think
that
the
best
way
to
increase
the
option
here
is
by
making
other
people
support,
is
in
KPI,
so
other
products
like,
for
instance,
music.
This
is
also
is
working
and
on
that
soul
has
already
I
think
they
already
launched
it,
something
private
beta
and
they
are
about
to
launch
something
on
this
month
of
GA,
so
so
yeah
there
are
there's
also
TIBCO
with
some
micro
gateway
functionality
that
support
async
api
and
a
green
hat
is
also
working
on
some
products
that
will
support
async
api.
A
A
This
is
actually
like,
almost
done,
I
will
say,
but
still
pendant
so
release
the
patch,
the
next
patch
version
with
just
patches
here
and
there
on
the
spec
explaining
things
a
little
bit
better
or
fixing
fixing
backs
and
spec
and
all
this
stuff
yeah,
that's
a
big
done
really
quickly
and
then,
after
that,
we
will
push
it
there.
The
next
release
the
next
minor
release,
but
yes,
still
to
it's
still
to
be
defined.
What's
gonna
end
up,
there
I
have
a
Ralph
of
the
idea
that
I
will
share,
but
I
will
share
in
the
future.
A
But
there
are
some
big
features
that,
like
that
could
be
included
like,
for
instance,
request
reply,
support,
also
Francisco
yeah
and
then
also
double
down
on
protocol
bindings.
It
means
that
we
need
to
actually
work
in
that
pretty
heavily
protocol
protocol
bindings
are
still
lacking
in
draft
states
if
you
want-
and
we
need
to
actually
work
on
that-
a
lot
of
them
and
finish
finish
the
first
version
and
then
write
any
questions
so
far.
B
A
A
We
need
to
make
something
really
cool
here
and
super
useful.
So
that's
that's
something
that
will
target
very
soon
and
another
financial
income
could
be
creating
paid
line
curses.
So
that's
something
that
we're
also
exploring
and
it
will
help
us
financially.
So
that's
seems
like
a
perfect
fit
we're
already,
also
working,
but
we
had
to
cancel
due
to
coronavirus
we're
already
organizing
workshops
in
different
cities.
We
will
keep
working
on
that.
A
A
So
there's
a
lot
of
effort
there,
but
I
don't
think
it's
enough
enough
support
for
different
languages
and
and
in
frameworks,
so
we're
gonna
dedicate
what
master
Mon,
working
and
creating
a
lot
of
calculators
and,
of
course,
not
a
lot
but
working
a
lot
of
on
different
code
generators,
which
is
different
right.
It's
writing
one
for
I,
don't
know
for
Java
for
Python
for
web
circuits
for
web
sockets
with
no
G
answers.
A
B
A
B
B
So
this
link
will
be
in
meeting
notes,
but
just
to
let
you
know
that
we
have
this
meeting
group
to
which
we
are
sending
out
the
invitation
to
these
meetings,
but
we
also
share
some
draft
documents,
so
we
also
have
access
to
this
roadmap.
If
you're
joining
this
group
and
for
you
from
info
that
you
probably
have
to
change
the
because
now
it's
invitation
sent
to
edit
and
you
should
change
it
to
comment.
So
people
can
rather
comment
and
modify
the
document.
You're.
B
But
yeah
it's
a
heads
up
for
you
guys
that
you
can
just
join
this
group
freely
and
then
get
access
to
post,
Doc's,
cool
cool,
yeah
and
also
something
important
I
forgot
to
mention
during
the
my
presentation
about
the
release,
automation
that
the
there's
an
important
change,
that's
going
to
be
introduced,
that's
going
to
be
a
change
in
the
name
of
the
of
the
package,
so
we'll
deprecated
current
as
an
API
generator
package.
Because,
starting
with
this
automation
flow,
we
will
introduce
the
packages
with
the
annotations.