►
From YouTube: Jean-Michaël Celerier: Creating interactive digital art with ossia — Rust in Arts 2021
Description
Learn how to create audio-visual art installations and interactive performances, live-code and live-patch in this talk focused on ossia.io's free and open-source digital art workstation, ossia score and its companion library libossia.
https://rustfest.global/session/47-creating-interactive-digital-art-with-ossia-score-and-libossia/
A
So
yeah
I'm
very,
very
happy
to
introduce
to
our
digital
stage,
jean-michel
celer.
I
hope
I
said
that
properly.
I
tried
so
hard
they're,
a
freelance
researcher
from
france,
with
a
phd
in
authoring
temporal
media,
which
I
think
I
know
what
that
is,
but
I
hope
it
will
explain
a
little
further
because
it
sounds
super
cool,
a
fellow
event
organizer
who
yeah
teaches
creative
coding
to
yeah
doing
the
good
work
by
also
passing
on
the
torch
and
teaching
creative
coding
languages
yeah.
So
thank
you.
B
Hey
so
thanks
for
the
introduction,
can
you
hear
me?
Yes,
okay,
perfect,
so
hello,
everyone,
so
I'm
super
glad
to
be
here.
Are
these
worst
fest?
So
I
prefix
my
talk
by
saying
that
there
isn't
a
lot
of
rust
elements
in
this.
What
I'm
going
to
show
is
mostly
splash
plus
and
we
are
trying
to
make
inroads
inverse,
so
we
have
some
small
apis
that
are
putting
a
finger
in
the
big
rest.
B
I'll
just
start
directly
by
a
small
demonstration
and
then
we'll
discuss
a
bit
what
this
is
all
about,
alright,
so
just
first
a
small
sound
check.
Can
you
tell
me
if
you
hear
sound
right
now.
B
Okay,
it
starts
great,
let's
just
see,
because
of
course,
I'm
on
linux
and.
B
B
All
right
well
I'll,
just
pump
the
sound
of
my
loudspeakers
that
work.
Okay,
here,
you
should
be
hearing
something
yep
all
right,
so
some
of
you
may
know
music
software
like
cubase
ableton
b3.
That
kind
of
thing.
So
this
is
something
kind
of
similar,
but
for
in
general,
the
whole
media
art
art
practice.
That
is
not
only
audio
and
music,
but
also
video,
and
I
I,
for
instance,
I
saw
that
some
previous
talks
talked
about
creative
coding
environments,
such
as,
which
was
very
nice
plantation.
B
The
first
concert
we
had,
I
think,
used
open
sound
control
in
the
in
the
description
and
all
with
media
technologies.
We
are
trying
to
build
that
aussie,
a
freehand
pencil
sequencer,
which
is
able
to
to
automate
same
in
a
timeline,
so
it
starts
extremely
simply
so.
For
instance,
you
can
just
like
most
music
software
put
song
and
and
play
it,
and
if
you've
ever
done
music
with
a
computer,
that's
nothing's
very
interesting.
B
For
from
now,
you
can
have
time
stretch
which
means
that
you
can
change
the
speed
of
the
sound
and
adjust
the
tempo
depending
on
what
you
have
that
kind
of
thing
and
you
can
apply
effects.
So
we
have
some
kind
of
effects
which
are
supported,
like
some
of
you
may
know,
vst,
plugins
or
lv2
or
phos,
so
for
today,
I'm
going
to
mostly
use
first.
The
first
is
a
domain
specific
programming
language
for
signal
processing
and
which
can
be
used
to
implement
a
lot
of
very
common
audio
effects.
B
So
first,
for
instance,
is
the
first
interesting
thing,
because
the
first
program
is
basically
source
code,
and
here
you
can
directly
go
and
edit
the
source
code
and
well
do
some
some
level
of
live
coding
with
this
software
so
and
I'll
show
a
bit
all
the
various
programming
languages
that
are
embedded
in
oscilloscore
and
well
maybe
one
day
there
will
be
rest
too.
At
least
that's
something
I
hope
to
see
in
my
lifetime.
B
So
sound
is
a
first
thing,
but
then,
when
you
do
media
art,
sometimes
you
have
sound,
but
you
also
want
to
have
some
visuals.
So
how
will
visual
works?
Well?
Pretty
much
the
same
thing:
you'd
have
some
video
file
and
you'd
drag
and
drop
it?
It
appears
in
the
timeline,
and
here
you
will
see
you
will
say
to
be
so
far.
Okay,
let's
add
an
output
window
in
which
we
are
going
to
be
able
to
do
some
rendering
so
rendering
is
hopefully
efficient
because
it
uses
all
the
nice
technology,
such
as
vulcan.
B
That
kind
of
thing-
and
so
I
hope
we
can
see
that
the
idea
of
the
software
is
that
you
are
going
to
be
able
to
assemble
various
kind
of
media
in
a
timeline
in
something
that
well
is
useful
for
media
artists,
whatever
kind
of
medium
they
are
using
so
for
videos,
we
can
of
course,
just
play
a
video
like
that,
but
we
can
just
like
we
can
apply
sound
effects.
We
can
also
apply
video
effects.
B
So
what
will
be
a
video
effect?
For
instance,
let's
see
this
one
just
is
very
simple
and
nice.
It
makes
things
black
and
white,
and
as
someone
who
listens
to
a
lot
of
metal
music,
I
like
black
and
white-
I
don't
know
if
it's
related
or
something
so
here
we
can
see
that
the
same
video
is
having
some
kind
of
live
effects
applied
to
it,
which
can
be
useful
if
you
do
vj'ing
that
kind
of
thing
yeah.
B
So
that's
the
core
idea
of
this
part,
then
you'll
notice
that
things
are
looping
on
every
medium.
You
can
say:
okay.
This
is
going
to
loop
forever,
so
here,
for
instance,
by
default,
both
rhythmic
samples,
which
are
kinda
detected
with
heuristics
and
videos
are
set
for
looping,
because
this
is
something
which
experimentally.
B
B
So
so
yeah
so,
but
then
once
we
have
audio
and
video,
so
one
thing
is
that
if
you
are
using
say
cubase
you.
A
B
You
you,
you
hit
a
export
button
and
it
gives
you
a
either
a
pdf
file
or
a
mp3
file
or
which
you
can
just
send
to
your
friends.
But
in
the
title
of
this
talk
the
keyword
is
interactive
and
the
interactive
is
basically
that
we
want
to
write
a
score
where
someone
can
write
down
inside
the
score.
I
want
this
thing
to
happen
in
response
to
something
else.
So,
for
instance,
if
I
press
the
button
on
my
computer,
I
want
things
to
advance
to
go
to
the
next
part
of
the
song,
or
something
like
that.
B
So
there
are
two
ways
to
see
interactivity,
the
first
one
which
is
using,
for
instance,
live
sensors
such
as,
let's
say,
keyboard
controller,
that
kind
of
thing
and
the
second
one,
which
is
directly
saying
the
score.
Okay,
I
want
this
to
react
to
that,
so
we'll
start
very
simply
with,
for
instance,
I'll
start
by
adding
a
midi
keyboard,
so
something
with
knobs
that
I
can
just
tweak.
B
Is
that
little
thing,
and
on
that
little
thing
I
can
say:
okay,
let's
learn
what
are
my
knobs,
that
I'm
going
to
use
okay
and,
in
this
part
of
the
software
you'll
be
able
to
see
basically
all
the
external
controls
that
you
want
to
be
using
in
your
scope.
B
So
here
I
can
just
say:
okay,
let's
say
this
because
here
this
goes
here.
B
This
goes
here
and
this
goes
here
and
from
then
let's
see
so
I
don't
know
if
you
can
see,
for
instance,
yes,
here
we
can
see
things
move
bits,
so
you
can
use
physical
controllers
to
to
tweak
things
but
mini
keyboards
they.
Actually
they
are
not
very
good
because
they
only
have
very,
very
limited
precisions.
It's
like
seven
bits.
Seven
bit
is
really
not
a
lot.
It
goes
from
zero
to
127.
B
You
can
hack
a
bit
your
way
to
have
more
precision,
but
maybe
you
want
to
use
you
know
some
kind
of
more
expressive
device.
So
I
have
one
of
those
devices
right
here,
which
is
a
plain
old
joystick,
and
here
I
can
say:
okay,
let's
add
my
joystick
to
the
score,
and
here
you
should
be
able
to
see
that
when
I
press
stuff
my
joystick,
then
I
also
have
the
controls
of
my
joysticks
that
are
available
for
controlling
strings.
B
B
B
So
another
thing
that
we
may
want
to
do
is
actually
what
people
like
to
do
in
like
music
videos
and
that
kind
of
thing
it's
using
sound
to
control
video
effects
so
for
this
I'll
add
another
effect
after
the
first
one,
something
which
does
like
some
rgb
fan,
rgb
stuff-
and
here
we'll
say:
okay,
the
shaking
of
the
thing
it
will
depend
on
our
kick.
B
So
how
would
we
do
that?
So
now?
This
is
getting
a
bit
into
what
we
call
visual
programming.
So
earlier
in
the
previous
talk,
I
heard
the
word
pure
data.
So
if
you
know
pure
data
well,
you
won't
be
lost
because
it's
fairly
similar,
but
in
a
timeline
instead
of
just
being
one
graph,
it's
basically
all
the
potential
graphs
that
can
happen
during
the
score.
B
B
Okay,
so
not
a
lot
happens
because
the
the
sound
is
not
very
loud,
so
what
we
have
to
do
is
to
say:
okay,
I
want
to
multiply
this
value
by
something.
Multiplication
is
quite
useful
operation,
I'd,
say,
and
so
we
can
just
do
things
like
this,
which
will.
B
B
B
So
you
can
live
edits
all
the
operations,
that
kind
of
thing
and
that
that's
amazing,
because
it's
also
meant,
for
you
know,
live
live
creation.
So
that's
the
basic,
the
core
idea
of
the
software.
You
have
this
timeline
and
then
you
have
the
graphs
within
timeline
and
if
you
want
to
edit
things
more
precisely,
you
can
just
go
and
see
things
in
a
very
graph
way.
B
B
So
now
what
we
may
want
to
do
is
things
such
as,
as
I
mentioned,
putting
some
level
of
interactivity
in
timeline.
Like
I
won't
say,
okay,
I
I
want
to
move
on
from
this
part
of
the
song
whenever
something
happens,
so
something
right
now
will
be
just
a
sound
playing
and
not
something
will
be
me
pressing
a
button
on
my
joystick.
So
let's
say
this
one,
so
here
I'm
pressing
button
and
well
just
a
boolean
value,
and
what
I
can
do
is
say:
okay,
something
like
that.
B
B
Now
I
can
move
on
to
the
next
part.
So
that's
the
first
thing,
then.
What
I
can
say
is
whenever
we
are
done
with
that.
Okay,
let's
go
back
to
where
we
were
at
the
beginning,
so
we
can
write
some
kind
of
state
machines
with
timeline
and
the
usual
language
has
basically
elements
to
say
some
parts
must
keep
running.
This
is
when
things
are
attached
and
some
things
cannot
be
interpreted.
B
So
if,
if,
if
I
have
dashes
here,
it
means
that
I
can
cut
and
have
my
event
interrupt
this
part
of
the
timeline
at
any
moment.
But
here
I
cannot
interrupt
my
timeline.
It
means
that
it's
the
score.
I
really
want
these
parts
to
go
on
entirely
before
moving
on
to
the
next
part
and
well,
this
fills
your
languages,
enables
writing
scores
where
you
have
things
that
have
various
duration,
depending
on
whenever
it
happens,
and
that's
necessary
when
you're
working,
for
instance,
with
humans
on
stage
play
or
something
like
that
humans.
B
They
don't
take
two
minutes
and
first
seven
seconds
to
do
something.
They
take
something
approximated.
We
have
to
have
tools
that
allow
us
to
have
some
leeway
depending
on
what
is
actually
happening
on
stage
and
react
to
that
as
soon
as
we
can
okay.
B
So
another
thing
that
we
often
see
in
this
kind
of
environments
is
automation,
so
an
automation
is
just
something
that
will
make
a
control
change
over
time.
So
here
I
can
say:
okay,
I
want
to
automate
this,
and
this
creates
an
automation
curve
and
if
I
hit
play,
then
this
paragraph
will
go
from
zero
to
one
to
zero
point
five
and
change
things
in
the
inventory.
B
So,
for
instance,
I
mentioned
the
osc
protocol
open
sound
control,
which
is
fairly
common
in
all
kinds
of
media
art
software,
like
most
environments
for
creative
coding,
generative,
art,
digital
and
intermediate
art.
They
all
support
osc
and
at
some
point
we
went,
and
it
was
an
initiative
led
by
vidbox
of
vdmx
fame,
a
very
famous
video
software.
B
There
was
this
basic
query
protocol,
which
allows
one
software
to
discover
what
kind
of
osce
nodes
other
software
have.
So
ocean
can,
for
instance,
be
used
as
a
way
to
go
and
control
other
software
and
score
them
in
time,
just
like
it
can
score
its
own
parameters,
hold
so
I'll,
give
an
example,
for
instance,
with
pure
data.
B
So
if
you
don't
know
your
data,
it's
a
system
made
by
middlebucket
who
also
made
maximum
spin
v80s,
which
is
one
of
the
most
used
free
and
open
source
graph
systems
for
well
making
artworks
and
that
kind
of
thing.
B
Okay,
so
intro
data.
What
we
can
do
is
well.
We
have
all
these
little
boxes,
which
are
objects
which
perform
some
operation
like
minus
20,
plus
100,
blah
blah
blah,
and
we
have
special
objects.
We
say:
okay,
this
is
an
ocean
parameter
and
this
character
will
be
available
with
all
the
metadata
that
it
gives
on
the
network
on
the
network.
That
means
that
we
can
do
something
like.
B
This
and
we
can
just
go
and
see
a
design
that
will
give
us
what
are
all
the
parameters
of
this
pure
data
patch
and
then
we'll
be
able
to
go
and
score
them
in
ossia.
So
here
I
can
see
all
the
available
os
query
devices
on
my
network
and
with
their
parameters.
So
here
I
can
just
send
something
to
my
pd
patch
and
my
pd
patch.
It
could.
B
B
So
you
can
synchronize
multiple
things
and
so,
for
instance,
pure
data
is
a
first
example.
Another
one
is
processing.
Processing
is
a
bit
similar
but
much
older
than
nano.
The
framework
that
was
shown
in
the
first
talk
and
it's
a
framework
to
create
mostly
visuals.
You
can
also
do
other
things,
but
most
people
use
it
for
creating
a
visual
generative
art,
and
so
here
we
have
a.
B
A
java
so
pressing
is
written
in
java
and
you
write
your
your
code
in
java,
just
like
with
nano.
You
write
it
in
rest
and
it
says:
okay,
let's
draw,
let's
draw
a
rectangle
of
a
given
size
with
a
given
color
and
this
size,
and
this
color
it
comes
from
it
will
come
from
ocr
parameters.
B
B
So
it
draws
over
every
time
and
if
I
say
okay,
my
first
score
is
255,
then
it
will
do
what
what
we
expect,
and
so
here
we
can
also
automate
things
to
give,
for
instance,
some
visibility
to
what
is
happening.
So
here,
let's
say
that
we
want
okay,
oh
and
here
I
I
will
say:
okay,
you
are
going
from
zero
to
one
person
and
here.
B
But
that's
not
enough.
What
we
want
is
also
to
change
its
color,
and
for
this
we
can
also
do
some
color
automation.
B
Let's
put
it
like
right
here,
we
can
even
load
some
presets.
There
are
some
color
presets
and
say:
okay,
I
want
this
to
go
and
hit
the
color,
and
here
you
can
see
that
it
sends
the
core
over
osc
to
the
processing
patch.
B
It
sends
the
color
to
processing
patch
and
well
that's
what
you
expect
so
you
you
can
also
say:
okay,
let's
repeat
this
forever.
A
B
B
So
yeah-
and
I
have
roughly
10
minutes
running
so
I'll-
explain
a
bit
more
in
detail
so
to
give
an
idea
of
what
can
be
done
with
it.
So
is
there
are
some
images
from
artworks
that
use
your
score
libosia
or
both
in
their
creation,
so
installation
work
plays
dance,
music,
video
art.
We
do
a
lot
of
sound
specialization
work
with
robots,
for
instance,
making
robots
dance
depending
on
what
happens
on
stage
some
artists
use
it
for
modular,
for
instance,
modular
setups.
B
You
can
use
different
plans
also
as
soon
as
you
have
some
sensor.
For
instance,
you
can
directly
send
data
to
arduinos.
It
supports
sending
raw
data
to
serial
port.
That
can
happen,
so
this
one
is
very
fun.
It
was
one
of
my
favorite
project,
it's
a
carousel
where
every
seat
is
basically
a
musical
instrument
and
people
are
supposed
to
all
play
together
and
so
far
tries
to
make
it
fit
to
sunny
correctly
so
yeah.
So
here
is
a
quick
data
sheet
of
asean.
B
Source,
so
it's
c
plus
plus
models,
but
still
not
rest,
and
actually,
when
it
was
started
in
2014
the
it
was
a
big
question:
should
we
like
directly
go
for
rest
or
they'd
be
safe
bet
at
the
time,
and
one
big
thing
that
was
missing
was
the
eu
history
at
the
time
and
even
today,
I'm
not
sure,
there's
something
as
extensive.
We
use
cute
for
the
user
interface
and
it
brings
a
lot
and
it
allows
us
to
do
a
lot
very
easily.
B
I
think-
and
it
is
built
with
a
plugin
based
architecture,
which
is
every
part
of
the
software
and
really
really
every
part,
even
the
central
view.
All
the
menus
left
right
everything
here,
it's
all
brought
together
for
plugins
and
which
right
now
are
in
space.
But
of
course
the
plus
plus
has
its
stability
problems,
and
it
will
be,
I
think,
really
beneficial
to
have
to
start
to
see
how
we
could
integrate
plugins
with
rust.
B
So,
but
I
mentioned,
I
think,
live
coding,
and
I
I
I
got
a
bit
too
much
ahead,
because
this
makes
me
think
of
one
thing.
We
can
also
script,
for
instance
with,
so
we
can
embed
pure
data
patches
directly
in
v-score,
and
this
will
open
pdf.
It
doesn't
crash
because
it's
still
experimental,
yeah
and.
B
You
can
say:
okay,
let's
send
some
notes,
I'm
sorry!
It's
going
to
be
ugly;
okay,
it
was
very
fast,
so
this
is
sending
things
to.
B
Here
I
can
directly
edit
my
pure
data
patch
and
when
it
loops
it's
much
so
I
change
this
and
you
can
basically
control
things
either
from
here
or
from
the
pd
patch.
B
A
B
That
kind
of
thing-
and
it's
experimental
and
I
broke
it
earlier,
but
you
can
also
do
the
same
with
that
is
if
it
doesn't
crash
it
didn't
work.
B
But,
theoretically,
you
can
add
a
c
plus
class
which
will
be
instantiated
as
an
object
and
yeah
be
able
to
be
used
for
that,
and
I
think
it
will
be
great
at
some
point
to
have
the
same
reverse,
but
I
really
don't
know
how
to
integrate
first,
for
instance
like
with
this,
we
integrate
clang
and
build
with
clang
and
the
little
software
to
just
compile
us
will
the
same
thing.
We
be
doable
first,
I
hope
so
and
if
anyone
knows
this
thing.
B
So
yeah
and
the
big
question
is
how
to
buy
into
the
api
because
well
there
is
an
extensive
specialist
api
which
is
helpful
to
develop
things,
but
then
it
makes
it
harder
to
to
buy
into
other
languages
and
yeah,
and
so
this
was
occur
and
then
libosia
is
the
library
which
is
being
used
underneath.
B
So
it
hides
various
creative
coding
protocols
behind
one
single
api
which
allows
to
introspect
it
as
you
could
see
in
score,
but
other
software.
I
use
it
too,
and
it
comes
with
things
that
are
useful
for
people
who
make
intermediate
art
and
libosia
is
implemented,
or
it
has
bindings
to
most
creative
coding
environments.
A
B
On
top
of
this
business
api,
and
then
viewer
straight
uses
the
vc
api
and
not
the
specific
one,
and
I
think
that
now
there
are
better
ways
with
well
better
trades
to
allow
automatic,
binding
generation
of
c
plus
less
cuts.
So
I
think
that
ultimately,
this
would
be
the
better
thing
to
do
or
to
implement
the
protocol
from
scratch.
B
Invest,
that's
also
something
worthwhile,
I
guess
because
it's
specific
foundation
well,
it's
network
protocol
information
is
less
less
but,
as
at
least
can
give
some
basis
and
then
the
idea
is
to
be
able
to
create
those
characters
and
say:
okay,
I
have
this
partner,
which
is
some
osc
address.
It's
of
type
float
this
one.
B
This
is
what
my
aperture
is
and
the
more
meta
that
I
give
and
the
more
separate
software
I
can
use
that
metadata
to
score
it
and
do
anything
with
it
across
the
network,
so
yeah
and
otherwise,
we'll
we
have
maximum
js
credit
objects.
A
B
In
sparkler
it's
been
written
rewritten
from
scratch.
I
think,
because
it
was
too
hard
to
embed
native
specific
library
speculator
source
code.
There
is
a
c-sharp
binding.
I've
been
working
on
an
unreal
engine
binding,
but
it
doesn't
work
perfectly
well
yet
and
yeah,
and
the
idea
is
that
everywhere,
where
you
use
libosia,
it
should
match
how
what
are
the
idiomatic
ways
to
do
in
the
host
environment
so
the
objects,
the
the
way
to
do,
for
instance,
in
java
it
uses
builder
pattern,
because
that's
what
people
do
there
and
well
I
invest.
B
I
I
don't
know
if
that's
dramatic,
as
I'm
not
really
much
of
a
worse
coder,
but
that's
what
we
strive
for
having
a
dramatic
killer
version
in
media
systems,
and
another
thing
is
that
we
are
trying
to
to
make
a
way
to
basically
write
the
actual
objects
in
as
free
as
a
possible
of
any
dependencies.
So
there
is
this
avendish
project
which
basically
allows
to
write.
B
Object,
for
instance,
vst,
plugins
vst3,
let's
say
a
max
pd
object
and
the
actual
code
of
the
object,
the
rule
is
that
it
shouldn't
have
any
dependency
on
anything,
so
we
can,
for
instance,
write
this
code
and
say:
okay,
these
are
my
my
inputs.
B
My
outputs
are,
for
instance,
I
take
float,
I
original
float
or
here,
for
instance,
my
input
is
an
audio
board
and
with
some
control
and
my
output
is
another
duo,
we
perform
the
computation
and
all
this
should
be
done
without
depending
I
believe,
on
an
existing
framework
and
library.
So
inspectors
20,
we
can
cheat
a
bit
there.
There
was
just
enough
reflection
to
get
by
and
I
wonder
if
something
similar
and
rust
would
be
meaningful
and
this
will
require
defining
you
know:
science.
B
There
is
concepts
and
rust,
it's
traits
that
say:
okay,
this
bundle
of
source
code,
that
really
looks
like
an
audio
processor
or
a
midi,
synthesizer
voice
or
a
visual
effect
processor,
and
let's
just
parse
that
at
compile
time
and
do
compile
time
stuff
with
it
to
create
various
video
objects,
and
I
think
I'm
getting
out
of
time.
So
thanks
everyone
for
listening.
I
hope
this
was
kind
of
clear,
so
please
start
the
club
repo
because,
as
you
all
know,
this
is
the
currency
on
which
we
feed.
B
We
feed
on
github
stars
and
you
can
come
and
say:
hi.
We
have
a
chat
on
guitar
forum
and
the
website
ocr.
So
you
can
find
everything
here.
A
Thank
you
so
much
there.
I
could
hear
myself
but
all
good.
I
guess
it's
part
of
the
demo.
Oh
it's
because
yeah
you
have
your
speakers
on,
so
we
could
hear
you
cool
yeah.
Thank
you
so
much
that
was
really
cool
yeah.
I
love
the
little
like
interesting
facts
you
threw
in
like.
I
had
no
idea
about
the
midi
keyboard
limitations.
That
was
pretty
cool,
no
idea,
yeah
yeah.
A
We
don't
have
that
much
time,
but
someone
did
leave
a
question
in
the
chat
no
well
I
mean
I
was
going
to
ask
it
to
you
live
if
you
don't
mind.
A
Yeah,
so
someone
asked
that
you
know
ocean
looks
really
powerful
and
they
were
asking
how
they
could
get
started.
You
know,
I
guess,
as
a
first
timer
just
wanting
to
dabble.
B
Yeah,
so
what
what
we
have
so-
and
I
I
I'll
be
honest-
that's
the
sore
point,
because
it's
really
all
contributor
based
but
on
the
website
you
can
find
there
is
a
docs
tab
and
we
have
online
tutorials.
So
it's
mostly
workshops
that
we
did
with
artists
that
were
recorded
and
they
explain
everything
from
start
to
finish
and
there
is
also
a
documentation.
A
Cool,
so
if
someone's
just
wanting
to
get
started
hop
on
the
getter
and
yeah,
I'm
sure
they'll
be
super
happy
thanks
again,
it
was
really
really
great
having
you,
it
was
a
great
talk,
really
fun
yeah!
No
thank
you.