►
From YouTube: Rust Zürisee Live Stream 2023-09-12
Description
Would you like to chat, ask questions or give a talk? Join us in our Matrix room:
https://matrix.to/#/#rust-zuerisee:matrix.coredump.ch
Support the Zürich community: https://estada.ch/support-my-work/
Follow us on https://rust-zürisee.ch/
Join us on our next event:
https://www.meetup.com/de-DE/rust-zurich/
B
A
A
A
First
one
will
be
this
little
quiz
and
then
afterwards
we
have
a
couple
talks,
a
big
one
and
two
small
ones.
If
I
remember
correctly
and
we'll
reset
up
by
either
downloading
all
the
things
onto
my
machine
or
yeah
well
use
the
someone
else's
machine.
Wonderful,
let's
go
with
the
quiz
hooray,
so
welcome
to
last
Tuesday.
As
you
can
see,
this
is
a
lake,
and
can
you
guess
where
it
is
because
it's
not
from
Siri.
B
A
A
A
A
A
A
Okay,
everybody
good!
Okay!
Here
we
go,
the
answer
is
nightly
crimes,
it's
a
crate
from
Mara
both
and
you
can
basically
enable
any
nightly
feature
in
the
stable
compiler
via
her
macro
and
it's
a
bit
of
a
hack,
but
just
in
case
now.
You
know
also,
if
you're
paranoid,
that
someone
in
your
dependency
tree
has
this
maybe
scan
for
it
not
sure
if
there
is
a
lint
for
it
already,
but
we'll
see
all
right
next
question:
how
to
escalate
privileges.
A
A
A
Nice,
nice
yeah.
Basically,
you
can
define
a
pi
function
and
you
can
also
go
back
and
call
into
python
I.
Think
the
next
I
think
the
next
big
change
will
be
when
the
Gill
Falls
the
global
interpreter
lock.
So,
but
this
I've
heard
that
the
guild
will
go
away
like.
C
C
A
Good
enough
good
enough,
I'm
learning,
that's
nice,
okay,
next
topic,
syntax
and
compilers,
and
here
we
go.
The
first
question
is
name
this
construct,
so
the
the
stuff
that
I
highlighted
like
the
the
parts
is
not
important.
A
Okay,
who
has
an
idea
turbo
fish?
Nice?
If
you
don't
know
this
website,
it's
called
turbo.stitch
and
then
you
can
add
types.
So
this
is
an
iframe
yeah
and
it
supports
umlauts
since
I,
don't
know
a
couple
of
years
it
didn't
in
the
beginning,
but
now
it
does
so
very
nice.
A
All
right
next
question:
custom
error,
training,
and
this
is
a
hard
mode.
So
you're
not
allowed
to
use
this
error
and
anyhow,
so
you
don't
have
to
write
code.
I
did
that
already.
So
it
looks
a
little
bit
like
this,
so
we
want
to
go
from
this
direct
where
we
just
have
one
error
type
that
we
can
pass
up
to
both
right.
We
make
our
own
enum
and
we'll
just
collect
all
the
error.
Types
from
our
module
or
crate
doesn't
matter.
A
So
the
the
second
function
oops
this
one,
so
both
of
the
functions
do
the
same,
but
you
see
here
we
have
an.
I
o
function
that
we
chain
with
question
mark
and
then
we
have
a
parse
function
that
we
chain
with
question
mark
and
the
first
one
the
Rita
string
will
throw
us
a
standard.
I
o
error.
If
it
fails
and
the
second
one
the
string
part
will
throw
us
a
parts,
int
error.
If
it
fails
yes,
so
what
can
we
do
to
fix
that
or
Implement?
That.
A
Yeah,
okay,
so
I
think
if
you
had
something
like
Implement
from
that.
That
gives
a
point
next
question:
how
do
we
attach
the
custom
error
message
like
in
the
worst
case?
It
will
be
something
like
this.
You
have
an
your
own
function
and
we
do
it
and
then
say
this
is
why
it
failed.
A
Oh
sorry,
for
that
it's
too
fancy.
A
This
is
how
it
should
have
happened
so
just
the
upper
function,
so
we
can
specify
a
map
error
function.
So
if
you
had
map
error
somewhere
in
your
chain
of
thoughts,
then
we
can
do
that,
and
this
is
the
cheesy
way
of
just
two
string.
So
we
string
the
error
and
concatenate
the
reason
in
the
thing
yeah.
A
Yes,
yes,
that's
my
personal
enum
yeah,
so
I
took
this
from
my
master
thesis.
So
I
have
my
error
collection
here
and
then
this
is
a
context
error.
That's
just
an
error
where
I
give
more
context
to
the
user
yeah,
but
it
would
be
handy
if
we
had
something
like
so
so
for
unwrap.
We
have
expect,
but
then
it
panics
would
be
very
nice
if
we
had
something
like
that
on
the
native
result,
without
having
to
do
anything
else,
I
could
just
say
Okay
this.
A
This
is
something
that
is
stringable
in
some
sort
just
chain
that
up
but
yeah.
C
A
So
anyhow
is
is
not
great
for
a
library
style
crates.
That's
for
yes,
certainly
from
the
the
same
offer,
this
error
is
nicely
wrapping
up.
What
I
did
here
with
ugly
strings
and
anyhow
is
pretty
much
the
same,
where
you
just
say:
try
this
anyhow
and
here's
some
context.
So
you
can
execute
any
given
thing
and
if
it
fails,
it
will
just
chain
up
the
error
with
anyhow.
It
will
try
to
keep
the
tree
of
the
reasons
stacked,
but
if
it
can't
it
will
just
say
yeah.
This
is
the
last
hype.
A
A
A
A
A
And
it's
it's
a
very
common
pattern,
so
you
can
Panic
yeah,
but
ideally
you
don't
panic
or
you
have
some
sort
of
system
exit
which
system
exit
also
has
the
never
type
all
right.
Then
here
we
go
to
social
dum,
dum,
dum,
so
programming
languages
are
something
around
that's.
Why
we're
here
so
into
the
first
question
recent
news:
what
was
the
Sierra
Scandal
this
year.
A
A
And
apparently
I'm
a
hipster
now
so
cool
I
guess
moving
on
so
did
anyone
of
you
have
pre-compiled
in
your
notes?
Yes,
you
get
a
point
binary,
blobs
same
same,
very
fine
and
in
case
you
ever
come
into
the
situation
where,
like
I,
need
to
be
faster.
I
want
faster,
builds,
I
want
something:
pre-compiled,
maybe
have
a
look
at
BV
because
they
have
their
Dynamic
linking
feature
which
also
speeds
up
linking
us
a
lot.
So
you
still
have
to
compile
it
once,
but
it's
opt-in
and
you
can
do
it.
A
There
was
a
massive
overreaction
of
purists
and
I'm
not
going
to
look
into
the
chat
for
another
minute.
Let
them
have
it
out
so
yeah.
A
Yes,
that's
how
it
started:
yeah
yeah,
it's
it
started
with
the
discussion
about
a
cell
that
wasn't
in
an
unsafe
block,
and
people
got
really
really
unfriendly
and
mad
and
then
at
some
point
he
just
I'm
done
with
open
source,
tore
down
all
the
repositories
for
a
week
and
then,
after
that
appointed
new
maintainers
basically
left
the
project,
and
it
boils
down
that
a
lot
of
people
relied
on
code
that
was
not
set
up
to
be
maintained
properly.
This
is
just
one
guy
and
yeah.
A
A
A
A
So
basically,
that's
what
they
say.
I
haven't
met
any
one
of
them
personally
yet,
but
it
was
a
fork
that
is
like
a
little
shot
across
the
bowel
of
the
rust
Foundation
and
some
of
the
critiques
they
they
voice
are
are
valid,
like
the
rust
Foundation
didn't
do
that
great
of
a
job
of
communicating
with
the
community
and
yeah.
It's
it's
also
hard
to
see
why
certain
stuff
is
done
in
a
certain
way
in
Rust,
and
then
some
people
like.
A
A
We
have
rust
jobs
now
bad
news
corporations
want
to
do
stuff,
but
I
think
we're
still
ahead
of
most
corporations,
so
I'm
not
shaming
any
C,
plus
plus
developer,
that
is
hindered
by
Legacy
that
I'm
just
saying
there
is
a
chance
that
rust
will
have
Legacy
soon
and
I.
Think
that
will
start
the
moment.
We
integrate
an
async
runtime
into
standard
library
because
that
one
will
get
out
of
fashion
at
some
point.
A
It
was
2017
and
we're
looking
into
one
next
year.
Good
news
is
I
won't
be
talking
that
much
other
people
will
be,
but
it
will
be
in
theory
and
we're
currently
collecting
dates
from
other
events,
so
we
don't
Collide
or
that
we
collaborate
we'll
see
and
if
you're
interested
in
giving
a
very
big
talk.
That
would
be
a
talk.
First
like
this
is
a
400
people
conference
and
hundreds
of
people
will
watch
it
live
and
thousands
will
watch
the
recordings.
A
Yep
I
did
a
mess
up;
okay,
so
that
that
was
the
quiz
who
did
well,
who
was
doing
well
in
the
ground
quiz.
A
Okay,
let's
start
with
you,
how
many
points
do
you
have
six
points?
Who
has
more
than
six
points?
How
much
do
you
have?
Okay?
So
if
you
don't
want
to
be
in
the
Stream
just
step
behind
this
one
here,
so
who
has
a
bit
more
than
eight
points?
D
A
If
you
want
feel
free
to
grab
it,
meanwhile,.
A
C
I'm
Robert
and
I
would
like
to
talk
quickly
about
my
personal
project.
What
I'm
working
on
this
rust
so
I'm,
building
up
a
startup
with
my
friends,
it's
called
excellent.
C
The
in
the
end,
what
we
do
is
we're
building
up
a
platform
and
the
cool
thing
is
everything
what
you
see
here.
These
are
three
screenshots
and
examples
of
the
platform
everything
written
in
Rust.
So,
as
you
see
it
down
there,
we
have
right
now
38
000
lines
of
code,
and
out
of
that
is
35
000
code,
purest
so
front
and
back
end.
Everything
is
done
in
rest.
C
What
is
the
cool
features
about
just
quickly?
We
have
a
scripting
language
I
go
later
on
into
it,
which
you
can
use
in
the
front.
End
and
in
the
back
end
to
execute
logic
and
to
work
with
these
object
of
this
local
platform.
We
have
real-time
collaboration
features.
C
So
when
someone
is
changing
a
variable
in
the
screen,
for
example,
here
on
the
side
and
on
the
iPhone
changing
a
day,
changing
something
it
automatically
updates
on
every
other
device,
which
is
we're
looking
at
the
data
in
that
moment,
and
we
have
a
layer
thing
which
you
can
do
in
an
HTML
like
language
where,
with
automatic
mobile
supports
or
whatever
you
do,
and
there
is
automatically,
as
you
see
it
here,
translated
to
mobile
device
screen
or
tablet
device
space
and
correct.
So
this
is
from
a
business
side
a
little
bit
high
level.
C
What
we're
doing
and
hello
it's
working
quickly
about
the
text
deck.
What
how
it's
built
up.
We
have
a
server
create
which
is
completely
unrest,
and
we
have
a
front
end,
which
is
I,
said
also
written
in
Rust,
and
we
have
a
shared
trade
where
everything
is
developed
inside
which
can
be
shared
between
front
and
back
end,
which
is
quite
cool
because
then
I'll
go
from
bottom
to
top.
But
the
shared
backend
allows
us
to
build
the
general
data
structure.
C
So
when
we
send
stuff
over
websocket
for
these
real-time
collaboration
using
put
and
the
cool
thing
about
pod,
is
it's
a
serialization
format
which
is
self-describing
like
Jason
is
so
you
can
really
get
out
of
port?
Also,
then?
What
variables,
for
example,
do
you
send
right
now
then
cool
thing
which
I
really
enjoy?
Is
we
actually
can
write
a
request
code
in
the
shared
code,
so
with
request
you
most
likely
all
know
that
trade
and
it's
possible
to
write
any
kind
of
request?
C
You
want
to
do
to
an
external
resource,
most
likely
other
rest
interfaces
and
it's
working
in
the
front
end
and
in
the
back
end,
without
any
additional
magic.
You
need
to
do
just
take
the
crate,
import
it
and
it's
running
out
of
the
box,
which
is
really
really
nice
and
then
the
last
dependency,
where
we
really
rely
heavily
on
and
which
we
really
enjoy,
is
past
for
parsing
custom
code
or
for
parsing
custom
languages.
C
In
our
case,
we
parsing
SQL
light
like
code,
because
you
can
write
SQL
queries,
for
example,
for
the
charts
you
have
seen
in
the
picture
before
and
you
can
create
them
on
your
own
and
we
pass
this
SQL
code.
Do
some
magic
with
it
and
then
generate
really
SQL
code
out
of
it
and
let
that
to
the
database
correct
then
at
the
front
end
as
I
said,
is
also
completely
written
in
Rust,
which
is
really
cool,
and
here
we
rely
on
you
and
you
is
really
really
cool.
C
Of
course,
it
has
a
possibility
on
one
side
to
really
use
lightweight,
Frameworks
and
components
on
the
other
side,
or
they
are
also
the
possibility
to
go
deeper,
with
full
struct
components
which
are
much
more
heavier
but
or
heavier
to
use,
but
in
the
runtime,
but
allows
really
to
do
deep
Integrations
into
existing
JavaScript
functionalities,
and
this
is
what
we
do
quite
a
lot.
C
All
the
nice
UI
you
have
seen
is
based
on
the
sap
ui5
web
components,
so
we're
actually
using
webcom
web
components
from
JavaScript
or
HTML,
which
we
import
or
which
we
use,
and
then
we
have
bindings
to
it
from
due
to
it
and
then
all
the
logic
on
top
of
it
is
done
poorly
in
Rust
and
we're
also
using
glow
at
a
few
points
where
U
is
not
deep
enough
or
where
we
have
really
custom
events
which
sometimes
come
up
from
webcomones,
which
we
then
need
to
handle
and
Tackle
with
the
globe
crate
on
the
server
side.
C
I
think
quite
boring.
From
from
my
point
of
view,
the
only
thing
which
is
maybe
not
that
common
is
that
for
the
orrest
interface
and
for
our
websocket
interface
we
use
silvo
silvo
is
yeah
trade,
which
is
a
web
server
and
also
depending
on
hyper.
But
it's
never
mentioned
really
some
somewhere
when
I
look
at
Reddit
or
at
comparison,
somehow
this
Falls
quite
often
under
the
radar.
C
We
choose
that,
because
it's
quite
or
has
a
lot
of
functionality.
So
there's
a
lot
of
functionality.
We
can
bind
into
it
with
just
a
feature
with
just
a
feature
and
has
real
Rich
feature
set,
and
it's
complete
asking.
So
that's
why
we
went
with
that
and
then
we
execute
SQL
queries
against
the
possible
yes,
possibly
database
and
we're
using
SQL
X
for
that.
C
So,
as
I
said
quite
a
standard
use
case,
but
I
think
in
combination,
it's
hopefully
a
little
bit
interesting
to
have
really
back
and
front
and
completely
written
and
rust,
and
then
everything
which
can
be
shared,
which
is
not
specific
to
database
or
not
specific,
to
the
API
server
or
not
really
specific
to
rendering
is
all
in
this
shared
Library
event.
Everything
is
together
in
one
GitHub
repository
and
in
the
end,
we're
building
up
one
Docker
container
out
of
it
10
INB,
roughly
and
with
that
everything
can
be.
C
On
the
server
and
and
run
it
yeah,
the
last
thing
which
I
really
enjoy,
which
is
actually
the
heart
of
what
we're
doing,
is
the
crate,
which
is
called
roon.
Rs
runer
RS
is
a
scripting
language,
not
developers
from
someone
else,
but
it's
a
scripting
language,
which
is
a
mixture
of
JavaScript
and
rust,
and
you
can
execute
it
as
well
in
the
front
end
Service
as
well
as
in
the
back
end,
the
really
cool
feature
thing
about
it.
C
It
has
rust
bindings
to
all
the
rust
data
types
out
of
the
box
and
you
can
integrate
with
any
function.
You
write
either
in
rust
or
in
you
or
in
roon,
and
have
a
bi-directional
communication
between
it,
which
is
really
nice
really
easy
to
use
and
that's
the
main
reason
why
we
went
for
it
and
the
second
really
cool
thing.
It
completely
supports
our
synchronous
code
execution.
C
So
what
we're
doing
is
we
provide
in
the
end
database
functionality,
for
example,
to
load
a
specific
record
from
the
database
with
with
an
identifier
with
a
viewer
ID.
This
is
a
function
we
provide
or
we
develop
and
rest
provided
to
a
room,
and
then
our
customers-
or
we
self,
can
just
configure
in
this
scripting
language
or
just
call
this
function
and
then
yeah.
We
turn
the
data
as
a
data
set
which
coming
back
from
the
database
on
the
other
side.
This
is
now
here
room
example.
We
have
here.
C
C
Another
reason
why
we
choose
the
Moon
is
it's
safe,
so
there's
no
Escape
really
from
this
root
VM,
which
is
really
good,
especially
when
we
run
user
written
code
on
the
back
end
on
your
server,
so
always
good,
if
you're,
if
you,
if
they
cannot
break
out
of
that
correct
and
last
feature
which
I'm
really
waiting
for
it's
not
there
yet,
but
I
saw
a
pre-build
of
it
already
if
the
roon
can
generate
in
future
new
API
automatically,
so
you
get
a
rust-like
API
documentation
out
of
automatically
generated
out
of
roon.
C
You
can
put
your
own
comments
to
it
in
the
source
code
itself.
So
all
these
nice
stuff
yeah-
and
we
relying
on
that
because,
right
now
we
have
a
huge
pain
in
building
up
our
own
documentation
and
markdown,
which
is
not
that
great,
but
with
that
yeah
we're
really
getting
new
functionalities
in
the
future
yeah.
C
C
It's
a
colleague
of
mine,
who's,
doing
mainly's
marketing
and
doing
it
as
almost
the
customer
handling
and
myself,
mostly
as
a
developer
yeah.
If
you
have
any
questions,
please
talk
to
me
later.
Ask
me
whatever
you
like,
but
I
hope,
that's
an
interesting
case
about
what
you
can
do
with
rust.
C
A
E
A
D
My
name
is
Adam.
This
is
just.
D
There's
just
a
story
of
like
yeah
something
interesting.
We
came
across
during
my
work
at
Divinity,
which
is
here
in
Zurich
yeah,.
D
B
D
Go
okay,
yeah,
so
there
was
a
reason
that
we
wrote
some
code
that
looked
like
this,
which
is
basically
just
copying
from
one
slice
to
another.
It
checks
that
the
lengths
are
the
same
and
then
one
item
at
a
time
copies
them
over,
and
so
does
anyone
know
what
clippy
says.
If
you
write
this
code
well,
I'll
tell
you
it.
It
doesn't
like
it
because
there's
a
standard,
Library
function
that
does
this,
for
you
called
a
copy
from
slice
or
clone
from
slice.
D
If
your
type
doesn't
Implement
copy,
the
Clone
will
call
clone
to
move
them
from
the
source
to
the
destination,
but
yeah.
D
It's
basically
complaining
because
you're
manually
doing
something
that
a
standard
Library
function
should
be
able
to
do
for
you.
D
So
anyway,
I
was
curious
about
just
like
if
it's
actually
that
much
worse
to
write
it
manually
like,
is
this
going
to
be
like
a
huge
performance
hit
like?
Is
it
kind
of
not
that?
D
Is
it
just
ugly
or
is
it
like
actually
make
a
difference
right
and
when
I
looked
into
it,
I
found
that
the
manual
one
was
actually
faster,
so
this
graph
shows
running
benchmarks
with
Criterion
on
vacca
view,
8,
so
just
vectors
of
bytes
of
various
sizes,
the
sizes
are
on
the
bottom
and
the
bars
are
showing
the
percentage
speed
up
that
the
manual
Loop
had
over
the
standard,
Library
implementation.
D
So
in
some
cases
it
was
like
a
few
percent
in
some
cases
is,
like
you
know,
more
than
10,
even
so
that
was
a
bit
surprising
and
well
I,
don't
have
a
great
answer
for
you
or
anything,
but
we
took
a
like
a
just.
A
quick
look,
look
at
what
the
manual
implementation
does
in
the
assembly.
So
this
is
the
assembly
here
of
the
hot
Loop.
D
When
you
write
it
manually.
This
was
taken
by
just
like
you
know,
copying
a
giant
thing
like
running
it
under
GDB
and
just
taking
a
random
break
point
in
the
middle.
Like
you'll
end
up
in
the
hot
Loop
and
looking
at
the
code
there
and
well,
we
can
see
that
at
least
it's
doing
something
better
than
copying
it.
One
byte
at
a
time
right.
D
The
the
two
moves
to
these
xmm
xmm
registers,
those
hold
128
bits
and,
like
you
can
see
each
time
when
we
do
the
the
ad
it's
adding
hex
2-0.
So
that's
a
32
bytes
at
a
time.
So
like
each
time
it
goes
each
time
it
goes
through
this
loop.
It's
copying
over
32
bytes
and
then
like
incrementing
the
counter
and
then
we'll
follow
through
at
the
end,
so
like
this
at
least
explains
why
it's
not
like
stupidly
slow.
D
The
manual
run
right,
even
though
you
I
wrote
like
copy
one
byte
at
a
time.
The
compiler
is
smart
enough
to
do
the
vectorization
and
actually
copy
32
bytes
at
a
time.
So
this
at
least
like
I,
mean
it
doesn't
explain
why
the
manual
one
is
faster.
It
just
says
suggests
that,
like
the
manual
one,
at
least
like
shouldn't,
be
slow,
it
exp.
You
know
it's
doing
something
reasonable
right,
because
the
compiler
is
that
good,
but
I'm
not
sure
why
they're
still
not
sure,
actually
why
the
manual
one
should
really
be
faster.
D
So
there's
a
couple
of
things:
one
is
the
standard
lib
copy
from
slice
will
actually
call
into
libsy's
mem
move.
D
D
Another
interesting
thing
is
that
memove
will
handle
cases
where
the
source
and
destination
overlap
right
and
it
needs
to
check
for
that
and
then,
depending
on,
like
which
one
is
first,
you
have
to
do
the
copy
in
One,
Direction
or
the
other,
because,
if
they're
overlapping
and
you
do
the
copy
in
the
wrong
direction,
you're,
like
overwriting,
you're,
you're
you're
messing
up
your
day
going
like
you're,
not
you're,
not
doing
the
thing
you
expect
to
do
so.
D
There's
some
extra
stuff
there
that
memo
does
which
it
doesn't
need
to
do,
because
in
my
original
function,
right
if
I
go
back
like
one
of
these
is
a
mutable
pointer.
One
of
these
is
a
mutable
reference
and
which
means
that
it's
not
like
overlapping
with
the
other
one
right
like
if
the
mutable
reference,
so
that's
the
unique
pointer
to
that
slice.
So
you
just
know
right
there.
They
don't
overlap
and
all
the
stuff
memo
does
to
handle
overlapping
things
are
just
not
necessary.
So
maybe
that's
another
thing.
D
I'm
not
sure
I
also
thought
like
this
happened.
I,
don't
know,
maybe
like
a
year
ago,
so
I
was
like
looking
into
it.
I
wrote
up
all
the
notes
and
stuff,
and
that
was
with
cargo
or
rusty
version,
1.60.0
and
I,
actually
just
even
like
re-ran
it
now
to
double
check
and
with
like
a
recent
one.
D
It's
not
even
like,
like
these
results,
aren't
even
the
same
right
like
it's
changed
so
like
it
was
probably
like
very
specific
to
like
you
know
something
in
that
compiler
version
and
like
maybe
specifically
the
CPU
and,
like
you
know,
who.
A
D
What
it
could
be,
but
maybe
there's
something
in
the
release
notes.
Maybe
someone
just
noticed
this
problem
and
fixed
it
in
the
last
year
or
something
I'm,
not
sure
what
yeah
yeah?
Of
course,
it
could
be
an
llvm
too
right,
yeah
yeah,
because
I
guess
the
the
mem
move
thing
is
also
like
an
lvm
intrinsic
or
something
right,
yeah,
so
yeah.
E
D
Yeah
yeah,
maybe
well
anyway,
but
it's
still
kind
of
a
funny
thing.
That's
that's!
Basically,
it
I
had
like
I
put
the
code
up
on
GitHub,
so
you
can
play
around
with
yourself
if
you're
curious
yourself,
it's
at
this
link
or
the
QR
code
just
goes
to
that
link.
So
you
can
just
take
a
photo.
If
you
want
to
save
it,
you
do
yeah
question:
how
did.
E
E
D
A
compiled
in
release
mode
when
I
did
the
benchmarking
I
just
did
I,
don't
know
what
I
mean
like
like
when
you
do
cargo
bench
right
like
it
automatically
does
release,
because,
obviously
you
want
to
release
finger
bending,
but
I,
don't
know
what
other
options
are
like
enabled
there
yeah.
E
F
Oh
hey,
my
name
is
Mike.
My
talk
will
be
much
lower
level,
no
much
easier
to
digest
it
than
the
cooler
ones
that
were
before
me
so
scratch
your
itches
with
the
rust.
Why
would
I
come
up
with
this
title?
F
I
ran
into
this
question
quite
a
bit
when
people
want
to
learn
the
language
and
what
is
a
good
project
to
learn
rust
and
some
people
are
competent
enough
to
I.
Don't
know,
write
a
full-on
game
or
come
up
with
a
cool
startup
like
we
just
saw
right
and
right
to
the
code
and
maybe
learn
on
the
go
like
this.
F
But
if
you
don't,
you
might
as
well
do
some
small
code
that
you
would
write
in
a
scripting
language
or
just
doing
the
shell
even
and
they
do
it
with
rust
on
purpose
to
learn
the
language.
So
this
was
my
Approach,
for
example,
when
I
added
the
support
for
Cooper
tour
output
for
the
cargo
llvmkov
function
or
our
feature,
and
this
existed
already
right.
There
are
python
libraries
and
binaries
that
can
convert
that
stuff.
So
in
theory
you
could
run
cargo.
F
The
thing
is,
though,
there
was
an
open
issue
at
the
GitHub
of
the
maintainer
and
I
figured
yeah.
Why
not
try
to
translate
the
python
code
and
see
what
happens,
and
luckily
the
maintainer
room
is
the
taiku
taiki
Endo
is
very
welcoming,
so
he
explained
to
me
like
where
I
could
probably
plug
in
my
stuff
and
also
helped
me
actually
to
design
the
API
of
my
now
Library
nicely
and
yeah,
and
that's
how
I
learned
quite
a
bit
about
rust,
right
I
learned
how
to
publish
to
crates
IO,
which
I
haven't
done
so
far.
F
Then
read
another
project,
source
code
and
figure
out
like
how
can
I
actually
merge
my
stuff
in
there
then
discuss
Solutions
like
a
social
part
of
getting
this
featured
work
and
write
and
parse
XML
I.
Don't
touch
XML
in
my
real
life,
otherwise,
but
here
I
had
to
and
and
yeah
learn
about
the
GitHub
CI
workflow,
because
I
use
gitlab.
Most
of
the
time
that
was
interesting
too.
F
And
here
are
the
differences
now,
so
my
naive
for
sure
not
most
optimized
solution
is
already
quite
a
bit
faster
and
uses
much
less
Ram
and
I
compared
the
runtime
of
the
actual
thing
running.
So
the
the
code
itself
of
python
is
actually
also
within
0.17
seconds,
but
the
runtime
of
python
has
to
start
up,
so
it
is
slower
in
your
CI.
You
would
lose
this
time.
F
So
that's
why
I
kept
it
this
way
and
yeah
Ram
uses
this
much
more
and
there's
another
tool
because
for
gitlab
at
least,
if
you
have
huge
Cooper
tour
XML
files,
you
cannot
upload
them.
So
if
you
have
a
substantial
code
base
that
generates
more
than
10
megabytes
of
XML
you're
not
allowed
to
upload
that
to
gitlab.
So
you
have
to
split
that
up
into
supper
files
of
smaller
size
and
the
python
version
use
the
strategy
where
it
uses
two
gigabytes
of
RAM
and
quite
a
few
seconds
to
do.
F
It
and
I
came
up
with
the
idea
to
use
an
event-based,
XML
parser,
so
I'm,
not
loading
the
whole
file
Dom
tree
into
RAM
and
then
kind
of
split
it
up,
and
my
solution
also
creates
files
that
have
9.5
megabytes
in
size
and
the
original
one
just
takes
every
well
coverage
block
and
write
separate
files
for
it.
F
So
you
could
end
up
with
hundreds
of
files
with
the
original
python
code
and
in
my
case
well,
it's
divided
by
9.5
megabytes
from
your
original
XML,
more
or
less
because,
of
course,
the
XML
code
size
changes
yeah.
That
was
also
interesting
to
see
the
difference
in
speed
and
I
did
that
also
for
other
stuff.
So
I
don't
know
a
friend
of
mine
won
in
the
lottery
quite
a
bit
of
money,
so
he
asked
me
like:
where
should
I
invest?
F
My
money
and
I
was
like
yes,
no
clue
right,
but
what
I
can
do
is
like
give
me
the
stuff
that
you
want
to
invest
in
and
I
will
use
Monte
Carlo
simulation
to
see
where
this
can
go
potentially
right.
So
I
wrote
this
in
Rust.
Of
course,
you
would
use
a
Jupiter
notebook
to
do
that
normally,
but
this
way
I
learned
the
plotting
time
series
and
reading
CSV
files
to
have
the
historical
data
of
the
different
stocks
turns
out,
though,
that
my
friend
didn't
care
about
the
plots
a
lot.
A
Well,
I
can
do
that
I
shall
say
goodbye
to
the
Stream.
Let's
gloss
over
the
last
comments:
hi
everyone
who
joined
us
late
and
yeah
there's
still
no
cookies
in
the
Stream.
So
that's
a
reason
to
join
us
here.
I
think
we'll
head
for
some
food,
maybe
a
drink.
The
ones
who
want-
and
if
you
want
to
you,
can
always
join
us
and
of
course,
once
we
have
a
new
logo.