►
Description
Iroh - presented by @b5 at IPFS bing 2022 - IPFS Implementations - https://2022.ipfs-thing.io
A
Hello,
everybody,
my
name
is
B5.
You
might
have
heard
me
earlier
and
today,
I
have
the
distinct
pleasure
of
introducing
some
new
stuff
new
stuff
day
is
my
favorite
day
and
yeah.
Today,
I
have
the
distinct
honor
of
doing
not
one
but
two
things,
because
we're
going
to
talk
about
an
implementation
of
ipfs,
but
for
you
to
really
get
the
full
context,
we
need
to
tell
you
who's
making
it
and
for
that
we
get
to
introduce
a
new
company
which
is
fun.
A
So,
let's
start
with
number
zero,
which
is
actually,
in
fact
the
name
of
the
company
number
zero
was
incorporated
on
April
1st,
as
n
o
Inc
in
the
state
of
Delaware
April
Fool's
Day.
It
was
a
great
day
to
name
your
company,
no
Incorporated
felt
right,
but
we've
not
been
around
for
long
and
we're
a
relatively
small
team
we're
these
four
people
for
the
moment,
if
you're
in
the
ifs
Community,
you
may
recognize
some
of
these
spaces.
A
A
number
of
us
have
been
working
in
and
around
ipfs
for
years
years
and
years
number
zero.
We're
not
going
to
talk
about
the
company
too
much,
but
the
bit
that
you
need
to
know
is
we're
really
kind
of
focused
at
a
high
level
on
building
efficient
distributed
systems
and
efficiency
is
the
key
word.
We
want
to
key
on
we're
really
as
a
team,
very
interested
in
the
idea
of
considering
the
efficiency
equation
holistically,
so
you
can
optimize
for
performance.
You
can
optimize
for
decentralization.
A
We
want
to
optimize
for
the
idea
of
something
being
wildly
efficient
like
the
most
resource
efficient.
We
can
think
about
it
and
we
haven't
finished
any
of
the
work
here
yet,
but
like
we're
really
interested
in
this
idea
of
sublinear
scaling,
which
is
what
happens
when
you
start
to
intersect
to
the
concept
of
efficiency
with
networks,
we
haven't
even
figured
how
to
quantify
this
yet,
but
this
is
the
degree
to
which
we're
chasing
wild
ideas
in
the
efficiency
space.
A
With
that
in
mind,
we
can
now
get
on
to
the
actual
implementation
itself,
which
we
are
calling
Ira
iro
is
a
new
open
source
implementation
of
ipfs.
It
is
written
entirely
in
Rust,
because
rust
is
better
fight
me.
It
is
open
source.
It's
open
source
today
and
zero
Dash
computer,
slash
iro
on
GitHub
we're
targeting
an
actual
release
sometime
in
the
end
of
this
year.
The
reason
we
say
actual
release
is,
if
it
isn't
documented,
it
doesn't
exist,
and
so
until
the
documentation's
up
and
ready
it's
not
done
in
our
eyes.
A
We
have
three
high-level
goals
for
the
project.
It's
got
to
be
efficient,
it's
got
to
be
robust
and
it's
got
to
be
platform
specific
in
terms
of
efficiency.
We
are
building
the
most
efficient
implementation
of
ipfs
on
any
Planet.
This
is
our
high
level
goal.
This
is
what
we
want
to
sort
of
accomplish.
We
have
been
measuring
everything
since
day,
one
so
far.
The
numbers
are
in
we're
not
the
most
efficient
implementation
today,
but
we're
doing
pretty
well.
This
is
again
when
you
start
thinking
in
terms
of
efficiency.
A
We
just
take
the
number
of
requests,
a
second
which
is
sort
of
the
requirement
of
it
and
Peg
it
at
a
thousand
and
ask
for
100
megabyte
files,
and
this
is
how
Ira
is
performing
today.
We're
sort
of
hitting
80
CPUs
as
we
want
to
be
our
memory.
Consumption
is
a
little
lower
and
our
average
latency,
interestingly,
is
a
little
bit
higher.
So
we
won't
lie
to
you
when
the
numbers
aren't
entirely
in
our
favor,
but
we're
pretty
proud
of
this.
After
what
has
it
been
a
couple
of
months
in
terms
now?
A
Just
because
something
sufficient
doesn't
mean
you
can't,
when
you
get
an
efficiency
gain,
you
can
take
that
and
just
dump
it
into
throughput.
So
we're
doing
all
right
on
throughput
already
as
well.
I
shouldn't
mention
sort
of
what
these
are
showing.
We
have
built
a
Gateway
and
the
Gateway
is
really
to
make
us
comparable
to
other
implementations,
the
Gateway
Checker
or
the
Gateway.
Comparison
tools
are
really
robust,
and
so
this
is
where
we
start.
A
We
have
a
bunch
of
tests
that
we
run
that
we'll
get
into
more
details
in
a
talk
tomorrow
in
the
connecting
app
PFS
track
that
Casey's
giving.
But
we
measure
a
bunch
of
gateways
around
there.
We've
learned
a
whole
bunch
of
things
about
what
gateways
work
and
what
don't.
But
when
you
actually
start
deploying
stuff,
when
you
actually
start
looking
at
Network
infrastructure,
there's
a
lot
of
things
that
go
into
that
equation.
But
the
takeaway
for
us
so
far
is
we're
already
developing
delivering
consistently
faster
than
ips.io
and
duwab.link.
A
We
haven't
obviously
tested
that
with
the
same
level
of
load
that
ipms.io
and
do
web.link
have,
but
one
day,
maybe
let
me
say
robust.
We
want
to
be
able
to
do
fewer
things
and
we
want
to
do
them
with
a
high
level
of
Polish
the
we're
really
aiming
for
that.
A
It
just
works
experience
with
this
project
and
that's
really
important
to
us
to
the
point
that
we
are
not
going
to
support
the
entire
section
of
SEC
of
utility
or
set
of
features
that
go
ipfestyle,
that
Kubo
supports,
but
the
things
that
we
do
support
we're
going
to
do
them
really
well
and
we're
going
to
pay
attention
to
them
and
really
make
sure
that
they're
executed
to
a
high
level
of
of
excellence.
A
And
then,
last
but
not
least,
we
think
platform
specific
and
what
we
mean
really
by
this
is
build
to
the
platform.
So
the
platforms
that
we
want
to
support,
we
want
to
take
this
core
set
of
code
and
actually
write
to
an
intended
platform,
and
so
for
that
the
easiest
way
to
sort
of
describe
that
is
through
our
Cloud,
which
is
iro
for
the
cloud
platform.
A
Specifically,
it's
got
a
number
of
sort
of
things
that
come
out
when
you
start
to
say
we're
going
to
take
a
core
code
base
and
Target
it
at
a
platform.
These
are
the
things
that
we're
kind
of
optimizing
for
the
most
important
one
is
actual
horizontal
and
vertical
scaling,
so
we've
actually
taken
what
would
normally
be
inside
of
Kubo
and
broken
it
out
into
separate
processes
that
are
then
talking
over
jrpc
to
each
other.
The
upside
of
this
is
it
behaves
a
lot
more
like
a
traditional
Cloud
set
of
tools.
A
You
have
horizontal
and
vertical
scaling,
so
the
Gateway
itself,
you
can
scale
any
one
of
these
pieces.
You
can
scale,
and
hopefully
you
can
pick
up
on
this,
but
you
can
start
to
stick
things
in
between
those
services
and
actually
do
what
you
need
to
to
get
this
to
behave.
The
way
that
you
want
we'll
go
into
more
detail
on
this
in
another
talk,
that's
why
it's
going
to
be
composable.
This
is
why
you
check
your
slides
beforehand,
but
yeah
that's
iro.
A
We
are
aiming
for
these
things
and
yeah
we'd
love
to
sort
of
get
into
the
mix
on
that
with
you
this
week,
but
I
think
before
we
even
get
too
much
farther
into
this
I
I
think
it's
fun
to
address
this
question
of
like
why
ipfs,
why
a
new
implementation
of
ipfs
in
2022
and
like
what's
our
reason
for
even
building
out
this
technology
in
the
first
place
and
I'm
really
convinced
of
this
statement
that
the
most
exciting
VIP
days
for
ipfs
are
ahead
of
us,
not
behind
us.
A
It
can
feel
really
exciting
to
see
like
the
explosion
of
where
fpfest
has
been
and
how
far
we're
going.
But
I
spent
a
lot
of
time.
Thinking
about
this
I
think
about
the
adoption
curve
a
lot
and
what
does
it
mean
to
drive
a
protocol
deep
into
the
early
majority?
So
the
idea,
where
we're
thinking
of
tens,
hundreds
of
millions
of
ipfs
nodes
or
ipfs
oriented
systems
and
what
it
takes
to
get
us
there,
and
so
as
a
community
I'm,
really
excited
to
sort
of
like
understand.
A
How
do
we
drive
when
the
slope
of
that
curve
is
really
hard
to
sort
of
get
through,
and
how
do
we
get
over?
This
thing
called
the
chasm
that
I
will
rant
about
a
lot
this
week.
If
you
ask
me
so
for
that,
we
need
to
sort
of
come
up
with
some
as
number
zero.
We
want
to
have
some
way
to
sort
of
contribute
to
that
adoption
curve,
and
for
that
we
think
that
ipfs
needs
to
be
in
many
many
more
places
and
more
platforms.
A
Specifically
this
thing
there
are
6.6
billion
smartphone
users
on
the
planet
in
2022,
and
it's
just
point.
Blank
got
to
be
a
thing
that
we
solve
properly.
If
ipfs
is
going
to
make
its
way
down
the
adoption
curve,
we
have
to
figure
out
Mobile
in
a
very
serious
way,
and
so
for
that
we
have
a
zero
index.
A
Three
new
things
to
talk
about
today:
there's
another
platform
that
we've
been
targeting
from
day
one
and
that's
Ira
mobile,
and
this
is
the
same
set
of
code
that
is
going
into
Ira
Cloud,
but
this
is
actually
a
ground
up
rethink
of
ipfest
for
the
mobile
platform.
This
does
not
take
everything
that
we
know
today
and
just
like
jam
it
into
a
phone
or
compile
down
into
a
phone.
This
is
think
about
high
churn.
This
is
think
about
the
characteristics
of
a
mobile
device.
A
What
you
can
and
can't
do
and
how
you
can
make
something
that
is
ipfest
but
also
runs
in
that
environment.
This
is
early
days.
We
are
nowhere
near
sort
of
finished
on
this,
but
is
what
we
want
to
get
out
of
it.
We
want
it
to
be
native
OS,
apis
drop
into
drop
any
of
these
Swift
objective-c,
kotlin
or
Java
apis
into
your
native
app,
and
it
won't
make
your
phone
hot,
which
we're
working
on
we're
going
to
back
it
with
irocloud.
A
Dignified
choir,
who
will
be
here
later,
has
been
working
very
hard
for
the
last
two
or
three
weeks
on
a
system
called
iroshare
to
prove
out
the
whole
system
itself
and
show
that
we
can
actually
make
some
minerals
on
mobile
early,
and
so
we
have
an
actual
working
sort
of
subsystem
of
this
today,
which
is
a
mechanism
that
we
can
use.
Qr
codes
to
actually
drive
a
data
transfer
process
that
utilizes
a
whole
host
of
technologies
that
we
all
know
and
love
inside
of
the
ipfs
universe,
to
do
data
transfer
across
devices
seamlessly.
A
This
is
working
today
we
have
a
version
of
this
that
is
running
and
relatively
efficient,
we're
starting
to
measure
it,
but
we're
very
excited
to
talk
about
this
very
early
thing.
This
we're
at
sort
of
a
demo
phase
for
this,
and
so
over
the
week
we'd
love
to
show
you
iroshare,
getting
it
on
our
actual
devices.
A
But
we
have
big
plans
for
this
and
very
serious
Ambitions
for
what
we
think
this
can
do
to
drive
ipfs
the
protocol
down
the
adoption
curve
toward
the
early
majority.
So
just
a
quick
gloss,
we're
very
excited.
We've
got
three
new
things
to
show
and
work
on
and
get
your
feedback
on.
We
are
really
fired
up
about
where
things
are
going.
The
last
couple
of
months
have
been
some
of
the
most
exciting
months.