►
From YouTube: 🖧 IPLD weekly Sync 🙌🏽 2020-06-08
Description
A weekly meeting to sync up on all IPLD (https://ipld.io) related topics. It's open for everyone and recorded. https://github.com/ipld/team-mgmt
B
Welcome
everyone
to
this
week's
IPL.
Do
you
think
meeting
is
June
the
8th
2020
and
this
every
week
we
go
over
the
stuff
that
we've
done
in
the
past
week
and
the
stuff
that
we
might
work
on
and
also
discuss
any
items
or
questions
there
might
be
and
yeah
so
I
start
with
myself
and
so
last
week,
I
finally
found
really
some
time
to
put
on
the
rest.
I
pulled
this
stuff.
Yes,
currently
a
PR
open,
which
it's
a
custom
code,
hey
Louis,
which
means
that
things
are
then
fully
generic.
B
So
you
can
just
say:
okay,
for
example,
ipfs
uses
its
own
table,
which
is
which
supports
like
protocol
buffers.
They
index
Ybor.
They
don't
care
about
tech
chasing,
for
example,
or
you
build
a
special-purpose
application.
It
only
works
on
Dec
JSON
or
your
own
codec,
and
you
can
pass
in
those
kind
of
like
table
implementations,
and
then
you
don't
bundle
the
whole
thing
and
now
like
with
this
PR,
the
code
is
so
generic
that
it
just
works
best
for
stuff.
B
If
you
implement
the
right
traits
and
that's
a
huge
step
and
also
then
enables
me
to
work
on
the
block
API,
so
to
me,
the
block
API
is
still
on
it,
separate
like
small,
truly
application
stuff,
but
then
also
the
blog
API
should
move
in
through
the
core
of
ILD.
This
is
what
I
work
on
next,
but
now
I'm
really
close.
So
all
those
other
stuff
that
I've
needed
for
the
body
I
mean
in
this
PR
is
much
and
it
looks
good,
but
it
will
be
much
too
hard.
B
B
Also
Peter
wasn't
Rost,
who
had
a
bit
and
I
just
yeah
also
implemented
it
for
us
and
then
found
out
that
text
text
pictures
could
need
some
cleanup
and
I
did
those
and
just
yeah
basically
got
over
it
looked
into
it,
and
also
maybe
for
discussions
about,
should
the
basin
coatings
that
have
only
one
kind
of
like
lowercase
or
uppercase
characters
be
case,
insensitive
or
not.
So
if
you
are
into
those
discussions,
feel
free
to
join
and
yeah.
B
A
A
What
happens
when
you
decompress
parts
that
can
be
decompressed
within
your
let's
say,
Debian
files
or
stuff
like
that,
and
what
is
the
effect
on
this
well,
at
the
same
time,
keeping
the
resulting
files
usable
so
like,
for
instance,
deb
files
can
actually
be
uncompressed
and
can
be
verified
and
so
no
supports.
Well,
the
set
I'm
being
duplicated.
So
an
important
comment
here
also
what
four
dimensions
with
the
multi
base
stuff.
A
There
is
also
an
interesting
discussion
that
I
started
and
I
just
wrote
to
it
earlier.
I,
remember
surely
did
not
have
time
to
read
it.
Yet
what
can
we
put
on
the
DHT
reasonably
these
days?
So,
for
example,
the
Bitcoin
work
like
how
much
of
that
and
may
actually
be
put
on
the
network
to
be
available
and
the
answers
to
that
are
a
little
bit
concerning,
but
we
can.
We
can
sort
of
circle
to
this
and
there
after
that
is.
C
C
There's
three
main
things,
but
also
a
bunch
of
just
little
things
so
still
working
on
Bitcoin
docs
for
the
specs
repo,
that's
tedious
work,
but
it's
like
I
have
to
get
this
stuff
out
of
my
head
and
into
the
spec
stock.
And
it's
fine.
It's
not
there's
no
big
deal
there.
It's
just
a
matter
of
actually
writing
coherent
documents
that
are
structured
well
enough
to
describe
what's
going
on
at
a
sufficient
level,
so
I'm
working
on
that
it's
it's
and
pieces
of
it.
C
You
are
going
back
to
it
start
working
on
Z
cash
dump
their
blockchain.
Thankfully
it's
it's
much
smaller
than
big
coin
and
much
faster,
and
so
far
things
are
going
smoothly.
My
codes
not
up
to
scratch
for
decoding
and
encoding,
but
it's
it's
it's
a
long.
It's
a
lot
of
the
way
there
and
a
lot
of
the
other
utility
stuff
that
I
wrote
to
manage.
The
Bitcoin
blockchain
just
translates
directly
over,
so
that
ends
up
being
making
it
much
smoother.
So
far,
so
that's!
C
Oh!
There
was
a
no-no
skip
that
one
I've
also
started
looking
at
Tim's,
selector
syntax
and
doing
that
in
go
Tim's.
Having
a
little
cool
at
the
moment,
we've
been
with
being
able
to
get
anything
meaningful
done
and
so
which
we
need
this
selector
syntax
the
short
hand
at
the
long
end,
string
syntax
impossible
in
to
go
data
structures.
C
So
I've
started
that
and
then,
and
just
you
know,
that
process
of
jumping
from
language
to
language
when
you've
been
deep
in
one
a
little
bit
jarring
to
get
back
to
go
again
after
a
few
weeks
very
deeply
in
JavaScript,
so
I'm
balancing
those
things
trying
to
get
meaningful
work
done.
That's
that's
me.
D
Yeah
then
I'm
kind
of
writing
up
Google
Docs
didn't
really
get
much
code
done,
but
a
little
bit
sort
of
end
of
last
week
weekend
and
today,
I've
gotten
some
J's
multi-format
stuff
done
so
I
got
some
of
Rods
work
merged
and
some
of
their
stuff
reviewed
and
early
for
me
started
on
it
and
merged.
In
some
documentation,
like
a
nice
table
of
all
the
plugins,
we've
been
writing
and
then
dag
seaboard
is
ported.
D
Now
today's
multi
formats
now
I'm
working
on
the
rest
of
them
right
now,
I've
actually
got
the
three
stuff
up
side
of
it.
What's
right
here,
Rob
did
you
have
to
do
anything
any
like
hash
functions
or
anything
like
that's
for
the
multi-format
stuff
or
for
Bitcoin,
or
was
it
all
just
straight-up
kind
of
quick
and
the.
C
D
C
C
D
Okay,
I'll
just
rethink
how
some
of
this
is
documented.
If
you
look
at
the
new
Docs
there's
like
a
section
where
we
we
have
bundles
that
have
like
a
bunch
of
different,
like
has
some
hash
functions
and
codec
stuff,
like
that,
like
the
basics,
bundle
that
it
comes
with
and
then
a
breakdown
of
like
each
package
of
based,
encodings
or
codecs
or
whatever,
and
so
it
makes
a
little
awkward
to
have
a
package
below
that's,
not
a
bundle
that
has
both
codecs
and
hashing
functions.
So
we'll
just
figure
out
I
was
nice.
E
C
D
Of
requiring
deep
in
the
package,
what
do
we
think
about
ESM
right
now?
Oh,
so
it
came
out
from
behind
a
flag
in
LTS
node,
so
we're
good
to
use
yes
modules
in
our
note
code.
Now,
if
we
want
to
I'm
trying
to
figure
out
how
much
work
it
would
be
to
get
all
of
our
tooling
up
to
date,
like
I,
don't
know
what
Paulette
like
I,
don't
think
that
pollen
Deena
will
work
in
particulars.
B
D
E
Yes,
yeah.
D
D
But
the
reason
that
this
is
important
for
us
is
that
the
multi
formats
library
in
particular,
but
really
across
our
packages,
there's
been
a
lot
of
we've,
seen
a
lot
of
problems
importing
files
within
a
package
that
weren't
explicitly
exported
when
people
like
relying
on
thing
of
the
word
specific
so
with
ESM,
you
actually
get
the
ability
to
explicitly
export
individual
files
and
nothing
else.
So
you
can
actually
keep
internal
modules.
Internal
and
also
those
imports
are
a
lot
they're,
less
expensive
now
as
well
I'm
for
importing
individual
exports.
D
D
C
B
So
so
I
would
be
like
I
would
say
we
should
I
brought
the
newer
stuff
like
Whitney's,
so
I
mean
we
couldn't
be,
have
the
older
stuff,
like
that
case,
I,
probably
the
iPod
d-day,
TV
and
so
on,
and
we
have
the
new
stuff
like
the
new
chase
multi
formats,
and
so
it
would
be
me
I
would
say
we
should
pour
the
newest
stuff
two
years
in
modules,
but
definitely
not
taught.
The
own
stuff
like
the
old
stuff
is
just
like
stay
as
it
is
agree.
C
B
And
I
think
we
could
base
it
take
the
chance
of,
as
we
already
have
the
new
stuff
and
it
will
break
things
and
people
need
to
use
new
stuff.
You
can
also
just
go
ahead
and
just
use
him,
but
I
don't
know
if
it
makes
too
many
things
then,
but
yeah.
If
I
would
need
to
make
the
call
I
would
say
we
should
feel
free
to
use
em
for
the
new
stuff,
but
yeah.
D
B
A
A
A
A
How
do
we
like
deal
with
that?
Because
the
current
think
on
the
gate
on
David
as
gateways,
is
essentially
that
you
have
some
kind
of
route
and
you
give
you
stats,
he
IDs
a
route,
and
then
you
do
unnamed
based
selector
above
that,
which
means
that
when
your
node
resolves
this
selector
back
to
the
root
CID,
it
knows
why
this
world,
who
to
ask
that
being
the
currently
connected
peers,
is
no
stressed
and
for
this
e88
resolved
off
of
that
would
see
ad.
A
D
So
I
mean
there's,
there's
threads
going
back
several
years
now
that
it's
not
sustainable
to
put
every
CID
in
the
DHD
like
in
a
in
a
large
data
structure
and
there's
like
a
lot
of
different
thoughts.
People
put
into
for
alternatives,
discoverability
mechanisms,
so
whether
that's
just
like
oh
hey,
I'm
in
this
M
in
this
network
that
does
Bitcoin
e
stuff
and
like
ask
this
network
for
cities
and
and
that's
how
we're
discovering
people
or
all
the
way
to
okay.
D
A
I
was
actually
asking
a
little
bit
differently
and
I
guess:
there's
a
question
to
rod
mostly
when
you
implemented
your
way
of
breaking
down
the
blockchain
into
you
know
into
blocks.
Did
this
was
this
basically
taken
into
consideration
that
some
of
the
parts
of
the
blockchain
are
only
reachable
by
a
select
and
not
by
CID
directly.
C
Both
stuff
to
start
with
the
way
I
broke
it
down,
was
simply
to
match
the
way
that
they
have
built
it.
So
the
you
know
the
it
has
its
own
built
in
idea
of
what
a
a
hashed
chunk
of
data
is,
so
it's
matching
that,
but
secondly,
they
the
aim
was
never
to
put
it
on
ipfs
I'd
like
to
put
some
of
it
on
ipfs,
just
as
an
example
but
I
think
a
lot
of
the
work
that
we
do
is
I
peel
D
doesn't
doesn't
account
for
ipfs.
C
We
we
we
treated
IPL
D
as
a
as
an
independent
thing
that
could
be
implemented.
It
could
be
part
of
ipfs,
but
also
it
could
be
separate
or
ipfs
could
have
its
own
distribution
mechanism.
So
you
know
that
wasn't
it
that
was
never
a
constraint
for
this
work
and
and
then
and
thirdly,
the
one
of
the
goals
of
this
work
was
so
that
you
could
take
existing
Bitcoin
block,
IDs
and
transaction
IDs
and
find
them
by
CID
in
whatever
storage
mechanism
this
stuff
was
stored
in
and
so
that
that
constraints
been
met.
C
It's
just
that
it
the
way
that
they
the
way
that
Bitcoin
decomposes
introduces
a
lot
of
noisy
intermediate
blocks
because
of
these,
these
binary
Merkle
trees
inside
of
the
inside
of
the
the
block
graph.
So
it's
just
not
really
much
of
a
way
around.
You
know
that
to
make
this
nice
and
neat,
and
even
if
even
if
you
were
to
get
rid
of
the
noise,
you
know
we're
still
talking
a
lot
of
transactions
in
every
block
graph
like
it's.
This
is
a
lot
of
data.
C
You
could
Rhian
code
all
of
these
things
and
come
up
with
an
entirely
new
cid
for
every
Bitcoin
block
graph,
but
those
C
IDs
would
have
no
relationship
to
anything
in
the
Bitcoin
world.
They'd
just
be
essentially
random,
but
you
could
do
that
and
then
end
up
only
having
600,000
blocks
on
ipfs
and
each
block
would
contain
everything.
A
C
This
is
this,
is
the
problem
in
the
Bitcoin
world
is
that
the
terminology
of
block
is
really
weird,
because
because
they've
they've
taken
block
to
mean
this
thing,
that
is
produced
every
ten
minutes
the
entirety
of
that
thing.
But
it's
not
what
they're
having
to
produce
the
digest
to
bridges
the
address.
What
they're
hashing
is
the
80
bytes,
that's
so
so
a
Bitcoin
block
is
really
a
blockhead
er
addressed
by
80
bytes,
and
then
you've
got
this
graph
attached
to
it.
C
That
is
other
blocks,
and
so
what
I've
done
and
what
I'm
doing
in
my
documentation
that
I'm
writing
now
is
being
really
clear
about
that
that
the
definitions
here
and
saying
when
the
in
the
Bitcoin
world,
when
they
say
block
it's
not
what
we
say
with
block
so
I'm
saying
block
graph
to
refer
to
what
they
would
say
is
a
block.
It's
one
is
produced
every
10
minutes,
but
it
is
a
graph
of
multiple
blocks
because
they
also
address
individual
transactions
and
you
can
go
in.
C
You
can
query
the
Bitcoin
core
or
one
of
these
block
explorers
for
transactions.
What
are
they
well?
Okay,
they're
within
a
block,
but
they're
also
addressable
individually.
So
this
is
confusion
of
terminology.
I,
don't
think
was
ever
cleared
up
properly
in
the
Bitcoin
world.
That
I'm
trying
to
be
much
more
clear
about
in
this
work
that
there
are
multiple
blocks
within
a
block
graph
that
is
produced
every
ten
minutes.
C
Yeah,
it's
messy!
It's
really
messy,
but
but
when
you
break
it
down
into
okay,
what
are
the
chunks
that
they
produce
these
digests?
For
then
you
can
see
that
there
is
actually
graph
a
coherent
graph
of
of
what
we
would
call
blocks
and
it's
a
single
hash
function
and
it
makes
a
well-formed
graph
except
for
the
sigmoid
garbage.
It's
still.
It's
still
a
graph,
so
yeah
and.
C
The
miss
the
main
messy
piece
about
that
us
putting
seg
we
decide
is
the
that
Bitcoin
treats
the
binary
Merkel
tree
that
traverses
from
the
header
to
the
transactions
it
treats
that
as
implicit,
and
so
it
doesn't
store
the
internal
nodes
of
that
binary
Merkel
tree
in
its
own
binary
format.
It
just
says
all
that
thing
exists
and
you
can
compute
it
and
just
waves
it
away,
and
so
what
I'm
doing
is
actually
computing
it
and
storing
it.
C
A
C
Viewed
IV,
this
is
this
is
IP
FSS
problem.
No
else
you
know
we
we
want
to
address
data,
and
so
we're
just
doing
that
and
we're
using
the
tools
available
to
us
and
we're
trying
to
be
agnostic
about
storage
format,
which
is
why,
in
that,
in
that
graph
that
chart
about
what
is
IP
LD,
that's
floating
around
and
the
specs
repose
a
copy
of
it.
You
know
pull
requests,
but
that
has
its
license.
Li
PLD!
As
being
you
know,
the
applications
it's
above,
but
the
storage
layer
sits
below
and
it's
like.
C
G
D
If
ipfs
just
do
something
along
the
lines
of
oh
here's,
a
data
set
and
here's
a
selector
and
just
the
things
that
match
the
selector,
what
you
should
put
in
the
DHT,
that's
enough
to
take
the
whole
Bitcoin
data
set
and
then
basically
put
in
the
DHT.
All
of
the
block
graph
identifiers,
which
are
what
like
Bitcoin
tends
to
look
at,
is
its
addresses.
So
that
would
be
enough
to
like
and
and
and
from
there.
You
can
get
anything
under
the
graph
right.
D
So
it's
actually
pretty
trivial
to
go
like
oh
I
need
some
data
and
I
know
that
it's
somewhere
in
this
transaction,
ok,
now
I
can
find
the
peers
that
have
it
and
then
go
down
the
graph
and
ask
them
for
it
right
as
I
put
the
data,
it's
all
fixable
once
the
teachers
that
exist
in
ipfs
right
now
they
really
gentle
and
we
don't
really
have
any
control
or
all
that
much
input
into
when
that
happens.
So
it's
best
to
just
not
really
think
about
that.
C
He's
the
piece
that
really
sucks
with
Bitcoin
is
that
you
can
take
if
you
have
a
transaction
ID
but
you've
stored,
because
it
was
your
transaction
and
you
want
to
look
it
up
on
ipfs
it's
and
say
we
did
sealed
this
all
in
DFS.
You
could
get
that
transaction,
but
it
would
be
devoid
of
witness
data.
So
if
there
was
witness
that
are
attached,
it
wouldn't
be
where
you
look
for
it
and
you
would
have
to
navigate
backwards
to
find
it
so
which
you
just
can't
do,
there's
no
way
to
get
from.
C
If
you
only
had
that
transaction
ID,
okay,
I
can
see
the
transaction,
but
so
what
I
don't
know?
What
header
it's
attached
to
I,
don't
know
how
to
go
back
up
so
I
can
go
back
down
and
find
the
witness
data.
It's
just
a
thing
in
isolation.
So
that's
that's
the
thing
that's
the
sucky
bit
about
about
Segway
is
that
they,
those
the
transaction
IDs,
don't
give
you.
The
full
transaction.
You've
got
to
find
the
full
version
of
a
transaction
elsewhere.
So.
A
D
D
So
there's
basically
like
a
new
block
now
that
references
both
of
these
the
index
and
the
primary
store,
and
like
often
you
may
have
a
primary
store
or
have
an
index
and
want
to
get
at
the
what
one
or
the
other
and
there's
no
reference
inside
of
either
of
those
trees
to
each
other.
There's
only
a
reference
from
from
that
header.
That's
about!
D
D
C
D
From
where,
though,
right,
because
you
could
have-
and
you
can
have
n
number
of
things
that
limits
right-
so
you
you
never
know
which
one
you're
talking
about,
but
one
thing
that
like
I,
do
have
a
day
to
be:
is
that
all
of
the
storage
layer
stuff
so
s3
file
like
like
everything,
it
stores
a
reverse
index?
So
you
can
always
find
all
of
the
blocks
that
link
to
a
particular
block
and
that's
really
important
if
you
want
to
do
smart,
garbage
collection,
because
it's
the
only
way
for
you
to
go
and
figure
out
like.
C
Wonder
how
useful
schemers
with
this,
because
they
tell
you
the
important
relationships,
because
it's
one
thing
that
I
haven't
done
with
Bitcoin
yet
which
is
doable,
is
the
links
between
historical
transactions.
So
every
transaction
links
to
the
previous
transaction,
where
the
dirt,
where
the
coins
came
from
and
so
you've
got
this,
the
header
which
forms
the
headers
that
form
the
coherent
blockchain
of
one
block
to
the
next
in
between,
but
also
within
the
transactions
they
link
to
each
other
in
this
sort
of
arbitrary
manner.
C
But
there's
there
is
this
other
dag
that
links
these
blocks
together
or
the
block
graphs
together,
and
it
would
be
nice
to
be
able
to
traverse
from
one
transaction
to
the
previous
transaction
that
had
held
those
coins
to
the
previous
and
but
they're
sort
of
they
sit
aside
and
currently
in
my
schemers
I.
Don't
I
don't
even
represent
those
as
links
yet
so
and
that's
the
doable
but
not
important.
Right
now.
Still,
though,
from
from.
D
A
D
Thing
that
you
could
use
schemas
for
is
that
if
you
were
doing
that,
queer
that
I
talked
about
earlier
really
said.
Give
me
the
blocks
that
link
to
this.
You
could
say,
give
me
the
block
link
to
this
that
match
the
schema
and
then
it
it
would
at
least
pair
it
down
to
just
the
things
that
match
what
you
believe.
The
parent
scheme
is
supposed
to
be.
C
Well,
what
what
about
the
case
like
I'm
thinking
Bitcoin
here,
obviously,
but
where
you
I
I,
add
a
graph
to
like
dag
DB,
say
then,
and
I
added
according
to
a
schema,
so
I've
got
a
set
of
blocks
that
goes
in
and
by
the
way,
these
set
conformed
to
this
entire,
like
the
schema
mapped
onto
the
set.
Here's
the
whole
lot,
because
that's
that's
my
use
case
where
I
here's
the
whole
lot
of
all
these
blocks
on
what
your
store
and
he's
the
schemer
about
how
they
link
together.
C
D
No
I
mean
like
you
could
always
create
an
index
right
like
that's
the
handles
right.
Is
it
big
like
so?
First
of
all,
what
you
store
is
always
reference
from
that
root
of
that
schema
when
you
store
it
in
the
primary
store,
but
I.
Imagine
that
you're
putting
in
the
day,
because
you
can
get
some
kind
of
better
indexing
on
all
the
values
and
then
those
values
are
in
an
index,
and
you
know
the
semantics
that
are
match
whatever.
That
index
does,
which
is
open-ended.
D
A
A
D
A
D
A
D
B
No
that's
specific
to
to
grossing
interest
and
therefore-
and
so
all
the
information
is
that,
basically,
that
the
team
that
works
on
or
wanted
to
work
on
graphs
again,
rest
I'm
just
postpones
in
a
bit
and
has
other
priorities
and
we'll
get
back
to
it.
It's
some
data
point
but
yeah,
it's
open
when
so
there's
no
more
information
about
it.
Also.
D
G
D
Everybody
is
sort
of
aware
system
cash
flows
like
had
like
I
mean
he's,
got
like
a
family
of
five
under
quarantine
right
now
and
everything.
So
it's
just
been
a
bit
more
bit
too
much
to
stay
on
top
everything,
so
his
kind
of
part-time
contract
with
us
it's
ending
now,
and
so
he
won't
be
working
on
the
selector
stuff
anymore.
He
may
come
back
at
a
later
date,
like
it's.
D
It's
just
sort
of
like
up
to
him
when
he
decides
he
has
the
bandwidth
to
come
back
so
that
particular
selector
stuff
is
it's
just
getting
set
down
for
a
minute
while
we
figure
out
who's
gonna
do
any
win.
It
may
end
up
being
me
at
some
point
in
time,
but
we'll
see
what
the
prior
they
can
look
like
all.
G
G
C
G
G
B
B
The
theme
is
our
KLH
and
I
think
who
is
from
xyx,
who
is
also
in
I,
think
interested
in
crafting,
and
they
also
do
a
ton
of
rust
stuff.
So
just
it
might
be
interested
to
really
catch
up
with
him.
If
they
do
anything
in
this
regard
because
like
they
might,
yeah
might
also
look
into
it
or
if
he
have
ideas
about
it,
just
to
make
sure
that
you
busy
and
catch
up
with
them
in
case
you're
going
to
get
into
it
and
that
there
aren't
any
top
of
their
phones.
B
G
G
B
D
One
private
thing
for
the
team:
real
quick
about
one
of
the
internal
politics,
wait.