►
From YouTube: 🖧 IPLD weekly Sync 🙌🏽 2020-08-03
Description
A weekly meeting to sync up on all IPLD (https://ipld.io) related topics. It's open for everyone and recorded. https://github.com/ipld/team-mgmt
B
Everyone
to
this
week's
ipld
sync
meeting
it's
october,
the
3rd
2020,
and
as
every
week
we
go
over
the
stuff
that
we've
worked
on
in
the
past
week
and
then
might
and
then
we
go
over
the
stuff
that
yeah
is
on
the
agenda.
If
there's
any
action
items
cool
so
yeah,
I
start
with
myself
this
week.
I
actually
so.
Where
do
I
have
to
take?
But
this
video
I
actually
got
a
plenty
of
time
to
work
when
I
I'm
really
ipoli
stuff,
most
notable
the
last
marty
hash
stuff.
B
It's
currently
in
a
library
called
tiny
multi-hash,
but
the
plan
certainly
is
to
get
it
upstream,
so
this
kind
of
like
agreed
upon,
so
it
will
go
upstream.
It's
just
a
matter
of
time.
They
are
still.
I
want
to
improve
still
their
documentation
a
bit
and
yeah
fix
some
minor
things,
but
it
should
be
really
soon
and
on
the
multi-format
stuff
in
javascript.
B
There's
also
plenty
of
stuff
going
on,
because
aching
brain
is
currently
working
on
moving
the
code
from
using
node
buffers
to
using
you
in
eight
arrays
instead,
and
it's
called
of
course
it's
a
breaking
change
for
the
whole
api
and
all
the
libraries,
because
we
really
yeah
have
then
unit
8,
arrays,
moving
and
moving
out.
It's
not
as
bad
for
the
node.js
users,
because
node.js
buffer
is
also
a
un8
array.
So
you
can
at
least
pass
in
a
buffer,
but
you
will
get
a
new
and
8
array
out.
B
But
I
have
actually
one
request
for
rod
that,
because
there's
an
open
pr
on
jscid
with
the
breaking
change
and
it
would
be
good
to
get
a
thumbs
up
for
this
pr
yeah.
I
will
post
it
in
the
notes
later
on,
because
I
basically
thought
at
least
I
want
to
get
the
thumbs
up
from
the
ipald
team,
and
you
want
to
it's
kind
of
like
blocking
him
from
other
changes.
B
So
it
would
be
great
if
you
could
do
it
too,
on
your
day,
because
I
told
him
that
tomorrow
his
day
so
he's
on
in
the
uk,
he
probably
will
have
to
get
the
approvals.
B
As
I
don't
expect.
Any
major
blockers,
cool,
yeah.
B
C
B
A
I'll
buy
it
I'll
buy
it
like
having
dealt
with
this
this
in
the
past,
like
with
a
value
type
change
like
this,
it's
better
just
be
able
to
give
people
an
error,
rather
than
have
it
work
most
of
the
time
until
they
try
to
access
a
method.
That's
not
there
like
it's.
It
is
better
like
no!
No,
I
mean
we
should
move
the
js
multi-formats.
The
id
to
use
bytes
then
like
as
soon
as
possible,
and
deal
with
that
change
in
in
all
the
difficulty
format
stuff
because
we're
using
dot
buffer.
B
A
B
Play
it
so
it's
not
like,
like
you,
have
to
agree,
you
don't
have
a
choice.
It's
really
like.
I
want
you
in
input
on
it
so
yeah
to
make
it.
A
Yeah
I
I
commented,
I
think
that
we
should
add
some
other
properties
that
are
going
to
make
the
future
migration
easier
as
well,
but
in
general,
like
it's
a
good
change,
we
should
do
it.
C
B
Yeah,
I
think,
like
I
think,
alex,
would
want
to
make
this
change
as
minimal
as
possible,
but
I
agree
that
we,
I
agree
that
we
should
introduce
those
as
well.
So
I
guess
the
trade-off
will
be
that.
So
you
see
if,
if
you
are
not
happy
with
the
bytes
change
and
he's
not
happy
with
the
other
changes,
we
just
do
both
in
compromise
yeah.
B
I
will
tell
you
for
sure,
but
so
yeah
cool.
C
Right,
so
the
bulk
of
my
week
was
spent
in
the
amt
crocking,
what
it's,
what
this
one's
doing
and
how
it's?
What
the
implications
are,
writing
docs
for
it
having
discussions
about
it
filing
some
bug
reports
and
some
pull
requests
just
to
fix
some
very
minor
things
still
haven't
quite
quite
finished,
those
I'm
because
we
don't
have
a
spec
for
this.
It's
I'm
ending
up.
C
Writing
spec
style
docs
in
the
thing
itself,
but
I've
also
said
that
I
will
probably
just
write
a
spec
doc
for
it,
so
I'm
sort
of
still
working
my
way
up
the
stack
of
docs
they.
I
keep
on
saying
to
people
about
this,
one
that
the
the
trade-offs
are
much
more
interesting
than
in
the
hampt
and
you
and
you
would
have
a
much
bigger
impact
by
tuning
parameters
with
this
one.
So
you
can
push
it
into
really
perverse
directions
depending
on
the
parameters
and
the
input
types
and
the
usage
style.
C
Whereas
a
hemp
is
a
lot
more
forgiving,
because
you've
got
the
hash
to
randomize
things,
whereas
this
one,
you
know
you
can
you
can
abuse
it
and
make
some
really
odd,
shaped
trees
and
also
some
very
s,
lots
of
small
blocks.
If
you,
if
you
really
do,
have
sparse
data
very
sparse,
then
you
can
have
lots
of
little
blocks
and
I
think
that's
worth
documenting
in
it.
C
I'm
trying
to
do
it
in
a
very
clear
way
so
that
the
busy
file
coin
folk
can
read
and
understand
and
think
about
the
trade-offs
they're
making,
because
I
know
they're
concerned
about
chain
size
right
now.
C
So
just
got
to
finish
that
up
now
eric
and
I
have
been
talking
about
a
test
fixture
system
for
cross
implementation
use-
and
this
has
been
something
that
was
on
the
agenda
last
year-
that
I
was
working
on.
This
is
how
it
got
into
car
and
zipcar
in
the
first
place
and
never
got
back
to
it.
But
now
with
this,
hampton
amt
work
turns
out.
C
Firecoin
needs
this
because
they
have
these
implementers
doing
stuff
and
there's
some
confidence
issues
with
the
code
and
its
portability
and
the
ability
to
implement
things
safely.
C
So
we
we
planned
out
the
basic
framework
for
what
this
thing
might
look
like
how
how
we
might
put
this
thing
together
and
came
up
with
a
really
interesting
thing
to
shoot
for
fairly
fairly
basic,
but
should
be
fairly
useful.
It
might
be
worth
at
some
point
synchronizing
with
volca
on
this,
to
think
about
whether
how
quickly
a
rust
one
could
be
done
as
well.
C
So
the
idea
is,
we
had
these
simple
fixtures
where
there's
like
a
file
per
fixture
per
case,
and
you
could,
you
know,
pick
it
out
and
run
it
against
your
implementation,
but
you
could
also
use
it
to
you
know
debug,
the
process
of
where
things
break
and
yeah
there's
a
bunch
of
interesting
features
about
this
thing.
That's
fairly
minimal
as
it
is
so
that's
that's
been
on
my
mind
since
we
talked
about
that
and
trying
to
clear
the
deck
to
get
to
it.
C
One
of
the
issues
that
I
had
with
that
was
at
the
revisit
the
work
I
was
in
the
middle
of
doing
with
the
car,
the
javascript
car
library
migrating
to
js
multi-formats,
which
I
I
put
down
for
the
time
being.
I
need
to
get
that
finished,
so
I
went
back
and
started
seeing
what
it
was
like
to
pull
in
more
of
michael's.
C
Esm
migration
stack
into
that
to
do
yeah,
but
that
and
that's
a
whole
rabbit
hole
in
itself
that
I've
been
avoiding
so
now.
I
can't
avoid
it
so
yeah
that,
and
then
just
a
bunch
of
minor
adventures
along
the
way,
just
other
things
that
keep
on
coming
up
and
needing
to
be
dealt
with
that
are
not
worth
itemizing.
C
So
so,
right
now
we
are
focused
on
minimum
viable
thing,
so
we
we
don't
have
as
much
thought
put
into
the
negative
it's
much
more
focused
on
the
positive
and
we
it's
working
initially
on
the
assumption
that
you
that
it's
not
it's
not
going
to
replace
all
of
your
tests
for
things.
It's
focused
on
a
narrow
use
case
of
your
thing
has
some
basic
testing
that
covers
its
its
common.
C
The
things
that
you
are
concerned
about
in
your
implementation
and
probably
some
negative
testing
as
well-
and
this
thing
is
this
primary
use
case
is-
is
I
didn't
mention
this,
but
multi-block
data
structures
is
what
we're
thinking
about,
although
I
can
see
it's
utility
across
other
kinds
of
ipld
applications,
but
it's
very
much.
You
have
a
a
thing
that
needs
to
behave
in
a
very
particular
way
and
have
very
stable
results.
C
So
you
can
find
the
edges
and
push
to
extremes
and
see
what
it
does
in
the
cases
where
you,
as
the
tester
have
identified
that
this
thing
needs
to
be
tested,
what
its
edges
are,
and
so,
for
example,
we
have
that
for
the
hampt
we'll
have
some
of
that
for
the
hector
anyway,
for
my
javascript
implementation
that
we
can
just
pull
in
and
say
using
these
parameters
and
this
these
operations,
you
should
get
this
really
perverse
version
of
it
and
it
should
handle
it.
C
Okay,
there
there
is
space
for
doing
things
like
we
expect
you
to
be
able
to
produce
an
error.
If
you
push
it
this
far,
but
we
haven't
thought
enough
very
much
yet
about
the
error
case.
But
the
priority
is
to
get
this
thing
working
and
then
maybe
come
back
and
start
considering
how
we
add,
in
some
of
the
negatives.
A
Yeah
I
like
it
it's
hard
to
test
some
of
the
stress
cases
and
some
of
the
more
specifics
without
getting
really
implementation
specific
and
it's
hard
to
like
to
get
cross
language.
You
can't
test
error
conditions
in
like
a
cross-language
way,
like
that's
going
to
be
really
difficult,
not
impossible.
You
can't
even
test
stuff
like
this
mutation
operation
should
produce
these
blocks
in
this
order,
because
you're,
assuming
that
the
implementation
yields
blocks
as
they
get
created,
which
is
like
not
how
all
implementations
necessarily
work.
A
It's
mainly
like,
if
I
give
you
these
operations
to
do,
is
this
the
new
root
hash
of
the
of
the
tree
right,
like
that's?
That's
the
machine
or,
if
I
give
you
this
tree
of
data
and
tell
you
to
do
this
mutation
operation
on
what
is
the
new
hash
of
the
tree
like
those
are.
Those
are
really
good
and
worthy
tests
and
get
in
most
of
the
stuff,
but
some
of
the
the
crazier
stuff
is
going
to
have
to
be
more
implementation.
Specific.
C
Yeah,
like
in
my
javascript
stuff,
I
test
the
number
of
loads
and
saves
from
the
data
store
and
that's
not
something
you
want
to
do
across
across
implementation,
because
everyone
will
do
something
different.
So
you
just
want
to
test
that
the
results
that
yields
are
consistent.
That's
it.
D
Yes,
so
another
week,
mostly
burning
falcon
stuff,
the
only
ipod
related
portion
there
is
that
I
started
helping
candles
a
little
bit
with
the
falcon
specs
of
how
things
fit
together
on
the
on
the
data
storage
side,
and
we
actually
have
more
meetings
with
him
because
I
turned
out
there
is
a
lot
to
talk
about
specking
this
stuff
out.
D
In
the
meantime,
when
I
had
a
chance,
I
started
looking
into
what
was
a
module
for
the
chanker
part
of
dagger
would
look
like
and
spent
a
little
time
over
the
weekend,
trying
to
compile
different
things
and
yes,
combining
gold
wasn't
is
a
little
bit
fat,
but
actually
it's
it's
much
smaller
than
I
than
anticipated.
D
To
be
to
be
honest,
well,
another
thing
that
I
was
working
on
while
trying
to
talk
through
with
janice
how
our
comp
is
calculating
generated
and
while
working
on
the
dumb
drop
additions
to
actually
validate
files.
Starting
with
why
it's
the
portion
that
we're
using
right
now
so
slow
and
want
to
see
what
does
it
actually
take
to
implement
it
to?
D
You
know
not
be
as
slow
as
it
is
right
now,
and
it
actually
is
quite
possible
to
do
it
at
like
about
10
10
times
faster
than
what
it
runs
as
right
now
in
rustville,
I
haven't
yet
looked
into
like
making
this
arrest
friendly
in
any
way,
and
it
was
just
an
exploration
and
yeah,
then
then,
to
some
goal:
specific
issues
with
with
streaming
that
are
also
weird,
and
I
need
to
bear
down
an
example
for
maybe
eric
to
look
at,
but
this
is
also
when
I,
when
I
have
time
from
rethinking
falcon
chains
and
and
testing
deals
and
stuff.
F
Y'all,
so
I'm
still
working
on
did's
stuff
did
decentralize
identifier
documents.
I
managed
to
get
a
subsection
about
seabor
and
specifically
dagsebor
ipld
in
it.
There
is
some
work
talking
about
how
those
feedback
that
the
the
way
I
wrote
it
mirrors
a
lot
of
dagsy
bore
constraints,
including
deterministic,
canonical
mapping
of
seabor
and
keys
as
text.
F
So
there's
a
lot
of
talk
right
now
about
how
to
optimize
and
more
make
more
concise,
the
key
as
int
features
of
cbor
without
requiring
necessarily
a
red,
a
separate
registry,
and
so
there's
a
lot
of
talk
about
sebor
ld,
with
an
external
dictionary
for
keys
mapping
to
attributes.
F
There's
a
huge
vulnerability
in
that
in
that,
like
it's,
a
mutable
context
that
describes
the
jason
ld
and
I
have
major
reservations.
It
was
talked
about
this
week
on
the
ietf
sibor
session
and
I'm
trying
to
actually
do
some
working
with
using
extending
the
at
contacts
to
include
a
dag
seabor
where
the
dagsy
borer
actually
is
the
mappings
between
the
into
text
fields
for
conciseness.
F
But
again
I
welcome
any
feedback,
they're
getting
giving
me
a
lot
of
pushback.
As
far
as
my
my
the
the
section
saying
it's,
this
is
out
of
left
field.
It
doesn't
have
enough
implementations
and,
like
it's
just
seabor,
cbor,
there's
like
tag
you
can
actually
use
tags.
It's
well
represented.
It's
I'm
not
inventing
any
new
anything
new
anything
new
in
cryptography.
F
The
dagsebor
has
seaboard
tag,
42,
that's
registered
with
diana
that
it's
it's
there
and
actually
battery's
included,
and
mostly,
I
think
it
seems
as
though
there's
just
a
lot
of
corporate
jousting
going
on
about
implementations
and
special
interest,
and
so
I
think
I
do
need
to
update
the
example
I
gave
in
the
document
to
be
in
the
seabor
section
to
have
a
very
clear,
cose,
key
format,
but
really
working
about
the
mappings
between
dag
sebor
and
particular
dag
cos,
a
format
to
pure
cos,
a
with
the
appropriate
tags,
and
so
I'm
writing
style
library
in
rust.
A
Cool
johnny
are
you
in
the
loop
on
the
the
folks
reading
the
dag
cose
jose
codex.
F
No,
I
think,
there's
a
couple
different
people,
I
think,
certainly
the
textile
group-
I
probably
are
using
this
and
then
the
three
box
people
I
think,
but
this
is
like
so
christian,
let's
fist
and
I
actually
have
talked
about
this
and
wrote
a
paper
together
years
ago
at
the
rebooting
web
of
trust,
and
so
I
think,
there's
a
lot
of
similar
similarities
to
our
the
the
approaches
and
I
think
mostly
it
gets
into
standards
and
semantics
for
interoperability
and
I
think,
to
make
sure
we're
all
on
the
same
page
as
far
as
how
we
solve
it.
A
Okay
yeah,
I
just
actually
asked
to
have
a
call
with
them
in
the
thread
that
I
now
can't
find
that
I
wanted
to
add
you
to
youtuber.
A
But
yeah
yeah
yeah,
I
do.
I
want
to
get
some
clarity
on
that.
I'm
starting
I'm
starting
to
question
whether
or
not
the
way
that
we've
been
thinking
about
the
representations.
The
data
model
are
actually
the
right
thing
and
there
may
be
something
else
that
we
want
to
do
here.
So
I'd
really
like
to
just
kind
of
hear
more
about
how
they're,
using
the
format
and
how
we
expect
to
be
used
in
the
wild
before
we
dig
in.
F
Yeah,
I
think
I
was
just
thinking
that
as
basically
dag
coz,
so
the
hard
thing
is
like
it
can
cause
a
be
a
acyclic
graph
and
I
think
that's
tricky.
How
do
you
prove
that
it's
an
acyclograph.
F
Yep
and
the
way
I
get
the
paper,
I
showed
you
guys
before
from
the
rebooting
rubber
trust
in
buenos
aires
that
was
cancelled
was
that
I
basically
you
in
the
tag
98
cos
a
signature.
You
basically
use
a
tag
42
as
a
b
string,
which
is
a
c
id,
and
but
it's
a
little
bit
redundant
because
you're
in
the
algorithm
section
you're
saying,
let's
say
on
a
sha-2256
signature
with
ed
edsa,
but
then
the
cid
actually
has
redundant
information.
F
It's
a
base,
pre
prefixed,
plus
a
sha-2256,
so
it's
double
encoding
and
I
think
also
I'm
not
sure
as
far
as
the
that
the
payload,
how
long
that
payload
can
be
right
now
that
would
be
like
55
bytes
or
something
so
you're
doing
the
shock.
256
plus
the
pre.
You
know
pre-pending,
all
those
different,
separate
bytes,
and
so
I
think,
is
that
conformance.
Is
that
proper?
Like
will
people
understand
that
that
cid
signed
cos
a
signature
with
tag
98
pointing
to
a
payload
and
tag
42?
Is
that
is
that
cool?
D
For
what
it's
worth,
we
actually
have
entries.
I
think
we
already
have
them
in
the
multi-codec
table
or
maybe
they're
just
pr's,
but
I
think
they
got
merged.
Where
there
is
it's
not.
The
cid
will
essentially
pay
the
hash
of
the
payload
plus
the
signature
of
it
as
a
single
cid.
A
There's
a
couple
pr's,
but
the
ones
and
multiformats
got
merged
which
are
just
adding
the
integer.
But
then
there
are
open
pr's
that
we're
still
talking
about
in
ipld,
for
what
the
spec
for
the
actual
code
actually
and
those
like
have
a
fair
amount
of
like
unresolved
questions.
E
Okay:
okay,
thanks
next
is
eric.
G
So
I
got
more
code
gen
news.
We
can
now
generate
structs
with
tuple
representations.
That
is
done
kind
of
exciting,
because
that
is
one
of
the
representations
that's
really
commonly
used
by
folks
who
are
sensitive
to
the
size
of
their
serial
data
and
looking
at
you,
filecoin
and
lotus
people,
they've
used
them
almost
everywhere.
G
It's
like
99
of
their
code,
so
this
is
an
important
one
to
get
for
their
usage
and
in
particular
it's
shown
up
in
some
hemp
stuff
that
we've
been
taking
a
look
at
that
we
mentioned
last
week,
so
we're
poking
around
that
in
this
world
there
cogen
for
unions
with
kind
of
representation
is
also
about
85.
G
I'd
expect
to
come
out
shortly,
but
needs
a
little
bit
more
error
handling,
as
always,
is
actually
the
trickiest
part
in
a
pleasant
surprise,
the
any
type
that
we've
known
for
a
while
will
need,
but
we
sort
of
bulked
on
specifying
it
as
thoroughly
as
we
wanted
to
a
couple
months
back.
G
It
turns
out
it's
easier
than
we
thought
it's
just
a
kind
of
union,
and
that
is
about
it.
It
turned
out
to
be
really
easy
to
whip
up
a
union
that
has
all
the
basic
types
and
then
add
a
map
type
that
is
just
a
map
of
any
to
any
and
any
composers
over
this,
and
so
like.
Of
course,
you've
got
a
cycle
in
that
definition,
but
that
turns
out
to
be
sort
of
fine
so-
and
I
did
a
little
prototype
of
code
jetting
that
and
it
just
kind
of
worked.
G
So
that
was
a
very
pleasant
surprise
in
the
future.
There
might
be
some
some
more
work
to
do
there,
something
that
has
come
up
as
a
pattern
is
we
might
need
general
configuration
for
whether
or
not
to
use
pointers
when
assemblers
are
embedded
in
other
assembler
types.
G
G
So
sometimes
we
need
to
say
I
don't
want
to
embed
this
assembler.
I
actually
do
want
it
to
be
a
pointer
phones
right
now,
there's
a
couple
of
cases
where
that
happens
anyway.
In
particular,
it
happens
to
happen
in
union
and
when
they're
configured
to
use
this
memory
layout
internally,
but
this
is
kind
of
a
coincidental
solution
and
it
shouldn't
be
the
only
solution.
G
So
some
of
the
features
are
working,
probably
there,
but
nothing's
blocking
on
it
soon
looked
at
test
fixture
stuff,
a
little
bit
as
rod
already
talked
about
michael
bay
and
we've
been
quietly
poking
at
what
it
would
be
like
to
implement
a
new
hampton
and
go
and
use
some
of
this
cogen
stuff.
But
that's
very
early.
So
we're
gonna
see
how
that
goes,
and
maybe
I'll
talk
more
about
it
later,
but
probably
not
too
much
today
and
that's
about
it.
A
Yeah,
so
most
the
big
thing
was
a
lot
of
progress
on
docs.
So
a
lot
more
stuff
there
for
the
javascript
stack.
The
tutorials
is
getting
pretty
locked
in
getting
the
point
where
other
people
need
to
to
do
stuff.
Now
I
moved
it
over
all
the
schema
docs
that
rod
had
written
and
all
that
config
stuff
interview.
It
took
me
a
while
to
figure
out
those
sidebars
ron,
that's
a
weird
format
and
view
for
those,
but
it
all
it's
all
nice.
Now,
yes,
the
dockside
is
really
coming
along.
A
I
really
now
need
contributions
from
other
people
before
I
can
move
forward
anymore,
and
then
I
took
thursday
and
friday
off,
which
I
spent
polishing
up
dagdb
and
showed
it
to
many
more
people.
It's
surprisingly,
like
a
lot
is
there
I
I've
forgotten.
How
much
was
there
there's
a
lot
of
code
integrity
and
a
lot
of
it
works,
so
people
are
starting
to
use
it
in
poke
around
now.
I'm
sure
there
are
bugs
but
they're
the
kind
of
bugs.
A
So,
yeah,
that's
also
looking
pretty
good
feedback
so
far
has
been
really
positive.
I
I
showed
some
stuff
to
carson
a
while
back
actually,
and
he
said
that
a
lot
of
what
they
were
looking
at
doing
in
threads
v2
is
actually
already
there.
So
we've
been
talking
about
what
it
would
look
like
to
do:
threads
v2
on
top
of
dagdb,
so
that's
a
fun
conversation
moving
forward
and
that's
where
we're
at.
E
Cool,
does
anyone
have
any
agenda
items.
B
Because
I
might
have
one
because
I'd
like
to
get
a
bit
er,
except
that
peter
is
a
more
important
one
like
just
like.
I
want
to
know
from
michael
and
wrote
about
the
cid
changes
like,
because
I
will
probably
talk
with
adam's
tomorrow
about
it.
So
is
it
like?
Is
it
just
properties
we
can
already
populate
and
it's
like?
Basically,
it's
more
or
less
a
change
internal
within
cid
and
you
don't
need
to
care
about
a
lot
from
the
outside.
A
Okay,
yeah,
the
the
things
that
I'm
saying
to
add
are
just
feature
editions
because
they
are
features
that
you
need
to
migrate
to
in
the
future.
So
there's
no
breaks
yeah,
there's
breaks
when
you
migrate
to
the
next
one,
but.
B
Yeah
but
like
like,
do
I
so
if
I
record
recalculate,
for
example,
the
code,
the
code
thing,
so
it
is
basically
then
just
populated
when
you
so
we
clearly
use
strings,
but
we
have
the
code
from
somewhere.
So
we
will
just
like
when
you
create
this
id,
we
would
just
populate
the
yeah
cohabit
anyway.
Yeah.
A
Yep
yep,
I
mean
you
can
look
up
the
string
by
code
from
the
multicolored
table
because
you
have
the
whole
thing
and
the
same
in
the
constructor.
When
you
take
an
image
or
you
can
just
convert
it
to
the
string
and
keep
the
string
behavior
there,
but
at
least
the
external
usage.
You
know
you
can
migrate
to
the
new
forms.
So.
C
A
A
C
B
B
A
C
And
then,
because
if
you
push
out,
if
you
pushed
out
a
version
that
just
did
alex's
changes,
then
you
could
easily
just
sit
on
that
version
and
you'd
be
fine,
mostly,
but
then
you'd
you'd
have
an
extra
level
of
pain
if
you
wanted
to
migrate
to
the
newest
deck
and
you
hadn't
started
that
work.
But
this
gives
you
the
optionality
to
start
that
work
at
any
point,
so
you
can
at
any
point
you
can
say.
Oh
maybe
we
should
switch
to
integer
multi
codecs
and
you
can
do
that.
H
B
B
Until
people
can
actually
use
the
cid,
so
I
mean
so:
if
we
do,
if
we
would
do
a
release
tomorrow,
you
can't
really
use
the
code
base
because
you
don't
have
all
the
other
dependencies
that
need
buffers
and
so
on.
B
Basically,
what
I'm
saying
is:
we
have
still
a
lot
of
time
to
implement
1.1
release
because
you
can't
use
it
anyway,
like
no
upstream
dependency
could
use
the
new
cd
version
tomorrow.
A
So
folker
I'll
put
it
this
way
so
that
that
repo
doesn't
have
automated
releases
on
each
check-in
so
like
we
could
merge
that
pr
and
wait
but
like
when
you
put
out
the
major
breaking
release
version.
We
may
want
to
get
this
code
in
before
that
because
that's
when
it'll
be
messaged
right.
B
B
Yeah
so
even
like,
even
if
we
basically
did
halfway
done-
and
I
like-
basically
it's
almost
done
in
small
things-
I
can
also
fix
those.
Let
me
see
so
if
we
talk
about
the
days
range,
that's
totally
fine,
but
it
sounds
like
well.
We
will
do
it
eventually
sometime
then,
that's
too
long,
because
I
really
don't
want
to
it's
not
a
big
check.
It
really
isn't
a
big
change.
It's
okay.
B
B
Okay,
now
cool
yeah
yeah,
that's
that's
perfect!
Then!
Okay,
peter,
you
wanted
to
say
something
I
very
quickly.
Oh.
D
Yes,
yeah,
so
this
is
an
idea
that
I
had
like
about
a
week
ago,
and
I
just
want
to
kind
of
like
solidify
that
we
want
to
do
this
going
forward.
I
realize
johnny's
here,
like
don't
be
alarmed.
D
What
I'm
proposing
does
not
actually
affect
what
you're
doing
directly,
so
we
seem
to
be
running
into
a
problem
where,
across
the
ecosystem
like
in
in
go
in
rust,
I
don't
know
what
the
story
in
javascript
is
probably
also
similar,
where
we
have
these
cyborg
encoders
and
decoders,
which
we
have
a
hell
of
a
time
cursing
to
actually
do
the
right
thing
for
dexhibor
being
more
strict,
more
more
resilient,
more
this
more
that,
and
it
seems
that
we
are
basically
handicapping
ourselves
by
saying
that
stuff
is
based
on
sibor
or
wars,
adding
cybor
as
part
of
the
name.
D
So
several
times
there,
it
was
floated
that
we
should
have
a
different
like
a
different,
better
format
for
objects.
Sorry
for
for
for
nodes,
whatever
you
want
to
call
it
for
blocks.
D
My
thing
is
that
we
should
call
it
something
that
does
not
have
any
connotation,
neither
with
json
another,
neither
with
civil
war
with
no
with
xml
what
what
whatever
you
want
yesterday,
completely
different
name,
so
that
nobody
implementing
this
gets.
The
idea
like
well
I'll,
just
reuse,
this
library
that
I
have
over
here
in
my
language
already
and
I'll
just
get
it,
you
know
fix
the
things
I
think
are
important
and
they'll
be
done.
D
A
D
A
Yeah
this
this
like
is
this
is
a
general.
I
mean
yeah
wrangling,
the
the
block
format
to
do
the
right
thing
is
pretty
difficult.
I
think
that,
like
seymour
was
definitely
the
right
choice
at
the
time
that
it
was
made,
and
it's
probably
still
the
right
choice
right
now,
but
we
know
we
now
know
enough
about
how
that
format
works
and
how
to
write
formats
that
it
would
make
sense
to
do
our
own
block
format
in
the
future.
G
That
sibor
is
a
bad
format.
It's
that
the
nomenclature
around
this
is
sometimes
drastically
miscommunicative,
because
it
causes
people
to
bring
all
sorts
of
baggage
with
yeah
and,
like
assume
that
certain
forms
of
extension
are
going
to
work
when
they
absolutely
are
not
that's
the
really
problematic
one.
A
I
mean
so
so
much
of
this,
though,
is
that,
like
we
have
an
existing
form
and
we're
trying
to
make
it
deterministic
and
that's
a
really
painful
thing
to
do
like
taking
an
existing
format
and
item
determinism
is
just
like
not
fun
and,
and
we
kind
of
found
all
the
corners
of
that
now,
or
at
least
I
hope
that
we
found
all
of
them,
but
this
will
like
continue
to
come
up,
especially
with
people
doing
blockchain
work.
A
I
think
that,
like
this
is
most
visible
when
you're
doing
blockchain
work,
because
you
always
have
to
be
agreeing
on
the
same
serialization
like
I'm,
not
actually
worried
about
the
deserialization
and
being
loose
on
dc
realization,
I'm
only
worried
about
being
strict
on
serialization
and
the
bugs
like
where
these
bugs
present
themselves
are
so
difficult
to
test
for,
and
so
like
problematic
yeah.
It's
it's
like
not
going
to
be
good,
like.
A
I
think
that
yeah,
it's
very
clear
that
at
some
point
we
were,
we
are
going
to
need
to
write
a
block
format
and
we're
going
to
need
to
really
think
about
this
really
hard,
but
yeah
we're
stuck
with
what
we
got
for
now.
B
Yeah,
but
I
think
that,
like
though,
the
the
cool
thing
about
the
whole
thing
is
that
you
can
have
your
own
block
format.
Basically,
so
I
think
it's
like
like.
We
should
at
least
then
do
two
different
ones
like
at
the
same
time
like
because
that
you
basically
then
end
up
with.
Oh,
you
basically
optimize
it
for
you
either
optimize
the
format
or
you
optimize,
like
ipld
or
the
stuff
around
it,
to
a
single
format,
which
is
totally
not
the
point
of
aprild.
B
A
Yeah
yeah,
whatever
new
block
format
that
we
do,
it
should
not
do
anything
outside
of
the
data
model.
It
should
do
anything
you
can't
do.
Indexing
boards
just
be
like
a
little
bit
more
efficient,
a
little
bit
easier
to
ensure
determinism
like
all
of
that
stuff.
It
shouldn't
it
shouldn't,
be
anything
that
you
absolutely
have
to
use
like
dixie
versus,
don't
work
everywhere.
The
stack
should
assume
that
yeah.
I
totally.
D
Agree
even
having
as
a
byproduct
having
normal
cyber
libraries
being
able
to
decode.
That
would
be
fine.
It's
just.
We
really
can't
advertise
because
it
leads
us
to
what
what
rat
was
saying.
I
I
think
it
was
tuesday,
like
oh
we're,
going
to
do
super
strict
or
something
like
that.
You
can't
do
a
super
strict,
like
with
existing
libraries.
You
need
new
lips
for
that.
A
G
There
have
been
some
good,
maybe
things,
to
study
where
people
have
proposed,
like
t
json
for
types
json,
and
it
was
like
json
plus
some
annotations,
but
they
made
a
clear
point,
calling
it
something
else
and
then
describing
clearly
the
ways
in
which
you
can
parse
it
as
regular
json.
But
like
that
whole,
I
can
parse
it
as
this
old
format.
You're
familiar
with
was
like
a
separate
clause
under
its
own
h2
heading
right
after
they
had
a
name
that
was
like.
This
is
something
new.
F
Well,
isn't
that
how
like
jason,
ld,
algorithms
work,
and
and
wouldn't
it
be
because,
as
I
understand
it,
when
you
you're
storing
the
block
as
seabor
it's
actually,
it's
not
ordered
it's
basically
just
byte
and
whatever
the
convenient
ordering
is.
But
it's
actually
when
you
read
dereference
and
read,
and
then
your
you're
transforming
it
back
in
like
into
a
human,
readable
format
like
json.
That's
when
you
actually
perform
the
term
deterministic
mapping
ordered
as
lowest
bytes
right.
F
Okay,
because
I
think
that
that
could
be
part
of
it,
which
is
actually
like.
Well,
when
you're
deserializing,
it
you're,
applying
a
particular
ordering
algorithm
or
you're
conforming
to
the
specific
algorithmic
format
that
actually
you're
part
you're
applying
to
that
during
the
transformation
process.
A
Yeah
I
mean
like
the
the
issues
that
we
run
into
right
are
like
what
are
they
so
there's
one?
Is
the
map
ordering
stuff
so
like
not
everybody
actually
implements
the
map
ordering
correctly.
We
are
now
everywhere,
but,
like
a
lot
of,
we
found
that
a
lot
of
them
don't
because
it's
kind
of
like
a
weird
corner
of
the
spec
and
then
the
other
one
is
the
integer
type
stuff
where
it
says
that
you
should
always
be
using
the
the
smallest
representation
and
a
lot
of
people.
A
Don't
I
think
part
of
this
too
is
like
it's
not
even
what
you're
saying
peter,
where,
like
people,
pull
their
seaboard
library
and
use
it
to
do
whatever
it's
literally
people
don't
even
implement
all
of
seabor.
They
implement,
like
the
features
in
seaboard
that
they
care
about
in
this
thing
and
then
like
that's
it,
and
so
it's
like
incredibly
naive
and
specific
and
that
that's
what.
D
Or
they
implement
more,
they
implement
like
unlimited
lists,
and
you
know,
and
things
like
that,
but
but
to
join
johnny's
point
it's
not
about
what
you
can
deserialize
it's
it's
always
about.
What's
on
disk,
because
what's
on
disk
is
what's
on
the
network
and
what's
on
the
network
is
what
you
can
validate,
whether
you
are
being
attacked
right
now
or
you
actually
got
the
right
cie.
A
I
mean
the
nice
thing
about
doing
a
new
block
format
is
that
we
can
basically
front
load
all
the
links
up
front,
and
so
you
can
get
all
the
links
out
without
parsing
a
whole
block.
So
that's
really
cool,
but
also
it
just
would
like
not
be
as
easy
to
write
these
naive
implementations,
because
you
have
to
do
a
bunch
of
upfront
work
to
understand
the
block
on
that.
A
So
people,
like
would
not
be
writing
a
lot
of
half
fake
libraries
for
serialization
serialization.
I
think
that
would
be
nice
adding
like
some
some
application.
Specific
compression
in
there,
too
would
really
help
like
it
should
just
be
like
a
little
bit
hard
to
not
implement
the
whole
thing.
I
think
that
would
be
great.
F
H
F
A
rumped,
the
first
byte
in
the
array
actually
like
like
a
bunch
of
iuis
or
uris,
are
actually
all
starts
with
a
very
similar
thing,
so
you
can
compact
that
and
basically
like
as
yeah,
so
it's,
but
that
I
don't
know
that
would
be
a
deterministic
way
of
doing
it.
So.
A
B
They
guess
for
like
the
dexter
stuff
that
we
need.
We
should
pro
like
pro
like,
I
think
brought
it.
Did
it
all
pretty,
probably
that
in
implementation,
so
implementing
really
only
the
zebra
that
we
need
for
tag
cbo
would
be
interesting
like
not
implement,
for
example
like,
for
example,
take
support
would
be
just
it
just
supports
one
tag
like
it
doesn't
even
need
proper
tag
parsing,
but
just
like
hard
coded
take
42,
because
I
think
then.