►
From YouTube: libp2p weekly sync 2019-05-06
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
With
our
pull
stream-based
and
Plex
implementation,
that's
been
open
for
a
long
time,
but
we
finally
are
that's
finally
ready
to
go
into
JSI
PFS.
We
should
see
some
improvement
boosts.
Switching
over
to
that
I
also
implemented
signing
message.
Signing
for
Jas
pops
up.
That
is,
the
base
of
that
is
done.
A
Also
did
some
DHT
CPU
testing
for
Jas
ipfs
I'll
finish
that
up
this
week
and
then
probably
sync
up
with
Raul
on
priorities
and
then,
if
everything
looks
good,
then
I'll
probably
tackle
the
jeaious
live
pdb
natural.
So
we
can
get
some
more
direct
connections
because,
as
we're
going
through,
the
DHT
updates
saw
a
lot
of
issues
with
just
all
the
Jas
notes.
Even
node
want
to
connect
over
circuit
relays
to
a
lot
of
things,
so
just
trying
to
avoid
that,
so
we
can
get
some
better
in
performance
improvement
there.
That's.
B
It
awesome
I
just
started
recording
as
you
went
so
I'm
just
gonna
do
for
posterity
very
quickly
summarize
my
update
again
essentially
yeah
that
the
test
lab
is
usable,
I'm
looking
for
partners
and
to
establish
like
an
actual
roadmap
that
I
can
be
accountable
to
and
the
teams
that
want
to
do.
Benchmarking
can
be
accountable
to
and
getting
that
started
and
then
I'm
also
going
to
start
working
on
the
self
styling
issue
which
I've
linked
in
notes
and
with
my
recap
done
all.
C
Right
so
there's
somebody
using
like
an
ad
pack
driver
a
jackhammer
next
doors.
Okay,
this
pretty
much
spare
you
guys.
Basically,
my
head
right
now
is
like
transitioning
fully
to
specs
for
the
next
four
weeks
or
so
I
might
try
and
like
like
use
my
whole
ass
on
the
specs
repo
I
find
a
more
or
less
like
I'm
gonna,
just
the
other
bits
work
on
the
docks,
especially
if
something
gets
clarified
these
next
process.
But
that's
like
my
thing,
is
to
try
and
commit
to
that.
C
So
in
service
of
that
I
made
a
tracking
work,
cracker
spreadsheet.
It
was
interested
in
like
chiming
in
on
priority
role
and
I
gonna,
try
and
think
about
that
this
week,
so
that
I
can
have
a
clear
head
and
basically
the
big
plan
is
like
do
some
cheerleading
for
the
PRS
that
are
stuck
and
do
edits
as
needed
and
then
try
to
form
a
plan
for
like
what
does
you
know?
The
main
specs
document
should
look
like
like
what.
C
What
is
the
first
thing
you
see
when
you
see
these
specs
and
work
like
you,
can
outline
at
that
and
then
figure
out
how
to
do
how
we
get
there.
So
then,
you
know
that
planning
process
is
probably
gonna.
You
know
be
ongoing
throughout
the
next
week
or
two,
but
I'm
gonna
try
to
make
progress
on
those
sort
of
works
in
progress
that
we
can
clean
up
is.
C
So
I
mean
we
could
render
it
into
like
a
you
know,
to
look
nice
in
the
dock
site.
I
think
the
domain
I
still
think
like
going
to
a
lot.
People
will
just
want
to
go
to
the
github
repo
and
browse
them.
That
way.
That's
fine,
but
I've
been
working
on
through,
like
the
file
coin,
Friday
tax
of
like
taking
markdown
and
massaging
it
a
little
bit
so
he
likes
it.
So
I
think
it's
a
pretty
easy
job
to
take
the
specs,
repo
and
format
it
nicely
there
yeah.
B
D
D
So
that
was
a
big
success
that
we
stopped
crossing
relays
for
members
directly,
so
I
also
added
some
functionality
to
put
their
deadline
in
the
can
take
for
the
relays
so
that,
basically,
if
you
can't
establish
the
the
hop
dream,
the
stop,
stop
I'm
take
with
any
minutes
and
I
bought
it,
because
probably
the
color
is
going
to
have
very
mixed
I'm
out
of
one
minute
and
time
out.
Only
when
X
it's
going
to
be
bad
anyway.
D
Okay
and
then
you
know
they
get
need
a
bunch
of
things.
You
know
like
with
whatsapp
I
opened
a
PR
for
the
reward
followed
data
pipeline,
so
we
want
to
bubble.
This
happened
basically
to
expose
it
on
the
BPD.
Muncho
forget
work
for
month
there.
If
you
want
it
yeah
and
also
did
the
street
signature
West
Medical
Foundation
that
prompted
to
take
to
implement
it
versus
hiding
in
JavaScript.
D
D
What
happened
is
that
you're
like
the
stupid
climb
to
800
megabits,
and
then
we
ran
out
of
memory
because,
okay,
you
know
like
some
10
million
girl
routines
going
on.
So
a
security
guard
limited
with
nobody
out
for
that
on
the
go
routines
that
conserve
op
streams
and
that
stabilize
it
so
it
seems
to
be
doing
fine,
and
then
you
know
like
this.
D
Southern
memory
buildup
appeared
out
of
nowhere,
it's
kind
of
memory
and
we're
trying
to
debug
that
so
I'm
not
going
to
repeat
myself
like
on
the
record
about
my
debugging
adventures,
but
basically
what
appears
to
be
a
current
memory
leak
or
something
related.
It
goes
to
relays
through
to
grass,
while
they're.
B
D
B
D
B
D
You
know
like
doing
some
low-level
candle
debugging,
because
it's
very
anyway
the
end
before
that
experiment
is
one
I
want
design
with
templates
only
because
I'm
just
on
single
protocol
and
yaps,
and
it
is
hired
but
engage
the
the
problem
that
there
is
a
couple
of
Magnum
planks.
What
the
monastery
which
into
bidding
constantly
devised
of
that
after
who
should
cross
the
stream
and
your
clogged
pharmaceutically.
That
prompted
me
to
appear
to
downgrade
the
dogs.
Go
to
the
pointed
fill
box
him
in
the
roulette
had
to
go.
You
know,
weight
off
the
slope
mountain
started.
D
It
also
made
it
a
little
path,
actually
such
that,
but
the
recycle
column
that
appears
to
be
Albert
pilasters.
Basically,
some
nasturtium
spin
pins
to
get
out
and
lobbed
error.
That
says:
oh
I've
licked
it
when
it
licked
me
I,
don't
know
so
we
get
a
fixed
after
we
make
relays
and
picks.
Omniture
throws
the
plant
for
others.
They
want
to
do
it
won't
happen
and
not
only
there's
have
been
mean
about
why
we're
getting
so
many
good
excellence
in
single
they
relays
being
step
round
for
the
58
cousins
and
this
guy
is
getting
hungry.
D
You
next
think
we
in
a
fifth
do
we
goes
there.
We
studied
like
I,
started
running
well
because
we're
hitting
the
the
DHD
a
lot
more
frequently,
so
it's
more
likely
to
show
up
in
the
queries
for
find
relays.
So
that's
my
surmise
about
that,
but
so
I
know
means
you
know.
Like
I
said
in
a
session,
that's
going
tribulation,
but
anyway
I
want
you
leverage
the
describe
the
possible
discovery
that
bill
to
the
relays.
So
what
we'll
find
would
like
in
Auto
relate?
E
Yeah,
so
a
bunch
of
tough
some
updates
last
week,
one
is
the
the
last
rider
wins.
Configuration
is
now
much
faster
on
initial
resolution,
so
like
100,
milliseconds
plus
finding
appear
over
the
DHT,
which
is
what
should
be
move.
You
know
that
the
pub/sub
value
router,
which,
as
far
as
I
can
tell,
is
only
really
being
used
for
I
penis
was
fixed
up.
You
can
sort
of
check
the
issues
there.
E
It
was
just
being
really
slow
for
kind
of
unnecessary
reasons,
and
next
up
I'm
going
to
be
trying
to
move
the
bootstrap
logic
into
hub
sub,
using
the
discovery
interface
because
seems
reasonable
that
when
you
initiate,
you
know
when
you
initialize
pub/sub,
you
want
it
to
bootstrap.
So
why
not
provided
a
way
to
do
that
instead
of
having
the
external?
E
The
issue
is,
is
link
there
if
you're,
if
you're
interested
in
that
and
then
I'm
gonna
try
and
wrap
the
the
ongoing
rendezvous
PR
in
the
discovery
wrapper
and
see
if
I
can
get
all
of
those
things
to
play
nicely
together
and
then
finally,
I
have
I
got
a
rewrite.
Some
my
pub/sub
PR
changes
to
be
a
little
more
channel
II,
so
they
flow
nicely
with
the
the
rest
of
with
this
or
the
rest
of
the
gossip
subcode.
F
Folks,
dinah
version
two
of
my
me
moving
along
this
I
made
another
push
on
this.
It's
very
an
exaggeration,
I
feel
really
good
about
about
this.
One
is
incorporated
basically
all
of
the
design
decisions
and
and
review
that
were
part
of
the
first
iteration,
which
were
really
good.
We
now
have
a
set
of
four.
We
have
a
preparer
which
in
turn,
can
be
a
sequence
of
preparers,
so
we're
not
dealing
with
like
multiple.
You
know,
cardinality
for
each
element
in
the
diner,
but
each
and
itself
can
be
an
aggregate.
F
If
you
want
it
to
be
an
update
right
now,
it
only
makes
sense
for
the
preparers,
which
in
turn
can
be
a
pipeline
Oprah
Paris.
Then
the
next
element
alone
is
an
address
for
solver,
which
is
which
can
then
very
easily
be
replaced
by
when
you
so
right
now
the
routed
host
construction,
we
can
basically
remove
it,
we
can
scrap
it
and
whenever
you
instantiate
a
discovery
gadget
with
Libby
to
be
the
discovery
process
can
inject
its
address
was
over
and
that's
it
so
it
becomes
really
simple.
F
The
next
step
is
a
planner,
and
basically
this
is
the
thing
that
is
scheduling
tasks
on
time,
so
it
receives
the
known
addresses
and
then
it
receives
a
channel
where
it
will
continue.
Getting
more
addresses
has
to
be
discovered
when
you
plan
a
new
dial.
This,
the
planner
actually
commits,
sends
you
returns
of
thumb,
and
the
plan
is
kind
of
like
a
like
a
state
machine
sort
of
thing.
It's
it's
intended
to
be
reactive,
so
it
gets
events
from
the
pipeline
as
things
are
happening
so
that
it
can
react
to
them
right.
F
So
it
gets
event
when
you
address
is
not
discovered
when
the
discovery
is
finished.
There
are
normal
addresses
to
be
discovered
whenever
a
new
judge
of
all
this,
whenever
a
DA
job
is
completed,
whether
it's
an
error
or
an
success,
so
it
can
take,
it
can
take
decisions
right
and
last
but
not
least
sorry,
then
there
is
the
throttler,
which
is
the
one
that
is
in
charge
of
guarding
resource
usage,
essentially
and
finally,
there's
the
executor,
which
is
the
thing
that
had
actually
dispatches
the
dial
and
talks
to
the
council.
F
So
this
is
kind
of
like
the
structure.
I
have
pasted
a
few
links
right
now.
There
is
no
PR
I'm
gonna
close
the
first
PR.
It
is
the
discussion
that
was
paid,
but
you
know
it's
it's
very
known
so,
unless
town
of
fresh,
this
I'm
gonna
open
a
new
PR
today
and
I've
just
linked
the
source
file
for
the
interfaces,
which
is
where
I
think
you'll
be
able
to
kinda
like
get
you
know,
it's
very
well
documented
I
tried
to
lease
this
particular
really
well
and
before
I
buy
the
PR
I'm
gonna
do
documentation.
F
Very
undocumented
in
the
past
has
been
very,
very
spaghetti
and
we
really
want
to
start
it.
You
know
with
this
sort
of
like
policies
so
before
I
open
the
pier
I'm
gonna.
Do
that
also
gay?
So
this
that's
really
diving
aside
from
kind
of
like
you
know,
usual
stuff
and
I
calls
him
so
on
yeah
it
did
some
debugging
with
B's
and
so
on,
but
I
think
what's
important
is
what
I'm
gonna
work
on
mix,
which
has
finished
the
diner
finished
a
porky
factor.
F
Yeah,
well,
we
had
a
discussion
what
weasels,
referring
to,
if
you
haven't
seen
it
yeah,
we
had
a
few
discussions
about
you
know
in
general
software,
so
for
practices
and
engineering
practices,
or
the
way
that
we
do
release
is
the
way
that
we
bubble
up
those
releases.
If
we're
gonna
have
weekly
or
about
a
weekly
or
monthly,
you
know
release
projects.
What
one
needs
to
be
true
before
we
actually
tied
a
release
like
how
much
testing
it
needs
to
have
undergone
not
only
in
the
top-level
project
but
also
in
beneath
projects
so
yeah.
F
We
had
some
really
good
discussion
about
this.
We
need
to
summarize
it
or
for
the
community
and
and
to
let
the
community
also
according
to
that
discussion,
but
part
of
that
is,
you
know,
let's
start
using,
let's
start
automating
more
stuff,
so
that's
where
the
ball
Cohen
comes
in
and
finally,
I
need
to
present
phantom
to
you
guys,
which
is
the
nippy
teepee
visualizer.
We
got
the
design
mock-ups
three
weeks
ago
or
four
weeks
ago,
I
think,
but
we
haven't
honestly
had
time
with,
like
okrs
thousand
other
things
and
performance
reviews
and
everything
else.
F
G
It
would
be
very
awesome
to
have
this
sort
of
a
pipeline
of
events
that
would
be
handled
by
sort
of
a
another
layer
so
that
we
don't
have
to
mess
up
unicode.
So
basically,
that's,
but
my
sort
of
a
requirement
would
be
something
like
that.
I'm
now
getting
through
documentation
or
like
the
issues
you
guys
sort
of
a
written
before
trying
to
catch
up
with.
What's
what's
the
state
is
not
so.
F
One
thing
that
I
wanted
to
point
out
is
somehow
we've
been
conflating.
The
host
refactor
service-oriented
their
service,
basically
factor
with
the
eventbus,
but
they
don't
need
to
happen
at
the
same
time
or
they're.
Not
actually
you
know
they
don't
need
to
have
been
they're
not
dependent
on
one
another
at
all
we
can
implement.
We
can
start
with
you.
Where
can
parallel?
F
We
can
start
for
that
achievement
and
what
is
e
to
unblock
the
eventbus
is
the
core
refactor,
because
we
do
plan
to
have
all
the
events
that
can
be
emitted,
the
abstract
events
that
can
be
emitted
by
components.
We
don't
want,
like
any
dependent
that
is
gonna,
be
using.
Then
it's
gonna
be
subscribing
to
those
events
to
have
to
like
dig
into
the
particular
module,
a
particular
implementation.
These
events
should
be
out
right
and
they
still
live
under
course.
So
once
we
have
a
core
refactor
one
that
once
that
is
founded,
we
can
create.
F
You
know
in
the
band
package
and
we
can
start
modeling
all
the
struts
there,
and
essentially
you
know
the
event
bus
is
it's
a
simple.
It's
a
simple
abstraction
that
could
be
part
of
the
host
and,
basically
all
components
in
the
system
have
access
to
the
host,
so
in
at
the
scene,
just
like
a
peer
store
is
kinda
like
this
top
level
component.
The
event
bus
could
be
a
set
of
that
right
now.
Does
this
like
remind
my
current
thinking?
F
D
That's
that's
in
line
with
what
we've
been
discussing
and
planning
with
revell
about
the
first
phase
for
service
refactor,
so
it's
important
to
remove
the
dependencies
you
know
like
from
for
the
consumers
of
the
actual
leaf
packages,
will
just
pull
all
the
event.
Bus
and
eventually
just
gave
a
method
and
the
whole.
It
says
event:
bus
minute.
C
G
F
F
F
So,
if
you
wanna,
like
you,
know,
make
a
contribution
here,
you'd
be
super,
welcome
and
we'll
make
sure
to
guide
you
through
it
in
terms
of
what
events
are
valuable
in
each
component
and
sort
of
like
the
architecture
that
you
know,
we
have
loosely
discussed
because
we
haven't
like
actually
discussed
it
right.
So
yeah.
G
C
Bus
and
generally
one
pattern,
I
noticed
in
like
I'd
like
a
delegate
or
your
you
have
a
callback.
You'll
also
would
post
an
event
with
like
the
high-level
details
of
likes,
even
if
you
don't
have
any
ready
to
listen
or
whatever,
just
as
like
a
broadcast
to
hey
might
not
be
a
directly
subscribe,
but
you
might
here
so
we
could
go
through
and
look
at
places
where
we
have
like
existing
like
hard,
coupled
callback
relationships,
see
as
opportunities
like
this
very
useful
place.
F
Absolutely
yeah,
so
definitely
bezel
can
inform
a
lot
of
that.
Much
of
like
the
auto
relay
auto
nod
the
identify
push
like
all
of
these.
You
know
the
initialization,
the
bootstrapping
of
the
DHT
and
everything
that
you
know
the
state
like
that
relies
on
that.
You
know
are
things
that,
like
we
have
challenges
that
we
have
encountered
that
have
led
us
to
think
hey.
We
need
independence
right,
so
this
is
kind
of
like
on
in
our
heads
and
I.
Think
we
should
what
we
can
do
that
scan
of
the
codebase
I.
F
Think,
like
you
know,
there's
furniture
principle
applies
here
with
just
20%
of
the
effort.
We
can
cover
80%
of
the
of
the
of
the
scope
of
this
yeah,
so
I,
don't
know
how
we
should
proceed
with
this.
Do
I'm,
okay,
opening
kinda
like
a
mission
we're
in
Goa,
b2b
or
potentially
even
specs,
because,
ideally
this
could
be
something
that
we
spec
for
all
implementations.
C
F
Like
will
come
up
with
requirements
like
I'm
thinking
of
this
as
well,
like
you
know
you
before
subscribing
to
an
event,
you
want
to
know
if
there
is
an
emitter
register
for
that
event,
because
we
could
be
waiting
for
something
that
would
never
come.
So
you
know
that
there
are
like
many
requirements
and
we
can
find
inspiration
like
I
would
say
time
box,
you
know,
may
be
looking
at
how
Nettie
does
it,
how
the
dog
does
it?
How
other
things
to
it?
F
Just
like
time
box
into
one
date
mass,
you
know
to
gain
some
inspiration
of
like
what
other
challenges
that
they're
solving
just
to
you
know,
do
a
breadth-first
scan
of
the
ecosystem,
and
once
we
have
that
I
think
we'll
have
my
pretty
good
insight.
Then
there
is
Bob
I
event,
bus,
for
example,
in
the
Java
world.
There
are
like
many
examples.
D
So
in
terms
of
words,
we
do
it
I
think
we
should
do
it
in
go
Libby
to
be
first,
because
it's
very
specific
to
the
go
on
the
p2p
Rica
tech,
chure,
I'm
sure
that
drast
has
a
different
idea,
texture
different
context,
or
even
so,
let's
keep
it
in
gold
between
at
least
for
the
start.
And
if
we
see
that
we're
using
the
objection
elsewhere
as
well,
then
we
can
make
it
aspect,
but
it's
not
making
regular
spec
before
we
be
mended
yeah.
F
We
don't
want
to
be
too
heavy
weight
with
this.
What
I
would
say
is:
let's
open
it
and
go
to
b2b.
Have
the
implementation
focus
sort
of
like
discussion
there
and
open
an
issue
and
go
a
little
bit
to
be
specs
that
points
to
hey.
You
know
there
is
this
discussion
going
on
and
it's
potentially
you
know
something
that
we
might
want
to
standardize
on
in
the
future,
so
that
it's
at
least
in
that
arena,
so.
D
F
I
mean
how
are
you
like
right
now
in
terms
of
bandwidth,
because
I
think
we
have
some
momentum
with
this
particular
issue
with
cansado,
also
needing
it
and
like
we
have.
You
know
a
lot
of
people
like
some
pot
here
and
if
you're,
okay
with
bandwidth-
and
you
can
do
this
quickly,
then
it's
fine.
Otherwise
we
can
find
another
solution.
D
So
I
mean
my
bandwidth
is
limited,
but
this
is
something
I've
been
thinking
about
for
a
while,
and
it's
also
important,
so
you
know
I'll
do
it.
Okay,
I
mean
the
implicit.
Only
delay
is
I.
Think
is
pretty
much
critical
before
we
ten
or
not
a
relay
by
default,
no
like
in
ipfs,
which
is
going
to
happen.
This
release
or
the
next
somewhere
close,
probably
the
next
game.
C
D
Event,
bus
is
gonna,
be
part
of
it
too,
so
I'm
going
to
spec
out
how
they
went
past
a
loop
when
I
make
a
road
map
that
will
say
hey
here
is
the
things
that
we
want
to
implement.
It's
gonna
be
a
three-step
process.
The
first
process
will
be
okay,
let's
get
internals
of
Libby
to
be
host
refactor
3u
services
internally
and
they're,
nice
notary,
and
then
we,
the
second
step,
gonna,
be
removed.
D
You
know,
like
the
clue,
jeez
like
the
context
argument
from
our
services
and
make
all
services
of
contain,
and
the
third
step
is
gonna
be
okay.
Now
that
we
have
that
we
can
add
a
dependency
which
is
the
event
pass,
and
then
we
can
have
the
event
bass-playing
in
technical
tractor
or
coming
from
the
host,
ideally
cannon
from
the
holster,
who
don't
say
you
have
to
change
so
what
and
then
we
can
spec
out
the
event
pass.
Okay,.
G
F
D
Let's
open
an
issue
about
the
event,
pass
that
so
many
nice
event
bus
and
go
Libby
to
be.
In
the
meantime,
we
can
start
working
on
the
duct
win
and
we
can
link
to
this
document
from
both
pages.
That
will
give
up
and
say:
hey
here
is
a
design
when
I
have
entities
in
the
design,
but
when
they
want
to
have
the
event
pass
interface,
so
that
we
can
actually
inspect
here's,
how
it's
going
to
look
and
then
we
can
so.
F
It
sounds
to
me
like
what
we
need
to
do
is
a
bit
of
requirements
gathering,
so
to
speak.
So
what
I
do
is
I'll
get
an
issue
specifically
for
the
event
bus
started,
so
we
can
all
pitch
in
and
and
like
at
least
get
some
parts
rolling,
because
before
they've
all
been
brewing
in
our
hat,
so
we
need
to
get
in
there
on
a
common
place.
Right
now
make.
B
D
B
B
C
A
F
So,
regarding
the
discretion
that
we
had,
I
was
like
that
was
great
to
like
you
know,
align
do
a
bit
of
like
brainstorming,
but
we
need
to
take
this
on
to
the
community,
for
you
know,
openness
and
transparency.
So
who
would
like
to
take
the
lead
to
summarize
that
yourself,
I,
you,
you,
okay
with
that
yeah.