►
From YouTube: IPFS All Hands 🙌🏽📞 Jun 18, 2018
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
And
we
should
be
recording
so
welcome
everyone
to
the
June,
18th
2018,
ipfs,
All
Hands
call
kicking
off
the
agenda
today.
I
would
like
to
talk
about
some
of
the
changes
that
we
have
been
discussing
recently,
but
we're
doing
so
without
much
of
much
of
the
people
here
and
without
without
the
recordings
going
off.
The
major
problem
we've
been.
A
Having
is
that
nobody
with
permission
is
here
so,
while
the
meetings
are
going
as
planned,
they're
kind
of
hastily
thrown
together
and
we're
spending
a
lot
of
time
scrambling
when
I
think
that
we
could
be
a
little
bit
more
prepared
and
that
can
be
achieved
with
some
simple
changes.
Just
like
posting
the
agenda
ahead
of
time
and
assigning
the
moderator
ahead
of
time,
and
things
like
that
and
having
a
contingency
plan
for
when
people
are
going
to
be
away,
is
basically
all
we
really
need
to
do,
but
I'm
gonna
post
it
in
a
chat
here.
A
I
would
like
everyone
here
to
at
least
look
at
that
and
give
your
feedback
on
it,
so
that
we
can
change
the
meeting
to
be
a
little
bit
more,
just
be
a
little
bit
more
prepared
for
these
meetings
over
already
posted
hit
nice.
So
the
typically,
we
would
pick
a
note
pick
note-taker
already,
but
the
problem
we're
having
is
that
the
bot
is
down
and
we
don't
seem
to
have
an
up-to-date
list
of
who
is
willing
to
volunteer
for
that
so
I
figure.
A
A
This
one
here,
oh
there,
it
is
I
posted
the
same
one
I
think
no
nevermind,
sorry
allowing
it
today
a
little
hotty
over
here.
So
we
we
need
to
build
that
list
of
note
takers
so
that
we
don't
spend
a
good
five
minutes
or,
however
long
this
is
taken
to
to
set
up
for
the
meetings
we
spend
too
much
time
preparing
for
that
before
we
even
get
off
the
ground
recording.
So
if
you
just
would
do
that,
that'd
be
great,
but
for
now
we'll
just
have
to
do
it
the
old-fashioned
way
of
someone.
A
If
you
would
like
to
take
notes
today,
please
raise
your
hand
Jacob
your
first
that'd
be
perfect
and
we
can
just
put
those
in
the
Google
Docs
for
now,
so
with
all
that
out
of
the
way,
let's
actually
get
on
with
business
as
usual
for
the
agenda
today,
it
looks
like
we
have
one
with
the
Twitter
pin
bot,
take
it
away.
Juan.
C
B
D
D
A
E
E
While
that
was
happening
and
at
the
time
we
sort
of
for
those
of
you
didn't
know,
we
had
dinner
recipes
were
like
these
loosely
coordinated
groups
of
people
at
something
to
build.
Something
looked
a
lot
like
a
data
comments
by
archiving
government
data,
and
so
we
had
like
librarians
academic,
stated
scientists
and
programmers
in
the
same
room,
but
we
put
a
real
lack
of
tooling
and
that's
X
Matt
zoom
Allah.
E
Can
we
had
like
a
good
conversation
about
how
ipfs
contribute
to
the
conversation
and
how
let
you
to
be
in
the
notion
of
peer
to
peer
distributed
systems
and
cut
to
the
dressing
way
help
some
of
this.
So
if
you
don't
mind,
I'll,
take
kind
of
five
minutes
and
I'm
most
in
just
updates
on
sort
of
some
of
the
things
we've
brought
into
the
mix
on
query.
You've
got
someone
else
trim
my
screen
that
I
get.
E
Okay,
does
everybody
see
an
editor
here,
all
right,
cool
thumbs
up
awesome,
so
we've
been
focusing
on
bringing
a
lot
of
the
best
bits
that
we
could
find
from
those
days
after
the
day
we
had
librarians
contributing
the
notion
of
what
a
good
method
aspect
look
like.
We
had
computer
science,
contributing
how
to
structure
datasets
and
how
to
work
with
this.
E
That
I
wanted
to
show
today
is
this
notion
of
transformations
that
we
blended
we've
implemented
this
skylark
programming,
syntax,
which
comes
from
the
google
basil
project?
I
don't
know
if
any
of
you
are
familiar
with
it.
What
skylark
is
is
a
touring
incomplete,
subset
of
Python?
So
it
looks
a
lot
like
Python
and
what
we're
using
this
for
in
this
context
is
to
automate
a
repeatable
script
that
generates
a
data
set,
and
so,
in
this
case,
what
we're
doing
is
this
data
set
generates
a
list
of
what
does
this?
E
We
have
ipfs
paths
that
properly
and
then
we
also
have
query
paths
that
just
something
that
we
would
like
it
to
go
to.
So
if
I
do
and
then
what
I'm
gonna
do
is
I'm
gonna
run
query
ad
and
I'm
gonna
need
you
supply
a
couple
of
things
here
just
to
quickly
walk
through
this.
This
at
the
mo
file
is
the
file.
I
was
just
showing
you
here,
which
defined
some
basic
sort
of
structure.
E
This
is
our
standard
definition
of
a
data
set,
so
we
have
a
section
which
is
mainly
I
drives
a
lot
of
its
knowledge
from
library,
science,
its
structure,
which
defines
a
machine,
readable
structure,
stuff,
and
then
this
transform
script
is
the
most
important
part.
So
this
is
a
link
to
that
transformed
sky
file
and
that
we
configure
it
to
say:
hey
I
want
to
grab
from
the
go
and
the
ipfs
go
organization
and
the
go
IPO,
that's
repo.
E
What
I'm
going
to
do
is
I'm
going
to
run
this
query,
add
function
providing
at
this
data
set
that
yellow
file,
and
while
this
is
a
public,
endpoint
I
just
thought
it
would
be
good
to
demo
that
we
can
actually
plumb
secrets
into
this,
and
so,
in
this
case,
were
plumbing
and
I
get
hub
API
token,
and
so
it's
gonna
warn
me
and
say:
hey
on
an
API
took
into
a
script
with
someone
else's
written.
This
is
probably
something
you
want
to
check
before.
E
You
actually
run
this,
but
I'm
gonna
say
yes,
because
I
wrote
it
I
think
it's
I,
think
it's
safe,
I'm
gonna
run
it.
It's
gonna
go
on
the
internet
and
it's
gonna
create
a
new
data
set,
and
so
now,
if
I
do
a
query
LS,
we
can
see
this
go
I
professed
contributors
by
week
and
what
this
is
done
is
this
is
generated
hash
fun
on
ipfs
I'll.
Just
data
is
being
stored
on
ipfs
and
stuff.
I.
E
Do
query:
I
use,
I'm
gonna
select
this
thing
now,
they've
query
get
sort
of
CL:
let's
do
the
for
nicer.
This
is
what
actually
got
generated,
and
so
what
happened
is
we?
We
did
a
lot
of
things
that,
in
the
background
for
a
few,
we
generated
a
commit
with
an
author
based
on
an
idea
that
we
provided
for
you.
We
signed
that
commit
with
a
cryptographic
hash
of
the
signature
just
to
check
some
of
the
data
that
was
generated
cutable,
so
I
can
just
run
query.
E
Update
and
it'll
go
and
rerun
that
script,
a
transformation
that
it's
now
been
embedded
inside
of
this
data
set
just
to
make
this
a
little
more
visible.
I
can
finally
run
query,
render,
which
will
just
execute
templates
against
ad
sets,
and
so
that
occur
but
I've
already
loaded
this.
That's
not
that
big
a
deal,
and
it
gives
you
a
chart
that
looks
something
like
this
I
think
I
went
to
that
a
little
basically,
what
happens?
E
I
pre-written
some
HTML
that
takes
the
query
data
sent
and
turns
that
into
this
chart,
and
so,
if
you
know
how
to
write,
this
is
very
similar
to
like
your
world
of
like
static
site,
rendering
templates
or
anything
like
that,
but
we
sort
of
in
the
world
of
query.
We
think
that
it
is
sort
of
our
fundamental
unit.
E
It
makes
things
so
we
can
do
all
kinds
of
stuff
like
query,
diff
and
query
validate
inquiry
law
which
will
actually
do
great
log
bad
thing:
oh,
no,
whatever
anyways
and
then
last
but
not
least
in
the
stuff.
That's
more
interesting
to
those
in
the
lib
beautiful
world,
I
can
run
query'
connects,
which
will
spin
up
an
IP
FS
note,
and
then,
on
top
of
that
it
will
add
the
query
sort
of
like
overlay
network
on
top
of
an
overlay,
Network
concept.
It
gives
me
my
sub
my
profile
ID.
E
It's
actually
separate
from
my
ipfs
pure
ID
address,
then
I
can
sort
of
just
quickly
get
a
feel
for
this.
I
can
do
very
appears
list,
and
this
will
show
query
peers
that
I've
connected
to,
but
again
for
this
crowd.
I
can
do
net
work,
equals
ipfs
this'll
show
all
the
ipfs
peers
they
don't
currently
connected
to
four.
They
were
debugging
the
way
that
connection
manager,
prioritizes
fears
and
so
I
can
see
a
few
more
peers
up
here
in
this,
as
the
network
finds
more
people
in
hydrates.
All
of
this.
E
At
the
end
of
the
day
is
queries
designed
to
put
it
if
that's
on
that
gift,
that's
make
them
interoperable
work
and
our
country
milks
and
last
but
not
least,
we've
done
all
of
this
at
the
front
end
right.
We
love
this.
We
can
sort
of
see
this
connecting
it
showing
my
actions.
It
is
that
we
just
generated
this
is
the
actual
information
that
was
being
loaded.
There's
the
structure,
that's
determined
yeah
and
then,
if
I
need
to
I,
can
see
the
gruesome
connected
ooh.
E
That's
me,
sorry,
this
is
the
protected
you
I
can
grab
their
datasets
I
had
a
dataset
I
could
just
hit
add
and
that
bins,
it's
my
local
IP
FS,
know
so
yeah,
that's
kind
of
where
we're
at
with
query.
It's
taken
us
a
long
time
to
get
to
this
place,
but
we
finally
feel,
like
we'd,
figured
out
some
important
problems.
E
F
The
Brenin-
this
is
great
I,
am
so
happy
to
see
this
stuff
coming
along
and
landing
in
the
past.
You
worked
on
having
query
the
query
nodes
that
are
sitting
on
top
of
IP
FS
nodes,
also
be.
They
were
also
indexing
content
so
that
you
could
query
it
on
the
fly.
Are
you
still
doing
that,
or
did
you
set
that
aside,
so
we.
E
Intend
to
yeah
for
a
long
time
for
those
of
us.
Those
of
you
maybe
are
on
the
photo
background
of
meat
hunting.
These
calls
we
had
like
SQL,
no,
it's
built
on
or
a
natural
selection
engine
built
on
top
of
this
we're
intending
to
go
back
there,
but
we
first
want
to
get
we
needed.
We
realized
we
needed
to
get
our
fundamentals
right
first,
and
so
what
we're
really
excited
about
with
the
skylark
implementation?
E
Something
called
map
and
reduce
that
can
then
be
called
bike
repairs
and
you
could
create
a
sort
of
on
the
fly
swarm
of
people
that
you
compute
we're
very
excited
about,
like
the
notion
of
semantic
chunking,
so
getting
right
down
into
ipfs
blocks
and
making
them
line
up
with
query
rows
so
that
you
could
say,
take
these
hundred
rows
and
go.
Do
this
thing
too?
So
we
think
there's
some
really
exciting
possibilities
that
come
out
of
that.
F
You
I'm
excited
to
see
commit
graphs
implemented
in
query.
Have
you
dealt
with
any
of
the
like
allowing
multiple
people
to
author
commits
and
merge
those
changes
so
like
Matt
and
Lars
and
Michelle
are
all
editing,
are
all
adding
and
modifying,
and
we
want
to
be
able
to
merge
Michele's
changes
or
a
lot
allow
her
to
push
her
changes
into
the
repo.
Any
of
that
like
sprawling
access
controls.
Consideration.
F
E
We've
we
can
started
to
roadmap,
we're
calling
them
change
requests
because
we're
not
very
original
but
yeah.
We
don't.
We
don't
quite
know
how
that'll
work,
but
we
have
definitely
laid
up
all
the
foundations
for
making
that
work
properly.
The
biggest
thing
is
obviously
like
getting
this
sort
of
spending
of
hashes
is
removing
and
knowing
who's
offering.
What
currently
great,
when
you
can
add,
each
other's
datasets.
They
just
stay
authored.
As
that
current
person,
the
recommended
workflow.
E
While
we
work
on
these
change
requests
is
to
you
can
use
query
export
to
spit
out
any
data
set
as
it
currently
exists
and
say
cool.
That's
somebody
else's
transform
script,
I'll,
just
rerun
it
as
my
own
and
create
my
own
dataset.
Eventually
we'll
figure
out
some
way
to
merge
them
back
together
overnight.
F
E
E
Yeah,
it
would
show,
because
we
embed
the
pure
idea
author
into
the
actual
one
that
you're
building
upon,
so
you
can
reference
just
like
a
normal
sort
of
commit
chain.
You
can
reference
previous
commits
that
are
of
your
own,
make
it
Rob.
C
E
It
so
the
script
itself
rides
with
the
data
set
definition.
So
when
you
do
query
ad,
it
comes
with
it.
We
think
that's
what
makes
them
repeatable
and
usable.
So
you
could
take
the
thing
that
I
just
wrote,
configure
it
and
set
it
up
for
I,
don't
know,
say
JSF
BFS
and
you
could
run
that
exact
same
script
to
generate
a
different
data
set
the
scripts
themselves.
E
I
shouldn't
mention
that
the
skylark
syntax
it
we've
implemented
explicitly
denies
access
to
things
like
the
file
system,
like
you
know,
there's
some
basic
security
stuff
that
you
can't
do
for
obvious
reasons.
We're
also
going
to
like
great
likes
to
make
that
download
steps,
one
that
has
access
to
the
Internet
and
we
didn't.
We've
turned
off
all
the
access
to
the
actual
underlying.
E
What's
on
IVFs
or
whatever,
and
then
you
have
to
hand
back
the
data
and
then
you
can
do
more
interesting
stuff
in
the
transport,
but
we
sort
of
explicitly
try
to
have
a
nice
security
model
for
that,
but
we
also
really
want
because
we
want
the
scripts
to
travel
with
the
datasets
and
make
them
that's
something
that
you
could
take
and
sort
of
build
upon.
As
you
go
move
forward.
E
Yeah
anyways
we're
excited
to
start
seating,
this
stuff,
with
some
real
content
with
some
really
interesting
things
coming
forward.
I'm
excited
related
connected
to
you
Hector,
especially
because
we're
going
after
the
next
thing.
We
have
to
build
it.
Some
service
that,
in
massive
sets
of
these
data,
sets
to
keep
them
available,
but
we
were
really
pumped
about
the
Libby
to
be
Tianjin.
Ipfs
underlying
technologies
has
been
a
really
fun
experience
in
trying
to
build
something
that'll,
hopefully
facilitate
building
a
data
Commons
for
people
who
are
interested
in
excited
for
such
a
thing.
Anyways
yeah.
C
E
Afternoon,
yeah
and
so
that
the
transformation
is
all
in
it
right
now,
but
we
just
have
some
cleanup
like
I,
showed
query,
use
and
query
get
today
which
real
this
afternoon,
the
peers
are
all.
There
are
people
ship
right
now
and
start
building.
It
knock
yourself
out.
We
would
be
deeply
appreciative,
obviously
yeah
our
github
three
roads
are
or
the
steps
for
a
query.
I
own
Qri,
any
comments,
suggestions,
super
welcome.
G
So
it
says,
offer
demo
it's
more
of
a
presentation,
because
it's
just
about
rewriting
your
web
RTC
transport,
so
I
have
invented
this
thing
called
interface
data
exchange,
which
basically
abstracts
away
the
logic
of
exchanging
something
like
a
handshake
and
the
motivation
behind
this
is
basically
when
a
web
RTC
star
server
is
down.
The
whole
thing
is
thought,
and
after
all,
is
this
coordinate
p2p
among
HTTP,
so
it
shouldn't
crash
the
whole
application,
and
so
I
thought,
let's
create
the
first
data
exchange.
G
It's
basically
exchanging
a
single
handshake
or
signaling,
or
something
like
that
request
through
an
abstracted
away
interface,
and
it
has
also
many
adventures
on
the
web
RTC
star,
because
it's
not
bound
to
a
single
server
and
it's
also
not
using
socket
IO,
which
apparently
seems
to
be
a
good
thing
if
it
slipped
Italy
transport
and
so
I've
made
two
implementations.
The
first
one
is
lipid
exchange
direct,
which
I'm
going
to
show
the
tests
or
as
a
demo,
and
it
basically
uses
the
existing.
G
The
PDP
circuit
connection
to
exchange
this
SDP
offer
for
web
RTC
and
there's
also
a
p2p
exchange
from
the
rules,
which
is
like
a
share,
uses
a
shared
here
between
two
other
peers
to
exchange
this
STP
offer
and
it
should
not
be
confused
with
the
other
thing.
I
made
called
relevance
protocol,
and
so
this
is
basically
graph
of
how
it
were
how
it
will
work.
G
So,
instead,
if
addictions
direct
what
your
there's
already
a
connection,
when
a
PDP
circuit
established
and
true
a
note
that
both
browsers
are
connected,
we
are
work
so
good,
secure
and
then
it
simply
exchanges
is
SDP,
offer
and
upgrades
that
connection
of
a
dynamic
protocol
which
is
going
to
be
implemented
soon
and
it's
as
normal
web
RTC
connection
without
a
need
for
the
server
anymore.
And
then
you
run
a
voice
module.
It's
similar.
There's
the
server
and
the
drones
be
able
to
be
action
for
us
server
module.
G
The
nodes
are
not
connected
of
a
circuit
and
they
don't
need
to
actually
connect,
and
instead
the
first
process
will
be
asked
for
publicly
off
yet
up
here,
tenzen
ste
encrypted
offer
to
give
a
free
single
char.
The
server
forward
set
and
the
whole
procedure
goes
back
again,
Jerez,
ID
and
then
web
allison
connection
is
established
and
that's
basically
the
point
of
this
rewrite
and
in
the
future
or
basically
later,
I
will
try
to
integrate
this
data
into
the
p2p.
G
So
it's
integrated
in
a
similar
way
as
discovery
is,
and
also
a
few
features
in
the
community
to
get
edit
like
supporting
to
pass
the
swarm
as
a
argument
to
a
transport
and
merging
this,
and
maybe
in
the
future,
the
response
server
can
be
enabled
by
a
vote
and
to
go
with
Latino
trust.
It
can
even
get
more
decentralized.
Also
I
have
a
future
idea
to
make
like
sharing
that.
G
G
So
what
it
did
is
first
it
connected
to
the
PDP
circuit
rule
a
then
it
connected
those
two
peers
of
a
circuit,
and
then
it
exchanged
that
signaling
offer
via
the
the
PDP
exchange
protocol.
I.
Guess
it's
more
like
an
interface,
so
any
module
that
implements
the
interface
could
be
used
for
it.
It's
not
strictly
limited
to
the
rules
protocol
or
the
direct
exchange
HT
and
everything
else
is
possible
and
so
yeah
this
physically
shows
that
it
works
and
I
don't
forget
it.
You
had
a
test,
they
are
just
repeating
the
same
thing.
A
C
So
last
week,
I
spent
a
day
where
I
tried
a
bunch
of
different
converters
and
none
of
them
really
worked
really
well,
so
I
wrote
one
so
there's
a
link
in
the
notes.
It's
pretty
simple,
just
copy/paste
from
Google
Docs
drop
it
in
the
left-hand
side
and
we'll
give
you
a
markdown
on
the
right-hand
side.
That
should
be
pretty
reasonable.
If
you
have
issues
with
it
feel
free
to
file
issues
or
submit
few
hours,
it's
really
simplistic,
but
yeah.
C
A
Have
a
question
on
that
Rob?
Is
there
I
know
Victor's
not
here,
but
do
you
have
any
idea
if
this
can
be
integrated?
With
our
note.
C
C
A
A
Looking
around,
it
looks
like
everything
is
good
so
before
we
go,
I
just
want
to
remind
everyone
that
the
agenda
and
such
for
next
week
is
going
to
be
posted
in
advance.
I
will
probably
post
it
in
the
PM
issue
or
open
up
a
PM
issue
for
next
week
immediately
following
this.
So
if
you'd
like
to
schedule
anything
ahead
of
time,
you
can
do
that
right
after
we're
done
here.
So
with
that
out
of
the
way.
That
concludes
the
meeting.
Thank
you
all
for
coming
and
bearing
with
this
bye-bye.