►
From YouTube: Filecoin Plus - Nov 9 2021 Notary Governance Calls
Description
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
B
B
There
have
been
a
couple
issues
that
have
been
brought
up
recently
that
we
want
to
flag
and
then
talking
through
the
large
dataset
process,
couple
of
reminders
and
some
outstanding
things
there
and
then
time
for
an
open
discussion.
But
before
we
jump
into
that
fun
agenda,
we
have
a
have
a
new
old
face,
but
we
want
to
welcome
back
our
our
good
friend
k,
ray
so
k.
Ray
welcome.
Take
a
second.
C
Say
hi,
oh,
I
will
take
you
up
on
that
thanks
galen,
my
name
is
k-ray.
I've
been
with
the
foundation
for
about
eight
days.
Now
it's
been
nice
to
be
here
prior
to
coming
out
of
the
foundation.
I
worked
at
protocol
labs.
I
have
a
very
solid
understanding
of
what's
going
on
with
the
program
as
a
whole,
but
I'll
be
joining
to
see
if
I
can
help
out
from
just
a
support
perspective,
helping
move
things
along,
get
a
pace
on
what
you
need.
C
B
Awesome
we
are
so
stoked,
and
I
think
this
is
really
going
to
make
a
huge,
significant
value.
Add
both
in
how
far
we
can
go,
how
fast
we
can
go
and
how
everyone
feels
along
that
journey,
so
very,
very
excited
to
have
more
supporting
members
on
this
cast
of
characters.
B
So,
looking
at
some
of
our
metrics,
we
pulled
the
snapshot
yesterday
and
there's
a
couple
things
here
that
that
fill
my
heart
with
such
such
deep,
deep
joy,
specifically
looking
at
our
average
time
to
data
cap
right
now,
we're
seeing
it
at
two
days
and
18
minutes
and
our
average
time
to
data
cap
in
the
past
30
days
is
a
whopping
one
day,
16
hours.
B
So
this
is
great,
so
we're
starting
to
see
that
it
looks
like
the
changes
that
we
are
making
really
are,
making
a
difference,
taking
having
an
impact
on
decreasing
that
time
to
data
cap
which,
from
a
lot
of
the
things
that
we
hear
from
clients,
feels
like
a
strong
direction
to
go
for
that
more
seamless,
onboarding
experience,
even
if
it's
that
first
smaller
tranche
of
data
cap,
it's
enough
to
get
them
started
and
get
onto
the
network
from
there.
B
You
know
we're
still
seeing
overall
average
with
direct
notaries
at
11
days
in
the
past
30
days.
That
is
an
improvement
down
to
nine.
We
want
to
keep
coming
up
with
ways
to
drive
that
down.
Specifically,
we
think
that
one
of
those
is
faster
response
times
so
wondering
ways
to
get
better
notifications.
What
kind
of
tooling
we
could
make!
B
B
Looking
at
our
ldns
we've
gotten
about
3.3
to
the
clients,
which
feels
like
a
small
percent
in
comparison.
But
you
know
again
it's
sort
of
just
a
different
flow
of
when
we
create
that
notary
with
something
like
five
petty
bites
that
is
kind
of
reserved
up
at
the
top
and
then
trickles
out
over
time.
But
that
said
from
the
amount
going
to
clients
into
deals,
we're
seeing
a
pretty
quick
throughput
so
very
happy
about
those
metrics.
B
D
Cool,
so
continuing
to
benefit
from
the
work
that
was
originally
done
by
the
textile
team,
hoping
to
eventually
replace
this
with
a
notary,
leaderboard
ground
thing
that
we
talked
about
in
previous
governance,
calls
that
that
I'll
be
partnering
on
with
k,
ray
and
the
rest
of
you,
folks
that
are
interested
in
helping
build
something
out
like
this.
In
general,
I
mean
the
the
the
amount
of
content
coming
in
continues
to
be
at
a
pretty
steady
rate,
solid
response
rates
solid.
D
You
know
due
diligence
processes
resulting
in
not
just
everybody
getting
data
cap
automagically,
which
is
fantastic.
I
know
andrew
and
anya
in
this
call,
so
I
do
have
a
quick
question
for
you,
which
is
a
bit
of
a
segue.
I
noticed
that
the
table
itself
is
an
aggregation
of
every
github
account.
D
That's
ever
been
assigned
an
issue,
and
so
I
actually
pr
this
to
try
and
get
rid
of
some
of
the
names
that
were
only
there
for
testing
purposes
like
I
am
super
mario
and
I'm
on
this
table
as
well,
because
I
I
tested
some
stuff
by
assigning
it
to
myself,
and
so
what
I
want
to
know
was
whenever
you
get
the
chance
to
look
at
the
pr,
not
an
urgent
thing.
D
I
literally
did
this
yesterday,
but
based
on
whether
or
not
I
did
it
right,
I'd
love
to
just
write
up
like
a
line
or
two
of
docs
as
well
on.
If
that
was
correct
or
ask
you
to
do
that,
so
other
folks
can
can
fall
in
line
and
adjust
if
they
have,
they
have
changes
and
then
accordingly,
for
the
team,
that's
running
the
verifier
like
list,
because
we
have
like
a
list
in
the
back
end.
That's
driving
the
app
the
integration
between
the
app
and
github.
D
They
can
probably
consolidate
and
snap
to
this
as
well,
and
so
we
have
a
consistent
like
mapping
of
what
is
like
who
for
each
notary.
What
is
their
identity
on,
like
the
falcon
slack?
What
is
that
energy
on
github
and
then
their
preferred
like
contact
method
or
whatever?
D
So
I
no
major
surprises
or
changes
on
this
one
I
think,
like
in
general,
things
are
pretty
consistent
and
I
see
I've
seen
chat
from
andrew
if
they
can
publish
a
json
or
csv,
we
can
put
it
in
okay,
even
better.
We
already
have
a
json,
actually
that's
how
we're
searching
it
in
the
back
end
for
the
app.
So
maybe
let's
chat
about
this
one
offline
yeah.
D
We
can
get
a
thread
going
somewhere
and
I
guess
andrew,
are
you
the
right
person
to
ping
on
this,
or
should
we
coordinate
with
anya
or
somebody
else.
D
Go
with
anya,
yes
from
what
I
understand
she
is
the
notary
pro,
so
we'll
keep
on
in
the
loop
yep
makes
sense.
Awesome!
Let's
go
to
the
next
slide:
sweet,
so
recapping
on
on
progress
that
we're
making
towards
the
general
like
q1
sort
of
marginal.
I
keep
getting
to
update
the
title
again.
We
should
really
remember
to
do
that
in
the
template.
D
The
the
six
month
thing
yeah,
I
know
every
time
this
comes
up
is
like
a
joke
and
then
we
forget
to
do
it,
but
yeah
we're
making
good
progress,
at
least
on
the
first
one,
which
is
just
trying
to
get
more
data
cap
out
there.
So
I
ran
some
numbers
yesterday.
Basically
how
I
calculated
the
ldn
data
cap
was.
D
I
took
every
ldn
that
had
given
data
cap
to
its
project
effectively
and
summed
up
all
their
max
like
allocation,
so
whatever
the
number
was
between
zero
and
five
and
then
every
ldn
that
had
already
given
an
allocation
and
may
have
been
ready
for
a
second
or
third
allocation
and
added
that
to
it
as
well.
And
so,
if
you
want
to
replicate
my
math,
you
would
go
to
the
large
data
sets
repo.
D
You
would
sort
issues
by
label
approved
which
are
grounded
actually
state
granted,
which
means
they've
already
received
some
data
cap
and
sum
up
all
of
those,
and
then
you
could
do
state
ready
to
sign
which
includes
both
the
set
that
have
never
received
data
cap
and
the
set
that
are
applying
for
their
second
third
fourth
allocation,
and
then
you
I
manually,
went
through
those
and
checked
for
the
ones
that
have
versus
are
still
going
through
due
diligence
or
waiting
for
notaries
to
sign
and
that's
how
it
got
to
about
75
terabytes,
which
is
like
you
know
massive.
D
Like
a
month
ago,
we
were
like
in
the
30s
and
the
40s,
and
then
a
month
before
that
we
were
talking
like
15..
So
I
think,
like
it's
pretty
awesome,
to
see
good
growth
there.
D
We
steady
progress
towards
three
digits,
which
is
fantastic.
I
think
it'll
be
a
big
milestone
for
us
to
say
that
you
know
in
general
data
clap
is
not
the
blocker.
There
is
data
cap
and
abundance
available
for
clients
that
are
worthy
of
receiving
it,
clients
doing
the
work
you
know
talking
about
their
projects.
D
Doing
the
kyc
that's
necessary,
should
be
successful
in
the
system
and
receive
that
sort
of
low
friction,
onboarding
path
that
we
want
to
build
out
via
falcon
plus
second
up
on
ttd,
my
favorite
metric
talking
about
how
long
it
actually
takes
to
get
data
cap
out
to
a
client,
and
so
this
this
one,
I
sort
of
study,
stated
in
the
last
month-
we're
still
hovering
around
the
same
numbers.
I
think
we've
got
a
long
ways
to
go
on
this.
We
have
one
open
discussion.
D
D
Then,
since
I
started
that
discussion,
and
so
by
way
of
the
the
governance
process,
I'll
probably
take
it
from
being
a
discussion
into
an
actual
issue
and
then
loop
in
the
folks
that
are
on
verified.lift.io
to
see,
if
that's
something
they're
interested
in
doing
or
if
we
need
to
go
via
a
different
path
to
create
an
automated
notary
or
something.
I
think
that
should
help
with
that
zero
to
one
tabi
byte
range
specifically,
but
I
think
there's
the
one
to
one
hundred.
D
We
buy
thing
as
well,
where
that's
the
more
manual
verification
part.
That's
required.
I'd
love
to
get
some
conversation
going
like,
especially
now
that
we've
got
k-ray
here
too,
who's
been
deep,
diving
into
the
general
notary
experience
in
falcon
plus
would
love
to
get.
You
know
from
the
notaries
that
are
here
today
like
what
are
things
that
could
exist
in
the
system.
That
would
make
your
due
diligence
turnaround
time
lower
or
your
due
diligence
burden
like
easier
to
handle.
So
specifically,
I
don't
think
we
need
to
compromise
on
due
diligence.
D
I
don't
want
that
to
be
like
an
outcome
from
what
I'm
saying,
but
I
would
love
to
hear
if
there's
like
tooling,
that
could
exist
or
like
software
that
could
be
written.
That,
like
provides
to
you
the
data
that
you'd
want
to
look
at
so
that,
instead
of
needing
to
spend,
you
know
several
minutes
or
hours
with
each
client.
You
go
down
to
spending
seconds
or
it
can
review
things
much
quickly,
especially
in
in
cases
where
it's
likely
a
good
client.
D
So
yeah,
you
know,
maybe
I'll
go
quiet
for
a
minute
and
see
if
anybody
has
any
ideas
immediately,
we
can
take
a
look
at
the
chat
and
see
folks
have
you
know,
prompts
or
thoughts
for
ways
in
which
we
can
improve
your
process
with
support,
not
replace
it.
Just
like
augmented
yeah.
D
This
is
the
question
from
faye
in
the
chat
I
know
new
note.
3
application
is
on
hold
right
now,
maybe
consider
some
notice
for
ldn
only
that's
interesting,
so
faye,
basically
you're
saying
instead
of
waiting
for
an
entire
like
notary
election
cycle,
maybe
we
just
start
taking
notaries
that
can
participate
on
the
ldn
front
without
getting
individual
allocation.
Is
that
right.
E
D
Yeah,
that's
pretty
interesting.
That's
pretty
interesting
right
in
terms
of
like
how
we
think
about
notary
status
and
like
role
evolving
and
like
I
know,
meg's
been
she's,
not
here
in
this
call.
She
she'll
be
in
the
afternoon,
but
she's
brought
up
the
point
of
generally
like
looking
at
different
levels
of
engagement.
D
Maybe
this
is
an
opportunity
to
draw
that
distinction
as
well
and
say:
hey
like
you
want
to
participate
in
ldn,
so
you
also
want
to
do
manual
verification,
because,
theoretically,
one
of
those
is
lower
left
than
the
other,
because
without
the
ends,
even
though
you
still
have
to
do
arguably
the
same
amount
of
due
diligence,
you
just
do
it
less
frequently,
because
those
apps
are
being
shared
across
multiple
notaries,
and
so
that
could
be
an
interesting
way
of
you
know,
providing
a
path
for
that
notary
which
is
kind
of
awesome.
D
What
are
your
thoughts
on
that
others,
or
in
general?
I
guess,
if
there's
anything
else,
we
could
do
from
a
tooling
standpoint.
D
D
Cool
this
slide
largely
unchanged
from
last
time.
I
actually
don't
want
to
spend
too
much
time
on
this,
because
I
think
we're
better
off
keeping
some
time
for
discussions
at
the
end,
but
in
general,
three
core
focus
areas
in
addition
to
the
the
ttd
and
data
cap
abundance
are
risk
mitigation
in
the
system,
improving
the
client
on
burning,
ux
and
and
general
improvement
of
the
falcon
plus
governance
processes.
D
These
are
meant
to
also
service
prompts
for
your
thinking,
like
in
ways
in
which
you
you'd
like
to
contribute
to
the
program
and
ways
in
which
you
think
we
can
improve
things.
Thinking
of
them
as
like
which
one
of
these
general
focus
areas
or
goal
areas
ideas
accrue
towards
is
helpful
for
framing
and
prioritizing
within
the
community.
D
D
B
In
a
second
for
open
discussion,
so
any
thoughts
that
come
up
around
like
people
saying
around
these
risk
mitigation
plan,
onboarding
and
governance-
love
to
hear
more
a
couple,
things
that
have
popped
up
recently
just
want
to
flag
for
the
community
a
sort
of
a
bug
issue
with
the
amount
of
like
data
cap
remaining
in
some
of
the
places
where
we're
calculating
it.
So
most
of
you
are
aware
that
there
was
a
recent
network
update.
This
has
had
a
cascading
impact
on
a
variety
of
people.
B
Some
of
their
nodes
may
or
may
not
have
been
updated
to
the
most
recent
version,
and
some
different
api
endpoints
have
changed
so
specifically
as
of
right
now
on
our
dashboard.
Some
of
the
values
are
incorrect.
We're
expecting
a
shift
change
today
or
tomorrow.
That
will
correct
this,
but
where
this
is
predominantly
showing
up
is
on
the
interplanetary
dashboard.
B
The
remaining
data
cap
value
is
not
correct.
The
other
statistics
like
data
cap
allocated
times
to
data
cap
values
of
data
cap
overall
those
to
the
best
of
our
knowledge,
are
correct.
It's
just
the
data
kept
remaining
that
was
coming
from
a
certain
array:
endpoint
that
is
no
longer
up
to
date
that
had
an
impact
on
the
bot
that
we
were
using
to
kick
off
subsequent
allocations
for
ldns.
B
This
impacted
a
number
of
people.
I
think
andrew
and
anya
on
the
textile
team
felt
the
impact
of
this
over
the
last
week
that
bot
had
been
trying
to
look
to
the
dashboard
to
see
how
much
data
cap
remained,
because
it
was
not
pulling
the
correct
amount.
The
bot
was
not
kicking
off
the
subsequent
allocation
that
has
been
changed
so
that
api
is
now
updated
and
corrected.
The
subsequent
allocation
bot
should
be
successfully
looking
at
the
amount
of
remaining
data
cap
and
kicking
off
those
allocations
appropriately.
So
that's
an
update
there.
B
The
bot
to
our
knowledge
is
working
and
those
dashboard
values
should
be
updated
by
the
end
of
the
week,
but
just
wanted
to
bring
that
up.
B
B
The
governance
team
and
the
bots
will
check
for
just
basic
completion.
We
look
for
things
like
is
this?
A
duplicate
submission
are
all
of
the
answers
filled
out.
That's
all
we're
checking
really.
Then
we
work
with
the
root
key
holders
to
actually
create
the
multi-sig
notary,
so
there's
an
amount
of
back
and
forth
processing.
B
Some
of
this
is
automated,
and
what
we
come
up
with
is
then
a
multi-sig
notary
with
an
amount
of
data
cap
with
a
weekly
amount
that
we
expect
to
use
with.
All
I
think
it's
23
active
notary
addresses
on
that
multi-sig
once
that
clears
the
root
keyholders
and
that
multi-sig
is
on
the
verified
registry.
The
bot
then.
F
B
Off
the
first
allocation
and
changes
the
labels
to
thought
ready
to
sign
so
that
it
now
shows
up
in
the
app.
So
all
of
this
prep
work
is
happening
and
we
are
tracking.
Now
we
have
some
various
api
ways
that
we're
tracking
this
time
and
we're
going
to
start
publishing
how
much
time
it
goes
between
each
of
these
changes
to
look
at
where
the
slowdowns
are
this.
Before
was
taking
months,
we're
now
looking
more
like
a
week
to
go
from
a
client
civilian
application
to
the
first
allocation
being
started.
B
That's
the
prep
work,
then
the
notaries
can
jump
in
and
start
to
action
and
due
diligence,
these
client
applications.
So,
as
a
notary,
you
may
start
at
the
app
signing
in
as
a
notary
and
looking
at
the
large
requests.
I
think
right
now,
there's
something
like
18
showing
up
in
the
app
or
you
may
start
in
github
searching
open
issues
of
that
large
dataset
repo
with
the
label
bought
ready
to
sign
either
our
valid
starting
points
either
are
going
to
get
you
to
the
same
place
at
that
time.
B
B
If
you
have
questions
feel
free
to
ask
that
as
a
comment
and
tag
myself
deep
k
ray
and
we
could
change
the
label,
if
you
think
that
it
warrants
it,
we
can
change
the
label
to
needs
review
or
actually,
I
think
it's
there's
a
different
one-
that
we're
using
sorry
I'll
have
to
go
check
exactly
which
label
we're
using,
but
that
will
take
it
out
of
the
app.
B
So
other
people
won't
sign
it
until
we
get
some
back
and
forth
with
the
client
themselves.
To
answer
your
questions
to
your
satisfaction,
if
you're
comfortable,
we
encourage
you
to
go
sign
the
allocation
and
we're
trying
to
kind
of
bias
towards
maybe
signing
a
first
allocation
for
a
smaller
amount
of
data
cap
and
then
let
the
client
start
making
deals
and
then
before
the
second
allocation.
Maybe
we
could
ask
some
more
follow-up
questions.
B
Maybe
when
they
go
to
make
the
second
allocation
we
can
see,
was
their
deal-making
behavior
up
to
what
they
said
in
the
application
itself,
or
is
there
something
suspicious
happening?
Maybe
a
first
allocation
is
a
safe
enough
starting
point
to
get
more
information
right
now
there
is
a
there
is
a
process
expectation
that
is
not
hard
coded,
but
it
is
just
a
standard
that
we
are
working
towards
where,
ideally,
an
individual
notary
should
not
sign
two
allocations
in
a
row
for
the
same
client.
B
So,
for
example,
when
textiles
allocation
comes
up,
they're
ready
for
their
first
set
danny
could
sign
that
first
allocation.
They
get
a
second
signature
from
rears,
runs
through
that
allocation,
they're
ready
for
their
next
allocation
of
data
cap.
Ideally
danny
and
rears,
don't
sign
that
allocation.
Two
other
notaries
jump
in
danny
rears
could
sign
the
third
one
danny
could
design
any
of
the
other
open,
18
large
data
sets.
B
We
want
to
do
this
to
try
and
let
other
notaries
have
a
chance
of
looking
at
these
different
deals
or
different
client
applications
and
seeing
is
there
maybe
something
that
another
notary
missed.
It
also
just
helps
distribute
that
load.
So
if
you
have
signed
one
go
sign
another
instead
of
signing
to
the
same
one
twice
in
a
row
so,
like
I
said
right
now,
it's
either
18
or
19.
B
One
of
them
may
have
actually
cleared
last
night
or
this
morning
and
shifted
it,
but
you
can
see
we
have
a
number
of
allocations
for
a
variety
of
sizes
with
an
audit
trail
to
investigate-
and
this
is
where
we
want
you
to
jump
in
either
sign
or
ask
those
questions
or
both
sign
it
and
say
I
went
ahead
and
proved
this,
and
I
have
some
questions
about
it.
B
So
hearing
from
faye
like
the
comment,
the
process
is
indeed
much
faster.
That's
great!
That's
exactly
like!
As
a
client
with
you
know.
Faye
now,
I
think,
has
one
ldn
that
has
closed
two
other
ldns
that
are
in
flight.
That's
what
we're
trying
to
do.
It's
getting
more
of
these
deals
flowing
faster
through
the
network
and
onboarding
more
of
this
like
valuable
data.
B
B
When
there
were
seven
notaries
only
we
would
ask
for
there
to
be
a
lead
notary
to
take
charge
when
all
23
are
the
notaries.
Everyone
is
equally
responsible.
There's
no
lead
notary
per
client.
It's
not
a
current
part
of
the
process.
The
threshold
is
two
threshold
before
was
four.
This
is
a
significant
change.
We
only
need
two
notaries
to
jump
in
one
proposal
and
one
approval
for
that
data.
Cap
allocation
to
push
through
there
have
been
a
couple
of
places
where
this
has
caused
a
slight
issue
and
we're
working
on
ways
to
like
better
catch.
B
B
What
may
also
happen
is
three
notaries,
may
all
jump
in
really
close
together
and
all
try
and
sign
around
the
same
time.
And
what
can
happen
there
is
the
first
notary
could
propose
an
allocation
message
transaction.
The
second
notary
would
approve
that
it
hits
the
threshold.
That
message
clears
that
allocation
lands
with
the
client
that
third
notary
that
jumped
in
right
after
those
two
could
be
30
seconds,
would
kick
off
a
new
transaction
id
for
a
new
allocation
amount.
B
We
may
then
go
in
and
remove
the
label
and
there's
now
a
floating
proposed
transaction
id
for
an
amount
of
data
cap
when
that
label
goes
back
on
and
it
shows
up
in
the
app
again
that
next
signature
that
fourth
signature
would
clear
that
other
transaction
id.
So
again,
I
think
this
is
what
happened
with
textile
most
recently,
not
the
end
of
the
world,
we're
working
on
some
ways
to
fix
it.
It's
still
significantly
faster
than
the
previous
process,
with
only
seven
available,
signers
and
a
threshold
of
four.
B
So
we
think
that
the
trade-off
here
of
speed,
where
we
run
into
some
of
these
situations,
where
we
move
maybe
a
little
too
fast-
and
we
get
this
next
transaction
at
the
scale
of
data
cap
that
we're
talking
about
this
is
still
safe
enough
to
try.
This
is
not,
in
our
opinion,
a
a
big
threat
or
risk
to
the
network.
So
other
things
to
note
here,
we
have
seen
and
heard
from
other
people
that
sometimes
the
app
can
be
slow.
B
This
is
because
the
message
has
to
land
on
chain,
so
it
can
take
over
30
seconds,
sometimes
from
when
you,
you
know,
start
to
send
a
message
for
that
approval
indicator
to
happen
again.
We're
working
on
some
tooling
things,
but
you'll
see
some
of
those
like
those
like
cycling,
dots,
letting
you
know
that
that
page
is
still
running.
You
can
also
open
the
console
in
your
web
browser
for
whatever
you're
using,
and
you
can
watch
that
message
get
built
and
then
watch
for
it
to
update
with
either
an
error
message
or
a
success
message.
B
You
can
refresh
the
app
once
you
have
seen
one
of
those
messages
where
it
has
been
a
minute
and
you
can
also
go
to
the
github
issue
and
refresh
the
github
issue
and
see
if
that
message
has
landed,
so
there's
a
variety
of
ways
to
to
check
whether
that
message
has
gone
through
and
the
last
thing
that
I'll
note
is.
With
these
new
process
changes,
our
documentation
in
a
couple
of
places
is
out
of
date.
This
is
a
work
in
progress.
We've
been,
you
know,
trying
to
get
to
it.
B
B
Some
out
of
date,
stuff
propose
some
copy
to
update
it
and
we're
happy
to
share
that
burden
so
pausing
here
before
we
go
into
open
discussion,
I
wanted
to
see
if
there
were
people
on
the
call
that
had
questions
about
this
large
data
set
process
either
from
the
front.
The
governance
part
the
middle
of
the
notary
diligence
or
any
of
these,
like
notes
that
I
called
out.
G
I
don't
see
any
hands,
just
one
small
question
yeah.
So
what
is
the
like?
The
documentation
expected
to
be
done.
B
A
great
question:
it's
you
know
it's
one
of
the
things
that
we've
mentioned
in
our
updates
with
the
client
onboarding
experience
is
to
you
know,
have
better
documentation.
So,
hopefully,
within
the
next
month
we
will
have
some
overhaul.
It
may
be
a
slow
trickle
and
maybe
a
big
launch
of
a
bunch
of
new
documentation.
B
Okay
last
thing:
I'm
going
to
say
about
ldns
if
you're,
notary
and
you're
on
this
call
go
check.
Some
out
sign
some
of
those
ldns
shifting
gears
here
to
we
have
these
still
outstanding
notary
governance
discussions.
We've
touched
on
these
a
lot.
I'm
gonna
pause
here
and
see
if
anything
had
come
up
recently
on
these
specific
discussions
deep
had
mentioned
the
you
know,
automated
verification
increase
any
comments
or
questions
from
the
room
about
one
of
these
or
all
of
them.
D
So
I
have
something
to
add:
I
guess
that's
not
on
the
screenshot,
because
actually
somebody
made
a
new
discussion
topic
this
morning
and
then
pinged
me
about
it
about
30
minutes
ago.
So
I
figured
I'd
flag
it
in
the
conversation
today.
It's
very
exciting.
You
killed
it,
the
maybe
his
worth
just
going
to
the
live
discussions
page
and
we
can
talk
about
it.
If
you
don't
mind,
pull
it
up,
but
yeah,
don't
harry
governance.
D
Jenny
juju
has
created
a
pip
about
some
topics
that
we've
often
talked
about
specifically
around
data
cap
ownership
once
it
gets
to
the
client.
So
today,
if
you
know,
functionality
is
limited
for
node.js
to
either
add
or
remove
and
then
for
our
root
key
holders
just
set
basically
data
cap
available
to
nodejs,
but
once
it
gets
given
to
a
client,
it's
just
with
that
client
address
forever
until
it
gets
used
in
deals,
and
so
this
is
about
basically
saying
well.
D
That
client
should
be
able
to
transfer
data
gap
among
the
addresses
and
and
introduces
an
interesting
idea.
You
know
around
like
if
somebody's
using
textiles
bitbot
or
using
estuary
as
an
intermediary
or
a
deal
broker,
can
they
transfer
that
data
cap
to
the
intermediary
instead,
so
that
I
I
don't
yet
have
a
strong
opinion,
but
it
is
an
interesting
point
where
it's
like.
D
Oh,
like
what
is
the
ownership
of
data
capital,
client
and
like
what
rights
do
they
have
about
moving
it
around
like
what
happens
if
they
lose
an
address
or
want
to
transition
to
a
different
address,
and
then
there's
like
the
whole
like
remove
that
they
have
from
a
client
which
could
be
in
activity.
It
could
be
actual.
D
You
know
issues
for,
like
you
know,
malicious
data,
cap
usage
and
things
like
that,
but
generally
providing
a
mechanism
whereby
data
cap
can
be
returned
from
a
client
back
to
the
system
or
back
to
the
nursery
that
granted
it
or
otherwise.
This
is
especially
interesting
on
the
ldn
front.
D
Two,
as
we
start
to
give
out
bigger
and
bigger
chances
of
data
cap,
how
do
we
continue
to
earn
confidence
and
let
the
clients
thrive
in
the
confidence
that
they
have
earned
versus
have
consequences,
for
you
know
not
being
honest
within
the
system,
et
cetera,
and
so
in
general.
I
think
this
is,
you
know
not
controversial,
not
really
surprising,
for
anybody.
That's
in
this
call.
It's
like
stuff,
we've
been
talking
about
for
a
while,
so
I
think
it
makes
sense.
I'm
glad
that
jenny
giuji
was
able
to
drop
this
up.
D
I
don't
think
she's
on
the
call
right
now,
but
it
might
be
good
to
just
spend
a
few
minutes
chatting
about
this
as
well
to
see
what
people's
reactions
are,
and
you
know,
there's
I
think,
there's
still
a
couple
of
open
questions
called
out
like
how
do
we
actually
like
what
is
the
mechanism
by
which
these
things
happen?
Who's
it
allowed
to
do
what
what
are
the
bounds?
D
D
G
So
for
the
first
one,
the
transfer
data
cap
so
like
we
should
avoid
like
for
these
clients,
to,
I
mean
selling
their
data
through
this
mechanism
right.
So
if
we
can
just
have
a
custo
mission
that
to
have
some
like
s3
to
like
did
they
have
some
kind
of
requirements.
So
if
we
could
have
some
kind
of
white
list,
then
we
actually
don't
have
to
do
to
approve
that
right,
just
to
let
them
to
transfer
data
to
s3o
right.
D
Yeah,
that
makes
sense
I'm
going
to
take
a
couple
of
notes
here
and
then
add
it
in
into
the
issue
later.
B
B
D
I
do
think
it
makes
an
interesting
challenge
with
like,
for
example,
if
we
do
believe,
like
you
know
previously,
my
opinion
was
that,
like
I
would
love
to
hear,
anya
and
and
andrea's
starts
to
previously.
My
opinion
was
sort
of
estuary
and
bitbot,
for
example,
they
have
knowledge
like
disproportionate
knowledge
about
who
their
clients
are
right.
They
know
who
they're
working
with
and
so
giving
them
data
cap
or
like
having
data
capping
transition
to
them
to
me
actually
doesn't
sound
that
right
like.
D
That
would
make
a
lot
more
sense
to
me,
where,
like
instead
of
having
clients
like
allocate
transfer
data
back
and
forth
between
addresses
you
just
make
that
deal
broker
notary,
that
your
broker
then
knows
where
to
send
the
data
cap
like
they.
They
know
whether
or
not
that
data
cap
has
to
go
to
a
specific
client
address
or
to
a
shared
address,
etc,
and
so
I
yeah
that
that's
like
one
of
the
thoughts
I
have
it's
not
a
super
strongly
held
opinion.
I
think
we're
still
learning
but
yeah.
What
do
people
think
of
that.
H
Yeah
sure
I'm
yeah,
I'm
kind
of
leaning
with
what
deep
just
said
it
seems
like
it's.
This
use
case
seems
a
little
tricky,
like
the
only
I'm
trying
to
think
of
like
who
it's
solving
for
and
like.
If
I,
if
I
was
a
client
and
I
applied,
and
then
I
wanted
to
transfer
my
data
cap
means
either
I
applied
prematurely
and
didn't
quite
know
how
to
use
the
data
cap,
which
is
fine,
no
big
deal.
H
Maybe
you
should
burn
the
data
cap
or
I
applied
with
the
intention
to
give
a
broker
more
data
cap,
but
that
doesn't
seem
like
the
right
way,
necessarily
for
the
broker
to
do
it
like
the
broker
should
be
vetting
the
clients
yeah
I
mean,
I
guess
I
guess
I
guess.
Okay,
sorry,
I'm
just
thinking
out
loud
here.
I
guess
one
idea
here
is
that
it
takes
the
burden
off
the
broker,
so
we
could
just
stop
applying
for
data
caps
generally
and
stop
betting
clients
and
just
tell
the
clients
to
go
get
vetted
themselves
by
notaries.
H
So
that's
that's
beneficial,
but
that
also
sounds
a
lot
like
the
large
data
cap
process
is
a
little
broken.
If
brokers
can't
use
the
large
data
cap
process
to
onboard
more
clients
through
their
system,
so
yeah
a
little
bit
of
a
conflict
there.
I
don't
know
I'd
like
to
know
I'd
like
to
see
like
some
concrete
users
for
this
before
you
spend
too
much
engineering
time
making
it
making
it
real
yeah.
D
B
Yeah,
I
think
that
again,
philosophically
the
question
of
like,
should
we
build
a
tooling
ability
to
transfer
data
cap
from
client
address
to
client
address
is
one
question
and
then
thinking
of
like
well.
Why
do
we
need
this
tool
of
this
use
case
of
going
to
a
broker?
I
feel,
like
the
argument
of
this
use
case,
invalidates
our
sort
of
philosophy
that
data
cap
should
be
readily
available
and
abundant
and
not
a
scarce
resource,
and
that
by
saying
you
would
want
to
transfer
this
to
another
entity.
B
To
use
in
your
stead
is,
is
again
to
like
andrew's
point
that
the
broker
then
there's
something
not
functioning
with
how
the
broker
is
able
to
do
their
role
in
their
service
and
either
a
contingent
point
like
we're,
either
going
to
shift
it
to
say,
okay.
Well,
then,
now
the
brokers
just
aren't
going
to
do
that,
and
it's
going
to
go
only
to
clients,
and
then
that
is
going
to
become
a
bigger
process
or
the
brokers
are
just
going
to
do
it.
B
I
think
that
there
there's
a
bigger
question
around
like
what
are
the
other
use
cases
where
this
might
be
valid
of
a
legitimate
client
who
loses
access
to
an
address,
but
would
need
to
transfer
it.
But
again
my
question
there
is
shouldn't
they
just
be
able
to
easily
apply
for
more
and
say
I'm
going
to
claim
this
address.
B
B
What
does
it
look
like
for
data
cap
to
flow
back
through
the
funnel
in
various
ways?
So,
if
data
cap
is
not
used
by
a
client,
maybe
it
starts
to
degrade
their
their
diligence
is
no
longer.
As
valid
over
time,
but
then
they're
looking
at
the
flip
side
of
that,
if
they
use
it
quickly
and
then
that
deal
expires
like
that
deal
meets
completion
without
any
flags
or
issues.
What
would
it
look
like
for
that
client
to
get
some
portion
of
that
data
cap
back
in
kind
of
this
flow?
B
And
similarly,
what
would
it
like
for
a
notary
to
top
up
their
allocation?
You
know
when
a
client
successfully
has
used
or
removed
that.
So
I
think
I'm
like
more
interested
in
you
know
what
is
it?
What
are
the
mechanisms
we
want
to
remove
a
verified
client
and
remove
their
amount
of
data
cap
and
then
also
what
are
the
ways
that
we
want
to
think
about
data
cap
flowing
back
through
the
system,
or
maybe
we
don't?
We
only
want
it
to
flow
one
way
forever.
D
D
It
also,
I
think,
raises
the
general
question
of
like
if
we
even
want
to
change
what
it
means
to
qualify
to
be
a
nurturing,
how
to
like
continue
like
being
a
notary
over
time
like
I
know,
people
have
tossed
around
ideas
of
like
staking
or
like
incentivization
of
like
no
like
good
notary
decision
making,
behavior
and
stuff
like
that,
and
so
I
think
it's
certainly
worth
exploring
that
stuff.
I
don't
know
what
a
good
format
is
to
do
that.
D
Maybe
what
we
need
to
do
is
like
have
a
governance
call
where
it's
just
like
open
brainstorm
on
like
some
of
these
topics,
and
we
like
give
people
a
chance
to
think
and
plan
their
opinions
for
like
a
weekend
and
show
up
and
talk
through
them
or
something,
because
what
ends
up
happening
is.
I
think
we
end
up
spending
this
time
like
bringing
up
stuff.
D
We
should
talk
about,
and
then
we
don't
really
talk
about
it
as
much,
because
the
expectation
is
we'll
do
it
on
like
github,
and
I
think
the
engagement
there
tends
to
be
lower
than
in
the
calls,
and
so
either
we
need
to
encourage
more
people
to
use
github
better
or
we
just
need
to
restructure.
Some
of
these
calls.
Maybe
that
have
enough
time
to
have
like
those
harder
conversations
and
we've
also
talked
about
like
creating
focus
groups
or
work
like
guild.
D
Just
as
a
general
aside,
I
guess
I
think
charles
is
unmuted
charles
what's
up,
is
there
anything
you'd
like
to
add.
I
You
know
I
just
like
about
trying
somebody
that
kept
part
like
I'm
kind
of
support,
because
our
system
also
has
this
issue
not
really
issues
like
we
have
a
data
cap,
but
as
some
user
have
their
own
like
because
our
projects
are
like
ipfs
transfer
right
and
we
provided
something
like
a
fs3,
so
they
can
build
their
own
sfx3
instance,
but
they
can
using
the
bot
system.
So
then
they
become
issues
like
how
can
we
enable
our
user
to
do
the
bible
their
own
instead
of
us?
I
So
I
just
say,
like
I'm,
supporting
the
transfer
identifier,
all
right
for
the
removal
remove
verify
the
client.
I
don't
have
much
idea
about
it
honestly,
because
there
are
lots
of
reasons.
People
don't
use.
The
performance
like
us,
like
we
gather
about
50
terabytes,
50
terabytes,
but
we're
only
using
five
terabytes
at
the
moment.
Mid
range.
Isn't
that
we're
going
to
have
some
highly
very
issue
at
the
moment.
We
need
to
figure
it
out.
I
It's
going
to
become
something
to
feel
slow
and
yeah,
so
I
think
we
can
give
them
a
little
bit
longer
time
to
see
like
what's
happening.
But
yes,
if
they
still
are
not
using
it,
because
I'm
okay
by
the
zero
it's
necessary
to
start
a
some
kind
of
implications
for
asking
them,
why
you
don't
use
it
right,
and
we
also
notice
that
the
about
us
are
pretty
busy
recently,
but
yesterday
sometime
goes
through
some
applications
and
notice
that
some
operations
is
in
due
diligence
and
yeah.
I
One
common
things
is
some:
some
projects
like
this
is
that
they
are
doing
this
one
and
we
don't
really
know
that.
Do
they
own
the
data
or
they
also
rise
to
use
this
data,
something
we
don't
really
know
so,
even
though
they
give
proof,
but
is
there
no
direction
shift
directly
relationship
between
the
party
owns
the
data
and
authorize
the
data?
We
need
something
like
this
to
know.
Like
all
right
some
scenario:
public
data
is
fine,
but
some
of
them
not
really
public
data.
I
I
I
think
there's
a
data
discussion
in
previous
github
or
somewhere,
I
said,
there's
a
data
doc.
What's
going
with
that
or
another
data,
it's
just
a
deal
discussion.
D
Yeah,
I
think
the
there's
a
difference
between
like
the
data
now
and
then
in
general,
like
I
think,
falcon
plus,
as
in
general,
like
learning
and
restructuring
itself,
in
a
way
that
would
make
more
sense
to
function
like
that.
D
Basically,
charles
is
just
like
how,
in
general
sort
of
moving
towards
notaries
being
like,
as
we've
talked
about
right
like
more
like
guardians
or
watch
people
rather
than
gatekeepers,
and
so
borrowing
some
concepts
from
like
what
it
would
be
like
to
run
as
a
dao
where,
like
the
actual
notary
role,
is
more
focused
on
policy
setting
and
like
creating
rules
and
frameworks
and
organizational
structure,
and
then
the
execution
happens
with
software
so
giving,
for
example
like
if
you
were
to
do
ldn
process.
D
It's
it's
already
like
a
step
in
that
direction,
where
a
lot
of
it
is
starting
to
be
automated,
but
the
the
notary
still
sign
allocations,
unchained,
but
ideally
in
the
future
notice
could
create
standards
that
clients
have
to
comply
with
and
like
bots
could
just
do
the
execution
unchained
for
them
right.
So
there's
definitely
like
it's
mostly
in
that
direction
where
it's
like.
Oh,
how
do
we
like?
Take
lessons
from
what
we
know
about
like
how
does
work
and
stuff
and
apply
them
in
a
way
that
would
be
productive
here.
D
There's
separate
conversations
that
I
know
andrew
and
jv
and
others
have
talked
about
where,
like
how
do
we
create
a
concept
of
like
a
data
dao
in
file
client?
What
would
that
look
like?
What
would
it
mean
and
in
some
ways
I
think
it's
very
similar
to
what
we're
talking
about
and
could
be
the
same
and
in
other
ways?
I
think
it's
also
different
in
kind
of
a
different
scope.
So
I
don't
actually
know
what
the
latest
has
been
on
that
maybe
andrew
can
shed
some
light
light,
or
maybe
it
is.
D
So
I
repeat,
the
question
question
from
charles
was
basically
like
he's
heard,
the
term
data
dao,
oh
yeah,
and
I
and
was
wondering
like
what
that
actually
meant
or
what
was
happening
on
that
front.
And
basically
I
was
just
saying
that,
like
we've
been
talking
about
like
falcon
plus
as
a
dao,
which
is
in
some
ways
like
ins,
depending
on
who
you
ask
that
it's
the
same
topical
conversation
or
it's
a
completely
different
scope.
And
so
I
don't
know
if
you've
still
been
having
conversations
on
that
front
and
and
in
what
way
they.
H
No,
I
I
haven't
heard
much
progress
there.
I
think
a
lot
of
it
seems
to
be
hinging
on
the
fvm
work,
the
idea
that
those
things
would
live
more
natively
on
the
filecoin
blockchain.
So
I
don't
know.
I
think
it
would
be
worthwhile
as
a
community
to
do
more
experiments
here.
I
I
think
I've
mentioned
this
in
the
past
of
like
even
creating
a
mock
dao
on
on
ethereum,
where
we
use
snapshot
to
handle
proposals.
H
The
challenge
there
is
just
like
notaries
would
need
to
maintain
both
their
filecoin
key
and
their
ethereum
keys,
and
we
need
to
do
some
mapping,
but
you
know
not
crazy
and
it
would
allow
us
to
to
already
be
seeing
how
that
works
like
I
feel
I
feel,
like
you
guys,
do
an
amazing
job
here,
keeping
phil
plus
moving
forward,
but
it's
clearly
it's
clearly
still
a
protocol
labs
project.
You
know
it's
kind.
H
I
think
it's
a
little
hard
for
you
to
get
us
all
to
take
over
some
of
the
work
that
you
have
to
do
day
to
day
to
keep
this
moving
forward,
where
if
we
were
experimenting
with
some
of
those
things
now,
maybe
people
could
jump
into
taking
on
more
tasks
of
like
keeping
you
know,
keeping
infrastructure
running
getting
new
interfaces
built
all
those
things
where
maybe
it's
like
a
little
hindered
now,
because
we're
not
quite
there.
H
B
Yeah,
like
I
would
like
a
record
to
show
for
for
youtube
and
all
those
watching
that
this
meeting
is
being
run,
managed
and
hosted
by
the
file
coin
foundation
in
the
file
yeah.
H
D
H
Like
I
just
just
even
add
more
to
that,
like
I
feel
like
when
I
have
problems,
I
go
and
complain
to
galen
or
I
like
ping
galen,
to
help
me
find
the
right
people
to
unblock
that
which,
which
is
on.
Like
I
mean
galen,
you
do
awesome.
I
I
appreciate
all
of
that
work,
but
I
feel
like
some
of
that,
could
be
shouldered
more
by
other
people
in
the
community.
If
we
were
to
experiment
more
but.
D
H
I
I
I
feel
like
have
you
ever
heard
of
conway's
law,
conway's
law?
It's
it
basically
says
the
way
you
write
code
will
will
mirror.
The
organization
of
your
code
will
mirror
the
way
your
organization
is
yeah,
and
so
I
feel
like
when
you
look
at
field
plus
it's
a
bit.
You
can
kind
of
see
that
playing
out,
so
you
can,
if
you
don't
know
the
shape
of
the
organization.
H
You
could
also
look
at
the
shape
of
the
code
and
infer
what
the
shape
the
organization
is
a
bit
so
yeah,
it's
just
something
just
an
insight.
There.
B
I
I
think
that
that
this
echoes
some
of
the
things
that
I've
heard
you
know
from
faye
as
as
well
at
other
times
that
you
know
sometimes
the
number
of
notaries
that
we
have
or
like
the
speed
with
which
they
reply
or
the
amount
that
each
notary
is
taking
on.
That
kind
of
where
we
are
at
this
early
stage.
B
There's
still
a
few
people
doing
a
lot
of
lift
and
wondering
how
do
we
scale
that
by
not
just
adding
more
people
to
the
mix,
but
also
like
what
are
the
you
know
in
in
typical
organizations?
We
have
this
flow
of
like
people
process
programs.
First,
you
just
have
to
throw
more
people
at
a
problem
to
solve
it.
Then
you
can
start
building
better
processes
around
it.
Then
you
can
start
building
better
programs,
and
I
think,
there's
sort
of
this
idea
of
like.
B
Are
we
at
the
point
where
we
need
to
start
experimenting
more
with
programmatic
changes
like
various
down
structures,
so
that,
as
we
add,
more
people,
that
program
can
scale
up
with
it?
Rather
than
saying
what
does
it
look
like
to
do
everything
the
same
way,
but
with
50
notaries,
but
with
a
hundred
notaries.
H
Yeah,
so
I
have
a
couple
questions
on
my
mind,
just
like
open-ended
questions
to
give
you
kind
of
where
I'm
coming
from
it's
like,
so
I
participate
in
some
other
dows
and
kind
of
mostly
lurking,
but
but
a
member
trying
to
follow
how
these
things
are
unfolding
and
when
I
look
at
phil
plus,
I
think,
there's
a
ton
of
opportunities,
and
so
some
questions
that
I
have
are
there
roles
that
aren't
notaries
that
should
be
here
so
people
that
are
enthusiastic
about
the
network.
That
would
be
collaborating
with
us
to
make
this
move
forward.
H
My
answer
is
yes,
so
how
do
we
get
them
in
here?
Another
question
I
have
if,
if
the
phil
plus
team
community
here
is,
is
dedicated
to
distributing
this
data
cap
and
getting
clients
onboarded,
do
we
have
a
business
development
function
and
do
we
have
business
development?
People
on
this
group
like
on
this
call
helping
us
think
how
we
get
this
distributed.
I
haven't
seen
it
emerge
yet,
and
I
know
there's
business
development
people
on
lots
of
different
orgs
in
the
community.
H
Those
ideas
start
as
a
document
as
a
proposal
and
notion,
and
then
we
start
like
snowballing
and
people
actually
are.
Oh,
that's
my
passion
area
I'll.
Do
that
and
take
the
take
the
lead
and
go.
Do
it
and
there's
a
little
bit
of
that
that
we're
a
little
we're
like
a
little
log,
jammed
and
kind
of
doing
and
kind
of
just
getting
the
pieces
all
in
place,
but
that
organizational
stuff
isn't
quite
sorted
out
yet
so
it's
it's!
Not
it's
not
allowing
that
sort
of
flourishing
of
parallel
ideas
and
experimentation.
H
Like
every
dow,
I
know
has
has
like
a
membership
team
and
their
their
like
job
is
to
help
new
people
on
board
and
give
them
tasks
and
get
them
so
that
idea
of
like
what
could
other
people
be
doing
here?
We
don't
we
don't
really
have
that
function
so
nobody's
out
there
trying
to
find
the
the
file
coin.
Enthusiast
and
saying
hey,
you
should
be
part
of
this
fill
plus
thing,
because
it's
more
than
just
data
cap.
It's
like
business
development,
it's
ecosystem
growth.
It's
like
high
quality
data.
H
I
Yeah,
but
I
think
that
party
is
currently
leading
by
the
working
group,
the
working
group
is
doing
lots
of
business
integration,
getting
this
client
on
data
on
boarding
and
it's
not
limited
just
the
data
cap,
including
the
gravity
project,
which
is
encrypted
data,
with
the
incentives
to
encourage
doing
more
so
that's
part
of,
but
sometimes
yes,
that's
cross
connections,
because
data
cap
is
also
a
way
to
encourage
people
to
save
useful
data,
while
the
graphic
project
is
for
more
business
data.
So
working
group
will
take
a
lot
of
responsibility
at
the
moment.
I
Building
the
business
cases,
but
not
only
a
professor,
can
perform
more
because
not
really
came
just
like
a
nem
class
right,
it's
a
plus
to
give
them
more
power
yeah.
I
think
it's
a
good
conversation,
unfortunately,
like
I'm
also
part
of
the
working
group
to
know
that
there
are
storage
providers
in
different
places,
but
for
now
we
can
grow
more
related
to
the
storage
provider,
but
another
is
more
working
with
data
clients.
So
I
think
if
we
can
have
both
together,
it
will
be
a
good
scenario
to
working
to
figure
out
something
interesting.
D
B
All
right,
tuesday,
november
9th
4
p.m
in
california.
Second,
one
of
these
today
jumping
swiftly
in
agenda
same
as
usual,
we've
got
some
metrics
to
go
over
some
program
updates
and
then
flagging.
A
couple
of
issues
that
have
popped
up
that
are
mostly
resolved
at
this
point,
going
over
some
reminders
about
the
large
data
set
process
and
then
time
for
some
open
discussion,
including
some
discussion
of
some
fips
that
are
going
to
have
potentially
have
impact
on
popcorn
plus
to
share
and
hear
more
about
those.
B
But
before
we
do
we'll
we'll
introduce
k
ray
back
to
the
team
for
the
for
the
notaries
on
the
call
I'll
pass
the
mic
over
to
k-ray
to
introduce
himself.
C
Very
warm
hello,
I've
been
with
pl
previously
now
I'm
joined
the
foundation,
I'm
a
little
bit
familiar
with
what
we're
doing
here
on
the
program.
B
Awesome
thanks
k
ray
stoked
similarly
stoked
about
some
of
our
current
metrics
that
we're
seeing
specifically,
we
have
recalculated
average
time
to
data
cap
once
we
found
some
outliers
that
were
incorrectly
changing
the
metrics
and
we're
seeing
two
days,
but
even
better
than
that
across
the
past
30
days,
we've
seen
one
day
and
16
hours
so
a
little
bit
better.
B
We
want
to
keep
driving
those
down
from
from
a
client
experience
just
to
get
that
first
amount
of
data
cap
to
them,
so
they
can
start
making
making
some
deals,
and
we
can
learn
more
about
that.
B
We're
also
seeing
the
overall
average
time
for
direct
notary
involvement
at
11
days
with
the
past
30
days,
hovering
closer
to
nine,
so
still,
improvements
which
are
great,
and
we
want
to
keep
increasing
one
area
where
we
think
we
could
have
some
impacts
around
this
time.
To
first
response.
Thinking
about,
you
know
ways
that
we
can
notify
notaries
better
so
that
you
know,
there's
a
there's,
a
clearer,
faster,
easier
action
on
their
part.
B
Things
like
that.
Also,
looking
at
our
data
cap,
sorry
volumetric
flow,
the
funnel
we're
seeing
you
know
two
pebs
going
to
clients,
which
is
a
decent
amount
from
the
notaries
out
to
clients
and
of
that
about
a
third
is
getting
to
storage
providers
and
deals
pretty
good.
B
Then
looking
at
our
ldns
slightly
different,
because
this
is
calculated
by
looking
at
that
top
large
data
set
pool
we're
seeing
about
3.3
pebby
bytes
to
clients
with
800
teddy
bytes,
making
its
way
to
deals.
So
the
percentages
are
a
little
bit
more.
You
know
skewed
and
harder
to
rely
on,
but
the
values
themselves
give
us
a
pretty
good
picture
and
inspire
some
confidence.
D
Sweet
you
know
just
sharing
the
table
that
we
aggregate
with
textiles
help
with
the
auto-generated
script.
We
actually
in
the
morning
session
today
spent
some
time
chatting
about
how
to
make
some
upgrades
to
this,
as
well
as
like
educating
ourselves
in
the
community
on
how
to
update
this
table
so
that
moving
forward,
it
continues
to
be
an
actively
updated,
accurate
source
of
data
just
to
catch
you
folks,
up
for
those
that
couldn't
make
it
to
the
morning
conversation.
D
There
is
actually
a
json
registry
of
all
current
active
notaries
that
exist
that
drives
the
back
end
of
the
actual
app
that
sits
between,
like
plus
startfield.org
and
github,
and
the
different
bots
that
need
to
do
operations
between
github
and
the
app
and
so
consolidating
that
json,
with
this
particular
like
table,
is
probably
a
good
next
step
in
ensuring
that
we
have
consistent
tracking
of
notaries
across
the
board,
and
so
that's
a
good
action
item
from
here.
No
major
changes
from
the
last
time
this
slide
was
presented
two
weeks
ago.
D
Nothing
really
to
call
out
on
this
front
cool
and,
as
you
folks
know,
we've
been
tracking
towards
you
know
this
end
of
q1,
which
I
swear
eventually
we're
gonna
update
the
slide
goals
that
we've
been
talking
about
for
about
a
month
a
month
and
a
half.
Now
at
this
stage,
the
main
things
I
want
to
call
out
are
that
spend
some
time
doing
the
math
a
little
bit
more
accurately
for
the
total
amount
of
data
cap,
that's
been
made
available
to
clients.
D
And
then,
if
you
put
the
label
for
start
sorry
awaiting
due
diligence,
I
believe,
or
is
it
no
no
needs?
I
think
it's
like
means
need
signing
or
something
hold
on.
Let
me
just
double
check.
B
D
I
think
that
was
just
around
75
bytes
and
then,
of
course,
as
we
know,
like
manual
allocations,
are
in
the
in
the
magnitude
of
about
eight
pepe
bytes.
This
is.
A
D
Cool
actually
because,
as
we
called
out
in
the
morning
two
months
ago,
this
program
was
like
hovering
at
like
30,
40,
megabytes
and
then
a
month
or
two
before
that
it
was
like
in
the
tens.
So
it's
like
tremendous
progress
in
terms
of
like
clients
that
are
actually
being
approved
to
receive
data
cap
via
this
path,
and
now
hopefully
you
know,
we
continue
to
work
together
as
a
community
to
improve
that
experience,
unblock
them
more
for
their
consequent
allocations.
D
One
thing
I've
been
meaning
to
introduce
here
is
a
is
a
metric
that
I
refer
to
as
dwd.
I
mentioned
this
in
one
of
the
past,
like
governance
calls,
but
the
idea
that
we
should
start
thinking
about
like
days
without
data
cap
like
what
does
it
mean
for
a
client
that
actually
gets
blocked
because
they're
waiting
for
the
next
set
of
data
cap
to
be
made
available
to
them.
So
we
should
start
thinking
about
access
to
software,
tooling
tools
to
augment
decision
making.
D
That
would
ensure
that
that
stays
low
for
clients
that
we
believe
are
good
actors
in
the
system,
and
then
you
know,
of
course,
for
ones
that
are
not.
Then
formalizing
that
process
empowering
notaries
to
feel
like
they
can
actually
ask
questions
of
clients
and
actually
like
do
due
diligence
and
consequence
like
consequences
of
utilizing
data
cap
in
the
network
in
the
future.
D
I
know
gillian's
gonna
touch
on
this
a
little
bit
further
later
on,
so
I
won't
get
too
deep
into
it
right
now,
but,
like
you
know
in
general,
prompt
for
notaries
here
and
watching
this
recording
to
continue
thinking
about
software
or
tooling.
That
could
be
built
to
augment
that
due
diligence
process,
mostly
to
make
it
more
efficient,
so
not
in
the
interest
of
like
any
compromises
in
the
quality
of
due
diligence,
but
rather
in
the
interest
of
making
it
lower
friction.
D
So
like
shaving
off
a
few
minutes
or
giving
you
like
one
click
access
to
the
data
that
you
need
other
than
like
five
clicks
or
ten
clicks
to
ensure
that
your
experience
is
good.
Increasing
the
likelihood
that
you
do
the
due
diligence
that
you
think
is
necessary,
but
then
also
to
reduce
your
overall
time
commitment
to
still
continue
serving
in
a
at
a
standard
that
you
believe
is
correct.
D
That
note
talking
about
time
to
data
cap,
no
significant
change
from
from
two
weeks
ago,
on
this
run
other
than
you
know,
we've
consistently
talked
about
increasing
the
amount
of
allocation
being
given
throughout
like
automated
notaries.
So
I
think
next
steps
here,
probably
to
transition
that
discussion
to
like
an
actual
proposal
and
then
follow
up
with
verified.glyph.io
to
see
if
they're
interested
in
doing
that
or
if
a
different
notary
might
want
to
take
that
stead,
but
in
general
it
seems
like
the
community
is
pretty
supportive
of
doing
that.
D
Even
sticking
to
the
current,
like
kyc
based
mechanisms
of
github
and
in
the
future,
we
can
continue
to
explore
more
efficient
things.
Cool
on
the
next
slide
are
the
three
other
themes
no
see
anything
changes
on
these.
I
just
want
to
still
highlight
the
focus
areas,
though,
because
I
think
that
they're
interesting
prompts
for
nurturing
is
to
continue
doing
thinking.
D
Ideation
creative
sort
of
brainstorming
in
in
these
calls,
or
on
your
own
or
in
github
and
slack,
and
these
focus
areas
are
meant
more
as
like
a
mental
model
or
framework
of
areas
that,
as
a
community,
we
can
default
to
prioritizing
and
uses
a
framework
for
how
we
choose
to
divide
up
the
work
that
we
have
in
front
of
us
and
then
prioritize
it.
So
these
areas
of
risk
mitigation,
which
is
generally
like
things
we
want
to
do
in
the
community
to
make
sure
our
eyes
are
wide
open
about,
what's
actually
happening.
D
Client
onboarding
ux
so
generally
like
what
it
actually
means
for
a
client
to
onboard
through
the
app
or
through
github
and
how
we
can
improve
that
to
make
a
very,
very
low
friction
experience
for
them
and
then,
of
course,
the
falcon
plus
governance
side
of
things,
which
is
how
we
want
to
improve
and
mature
our
own
practices
and
make
ourselves
more
easily
held
accountable,
but
also,
in
general,
feel
more
empowered
to
make
a
difference
to
this
program
that
we're
all
contributing
to
general.
Shout
out.
D
I
mean
we
just
celebrated
one
year
anniversary
of
mainnet,
which
is
also
one
year
anniversary
of
falcon
plus,
when
we're
talking
about
software
that
needs
to
exist
for
10,
20,
30
50
100
years,
it's
very,
very
small
on
the
scale
of
the
long
term
horizon,
and
so
I
think
we've
got
a
long
way
to
go.
We've
already
made
great
progress
together,
but
certainly
got
plenty
of
time
to
continue
experimenting,
iterating
and
improving
our
processes
yeah.
Unless
any
immediate
questions
on
this
I'd
say
back
to
galen
and
let's
continue.
B
Okay,
awesome
thanks
steve
for
those
updates
jumping
in
calling
attention
to
a
bug
issue
that
some
people
were
facing
and
run
into
last
week
around
the
remaining
data
cap.
So
there
was
most
of
you
are
aware.
There
was
a
network
update.
B
This
had
some
cascading
impact
where
various
nodes,
you
know,
may
or
may
not
have
been
updated
in
time
and
some
different
endpoints
changed.
So
what
this
led
to
is
our
interplanetary
dashboard,
the
values
specifically
around
data
cap
remaining
was
incorrect,
so
other
other
pieces
of
information
on
that
dashboard
were
still
accurate.
It
was
just
this
one
that
affected
the
subsequent
allocation
bot,
because
it
would
look
to
that
calculation
to
c.
Should
it
kick
off
the
next
allocation
request.
B
So
at
this
point
in
time,
actually
both
of
these
things
have
been
fixed.
So
last
week
the
bot
was
corrected
by
pointing
to
a
different
endpoint,
so
now
that
subsequent
allocation
bot
is
successfully
pulling
data,
cap
remaining
and
kicking
off
those
allocations
correctly
as
well.
When
I
just
checked
before
this
call,
our
interplanetary
dashboard
values
are
correct.
You
will
see
some
places
where
there
is
a
notary
listed
with
zero.
B
You
know
pebby
bytes
of
data
cap
remaining.
Those
are
situations
where
that
specific
notary
address
has
been
deprecated
for
one
reason
or
another.
Some
of
them
are
notaries
that
are
not
participating
in
this
election
cycle,
but
most
of
them
are
from
large
data
sets
that
were
on
the
previous
v1
process
that
have
now
been
removed
of
data
cap
and
deprecated
we're
working
on
updating
the
names.
B
But,
for
example,
you
know
ldn
number
one
that
multi-sig
notary
that
had
seven
signers
is
no
longer
an
active
notary,
so
we've
removed
that
data
cap
and
removed
them
from
registry.
So
you'll
still
see
some
zeros.
Those
are
accurate,
jumping
into
some
large
data
set
process
reminders,
so
we've
had
a
number
of
questions
from
different
notaries.
Over
the
past
few
weeks,
we've
also
seen
you
know
just
a
lot
of
new
applications.
I
think
we're
somewhere
at
46,
45
or
46
open
client.
Large
data
set
applications,
which
is
great
so
the
flow.
B
The
way
that
it
works,
a
client
will
submit
the
application.
The
governance
team
and
some
bots
will
only
check
for
basic
completion,
which
means
are
all
of
the
questions
answered.
Is
this
a
duplicate
submission
that
sort
of
very
outstanding
flag
once
those
have
been
resolved?
The
governance
team
works
with
the
root
key
holders
to
create
the
multi-sig
with
all
of
the
existing
notaries
as
signers
and
a
threshold
of
two.
So
these
multi-sigs
have
about
23
signers
on
them
and
a
threshold
of
two.
B
We
give
them
the
amount
of
data
cap
they
get
added
to
the
verified
registry
once
that
is
complete.
The
bot
goes
in
and
kicks
off
that
first
allocation
and
changes
the
labels
on
the
github
issue
so
that
it
shows
up
in
the
app
all
of
these
steps
are
happening
in
preparation
for
the
notaries
and
we're
now
tracking.
B
All
of
these
steps
for
the
timing,
so
that
we
can
start
publishing
details
on
the
timing,
but
this
is
also
that
the
multi-sig
itself
can
be
created
and
ready
with
data
caps,
so
that
once
the
notaries
have
reviewed
it,
the
allocations
can
start
rolling
through,
I
see
a
hand
from
meg,
so
I'm
going
to
pause.
J
Cool
thanks.
My
question
just
is
about
the
app
so
when
I
go
in
there
and
I
see
a
whole
list
pages
of
pages
of
the
ends
that
need
yeah
in
here
need
verifying
how
this
has
changed
a
bit.
So
just
tell
me,
are
we
driving
off
this
list,
I'm
looking
at
both
I'm
using
your
issues
list
in
github,
which
I'm
finding
quite
useful?
Actually
it's
telling
me
what's
coming
up,
but
on
on
this
list,
there's
zeros
and
ones.
So
I'm
assuming
I'm
going
to
pause
I'll.
Let
you
direct
me.
B
F
B
Start
either
at
the
app
or
in
github.
If
you
go
to
github,
you
can
search
for
this
label
bought
ready
to
sign,
and
you
will
then
see
the
same
thing.
That's
populating
here.
I
think
it
right
now
is
at
18,
because
I
think
one
of
these
cleared
from
when
I
took
this
snapshot,
but
it's
about
18
large
data
sets
that
are
ready
to
start
getting
allocations,
so
you
could
start
at
either
of
those
places.
B
You
will
then
have
the
issue
or
the
audit
trail,
and
you
can
review
the
client
application
itself.
You
can
ask
questions
like.
Is
this
a
real
client?
Is
this
real
data?
Is
the
amount
of
data
cap
that
they're
requesting
appropriate
based
on
what
I
think
they
are
saying
they
will
need?
And
if
this
is
a
subsequent
allocation,
is
there
any
past
deal-making
behavior
that
I
deem
questionable
or
suspicious
such
as
a
violation
of
their
application
terms?
Did
they
claim
they
would
perform
a
certain
type
of
deal
making
distribution,
but
instead
we're
seeing
something
else?
B
We
can
then
ask
all
those
questions
directly
in
the
issue
and
flag
them
in
that
github
issue,
or
if
you
don't
have
questions
you
can
sign
the
allocation
in
the
app.
You
could
also
sign
the
allocation
and
acknowledge
that
the
question
that
you
have
shouldn't
be
a
blocker
and
say
like.
I
was
curious
about
this.
I
went
ahead
and
signed
it,
but
I
would
love
to
hear
more
about
your
answer
to
this
question.
B
You
can
do
that
if
you
think
that
your
question
should
prevent
people
from
signing
it,
then
you
can
tag
myself
and
deep
and
we
try
and
stay
on
top
of
these
comments
and
we'll
remove
the
labels
to
pause
the
allocations
until
we
get
sufficient
evidence
from
the
client
now
another
note
here
about
this
process.
That
is
a
agreed
process,
but
not
a
hard-coded
rule
is
we
don't
want
a
notary
to
sign
two
allocations
for
the
same
client
in
a
row?
B
So
if
internet
archive
comes
up
for
an
allocation-
and
you
know
meg
and
danny
sign
for
internet
archive-
and
then
a
week
goes
by-
and
internet
archive
is
ready
for
their
next
allocation,
the
bot
tells
who
signed
the
first
one
and
ideally
meg
and
danny-
don't
sign
internet
archives.
Second
allocation.
We
let
two
other
notaries
jump
in.
B
Do
some
diligence
see
it
spread
that
out
and
then,
when
the
third
allocation
comes
through,
meg
and
danny
could
absolutely
jump
in
again,
so
we
just
kind
of
want
to
have
a
little
bit
of
a
rotation
back
and
forth
before
we
take.
Questions
like
this
is
as
meg
was
saying.
This
is
what
you're
seeing
there's
you
know,
18
or
19
open
ready
to
receive
and
the
approval
counts
are
ones
where
you
know
for
this
one.
No
one
has
yet
signed
it.
B
I
want
to
flag
some
different
notes
here
so
that
we're
on
the
same
page
on
the
previous
process,
there
was
this
concept
of
a
lead
notary.
This
is
no
longer
the
case
on
this
ldn
version
2.0.
That
was
when
we
had
only
seven
notaries.
With
a
threshold
of
four,
we
asked
one
notary
to
step
up
and
be
the
lead
before,
creating
that
multi-sig
we've
gotten
rid
of
that
kind
of
programmatic
expectation
and
role,
and
now
all
of
the
notaries
are
responsible
for
all
of
the
open.
B
Large
data
sets,
if
you
as
a
notary,
feel
very
very
strongly
about
one
absolutely.
You
could
help
shepherd
that
client
through
and
go
help
empower
other
notaries
to
jump
in.
You
may
sign
the
first
allocation,
because
it's
a
project,
you
really
know
and
love
and
then,
when
the
second
one
comes
in,
you
may
go
champion
in
some
of
our
channels
and
say:
hey
I
signed
this
first
one.
B
Can
someone
please
jump
in
and
you
can
call
some
people
out,
it's
fine
for
for
the
notaries
to
to
do
that
and
advocate,
but
there
is
no
lead
notary
anymore.
The
other
thing
to
note
before
the
process
was
seven
signers
with
a
threshold
of
four.
It
is
now
23
signers
with
a
threshold
of
two,
and
that
means
one
proposal
transaction
and
then
one
approval
on
that
transaction
id.
And
then
the
message
clears
and
they
receive
the
allocation
they
receive.
B
The
data
cap,
what
we
have
seen
happen
is
we
may
get
three
messages
back
to
back
from
three
different
notaries
quickly
in
a
row.
This
may
happen
because
there's
currently
an
issue
where
the
bot
doesn't
always
remove
the
label,
so
it
still
shows
up
in
the
app
or
if
the
notaries
are
jumping
in
in
quick
succession
and
signing
at
the
same
time,
you
know
someone
posts
that
they're
really
running
low
three
notaries
jump
in
to
sign
it.
What
could
happen
is
the
first
notary
proposes.
B
B
That
transaction
is
still
there
when
they
start
to
run
out
of
data
cap
and
we
put
the
label
back
on
and
it
shows
up
in
the
app
again
it's
going
to
have
an
approval
count
already
of
one
and
it's
pending
that
second
signature.
So
this
happened
last
week
with
textile,
and
you
know
we're
working
on
ways
to
prevent
this,
but
it's
just
something
to
call
out
that,
with
a
threshold
of
only
two.
Sometimes
we're
gonna
hit
that
threshold
quickly,
while
other
people
are
still
signing
other
things.
G
B
So
that
way
the
app
appears
slow,
we're
working
on
better
user
experience,
signals
to
make
it
more
clear
that
it's
still
running
in
the
background,
but
it's
not
as
instantaneous
as
we
may
have
become
used
to
with
technology,
because
it
won't
post
an
approval
or
a
failure
until
it
has
landed
on
chain
we're
still
better
than
some
other
blockchains.
Just
you
know
not
throwing
any
shade.
B
So
some
things
that
you
as
a
notary
can
do
is
you
could
open
the
console
in
your
browser
to
watch
this
and
watch
for
the
messages
as
they
cascade
through
the
console.
You
could
also
check
and
refresh
the
github
issue,
which,
once
that
message
lands
will
be
posted
in
the
github
issue.
Last
thing
to
note:
we
are
aware
that
our
documentation
is
out
of
date.
This
is
going
to
work
in
progress
for
a
while.
We
just
haven't
had
the
time
to
sufficiently
prioritize
it.
B
We
would
be
happy
to
crowdsource
some
of
this
and
get
some
some
prs
in
github
this.
You
know
kind
of
falls
into
one
of
our
client
onboarding
ux
goals.
So
it
is
something
that
we're
aware
of,
and
we
understand
it's
not
ideal.
B
It's
it's
on
the
map
to
have
better
documentation
so
pausing
there.
I
see
some
questions
in
the
chat.
One
client
can
only
apply
one
time
before
their
data
cap
is
used
up.
So
what
we're
seeing
from
some
people
is
a
client
is
applying
for
a
data
set
and
they
have
two
data
sets
that
they
are
bringing
to
the
network.
So
the
client
is
representing
two
different
data
sets
and
they
have
two
different
large
data
set
applications
and
those
are
running
concurrently
because
they
are
for
a
specific
set
of
data
that
they're
getting
approved.
B
This
just
helps,
you
know
it
helps
them
manage
their
process.
It
helps
the
application,
be
more
clear
as
to
what
it
is
and
who
they're
storing
it
with.
So
it
just
creates
that
separation,
so
a
client
is
able
to
within
reason,
have
multiple
applications
open
based
on
why
they
might
need
that.
B
The
other
question
they
met
with
no
response
after
approval
and
tried
to
review
again.
Can
you
clarify
some
more
of
what
you
mean
with
your
comments.
B
No,
that
was
that
was
your
opinion,
but
we
may
have.
We
may
have
lost
them,
but
meg.
Also
to
your
point,
one
of
the
one
of
the
other
things
that
we
can
do
is
you
can
you
know
reload,
like
phil
fox,
to
also
look
to
see
if
the
messages
have
landed.
So
if
you,
you
know,
are
looking
at
the
f-080
address
on
filfox
you'll
see
if
a
message
is
a
proposal,
but
the
transaction
ideas
that
sort
of
thing
so
there's
there's
a
couple
ways
to
look.
J
J
Like
I'm
aware,
I
shouldn't
have
signed
that
again,
but
tai
had
the
time
zone
advantage
because
I
was
up
and
everyone
else
might
have
been
sleeping
and
then
it
seemed
urgent,
but
you
know
I
nearly
gave
up
after
the
second
attempt,
but
I
had
one
more
and
then
it
worked.
So
it
was
like.
B
Yeah
yeah,
no,
I
mean
it
does
the
same
thing
for
the
root
key
holders.
We've
heard
it
from
notaries
for
direct
allocations
too.
So
I
would
just
say,
though,
the
biggest
advice
with
as
as
with
lots
of
this
technology
is
like
once
you've
hit.
It
wait
a
little
bit,
come
back
in
a
few
minutes
and
refresh
and
see
if
it
has
if
the
status
has
changed,
but.
B
B
B
B
Well,
we
are
working
on
better
ux
tooling,
to
try
and
make
this
more
clear
when
is
running
in
the
background
when
it
has
failed
when
it
is
successful.
B
D
B
Here
we
go
the
more
data
cap,
ownership
management.
B
So
there's
the
issue
on
our
page
and
here's
the
fit
itself,
and
we
also
have
here
with
us
caitlyn
from
the
foundation
tp
and
for
governance,
who
can
also
sort
of
address
these
and
answer
questions
that
come
up
kind
of
about
the
fit
itself
or
about
the
fit
process.
So
deep.
B
Through
this
fit
itself,.
D
Yeah,
I
think
the
the
main
thing
is
that
today,
like
data
data
cap
itself
is
a
very
like
static
resource
that
doesn't
actually
have
that
much
ability
to
be
change
or
mess
with
once
it
gets
to
a
client
like
it's
much
easier
to
give
somebody
notary
status
to
remove
their
data
cap,
but
once
it
gets
declined,
there's
nothing
to
be
done
about
it,
and
so
this
flip
is
sort
of
a
set
of
attempting
to
solve
at
least
two
types
of
problems,
one
where,
if
a
client
address
no
longer
is
the
right
address
for
that
client
like
can
we
move
that
data
cap
over
to
a
different
address
and
then
there's
the
in
the
case
of
the
client
no
longer
should
have
data
cap?
D
How
do
we
remove
their
status
as
a
verified
client?
So
talking
about
these
in
their
two
different
sort
of
use
cases
I
like
to
think
of
them
as
sort
of
separate,
and
maybe
we
should
think
about
like
each
of
these
is
a
separate
fib,
but
the
second
one,
I
think,
is
a
less
controversial
one
for
our
community.
D
So
what
does
it
mean
to
actually
strip
them
of
existing
data
cap
that
could
lead
them
to
create
deals
that
enable
further
abuse
of
the
system
or
two
we've
also
talked
about
data
caps
sitting
latent
with
clients
for
extremely
large
amounts
of
time,
like
what
happens
if
a
client
or
notary
doesn't
actually
give
any
data
cap
out
for
like
months
or
years,
is
there
a
concept
of
like
revoking
or
returning
that
so
that
they're,
still
like
the
data
cap
that
is
going
out,
is
being
like
useful
when
it's
going
out
and
then
the
second
which
is
in
effectively
like
there's?
D
This
idea
that,
like
client,
addresses
change
over
time
either
because
they
lose
access
to
them
or
because
they
decide
that
they're
going
to
make
deals
in
a
different
way.
So
the
example
that
was
given
in
this
tip
is
basically
if
a
client
decides
that
they're
going
to
go
via
a
deal
broker
instead
of
via
their
own
address,
then
maybe
they
can
transfer
that
data
cap
to
a
broker,
and
maybe
that
should
be
an
option
that
we
provide
and
if
so,
then,
who
has
the
rights
to
actually
move
that
data
cap
around?
D
So
I
think
in
general,
the
I
can
do
a
quick
recap
of
the
conversation
in
the
morning
and
then
we
can
take
it
from
there
or
no
you'd
rather
have
the
unbiased
conversation.
Okay,
that
sounds
good.
I
see
good
okay.
That
sounds
good.
So
then
I'm
just
going
to
stop
talking
now,
because
I
I
already
have
some
opinions
and
instead
of
sharing
them,
I'd
like
to
hear
what
people
in
the
room
think.
D
J
Our
thoughts
on
this
I'm
just
reading
it
do
you
want
us
to
read
it
now
or
comment
later.
D
Actually,
if
you
have
a
minute
to
scan
to
it
now,
it
might
be
good
to
just
chat,
because
we
do
have
some
time
left
in
the
session
might
be
good
to
do
some
of
it
live
if
possible.
Maybe
we'll
just
give
a
minute
of
silence,
because
it
looks
like
a
couple.
Others
in
the
chat
are
still
dating.
We
can
all
watch
galen
make
some
entertaining
faces
in
the
meantime,
hello
youtube.
J
I
guess
your
open
questions,
you
know
good
questions.
The
second
point
I
mean
this
is
where
we
need
a
bit
of
ux
right
so,
rather
than
just
removing
or
evoking
an
inactive
client
inside
just
reach
out
to
them,
why
aren't
they
using
it
and
the
same
with
the
inspire
and
the
data
cap
like?
Why
can't
we
ask
what
the
what
they
need
and
should
someone
not
be
using
it,
then
it
expands.
J
Oh
well,
I'm
I'm
actually
just
saying
we
don't
need
to
guess.
We
can
ask
the
client.
So
this
is
this
ux
side
of
things.
If
there's
any,
let's
agree
the
expiry
up
front,
even
if
they're
not
using
it,
then
then,
let's
find
out
why.
J
D
Them
the
support
that
they
need
to
like
be
able
to
use
it
yeah
yeah.
I
think
our
like
data
cap
utilization
once
given
to
clients
is
what
hovering
around
30.
I
want
to
say,
which
is
pretty
good,
but
should
be
a
lot
better,
especially
if
we're
like
doing
automatic
subsequent
allocation
releases
at
75.
So
I
would
expect
that
number
to
be
higher
there.
D
I
G
D
J
Yeah
is
that
was
there,
there
was
some
talk
about
a
customer
success
or
you
know,
client
support
that
can
work
with
those
that
have
the
data
cap
just
to
oversee
this
on
your
side.
Did
that
happen
is
that
is
that
a
role
that
you
think
is
worthy.
J
B
The
foundation
has
brought
in
one
solutions
architect
so
far
and
there's
talk
of
bringing
in
some
additional
support
on
that
and
the
ecosystem
working
group
sort
of
umbrella
to
effectively
help
bridge
the
gap
between
you
know.
Clients
and
storage
providers,
which
would
you
know
an
ideal
applicant
pool
of
clients
would
be
ones
that
have
already
been
verified
so
that
that
sort
of
gets
at
that
point
of
like
we.
B
F
Yeah
yeah,
it's
really
helpful
to
have
this
sort
of
targeted
audience
who
have
sort
of
very
hands-on
use
cases
for
the
potential
impact
of
these
types
of
proposals.
F
So
if
you
have
opinions
even
if
it's
just
to
say,
yay
or
nay,
it
would
be
really
helpful
if
you
could
add
those
directly
to
the
thread
in
github
that
way.
Jenny
as
the
the
fip
author
can
see
that
this
is
something
that
potential
filecoin,
plus
participants
and
notaries
support
or
have
issue
with,
and
I
think
also
the
comment
about
research
should
be
done
to
check.
F
B
Yes,
and
and
also
as
as
caitlyn
said
like
deep
and
I
are,
are
not
the
original
authors
of
this
issue.
So
while
it
is
a
fifth
that
relates
to
the
five
points
program,
there
are
other
people
submitting
submitting
it.
So
we
we
hesitate
to
speak
on
behalf
of
someone
from
this
call
on
this
issue,
so
encouraging
you
guys,
thank
you
again
for
the
comments
here
and
jumping
in
looking
forward
to
starting
to
synthesize
some
of
those
and
thanks
caitlin,
well
caitlyn.
B
While
we
have
you,
I
think
one
of
the
outstanding
questions
that
that
deep
raised.
This
is
more
of
a
process
question.
What
would
it
look
like
to
split
this
into
two
fibs?
Is
that
easy
is
that
you
know
just
a
a
process
thing
of
opening
another
fit
and
reframing
this
one?
Can
you
approve
only
one
part
of
a
fip?
What
does
that
look
like
from
from
that
side.
F
Yeah,
this
is
a
really
great
question,
so
you
can
only
approve
a
fip
in
entirety.
This
is
what
we
saw
previously
if
anybody
was
involved
in
the
decision
to
extend
the
fault
sector
period
from
two
to
six
weeks
effectively
been
pushed
to
the
repo
that
had
a
lot
of
different
components
to
it.
Some
were
technically
complex
and
some
were
a
little
bit
more
controversial
for
different
stakeholders
within
the
falcon
ecosystem.
F
F
So
for
this
it
is
helpful
again
to
have
this
feedback
captured
specifically
to
know
you
know
if
you
just
you
support
this
measure
as
it
currently
is
written.
That's
great.
If
you
have.
You
know,
general
support
for
the
transfer
data
cap
idea,
but
perhaps
outstanding
questions
for
remove
verified
client
making.
F
These
questions
in
caveats,
known,
is
the
only
way
we
know
whether
or
not
we
do
have
to
actually
split
a
flip
into
two,
but
the
process
of
doing
that
is
really
simple
and
straightforward,
and
jenny
could
quite
quickly
go
and
just
create
a
second
issue,
and
we
could
capture
feedback
on
these
as
separate
topics
for
consideration.
E
F
Oh,
thank
you
yeah
process.
Questions
for
fifths
are
what
I'm
here
for
we're
actually
going
to
be
rolling
out
a
new
process
in
the
coming
weeks.
That
I
hope,
makes
this
a
lot
clearer
for
everyone
and
I'll
actually
add
this
question
about.
Should
we
split
a
fifth
to
our
faq
section
on
our
discussion
forum?
F
I
don't
want
to
open
a
can
of
worms.
I
know
we
only
have
a
few
minutes
left
in
this
meeting,
but
there
is
actually
a
second
file
coin,
plus
fip
topic
that
was
opened.
F
I
flagged
this
for
you
very
quickly
and
I
will
share
it
in
the
chat
as
well.
This
is
one
that
was
written
by
someone
who
is
not
to
my
knowledge
involved
in
the
file
coin
plus
project.
This
is
a
storage
provider,
but
I
don't
believe
he
has
ever
worked
directly
with
notaries.
I
could
be
wrong.
I.
F
F
I
think
there's
also
some
misinterpretations
about
the
program
and
what
it's
supposed
to
provide
for
the
network,
so
I'll
hold
back
my
opinions
on
this
proposal
generally,
since
I'm
supposed
to
just
be
a
steward
of
the
process,
but
I
did
want
to
flag
this
because
I
think
there
are
some
points
of
guidance
that
notaries
themselves
could
add
on
this,
to
help
clarify
sort
of
the
purpose
of
the
the
notary
process
and
as
well
as
the
improvements
that
have
been
logged
in
the
recent
weeks.
F
Yeah
again,
the
only
way
we
are
able
to
actually
capture
this
feedback
is
on
the
thread
directly
for
good
or
bad.
So,
if
you
read
through
this
and
you're
saying
wait
a
minute
that
misinterprets,
what
we're
doing-
or
this
seems
to
discount
some
of
the
improvements
we
made
or
misunderstands
the
purpose
of
five
plus.
I
think
a
lot
of
that
guidance
is
missing
on
this
post
and
it
would
be
helpful
to
have
as
well
or
maybe
you
agree.
That's
obviously
valid
too
so
did
want
to
call
attention
to
it.
F
Yeah,
so
I
mean
separately
for
the
notary
that
may
be
looking
at
this
application
or
may
be
interested
in
reviewing
it.
I
believe
this
person
does
participate
in
the
storage
provider
working
groups,
so
they
are
relatively
accessible
and
they're,
always
on
slack.
So
it
should
be
pretty
easy
to
speak
with
them.
If
that's
of
interest,
I'm
not
going
to
pry
into
the
notaries
and
processes
but
yeah,
there's.
Obviously,
some
bias
to
this
post
and
the
point
of
fips
is
to
contribute
like
improvements
to
the
network.
F
So
again
it's
helpful
to
capture
alternative
perspectives
and
there's
also
some
like
just
obviously
like
some
more
normative
perspectives
here
that
I
don't
think
capture
everyone's
point
of
view.
So.
D
Yeah
good
question,
when
you
say
like
engage
on
slack
and
stuff.
The
best
practice
here
is
still
to
respond
in
github,
though
right
to
capture
the
discussion.
F
Oh
yeah,
it
is
sorry
I
was
saying
if
you
wanted
to
talk
to
batman
13
in
particular
about
his
ldn
application
but
or
his
yeah
his
application.
You
could
contact
him,
he's
pretty
active
in
file
queen
slack,
but
you're
right,
deep
fips
themselves.
F
The
most
important
thing
is
that
we
have
this
comments
and
feedback
for
specific
items
on
the
thread
itself,
since
that's
what
we
use
to
gauge
the
fit
and
where
it
is
in
the
process,
so
we're
not
going
to
steward
a
fifth
through
a
security
audit
or
a
crypto
econ
audit,
if
there's
no
feedback
on
it,
for
example.
F
Likewise,
if
you
know
a
fifth
represents
someone's
own
perspective
or
frustrations,
which
I
suspect
this
one
might,
but
it
doesn't
actually
have
support
from
the
broader
community,
it's
much
easier
for
us
to
sort
of
close
that
issue
and
provide
some
redirection
or
alternative
solutions
for
this
person.
F
A
A
So
two
good
tips.
B
Homework
for
people
to
check
out
as
well
as
about
18
open
large
data
sets
so
a
lot
of
a
lot
of
action
in
the
community.
On
top
of
you
know
the.
Why
did
we
say?
Sorry,
let
me
scroll
back
up
here
about
19
new
apps
in
last
week
and
36
open
direct
applications,
so
lots
of
lots
of
people
trying
to
get
into
that
data
cap
world.
J
D
So
we
briefly
talked
about
this
last
governance.
Call,
I
think,
where
we
thought
about
structuring
the
notary
working
group
into
smaller,
like
focus
groups
meg.
I
don't
know
if
you
remember
this
and
then
sort
of
enabling
them
to
push
forward
kind
of
thing.
I
think
before
that
effectively
happen
a
little
bit
more
like
organizational
work
and
planning
needs
to
happen
to
like
set
up
the
community
to
do
that.
D
So
there's
an
action
item
from
today
morning's
call
which
was
like
hey,
can
we
do
a
little
bit
more
on
like
our
end,
because
right
now,
like
galen
gary
and
I
end
up
doing
a
lot
of
like
the
operational
overhead
sort
of
opening
that
up
a
little
bit
more,
so
we
can
share
that
better
with
people
involve
them
more
in
the
operational
processes
and
then
based
on
that
start
once
that,
like
info
or
whatever,
resources
required
to
enable
somebody
participating
in
that
become
easier
to
access
to
all
engaged
members
of
the
community.
D
Maybe
we'd
start
defining
roles
to
push
some
of
these
discussions
to
completion.
I
think
that's
like
a
reasonable
next
step,
where,
like
action
item
from
that,
might
be
that,
like
one
of
the
upcoming
governance,
calls
we
set
aside
for
discussing
and
like
driving
the
resolution
or
driving
to
proposal.
Basically
for
these
open
discussions
or
other
topics
that
have
come
up
and
then
elect
like
individuals
from
the
community
who
are
interested
in
specifically
solving
those
problems
or
like
fleshing
out
the
details
of
a
proposal
that
they
can
bring
back
to
the
rest
of
the
community.
J
Yeah,
what's
the
short
answer
like
I
like,
defining
the
dow
and
some
of
the
things
and
moving
on
this.
D
Yeah,
I
think
sure
the
short
answer
is
basically
one.
Let
let
us
do
a
little
bit
of
homework
on
how
we
can
be
more
transparent
about
what
is
required
for
operations
so
that
we
can
start
sharing
that
responsibility
with
others
and
then
two,
let's
plan
an
upcoming
military
governance
session.
D
Probably
the
next
one
to
just
be
focused
on
the
discussion
topics
to
drive
towards
like
proposals
or
sets
of
working
groups
on
proposals
that
can
then
follow
up
and
and
build
like
details
that
are
required
before
we
ratify
stuff,
not
a
very
short
answer,
but
shorter.
D
I
guess
further
tldr
would
be
let's
yeah.
I
I
think
it's
pretty
crisp
actually
like
next
next
governance
call,
maybe
specifically
set
it
to
like
you
know,
figure
out
how
we
want
to
proceed
with
each
of
these
elect
people
that
want
to
work
on
them,
and
then
we
have
the
burden
to
share
more
of
the
operational
breakdown
on
the
work
that
we
do.
D
D
Again,
is
that
reasonable
is
that
sort
of
map
to
okay
cool
there's,
a
question
in
the
chat
we
need
to
deal
with
the
problem
with
clients
is
data
cap
applications
and
give
more
support.
Follow-Up
visits
to
client
are
important.
Do
you
mean,
in
terms
of
when
they
receive
data
gap
and
don't
make
deals,
or
do
you
mean
after
they
made
deals,
and
they
want
to
continue
getting
more
data
cap.
B
Well,
we
are
also
running
into
time
and
to
be
mindful
of
everyone's
schedule.
Maybe
if
there's
some
other
follow-up
questions
could
post
them
in
the
fill
plus
notaries
channel
or
in
the
broad
field
plus
channel,
just
as
some
some
extra
places
to
follow
up
and
we
have
those
open.
Discussions
could
absolutely
add
another
discussion
topic.
If
there's
a
larger
issue,
I'm
going
to
go
ahead
and
stop
the
recording
for
those
that
need
to
go,
and
thanks
for
everybody's
time.