►
From YouTube: Filecoin Plus January 11, 2022 Notary Governance Calls
Description
First (two) Governance Meetings for 2022. For the slides to this presentation, which includes all links referenced
https://docs.google.com/presentation/d/1WLY0sVkwyVFKnKaumswcsJDyT9ci_K3o34kqxzTJGJw/edit?usp=sharing
Topics Covered
- Soliciting input on tentative dates for next round of Notary Elections
- Two FIPs in "Last Status"
- Allocation Step Increase final input
- Alex North joined to provide updates (1:22:03 mark) to discuss upcoming FIP to rework the Fil+ Network subsidy
A
All
right
well
good
morning,
welcome
to
the
file
coin
plus
notary
governance
meeting.
This
is
the
first
one
of
2022
and
it
is
january
11
and
we
are
gonna
get
right
into
it.
I'm
just
gonna
make
sure
that
I
can
add
anybody
who
joins
just
a
quick
housekeeping.
Sorry
to
start
this
after
bringing
galen.
Can
you
see
anybody
joining
the
meeting
or
caitlin
to
add
them
in,
or
should
I
just
monitor
that?
I
don't
know
that.
I
am
a
co-host
all
right.
It
looks
like
I
can
add
them.
A
So
this
is
a
january
20
20,
two
wow,
it's
not
scary,
to
say
that
so
here's
the
big
points
on
the
agenda,
we're
going
to
walk
through
the
introductions,
so
you're,
just
kind
of
a
recap
of
who
the
team
is
and
if
you're
new
give
you
a
chance
to
say
hi,
a
very
high
level
of
the
metrics
and
take
a
look
at
what
we
did
in
the
last
two
weeks
and
then
take
a
look
at
the
activity.
A
What
open
issues
we
have
to
work
on
and
then
we
had
some
community
issues
to
flag
for
you,
new
fips,
new
slingshot
restore
and
then
some
discussion
that
you
flagged
for
us
in
the
github
repo.
So
with
that,
let's
take
a
look
at
saying
hello:
deep:
do
you
want
to
kind
of
go
first.
B
The
recording
I
have
been
working
on
falcon
plus
for
about
slightly
over
a
year
now
excited
to
get
into
2022
and
have
a
strong
strong
year
with
all
of
you
in
terms
of
maturing
the
program
and
a
bunch
of
interesting
ideas
that
people
have
brought
up,
especially
towards
the
end
of
the
year
last
year,
so
pretty
excited
to
integrate
that
and
continue
scaling
the
program
up
more
I
initially
this
year.
B
I
want
to
be
a
little
bit
more
focused
on
like
data
cap
in
general,
its
presence,
its
flow
allocations
and
the
general
client
verification
side
of
things
and
with
that
as
well,
including
like
auditing,
and
ensure
that
there's
no
views
to
the
system.
C
I
suppose
yeah.
Similarly,
I
think
a
lot
of
you
guys
know
me
galen
the
fox
wayne
foundation.
I've
been
following
foundation
since
june
working
on
popcorn
plus,
and
my
primary
focus
at
this
point
is
really
around
large
data
sets
and
trying
to
unblock
that
process,
make
it
faster,
more
efficient
and
higher
trust
and
so
leveling
up
how
we
can
get
that
enterprise
level
data
cap
pipeline
going.
A
I'll
say:
a
wonderful
low,
my
name's
k
ray
kevin
ray
good,
my
k
race
since
there's
so
many
kevin's.
I
think
my
role
coming
onto
the
team,
I'm
relatively
new.
A
This
is
like
my
second
or
third
month,
is
looking
at
how
we
can
strengthen
the
ties
between
the
community,
improve
the
communication,
make
sure
that
we're
giving
you
like
systems
that
work,
whether
it's
the
notary
elections
or
whether
it's
this
meeting
or
whether
it's
like
speeding
up
processes,
we
look
at
it
so
still
relatively
new,
still
getting
on
board
looking
forward
to
it
as
we
go
on.
So
with
that,
let's
see,
if
there's
anybody
else
that
would
like
to
say
hi.
A
So
this
is
your
first
time
on
the
call
feel
free
to
come
into
the
chat
and
you
can
come
through
and
add
your
name
say
hello,
where
you're
from
what
you
may
be
working
on.
A
If
you
have
been
here
for
a
while-
and
this
is
your
30th
call-
might
have
someone
new
on
the
phone
feel
free
to
just
introduce
yourself
if
you're
looking
to
kind
of
expand
that
we
just
definitely
want
to
give
you
a
chance
to
kind
of
introduce
yourself
and
say
more
at
the
end
of
the
call
we'll
have
time
for
community
discussion
as
we
go
through
it
all
right.
I'm
going
to
turn
the
floor
over
to
galen
and
he's
going
to
kind
of
walk
us
through
what
other
metrics
looks
like.
C
Yes,
thank
you
so
using
our
dmob
interplanetary
dashboard
here,
we're
still
working
on.
You
know
cleaning
this
up
getting
some
updates.
As
we've
mentioned
a
couple
times,
we
started
a
public
slack
channel.
If
you
are
building
a
dashboard,
if
you
have
questions
or
ideas
about
dashboards,
you
can.
C
Let
me
just
grab
that
channel
name
and
drop
that
here
it's
plus
dash
dashboards
or
hyphens.
You
know
you
want
to
say
it
feel
free
to
go.
C
There
ask
questions
if
you
see
a
stat,
sometimes
across
these
different
dashboards,
things
are
getting
calculated
in
a
couple
of
different
ways,
and
so
it's
great
to
just
have
more
open
dialogue
about
how
to
how
did
somebody
reach
a
certain
number
so
looking
at
this,
our
average
time
to
data
cap
is
about
three
hours
faster
from
end
of
december,
to
where
we
are
great,
I
want
to
continue
seeing
that
number
going
down.
C
First,
response
from
notaries
is
five
hours
faster,
I'm
still
driving
that
down
still
seeing
the
average
time
being
around
five
days
so
trying
to
get
that
closer
to
to
sort
of
a
three
day
turnaround
time
the
total
amount
of
data
cap
that
has
gone
to
clients,
not
a
huge
shift
in
the
client
flow
here,
so
still
sort
of
looking
we've
had
a
few
new
large
data
sets
get
activated.
C
A
Hey
thanks
dalen,
so
I
can
kind
of
speak
to
this
and
I
just
want
to
say
props
right
off
the
bat.
It's
been
a
busy
time
period,
especially
for
the
holidays
in
the
states,
with
everything
going
on
we've
had
a
lot
of
large
data
sets,
we've
had
a
lot
of
applications,
so
we
appreciate
the
timeliness
and
the
effort
looking
into
it.
It
kind
of
really
was
a
lot
right
around
christmas,
but
if
we
take
a
look
at
this,
I
think
what
I
wanted
to
show
is
yeah.
A
We
have
34
direct
applications
and
33
of
those
are
those
large
data
sets
that
are
awaiting
signature.
So,
if
you
have
bandwidth
after
this
call-
and
you
want
to
get
on-
we
have
a
lot
of
them
that
are
waiting
for
the
reviews
and
if
you
look
here
on
this
chart
on
the
right,
this
is
kind
of
that
rough
draft
geographic,
actually
jurassic
state
of
what
that
leaderboard
that
we
talked
about
coming.
A
So
with
all
these
new
applications
coming
on,
it's
it's
a
busy
time.
So
the
program
is
doing
well,
really
looking
for
feedback
from
you
on
ways
that
we
could
make
this
go
faster
and
we
might
be
reaching
out
on
like
hey.
If
you
have
bandwidth,
we
have.
We
literally
have
30
one
of
these
that
are
ready
for
your
input
and
then
we
might
need
anything
so
with
this
I'll
kind
of
pause
for
a
minute,
galen
or
deep.
A
Rock
rock
rock,
so
here's
something
that's
coming.
We
talked
about
this
in
the
slack
and
this
is
kind
of
the
first
mention
of
it.
The
election
election
cycle,
which
makes
us
all
happy
because
it's
never
ever
a
dull
moment
when
it
comes
to
this,
so
we're
getting
ready
for
the
next
round
of
2022
notary
elections
to
come
up,
and
this
is
just
that
heads
up
heads
up
it's
coming
and
looking
for
feedback,
because
the
tentative
suggested
schedule
that
we're
looking
at
is
this
applications
would
open
up
on
february.
A
21St,
that's
about
five
weeks
from
now,
and
the
reason
for
that
time
is
that
lets
us
really
get
feedback
from
you
to
make
sure
that
this
time
works,
and
then
it
lets
us
kind
of
build
up
on
like
what
we're
actually
going
to
be
putting
together
is
like
the
expectations
for
the
notaries
going
forward
in
2022
and
making
sure
that
alliance.
So
once
we
give
two
weeks
for
those
applications
to
open
up
we're
going
to
give
another
two
weeks
for
all
the
scoring
additional
follow-ups.
A
So
through
those
questions,
we
need
to
rate
these
and
kind
of
take
a
look
and
make
sure
we
have
the
right
people
in
the
right
slots
at
the
right
time,
two
weeks
to
go
through
that
and
then
we'll
announce
pending
feedback
march
22nd.
So
that's
what
we
have
tentative.
I'm
gonna
open
up
a
github
repo
shortly
after
this
call,
probably
tomorrow
to
let
everything
settle
and
really
looking
for
your
feedback
to
make
sure
that
this
works.
A
This
is
appropriate,
not
too
truncated
and
all
set
to
go.
One
quick
note
just
to
always
point
out
that,
if
you're
a
notary
with
existing
allocations
and
you're
not
seeking
additional
data
cap,
you
don't
need
to
reapply.
This
is
just
if
you
want
to
keep
going
in
the
program,
and
so,
if
you
have
any
questions
on
this
at
the
end
of
the
call,
when
we
have
open
discussion
we'll
happy
to
open
it
up,
also
feel
free
to
put
it
in
chat
and
that
way
we
can
make
sure
that
we
can
get
your
questions.
A
D
Cool,
thank
you.
K-Ray
yeah
I've
popped
into
this
meeting
before,
but
for
those
of
you
who
I've
never
met
before
again,
my
name
is
caitlin
and
I
work
at
the
file
queen
foundation
and
it's
my
job
to
handle
all
things
related
to
fips.
D
So
we
are
looking
forward
to
our
upcoming
network
upgrade
which
is
going
to
happen
on
march
1st,
and
I'm
really
excited
to
begin.
This
last
call
process
for
some
of
the
fifths
that
we're
hoping
to
be
able
to
incorporate
during
that
upgrade
period.
So
fortunately
this
is
sort
of
a
smaller
upgrade.
I
suppose
you
can
think
about
it
in
certain
ways,
in
the
sense
that
we
do
only
have
two
fits
that
are
up
for
community
consideration.
D
So
with
that
again,
there
are
two
up
for
final
consideration
right
now,
and
I
do
just
want
to
make
sure
we
call
your
attention
to
them,
so
you
can
review
them
and
weigh
in
ask
whatever
questions
you
have,
etc.
The
first
one
I'm
going
to
call
attention
to
is
snap
deals.
D
This
is
a
snip
fit
that
has
been
worked
on
for
many
many
months
now,
so
you
may
have
heard
about
it
before,
especially
if
you
ever
end
up
involved
in
any
of
the
storage
provider.
Working
groups
are
following
sort
of
their
communication
channels
in
slack-
and
this
is
a
one
message-
protocol
that
actually
changes
the
way
that
sectors
are
sealed
so
that
committed
capacity
sectors
can
be
upgraded
to
actually
hold
client
data.
D
Nothing
else
about
the
sector
type
has
changed.
It
does
not
allow
you
to
move
data
between
sectors.
It
has
nothing
to
do
with
sector
sealing
times,
but
fortunately
it
does
allow
you
to
take
any
committed
capacity
sectors
that
you
already
have
and
that
you've
already
gone
through
the
sealing
process
with
and
upgrade
them
relatively
quickly,
so
that
you
can
accept
new
client
data
without
having
that
sort
of
sealing
latency
that
we've
come
to
expect
the
network.
D
B
Totally
so
we've
been
talking
about
this
one
a
few
times
over
the
last.
Several
governance
calls
as
well
so
for
the
usual
suspects
and
for
those
that
have
been
here
before
there
shouldn't
be
any
surprises,
but
we
did
have
one
change,
so
I
do
want
to
highlight
that
before
we
dig
into
that
a
quick
overview
for
those
of
you
that
haven't
heard
about
it,
this
trip
was
primarily
aimed
to
create
a
path
to
remove
data
cap
from
a
client.
So
this
was
done
for
two
reasons.
B
One
was
for
inaction
and
inactivity,
and
in
the
case
that,
like
accounts,
accumulate
data
cap
and
do
nothing
with
them
over
a
period
of
time
so
as
a
way
to
reduce
network
risk
or
accumulation
risk,
we
wanted
to
have
an
option
where
people
could,
but
like
participants
or
stakeholders
in
the
community
could
find
a
way
to
remove
the
data
cap
given
to
a
client
address,
and
the
second
was
in
terms
of
like
an
actual
consequence
or
fraudulent
use
of
data
cap
or
abuse
of
the
falcon
plus
system.
B
We
wanted
to
have
a
combination
of
notary
and
root,
key
holder
signatures,
and
so
that
was
the
the
proposal
put
forth
is
basically
two
two
notary
signatures
and
one
root.
Key
holder
signature
would
be
enough
to
remove
data
cap
from
a
client.
However,
upon
a
lot
of
like
discussion
with
the
l1
protocol
design
team,
what
they
came
back
with
is
basically
instead
of
having
an
individual
root
key
holder
tracked.
B
We
should
just
have
the
root
key
holder
multi-sig
on
it,
similar
to
the
way
in
which
we
have
root
key
holder,
multi
six
on
notary,
create
actions,
and
so,
even
though
this
is
a
little
bit
conservative
up
front,
it
does
fall
more
in
line
with
the
way
in
which
we've
structured
actor
interactions
for
general,
like
falcon
plus
on
behalf
of
the
root
key
older
set
of
folks,
and
so
it
doesn't
like
violate
any
of
the
core
tenets
of
the
design.
B
It's
still
in
line
with
the
principles
of
the
design
of
the
solution
but
net.
What
that
means
is
it's
going
to
be
two
notary
signatures
and
two
root
key
holder
signatures,
because
the
multi-seg
on
the
root
key
holder
threshold
is
still
sorry
the
threshold
and
the
root
key
on
the
multisig
is
still
two,
and
so
that's
been
the
only
change.
So
far,
it
looks
like
there's
been
pretty
good
progress
on
the
design
and
implementation
front,
and
hopefully,
we'll
have
some
great
updates.
Once
the
bigger
network
upgrades
land
in
the
coming
months,.
D
Great
thanks
deep
yeah,
so
if
you
are
able
to
review
the
slides
after
the
call
today,
you'll
see
that
there's
two
links
available
for
each
of
the
individual
fips
that
are
being
proposed,
the
first
one
will
actually
bring
you
to
the
draft
fip
itself.
So
this
is
where
you
can
see
full
spec
change,
motivation,
simple
english
details
about
what
would
actually
change
and
how
it
would
work
within
the
network
and
that
second
link
under
discussion
post
is
where
you
can
go
to
register
your
support
or
disagreement
for
the
proposal.
D
Ask
questions
etc,
because
we
are
in
this
last
call
period.
I
can
assure
you
that
that
discussion
post
is
monitored
by
myself
and
many
others
very
very
closely,
so
it
github
is
actually
the
best
place
for
you
to
register.
You
know
any
kinds
of
concerns,
etc
that
you
may
have
whatever
shape
it
may
take
and
actually
get
a
resolution
to
that
very
quickly.
D
If
this
is
the
first
time,
you're
kind
of
engaging
with
this
process-
or
if
you
don't
frequently
dabble
in
governance,
do
know
that
the
way
the
last
call
process
operates
is
that
we
have
two
weeks
right
now.
So
that
window
will
end
on
monday
january
24th,
which
I
will
add
in
the
chat
after
I'm
talking,
and
at
that
point
we
will
look
to
ensure
that
there
have
been
no
new,
significant
or
previously
unaddressed
concerns
brought
up
by
any
community
member
about
either
of
these
proposals.
D
If
there
have,
we
will
address
them
publicly
and
openly
with
the
person
who
raised
them
and
potentially
extend
that
last
call
period
to
see
whether
or
not
a
this
is
something
we
can
address
prior
to
the
network
upgrade
or
b.
This
is
something
we
need
to
reconsider,
and
this
is
not
a
fifth
that's
ready
for
on-chain
acceptance,
but
otherwise,
in
that
period
of
time
we
also
have
all
of
our
implementation
teams
doing
their
testing.
D
They've
also
tested
this,
and
do
know
in
general
that,
although
last
call
is
sort
of
the
last
step
of
the
fifth
process,
this
does
represent
sort
of
the
culmination
of
many
months
of
security,
auditing
testing,
technical
feasibility,
reviews,
etc,
and
that
these
steps
are
considered
at
complete.
D
All
right
again
review
these.
Let
us
know
what
you
think
by
january
24th,
if
you
have
any
questions
or
additional
follow-up
after
this
meeting.
You're
always
welcome
to
reach
out
to
me
on
slack
I'll,
add
my
slack
username,
also
in
the
chat
or
you
can
just
tag
me
in
the
the
phil
phipps
channel
and
filecoinprojectslack
as
well.
It's
pretty
easy
to
find
me
there.
C
Yes,
awesome
so
a
another
thing
proposing
not
a
not
a
fit
here,
but
still
important.
We've
been
talking
about
this
number
of
times
this
proposal,
345
ldn
allocation
step
increase.
So
a
number
of
clients
have
expressed
that
the
runway
is
not
long
enough.
C
So
this
is
a
what
we
consider
to
be
a
low
low,
technical,
lift
and
low
risk
to
basically
just
add
two
more
allocation
calculations,
not
changing
the
first
three
at
all,
so
it
would
still
be
five
percent
or
fifty
percent
weekly,
but
just
getting
it
so
that
the
clients
that
are
acting
in
good
faith
and
making
those
multiple
allocations
have
the
ability
to
get
to
800
percent
weekly
data
cap
so
causing
here.
C
We
would
really
like
to
push
this
in
the
next
two
weeks.
I
have
not
gotten
any
comments
in
that
issue
kind
of
post
it
in
the
chat
here
yeah.
We
just
do
like
a
quick
question.
Concern
show
of
support
from
the
people
on
the
call.
C
Kasaki's
in
favor
not
hearing
any
dissenting
opinions,
so
that's
great.
If
you
do
have
questions
about
it,
please
go
post
in
the
issue.
We
will
probably
not
discuss
this
again
at
another
governance
call.
We
will
probably
close
this
out
by
the
end
of
the
week
if
we
don't
hear
any
major
concerns
and
roll
it
out
and
implement
it
and
we'll
notify
people
in
the
slack
channel
and
we'll
update
the
readme
as
well.
So
that's
that
moving
fast.
B
Button
thanks
so
much
hey
folks.
Many
of
you
are
already
aware
of
the
slingshot
restore
program,
and
several
of
you
have
already
seen
a
lot
of
data.
Cap
applications
come
in
for
it.
B
If
you
want
to
learn
more
about
the
general
program,
I
would
say:
watch
the
recording
the
last
governance
call
where
we
went
more
in
depth
with
this,
but
I
did
want
to
give
an
update
without
taking
too
much
time
where
we're
at
with
it.
Right
now
is
basically,
we've
been
making
some
standard
progress.
B
The
main
determinant
for
that
actually
tends
to
be
whether
or
not
the
storage
provider
has
enough
fill
to
take
on
verified
sectors
so
that
they
can
provide
the
collateral
for
it,
but
in
general
it
does
seem
to
be
a
popular
option,
and
so
a
lot
of
people
are
able
to
go,
get
data
cap
and
then
store
useful,
valuable,
open
public
data
sets
onto
the
network
at
an
unprecedented
pace
like
this
is
the
fastest
we're
working
towards
data
onboarding
onto
the
falco
network
in
general.
B
So
lots
of
really
good
learnings
from
that
just
as
a
process
and
something
that
we
should
probably
think
about
leveraging
in
the
falcon
plus
ecosystem
overall,
as
we
think
about
client,
onboarding
and
client
success
later
this
year,
other
than
that
young
clients
continue
to
be
held
to
a
very
high
standard
and
are
able
to
comply
with
it.
Some
interesting
things
that
you
know
could
be
interesting
to
think
about
from
a
file
coin,
plus
like
data
cap
distribution
perspective
as
well.
B
Is
that,
with
some
amount
of
kyc
we're
basically
testing
for
at
least
three
storage
provider,
entities,
groups
or
interrelated
organizations
in
any
way
crossing
five
cities
with
the
replicas,
so
five
individual
data
centers
that
are
in
five
different
cities
that
are
then
spread
out
across
at
least
three
countries
and
at
least
on
two
continents.
B
So
we've
got
a
lot
of
you
know:
combinations
of
oceania
europe,
asia
and
na
and
then
even
within
those
continents,
lots
of
presence
of
like
specific
countries
where
we
see
a
lot
of
traction
and
availability
of
high
quality
storage
providers,
we're
working
or
what
we're
expecting.
Is
that
probably,
by
mid
to
end
of
february
it'll,
be
like
five
ish
peribites
onboarded
onto
the
network,
which
is
pretty
awesome
for
contacts
past
slingshot
phases
used
to
operate
around
or
actually
a
little
bit
less
than
that
on
a
per
like
eight-ish
weekend.
B
So
this
is
a
little
bit
quicker,
plus
we're
stress,
testing
the
network
and
also
seeing
a
lot
more
higher
quality
storage
services
and
relationships
being
formed
between
clients
and
storage
providers
that
should
hopefully
benefit
client
deal.
Brokering
and
client
support
client
dueling
to
be
built
and
made
available
for
falcon
plus
clients
in
the
future
as
well
in
terms
of
next
steps.
The
first
thing
is:
carrie
already
pulled
this
up
on
his
second
tab,
but
one
client
is
currently
awaiting
signatures.
We
just
need
one
signature
for
that
as
well.
B
I
think-
or
maybe
we
need
to-
I
don't
remember
now,
but
if
you're
a
notary
and
have
the
time
please
go
check
this
out,
definitely
working
closely
with
them
and
ensuring
that
they're
being
held
to
the
same
standard,
so
yeah
two
signatures
required
on
this
one
thanksgiving.
So
if
you've
got
the
time
to
take
a
look
would
greatly
appreciate
it,
so
we
can
unblock
them.
B
They'll
be
storing
at
least
one
unique
100
w
by
data
set
five
times,
but
they're
also
requesting
additional
ones,
and
so
it'll
probably
end
up
being
at
least
one
pebby
byte
from
this
client
specifically,
and
if
you
go
back
to
the
slide
carray
a
couple.
Other
things
that
I
want
to
call
out
is
that
I
think
we
like
we
in
general
are
seeing
a
lot
more
ldn
applications
that
came
in
in
the
last
like
four-ish
weeks,
even
though
it
was
over
the
holidays.
B
I
think
some
of
it
is
because
people
are
seeing
a
lot
of
momentum
in
the
ldn
process,
especially
for
slingshot
restore
applications.
So
if
you're,
a
notary
signing
this
stuff,
please
keep
an
eye
out.
B
We
have
like
a
official
excel
sheet
with
participants
on
it
that
I
try
to
keep
an
eye
out
on
and
update
daily
or
every
other
day
or
so
so,
if
you
don't
see
a
name
on
there
or
an
address
on
that,
don't
feel
like
you
have
to
sign
it,
because
it's
likely
that
people
are
pretending
to
participate
or
optimistic
about
participating
and
preemptively
going
to
get
data
cap,
which
is
not
a
bad
thing
necessarily,
but
then
we
do
end
up
in
a
situation
where
clients
have
data
cap
and
then
are
not
participating
in
this
restore
program,
and
so
in
that
case,
I'm
also
looking
forward
to
that
fip
finally
landing,
so
that
we
can
correct
for
some
of
these
cases
where
people
ended
up
dropping
out
of
the
program
as
well,
and
so
we'll
have
to
do
some
housekeeping,
probably
once
that
fit
plans.
B
B
You
know
where
the
various
bottlenecks
are,
and
so
I'm
excited
to
walk
through
some
of
the
takeaways
as
it
relates
to
the
falcon
plus
client
onboarding
experience
and
we've
also
gotten
a
lot
of
like
indirect
feedback
from
the
clients
participating
and
restored
through
or
in
the
way
of
the
questions
that
people
are
asking
and
the
way
in
which
they've
understood
the
process
or
are
not
understood
the
process.
B
And
so
I'm
looking
forward
to
sort
of
sharing
some
more
specific
and
concrete
takeaways,
probably
not
at
the
next
governance
call,
but
the
one
after
that,
so
that
we
have
some
tactical
takeaways
that
we
can
work
on
as
a
community.
B
If
you
have
any
questions
on
this
anytime
or
if
you're
a
notary-
and
you
want
to
talk
to
me
about
a
client,
that's
applying
to
datacap-
that
claims
to
be
in
the
store.
My
dms
are
always
open,
but
I
would
also
recommend
you
just
drop
that
in
the
falcon
plus
channel
better
to
do
this
stuff,
always
in
public
and
make
sure
people
are
looped
in.
B
Faye,
yes,
there
was
definitely
some
drama
about
data
cap.
Let's
chat
about
that
for
a
second,
so
just
for
context,
one
client
claimed
to
be
participating
in
restore
was
not
at
that
point
participating
in
restore
but
got
data
cap
preemptively,
then,
upon
being,
you
know
publicly
mentioned
that
they're
not
yet
in
your
store.
B
They
changed
their
application,
just
say
just
slingshot
and
they
are
now
participating
in
slingshot
2.7,
which
is
fine,
but
then
they
were
added
to
restore,
which
is
happening
in
like
a
parallel
timeline,
and
so,
instead
of
reapplying
that
data
cap
to
the
restore
data
set,
they
went
and
applied
for
another
ldn
to
go
and
get
data
cap
for
a
store,
and
so
that
was
the
resolution
of
it
is
basically.
B
The
client
now
has
two
ldn
applications,
one
specifically
for
restore
one
for
just
in
general,
slingshot
and
and
for
the
restore
one.
That's
the
one
where,
like
you
know,
I'm
ensuring
that
they're
being
held
to
an
extremely
high
standard,
hey!
You
are
one
of
the
clients.
You
know
how
we
meet.
Every
week
we've
been
we've
been
discussing
and
ensuring
that,
like
the
replicas,
are
actually
being
distributed
correctly
and
people
are
processing
data
in
a
way
that
it's
useful.
B
So
that's
the
one
I'm
willing
to
stake
my
reputation
on
and
say
yes,
we're
likely
to
see
good
use
of
data
cap
there
for
the
regular
slingshot
one
that
one
goes
back
to
the
sort
of
default
status
we've
had,
which
is
notaries.
You
should
decide
on
a
case-by-case
basis,
if
you'd
like
to
support
that
client
and
support
their
particular
methodology
for
processing
and
storing
the
data
on
the
network,
so
hopefully
no
more
drama
there
and
things
will
get
resolved
and
we'll
proceed
in
the
productive
way.
A
Sweet
sweet,
sweet,
sweet.
I
think
we'll
keep
the
questions
going
in
the
chat
and
come
back
to
that.
If
you
need
next
was
the
community
discussion,
so
we
had
an
issue
that
was
raised
in
the
github
repo
in
preparation
for
this
meeting,
and
this
came
over
from
factory
solutions
patrick
over
here
and
so
essentially
the
question
was
the
possibility
of
a
second
tier
summarizing.
This
would
be
like
hand
out
data
to
their
project's
users.
A
A
B
Yeah
happy
do
so.
This
is
not
dissimilar
in
some
ways
to
the
proposal
that
meg
put
forth
like
several
months
ago,
where
she
talked
about
service
levels
for
notaries
that
want
to
participate
at
different
tiers
or
have
different
expectations.
B
Patrick
was
coming
at
it
with
some
similar
implications,
but
definitely
some
differences
as
well
and
inspiration
from
from
a
different
project
that
he
linked
as
well,
but
the
general
idea,
at
least
from
my
understanding
of
it,
is
effectively
that
there's
a
a
model
in
which
notaries
can
interact
with
the
falcon
plus
system.
That
is,
has
multiple
tiers
in
it.
So
you
have
like
the
generic
notary
behavior
that
you
have
today,
where
you
have
in
the
flow
a
client
comes
into
the
network.
B
A
client
handles
all
their
deal
making
as
far
as
the
farcoin
plus
system,
like
cares
about
it
and
requests
a
data
cap
from
an
existing
notary
and
a
notary
provides
data
cap
to
them.
We
then
introduced.
The
second
thing
called
an
ldn
and-
and
the
type
of
notary
that
participates
in
the
ldn,
is
typically
the
same
as
the
notary
that
participates
in
direct
client
applications,
and
so
mega
is
coming
from
it.
B
At
the
perspective
of,
like
notaries,
might
have
different
expectations,
it
might
participate
at
different
points
in
the
funnel
or
be
engaged
in
different
ways
from
like
how
many
hours
are
you
spending?
How
engaged
you
in
the
system
and
patrick
is
effectively
thinking
about
like
changing
the
definition
and
the
scope
of
the
notary
to
also
include
deal-broking
services
or
services
that
are
providing
tooling
for
clients
that
do
the
deal
making
on
their
behalf.
B
B
This
kind
of
notary,
I
would
imagine,
would
be
much
more
automated
and
would
sit
where
the
client
sits,
as
opposed
to
necessarily
like
off
of
it,
where
the
client
sort
of
breaks
away
from
the
natural
deal
making
path
to
come
and
get
data
cap
to
return
to
the
parts
to
then
make
deals
with
it.
But
instead
this
service
or
the
tools
that
clients
use
to
make
deals
themselves,
give
them
data
cap.
B
Sorry,
a
high
degree
of
compliance
or
high
quality
of
deal
making
on
the
network
then
grant
power
to
these
services
to
effectively
become
notaries
in
the
system
and
award
their
data
cap
allocation
that
they're
given
to
their
own
clients
and
then
publish
some
sort
of
reporter
statistic
in
a
public
way,
so
that
we
can
continue
to
ensure
that
it's
auditable
and
transparent
one
flag
that
I
have
here
is
that
today,
most
of
these
services
all
make
deals
using
the
same
client
address.
B
B
Supporting
something
like
this
becomes
really
interesting,
because
the
client
now
only
uses
one
tool,
as
opposed
to
needing
to
use
the
tool
that
they're
choosing
to
to
make
deals
and
also
separately
needing
to
go
to
github
or
needing
to
go
to
the
fill
plus
app
to
go
and
get
data
cap
from
a
different
notary
and
educate
them
about
what
they're
doing.
And
so
that
is
my
understanding
of
this.
I
think
it's
a
very
pertinent
thing,
I
think
in
general.
B
We
have
to
start
thinking
in
this
way
where
data
cap
can
be
allocated
in
in
different
stages
of
the
process
or
via
different
stakeholders
that
have
different
levels
of
engagement
with
that
that
path,
so
that
is
efficient
so
that
it
like
positively
impacts.
The
client,
onboarding
experience
and,
in
general,
keeps
the
network
at
its
most
productive.
C
Yeah,
I
would,
I
would
also
say
that
you
know
we
have
a
notary
with
glyph
and
infinite
scroll
that
functions.
They
are
a
notary
that
follows
a
set
of
rules
and
runs
a
project
right.
They
use
a
slightly
different
process
and
all
of
that
went
into
their
application
and
met
the
same
like
rubric
criteria
but
functions
kind
of
in
a
different
path
as
a
notary
to
award
data
cap
to
a
client
address,
and
so
there
are
some
other
examples
where
things
like
this
already
currently
exist.
B
I
think
it
depends.
I
think
that,
like
part
of
this,
will
involve
like
maturing
how
those
services
interact
with
our
clients,
as
well
as
like
building
some
sort
of
agreement
on
the
min
spec
of
the
metadata.
That's
tracked,
the
information
that's
collected
and
in
what
way,
a
client
within
that
particular
service
like
qualifies
for
this,
so
I
don't
know
yet
like
what
those
individual
deal
broken
services
actually
collect
about
their
clients
like,
for
example,
like
estuary.
B
The
bar
right
now
is
not
that
high,
but
maybe
for
clients
that
would
qualify
for
a
path
like
this.
There
is
a
higher
bar.
B
They
have
like
separate
tiers
of
clients
where,
like
oftentimes,
you
have
premium
deal
making
offerings
and
in
that
there
are
different
requirements
expected
also
like,
for
example
like
if
you
use
a
custodial
wallet
solution
in
the
general
crypto
world
right,
you
have
tiers
of
kyc
that
you
have
to
do
that,
enable
different
things
that
you're
able
to
do
and
oftentimes
unlock,
better
access,
better
rates,
faster
ability
to
make
deals
or
transactions
or
whatever
make
the
market,
and
so
I
kind
of
view
this.
B
Similarly,
where,
like
we
don't
have
to
force
like
this,
wouldn't
necessarily
force
every
service
to
look
and
feel
the
same,
but
I
think
it
gives
it
does
require
that
there's
some
mint
spec
that
is
agreed
upon
by
this
working
group
of
people
that
are
trying
to
solve
this
problem
or
trying
to
improve
this
problem
space.
Specifically.
B
So
I
do
agree
with
you
again
that
there
has
to
be
some
sort
of
standard.
That
standard
has
somehow
to
be
vetted
and
agreed
upon
by
the
falcon
plus
community
as
well,
and
should
in
include
some
amount
of
min
info
that
the
that
these
client
solutions
have
high
confidence
that
they
can
collect
and
have
high
confidence
that
they
can
share
in
some
safe
way
with
the
rest
of
the
community,
so
that
there
is
some
ability
to
continue
auditing
and
tracking
the
net
clients
that
are
coming
into
the
system
and
the
way
in
which
they're
participating.
B
It
also
has
other
implications
in
terms
of
storage
provider
like
data
cap
distribution,
for
example.
So
a
lot
of
these
services
operate
based
on
like
reputation
and
like
sp
selection
based
on
reputation,
which
then
leads
to.
I,
like.
B
But
I
think
that
it's
very
important
to
start
out
at
least
thinking
about
this
sooner
rather
than
later,
because
if
we
reach
a
point
where
those
services
are
able
to
support
like
a
bring
your
own
wallet
or
bring
your
own
client
address
offering
before
we
have
the
ability
to
support
this,
then
I
think
those
clients
are
going
to
have
a
very,
very
painful
ride,
with
falcon
plus
into
the
falcon
network,
because
the
story
of
having
like
a
custodial
wall
but
then
needing
to
go
to
a
different
service
to
get
data.
B
Cap
allotted
to
a
wallet
whose
keys
that
you've
granted
control
to
to
a
different
service
is
like
a
pretty
whack
story
and
so
yeah.
I
I
think,
like
patrick,
raising
this
now,
it's
a
very
good
time
to
start
this
conversation,
but
I
completely
agree
with
you
that
it's
not
so
simple,
I
think
we're
gonna
have
to
have
a
lot
of
conversation
on
this.
C
Cool
well,
there
is
the
governance
issue
there
364.,
if
you
think
of
something
or
have
questions,
and
I
believe,
we'll
try
and
have
patrick
also
discuss
in
the
next
governance
call
this
afternoon
as
well.
Maybe
we'll
see
you
can
check
that
recording.
C
So
I
think
k-ray
next
on
the
agenda
is
open
discussion
right,
open
discussion,
all
right,
13
minutes
any
discussion,
topics
that
didn't
make
it
up
to
this
agenda.
We
also
mentioned
the
notary
election
cycle.
Any
questions
around
that
happy
to
field
those
here.
B
I
mean
there's
a
lot
of
notaries
on
this
call
and
there's
a
lot
of
clients
that
are
like
relatively
well
known
in
the
system
in
this
call
as
well.
I'd
love
to
just
hear
it
opinion
open
season
on
sort
of
you
know
how
how
things
are
going
in
general,
with
like
from
your
perspective
of
the
falcon
plus
system,
any
like
massive
pain
points
that
you're
really
excited
to
think
about
and
solve
early
in
this
new
year.
B
Yeah,
I
think
it's
great
that
you
have
a
high
standard.
I
think
that
we
are
starting
to
see
some
notaries
slowly
run
out
of
data
caps.
I
think
tim
williams.
F
B
Tolerant,
there's
also
been
like
a
set
of
folks
that,
from
my
understanding,
like
even
certifiers
and
clients
that
are
explicitly
like
looking
to
make
deals
in
different
regions
as
a
result
of
this
to
ensure
that
they
can
still
continue
doing
stuff,
and
so
maybe
we
need
to
like.
Maybe
that
should
influence
a
little
bit
of
how
we
think
about
like
regional
allocation,
because
right
now
we
have
specifically
like
a
gcr
region
in
the
way
that
we
like
select
notaries.
B
But
I
don't
know
how
much
like
I
don't
know
how
pertinent
that
will
continue
to
be,
like.
As
an
example
like,
we
only
have
a
few
notaries
that
are
in
asia
that
are
not
in
the
greater
china
region
and
some
of
those
are
standard.
B
On
outer
data
cap,
for
example,
I
think
masaki,
I
think,
you're
pretty
close
to
running
out
of
data
cap
and
to
me
it
would
be
more
useful
to
have
those
clients
come
to
you
then,
or,
like
you,
work
with
one
of
those
notaries
to
make
allocations
in
tandem
or
like
operate
as
a
effective
group,
as
opposed
to
just
splitting
like
china
out
on
its
own,
if
general,
like
risk
affinity,
there
is
lower
in
the
near
in
the
near
term.
I
guess.
F
Nearby
and
like
so
some
yeah,
but
some
did
move
to
like
hong
kong,
which
is
still
gcr,
but
some
moved
to
like
korea
and
singapore
yeah,
and
I
think
we
might
help
them
or
let
other
notaries
to
refer
refer
their
their
clients
fast
and
we
can
like
work
together.
Yeah.
B
So
I
think,
like
in
general,
like
this
idea
of
like
groups
of
nurseries
working
together,
is
something
that,
like
I've,
been
pretty
enamored
by
recently,
and
so
I
was
thinking
about
formalizing
that
thinking
a
little
bit
and
just
starting
a
discussion
on
it,
where,
like
it,
puts
us
more
in
the
direction
of
like
useful
working
groups
that
can
share
resources
and
ideas
similar
to
some
thoughts.
B
We
had
like
closer
to
the
end
like
november
20
21,
where
we're
talking
about
like
inspiration
from
dows
and
how
we
structure
like
fill
plus
as
a
governance
body,
and
I
definitely
think
this
is
a
pretty
interesting
direction
to
go
in
right
where,
instead
of
holding
an
individual
notary
accountable
already
in
the
ldn
process,
we
need
two
notaries.
Every
time.
Why
aren't
we
instead
holding
groups
of
notaries
accountable?
B
That
can
be
grouped
either
based
on
region,
country
or
expertise
and
like
data
cap
availability
and
have
people
like
instead
of
a
client
having
to
pick
a
notary?
We
assign,
based
on
like
knowledge,
that
we
have
about
a
client
application,
and
I
think
that
could
be
a
much
better
user
experience
overall.
B
B
H
Yeah,
okay,
yeah
yeah,
yeah
yeah,
so
we
implemented
in
the
fifth
plus
app
a
section
where
you
can
see
the
logs
regarding
your
issue,
your
github
issue,
for
example.
Right
now,
if
you
go
to
this.
H
You
can
input
the
issue
number
in
the
search
bar
then
hit
on
search
logs,
and
you
will
see
what's
happening
on
the
issue.
Sorry,
so
maybe
I
can
share
the
screen.
I
know:
okay,
you.
You
are
there.
So
if
you,
for
example,
put
issue
number
162.
H
You
can
see
that,
like
now,
I
just
yeah
garland
just
commented
the
the
issue,
so
this
is
useful
because,
maybe
sometimes
when
you
don't
know
what's
going
on
why
certain
behavior
is
not
triggered
in
github,
you
just
can
have
a
look
to
the
logs
and
maybe
have
you
have
a
better
understanding
of
what
happened
of
what
happening,
and
this
helping
me
can
help
you
occasionally.
Notaries
can
help
the
governance
team
and
yeah.
So
this
is
the
this
is
one
for
me.
H
This
is
one
good
thing
yeah,
it's
useful
also
to
make
this
thing
more
open
to
everybody,
because
before
the
only
the
only
one
who
could
see
the
logs
was
me
so
yeah
next
awesome,
yeah
yeah!
Thank
you.
Next
things
are
coming
so
stay
tuned.
Nice.
B
You
can
now
check
on
a
per
client
or
per
issue
basis,
like
what
the
actual
falcon
plus
experience
has
been
like
for
them,
where
the
bottlenecks
are
what's
stopping
them
from
succeeding,
where
automation
is
helping
and
where
automation
is
not
helping
enough,
and
so
I'm
excited
to
continue
pursuing
more
transparency
on
this
front
and
and
sharing
this
out
as
a
resource
and
so
really
great
work
from
fabry
and
the
team
working
on
on
the
app
thanks.
Guys
really
appreciate
it.
H
H
B
C
B
A
We
have
those
33
issues
and
the
one
we
posted
in
slack.
So
if
we
do
end
this
call
a
minute
or
two
early
and
you
have
that
sling
slack
link
I'll
post
it
one
more
time
we
did
need
to
signature
on
quite
a
few
of
these,
so
it's
the
large
dataset
number
just
sharing
that
issue
right
here.
So
if
we
end
one
minute
early,
you
can
come
here
and
sign
that
huge
greatly
requested.
A
All
right
well
good
afternoon,
good
evening.
This
is
the
second
call
of
the
january
11th
notary
governance
meeting
today
and
welcome
we're
gonna,
be
walking
through
an
updated
deck
and
flow.
Now
before
we
say
that
this
is
the
abridged
version
right
if
you're
curious
for
the
full
version,
just
rewind
your
youtube.
A
few
minutes
and
the
first
one
from
this
morning
has
a
majority
of
the
content
as
we
go
through
this.
A
I
Oh
you
want
me
to
so.
Yes,
I'm
stefan
maglinski.
I
have
now
joined
the
falcon
foundation
as
a
senior
fellow
working
alongside
danny
I'll.
Do
the
short
version
here.
C
C
Yes,
lightning
round:
metrics
numbers
are
getting
better
going
in
the
right
direction,
we're
very
happy
about
that.
Some
of
the
ones
that
were
happy
specifically,
they
we
did
not
see
a
slow
down
yet
over
the
like
kind
of
holiday
and
new
year.
So
that's
great
time
overall
average
time
to
date,
a
cap
is
still
under
two
days
spectacular.
C
Our
average
time
to
our
first
response
from
a
notary
is
still
around
five
days.
So
that's
eight!
C
That's
it
kind
of
a
thing
that
we
really
want
to
be
looking
at
with
this
next
round
of
notary
elections
and
sort
of
the
next
iteration
of
the
notary
process
is
how
do
we
unblock
clients
faster
to
get
them
to
data
cap,
even
if
it's
a
smaller
amount
to
get
started
and
then
looking
at
sort
of
the
total
amount
of
data
cap
that
has
gone
to
clients,
not
a
not
a
significant
amount
going
out
directly
to
the
clients,
but
a
fun
statistic.
C
Let
me
just
pull
it
up,
so
I
get
the
number
right.
We
updated
our
interplanetary
dashboard
and
now
the
total
data
stored
through
filecoin
plus
overall
we're
seeing
7.26,
pebby
bytes.
C
So
seven
and
a
quarter
pebbles
of
falcon
plus
verified
data,
so
very
excited
for
that
number
to
keep
growing
with
the
network
as
you're
watching
this.
If
you
have
questions
about
these
dashboards,
we
have
a
public
slack
channel
for
the
filecoin
plus
dashboards.
People
can
join.
Ask
clarifying
questions
about
how
something
is
calculated.
We
now
have
four
different
dashboards
that
we
pull
from
so
we'd
love
to
get
some
more
communication
happening
there.
That's
our
high-level
statistics,
k-ray.
A
Yeah
thanks
galen.
If
you
took
anything
from
this
slide,
it
was
this.
What
we're
looking
at
is
what
are
some
of
the
successful
metrics
of
notaries,
and
how
are
we
looking
at
building
that
leaderboard,
so
huge
props
to
those
of
you
on
this
list
that
are
just
continuing
to
crush
it?
We
have
a
lot
a
lot,
a
lot
of
open
issues
right
now,
34
direct
applications
and
33
ldns.
A
So
we're
going
to
be
kind
of
reaching
out.
You
might
see
some
things
come
through
we're
working
on
some
different
tooling
to
like
kind
of
reach
out,
but
to
you
notaries.
Thank
you
so
much
for
the
continued
work
and
goodness
gracious.
There
was
a
lot
that
came
through
so
we're
still
burning
through
it
really
appreciate
it
and
we'll
be
looking
and
as
a
quick
plug
that
leaderboard
is
coming
along
nicely
we're
interviewing
candidates.
A
A
A
We
took
a
softball
in
the
morning
and
this
was
all
thumbs
up
if
you're
watching
this
recorded
video
and
this
timeline
looks
horrendous,
feel
free
to
comment
in
the
github
repo
or
let
us
know
on
slack,
we're
looking
for
your
feedback,
but
this
will
be
the
time
where,
if
you're
keen
to
join
the
process,
this
is
your
first
time
watching.
You've
never
been
involved
before,
and
you
wanted
to
kind
of
really
take
a
stand
and
really
get
involved.
The
application
to
apply
for
the
notary
governance
would
be
the
place
to
really
dive
into
it.
D
This
looks
like
my
slide,
charles
thanks
charles
we've
met
before,
but
it's
been
a
little
while,
so
my
name
is
caitlin.
D
I
work
at
the
file
coin
foundation
and
I'm
the
tpm
for
all
of
the
fips
right
now
we're
preparing
for
the
upgrade
to
file
coin
version,
15,
which
is
being
called
osnap,
and
there
are
two
fifths
that
are
currently
open
in
last
call
status,
we're
gathering
discussion
and
final
feedback
hoping
to
incorporate
both
of
these
in
that
upcoming
network,
upgrade
you'll,
see
on
the
slides
I've
linked
to
both
the
fifth
drafts.
D
D
D
So
the
idea
is
that
it
allows
storage
providers
to
save
money,
save
some
time,
and
it
also
makes
it
a
lot
quicker
for
clients,
especially
those
who
are
new
to
the
network,
or
have
really
really
large
data
sizes
that
they're
looking
to
store
to
quickly
know
that
they
do
have
those
sectors
ready
to
go
and
that
the
deal
storage
is
active.
B
Yeah,
so
those
of
you
that
have
been
coming
to
the
last
couple
of
governance
calls
are
pretty
familiar
with
this
already
and
for
those
of
you
watching
the
recording,
if
you're
confused
by
why
we
keep
calling
charles
out
by
name,
it's
got
a
couple
of
the
usual
suspects
on
vacation
this
weekend.
So
we're
singling
him
out
while
he's
attending
this
afternoon
session,
but
yeah.
The
update
for
this
one
primarily,
is
that
when
we
were
initially
drafting
the
fifth
in
the
conversations
that
we
were
having
in
the
that's
true.
B
Actually,
if
you,
if
you
can
see
the
chat
in
the
recording,
there's
still
several
other
folks
in
the
call
we're
just
here
to
make
charles
feel
special
today
too
bad
danny
on
a
different
note,
yeah.
So
the
data
cap
management-
that's
that's
up
for
review
at
the
moment,
that's
specifically
for
removing
data
cap
from
a
client
address,
so
for
for
those
of
you
that
came
to
the
previous
calls.
You'll.
B
Remember
that
the
proposal
I
was
going
for
was
basically
requiring
two
notaries
to
sign
a
message
that
was
then
validated
by
a
single
root,
key
holder
to
remove
data
cap
from
a
specific
address,
and
so
that
would
enable
us
to
correct
for
situations
where
a
client
is
not
using
data
cap
or
can't
use
data
cap
like,
for
example,
they've
lost
access.
The
keys
for
that
address
or
they've
decided.
They
don't
want
to
proceed
with
that
address,
for
whatever
reason
or
they've
changed
their
plans
and
then
so
that
address
shouldn't.
B
Have
any
data
cap
associated
with
it
and
accumulate
potential
network
risk,
or
in
the
case
that
we
find
like
misuse
or
abuse
of
data
cap
or
the
general
falcon
plus
system,
and
we
need
some
way
to
impose
a
consequence
in
the
client.
This
gives
the
option
to
the
community
to
pull
back
both.
B
You
know
a
certain
amount
of
data
capital
so
remove
the
verified
client
status
from
the
network
as
a
result
of
setting
the
data
cap
balance
down
to
zero,
and
so
the
main
difference
with
highlighting
here
is
that,
after
a
conversation
with
the
implementers
in,
like
the
the
sync
that
we
typically
have,
I
maybe
kayla.
I
don't
know
what
the
exact
name
of
the
meeting
is,
but
the
general
one
where
the
community
of
developers
meets
one
of
the
takeaways.
B
So
for
now,
for
the
first
implementation,
it'll
be
two
notaries
have
to
send
a
message
to
the
network
that
has
specific
parameters
that
include
the
client
address
and
the
amount
of
data
capping
removed
and
then
the
same
bits
are
effectively
used
by
a
proposal
and
an
acceptance
to
the
keyholder
multisig
that
then
enacts
that
change.
And
so
we
still
get
the
benefits
of
tracking
and
auditing
everything.
Ensuring
that
through
people
are
able
to
understand,
what's
happening
and
sign.
G
B
And
we
ensure
that
there's
at
least
two
notaries
that
are
participating
in
an
agreement
before
proceeding
to
remove
data
cap
from
the
client,
so
yeah
looking
forward
to
getting
this
implemented
really
excited.
You
know,
timeline
is
great
for
us,
with
the
elections
coming
up
and
a
bunch
of
other
changes
that
we've
all
been
talking
about
in
the
falcon
bus
system,
and
so
it'd
be
great
to
have
this
lever,
as
we
continue
to
try
new.
B
Operate
at
a
higher
order
of
magnitude,
as
we
have
than
we
have
in
the
past,
at
least
thanks
steven.
D
Thank
you
deep
yep.
So
again,
these
are
both
proposals
that
have
been
worked
on
for
weeks
now
in
in
this
context,
and
for
snapchat
it's
been
several
months
so,
charles,
I
don't
think
either
of
these
are
going
to
be
particularly
new
to
you,
but
if
you
or
anyone
else
would
like
to
weigh
in
danny
galen
etc.
B
So
I
I
do
have
one
question
about
the
snapdeals
one
that
is
probably
worth
just
flagging
for
now
for
those
of
you
watching,
recording
as
well
just
worth
thinking
about
if
we
were
to
have,
if
you
were
to
make.
G
B
That
are
verified
deals
using
snap
deals.
I
think
that
we're
soon
going
to
get
into
the
conversation
and
like
deal
with
neil
as
a
result
of
this,
because
the
life
term
on
that
snap
deal,
I
think,
is
constrained
by
like
the
initial
life
of
the
sector,
which
could
be
shorter
than
if
you
were
to
make
a
net
new
deal.
And
so
I
think,
there's
going
to
be
more
interesting.
B
Math
happening
behind,
like
the
willingness
of
storage
provider
to
use
liquid
falcoin
on
collateral
for,
like
a
snapdeal
based
plus
deal
versus
a
net
new
deal
to
the
network,
and
so
I
think
there'll
be
in
some
ways.
It's
actually
better
and
easier
and
requires
less
longer
than
up
and
stuff
and
in
other
ways
you
might
prefer
higher
risk
for
high
reward
over
a
longer
period
of
time.
And
so
I
think,
they're
going
to
be
some
interesting
consequences
to
having
this
as
an
option.
B
D
Yeah-
that
is
a
great
point
just
to
point
out.
I
am
pretty
positive
that
the
snapdeal
spec
does
also
confirm
that
you
can
obviously
extend
sector
lifetime,
even
if
it
is
a
snapdeal
itself.
So
there
is
some
control
over
the
longevity
of
a
sector
in
total,
but
you're
right
that
it
it's
not
going
to
be
as
long
as
if
you
had
an
entirely
new
sector
that
was
being
sealed
for
the
first
time.
B
B
Does
it
make
sense,
but
at
least
on
the
discussion
on
like
limits
to
how
far
you
can
extend
it
and
then
what
happens
once
it's
close
to
expiring
like
further
extensions,
and
I
think,
as
we
continue
to
discuss
that
stuff,
which
will
probably
happen
in
the
coming
months,
I
think
we
should
plug
back
into
the
skull
and
loop
in
some
of
the
folks
from
the
falcon
bus
community
that
are
interested
in
having
those
conversations.
C
Okay,
my
turn
allocation
step
increase.
This
is
a
so
far
has
been
a
very
non-controversial
proposal.
It's
been
discussed
a
number
of
times.
It
is
not
need
to
be
a
fip
because
it's
happening
solely
within
the
falcon
plus
realm
of
influence,
and
it's
not
really
a
protocol
change.
We
are
just
proposing
adding
two
additional
steps
to
our
large
data
set
allocation
calculation.
C
All
that
we're
doing
is
proposing
adding
to
be
able
to
get
up
to
80,
total
or
800
weekly,
and
this
is
really
for
our
high
volume,
high
throughput
enterprise
level,
clients
that
have
demonstrated
great
deal
making
behavior
and
keep
coming
back
for
more
allocations.
C
So
we
would
really
like
to
move
this
basically
into
you
know
last
call
status
and
implemented
before
the
next
notary
governance
call.
So
please
go
check
out
this
issue
in
minority
governance.
Repo,
add
some
comments.
C
If
you
have
questions
or
concerns,
and
if
we,
you
know,
don't
hear
any
questions
or
concerns,
we're
gonna
work
on
moving
it
forward
and
we'll
notify
people
in
slack
and
in
that
so
and
update
the
readmes
when
this
gets
implemented,
it'll
be
a
very
low
technical,
lift
change
to
a
subsequent
allocation
bot,
so
kind
of
pause
just
for
like
five
seconds
and
see
if
the
notary
on
the
call
charles,
oh.
B
C
J
Yeah
actually
like
for
the
location
what's
happening
recently
like
because
of
the
selena
restore
project.
It's
like
lots
of
people
are
like
trying
to
find
the
notary
to
become
like
can
get
a
lot
of
requests.
Maybe
it's
because
of
pretty
smart
vocation
time.
I
think
after
the
location
is
really
better.
Like
even
myself,
I
have
difficulty
to
get
people
signature.
For
me,
another
thing
like
we
need
to
run
is
really
fast
each
time
we
need
two
notaries
and
then
the
first
time
you
need
the
two.
The
second
you
need
another
two.
J
Well,
I
doubt
that
we
don't
really
really
have
so
many
not
available
after
the
three
or
first
round.
I
really
doubt
that
I'm,
oh
sometimes,
I'm
really
scared
like
at
this
time.
I'm
gonna
get
it
at
all
the
race
and
and
also
like.
We
noticed
that
summoner
is
not
so
active,
it's
just
some
rumors,
but
the
information
for
you
guys
in
case
you're
interested
just
consider
it
as
rumors.
J
They
said
some
chinese,
that's
not
training
three,
some
another
in
china
due
to
the
political
reason
or
like
governance
reason,
they
don't
really
want
to
sign
a
lot.
This
is
the
information
I
get
from
them,
so
we
maybe
have.
We
have
reduced
of
capacity
of
notorious
who
can
sign
the
data
cap
yeah
just
some
thing.
I
know.
C
Just
a
quick
clarifying
thing
with
the
like
two
notaries
for
large
data
sets
it's
two
notaries
for
an
allocation
and
then
for
the
second
allocation.
We
want
two
different
notaries
for
the
third
allocation.
It
could
be
the
same
two
notaries
from
allocation
number
one,
so
it
is
possible
that
a.
C
Client
forms
a
relationship
of
diligence
and
communication
with
four
notaries
and
rotates
through
those
four.
We
just
want
to
have
some
more
eyes
on
these
client
applications.
So
it
is
not
that
you
have
to
have
two
different
ones
for
number
three,
then
two
totally
different
ones
for
number,
four,
etc,
etc.
Because
I
agree
you
know
some
notaries,
don't
you
don't
want
to
engage
in
the
large
data
set
process
for
one
reason
or
another?
C
E
Danny
had
a
hand
danny
I
did
I
I
have
an
opinion,
so
actually
it
was
a
question.
I
always
appreciate
the
statistics
you
folks
are
collecting,
so
are
we
going
to
there'll
be
some
time
between
this
happening
and
and
the
new
notary
is
coming
on
board
right
like
because
I
think
I
I
I
there
are
two
two
factors
here
and
I
the
reason
why
I
mentioned
this
is:
I
was
thinking
they're
going.
E
I
wonder
if
having
more
notre
is,
is
going
to
have
this
go
faster
because,
of
course,
there'll
be
more
people
around
or
something
I
definitely
feel
is
that
sometimes
there's
a
little
bit
of
a
dilution
of
responsibility.
You
know,
and
specifically-
and
specifically
I
I
mean
I'm
always
a
little
bit-
hesitant
to
sign
off
on
mainland
china
request
just
because
it's
it's
pretty
hard
for
me
to
do
the
diligence
from
where
I'm
at
so.
E
C
Yeah
I
mean
the
other
thing
to
call
out
is
like
when
we
finish
the
election
cycle,
any
new
notaries
wouldn't
be
on
the
existing
multi-sigs,
and
so
we
will
need
to
determine.
Do
we
create
new
multi-sigs
with
all
of
the
new
notaries,
or
do
we
adjust
the
process
in
some
other
way,
but,
like
there
is
a
technical
question
here
you
know
around
when
we
finish
this
election
cycle,
we
will
have
new
addresses.
C
J
Yeah
get
them
I
see
like
you,
are
really
working
really
hard
because
I
saw
like
everybody
I
get
a
lot
of
messages
say
like
you're
working
on
asking
them
due
diligence.
J
Sometimes
like
I'm,
showing
like
a
too
picky,
because
if
you
want
to
ask
like
I
said,
your
website
is
not
accessible,
your
data
like
looks
a
little
bit
fishy
and
your
website
sometimes
like
just
have
a
website.
One
page
webs
not
really
one
page
several
image
website
and
claim,
like
they're,
doing
a
big
business.
It's
really
hard
to
convince
me
like
it
really
owns
those
data.
I
don't
know
where
did
that
come
from,
and
then
I
don't
know
what
entity
behind
the
business
yeah.
C
Yeah-
and
that's
I
mean
that's
the
that's
the
whole
point
of
like
having
multiple
eyes
to
this
diligence,
because
some
people
are
gonna,
look
at
it
and
be
like
I
have
my
spidey
senses.
Tingling
like
this
seems
a
little
suspicious
and
lightweight
on
what
they're
showing
me,
and
I
want
to
ask
for
more
and
other
people
may
have
a
relationship
with
that
person
or
understand
that
company
or
understand
the
space
or
just
have
another
way.
You
know
that
they're
doing
the
diligence
that
gives
them
confidence,
so
I
yeah.
C
B
I
don't
know
where
deep
is
on
my
screen.
I'd
like
to
have
my
hand
up.
So
one
thing,
charles
that
you
brought
up
that
I
wanted
to
address
as
well,
was
that
this
allocation
step
increase
has
not
been
in
reaction
to
like
latency
that
we've
seen
as
much
as
I
think.
It's
a
reaction
to
realizing
that
as
we
get
more
and
more
serious
enterprise
use
cases
wanting
large-scale
like
data
cap
allocations
over
time,
it's
unrealistic
to
give
them
a
runway
of
like
two
weeks
worth
of
work
like
if
genuinely
they're.
B
That's
proven
themselves
as
a
good
client,
and
it's
like
likely
to
be
trustworthy,
then
giving
them
a
runway
of
a
month
or
a
month,
and
a
half
worth
of
work
makes
a
lot
more
sense
in
terms
of
how
they
can
structure
their
operations
and
how
they
can
plan
for
cases
in
which
there
is
an
interruption.
They
get
a
much
bigger
runway.
E
Yeah,
so
I
one
thing
I
definitely
noticed
is
that
we've
shifted
from-
or
at
least
in
my
case,
from
the
small
allocations
to
many
more
in
the
the
large
data
allocations,
and
I
think
that
that
that
makes
sense.
I
I
do
wonder
if,
if
we
are,
if
there's
a
risk
of
us
getting
kind
of
like
overloaded
with
requests
for
large
data
allocations
and
make,
maybe
some
of
them
are,
you
know
not
not
as
legit
as
we
would
want.
E
G
G
E
You
know
200
terabytes,
I
might
as
well
go
through
this
process.
My
question
is:
do
you
think
it
makes
sense
or
would
be
worth
thinking
about
to
have
like
a
small
fee
attached
a
falcon
fee
attached
for
the
large
data
application
process?
You
know
just
to
disincentivize
that
kind
of
that
kind
of
thing.
J
B
Personally,
I
think
that
the
that
is
a
fair
reaction,
but
I
would
almost
say
that
we're
probably
better
off
instead,
like
I
think
the
pain
point
there
is
that
it's
like
creating
a
lot
of
work
and
it's
creating
a
lot
of
like
extra
work
and
we
can
reduce
that
work
in
ways
that
are
safer
for
the
system
without
having
to
create
friction
for
the
client.
So
I
think
one
thing
that
we're
specifically
well.
B
I
was
chatting
about
this
actually
earlier
today
and
in
the
past
couple
weeks
as
well,
but
formally
put
forth
to
do.
A
discussion
is
probably
looking
at
like
a
much
higher
degree
of
automation
for
lds,
where,
like
subsequent
allocations,
after
a
certain
number
of
like
notary
signatories,
like
do
not
require.
Like
a
notary
step,
we
still
have
that
opportunity
to
like
run
analysis
in
an
automated
fashion,
like
block
a
transform
going
out
and
in
that
case,
bring
in
a
human
in
the
loop.
B
But
reducing
subsequent
requirement
of
having
humans
in
the
loop
at
every
next
range
probably
doesn't
make
sense
and
creates
a
lot
more
work.
So
to
me,
that
is
like
more
in
the
direction
of
okay:
let's
reduce
the
overall
burden.
My
fear
with
putting
additional
like
steps
at
the
start
of
it
is
that
we
already
have
so
much
friction
when
it
comes
to
electronic
onboarding
that
I'm
coming
into
the
angle
of
like
trying
to
reduce
that.
B
However,
I
do
think
that
what
you're
proposing
is
much,
it's
probably
a
faster
way
to
weed
out
people
that
are
trying
to
cheat
the
system.
But
to
that
I
would
say,
maybe
I
think
we
consider
this
in
after
we
get
the
chance
to
try
and
build
like
some
sort
of
like
audit
based
bot
or,
like
a
data
analysis
may
spot
which
we're
trying
to
do
right
now
see
how
that
pans
out
and
depending
on
whether
or
not
that
is
able
to
track
like
what
we
consider
malicious
or
abuse
use
of
the
system.
B
At
that
stage,
then
consider
and
in
general,
the
concept
of
like
client
collateral
is
not,
and
we
have
about
in
the
past
for
general
like
deal
making
to
have
clients,
post
collateral
or
contribute
collateral
as
well
to
search
providers
so
that
they
can
even
afford.
Taking
a
verified
deal,
and
so
I
think
all
of
this
is
like
reasonable.
B
I
just
think
that
our
initial
reaction
should
probably
air
on
the
side
of
like
more
manual
work
that
we
can
make
automated
and
like
more
risk,
riskier
actions
that
we
can
learn
from
and
get
data
from
before
reacting
to.
If
that
makes
sense,
that's
like
an
operating
philosophy.
I
guess.
C
C
Asking
them
to
put
up
some
amount
of
collateral
is
not
going
to
be
like
a
big
enough
threat
to
them
versus
the
offset.
You
know,
data
cap
reward
that
they
could
be
getting
compared
to
a
legitimate
client
who
is
already
like,
maybe
a
little
wary
about
like
what
is
this
thing
asking
them
for
a
pay
to
play
behind
a
paywall
while
for
some
people
might
give
them
higher
degree
of
confidence
like
this
is
a
legitimate
enterprise
thing.
Look
I'm
having
to
pay
for
it.
C
C
It
also
goes
to
like
the
ongoing
question
of
like
what
is
the
problem
we're
trying
to
solve
and
to
deep's
point
of,
like
the
philosophy
of
you
know,
can
we
do
more
diligence
up
front
and
learn
from
you
know
these
different
use
cases
to
then
have
better
tracking
fraud,
monitoring
and
basically
turn
on
a
faucet
that
we
can
monitor
and
then,
when
something
starts
to
happen,
shut
it
down.
C
So
I
think
we
spent
you
spent
a
lot
of
time
on
this
one
and
we
have
a
special
guest
presenter
to
help
talk
about
this.
I
think
we
should
go
forward
one
to
hear
from
deep
about
slingshot
restore
because
it
is
related
to
these
large
data
sets
and
then
kick
it
to
our
our
guest.
B
Sure
yeah,
I
think
I'll
keep
this
quick
in
general
like
crystal,
is
making
progress.
I'd,
say
it's
not
as
fast
as
I
think
initially
we'd
hope,
but
it's
resulting
in
a
lot
of
really
interesting
things
in
case,
there's
anything
fun
you're
wanting
to
schedule.
We
could
chat
about
that
after,
but
I
did
want
to
call
out
that
in
terms
of
immediate
next
steps
we
have
one
pending
signature
and
on
a
client.
That's
currently
blocked
issue
number
167.
B
There
were
two
signatures
starting
this
morning,
but
I
think
one
particularly
active
notary
in
this
call
signed
that
one
once
and
so
we
just
need
the
second
one.
So
I'm
looking
at
the
other
neutrons,
it's
called,
maybe
was.
B
I
think
it's,
I
think
it's
a
certain
danny
other
than
that.
B
I
also
want
to
call
out
that,
like
definitely
there's
a
lot
of
people,
who've
taken
an
operating
have
been
a
little
opportunistic
when
it
comes
to
this
store
in
terms
of
getting
data
capped
like
either
proactive
like
proactively
in
some
cases
or
preemptively,
and
then
there's
other
people
that
have
gotten
data
cap
and
then
left
the
program
as
well,
and
so
right
now
we're
not
seeing
too
much
rampant
abuse
of
that,
because
the
amounts
going
on
in
the
first
allocation
are
decided
to
be
quite
low.
B
But
I
am
looking
forward
to
one
vetting
that,
and
especially
once
we
get
this
step
implemented,
the
ability
to
remove
data
cap
will
probably
come
in
handy
to
make
sure
people
are
getting
it
for
the
reasons
that
they
cited,
but
also
I
want
to
call
out
that,
like
there's
an
up-to-date
list
available,
if
you
have
any
questions,
if
you
feel
like
you're
you're,
giving
data
capture,
somebody
that
is
not
doesn't
look
like
they're
participating
in
the
store,
but
it
claims
to
be,
please
feel
free
to
ping
me
anytime
or
check
that
list.
B
It's
also
several
other,
like
bigger
scale,
key
takeaways
that
I'd
love
to
share,
but
we'll
probably
hold
that
to
like
a
future
governance
call
once
we
have
even
more
feedback,
so
yeah
thanks
so
much
for
all
the
support
you've
already
given
to
the
program.
I
think
it's
going
to
pay
out
lots
of
dividends
in
terms
of
learnings
and
takeaways
for
future
falcon
plus
clients
plus
notaries,
and
on
plus
search
riders.
A
Okay,
during
this
morning's
call,
we
spent
a
lot
of
time
talking
about
the
possibilities
like
second
tier
and
how
we're
looking
at
data,
I
think,
pending
anybody
on
the
call
having
questions
charles
north
alexandria
just
joined.
I
would
refer
you
to
rewind
the
youtube
video
I'll
bookmark
it
in
the
link.
So
you
can
see
it.
It
was
a
very
thorough
discussion
being
a
lot
of
minors
weigh
in
and
we
had
a
really
good
feedback
from
the
community.
If
you
had
any
input
that
you
wanted
to
share.
This
is
opening
the
repo
on
ticket
364..
A
B
Yeah,
I
think
we'd
love
to
start
out
with
alex
alex
joined
us
from
the
pl
team.
He's
been
doing
a
bunch
of
thinking
around,
like
crypto
economics
in
general
and
recently
just
posted
a
draft
for
a
fip
looking
at
redesigning
the
falcon
subsidy,
so
alex
we'd
love
to
have
you
like
just
chat
through
it.
Give
us
a
bit
of
an
overview.
G
Sure
hi
thanks
very
much
and
hi
everyone
nice
to
to
see
you
yeah.
So
just
briefly,
my
name
is
alex.
I've
been
an
engineer
on
filecoin
for
about
three
years
and
was
responsible
for
the
actors
implementation
at
the
network
launch.
G
Although
I
didn't
write
the
filecoin
plus
registry
actor,
I've
been
spending
a
lot
of
time
now,
looking
ahead
to
what
will
be
possible
when
the
filecoin
vm,
the
fvm,
enables
user
programmable
contracts,
and
it's
our
dream
that
you
know
a
huge
amount
of
innovation
will
be
unlocked
in
contracts
written
within
the
filecoin
vm,
including
lots
of
innovation
in
storage
markets
and
and
features
for
deals.
Things
like
deal
extension
or
multi-sector
deals,
renegotiating
deals.
G
Transferring
deals,
deals
for
capacity,
option,
markets,
repair
markets,
all
kinds
of
things
like
that,
most
of
those
things
as
a
year.
Currently,
those
are
could
only
be
implemented
by
sort
of
you
know
built-in
contracts,
because
that's
the
only
kind
of
contract
we
have.
The
f,
where
the
fpm
will,
in
theory,
enable
all
those
things
to
be
implemented
by.
G
You
know
smart
contract
developers
anywhere
in
the
world
in
just
the
same
way
as
other
blockchain
systems
like
ethereum
and
solana
and
so
on,
which
of
course,
we
think
will
be
a
huge
boost
to
the
the
economic
possibilities
of
what's
possible
on
firecoin
and
hence
the
economic
economic
activity
and
the
utility
of
far
coin
as
a
piece
of
web
3..
G
Anyway,
much
of
my
work
has
been
focused
on
how
we
can
arrange
our
existing
smart
contracts
so
that
people
can
write
useful
things
on
the
fvm
once
it
exists,
because
our
current
set
of
contracts,
the
built-in
contracts
that
we
launched
with
and
have
iterated
from,
are
not
open
to
sort
of
external
development.
If
we
did
have
the
ability
to
write
new
contracts,
they
couldn't
do
anything
with
the
storage
market
or
the
storage
miner,
as
they
exist
today,
because
they
were
coded.
G
G
So
a
couple
of
things-
I
guess
one.
I
wanted
to
flag
this
this
future.
That
is
coming
where
I
expect
and
hope
that
there
will
not
be
just
one
storage
market
actor.
That
is,
you
know
the
built-in
storage
market
actor
that
we
have
today,
but
that
many
different
people
can
write
smart
contracts
that
function
as
storage
markets
and
they
can
have
different
market
features
and
it
can
innovate
on
their
own
pace
and
as
smart
contract
upgrades
without
requiring
a
network-wide
consensus
upgrade
in
order
to
make
change.
G
So
a
large
part
of
my
work
has
been
trying
to
remove
the
special
privileges
and
figure
out
ways
to
remove
the
special
privileges
that
the
built-in
market
actor
currently
has,
so
that
other
people
could
write
markets
that
could
be
on
par
that
have
the
same
access
as
the
current
market.
One
of
those,
of
course,
is
access
to
verified
deals,
and
so
a
future
that
I
see
coming
is
where
multiple
parties
deploy
storage
market
actors
that
behave,
you
know,
have
different
functionality,
but
want
access
to
be
able
to
broker
verified
deals
and
so
we're.
G
Looking
I'm
looking
for
ways
looking
into
ways
to
make
this
easy,
so
one
you
know
a
feature
that
I
imagine
might
happen
is
that
the
falcon
plus
notary
committee
maintain
a
list
of
sort
of
approved
market
actors
which
we
can
maintain
on
chain
and
approved.
Market
actors
can
make
calls
to
the
filecoin
plus
registry
on
chain
and
get
data
cap
and
then
earn
rewards
for
that
data
cap
and
so
there's
still
it.
It
opens
up
the
trust
a
little
bit.
G
Instead
of
trusting
just
one
actor,
we
we
can
review
the
code
of
the
alternative
markets
and
if
they
implement
the
kind
of
mechanics
that
we
believe
are
you
know
concordant
with
our
vision
for
falcon
plus.
Then
we
can
allow
them
to
broker
verified
deals
and
markets
that
do
something
that's
too
weird
or
that
don't
you
implement
enough,
you
know
robustness
or
you
don't
implement.
G
You
know
a
two
yeah,
I
guess:
don't
don't
meet
our
we'll
need
to
have
a
way
of
you
know,
defining
what
the
standards
are,
that
the
market
should
maintain
in
order
to
be
eligible
for
for
far
coin
plus
deals,
but
then
we
might
need
some
kind
of
review,
review,
technical
and
and
policy
review
process
for
new
markets
in
order
to
whiteness
them
for
filecoin
plus,
but
then
I'm
working
on
the
mechanism
so
that
if
we
had
that
governance
process,
other
market
actors
that
have
different
functionality
most
likely
much
better
functionality
than
the
current
built-in
storage
market
actor
can
benefit
from
verified
deals
as
well.
G
Let
me
pause
briefly
there
for
questions
deep,
also
nudged
us
to
a
specific
fit
that
I've
just
proposed
today.
That's
a
step
towards
this,
but
I'll
pause
for
questions
generally
and
then
I
can
briefly
explain
the
falcon
plus
and
subsidy.
E
Yeah,
do
you
have
any
sort
of
even
rough
sort
of
examples
of
what
alternative
markets
you
you
could
you
could
envisage?
I
mean
I
realized
that
that's
up
to
the
creativity
of
the
this
wider
audience
that
will
be
able
to
make
it,
but
I
I'm
not
quite
sure
what
brokering
data
cap,
where,
where
the
actor
sort
of
fits
into
the
the
process
that
we
see
from
the
outside.
G
Sure
so
the
the
market
actor
defines
what
the
deal
is
and
the
like.
The
non-existence
of
a
whole
range
of
features
that
we
might
want
is
a
function
of
our
market
actor
being
simple
and
also
not
designed
in
a
particularly
scalable
way.
So
one
dimension,
along
which
I
see
you
know
different
markets,
would
be
something
that
has
a
different
internal
implementation.
G
Perhaps
it
uses
a
different
state
representation,
or
perhaps
it
uses
some
kind
of
off-chain,
roll-up
or
zero
knowledge
proof,
or
perhaps
it
uses
yeah
anchors
some
stuff
in
another
blockchain
and
uses
a
breach.
I
don't
know
but
but
then
it
but
then
adds
features
like
a
deal
that
can
span
multiple
sectors
or
the
ability
to
transfer
a
deal.
The
client
side
of
a
deal
from
one
party
to
another
or
the
ability
to
with,
with
the
client's
permission
transfer
deal
from
one
provider
to
another
and
enable
those
kind
of
migrations.
G
So
those
features
in
theory
can
be
implemented
in
the
current
storage
market
actor,
but
they
require
a
lot
of
work
and
it's
likely
that
a
bunch
of
them
ultimately
wouldn't
scale
unless
we
did
essentially
a
complete
overhaul
and
rather
than
doing
a
complete
overhaul,
we
can
just
allow
many
parties
to
write
new
ones
from
scratch,
possibly
including
product
labs
and
other.
You
know
core
developers.
G
From
the
one
we
have
right
now,
I
I
so
so
I
have
a
big
list
of
features
that
I
think
would
be
might
be
desirable,
but
the
main,
the
type
of
difference
that
I'm
predicting
first
is
just
different
internal
representations
that
enable
certain
features
to
be
implemented
efficiently.
E
So,
just
to
be
crazy
about
this,
like
suppose
that
we
might
envisage
like
a
a
an
actor,
a
contractor
if
a
particular
provider
is
no
longer
proving
that
they
are
storing
a
kind
of
data,
you
would
be
able
to
maintain
like
having
five
redundant
copies
of
the
data,
and
what
you
would
want
is
given
that
that
deal
was
originally
using
data
cap
for
that
to
act
as
kind
of
a
proxy
for
its
the
original
client
or
or
some
such
you
see
where
I'm
going,
I'm
just
like
yeah
like
it
can
play
different
roles
in
the
scene.
G
Yeah,
I
don't
know
if
that
would
be
built
in
sort
of
the
market
or
would
be
a
thing
that
would
be
built
on
top
of
markets
but
yeah.
So
those
kind
of
high
level
features
like
automated
repair.
Automated
your
maintaining
of
replication
is
exactly
the
kind
of
thing
that
couldn't
be
built
today
will
hopefully
be
be
much
more
tractable
after
these
kind
of
changes.
J
Yeah
actually
like
during
the
senior
crystal
we're
already
like
I
would.
I
was
like
trying
to
do
something
related
to
smart
contract,
because
something
like
what
do
we
found
dash
when
we're
doing
the
sensor
crystal
project.
We
need
a
five
participants,
so
then
we
can
get
a
reward.
J
So
there
are
a
couple
of
issues
like
in
front
of
us
how
to
make
a
contract
about
sharing
the
revenue,
because
it's
it
doesn't
work
if
they
just
see
like
I'm
the
client,
I
get
a
100
reward
and
I
don't
give
you
three
providers.
It
doesn't
work,
so
we
need
a
contract
and
now
it's
really
low
efficiency.
J
We
and
first
how
to
share
the
revenue,
is
a
top
issue
secondary
once
one
participant
is
acquitted,
which
means
that
all
the
other
other
four
people
cannot
get
a
reward
and
we
need
to
find
a
replacement
and
there's
no
penalty
for
the
participants
who
quit
so
like.
We
want
to
have
something
like
a
collateral
or
a
participant
to
get
as
a
price,
so
you
can
usually
just
say.
No,
I
don't
I
won't.
I
don't
want,
but
I
can't
quit.
J
I
think
those
kind
of
can
be
part
of
the
fm
work,
and
so
as
also
framework,
I
heard
that
fom
we
are
compatible
with
evm.
So
if
we
have
some
smart
contract
building,
this
solidity
where
it
works
on
the
equivalent.
G
Yeah,
I
believe
so
so
I'm
not
deep
in
the
fm
team.
But
yes,
the
plan
is
definitely
to
internally.
The
implementation
is,
is
a
wasm
vm
with
an
avm
interpretation
layer
on
top
of
it?
G
But
yes,
the
the
goal
is
that
basically
any
solidity
or
evm
compiled
a
smart
contract
should
be
deployable
into
filecoin,
with
sort
of
minor
might
need
minor
changes
to
account
for
the
different
gas
models,
but
but
yeah
ultimately
aiming
for
very
close
compatibility,
and
so
yes,
exactly
the
same
kind
of
things
can
be
built
and
contracts
like
yeah.
So
you
pointed
to
another
limitation
of
the
current
market
in
the
the
client
of
a
deal
in
our
current
market
has
to
be
a
like
an
off-chain
address.
G
It
can't
be
a
multi-sig
or
any
other
contract.
So
there's
all
kinds
of
flexibility
like
that
that
is
currently
excluded,
but
that
new
market
implementations
could
could
build.
J
Yeah
I
really
like,
if
one
day
we
can
use
smart
contract
to
regulate
those
people's
behavior
like
don't
quit
easily
and
make
sure
like
we
have
a
standard,
transparent
revenue
sharing
protocol
online
even
can
have
some
voting
system,
so
people
like
get
some
kind
of
if
everyone
can
support
a
token
publish
so
like
we
can
use
in
token,
like
dl
tokens,
like
I
vote
with
the
weight
to
decide
like
this,
is
the
money
I
want
to
put
and-
and
this
is
the
share
I
want
to
gain
the
more
you
put
the
more
shares
you
can
gain
or
even
like
get
the
revenue
from
the
activities
challenges.
J
G
Yeah
sure
I
mean
some
some
of
these
things
that
we
would
like
are
technically
possible
today,
if
you
like,
wrote
some
bridge
contracts
onto
another
blockchain
and
then
implemented
all
the
complicated
stuff
in
another
blockchain.
But
that's
a
huge
barrier
to
you
know
rapid
innovation,
so
yeah.
I
think
the
fm
make
a
huge
difference
today.
G
Okay,
if
there's
no
other
questions
on
that,
I
just
briefly
I'll
point
out
a
fifth
that
I
just
proposed
today,
which
I
don't
have
a
number
yet,
but
there's
a
pull
request
in
the
phipps
repo,
which
is
to
rework
the
the
reward
for
verified
deals
very
briefly.
Today.
G
The
mechanism
for
incentivizing
filecoin
plus
deals
is
that
the
the
space
that
they
occupy
in
a
sector
accounts
for
ten
times
as
much
power
in
the
network
and
the
power
is
what
determines
their
share
of
the
block
reward
from
the
network,
and
so
a
sector
that
has
lots
of
verified
deals
in
it
is
more
likely
to
earn
block
award.
This
is
an
unfortunate
coupling
of
two
different
things.
The
the
network
power
is
really
about
network
security
and
the
functioning
of
the
blockchain
and
we've
piggybacked.
G
The
subsidizing
of
file
coin
plus
deals
into
this
mechanism,
which
makes
both
of
them
hard
to
change
independently,
and
it
makes
a
lot
of
these
features
difficult.
So,
for
example,
the
feature
of
moving
a
deal
from
one
sector
to
another
is
horrendously
complex.
If
it's
a
verified
deal
because
it
changes
the
power
of
both
of
those
sectors
which
you
know
goes
into
consensus
and
changes,
it
changes
the
penalties.
G
If
the
sector
faults
and
all
kinds
of
other
things,
it
isn't
what
I
would
call
technical
debt
from
the
implementation
at
mainnet
launch
all
right.
So
my
proposal
is
to
instead
have
the
have
the
market
actor
and,
in
the
future,
have
other
market
actors
explicitly
account
for
the
amount
of
verified
deal
space
that
each
provider
is
providing
and
for
the
market
actor
to
dole
out
a
subsidy
to
dole
out
a
reward
to
those
providers
who
are
maintaining
verified
deals,
and
we
take
that
reward
from
the
from
the
block
reward.
G
Just
once
up
at
the
sort
of
the
root,
as
the
block
reward
is
calculated
for
each
epoch
of
the
blockchain.
We
split
that
into
the
consensus
reward,
which
goes
to
just
the
raw
byte
power
of
all
the
miners
that
are
securing
the
network
and
the
file
coin,
plus
subsidy,
which
is
split
and
paid
out
to
the
market,
actor
or
market
actors
in
the
future
who
disburse
it
to
the
providers
who
are
providing
verified
deals,
and
so
this
makes
the
this
makes.
G
The
fact
that
we
are
subsidizing
verify
that
the
network
is
subsidizing,
verify
deals
very
explicit
and
direct
in
the
flow
of
funds
throughout
the
the
accounts,
and
it
decouples
these
two
things,
this:
the
storage
powered
consensus
and
the
subsidized
storage,
so
that
both
of
them
can
change
in
the
future.
This
is
another
thing
that
would,
you
know,
is
necessary
to
change
before
it's
possible
for
anyone.
G
Anyone
else
to
build
a
storage
market
that
could
possibly
broker
verified
deals
right
now,
there's
a
very
tight
coupling
between
the
minor
actor
and
the
storage
market
actor,
but
we
need
the
minor
actor
to
be
able
to
participate
in
many
storage
markets
and,
right
with
this
coupling,
it's
very
hard
to
do
that.
G
So
that's
the
proposal
that
I've
made
it
ends
up
being
sort
of
complex
internally
to
unwind
all
this
debt,
but
I'm
very
happy
to
answer
high
level
questions
about
it
here
or,
of
course,
there's
full
detail
and
there'll
be
a
lot
of
discussion.
I
hope
in
the
future
near
future
in
the
fibs
repo
against
the
draft.
B
Deep
end,
yeah
good
question
to
you
is
that
the
the
the
general
network
block
payout
happens
when,
like
you
do
the
winning
post
you
you
win
a
block
reward.
Would
the
field
plus
subsidy
have
a
similar
like
lottery,
ask
construct
or
would
it
always
be
a
consistent
amount?
That's
paid
out
proportionately
every
clock.
G
Yeah,
no,
this
changes
the
the
deal
sub
deal
subsidy
to
be
not
a
lottery
and
do
paid
out
consistently,
irrespective
of
block
reward.
Okay,
I
think
that's.
G
For
small
miners,
who
very
rarely
win
a
block
because
they'll
be
able
to
rely
on
their
reward
subsidy,
even
if
they
don't
participate
in
block
mining.
B
Okay,
cool
and
then
I'm
assuming
that
we've
probably
done,
or
we
might
be
planning
to
do
some
kind
of
modeling
on
like
how
this
would
impact
miners
or
different
sizes
or
different
relative
power
distributions,
because
I'd
be
curious
to
see
what
the
incentive
ends
up
being
for
if
it,
if
it.
If
it
does
create
like
a
perverse
incentive
and
people
who
want
to
move
towards
like
symbols
or
set
up
more
minorities.
B
That
individually
store
data
like
data
cap
based
deals
differently
versus
put
their
effort
into
like
a
single
larger
storage
provider,
as
they're
currently
incentivized
to
do,
because
they
don't
know
how.
G
The
goal
is
to
be
is
to
be
neutral
to
the
aggregate
incentives
that
a
miner
faces,
all
the
network
faces
as
a
whole.
So,
for
example,
a
bunch
of
the
complications
are
replicating
the
pledge
amount
and
replicating
the
penalties.
Yeah
replicating
the
reward
payout
so
that
a
minor
ends
up
receiving
very
close
to
the
same
reward
in
the
old
system
versus
the
new
system.
It's
just
accounted
for
differently.
B
Internally,
nice
and
then
this
question
I
know,
is
implied
in
your
write-up
already,
but
I'm
just
going
to
ask
the
benefit
of
the
recording,
there's,
no
change
to
the
way
collateral
works
as
a
result
of
this
draft
correct.
G
There's
no
chance
that
total
collateral
will
be
required.
There
will
be
a
change
to
how
it
works
in
that
part
of
the
collateral
will
be
will
remain
as
it
is
and
be
posted
with
the
storage
miner,
but
part
of
the
part
of
the
collateral
that
is
securing
the
file
twin
plus
subsidy
will
be
posted
with
the
market
actor
instead
and
will
be
lost
if.
G
Yeah,
my
very
next
to
do
item
is
to
go
and
model
these
collateral
and
reward
and
penalty
functions
to
demonstrate
the
equivalence
or
whatever
variance
there
is
between
the
old
model.
Yeah
for
your
miners,
with
different
profiles
of
verified
deals,
cool.
B
Thanks,
I
think
people
will
be
extremely
interested
in
seeing
that
part,
because
that
tends
to
be
a
big
blocker
today
for
clients
today
is
finding
stress
providers
that
have
the
amount
of
liquid
fuel
required
to
actually
onboard
the
amount
that
they
have
is
not.
Okay,.
G
Yeah
I
mean
so.
The
goal
right
here
is
to
not
change
anything
in
a
macro,
exactly
right,
I'll,
of
course,
be
working
with
the
crypto
econ
team
to
verify
that's
what
we
wanted
awesome.
A
J
To
that
one,
before
we
yeah
it's
just
like
us
like
several
times,
I
watched
like
doing
the
airdn
approval,
like
some
people
anonymous
to
community
members
like
here,
is
a
new
created
account
like
leave.
A
comment
like
usually
check
this
node.
His
data
is
not
accessible,
usually
text
that
node.
This
data
is
like
not
work
as
expected.
J
So
I'm
just
wondering
like
I
do
appreciate
those
people
are
like
what
is
that
doing
so,
I'm
wondering
like
do
we
have
the
tools
we
can
easily
like
just
fetch
one
data
so
yeah
once
they
id.
B
That's
a
great
great
question,
so,
firstly,
I
do
have
a
tool
that
does
this.
I
use
this
for
slingshot
and
we'll
be
using
it
for
essential
store
where
I
can
check
for
specific
cids
and
try
and
retrieve
them.
So
that's
actually
part
of
the
dealbot
functionality,
which
is
completely
open
source
software
piece
that
you
can
find
on
github.
I
can
share
a
link
to
that,
but
anybody
is
welcome
to
run
a
dobot.
B
B
However,
there's
also
a
much
like
more
interesting
and
deeper
conversation
trials
happening
around
how
to
support
private
data,
for
example,
or
encrypted
data,
which
has
similar
implications
here,
where
effectively,
one
of
the
ideas
put
forth
by
hold
on
was
get
the
content
stored
on
ipfs
first
encrypted
or
in
whatever
shape
it
will
be
on
filecoin
share
decryption
with
specific
subsets
of
notaries
that
can
fetch
it
from
ipflash
gateways.
Look
at
it
and
then
watch
the
same,
cid
effectively
being
dealt
with
in
the
falcon
network,
so
with
some
kind
of
sampling
we
can.
B
We
can
assume
and
build
trust
in
a
client,
that's
not
able
to
publicly
show
data,
but
by
starting
to
track
cids
or
like
the
flow
of
car
files,
and
knowing
that
there
are
indices
that
exist.
That
tell
us,
what's
in
those
car
files,
we'll
be
able
to
get
a
lot
more
information
about
the
likelihood
that
the
client
is
complying
with
what
they've
claimed
in
their
application.
B
Similarly,
with
the
advent
and
the
actual
like
shipping
of
indexer
nodes,
I
think
this
will
also
become
much
better
and
easier
to
track
where
people
will
have
to
publicly
create
indices
of
the
information
that
they
have
and
where
they're
storing
it.
Because
then
we
can
start
to
build
random
sampling,
retrieval
tools
dedicated
to
filecoin
plus,
for
example,
where,
like
a
node,
you
can
say
you
know.
B
This
client
looks
suspicious
before
we
allocate
next
transfer
data
cap
press
this
button
or
right
in
this
thing,
for
a
bot
to
try
and
retrieve
it
or
something
like
that,
and
so
I
think
this
is
a
very
pertinent
question.
It's
a
very
pertinent
topic.
I
think
that
we're
not
really
in
a
place
where
we
I
I
want
to
say
yes,
we
have
this
in
a
good
way.
We
sort
of
have
it.
I
think
we've
got
to
do
a
lot
more
thinking
on
it
and
it
is
something
we
should
aim
for.