►
From YouTube: Filecoin Plus - Nov 23, 2021 Notary Governance Calls
Description
For reference to the items discussed, please check out
https://github.com/filecoin-project/notary-governance/issues/322
Content Covered on this call
- Goals and Updates for Q1 2022
- Fip 203: Remove Fplus Notary Process
- Fip 204: DataCap Management
- Holiday Schedule and 2022 Meeting dates
B
A
November
the
23rd
8
a.m.
Pacific.
There
has
been
some
debate
back
and
forth
as
to
what
time
it
is
at
utc
because
of
daylight
savings.
So
we'll
get
that
clarified.
Sorry
about
any
of
the
confusion
with
a
new
zoom
link
with
our
new
vanity
zoom
url,
so
everyone
that
made
it
as
a
participant
is
called
you're
in
the
right
spot.
A
Anyone
who
is
struggling
in
another
zoom
room
waiting
for
me
to
start
that
zoom
room.
I
apologize
we'll,
hopefully
get
that
clarified
as
well.
All
right
jumping
in
we
do
have
a
similar
agenda,
as
we
normally
run
talking
through
our
metrics
some
program
updates
and
then
there's
a
number
of
open,
fibs,
some
community
updates
and
then
some
discussion
topics.
A
So
I
think
first
we'll
jump
into
our
metrics
from
one
of
our
dashboards
and
then
we'll
have
some
time
to
have
a
conversation
about
some
other
dashboards
that
are
popping
up
so
pulling
the
metrics
from
our
dmob
dashboard
snapshot.
As
of
yesterday,
we're
seeing
you
know,
average
time
down
to
two
days
with
time
to
date,
a
cap
in
the
past
30
days
at
one
day,
so
feeling
really
good
about
that.
A
A
It's
only
two
percent
of
the
data
cap
that
has
been
kind
of
put
at
that
top
band
of
ldns
has
gone
to
clients.
That's
because
you
know
each
ldn
that
gets
created
is
requesting
currently
up
to
five
petty
bytes,
so
that's
at
that
top
of
the
funnel,
and
then
it
flows
down
from
there.
So
two
percent
down
to
only
one
percent
being
used
in
deals.
This
funnel
is
also
running
for
less
time
than
the
direct
one.
A
But
the
interesting
thing
that
we
notice
that
I
want
to
call
out
there's
a
very
pretty
similar
percentage
drop
from
client
to
minor
storage
provider
being
40
so
of
the
amount
of
data
cap
that
has
been
given
to
clients
and
both
pipelines.
A
40
of
that
is
being
used
in
deals,
which
you
know,
there's
some
interesting
stuff
to
kind
of
unpack
there,
but
like
it
could
be
a
few
things
are
happening
either,
clients
that
get
their
data
cap
40
of
those
clients
are
using
all
of
it
with
storage
providers,
so
they
get
it
and
they
use
their
total
amount
or
of
the
clients
that
get
data
cap
out
of
all
of
them.
A
They
are
all
using
some
of
their
data
cap
down
to
about
40,
but
in
some
ways
they
are
sort
of
holding
on
to
an
amount
of
data
cap.
So
there
could
be
it's
probably
a
combination
of
both
of
those
things.
It's
not
going
to
be
like
that
cut
and
dry
that
it's
one
behavior
or
another,
but
it
is
just
an
interesting
similarity
so
pausing
here
and
we're
going
to
talk
through
some
other
dashboards
that
have
come
up
deep.
Was
there
one
that
you
wanted
to
pull
up?
First.
C
Not
not
necessarily,
we
can
go
in
either
order
that
you'd,
like
there's
also
we.
I
was
also
going
to
flag
these
as
part
of
the
six
months
or
I
guess
q1
gold,
but
I'm
down
to
discuss
it
now
too,.
A
Yeah
go
ahead
and
chat,
you
know
I'll
get
those
pulled
up,
we'll
go
through
the
direct
notary
activity
and
then
we'll
talk
about
the
dashboards
in
the
program
updates.
Yeah,
changing
the
agenda
live
so
direct,
node
reactivity.
A
We're
seeing
253
granted
issues
15
in
the
last
seven
days,
that's
about
similar
to
what
it
was
before
and
then
the
number
of
applications
getting
a
response
from
a
notary
is
about
63,
with
clients
being
similar
to
what
it
was
two
weeks
ago
about
24
out
of
79
of
the
clients
are
getting
data
cap
per
month.
So
here's
our
wonderful
table
breakdown
and
another
update
that
we'll
talk
through.
A
We
are
making
progress
on
a
notary
leaderboard
to
kind
of
have
another
great,
clear,
visible
web
surface,
to
highlight
how
fantastic
all
of
our
notaries
are
doing
so
from
there.
Let's
talk
through
someone
finally
updated.
This
slide.
Header
we've
been
talking
about
it
for
months
and
we
did
it.
Our
fill
plus
q1
for
2022
goal
updates.
C
Thank
you
very
much
so
yeah
just
running
through
these.
I
think
you're
all
getting
pretty
familiar
with
this,
like
general
framework
of
thinking
across
these
different
areas,
so
I'm
just
going
to
get
right
into
it,
the
first
of
which
is
volume
and
the
general
goal
that
like
data
cap,
should
be
abundant
and
available
for
both
notaries
and
for
clients.
We
have
this
goal
of
wanting
to
get
that
15
per
bytes
of
data
cap
available
for
giving
to
clients
at
the
end
of
q1
we're
making
pretty
solid
progress
from
that.
C
The
current
number
has
just
crossed
100
as
of
last
week,
which
is
massive
so
bigger
conversations
to
you
all
and
then
just
straight
up
in
terms
of
data
cap,
that's
been
like
generated
in
the
ecosystem.
That's
now
around
150,
which
is
very
close
to
150,
which
is
a
big
deal.
The
delta
between
those
two
is
effectively
now
when
ldn
apps
come
in,
they
may
often
get
an
ldn
that
has
data
cap
before
they're
even
approved
to
receive
data
cap,
and
so
in
some
cases
that
did
it
like
that.
C
But
then
I
take
the
number
on
top,
which
is
the
total
amount
that's
been
put
into
the
multistick
for
them
and
add
all
of
that
up,
and
so
that
number
is
98.5
and
we're
now,
with
about
eight
baby,
rides
through
the
regular
military
elections,
that's
still
being
dished
out
and
only
about,
I
think
50
48
of
that
is
actually
out
and
used
so
making
good
progress.
C
I
think
we,
of
course,
like
you,
know
we're
we're
not
there
yet,
and
but
we
do
have
several
months
and
we're
no
longer
like
an
order
of
magnitude
off
from
this,
and
the
utilization
of
this
data
cap
is
also
going
much
better
in
that
like,
even
though
from
that,
like
98.5.
C
Only
you
know
two
pebbly
bytes
or
three
wavelengths
have
made
it
two
clients
right
now,
so
in
total
I
think
client
addresses
have
received
more
than
five
bytes
of
data
cap
as
of
last
week
and
that's
also
a
massive
milestone.
Congratulations
to
all
of
you!
That
is
epic
in
terms
of
the
amount
of
data
cap
that
is
available
in
client
addresses
today
for
deal
making.
C
So,
even
though
you
know
when
we're
not
no
longer
like
that
much
of
a
delta,
because,
like
six
months
ago,
we
that's
how
much
data
cap
existed
in
the
network.
So
it's
great
to
see
like
the
order
of
magnitude
improvements.
Since
then,
we
still
have
a
lot
more
to
do
here.
The
main
things
I
want
to
call
out
here
like
improvements
to
the
ldn
flow.
C
I
think
there
will
probably
be
even
more
to
come
on
that
and
in
general,
like
figuring
out
how
we
can
make
the
notary
role
a
little
bit
more
lightweight
and
more
consistent
in
operation
from
the
standpoint
of
like
where
it
creates
overhead,
not
necessarily
like
making
the
role
easier
to
do
from
like
a
thinking
and
a
subjective
analysis
perspective,
because
that's
the
whole
point
of
having
humans
in
the
loop,
but
rather
make
the
things
that
the
humans
need
to
make
decisions
more
accessible,
more
easily
accessible
and
have
better
ux.
C
So
that,
like
the
the
the
actual
cognitive
overhead
of
you
as
a
notary,
saying,
oh,
I
need
to
go
to
view
my
applicants.
So
look
at
the
alien
applicants
and
decide
one
way
or
another
that
goes
down
because
the
tooling
is
fun
to
use
or
it's
nice
to
use
and
everything
is
working,
which
is
not
the
case
today.
C
As
many
if
you've
seen
you've
run
into
a
bunch
of
issues
with
the
interrupt
between
the
bot
and
github
they're
random
things
like
certain
messages
that
are
being
signed,
aren't
showing
up
some
of
the
data
cap
that's
been
granted.
It
doesn't
show
up
correctly
in
github,
and
so
people
end
up
signing
a
third
message
as
opposed
to
approving
the
second
one.
We
now
have
a
third
proposal
in
the
mix,
and
so
there's
still
work
being
done
on
that
actively
every
single
week
to
try
and
improve
that
pipeline
promise.
C
C
There
is
also
building
a
little
bit
more
transparency,
because
a
lot
of
what's
happening
with
the
tooling
today
is
like
we'll
hear
feedback
from
you
or
see
issues,
and
then
we
go
disappear
off
into
this
little
room
with
the
team,
and
you
guys
know
the
team,
because
we've
tagged
them
in
a
bunch
of
bunch
of
threads
and
things
like
that
be
great
to
get
a
little
bit
more
transparency
in
that,
so
that
you
can
also
see
how
some
of
that
tooling
is
working
and
where
it's
being
effective
and
not,
and
so
we're
going
to
try
and
make
a
little
bit
of
an
effort
to
make
those
things
a
little
bit
more
public.
C
Second
up
time
to
data
datacap,
my
favorite
metric.
So
this
this
is
directly
correlated
with
building
a
low
friction,
client,
onboarding
experience
and
ensuring
that
clients
actually
have
a
good
time
getting
that
data
cap
and
getting
to
use
this
network
to
onboard
that
data,
because
we
need
that
to
be
a
good
experience
for
this.
To
actually
have
demand,
show
up
and
be
served
correctly.
C
This
one,
I
think,
we're
still
quite
a
ways
away
from
our
goals
other
than
on
one
front,
which
is
that
I
think
in
general,
the
community
is
an
agreement
that
we'd
like
to
get
automated
verification
path
up
to
much
more
than
just
32
gb.
So
the
number
we've
been
tossing
around
is
around
one
tabby
byte,
there's
a
whole
github
discussion
on
this
last
month,
and
lots
of
folks
in
these
calls
are
generally
supportive
of
that,
and
so
we'd
love
to
put
that
into
the
next
steps
in
terms
of
implementation.
C
Now
it
would
be
interesting
to
see
if,
if
verified
our
glyph
or
io
wants
to
take
that
on
as
part
of
their
notary
function
or
if
it
turns
into
like
a
different
way
in
which
we
achieve
this.
So
I
think
the
push
on
that
front,
at
least
on
my
end,
will
probably
be
like
figuring
out
with
you
guys
like
what
the
right
shape
for
that
is
and
what
the
level
of
kyc
needs
to
be
like
is
github
still
enough,
like
maybe
we
start
with
github
and
change
it.
C
I
don't
know,
but
like
the
more
tactical
discussions
need
to
start
happening
on
that,
and
so
I'm
I'm
excited
to
see
the
difference
that
that
will
make
other
than
that.
I
think
we
still.
We
still
got
ways
to
go
at
least
we're
improving
the
transparency
side
of
things.
The
reporting
is
improving
and
we'll
get
to
be
better
informed
about
how
things
are
actually
happening.
C
Okay,
ray
has
also
joined
the
team
for
those
of
you
that
didn't
meet
him
at
the
previous
notary
governance
call
and
he's
specifically
focused
on
the
notary
experience,
and
he
and
I
have
been
talking
a
little
bit
about
like
the
the
general
notary,
dashboard
leaderboard
thing
that
I
think
I
showed
you
guys
about
a
month
ago,
and
so
hopefully
that
will
come
into
fruition
in
the
next
months
or
so
don't
go
to
the
next
slide:
sweet
risk
mitigation
standpoint
we've
got
two
dashboards
that
you
know,
we've
been
showing
for
a
long
time
the
the
excellent
plus.info
built
by
laura
and
the
file
drive
team
and
then
the
long
url
that
calen
and
I
have
been
working
on.
C
It
also
looks
like,
as
of
last
week,
there's
actually
two
new
dashboards,
there's
pluskit.io
and
atpool.com
made
one
where,
on
their
main,
athena
explorer
for
the
blockchain
there's
a
data
cap
statistics
you
can
see.
Galen
is
showing
you
with
his
mouse
and
so
they've
got
like
a
whole
like
data
cap
statistics
as
well.
Firstly,
great
work
they're.
Both
these
teams
awesome
to
see
you
know,
I
don't
even
know
that
this
is
being
worked
on.
C
I
know
the
atp
one
somebody
had
mentioned
that
they
were
looking
into
it,
but
I
don't
even
know
the
team
behind
basket.ios
and
I
I
like
the
the
unique
stylization
of
both
of
these.
I,
like
the
the
ux,
the
ui.
It's
pretty
neat,
it's
also
great
to
just
see
additional
like
independent
thinking
in
this
space,
but
also
for
the
integrity
of
the
program.
C
The
integrity
of
the
data
right
like
having
over
reliance
on
one
methodology
is
probably
bad
and
we
definitely
want
to
have
like
access
to
more
reliable
data,
tracking
and
transparency
in
how
like
data
cap
itself
is
working.
How
the
system
is
working
so
that
we
can
build
more
informed
opinions
without
bias
or
reduce
bias,
ideally
on
the
the
general
state
of
the
system,
though,
if
you're
watching
this
video-
and
you
built
one
of
these-
please
let
us
know
in
slack
one.
C
It
would
be
great
for
you
to
come
present
them
at
a
call
like
this,
and
this
goes
to
all
of
you
here-
that
you
didn't
even
build
this,
but
might
be
building
on
other
things.
So,
for
example,
the
matrix
storage
team
was
here
about
a
month
and
a
half
ago
and
talked
about
some
of
the
tooling
they're
building
for
notaries.
C
These
calls
are
a
great
place
to
share
progress
that
you're
making
on
projects
that
you're
building
in
the
system,
and
so
please,
please,
please
feel
free
to
come
and
do
a
little
presentation
and
and
chat
with
us
and
see
how
you
can
continue
improving,
developing
and
partnering
with
different
folks
in
the
community,
but
also.
Secondly,
you
know
if
you're
working
on
this
make
sure
you
also
give
us
a
way
to
give
you
some
feedback,
so
whether
that's
in
slack
or
elsewhere.
C
Actually,
I
wanted
to
propose,
like
maybe
it's
worth
doing,
like
a
like
a
dashboard
sync
either
at
this
call,
or
at
like
a
separate
call
that
we
can
stand
up
like
maybe
once
a
month
or
something
where
we
chat
about,
like
data
discrepancies
and
just
data
integrity,
so
that
the
pipelines
themselves
are
robust
and
people
can
rely
on
them
or,
alternatively,
it
would
also
be
good
to
have
like
a
min
requirement
of
like
a
what
a
dashboard
needs
to
be
for
us
to
like
talk
about
it,
a
lot
or
use
it,
and
I
think
one
of
those
things
is
probably
like.
C
We
need
to
start
publishing
methodologies
like
how
are
people
actually
tracking
stuff
on
the
network.
How
is
it
being
processed,
and
how
is
it
therefore
showing
up
on
a
dashboard
galen?
You
got
a
hand
up.
A
C
C
Cool
yeah,
I
don't
know
I
I
I
did
ask
in
slack
to
see
where
we
can
give
some
feedback
and
learn
more
about
it.
So
hopefully
we
hear
back
from
the
team
soon.
I
think
that'd
be
good.
There's
like
a
there's
like
an
efficiency
parameter
right,
which
is
like
a
per
per
region.
D
What
about
the?
How
do
you
guys
usually
measure
it,
because
I
mean
this
efficiency
is
measured
for,
like
I
think,
a
long
time
when
we
apply
last
time
we
talk
about
it
also,
so
I
just
wanted
to
know
like
do
we
have
to
just
close
the
issue
as
soon
as
possible
or
like
each
time.
There
is
a
new
comment:
it's
just
like
pausing
until
the
the
client
is
answering
again
and
all
of
that
just.
E
D
I
have
an
example
right.
I
have
one
client,
they
send
like
two
requests
in
parallel,
so
I
just
work
on
one
and
I
say
I'm
pausing
the
second
one
until
the
first
data
cap
is
completely
consumed,
but
is
it
what
I'm
supposed
to
do,
or
I
should
just
close
the
second
one
and
then
just
yeah
play
again
later
on.
C
That's
either
are
those
requests
for
the
same
project.
C
I
think
that
one
is
up
to
you
in
general,
like
the
the
thing
is
that
there's
there's
there's
like
an
aspect
of
this
that
needs
to
be
better
where
the
client,
where
the
clients
know
what
to
give
to
a
notary
up
front
like
today,
we
collect
such
basic
information,
and
then
each
notary
has
a
set
of
questions
that
they
post
and
then
the
clients
go
back
and
come
back
like
a
week
later.
C
I
think
we're
like
we
could
improve
a
little
bit
and,
like
the
general
topic
of
conversation,
I
wanted
to
have
in
the
open
discussion
today.
So
I'm
not
going
to
preempt
this
too
much,
but
it
was
around
the
idea
of
like
having
more
consistent.
C
Types
of
things
being
asked
in
due
diligence-
and
I
think
the
right
boundary
probably
is
geographic
like
on
a
per
like
country,
basis
or
geopolitical
basis
like
what
can
we
do
in
the
app
that
basically
says?
Oh,
it
looks
like
the
notaries
of
europe
or
the
notaries
of
france,
like
believe
that
you
know
at
least
these
eight
questions
will
be
answered.
They
have
to
be
answered
and
then
maybe
there's
more
beyond
that.
C
C
That's
being
lost
today
and
reading
those
questions
and
having
to
do
the
analysis
to
get
back
with
that
data,
and
then
I
think,
there's
the
second
angle
of
effectively
like,
even
if,
like
I
agree
with
you
that,
like
counting
the
amount
of
time
it
takes
the
client
to
do
like
their
own
preparation,
should
not
be
counted
against
like
a
notary's
capability
and
delivering.
C
Then
their
delta
between
when
they
thought
they
could
start
making
deals
to
when
they'll
get
to
make
deals,
is
going
to
be
longer
or
the
amount
of
preempting
that
has
to
be
done
by
a
client
and
the
overhead
that
they
have
to
take
to
like
anticipate
needing
data
cap
increases,
and
so
things
that
we
can
do
in
the
system
to
prepare
them
better
up
front.
To
get
that
data
upfront
and
reduce
the
the
the
amount
of
interactions
is
probably
like
my
interpretation.
What
you're
saying
is
like.
D
C
That
makes
sense,
so
I
think,
if
you
actually
given,
if
you
pull
up
plus
kit,
dot,
io
and
you
scroll
down,
this
is
very
similar
to
what
andrew
host
dashboard
also
does,
which
is
like
you've
got
like
allocated.
C
Oh
and
you've
got
average
time.
I
don't
know
in
this
one
if
the
average
time
includes
allocated
unallocated
for
our
ttd,
that
we
are
measuring,
which
we
have
not
published
on
a
partner
databases
yet
because
I
don't
think
we're
actually
like
tracking
it
on
a
priority
basis.
We're
currently
just
doing
averages
for
the
ecosystem.
We
look
at
allocated
specifically.
C
So
we
look
at
the
time
at
which
the
first
request
came
in
from
a
client
to
the
time
at
which
the
message
was
posted
to
the
chain
and
that's
the
amount
that's
like
ttd,
and
then
we
average
that
across
all
the
applications.
So
if
a
client
delays
by
a
week
in
responding
to
you
that
counts
against
you
or
that
counts
against
the
system.
I
guess
in
this
case,
because
we're
aggregating
that
for
everybody
and
then
saying
oh,
like
falcon,
plus
on
average,
it
takes
11
days
to
get
it
again.
C
We
also
we're
trying
to
do
that
for
multisec,
it's
currently
aggregated
and
rolled
up
in
the
same
thing,
what
we're
working
on
right
now
is
separating
it
so
that
we
can
say
oh
for
ldn
versus
not
for
ldn
and
for
individual,
because
we'd
like
to
see
that
broken
out,
because
I
think
that
will
also
be
a
good
lesson
for
systems
design
perspective
for
the
community
on
like
do
we
even
like
this
model?
Is
there
a
better
model,
or
do
we
want
to
move
more?
Because
if
it
turns
out
the
multisig
is
working
better?
C
It's
closer
to
the
step
of
like
more
automation,
more
like
a
dow.
Anyway,
maybe
we
just
started,
like
figuring
out
mechanisms,
to
turn
that
into
how,
like
groups,
do
notaries
work
right?
So
it
turns
out,
like
a
lot
of
notice,
have
a
consistent
due
diligence
process.
Then
maybe
we
just
instead
of
applying
to
one
notary.
C
You
applied
a
group
of
notary
every
time
and
that
group
is
determined
by
like
consistent
like
the
same
type
of
due
diligence
process
and
the
region
that
they're
coming
from,
or
something
like
that
or
the
language
they
speak
or
something,
and
so
that
turns
into
a
better
mechanism.
So
I
think
it's
absolutely
paramount
that
we
track
that
separately
and
so
we're
working
on
that
pretty
actively.
Today.
C
So,
on
the
previous
slide,
sorry
to
do
this.
There
you
go
on
the
left.
This
is
our
goal,
which
is
for
an
allocation,
that's
less
than
one
terabyte.
It
should
be
less
than
an
hour
and
for
one
to
100
terabytes.
It
should
ideally
be
in
a
day,
and
I
think
anything,
that's
beyond
that
which
is
like
a
bigger
scale
of
master
scale.
Data
sets,
we
probably
need
more
time,
and
so
that
turns
into
more
bespoke
today,
where
we
are,
is
one
even
for
one
terabyte.
C
If
you,
the
average
ctd,
is
11
days
and
for
like
anything
that
is
more
than
that
which
often
ends
up
in
ldn
like
greater
than
50,
pretty
much
ends
up
in
ldn
all
the
time
that
ends
up
being
like
three
weeks
four
weeks,
and
so
actually
what
we
also
should
do
is
split
this
out
into
three.
We
should
do
like
a
zero
to
one
one
to
fifty
and
fifty
plus
or
one
to
one
to
one
hundred
and
a
hundred
plus
something
like
that,
so
we
can
differentiate
across.
C
Like
oh,
you
came
in
with
an
automated
verifier,
that's
30
seconds.
You
came
in
to
apply
to
an
individual
notary
or
a
group
of
individual
notaries.
That's
like
a
day.
You
came
in
you
applied
to
ldn,
that's
also
a
day
or
something
like
that
or
maybe
ldn
is
expected
to
be
faster
because
there's
even
more
notaries,
but
in
general.
C
I
think,
there's
also
this
aspect
of
needing
to
correlate,
and
I
think
this
was
brought
up
by
by
copy
earlier
on
the
notary
from
finbushi,
where
we
it's
ironic,
that
sometimes
the
ldn
process
is
easier
than
some
individual
notorious
processes,
and
that
should
not
be
the
case,
because
we
should
better
correlate
the
amount
of
data
cap
with
the
amount
of
trust
being
established
to
the
client,
and
so
that's,
like
a
general
like
problem
space
that
I've
been
trying
to
do
a
little
bit
of
thinking
about
in
the
last
couple
of
weeks,
which
is
like
how
do
we
propose?
C
C
The
more
trust
you
can
establish
the
more
likely
it
is
that
you
get
verified
deal
making
on
the
network
at
scale
and
so
you're,
almost
trading
off,
like
the
the
completely
anonymous
nature
of
the
internet
for
benefits
in
deal
making,
and
so,
if
you
can
prove
that
you
are
an
entity
with
high
reputation
in
the
world,
and
you
can
do
that
in
many
ways
such
that
we
can't
doubt
it.
C
Then,
arguably,
you
should
be
considered
as
somebody
that
is
worthy
receiving
whatever
data
cap
is
required
versus
somebody
who
is
unable
to
do
that
or
doesn't
have
as
much
generic
reputation
in
the
outside
world
and
so
correlating
the
trust
that
you
establish
with
the
access
to
this
resource
a
little
bit.
Better
is
probably
just
a
general
thing
that
I
would
recommend
everybody
thinks
a
little
bit
about,
because
that
is
like
an
existential
thing
for
this
program,
in
that
we
have
to
be
able
to
do
that
right
in
order
for
this
to
be
truly
useful.
A
A
So
there
are
some
open
fips
that
we
want
to
talk
through
and
bring
to
the
attention
of
everybody
because
they
are
for
the
entire
community
that
is
sort
of
specific
to
the
falcon
plus
community,
so
there's
fit
203,
proposing
that
we,
I
haven't
read
through
all
of
the
comments
up
to
date
yet
but
effectively.
I
think
they're
proposing
that
we
remove
notaries
entirely
and
just
give
anyone
any
amount
of
data
cap,
which
is
a
controversial
opinion.
I
would
say,
for
the
people
on
this
call.
A
Their
argument
is
that
the
notaries
are
creating
centralization,
and
so
you
know
they
are
proposing
that
we
get
rid
of
notaries.
I
think
that
some
of
the
other
takeaways
we
flagged
here
is
like
you
know
what
is
the
notary
role
in
the
election
process
and
how
could
those
things
sort
of
evolve
to?
Maybe
you
know,
is
it
centralized?
How
could
it
be
less
centralized
in
the
long
term?
Is
it
a
current
high
priority
that
it
needs
to
be
the
most
decentralized
as
possible
based
on
the
status
of
the
network?
A
Those
are
some
questions
which
also
lead
to
kind
of
questions
around
the
goals
of
the
thoughtplus
program.
So
this
one
is
in
here
fit
discussion,
removing
the
f
notary
process.
It
currently
has
eight
different
comments.
So
go
check
that
out.
I'm
gonna
pause
here
and
see
if
anybody
had
any
top
of
mind,
questions
or
opinions.
Having
already
read
this
one
that
we
wanted
to
capture
here,
knowing
that
I
am
not
the
author
to
to
defend
or
argue
any
of
the
statements
made
pausing
to
see.
C
Yes,
I've
been
going
back
and
forth
a
little
bit
on
this
one.
I
will
probably
continue
to
do.
I
do
think
that
there's
some
interesting
like
claims
being
raised
and
like
some
like
thinking
areas
for
us,
but
I
don't
think
so
far
I
feel
like.
I
have
not
yet
been
surprised
like
a
lot
of
the
stuff
that
is
being
flagged
as
like
valid
criticism
and
stuff
that
we've
all
talked
about
and
we're
actively
like
trying
to
work
on.
C
There's
like
the
more
generic
like,
oh
falcon,
plus
itself,
is
centralized
because,
like
we
elect
notaries
and
then
those
notaries
have
their
own
process
and
therefore
dictate
like
in
a
centralized
way,
the
future
or
whatever
of
a
client-
and
I
think
that's
fair,
but
I
I
don't
know
if
I
would
term
that
as
like
centralized,
because,
ultimately,
the
process
for
selecting
those
notaries
is
discussed
or
decided
in
a
slightly
more
decentralized
fashion,
and
that
role
is
is
today
like
very,
very
select.
C
But
I
think
that
that's
probably
going
to
change
like,
as
we
scale
up
the
operation,
there's
probably
going
to
be
more
notaries.
There's
probably
going
to
be
some
notice
that
are
able
to
engage
at
a
much
higher
level
and
some
that
aren't
able
to
engage
as
higher
level,
and
so
I
think,
as
we
continue
to
develop
the
program,
I
think
the
original
fear
behind
this
post
will
will
slowly
abate.
It
also
creates
this.
C
He
also
creates
this
interesting
point
on
like
do.
We
just
want
to
revert
to
like
people
who
can
stake
to
make
decisions
to
like
follow
the
decentralized
model
of
like
if
you
have
fill
you
stake
the
fill,
and
then
you
get
voting
power
and
everything
depends
on
that
or
just
give
everybody
data
cap,
because
everybody
should
just
get
to
make
deals
for
free
kind
of
everything.
C
A
Okay,
I
don't
see
any
other
hands
coming
up,
so
the
other
open
fip,
which
I
I
think
we
talked
about
two
weeks
ago
we
may
have
talked
about
both
of
those-
was
data
cap
management.
A
A
I
think
that
a
lot
of
conversation
has
for
the
question
of,
should
we
be
able
to
transfer
data
cap
from
client
to
client
has
trended
to
no
that
we
don't
feel
that
that
is
a
necessary
or
appropriate
use
of
data
cabinet
of
the
program.
However,
to
have
an
ability
to
remove.
B
A
Cap
from
a
verified,
client
and
remove
a
client
and
further
remove
a
client
from
the
verified
registry
is
something
that
would
be
useful
and
there
are
a
variety
of
different
use
cases
to
do
so.
So
what
we
wanted
to
sort
of
talk
about
here
today
with
this
team
of
of
notaries,
is
specifically
what
if
we
were
to
add
a
new
command.
How
would
that
specifically
work
if
that
command
is
to
remove
data
cap
from
a
verified
client
address?
Now
I
want
to
pause
and
make
sure
that
we're
all
talking
about
the
same
thing.
A
Should
we
can
we
and
how
would
we
change
by
removing
that
amount
of
data
cap,
and
you
know,
like
we've,
said
there
have
been
various
use
cases
where
I
think
that
this
is
an
appropriate
thing,
that
we
should
have
the
ability
to
do
some
of
the
ideas
that
we've
come
up
with
here.
A
A
If
a
client
address
has
had
data
cap
for
a
certain
amount
of
time
and
not
used
it
or
not
made
deals,
we
could
also
decide
what
does
inactive
mean
if
they're
making
deals
on
the
network
but
not
making
deals
with
data
cap?
Are
they
active
or
not?
So,
there's
some
questions
there
so
before
again
before
we
like
shed
our
opinion
on
some
of
these
ideas,
just
wanted
to
see
comments
from
the
experts
here
show
of
hands,
unmute
and
jump
in.
C
Also,
I
mean,
if
there's
any
other
ideas.
This
is
just
like
me
chatting
with
guillen.
Yesterday
after
we
went
to
one
of
the
implementer
sinks,
which
is
like
the
dean
working
on
various
implementations
with
the
spec
and
resulting
in
things
like
lotus
and
venus
and
stuff
being
built,
and
their
ask
was
effectively
like
yeah
feel
free
to
recommend
something
here,
because
I
don't
think
they
have
an
opinion
and
they're
looking
to
us
as
a
community
to
do
that,
and
so
these
are
just
to
get.
The
conversation
started.
F
Hey
so
I
hope
I
can
be
heard
it's
danny
here,
so
I
guess
my
question
here
is
how
many
key
holders
are
there
like
when
we
say
two
rookie
holders
are
we
going?
Does
it
make
sense
to
express
that
as
a
number
of
key
holders
or
a
ratio
of
the
total
number.
C
Yeah,
so
I
think
right
now,
there's
four
of
five
organ
organizations
that
are
signed
up
to
video
keyholders
and
we're
actively
in
conversation
with
at
least
two
a
few
more.
The
aim
is
to
get
to
seven,
and
so
it
would
be,
and
there's
in
general,
there's
like
some
proposals
for
re-architecting.
How
that
would
look
that
will
show
up
as
like
fit
proposals
in
the
near
future.
Of
that
is
effectively
like
the
the
denominator
that
you're
looking
for
is
still
probably
going
to
be
seven
per
the
original
system,
design
those
proposals.
C
So
when
it
says
two
root,
keyholder
sign.
That's
like
two
signers
of
representing
two
entities
or
two
organizations
out
of
seven
organizations
that
are
currently
part
of
the
set
of
weak
killers,
which
is
the
same
as
what
it
takes
today
to
get
a
notary
created
right
right.
So
that's
the
same
threshold,
and
so
arguably
it
could
be
a
little
bit
too
conservative
because,
like
this
is
just
like
removal
or
grounding
from
a
client
which
is,
I
think,
slightly
less
consequential
than
removing
or
grounding
data
capital
notary.
F
I
guess
it
would
be
hard
to
programmatically
define
a
sort
of
ratio
but
yeah.
I
I
speaking
of
someone
who
was
initially
like
instinctively
kind
of
against
this
this
proposal,
because
it's
been
kicking
around
for
a
while.
I
I,
I
think
that
at
least
having
some
collective
action,
if
we're
gonna
do
it
should
should
be
the
the
way
of
doing
it.
But
I
that's
probably
obvious.
C
Do
you
still
have
like
any
serious
concerns
that
are
worth
flagging?
It's
also
a
good
time
to
do
that
about.
This
is
specifically
for
removing
verified
status
or
removing
data
cap
from
client
and
the
cases
as
gilen
described
our
inactivity
or
like
known
fraud.
Slash
like
a
dispute
thing,
because
we
finally
have
a
consequence
that
we
can
use
in
the
system.
F
Yeah
yeah,
I
want
to
think
it
through
a
little
bit
more,
but
I
I
I
don't
quite
see
and
maybe
there's
a
discussion
in
this.
What
the
issue
is
with
sort
of
inactive
client
accounts,
because
you
know
it's
not
it's
not
like
we're
returning
that
data
cap
back
or
it
has
any
use
if
the
client
is
inactive,
but.
A
F
A
That
I,
that
the
two
sort
of
like
inactive
use,
cases
that
have
come
up
in
my
mind
that
people
discuss
like
one
of
them
is
you
have
lost
access
to
that
address
right,
like
you,
you
lost
that
wallet
or
whatever,
and
so
right
in
a
situation
like
that,
you're
like
oh,
I
can
no
longer
get
to
this
amount
of
data
cap
and
it
sits
there
and
it's
inactive,
and
you
know
we
want
to
clean
that
up,
but
the
other
goes
to,
for
inactivity
goes
to
like
a
sense
of
statistics
and
kind
of
usage
right
so
that,
if
we
say
well,
how
are
these
mechanisms
working?
A
How
effective
is
our
system?
How
efficient
is
it
if
we
are
giving
data
cap
to
people
through
a
certain
pipeline,
and
it
is
sitting
there
for
six
months
and
not
being
utilized
that
is
going
to
like
color?
All
of
those
dashboard
like
metrics,
for
what
the
the
program
as
a
whole
is
accomplishing,
and
so
by
having
a
mechanism
to
remove
those
inactive
people,
it
shows
a
better
picture
of
like
well
the
people
that
are
actually
trying
to
use
data
cap.
How
is
the
system
working
for
them?
A
So
in
that
way,
our
like
dashboard
metrics
are
more
effective.
It
also,
I
think,
creates
a
mindset
in
a
client
that,
while
this
is
not
a
you
know,
this
is
this
is
supposed
to
be
a
limitless
abundant
resource,
but
also
it
is
a
resource
with
purpose,
and
the
purpose
is
not
to
hoard
it.
The
purpose
is
to
have
it
flow
through
the
system
and
so
to
create
that
you
know,
there's
an
incentive
to
make
deals
with
your
data
cap
and
that
incentive
is.
A
We
will
take
the
data
cap
away
if
you're
not
using
it
the
way
that
it
was
intended
to
be
used
right,
and
so
in
that
way
it's
kind
of
like
it.
It
also
is
meant
to
be
a
bit
of
a
prod
to
not
hoard
a
thing
that
does
you
no
good
sitting
at
your
client
address
and
then
there's.
A
For
if
you
are
a,
if
you
are
a
client
who
is
just
going
and
either
you
are,
you
have
gotten
some
amount
of
data
cap
and
you
have
not
gotten
any
more
and
you're
just
sitting
on
it
and
not
doing
anything
with
it
versus
like.
Is
that
a
different
behavior
than
the
person
who
is
just
going
out
and
like
requesting
requesting
requesting
requesting
and
trying
to
just
accumulate
data
cap
without
doing
anything?
Those
are
two
different
behaviors
as
well.
C
There's
also
the
the
risk
of
like
now
wanting
to
sell
an
address
right.
This
is
been
a
long
term
like
data
cap
is
lucrative.
There
was
at
points
like
markets
for
like
data
cap,
where
people
were
paying
people
to
make
deals
with
them
for
data
gap
yeah.
Why
create
a
risk
where,
like
oh
got,
a
hundred
terabytes
data
gap
decided
not
to
use
it?
Just
gonna
sit
down
in
a
year
and
wait
for
somebody
to
offer
me
cash
flow.
It's
like
no,
that's,
not
a
good
setup
for
the
integrity
of
the
system,
so.
F
Do
we
do
we
interact
at
all
with
the
sort
of
crypto
economic
team
at
a
protocol?
I
have
some
questions
like
this,
because
I
again,
I
don't
know
the
ramifications,
but
I'm
sitting
there
going
well.
Another
issue
is
sort
of
the
effectively
inflationary
process
of
just
you
know,
constantly
handing
out
data
cap
and
I'm
curious
as
to
I
wasn't
around
when
the
falcon
plus
system
was
sort
of
built
out,
and
I
wonder
if
there's
there's
some
aspect
to
this-
that
we're
not
considering
that
the
crypto
economic
people
would
be
would
spot.
C
Good
question,
I
think,
that's
a
valid
point.
It's
probably
worth
v
plugging.
I
would
also
expect
that
engaged
crypto
economic
folks
should
be
looking
at
discussions
because
they're
the
most
important
thing
in
the
network,
but
I
will
re-flag
it
too.
Just
in
case
to
a
couple
of
folks,
that's
fair
yeah.
I.
F
Mean
maybe
maybe
just
sitting
down
and
having
like,
even
though
it's
a
a
a
sort
of
public
discussion
where
we
run
through
these
things,
because
I
think
that
might
also
be
useful
for
the
other
discussion,
because
I
I
I
mean
I
haven't
read
it.
F
Thank
you
for
drawing
it
to
attention,
but
on
on
fib
203,
you
know
the
the
the
main
compelling
argument
for
me
for
this
whole
process
is
cryptoeconomics,
so
I
think
that
it
would
help
to
draw
out
both
while
both
how
things
like
this
might
affect,
that
that
side
of
things
and
also
kind
of
reaffirm
that
that's
the
reason
why
this
exists.
C
I
think
we
should
continue
that.
I
did
want
to
just
say,
though,
just
sorry
sorry
to
keep
getting
you
on
this
calendar.
It
would
be
good
to
see
folks
here,
like
at
least
take
this
as
food
for
thought
on
like
one
whether
or
not
you
do
have
a
strong
opinion.
C
Whether
or
not
this
should
happen,
because
you
should
share
that
immediately
and
get
up,
but
also,
secondly,
like
on
this
implementation
standpoint,
which
is
like
what
do
we
actually
need
for
it
to
be
considered
like
fair
to
be
like
hey
like
this
client,
it's
it's
the
other,
just
yeah,
you're
gonna,
get
it
awesome.
Thank
you.
What
is
it
actually
like?
What
do
we
need
as
a
community
to
feel
safe
about
this
decision
like
this?
C
C
So
we
should
ensure
that
there's
some
aspect
of
like
multi-sig
thing
happening
here
where,
like
it
can
take,
maybe
like
takes
in
like
signatures
from
like
multiple
multi-sticks
or
something
we
have
the
opportunity
to
experiment
here
and
create
also
a
new
kind
of
construct
for
signing
that
could
be
beneficial
for
us
down
the
line
in
file
coin
plus
overall,
and
so
I
I
really
like
the
idea
of
like
combining
like
notary
signatures
and
a
root
key
holder
thing,
because
that's
something
we
haven't
done
before
and
could
be
interesting
and
beneficial
where
it's
like.
C
Not
only
is
this
like
a
higher
bar,
where
it's
like
two
or
more
notice,
I
have
to
sign
this,
but
then
also
they
have
to
ensure
that
it's
documented
and
auditable
in
a
way
that
somebody
who's
an
unbiased
observer
who's
not
plugged
into
the
network
right.
These
are
like
security
folks
that,
like
highly
influential
organizations,
can
look
at
that
data
and
say
yeah.
This
seems
reasonable.
All
this
makes
sense.
C
There's
an
audit
like
audit
trace
here
that
I
can
follow
and
like
here's,
a
thumbs
up
kind
of
a
thing,
so
my
opinion
and
would
be
that
let's
do
some
combination
and
not
really
specific
holders.
I
don't
know
what
those
numbers
are.
I
see
charles
is
saying
one
note
three
plus
two
to
keyholders
and
then
charles
then
my
question
to
you
would
be:
does
that
notary
have
to
be
the
one
that
gave
the
data
cap
to
the
client
or
can
it
be
any
nursery?
C
Because
then
I
would
say
that
you
probably
want
the
same
one
to
keep
the
preservation
on
the
due
diligence
and
then
there's
also
learnings
from
that
experience.
But
then
there's
repercussions
on
what
if
that
person
is
no
longer
notary.
So
I
think
we
have
to
do
a
little
bit
more
thinking
on
this.
So
I'll
continue
to
engage
on
github
and
slack
in
the
next
couple
of
weeks,
but
I
would
love
that
in
the
following
governance
call
to
this.
C
We
we
have
a
strong
opinion
that
we
can
go
back
to
the
implementation
team
with
and
say:
hey
here's,
our
initial
hypothesis,
for
what
we
should
do.
That
way,
it
allows
us
to
get
into
the
next
big
network
upgrade
which
will
come
in
like
the
fab
time
frame,
and
we
can.
That
will
be
timed
well
with
all
of
our
other
like
end
of
q1
goals,
and
so
then
we
can
iterate
from
there
together.
A
Okay,
we
do
have
to
move
on
so
k.
Ray
is
here
with
some
community
updates:
okay
ray
you
want
to
unmute
and
jump.
H
In
on
this
one
yeah
unmuted
galen,
thank
you
for
that.
I
want
to
bring
the
first
bit
of
information
to
light
is
on
january
4th.
There
will
not
be
a
call
so
you're
going
to
see
this
posted
in
the
repo
you'll
see
this
posted
in
slack,
but
you
hear
it
first
january
4th.
There
won't
be
a
call
the
next
week.
However,
we
will
resume
so
on
january.
H
11Th
that'll
be
the
first
call
of
2022
now
the
other
thing
that
bring
to
light
is
how
are
we
getting
this
information
out
of
you
and
collecting
this
signal?
So
those
of
you
are
on
the
call,
you
can
listen,
you
might
thumbs
up
with
emojis.
You
might
mute
and
kind
of
weigh
in,
but
sometimes
we
want
to
look
at
like
what
other
ways
can
we
get
that
feedback?
H
How
are
we
enabling
you
to
participate
in
the
community,
so
the
next
couple
of
days
or
weeks,
you'll
see
a
feedback
survey
come
from
me
and
I
might
ask
for
time
to
get
one-on-one
with
you
just
to
kind
of
chat,
get
to
know
you
how
you
communicate
what
you're
looking
for
in
the
program?
What
are
your
thoughts
that
might
not
always
be
easy
to
post
to
a
github
repo,
not
my
maybe
easy
to
bring
up
in
a
call
like
this.
H
So
it's
just
a
friendly
heads
up
I'll
be
reaching
out
to
you
in
the
coming
days
and
weeks
just
to
kind
of
get
that
feel
for
what
we
can
do
and
get
your
feedback
on.
Some
of
these
questions
we're
asking
the
last
is
just
to
keep
you
updated
we're
submitting
the
rfp
for
the
leaderboard
for
the
notaries,
so
in
the
coming
weeks,
you'll
see
this
come
through.
H
We'll
keep
you
updated
as
this
progress
unfolds,
but
we
hope
to
solicit
bids
with
this
in
the
coming
days
and
weeks
and
we
hope
to
have
that
leaderboard
up
in
early
2021..
So
if
you
took
nothing
from
this
raspy
cold
throat
voice
right
now,
it's
that
you'll
be
seeing
that
january.
Fourth
time
disappear
from
your
calendar
and
a
brand
new
series
startup
on
january
11th,
along
with
some
new,
invites
to
start
talking
about
one-on-one
what
you
like
and
what
you
want
from
this
program.
Thanks
cheers.
A
Awesome
thanks,
k-ray
fantastic.
There
were
some
other
notary
governance
discussions
that
are
not
in
the
form
of
fips
that
we
needed
to
talk
through.
We
have
ten
minutes,
so
we're
gonna
try
and
do
this
quick
number
319
data
cut
distribution,
expectations
across
multiple
storage
providers
and
we've
gotten
two
examples
of
this:
both
large
data
set
number
36
and
large
data
set
79,
so
that's
79
is
shinshu
and
36
is
ecology,
limited
in
both
of
those
in
their
application.
A
A
When
asked
about
this,
and
you
could
go
kind
of
read
their
our
questions
and
their
responses
and
I'm
sort
of
speaking
for
them
like
summarizing
this.
So
I
encourage
you
to
go
to
go.
Read
it
yourself
as
well.
A
One
of
the
claims
that
they
make
is
that
they
have
attempted
to
reach
out
to
more
storage
providers
to
make
deals,
but
have
not
been
successful,
and
that
is
why
their
deal-making
behavior
is
the
way
that
it
is
so
we
are
in
a
place
where
we
need
to
decide.
A
Are
we
still
comfortable
avoiding
more
data
cap
to
these
ldns
that
are
not
operating
on
the
terms
of
their
initial
application,
with
the
hope
being
that,
if
we
give
them
more
data
cap,
they
will
be
better
able
to
make
more
distributed
deals,
or
is
this
the
kind
of
thing
that
we
want
to
pause
them?
You
know
where,
when
we
as
a
community
want
to
move
forward
specifically
with
these
two
but
then
broadly
with
large
data,
sets
that
kind
of
don't
follow
something
they
put
in
their
application.
A
So
I
want
to
stop
here
and
let's
see,
if
there's
like
two
or
three
minutes
of
questions
or
conversations
around
this
issue,.
I
Hi,
karen
and
all
the
government's
members,
I'm
team
from
ecology
limited,
so
I'm
a
system
of
engineer
there
and
participate
all
the
other
steps
on
boarding
for
calling
network.
Do.
I
have
several
minutes
to
describe
all
the
difficulties
or
the
severe
issues
I
have
been
met
during
the
past
several
months.
I
Okay,
if
time
allowed.
Okay,
thank
you.
So
we
have
been
struggling
trying
to
onboard
our
data
onto
the
falcon
network
at
the
very
beginning.
Before
we
start
this
process,
we
look
at
the
falcon
storage
provider.
Miner
list
is
like
more
than
3000
miners.
Are
there
sitting
there,
so
we
think
we
find
30
out
of
them
could
be
easy,
but
we
were
wrong
and
at
the
very
beginning
we
tried
the
online
started
due
method,
which
seems
reasonable
to
our
engineering
team,
because
everything
would
be
automatic.
I
We
just
publish
some
story
deals
and
everything
would
be
automatic,
but
unfortunately,
unless
we
divided
our
data
block
into
down
to
the
size
of
around
128
megabytes,
that's
more
size
and
even
if
without
small
size
data
blocks,
we
cannot
get
a
stable
result.
A
lot
of
miners
just
replied:
it
doesn't
accept
story,
dues
and
some
of
them
bounced
back.
I
I
Our
company
replied
rely
on
the
large
amount
of
online
trading
data
from
the
exchange,
all
kinds
of
exchange
platforms.
Every
day
we
collect
hundreds
of
terabytes
of
data
raw
data
into
our
system,
and
our
data
engineer
would
use
those
data
to
train
our
model,
so
we
most
of
them.
We
use
the
recent
data
within
one
year,
but
for
the
code
data
we
would
need
them
like
every
quarterly
basis
when
we
try
to
revise
our
model
and
we
will
bring
back
the
the
old
data
so
well.
I
I
According
to
our
speed,
according
the
data
we
generated
every
day,
so
we
found
a
couple
of
miners
first
from
north
america,
because
our
data
located
in
united
states,
so
we
think
north
america
will
be
a
good
start.
I
Then
we
find
that
even
if
we
find
a
couple
of
miners
who
willing
to
collaborate
with
us,
the
fact
that
we
we
have,
we
have
to
engage
two
technical
teams
from
two
companies
deeply
to
get
the
whole
process
done.
I
So
it
took
us
around
two
weeks
for
both
teams
to
engage
with
each
other
deeply
to
to
for
us
to
transfer
the
data
to
their
site,
and
then
they
they
seal
the
data
for
us
and
finally,
we
get
it
done
after
we,
then
we
started
to
looking
for
more
manners
which
can
provide
offline
mode
data
storage.
I
For
us,
we
find
some
more
miners
from
from
china,
but
then
we
find
out
that
this
data
transfer
speed
is
so
slow,
so
it
took
forever
for
me
to
push
like
10
terabytes
to
the
other
side.
So
eventually
it
leads
to
the
situation
that
we
only
have
three
manners
serving
us,
and
the
data
cap
is
also
very
imbalanced,
because
north
american
ones
grab
up
almost
all
of
the
data
majority
part
of
the
data
and
the
the
china
miners.
I
They
only
have
a
small
portion
like
30
percent,
but
still
we
are
trying
to
look
for
more
manners
which
can
support
us
and
again
because
we
generate
so
many
data
per
day.
When
we're
looking
for
miners
who
can
support
us,
we
need
to
check
very
detailing
to
make
sure
they
can.
They
can
really
seal
that
many
data
for
us
per
day.
So
we
also
find
out
that
some
miners
claim
they
can
seal.
I
I
I
don't
know
what's
behind
the
theory,
so
the
just
the
majority
part
of
the
manners
cannot
satisfy
our
our
speed,
our
quality
requirements,
but
we're
still
trying
to
reach
out
more.
The
only
the
only
issue
here
is
that
we
find
the
whole
process
kind
of
painful
for
origin
for
our
engineering
team.
I
We
don't
know
the
quality
of
the
of
the
sound
miners,
so
we
have
to
do
our
own
data
analysis.
We
try
to
grab
data
from
the
network,
try
to
analyze
how
many,
how
many
data
the
host
is
and
trying
to
give
a
quality
score
from
our
own
side.
But
it's
quite
we
don't
understand
the
protocol
pretty
well,
so
it's
quite
superficial
and
quite
low
efficiency.
I
I
So,
even
today
we're
still
struggling
trying
to
find
more
high
quality
miners
working
together
and
willing
to
work
together
with
us
to
hold
our
data.
But
then
we
find
out
the
the
data
cap
is
your
staff,
so
here
I
am
trying
to
explain
everything
to
you
guys,
so
we
really
want
to.
We
have
been
in
the
crypto
industry
for
a
while.
We
really
want
to
satisfy
our
storage
requirement
within
the
crypto
world,
but
yeah
we
are.
I
We
have
been
working
on
this
for
quite
some
time
like
the
past
almost
two
months
and
we
we
are
looking
forward
to
the
final
decision
from
the
community.
If
we
can
get
the
data
cap
and
then
we
can
move
on
or
we
we
should
find
out
some
other
manners
for
our
data.
So
thank
you.
Thank
you
all
for
the
time,
and
hopefully
this
can
give
you
guys
another
point
of
view.
On
the
client
side,
we
really
have
a
large
amount
of
data
and
need
the
storage,
but
we
suffering
yeah.
Thank
you.
A
Yes,
thank
you
tim
sorry
to
to
cut
you
off.
We
are
at
time
here,
but
we
really
appreciate
you
coming
to
this
call
and
sharing
all
of
that.
I
think
that
this
gives
us
a
lot
of
really
good
insights
that
you
know,
as
you
explained
like,
we
don't
always
have
the
whole
picture.
So
I
think
you
know
deep
has
some
deep
and
I
both
have
some
thoughts.
Maybe
we
could
get
some
back
and
forth
going
in
the
public.
B
A
Coin
slack
and
then
we
can
also
kind
of
make
some
comments
back
and
forth
in
the
github
issue,
but
as
we
are
at
time
and
we're
losing
some
of
the
notaries,
maybe
we
could
just
see
in
that
issue.
We
could
get.
A
With
some
comments
from
some
other
notorious
to
see
about
what
we
want
to
do
to
move
forward,
I
do
I
want
to
encourage
other
people
so
go
check
out
these
discussion,
topics
which
will
link
to
those
different
issues
and
you
can
see
where
they're
at
and
put
some
more
information
in
the
other
one.
A
What
came
from
andrew
hill,
around
reducing
constraints
on
requesting
additional
beta
cap
check
that
one
out,
I
think
that
one's
some
a
little
less
juicy
and
more
around
sort
of
some
documentation
and
some
things
that
need
to
get
corrected
around
documentation
and
then
there's
potentially
a
new
discussion,
which
we
definitely
don't
have
time
for
around
different
types
of
addresses.
A
But
we
may
just
have
to
talk
about
that
in
two
weeks.
So,
as
we
are
at
time
want
to
thank
everyone
check
out
the
open
fits
check
out.
The
open
discussion,
topics,
check
out
open
client
issues
and
large
data
set
issues,
and
please
comment
there
thanks
everybody,
I'm
going
to
go
ahead
and
stop
the
recording.
A
Great
happy
4
p.m.
Pacific
time
everybody
welcome,
notary
governance,
call
for
the
23rd,
we're
gonna
just
go
ahead
and
jump
in
action-packed
agenda.
Potentially
gonna
run
through
these
top
two
sections
a
little
bit
faster
on
this
call,
because
we
didn't
quite
have
as
much
time
to
discuss
some
open
issues
as
we
wanted
to
earlier.
A
Moving
to
this
so
be
prepared
for
some
faster
metric
updates,
pulling
some
stash
from
our
interplanetary
dashboard
we're
seeing
some
some
happy
average
times
of
two
days
and
one
day,
three
hours
to
allocation
in
the
past
30
days.
That's
spectacular!
A
A
lot
of
that
is
still
taking
into
account
our
automations,
so
our
average
time
for
notaries
is
still
at
11
days,
so
hoping
to
get
that
number
a
little
bit
lower
and
also
we're
still
working
on
getting
some
statistics
on
ldns.
So
the
two
of
our
teams
work
with
them
on
having
endpoints
and
status
changes
being
tracked,
so
that
we
can
then
start
reporting
on
some
of
the
time.
A
You
know
time
to
root
key
holder
action
time
to
sort
of
first
allocation
time
to
you
know
second,
signature,
that
sort
of
thing
looking
at
our
data
cap
funnel
just
a
really
high
level
the
amount
allocated
to
clients
from
those
direct
notaries
we're
seeing
it
at
about.
A
You
know
two
and
a
quarter
pebb
byte,
which
is
about
30
of
the
data
cap
that
has
been
given
to
the
notaries
and
then
comparing
that
to
ldns
at
already
at
3.8,
petty
bytes
to
clients
from
those
ldns
an
interesting
thing
to
kind
of
call
out
we're,
seeing
like
a
fascinating,
close
similarity
in
the
percentage,
as
the
funnel
decreases
from
client
to
deal
storage
provider.
So
for
ldns,
that's
at
1.6
pebs,
which
is
about
40
and
very
similar
for
direct
client
behavior
as
well.
A
985
tibi
bytes,
again
about
42
drop
from
what
has
been
given
to
them
by
the
notaries
to
what
they've
sealed
in
deals.
So
it
could
be
some
interesting
behavior
patterns
there
that
we
may
want
to
throw
investigating
into
speaking
of
those
notaries.
Some
direct
notary
activity,
stats
we've
got
about
27,
open
issues.
A
As
of
yesterday,
which
looks
to
about
eight
in
the
last
week,
being
new
number
dropped
a
little
bit
253
that
are
granted
in
our
client
on
morning,
repo
that
number
may
not
perfectly
line
up
with
all
of
the
actual
client
applications
between
granted,
sometimes
those
get
deleted,
but
that's
how
many
there
and
we're
seeing
about
15
in
the
past
seven
days,
63
of
those
applications
are
getting
a
first
response.
A
That's
been
pretty
consistent
and
seeing
about
24
out
of
the
79
clients
that
put
in
an
application
are
getting
data
cap
per
month.
So
not
bad.
Well,
we'll
talk
a
little
bit
later
on
about
some
of
the
leaderboard
things
we've
mentioned
before
around.
You
know
better
ways
to
display
some
of
these
stats
in
a
live
web
surface
so
from
there
going
to
pass
it
too
deep
to
sort
of
quickly
talk
through
some
of
our
q1
goal.
Updates.
C
Crazy
that
we're
talking
about
2022
already,
I
feel
like
this
year,
is
absolutely
flown
by
nice
to
see
all
of
you
for
the
second
governance
call
today,
just
talking
through
just
the
same
parameters
that
you're
used
to
hearing
from
me
on
for
the
last
couple
of
weeks
now.
Well,
several
weeks.
The
last
couple
of
governance
calls
now
the
first
of
which
is
volume
where
we
want
data
cap
to
be
abundant
and
available
to
all
clients
that
are
worthy
of
receiving
it.
C
Cool
stats
to
update
here
is
that,
when
tracking,
through
ldn
applications
that
have
either
been
granted
in
the
past,
or
are
currently
in
the
state
of
granted
and
adding
up
all
the
total
amount
of
data
cap
that
was
like
allocated
for
them,
that
number
is
now
at
98.5,
pebby
bytes.
So
that
includes,
of
course,
the
stuff
that
went
through
like
the
first
ldn
process,
which
was
maybe
in,
like,
I
think,
high
20s
or
low
30s.
C
If
you
call
correctly
and
then
net
about
60
additional,
that's
out
of
the
100
that
we
earmarked
for
this
program.
So
we
should
also
talk
about
like
the
continuation
of
this
and,
if
we
think
it's
it's
good
or
not
good,
and
how
we
want
to
adjust
it,
but
perhaps
we
can
keep
that
for
the
next
caller
or
for
the
discussion
section
of
this,
but
tldr
is
between
what's
been
given
to
the
notaries
and
what's
actually
started
to
be
granted
to
clients.
C
C
You
know
six
months
ago,
that
number
was
at
two
baby
bites
three
percent,
so
this
is
a
big
achievement
in
terms
of
whether
or
not
we
feel
like
we're
we're
at
least
getting
to
the
point
that
we're
comfortable
allocating
data
cap
in
a
much
larger
scale
and
supporting
clients
that
are
bringing
on
much
larger
use
cases.
C
Of
course,
there's
a
lot
of
work
to
be
done
still,
not
only
in
terms
of
getting
this
number
substantively
further
up,
but
also
ensuring
that,
like
we're
doing
it
in
a
way
that
is
safe,
it's
being
tracked,
the
clients
are
empowered
to
actually
use
that
data
cap.
Well,
so
right
now
we
passed
another
milestone.
A
couple
of
weeks
ago,
where,
like
five
periods
of
data
cap,
had
actually
made
it
into
the
actual
addresses
of
clients,
which
is
also
a
great
milestone.
C
Considering
like
three
months
ago,
we
were
less
than
one
pebby
byte
and
so
more
than
5x
that
in
the
last
quarter,
but
the
utilization
of
that
data
cap
is
still
hovering
around
40,
which
means
that
clients
are
not
able
to
action,
60
or
so
the
data
cap
that
they're
receiving
today
and
that
that
number
has
been
pretty
consistent.
C
It's
been
between
30
and
40
over
like
the
last
four
months,
five
months,
and
so,
while
this
goal
specifically
tracks
the
volume
and
the
availability
of
data
cap
and
ensuring
that
it's
there
for
the
people
that
need
it,
there's
also
like
a
there's
always
that
little
asterisk
that,
like
keep
in
the
back
of
your
minds
like,
ultimately
it
all
comes
down
to
supporting
clients
and
ensuring
that
they
have
a
good
experience
and
are
able
to
leverage
this
as
a
tool
to
be
set
up
for
success
on
the
farcoid
network.
C
Second,
up
my
favorite
metric
ttd,
where
we
track
the
time
to
data
cap.
C
This
it's
kind
of
steady
state
from
the
last
call
where
it
still
takes
about
11
days
on
average
to
get
data
cap
when
you,
when
there's
a
notary
involved
in
the
loop
or
if
this
this
average
includes
people
that
go
to
the
automated
verifier
as
well,
and
then
for
things
that
go
lda
and
like
their
applications
that
have
gone
through
in
like
three
days,
but
at
the
tail
end
of
that
we're
still
looking
at
like
three
four
weeks
for
open
applications.
C
There's
two
sort
of
thinking
things
that
I
want
to
call
out
here
for
this.
The
first,
which
I
did
call
it
this
morning.
The
second
I
failed
to,
but
the
first
one
was
basically
that,
like
over
the
last
few
calls,
we
have
generally
decided
to
support
increasing
automated
verification
to
have
a
much
higher
scale
of
data
cap
being
granted.
So
right
now,
the
only
automated
verifier
is
verified.glyph.I
o
and
that
grants
32
gb
of
data
cap.
C
Let's
see
if
verified,
clifford
io
is
interested
in
doing
that
or
if
we
need
to
put
out
like
an
rfp,
but
I
do
think
that
getting
to
that
magnitude
will
also
push
us
a
little
bit
in
terms
of
structuring
the
notary
rule
a
little
bit
and
evolving
it
a
little
bit
to
like
fit
those
needs
and
figuring
out
like
how
we
want
to
serve
the
community.
That
way,
if
we
really
believe
that,
like
github
based
kyc,
is
still
the
right
thing
for
an
automated
verifier
etc.
So,
please,
you
know
chime
in
on
that
discussion.
C
The
second
thing
that
I
wanted
to
call
out
here
was
around
the
fact
that
today,
when
we
track
ttd,
we're
actually
doing
it
for
successful
applications,
only
really,
but
there's
an
aspect
of
this
where,
if
a
client
is
not
going
to
get
data
capped
like
them,
knowing
sooner
is
also
an
important
and
good
thing,
but
also
if
a
client
is
getting
data
cap
then
like
engaging
with
them
in
a
way
that
sets
them
up
for
success
at
the
start
is
better
where
we
can
get
them
more
data
on
what
they
need
to
provide
to
us
more
up
front.
C
So
today,
when
a
client
goes
to
the
onboarding
flow,
they
basically
just
end
up
posting
like
basic
information
right
like
what
website
they're
coming
from
how
much
data
can
they
need
what
their
name
is
etc.
But
I
think
it
would
be
excellent
for
us
to
start
thinking
about
when
we
can
actually
get
to
a
stage
where
there's
more
consistency
across
due
diligence
processes.
C
This
is
also
highlighted
by
fip
that
we'll
talk
about
later
in
this
call,
but
sort
of
leveraging
all
of
your
individual
expertises
to
build
a
much
more
robust
due
diligence
process,
especially
because
we've
also
found
that,
in
some
cases
like
the
due
diligence
of
an
individual
notary
which
contributes
to
this
11
day,
number
is
sometimes
even
more
robust
than
what
we're
doing
for
some
of
the
clients
applying
for
ldms,
and
that
doesn't
make
a
lot
of
sense,
because
the
amount
of
effort
going
into
do
kyc
should
be
correlated
with
the
data
cap
that
a
client
is
getting,
and
so
just
some
ideas
prompts
for
thinking.
C
I
would
love
to
hear
your
thoughts
at
the
at
the
latter
end
of
this
call.
Let's
move
on
to
the
next
slide.
C
This
this
slide
has
an
exciting
update,
which
is
that
in
the
last
week
we
had
two
new
dashboards
shared
with
us
and
as
far
as
I'm
aware,
that,
actually
might
be
a
third
one,
that's
being
developed,
so
the
first
of
which
is
pluskit.io
very
visually.
Pleasing
dashboard
that
talks
to
the
data
cap
funnel
has
interesting
stats.
They've
renamed.
C
I
don't
know
if
this
is
renaming
actually,
but
they
have
this
metric
on
efficiency,
which
I
think
is
probably
similar
to
ttd,
but
it
would
be
nice
to
also
see
the
methodology
for
how
they're
calculating
that,
but
not
sure.
If,
if
any
one
of
you
in
this
call
knows,
please
do
share
if
not
putting
the
ask
out
into
the
world
if
you're
watching
this
recording-
and
you
built
this-
please
get
in
touch
with
us.
Come
presented
at
the
next
governance.
C
Call
we'd
love
to
hear
from
you
engage
with
us
on
slack
and
tell
us
how
to
give
you
some
feedback,
because
it'd
be
awesome
to
develop
this
further
and
then
in
a
similar
vein.
The
athena
pool
folks
also
released
a
data
cap
statistics
tab
on
their
athenaexplorer.
C
So
at
the
top,
you
can
now
see
there's
a
data
cap
statistics,
and
this
is
also
similar
in
terms
of
tracking,
like
the
data
cap
funnel
and
data
cap
flow
great,
to
see
more
sources
of
this
data,
because
it's
super
important
to
preserve
like
the
integrity
and
the
decentralized
nature
of
decision
making,
both
immediate
term
but
also
longer
term
falcon,
plus,
as
we
move
closer
and
closer
towards
automating.
C
A
lot
of
this
functionality
having
multiple
opinions
in
the
room
and
individual
opinions
in
the
room
on
what
the
state
of
truth
is
will
matter,
because
that's
how
we
will
build
a
converged
state
of
truth
and
so
between
these
two
fieldplus.info,
which
was
a
bit
about
laura
and
the
excellent
file
dot
theme
and
then
the
filecon
bus
dashboard.
That
gail
and
I
are
partnering
with
right
now.
We've
got
four
solid
foundations
to
build
off
of
and
improve
our
understanding
of
the
state
of
the
program.
C
There's
a
thought
in
the
morning
as
well
on,
like
maybe
we
want
to
start
having
like
a
dashboards
people
for
falcon
plus,
like
maybe
once
a
month
or
maybe
even
more
infrequently
required,
but
some
kind
of
sync
on
either
at
the
very
least
establishing
standards
and
what
it
takes
to
be
like
a
dashboard
for
falcon
plus,
like,
for
example,
like
publisher
methodology
and
how
you
calculate
certain
statistics,
share
your
data
via
an
api
so
that
other
people
can
also
like
use
that
in
automation,
things
like
that
and
or
like
let's
just
sync
up
and
talk
about,
like
discrepancies
that
we're
seeing
in
the
data
to
make
sure
that
some
of
the
stuff
are
on
the
same
page
or
maybe
we're
at
least
aligning
on
how
we
define
certain
terminology
so
that
it's
easier
to
interpret
and
compare
data
across
inputs.
C
The
following
two
focus
areas,
largely
unchanged
so
far:
client
onboarding
ux.
You
know
that's
a
huge
focus
error
for
us,
I'm
always
open
to
discussing
and
improving
how
we
can
do
that
and
then
generally
falcon
plus
governance
for
which
k
ray
has
an
update.
Later
in
this
call.
So
I'm
not
going
to
see
this
thunder,
but
I'm
going
to
hand
it
back
together
to
take
it
away.
A
All
right,
thank
you.
So
we
have
a
couple
of
open
fips
to
discuss.
It
is
our
community,
you
know
fits
normally
happen
and
impacts
sort
of
the
broader
community,
and
it's
interesting
that
right
now
there
are
two
that
are
really
specifically
geared
towards
filecoin,
plus
so
just
a
caveat
here.
I
am
not
the
author
of
either
of
these
tips,
but
we
do
actually
have
one
of
the
authors
here
on
this
call.
A
So
maybe
we
will
put
them
on
blast
here
in
a
second,
but
before
we
do
going,
in
chronological
order,
tip
203
to
remove
the
s
plus
notary
process.
So
there's
been
a
few
conversations
and
comments
back
and
forth
when
it
was
opened
20
days
ago.
I
encourage
you
to
go,
read
through
and
weigh
in
some
interesting
ideas
at
kind
of
a
very
high
level.
A
An
argument
that
is
being
made
is
that
the
notary
process
is
created
centralization
and
you
know,
there's
kind
of
an
argument
to
be
made
for
getting
rid
of
notaries
and
just
giving
anyone
any
amount
of
data
cap
that
they
want,
and
so
some
things
that
have
sort
of
come
out
of
this
is
what
does
it
look
like
to
sort
of
evolve
and
change,
both
the
notary
role,
as
well
as
the
notary
election
process,
and
how
does
this
kind
of
map
to
the
overall
goals
of
the
filecoin
plus
program?
A
So
before
I
move
on
to
the
next
fit,
I
would
love
if
people
on
this
call
have
either
some
questions
or
strong
opinions
about
this,
so
that
we
can
just
capture
those
here
again.
This
is
not
my
proposal
but
happy
to
hear
them.
So
we'll
just
pause
for
a
few
seconds
see
if
there
are
any
hands
or
questions.
C
I'm
raising
my
hand,
because
I
have
strong
opinions
on
this
one,
but
I
have
voiced
my
opinions
already
in
the
discussion
and
I'm
going
to
continue
doing
that.
I
do
think
that
there
are
some
useful
takeaways
here,
though,
and
a
lot
of
the
feedback
is
good,
even
though
I
don't
necessarily
agree
with
the
intended
changes
that
are
being
proposed.
I
think
that
the
feedback
is
valid
and
also
feedback,
as
the
discussion
has
continued
to
evolve
and
the
feedback
has
become
more
crisp
in
terms
of
like
problem
areas
that
we
have.
C
I
think
a
lot
of
it
is
actually
in
alignment
in
the
discussions
that
we've
been
having
as
a
community
over
the
last
six
to
eight
weeks.
I
wasn't
particularly
surprised
by
a
lot
of
the
actual
like
yeah
like
this
makes
a
lot
of
sense.
We
should
think
about
it
kind
of
a
thing,
at
least
for
me.
I
do
think
there's
some
really
interesting
prompts
to
thinking
so,
as
many
of
you
here
have
been
invested
in
the
development
of
this
program
for
a
year
now
year.
C
Plus
I
do
think
it
is
worth
your
while
to
read
through
this
thread,
though,
and
see
if
you
have
any
thoughts,
because
I
think
there's
some
really
cool
ideas
being
shared,
especially
just
generally
around
more
like
distributed
and
decentralized
governance
and
coming
up
with
mechanisms
where,
like
participation
in
the
program,
is
more
like
fairly
evaluated
or
like
fairly
and
like
set
up
for
success
in
that
like.
C
There's
also
questions
around
like
process
things
like
how
should
we
do
dispute
and
audit?
How
should
we
verify
private
data
sets,
and
these
things
to
be
honest
like
in
these
calls?
We
have
talked
literally
about
these
exact
topics
and
we've
all
had
ideas
and
we've
talked
through
examples
of
how
people
are
doing
them.
J
You
know
I
just
quickly
skimming
through
this
50s.
I
should
definitely
feel
like
it's
very
interesting,
so
I
feel,
like
my
personal
take
on
this-
is
like
it's
hard
to
have
for
us
to
have
like
immediate
action
to
actually,
like
you
know,
do
whatever
this
c
is
proposing.
You
know,
develop
new
decentralized,
algorith,
like
algorithm
and
for
the
military
process.
I
definitely
feel
like
things
were
getting
easier
next
year,
when
we
have
a
vm
like
landing
in
the
network.
J
You
know
with
our
contract
and
also
being
able
to
compute
or
cut
like
easily
compute,
like
on
the
data
that
is
stored
via
like
verified
deal
or
like
those
clients
or
like
next
year.
If
we
have
the
retrieval
market,
we
have
the
indexer
and
have
like
two
links
built
around
like
we're.
Achieving
the
data
that
actually
stored
to
verify
the
process,
like
all
those
tools
will
be
making.
This
seem
like
easier.
J
I
just
don't
see
as
of
today
how
we
can
move
forward
and
have,
like
a
very
you
know,
useful
changes
to
the
program
like
to
the
doctoral
process,
but
I
definitely
feel
like
we
should
keep
this
discussion
open
and
then,
whenever
we
have
a
new
technology
introduced
in
the
falcon
network,
we
should
like
keep
brainstorming
the
ideas
here.
A
Yeah,
I
think
it's
kind
of
high
level.
A
If
I'm
understanding
correctly,
it
sounds
like
you
don't
fundamentally
like
disagree
with
some
of
the
things
that
are
raised
here.
It's
just
that
the
technical
solutions
to
be
able
to
to
implement
some
of
these
changes
don't
currently
exist
and
couldn't
exist.
A
You
know
in
a
month
and
so
keeping
this
open
as
a
place
to
like
have
this
conversation,
but
also
knowing
that
the
technical
lift
to
do
this
is
something
that
we're
still
working
on.
J
Exactly
and
my
another
thing
I
was
just
thinking
of
is
like
this
fit.
Maybe
like
worked
really
well
with
the
other
flip
like
that
was
opened
a
long
time
ago,
it's
like
being
able
to
do
a
regular
deal
first
and
then
upgrade
those
two
by
five
deals
and
getting
a
search
for
other
extra
power.
J
So
if
we
do
that,
we
can
maybe
offloading
some
of
the
print.
You
know
notary
process
as
well,
but
also
that
fit
was
like
sort
of
controversial
but
also
like,
because
on
the
protocol
level
or
the
tooling
level,
we
don't
have
the
techno
like
you,
you
know
we.
We
can't
support
it
easily.
That's
why
it's
like
you
know,
I'm
positive
or
not,
but
I
feel
like
these
two
can
link
pretty
well
in
the
future,
and
I
would
love
to
see
more
discussion
on
that.
A
Awesome,
thank
you,
and
while
we
have
you,
why
don't
we
think
about
talking
through
this
this
bit?
204
around
data
cap
management
and
kind
of
to
we've
talked
about
this
one
previously
on
on
another
call,
and
I
think
that
kind
of
the
two
specific
proposals
that
are
being
made
really
are
around
technical
implementation
of
like
how
would
we
transfer
data
caps
from
a
client
address
to
a
client
address,
and
then
how
would
we
remove
data
cap
from
a
client
address
and
or
remove
a
client
from
the
verified
registry?
A
I
think
that
before
I
turn
it
over
to
jennifer
one
of
the
things
that
we've
sort
of
landed
on
as
a
community
in
a
couple
of
different
calls
and
conversations
is
that
you
know
we
don't
really
understand
a
valid
use
case
right
now
for
transferring
data
cap
from
a
client
to
another
client
compared
to
like
the
parameters
of
the
program
and
and
sort
of
this
idea
that
data
cap
is
readily
abundant
and
available,
and
so
as
it
stands
we're.
A
Removing
data
cap
from
a
verified,
client
and
sort
of
the
the
question
that
we
have
from
the
for
the
notaries
on
the
call
is
kind
of
like
how
would
we
tactically
want
to
support
doing
that
like
what
would
the
actual
system
rules
be,
and
these
are
just
some
ideas
that
people
came
up
with
these
are
these
are
by
no
means
meant
to
be
an
exhaustive
list
or
even
a
prescriptive
list,
but
so
some
things
for
the
notaries
to
think
through
right
now
is
kind
of
like
what
would
it
look
like
for
the
for
it
to
be
some
subset
of
notaries
or
notaries
and
root
key
holders?
A
And
how
would
that
work?
But
first,
I'm
going
to
like
pass
it
to
jennifer
to
sort
of
talk
through
this
fit
where
you
know
some
history
about
it
and
where
we
are
right.
Now
that
sort
of
thing
take
it
away.
J
Thank
you,
but
yeah
so
like
when
I
first
started
like
when
I
first
had
this
song.
Basically
I
was
thinking
like
there
are
more
more
and
more,
you
know
data
broker
in
the
network.
We
have
s
tray,
we
have
a
matrix,
we
have
base
all
those
and
a
lot
of
them
started
to
get
like
large
data
cap
location,
which
I
think
is
great,
but
I
haven't
like
that's
not
me,
but
I
haven't
keep
up
with
the
whole
of
our
queen
program
that
much
and
these
days
as
well.
J
Sometimes
they
will
run
out
of
like
a
data
type,
and
I
really
don't
want
them
to
just
like
to
lose
that
momentum
and
getting
real
data
on
board
into
the
network,
and
I
also
know
there
are
a
lot
of
existing
clients
that
already
got
data
cap.
They
are
actually
using
the
data
broker
service,
so
I
was
like
if
they
already
have
the
data
cap,
why
do
they
have
to
use
the
data
broker
data
cap?
Maybe
they
can
just
like
transfer
their
own
data
cap
to
the
data
broker
instead?
J
So
you
know
the
data
brokers
can
have
like
one
data
account
on
that.
That's
why
I
created
the
transfer.
The
main
reason
I
created
transfer
data
cap
proposal,
but,
like
I,
I
brought
through
the
thread
and
totally
understand
the
concerns
that
people
have
been
bringing
like.
J
So
I
definitely
agree
that
we
can
put
a
p
on
transfer
data
cap
and,
like
don't
have
to
worry
about
that
like
too
much
right
now,
like
galen
said
we
do
want
to
like
push
for
like
revoked
data
cat
from
the
clients.
For
like
multiple
reasons,
can
we
go
back
to
your
slide?
Maybe
so
I
feel,
like
the
community
has
a
good
understanding
on
why
we
need
that.
J
You
know
for
like
to
ensure,
like
good
behavior
of
the
verify
client,
to
make
sure
the
data
are
actually
put
in
use
in
the
network,
all
those
reasons.
So
I
don't
think
it's
like
controversial
for
us
to
implement
that
in
protocol
or
not
but
like
based
on
the
questions
here.
What
we
can
do
is
basically
we
can
do
anything
so
like
from
an
actor
perspective.
We
can
say
like
we
can
have
like
multiple
message.
J
For
example,
like
someone
can
propose
a
revoked
data
cap
message
and
then
we
can
have
like
two
or
more
notes
race
or
like
our
one,
like
key
holders
to
like
approve
that
message
before,
like
we
actually
do
the
action.
That's
that's
one
option.
The
other
thing
we
can
do
is
like
it's
gonna
be
one
message.
However,
you
know
a
couple
of
like
notes.
J
We
will
be
collecting
signatures
from
like
multiple
notaries
or
like
rookie
holders
that,
like
have
that
sent
to
the
network
as
seen
one
message,
that's
all
possible,
so
it
comes
down
to
a
governance
problem
for
the
far
coin
plus
it's
like.
If
this
is
a
trust
program
who
can
make
the
decision
like
who,
how
many
signatures
we
need
as
long
as
the
falcon
first
community
can
reach
consensus,
I
think
on
protocol
level,
corelab
should
be
able
to
implement
in
the
way
you
guys
wanted.
A
J
J
Yes,
like
I
do
want
to
talk
about
like
exploration
on
data
cap
a
little
bit
more,
so
I
don't
know
how
the
falcon
plus
community
is
feeling
about,
like
all
those
like
stealth
data
cap
that
is
not
used
in
the
network.
Right
now.
Is
that
something
we
want
to
do
in
the
future.
A
Yeah,
so
why
don't
we?
Why
don't
we
pause
there
and
just
see?
We
have
some
notes
on
the
call
feel
free
if
you
have
strong
opinions
or
if
you
have
clarifying
questions,
drop
them
in
the
chat
or
raise
your
hand.
So,
let's
just
like
pause
for
a
second
and
let
people
digest.
C
I
just
want
to
say
that
I
think
this
is
a
pretty
interesting
opportunity
for
us
to
also
test
root,
key
holder
plus
notary
interaction
for
like
signing
something.
That's
not
it's
like
kind
of
unprecedented
for
us.
We
haven't
done
this
before,
but
it
could
be
cool
in
that
one.
C
You
know
with
with
multiple
signatures
coming
in
from
different
stakeholder
groups,
and
so
personally
like
I
would
put
this
as
a
plug
as
hey
like,
even
if,
even
if
we
do
this
just
to
like
experiment
with
it
and
then
have
this
implementation
done.
It
creates
like
interesting
combinations
of
things
that
we
can
also
do
in
the
future,
and
it
also
ensures
that,
like
the
later
keyholders
sort
of
show
up
and
just
batch
sign
in
notaries,
this
might
give
them
something
interesting
to
do
as
well.
C
In
addition
to
that,
but
I
do
think
it's
a
pretty
cool
cool
way
to
integrate
like
multiple
sort
of
stakeholders.
I
would
definitely
propose,
like
you
know,
maybe
the
second
one
where
it's
like
two
notaries,
or
one
of
them,
is
the
one
that
originally
gave
the
data
cap
to
the
client
and
maybe
one
of
the
root
key
owners
entities
or
something
like
that.
J
Actually,
you
bring
up
like
a
very
good
point
like
from
the
scene.
You
say
is
like
bug
transaction.
So
right
now,
like
I
I'm
pretty
sure
like
for
a
locate
like
data
cut,
we
can
like
one
transaction
count.
One
message
can
only
handle
one
location
or
like
one
grant,
then
something
we
can.
We
can
fit
as
well.
We
can
say
like
in
one
message:
you
can
actually
grant
the
account
to
multiple
kinds
and
in
one
in
one
message
a
rookie
can
verify.
J
You
know
multiple,
no
trace.
That
is
something
like
worse
in,
like
fib,
you
know
that's
a
pretty
easy,
I
would
say
protocol
change
like
in
actor
we
make,
and
if
you
want
that
in
the
next
you
know
a
network
upgrade,
we
should
get
the
process
going
and
then
we
should
do
the
same,
for
you
know,
remove
a
revoked
data
cap.
I
think
that
can
solve
a
lot
of
like
issues
with
like
you
know,
manual
process.
Nother
rookie
holder
needs
to
be
done.
C
No,
I
think
it's
actually.
That
does
remind
me
of
it.
I
did
flag
this
in
the
morning
call,
so
I
should
do
it
now
as
well,
which
is,
I
think,
getting
this
implemented
for
the
next
network
upgrade
in
fab
versus
like
waiting
another
two
three
months
after
that
is
super
compelling
for
us,
like
we're
tracking
towards
all
of
this
stuff
for
q1.
C
It's
like
our
most
aggressive
period
of
change
that
we've
had
in
the
system,
starting
from
like
august
september
till
q1,
and
I
think
that
this
is
like
yet
another
really
cool
thing
that
we'll
get
to
integrate
as
part
of
our
like
process
safety,
our
system
safety
side
of
things,
especially
as
we
expand
the
scope
of
the
program
substantively,
and
so
I'm
very
much
in
the
favor
of
like.
C
Let's
talk
more
now,
let's
talk
more
at
the
next
governance
call,
and
then
at
least,
and
then
at
that
stage,
go
back
to
the
the
implementation
team
with
a
proposal
that
says
hey,
let's
try
this
first
kind
of
a
thing,
even
if
it's
for
us
to
change
later,
at
least
we'll
have
something
to
start
out
with
and
so
airing
on.
C
B
Listen,
I'm
sort
of
scanning
these
now
and
I
think,
there's
lots
of
questions
about
what
happens
and
how,
how
might
it
impact
storage
providers
and
clients-
and
you
might
remember
me
a
while
ago
saying
how
how
it
would
be
useful-
and
I
know
that
galen-
you
did
a
map
a
while
ago,
but
it
was
more
processed
and
it
was
on
our
side,
but
what
if
we
flipped
it
and
we
started
to
put
the
lens
or
the
focus
on
the
client
journey
and
we
understand
beginning
to
end
what
their
process
is,
and
it's
like
there's
a
formula
for
this.
B
B
What
happens
when
they
fall
out
of
the
system
when
they're
not
using
and
it
might
help
us
solve
some
of
the
underutilization
parts
as
well
like
how
do
we
identify
them?
So
I'm
I'm
volunteering
to
do
it.
I'm
you
know
I
can
sit
down
with
yourself
and
galen
to
start
to
do
the
first
version.
C
Actually,
we
had
tim
from
ecology,
limited
dial
into
the
call
this
morning
he's.
He
ran
into
the
very
interesting
challenge
of
basically
getting
to
commit
too
many
of
his
deals
to
only
two
storage
providers,
because
nobody
else
wanted
to
take
his
deals
initially
and
then
getting
react
basically
by
the
fact
that
we
have
guardrails
in
place.
That
say
you
can't
over
commit
deals
to
like
less
than
like
three
or
five
search
providers,
because
their
original
goal
was
to
have
like
30
storage
providers
and
so
their
their
like
data
cap
tranche
was
not
shipped.
C
The
second
one
was
not
like
sent
out
because
it's
like
hey,
like
you,
used
two
search
providers
when
you
said
you'd
use
30.,
and
so
he
came
to
the
call
to
like
talk
through,
like
his
pain
points
on
like
how
do
you
do
data
transfer
for
minors
in
general,
like?
How
do
you
find
stretch
fighters
that
can
reliably
make
deals
at
the
magnitude
that
they're
looking
for?
C
C
I
think
getting
it
done
by
a
client
would
also
be
ideal
in
a
way
where
it's
like
a
either
work
extremely
closely
with
some
vocal
clients
or
just
requests
them
to
like
write
some
of
the
thoughts
that
they've
shared,
like
even
tim,
if
you
were
to
bullet
out
some
of
the
things
that
he
shared
this
morning,
I
think
would
be
a
really
good
step
in
this
direction,
but
yeah.
This
is
something
you're
excited
about
meg
like
I,
I
don't
see.
B
B
We
pick
up
where
there's
friction
and
where
there's
opportunities
and
that's
how
we
improve
the
process,
then
it
becomes
a
living
thing
right
so
and
as
as
guard
rails
and
rules
and
automation
happen,
so
does
the
journey
that
changes
as
well
and
then
it
might
make
it
simpler
for
anyone
that's
catching
up
on
on
the
changes,
because
we're
all
not
in
one
place
at
one
time,
and
so
therefore,
it's
hard
to
follow.
Sometimes
it's
a
suggestion,
I
think
think
about
it,
I'll
ping,
you
guys
offline
a
little
bit
more
about
it.
C
Any
other
notice
your
thoughts
on
implementation
side.
For
this
like
like
what
sounds
like
a
compelling
requirement
or
what
sounds
like
a
compelling
audit
trace
to
say
yes
like
we
as
a
community,
decided,
we
should
revoke,
verified
client
status
or,
like
you,
remove
data
cap
from
a
client
and
then
who
needs
to
be
liable
for
that
ending
up
on
the
chain.
J
I
personally
feel
like
have
like
multiple
threes
should
be
good
enough
because,
like
a
one
rookie
holder
has,
like
you
know,
granted
the
cap
to
to
the
note
race.
Basically,
they
trust
the
notorious
two
like
handle.
You
know
to
monitor
and
handling
the
whole
day
that
have
a
location
across
the
network,
so
I
don't
think
we
have
to
have
some
like
rookie
holders
there,
but
that's
just
my
personal
opinion.
C
Yeah,
I
think
that
makes
sense.
I
think,
there's
a
similar
similar
opinion
this
morning
to,
whereas,
like
new
killers
may
not
need
to
be
involved
in
this,
I
think
the
only
benefit
of
that
is
basically
one
getting
to
test
that
as
a
combination
for
future,
like
more
automation,
based
decisions
and
decision
making.
C
As
you
move
towards
that-
and
I
think
the
only
other
reason
would
be
like
have
like
pure,
like
execution-based
audit
drilling,
where,
like
rookie
holders,
are
not
plugged
into
the
process,
they
don't
really
know
what
each
individual
client
is
doing
and
so
to
fire
off
a
request
in
root
key
order.
You
typically
we
like,
at
least
when
gail,
and
I
do
it,
for
example,
when
we're
adding
a
new
notary.
It's
like
you,
have
the
whole
application.
C
C
We
have
all
the
data
we
need
to
execute
on
this,
because
their
role
is
not
necessary
to
have
an
opinion,
but
rather
to
to
get
enough
data
to
say
we
can
execute
on
this,
and
so
it
ends
up
being
like
a
a
slight
gatekeeper
and
the
quality
of
the
audit
trailing
that
we
do,
and
so
I
think
that
could
be
valuable,
but
may
not
be
a
requirement.
A
Right,
I
would
I
would.
I
could
explain
on
that
that
to
me
I
think,
as
I
think
about
these
options,
I
think
the
one
that
is
most
compelling
is
two
notaries
and
one
route
keyholder,
for
the
reasons
that
he
mentioned,
of
like
being
able
to
test
that,
but
also
from
the
sense
that,
if
I
think
about
like
what
is
the
unhappy
path
that
this
could
go
down
where
it
could
like,
how
could
this
become
a
threat
to
the
program
and
the
network
right
like
what?
A
If
we
don't
require
a
rootkeyholder
and
it's
in
its
two
notaries?
What
does
it
look
like
for
suddenly
two
notaries
to
go?
Rogue
revoke
all
of
the
data
cap
from
a
ton
of
large
dataset
clients
shut
down
their
operations,
even
if
it's
just
for
a
day
until
you
know
the
note,
like
other
notaries,
jump
in
and
realize,
what's
going
on,
like
what
is
what
are
the
stop
gaps
on
sort
of
this
process
going
off
the
rails?
And
I
think
one
of
the
interesting
things
about
thinking
about
how
the
root
key
holders
can
do.
A
These
are
people
that
are
less
available,
less
plugged
in
and
and
less
kind
of
have
have
maybe
a
different
set
of
stakes
in
the
program
compared
to
the
notaries
and
what
their
interests
in
the
program
overall
would
be,
and
so
by
having
a
root
key
holder
in
the
loop,
it's
sort
of,
as
he's
saying
forces
more
of
that
justification
audit
trail.
That
says,
like
the
two
notaries
need
to
come
with
a
clear
proof
and
then
the
root
key
holder
can
be
like.
Yes,
this
passes
this
passes,
like
my
bar
of.
J
Oh,
I
think
you
said
like
your
concern
is
like
very
real.
I
do
want
to
point
out
like
maybe
fit.
It
creates
another
fit,
we're
literally
changing
a
falcon
plus
to
fit
it,
but
like
another
fifth
can
be
like
maybe
root
key
holder.
I
don't
think
it
exists
today.
I
just
like
be
able
to
remove
my
amen
like
a
membership
for
the
notary.
It's
like
if
those
notes
really
did
go
work
like
like
for
no
reasons
they
are
actually
hum
hum
the
network.
J
A
Remove
verifier
is
a
command
there.
There
is
there's
not
a
clear
there's,
not
right
now
a
clear
like
if
this
happens
or
this
or
this
there's
not
a
there's,
not
necessarily
a
clear
parameter.
However,
there
is
a
technical
tool
where
a
root
key
holder
can
remove
a
notary
which
we.
A
A
To
roll
those
to
the
new
process,
we
deprecate
that
notary,
so
so
the
technical
tooling,
is
there
for
keyholders
what
this
so
so,
basically,
like
we
built
a
tool
that
lets
a
root.
Key
holder
make
a
notary.
We
built
tools
that
let
notaries
empower
clients
now
we're
talking
about.
What
does
it
look
like
to
to
go
backwards
right,
so
we've
got
a
tool
where
a
notary
can
revoke
or
a
root
builder
can
revoke
a
notary.
So
then
what
is
the
tool
that
allows
notaries
to
revoke
clients?
And
so
it's
just
like
it's.
J
I'm
aware,
like
that,
you
know
a
rookie
holder
that
can
set
like
they
like
have
balance
of,
like
note
3
to
be
like
zero,
but,
like
you
can
create
a
action
item
like
for
me
to
actually
check
when
a
notary
data
can
balance
your
zero.
Yet
he
or
she
is
actually
removed
from
the
note
3,
because
that's
how
we
propose.
J
C
Yeah,
pretty
sure
that
I
did
jen
actually
just
so
you're
aware
we
the
way
we
should.
We
should
double
check.
And
so
I
will
bring
you
to
double
check
this.
But
today
the
the
proposal
mechanism
to
the
root
key
holders
that
we
draft
is
like
set
this
balance
to
zero
and
that
gets
interpreted
as
remove
verifiers
head
balance
to
zero.
And
so
we
should.
We
should
double
check
like
what
happens
but
yeah,
because
I
don't
think
we've
seen
notaries
that
stay
on
the
registry.
C
J
Yeah,
it
does
exit
you
guys
know.
The
actor
was
better
than
me,
which
is
perfect
yeah.
Another
option
like
do
I
do
this
like
combination
this,
because
it
sounds
like
you
guys
want
to
keep
things
flexible
and
also
like
want
to
play
with,
like
nostalgia,
key
holder
like
responsibility
like
sharing
experience
and
here
another
option,
we
can
do
as
you
say,
in
total,
we
need
like
three
signature.
It
can
be
a
combination
of
x
amount
of
like,
like
any
amount
of
like
notorious
or
any
amount
of
like
rookie
holder.
A
One
of
the
things
that
we
that
we've
sort
of
brought
up
a
few
times
that
would
be
really
interesting,
is
a
multi-sig
of
multi-sigs
and
kind
of.
How
would
that
which
we're
sort
of
running
out
time
to
spend
a
lot
of
time
on
this
idea,
but
like
what
is
it?
What
does
it
look
like
to
basically
say
like
if
we
create
a
multi-sig
and
there
are,
for
example,
two
people
from
one
organization
on
that
multisig?
A
How
do
we
say?
Only
one
person
from
the
file
coin
foundation
is
necessary,
but
we
need
two
signatures
overall
right,
and
so
it
could
be
either
of
these
five
people
at
file
coin,
but
either,
but
only
one
right,
and
then
we
need
someone
else
from
an
outside
organization
as
well.
So
it's
so
that
starts
to
become
an
interesting
thing.
It's
like,
if
we
say
the
rule,
is
you
need
two
notaries,
any
two
notaries
and
one
root
key
holder
out
of
any
of
this
list
of
root
key
holders.
A
You
know
how
do
we
set
the
threshold
of
the
message
is
three,
but
it's
two
from
column
a
and
one
from
column
b
and
also
bring
their
passport
and
their
coveted
vaccine
cards
with
them.
But
anyway
it
sounds
like
jimmy
has
already
solved
this
and
she
knows
how
to
do
it.
I
do
want
to
move
on.
I
think
that
we
have
k
right
here
to
talk
about
some
fun
holiday
information
and
upcoming
things
for
the
community
k-ray
yeah.
H
Thanks
caitlin
new
voice,
new
topic:
here's
what
I'll
hit
you
with!
First,
if
you
see
on
the
slide
holiday
schedule,
is
coming
up
so
january
4th
we
are
not
going
to
have
a
governance
call
so
we're
gonna
update
the
calendars,
we're
gonna
post
this
in
the
repo.
You
might
see
this
come
through
on
slack,
but
just
be
known,
you
hear
it
first
january
4th
no
call
week
immediately
following
on
11th.
The
cycle
will
resume
back
on
the
network's
call
on
the
11th.
Now
to
the
point
on
feedback.
H
As
you
see,
these
fips
come
across.
It's
really
important
that
we're
capturing
the
journey
for
notaries
as
well
as
clients,
so
the
coming
days
and
weeks
you
might
see.
Actually
you
will
see
an
outreach
from
me
we're
going
to
be
trying
to
get
signal
from
you.
What
do
you
think
about
certain
initiatives?
How
do
you
communicate?
H
What
we're
trying
to
do
is
make
sure
that
we're
coming
from
a
place
that
really
communicates
in
a
way
that
captures
your
signal
so
stand
by
you'll,
see
that
interview
requests
come
through
and
the
last
just
keeping
you
updated
on
the
status
of
the
leadership
notary
board.
That's
coming
so
the
rfp
is
going
out
in
the
next
coming
days
and
weeks
as
well.
So
what
this
will
do
is
give
an
actual
display
we
stand
and
where
we're
working
and
you
can
actually
see
what
deals
are
coming
through.
A
Thank
you.
Thank
you.
Thank
you
great
deeps
over
there
writing
a
new
fit
as
well
as
making
minting
some
nfts
for
us.
There
are
some
open
discussion.
Topics
want
to
hit
on
this
one
data
cap
distribution
expectations
across
multiple
storage
providers.
Like
you
mentioned,
we
heard
from
ecology,
specifically
so
they're
an
ldn
number
36
as
well
as
shinshu
ldn
number
79.
Those
have
been
flagged
as
needs
diligence
partially,
because
in
their
application
they
said
they
were
going
to
be
able
to
make
a
certain
amount
of
deals
with
certain
amount
of
providers.
A
What
we
have
seen
so
far
with
their
deal,
making
behavior
does
not
match
that,
and
so
we
paused
it
to
sort
of
have
a
chance
for
notaries
to
weigh
in
so
I
would
be
curious.
I
would
encourage
you
guys
to
go
once
these
get
posted
and
listen
to.
Ecology.
Limited
speak
for
themselves,
but
also
they
have.
They
have
been
replying
to
comments
in
their
issue,
and
shinsu
has
been
replying
to
comments
in
their
issue
as
well.
A
So
we
want
to
hear
from
the
notaries
on
whether
or
not
we
feel
comfortable,
removing
that
diligence
label
and
starting
a
new
allocation
request
or
if
we
think
that
we
still
have
questions
for
them
we're
running
out
of
time.
So
I'm
just
going
to
wait
here
like
three
seconds
to
see
if
anyone
has
an
immediate
thought
on
either
of
these
or
it's
this
parent
issue.
A
Not
seeing
ends
so
I'm
just
going
to
encourage
you
guys
to
go
check
out
this
discussion,
319
and
those
two
specific
ldns,
the
other
one
that
has
come
up
around
reducing
the
constraints
on
requesting
an
additional
data
cap.
This
one
is,
is
a
little
bit
easier
to
talk
through
part
of
the
problem
is
some
out
of
date
documentation
around
how
the
bots
are
currently
functioning
to
request
subsequent
allocations.
A
It
says
that
it
needs
to
be
90.
This
isn't
actually
true
anymore.
It's
the
subsequent
allocation
bot
is
supposed
to
kick
off
when
they
have
20
they've
used
75
percent
of
their
allocation,
specifically
for
textiles.
There
were
a
couple
technical
issues
that
came
up,
that
sort
of
have
like
led
to
some
of
this
brainstorming
good
ideas
here.
Overall,
if
there
are
some
questions,
comments
or
concerns
on
discussion,
321
pause
for
a
second.
C
I
think
one
of
the
takeaways
from
this
was
just
like.
We
need
to
update
some
of
the
docs
too
right
again
we're
running
a
little
bit
behind
on
that.
So
we've
got
to
prioritize
that
a
little
bit
good,
takeaway
jennifer's
got
a
question
in
the
chat.
That's
not
related
to
this,
which
is
what
expiration
period
does
the
football
community
need
right
now
for
data
cap
jen,
that's
a
great
question.
C
A
I
would
also
say
that
a
question
with
that
is
like
this
came
up
this
morning.
If
a
client,
what
do
we
define
as
inactive
right?
If
a
client
has
asked
for
data
cap-
and
let's
say
they
get
32
gib
in
you
know
march,
and
that
client
goes
on
to
not
use
that
32
gib,
but
they
got
more
data
cap
they're
making
deals
they
just
haven't
used
that
amount.
A
Is
that
an
inactive
client
or
not?
Where
do
we
where's
the
threshold
of
activity?
Is
it
on
any
message
any
deal
making
or
is
it
on?
You
know
a
specific
amount
of
data
cap.
I
think
that
a
starting
point
might
be
saying
if
a
client
address
gets
an
amount
of
data
cap
and
does
nothing
for
x
amount
of
time
revoke
that
data
cap.
A
That's
like
a
very
clear
to
me,
inactive
client,
but
it
does
just
bring
up
the
question
like
what,
if
they're,
making
deals
they're
just
not
using
their
data
cap,
should
we
revoke
that
data
cap
or
wait
so
some
questions
there
you've
got
real
close
to
the
camera.
He's
just
gonna
like
give
a
strong
opinion
about
something.
A
No
okay:
there
was
also
another
sort
of
placeholder
discussion.
Deep.
Did
you
want
to
touch
on
this
one
or
save
it,
since
we
didn't
get
a
chance
to
talk
about
it
this
morning
and
we
don't
have
the
discussion
topic
yet.
C
Yeah,
I
think
jennifer
actually
kicked
off
the
discussion
topic
for
this
as
number
324
between
today
morning's
call
and
today
evenings
also.
We,
we
should
probably
do
a
quick
discussion
on
if
we
have
the
time
get
into
the
other
time.
Okay,
cool
and
we
can
catch
people
up
in
slack
after
but
dldr
is
basically
clients
have
clients
can
use
multiple
types
of
addresses,
of
course,
but
you
know
between
f1
f2
f3
addresses.
C
We
see
a
variety
of
ulti
coming
in.
Unfortunately,
some
come
in
with
f2
only
to
realize
that
after
getting
data
cap,
they
can't
use
those
addresses
in
making
deals
today.
So
even
even
though,
like
removing
data
cap
and
reissuing,
it
would
be
helpful,
but
between
f1
and
f3.
It
turns
out
that
there
are
implications
to
that
decision,
and
so
there's
an
ask
from
some
of
the
market
like
folks
that
work
on
markets
for
falcon
to
say.
C
Maybe
we
should
consider
pushing
clients
to
using
f1
based
addresses,
as
opposed
to
f3
ones,
because
it
results
in
a
higher
published
deals.
Message
cost
to
a
storage
provider,
thereby
making
the
likelihood
of
batch
deal
making
at
scale
with
storage
providers.
Even
trickier,
and
and
given
that
there
is
some
friction
there
today
with
like
getting
video
making
down
at
scale
reliably
for
clients,
this
could
be
one
one
incentive
mechanism
that
could
be
used
to
help
with
that
jen.
C
J
Of
course
yeah.
Thank
you
for
the
background
context.
It's
actually
like
very
accidental,
like
we
just
get
bored
and
started
to
look
at
the
guest
chase
of
like
published
storage
deal
like
one
day
eating
a
lotus
call.
We
have,
and
then
we
found
out
like
the
this
exact
message.
We
randomly
picked
like
a
lot
of
the
guys
usage
are
used
for
like
signature,
verification
which
is
like
we
realize
is
caused
by
the
bls
wallet
address
and
it
costs
like
10
x.
J
More
than
like
the
signature,
verification
for
the
step,
256
k1
wallet
address
now
related
enough.
So,
as
we
all
know,
the
community
member
has
been
sharing
opinion
that,
like
you
know,
public
storage,
still
message
being
like
expensive,
like
very
gas.
Costly
has
been
a
pain
point
for
them
to
onboarding
clients
or
like
control
the
base
cost
we
even
have
they
fit
from.
J
I
believe
it's
like
matrix
matrix
storage.
It's
like
flip
discussion
like
q1
when
to
specifically
ask
you
know
to
reduce
the
psd
message.
Like
that's
cost.
The
proposal
here
I
have
like
leave
my
opinion
there
like
how
it
may
cause
you
know,
attack
factor
for
the
network
or
whatever,
but
a
very
simple
thing
to
do.
To
actually
reduce
this
message,
like
the
cost
for
the
storage
provider
is
for
kind.
J
Do
you
use,
except
like
250
6k
wallet
address
instead
of
bls,
so
but
yeah,
so
we
are
hoping
like
starting
from
you
know.
J
A
lot
of
the
deal
from
the
network
right
now
is
coming
from
like
wi-fi
deal,
which
is
a
scene
coming
from
the
falcon
plus
program,
and
we
are
hoping
to
work
with
the
falcon
plus
community,
including
the
verified,
client
and
military
to
you
know,
starting
save
some
guests
for
the
search
further,
which
can
you
know,
potentially
even
lower
the
ask
price
from
the
search
further
in
the
long
term
in
the
future
yeah
and
the
ascii
is
pretty
simple.
It's
like
whatever
kind
you
just
like
start.
J
C
Yeah
but
there's
we
we
do
need
to
get
some
data
agent
based
on
what
we
talked
about
and
how
f1
addresses
can
also
end
up,
resulting
in
being
slightly
more
expensive,
like
tiny,
tiny
bit
for
certain
operations,
and
so
just
making
sure
that
we're
transparent
about
that
would
be
good.
J
It's
sharing
more
like
more
data,
but
like
it's
like
300
000,
like
gas
cost,
which
is
like
based
on
the
price.
Today,
it's
like
0.0002
cents
in
a
way,
but
like
definitely
should
be
sharing
more
data.
C
Thank
you.
If
you
don't
mind,
I
I
think
this
is
a
good
discussion.
Let's
chat
about
it
further
on
github,
encourage
folks
to
go,
read
this
and
ask
any
questions
that
you
have
tldr
even
making
the
recommendation
is
probably
pretty
low
left
for
us
as
a
community,
and
something
we
can
be
supportive
of.
C
I
think
it
becoming
a
requirement
is
certainly
a
little
bit
tricky
and
a
little
more
controversial,
but
it
being
your
recommendation
is
probably
like
reasonable
from
from
like
a
phil
plus
scope
side
of
things,
so
notaries
are
in
the
room.
Take
a
look
see
what
you
think
see
how
it
impacts
your
client
experience,
given
that
we
have
four
minutes
left
gail,
and
maybe
we
need
to
pop
back.
I
don't
know
unless
we
can
talk
about
these
further
up
to
you
all
right
that.
A
Was
in
the
head,
I
will
also
say
that
I
I
thought
that,
in
the
zeitgeist
of
things,
blockchains
that
have
very
expensive
gas
fees
are
really
popular.
Am
I
misunderstanding
that
does
that
there's
one
that
comes
to
mind
where
the
gas
fees
seem
to
be
discussed
very
often,
so
I
thought
that
would
be
a
be
a
pro,
but
maybe
I'm
misunderstanding:
all
all
press
is
good
press
no
I'll
jokingly.
Decide,
though.
I
think
that
this
is
an
an
interesting
thing
to
bring
up
and
it
does
have
ramifications.
A
I
I
tend
in
in
this
instance.
I
do
agree
with
deep
that
making
as
a
community
making
a
request
to
a
client
seems
seem
good
and
telling
them
kind
of
the
the
situation
versus
making
it
a
parameter.
A
That
says,
in
order
to
be
verified,
you
must,
you
know,
have
a
certain
type
of
address,
and
so
thinking
through
are
there
other
ways
that
we
could
approach
solving
this,
and
you
know
maybe
it's
something
to
the
extent
of
are
there
other
ways
to
subsidize
or
calculate
gas
fees?
A
And
you
know,
just
all
of
these
hurdles
are,
are
opportunities
to
also
think
creatively
too
right.
So.
J
Like
yeah,
because
this
is
like
you
know
recorded
so
I
want
to
be
like
very
clear
that,
like
at
least
for
like
a
lotus
team
and
the
protocol
that
team
that
is
like
protocol
labs,
we
had
been
like
wanting
to
power
test.
You
know
looking
to
publish
storage,
still
message
how
we
can
optimize
that
how
we
can
make
the
deal
on
boarding
price
like
cheaper
for
the
storage
further
in
the
long
term
like
in
the
past
couple
months,
and
there
are
more
investigation
like
going
on
these
days.
J
However,
most
of
them
needs
a
lot
of
research
like
research
and,
like
you
know,
protocol
changes.
So
this
exactly
this
exact
same
is
why
immediate
action
doesn't
need
a
protocol
change
and
can
easily
be
done
with
some
of
the
help
from
you
know
the
community
and
we
do
feel
like
he
brings.
You
know
that
positive
impact
for
both
like
clients
and
start
friday
in
general.
It's
not
a
long-term
solution
for
sure,
but
yeah.
C
I
think
for
us,
like
jen
the
number
one
priority
has
been
like
the
client
experience
right
now,
where
it's
like.
We
need
to
make
sure
that
the
client
onboarding
experience
is
as
painless
as
possible.
C
So
I
think
some
due
diligence
that
we
probably
need
to
do
together
as
well
would
probably
be
like
where,
like
what
is
the
actual
snapshot
state
today
of,
like,
I
don't
even
know
off
the
top
of
my
head,
like
the
ratio
of
people
that
are
using
like
f1
versus
f2
versus
fc
addresses,
for
example,
like
I
don't
know
how
many
would
be
impacted,
there's
also
an
aspect
of
like
two
megs
point
earlier
on,
like
in
the
client
like
journey
where
they
come
through
this
path.
How
many
of
them
already
have
an
address?
C
That's
already
activated
versus
how
many
don't
have
one
and
are
getting
activated.
So
a
good
example
of
this
is
like
the
automated
verifier
can
actually
be
used
to
activate
addresses
on
chain
right.
So,
like
people
go
to
verify.io
with
a
new
address,
request
automated
data
cap
and
then
that
gives
them
legitimacy
on
the
network.
That
is
a
very
interesting,
like
probably
high
leverage
data
point
to
educate
a
client
to
say,
hey
you're,
entering
in
this
kind
of
address.
This
has
these
implications
or
like
in
our
documentation.
We
need
like.
C
We
need
to
do
a
better
job
of
this,
but
we
often
link
off
to
like
generic
like
lotus
docks,
so
like
make
sure
you
put
them
in
those
lotus
docks
as
well
or,
like
other,
like
implementations,
have
like
respective
documentation
too,
where
this
is
a
little
bit
more
transparent.
I
think
we
could
benefit
from
in
the
community,
because
that
also
reduces
the
net
effort
per
notary
per
client
right,
because
per
application
like
having
a
notary,
educated
client
is
very
difficult
but
pointing
to
a
piece
of
documentation
that
says:
hey.
C
Please
consider
this
becomes
extremely
easy,
and
so
I
would
I
would
basically
like.
I
would
like
to
take
this
discussion
in
a
way
of
like
how
can
we
do
this
in
a
lower
fraction
way?
And
then
how
do
we
have
the
data
to
prove
that
it's
actually
worth
the
time.
A
And
the
sun
is
set
in
here
in
california,
and
it
is
now
5
p.m.
We
are,
we
are
at
time
so
wanted
to
thank
everybody
for
joining
this
be
recorded
post
online
plenty
of
links
in
here
for
other
discussion.
Please
go
check
out
those
github
issues,
flips
discussion
issues
and
our
repos
and
share
your
opinions
there.
So
thanks
everybody,
oh
and
as
as
kerry
mentioned,
you
know,
be
looking
out
for
some
surveys
and
interviews.
So
great
thanks.
Everyone.