►
From YouTube: Filecoin Plus - March 16 Notary Governance
Description
Community governance call to discuss the DataCap allocations for improving TTD (time-to-DataCap), also covered is the formation of a Governance Tribunal for dispute resolution and increased transparency.
A
All
right
welcome
everybody,
so
glad
to
have
you
here
to
the
next
edition
in
the
notary,
governance
and
falcon
plus
community
calls.
We
still
have
a
couple
of
people
popping
in
so
kerry.
I'm
gonna
ask
we
can
trade
off
as
we're
presenting
yeah
just
just
to
kick
things
off
like
so
good
to
see
everybody
here
as
usual,
we've
got
a
slide
back,
we'll
cover
some
of
the
issues
that
have
been
opened
for
a
while.
That
needs
some
follow-up.
A
We'll
talk
a
little
bit
about
some
new
issues,
but
before
we
get
into
that,
we
actually
have
a
couple
of
updates,
and
so
I'm
going
to
share
my
screen
and
just
kick
things
off
here:
okay
march
16th,
just
shortly
after
pi
day,
hopefully
you
had
some
pie
a
couple
days
ago.
A
I
guess
that
only
counts
in
this
daytime
format.
So
yeah
we'll
talk
a
little
bit
about
content
and
then
some
updates
from
the
notary
front,
especially
from
at
least
what
I've
seen
and
then,
if
anybody
else
has
anything
to
contribute,
if
you're
a
notary
and
then
talk
to
the
issues
that
have
been
opened,
the
first
stop
falcon
plus
content.
I'm
going
to
hand
it
over
to
k-ray
if
you
want
to
give
a
give
us
a
quick
update
on
some
of
the
stuff
you've
been
doing.
B
Yeah
warm
friendly
hello,
so
two
points
to
really
put
on
everyone's
radar.
The
first
has
to
do
with
how
we're
recording
these
meetings.
So
in
the
past
we
were
using
zooms
cloud
only
problem
with
that
is
there's
an
expiration
date
on
it.
We
weren't,
quite
aware
of
so.
If
you
check
the
notary
repo
you're
going
to
notice
the
only
last
two
meetings
are
hung
for
historical
purposes
so
going
forward.
What
we're
going
to
do
is
we're
going
to
capture
these
meetings,
we're
going
to
be
hanging
them
on
the
filecoin
youtube
page.
B
There's
a
dedicated
playlist
called
the
libraries
that
we
have
set
up.
You
can
see
the
link
there
and,
if
you're
keen,
I
can
show
that,
after
this
chat
into
the
link
that
we
have
for
the
announcement,
the
second
thing
has
a
little
bit
more
pressing
for
the
community.
Is
we
notice
that
there's
common
questions
that
seem
to
come
up
so
for
the
last
couple
of
weeks?
What
we
did
was
we
gathered
all
of
those
questions
that
have
been
incoming.
We
put
them
into
one
centralized
faq.
B
These
were
issues
that
were
coming
from
the
notary,
repo
itself
or
some
of
the
more
common
ones
that
we
saw
in
the
file
coin,
plus
slack
channel
we're
going
to
open
this
up
for
community
feedback,
because
these
are
the
issues
that
we
saw
that
we
were
spending
time
on
a
lot.
We
want
to
make
sure
that
it's
addressing
what
the
community
is
also
seeing.
So
in
the
coming
days,
you
might
notice
that
link
gets
posted,
we're
very
open
to
feedback.
B
A
Cool,
I
just
want
to
make
another
plug
that
you
know
in
general,
educating
the
community
improving
the
quality
of
the
content.
That
exists
is
an
ongoing
journey
for
all
of
us.
So
if
you
have
ideas,
if
you've
got
time,
please
you
know
feel
free
to
start
a
thread
on
the
falcon
plus
channel
and
slack
share
your
ideas,
your
thoughts
and
things
that
we
can
all
contribute
towards
and
improve
as
more
people
enter
this
ecosystem.
More
people
need
to
get
educated
about
how
falcon
plus
works
and
are
curious
to
get
involved.
A
As
some
of
you
may
remember,
we
also
were
looking
at
the
idea
of
having
like
a
filethorn
plus
day
summit
type
thing
at
some
point.
That
idea
is
still
very
much
on
the
books
and
we're
going
to
be
doing
some
planning
and
hopefully
I'll,
have
some
more
tactical
updates
to
share
in
the
next
notary
governance.
Call
on
like
how
we
can
get
closer
towards
that
cool.
So
that's
it
on
the
content
front.
Unless
anybody
else
has
anything
they'd
like
to
share
gonna
give
10
seconds
to
anybody
in
the
room.
A
Cool,
let's
keep
going
so
on
the
notorious
front,
a
couple
of
things
that
like
have
been
flagged
or
are
being
flagged
actively
in
the
channel,
as
well
as
just
on
github
and
in
the
last
noted
governance
call.
A
A
It
looks
like
data
gap
wasn't
allocated
correctly
in
the
first
pass,
so
I'll
be
helping
both
of
you
to
make
sure
that
that
happens
and
we'll
be
working
with
the
the
kiko
team,
as
well
as
anybody
else
who
can
help
validate
the
flow
just
to
ensure
that,
like
things
are
happening
correctly,
this
was,
of
course,
the
first
time
that
we
were
doing
data
cap
allocations.
A
I
think
to
noted
so
it's
quite
possible
that
something
trivial
was
missed
and
so
we'll
go
through
and
do
an
audit
and
ensure
that
these
are
corrected.
As
far
as
I'm
aware,
neither
of
you
have
as
a
result
allocated
any
of
your
data
cap
and,
of
course,
in
general,
we've
often
talked
about
data
cap
liquidity
being
a
thing
we
want
to
improve
in
the
in
the
ecosystem,
and
so
this
should
hopefully
enable
at
least
two
more
notaries
to
get
going.
A
A
The
second
was
something
that
came
up
last
week,
which
is
fleeq
asked
to
be
removed
as
a
notary.
They
found
that
they
didn't
have
the
bandwidth
to
be
contributing
and
following
client
applications,
and
so
we
had
a
discussion
at
the
last
governance.
Call
that
we
agreed
to
set
their
data
cap
allocation
to
zero
and
fleek
would
effectively
be
removed
without
like
additional
penalties
so
effectively.
A
If
they
do
want
to
be
a
notary
again
in
the
future,
they
just
have
to
reapply
and
they
would
go
through
the
entire
like
rubric
and
scoring
process,
just
like
any
other
notary
would
as
like
an
independent
application.
Without
this
historical
context,
so
this
was
based
on
discussion
that
we
had
in
the
last
governance
call.
A
Of
course
not
everybody
in
this
room
today
was
in
that
last
governance
call,
and
so
I
just
wanted
to
share
those
updates
and
also
open
up
the
floor
for
any
additional
comments
or
thoughts
if
anybody
had
any
on
either
of
these
masaki.
I
know
you're
in
the
call
today
as
well.
If
you
want
to
share
anything
on
your
end
or
if
you're
good,
to
go
that'd
be
great
as
well.
C
Hi
deep,
yes,
thank
you.
I'm
good.
A
Well,
that
bought
me
enough
time
for
a
sip
of
coffee,
so
I
appreciate
it
nice
to
have
you
here
sweet,
so
the
next
next
thing
I
we
briefly
want
to
bring
up
was
like
just
in
general
this.
A
This
data
cap
application
side
of
things
where,
in
the
last
couple
of
calls,
there's
been
a
theme
where,
like
allocations
themselves,
may
not
be
going
out
as
fast
as
as
ideal
and,
of
course,
the
the
steps
that
a
client
goes
through
today,
like
unless
they're
going
to
an
automated
verifier
to
actually
end
up
with
data
cap
in
their
account
or
non-trivial.
It
often
requires
like
a
lot
of
interaction.
Your
client
has
to
understand
and
learn
what
data
cap
is
navigate.
The
right
website
make
an
application.
A
Then
a
notary
has
to
send
them
whatever
appropriate.
Due
diligence
is
required.
A
client
may
then
go
do
additional
research
and
come
back
with
answers.
This
back
and
forth
can
happen
n
number
of
times
until
the
notary
decides
that
yeah.
This
client
is
bringing
a
valuable
use
case
to
file
coin.
They
deserve
some
amount
of
data
cap
and
then
that
data
cap
gets
allocated,
and
so
one
of
the
themes
that
has
been
ongoing
in
various
angles.
A
A
A
The
thing
that
I
did
want
to
have
a
little
bit
of
a
conversation
about
right
now
is
also
just
in
general,
like
the
the
perspective
from
the
notaries
on
data
cap
fluidity.
So
in
past
conversations
we
did
have
some
clients
talk
about
like
bringing
in
data
cap
taking
a
while
for
them-
and
I
know
that's
still
been
an
ongoing
issue,
especially
as
some
notaries
are
also
reaching
their
like.
A
Caps
of
the
first
round
of
data
cap
that
they
were
given
to
allocate,
I
did
want
to
hear
from
notaries
that
have
still
a
substantive
amount
of
data
cap
to
allocate
that
aren't
blocked
so
not
masaki
or
stone.
But
for
those
of
you
that
still
have
a
decent
amount
of
data
cap
left
to
allocate
if
you're,
seeing
any
patterns,
if
you're,
seeing
any
anything
emerging.
A
That
would
help
as
guidance
or
service
just
general
advice
and
suggestions
to
the
community
of
clients
out
there
and
how
they
they
reach
you,
and
how
they're
able
to
apply
to
data
cap,
whether
that's
like
how
they
structure
an
application
or
how
much
data
can
they
ask
for?
I
wanted
to
open
up
the
floor
to
the
the
notaries
in
this
call
for
a
couple
of
minutes
to
share
some
perspective
and
insights.
A
A
D
Yeah,
I
I
think
one
of
the
things
I
could
share
is
like
for
me
as
a
notary
like
not
so
many
clients
trying
to
apply.
You
know
data
cap
from
me,
so
that's
kind
of
the
reason
why
I
still
have
a
lot
of
data
cap
handy
and
but
I
would
say
it
really
depends
on
the
clients.
If
the
client
doesn't
want
to
apply
from
me,
then
I
you
know
I'm
not
intend
to
reach
out
to
them.
Say:
hey.
D
Do
you
want
any
digicap,
but
I
would
say,
probably:
is
it's
not
really
client
friendly,
like
I'm,
I'm
talking
about,
like
probably
it's
complicated,
like
my
process
is
kind
of
complicated
to
them
to
you
know,
apply
or
I'm
thinking
about.
If
there
is
anything
I
could
improve
to
to
make
it
easy
for
the
clients
to
apply
data
cap
from
so
that's
just
kind
of
question
both
to
you
guys
and
to
myself
as
well.
A
Yeah,
I
think
that's
a
really
good
question,
which
is
just
sort
of
generally
opening
up
the
the
platform
to
get
some
feedback
right
as
a
notary
to
see
how
we
can
continue
to
optimize
and
improve,
reduce
friction,
but
also
increase
the
quality
of
the
work
that
we're
all
doing
here
together
to
build
a
productive
ecosystem.
A
A
Yeah
andrew
I
dropped
in
the
chat
that
it
would
be
the
same
as
what
was
mentioned
last
call,
which
would
be.
It
would
be
great
to
bump
up
the
automated
data
cap
from
eight
gigabytes
to
something
more
substantive,
andrew.
I
believe
you
opened
up
an
issue
which
we
will
be
discussing.
I
think
two
slides
from
now
on
bumping
it
up
to
32
gigabytes.
So
let's
definitely
take
some
time
in
this
call
to
talk
about
that.
A
Cool
I'm
gonna
head
to
the
next
slide,
based
on
this
thanks
neo
for
sharing.
As
always
folks.
This
is
an
open
ecosystem
and
open
community
always
great
to
hear
feedback
always
get
work
together
to
iterate,
to
make
this
more
effective.
To
ensure
falcon
plus
does
deliver
its
value
on
you
know,
bringing
useful
data
to
falcon
and
ensuring
that
falcon
can
serve
the
world
in
a
productive
way.
A
So
we
have
a
set
of
issues
that
we
flagged
as
the
new
issues
that
came
up
in
the
last
call
that
I
wanted
to
touch
on
and
just
see
if
there
were
any
updates
in
thinking.
So
the
first
one
was
basically
on
the
the
fat
man
opened
up
an
issue
on
a
huge
data
cab
discrepancy
between
for
some
applicants
in
china
versus
rest
of
the
world.
A
The
conversation
we
had
in
the
previous
notary
governance
call
was
that
fundamentally,
this
is
still
up
to
the
notaries
and
how
they
choose
to
distribute
and
allocate
allocate
data
gap
to
clients
that
they
they
find
to
be
trustworthy
or
ones
that
align
with
their
process
of
due
diligence.
And
so
I
don't
think
fat
managers
in
this
call
did
want
to
make
a
call
out,
nonetheless,
that
it
is.
This
could
be
an
interesting
point
of
feedback
for
some
of
the
notaries
that
are
operating
in
china,
for
example,
specifically
or
just
in
other
regions.
A
That
may
not
have
the
same
rate
of
data
cap
allocation
just
flag
and
think
about
ways
in
which
maybe
the
minimum
bar
sorry,
the
minimum
amount
of
data
cap
that's
being
allocated,
maybe
could
be
higher,
or
things
like
that,
and
it's
also
important
to
note
that,
of
course,
this
is
the
first
time
we're
testing
this
system.
A
A
I
also
want
to
use
this
as
like
an
opportunity
for
us
to
start
bringing
issues
to
resolution
so
like
if
we
agree
that
we
we
are
all
on
the
same
page
and
generally
there's
no
like
actual
action
items
or
changes
that
need
to
be
proposed.
We
should
be
able
to
together
agree
through
github
or
through
these
calls
to
close
things
out,
and
so
this
would
be
an
issue
that
I
would
propose,
I
think
is,
is
ready
for
closing.
Unless
there
are
other
opinions
that
have
not
yet
been.
A
A
Cool
feel
free
to
share
on
github
as
well.
It's
not
going
to
be
like
an
instant
action
to
close
something
out,
but
it's
just
something
to
think
about,
as
we
improve
the
operational
practices
that
we
all
have
in
falcon,
plus
as
well
cool.
The
second
thing
that
we
talked
about
for
quite
a
while
in
the
in
the
previous
noted
governance
call
for
those
of
you
that
weren't
there
was
on
this
idea
of
having
some
kind
of
process
for
really
really
large,
like
massive
use.
A
Cases
which
I
loosely
have
been
defining
is
like
data
cap
required
to
be
greater
than
one
pebby
byte,
for
example.
So
a
great
example
of
this
is
like
starling
labs
is
work
and
what
they're
doing
with
the
show
of
foundation
to
upload
some
of
the
refugee
testimonial
like
interviews
that
they've
been
doing
on
top
of
altcoin.
This
is
like
a
multi-purpose
project.
A
A
But
at
the
same
time,
I
believe
most
notaries
are
in
agreement
that
use
cases
like
this
are
defensively,
valuable
and
should
be
served
by
falcon
plus,
and
so
this
issue,
which
is
specifically
issue,
94,
discusses
this
idea
of
maybe
having
like
a
sort
of
a
faucet,
that's
dedicated
to
a
client,
that's
been
vetted
by
many
notaries
and
is
constantly
checked
by
note
3
in
the
form
of
maybe
like
a
multi-sig
that
can
allocate
data
cap
to
that
particular
client
over
time,
and
so
oh
it
looks
like
jv
is
actually
here
jv.
E
Yeah
sorry,
I'm
outside,
I
think,
like
the
high
level
to
this
one
is
there's
a
number
of
use
cases
that,
like
are
on
our
radar,
that
will
need
super
large
allocations
and
right
now,
like
the
default
process,
is
basically
to
have
like
the
client
go
to
like
every
notary
and
ask
for
data
cap
and
like
that
could
happen.
But
what
will
what
it'll
do?
Is
it
basically
will
just
drain
every
notary
like
the.
E
With
the
size
of
the
allocation
that
they'll
need,
they'll
talk
to
every
notary,
they'll
use
up
all
the
data
cap,
that's
in
the
ecosystem
and
that's
just
going
to
throttle
everyone,
and
so
I
think
there
is
one
path
where
we
do
that.
I
think
something
that
like
may
be
less
disruptive
is
especially
for
these
super
large
allocations.
I,
like,
I
think,
in
order
for
this
to
like
work.
E
We
should
write
up
like
a
full
pr
of
like
the
proposal
and
then
have
people
look
at
that,
but
I
think,
like
the
rough
structure
would
be,
we
define
a
process
for
like
separate
from
like
normal
notary
out,
like
applications
like
a
project
can
apply
to
like
the
community.
Hopefully,
the
barrier
to
entry
is
like
much
higher
because,
like
it'll
have
to
go
through
something
in
this,
like
governance
call
so
that
people
can
like
approve
or
disapprove
or
call
out
like
potential
risks.
E
There's
like
an
allocation
plan
that
that,
like
client
should
already
have
sort
of
like
defined
of
like
how
are
they
going
to
allocate
the
data
cap
and
like
what
sort
of
strategy
and
then,
ideally
the
data
cap
allocation
is
made
into
like
a
multi-sig
and
it
can
be
the
existing
notaries
and
we
can
figure
out
like
what
number
of
notaries
and
like
is
it
like.
E
All
of
them
have
to
like
sign
to
like
allocate
data
cap
out
from
that
multisig
to
the
client,
but
hopefully
that
sort
of
creates
a
parallel
pathway
where
it
doesn't
actually
like
disrupt
other
clients
trying
to
get
access
to
data
cap,
but
still
enables
like
these
big
use
cases
to
be
supported.
Yeah.
So
that's
like
the
high
level
idea.
A
Any
thoughts
on
that
I'm
megan-
I
know
you
plus
one
that
in
chat,
but
if
there's
any
other
opinions
and
thoughts
immediately,
it'd
be
great
to
hear.
F
Yeah,
just
saying,
like
I
think,
that's
a
good
call
and
like
I
think
that,
for
a
lot
of
these
projects
like
we
know
that
they're
coming
in
advance,
you
know
like
especially
like
starlings
like
getting
funding
from
the
foundation,
and
so
I
think,
big
things
that
are
coming
through,
at
least
like
channels
related
to
the
you
know
to
things
like
the
foundation.
We
can
have
them
approximate
the
amount
of
data
that
they're
going
to
use-
and
you
know
bring
it
up
in
this
kind
of
process.
A
Yeah
that
makes
sense
generally.
I
think
that,
like
we'll
probably
have
like
a
few
of
these,
I
don't
think
there's
going
to
be
like
a
multitude,
but
there's
definitely
going
to
be
probably
a
handful
of
of
clients
and
use
cases
for
massive
deployments
like
this,
especially
given
like
the
value
that
falcon
can
add
to
the
world
in
this
manner.
A
A
We'd
also
have
to
do
a
bunch
of
like
rounds
of
additional
like
elections,
for
example,
for
notaries,
and
so
this
could
help
reduce
like
the
friction
for
that
client
substantively,
but
also
increase
additional
use
case
diversity
and
increase,
like
the
usefulness
of
each
like
notary,
when
they're
elected
to
service
inventory,
without
having
to
apply
like
all
of
their
data
cap
towards
like
a
single
client
for
example.
So
I
I'm
definitely
in
support
of
doing
something
like
this.
A
One-
and
I
know
you
can't
come
off
mutes,
I'm
just
gonna,
read
out
your
chat,
you
said:
should
you
guys
perhaps
consider
this
recurring
data
cap
together
with
this?
I
don't
know
the
current
state
of
that,
but
it
should
be
less
work
today
new.
So
I
believe
one
and
you're
referencing
the
fip
right,
which
is
currently
to
help
like
client
addresses,
get
refreshed
instead
of
a
client
having
to
supply
a
new
address.
G
A
Cool
yeah,
so
yeah,
I
I
do
see
why
it
would
make
sense
to
consider
them
as
related,
because,
theoretically,
a
massive
client
who
wants
to
get
topped
out
would
probably
want
to
get
topped
up
at
a
similar
set
of
client
addresses,
instead
of
continuously
making
even
more
and
more
addresses.
So
I
do
think
that
they're
related,
I
don't
think,
there's
necessarily
a
dependency
unless
I'm
misinterpreting,
where
you're
coming
from,
I
think
you're
on
the
same
page.
G
Yeah,
the
the
thing
is,
if,
like
14
people
have
to
decide
on
this
data
cap,
the
renewal
should
not
take
14
people
to
to
decide
again
excellent
yeah.
I
think.
B
E
I
think
the
like
the
way
that
this
would
work
is
the
client
sort
of
like
pre-defines
a
lot
of
like
the
general
strategy,
and
then
I
think
it's
as
simple
as
like
the
client
would
need
to
submit
back
deals
or
like
deal
info
and
then
the
14
people
like
it's
not
like
they
have
to
submit
a
new
application
because,
like
their
original
thing
to
the
community,
was
for
like
the
full,
100,
petabytes
or
whatever,
and
so
you
can
sort
of
think
of
it,
like
I
don't
know,
like
a
trust.
E
That's
making
like
small
allocations
out
to
that
like
to
that
recipient,
and
so
all
you
have
to
do
is
like
prove
that
you've,
like
done
the
thing
that
you
said
according
to
your
allocation
plan,
and
so
I
think,
like
the
main
work
for
something
like
the
showa
foundation,
would
be
like
using
it
as
like
a
sample
starting
case
and
like
building
out
like
frameworks
to
make
it
easy
for,
like
other
clients
in
the
future,
to
like
also
replicate
so
maybe
that's
something
that
we'll
have
to
figure
out,
but
I
think
for
future
clients.
E
Hopefully,
it's
like
pretty
and
maybe
even
for
fall
point
discover.
We
can
do
the
same
thing
but
like
it's
basically
like
just
predefined,
like
the
full
strategy
of
like
what
are
you
gonna
do?
How
are
you
to
do
it
like
the
the
people,
the
notaries
controlling
the
multisig?
I
think
it's
just
like
making
it
super
trivial
for
them
to
just
check
to
see
like
did
you
do
the
thing
that
you
sort
of
claimed,
I
think
that's
just
as
simple
as
like,
defining
a
spec
for
like
yeah.
A
E
Is
the
deal
info
that
has
to
be
posted
and
then,
if
there's
an
easy
way
of
like
having
that
map
to
like
the
strategy
that
they
defined.
A
A
Should
it
be
like
an
encompassing
group
of
notaries,
whereas,
like
every
single
notary,
for
example,
is
on
this
multi-sig,
that's
currently
elected,
or
would
it
be
like
the
selling
notaries
in
the
region,
or
would
it
be
like
randomly
selected
or
volunteered
selling
notaries
like
based
on
an
application
like?
How
should
we
go
about
like
thinking
like?
Are
there
any
opinions
instead
to
rephrase
that
a
little
better?
Are
there
any
opinions
on
how
notaries
would
like
to
be
grouped
and
exposed
to
a
data
cap
faucet
of
this
sort
for
a
massive.
A
A
E
Yeah
something
like
that,
I
mean,
I
think
the
high
level
is
like
basically
what's
the
surface
area
for
risk
and
so
like.
If
we're
talking
about
like
a
multi-heavy
bite
like
allocation,
it
probably
requires
like
at
least
a
couple
people
to
like
verify,
because,
like
at
the
end
of
the
day,
if
we're
thinking
about
like
is
the
game
ability
of
this,
you
want
like
it
to
be
sufficiently
hard
where
it's
like.
Multiple
people
have
to
like
be
compromised
right,
and
so
I
don't
know.
E
It
may
also
be
the
case
that,
like
we
say
that,
there's
like
no
scale
at
which
this
is
like
reasonable
or
like
they
have
to
go
through
like
application
again
for
like
another
big
trusty
thing
or
trust.
A
Yeah
any
other
notaries
on
the
call
that
have
an
opinion
on
on
how
they'd
like
to
participate
in
something
like
this.
Like
do
you
envision
yourself,
volunteering
to
be
part
of
a
group
of
like
seven
others,
for
example,
or
six
others?
So
we
have
an
odd
number
for
consensus,
or
would
you
expect
like
an
explicit
application
to
you
and
then
you'd
sign
on
to
a.
A
Multiseg,
okay,
I'm
gonna
assume
people
have
some
wonderful
ideas
that
they're
gonna
share
on
github
as
we
start
to
make
progress
on
on
the
on
what
like
this
could
look
like
from
a
spec
perspective
or
implementation.
A
I'd
also
be
curious
to
hear,
if
there's
anybody
that
disagrees
with
doing
something
like
this,
because
we
haven't
we've
had
a
lot
of
people
share,
that
they
they
are
an
agreement,
and
that,
like
this,
would
be
useful.
But
if
of
course,
if
somebody
has
feedback
that
this
is
not
necessarily
something
we
should
be
looking
at.
I'd
be
curious
here
that
and
I'm
sure
that
would
be
worthwhile
for
the
community
to
talk
about.
F
Not
to
jump
in
where
for
actual
notaries,
but
it
would
definitely
be
nice
like
from
the
from
the
side
of
the
foundation,
because
we'll
we
will
have
things
that
we
know
are
coming
up
in
the
future
and
so
and
we've
already
done
like
a
fair
amount
of
work
to
like
vet.
The
project
so
just
be
able
to
like
preempt
that
and
like
get
get
the
quorum
of
notaries
together
in
advance.
F
You
know
so
that
it's
so
that
these,
like
large
projects,
are
not
not
coming
in
and
throwing
people
off,
and
it's
sucking
up
all
the
data
cap.
A
Yeah
yeah,
I
think,
plus
one
in
favor
of
it
still
right,
though
megan.
I
guess
from
from
your
perspective,
though,
like
as
you
as
somebody
who's
seen
like
multiple
applications,
that
this
could
be
useful
for.
Do
you
do
you
see
like
a
slice
that
we
could
use
to
inform
the
notary
selection
process
for
it
like?
A
Are
they
typically
bounded
by
geography,
for
example,
because
you
know
notaries
today
exist
in
in
different
regions
of
operation,
or
would
you
envision
like
it
or
is
it
just
that
they're
typically
much
more
global
and
desperate,
and
so
we
we
should
just
think
of
a
different
way
in
which
we
would
have
liked
notaries
to
be
part
of
the
multi-site.
F
Currently,
the
stuff
that
we're
dealing
with
is
mostly
us-based,
but
we're
also
pretty
early
in
the
process
of
of
getting
projects
together.
So
I
think
that
will
likely
change
kind
of
towards
the
end
of
this
year.
But
but
that's
the
current
situation.
A
Cool
that
makes
sense
in
the
interest
of
like
getting
to
the
other
topics.
I'm
going
to
unbox
the
conversation
on
this
here,
but
I
think
issue
number
94
is
where
the
conversation
was
started.
I
would
love
to
see
some
follow-ups
there
and
we
can
continue
iterating
together
as
a
community
on
like
how
this
could
get
implemented.
A
Specifically,
we
talked
about
issue
98
last
time,
which
was
on
when
a
when
somebody
applies
as
a
client,
but
is
also
a
minor
like
to
what
extent
should
they
be
allowed
to
make
deals
with
themselves,
and
so
there
were
opinions
presented,
and
there
was
some
good
conversation
on
issue
98
a
couple
of
weeks
ago.
We
then
discussed
it
in
the
call
two
weeks
ago,
but
since
then,
there
hasn't
been
as
much
discussion.
A
However
riba
the
river
sushi
actually
shared
some
analysis
that
he
was
able
to
do
earlier
today,
which
should
hopefully
be
another.
You
know
great
step
in
just
general
data
cap
transparency
and,
like
the
work,
that's
going
into
building
better
tooling
for
this
ecosystem.
A
So
I
wanted
to
bring
up
this
issue
again,
mostly
so
that
he
could
share
some
of
his
analysis
and
and
see
if
anybody
was
interested
in
taking
that
forward,
especially
on
the
notary
side
is,
as
it
may
inform
your
future
allocation
decisions
as
well
reebok,
if
you're
comfortable,
we'd
love
to
have
you
take
over
a
screen
share
and
just
walk
us
through
how
you
like
what
you've
done
and
how
you
did
it.
H
Yeah
totally
cool
so
based
on,
like
a
number
of
conversations
that
have
been
happening
recently,
I
put
together
a
very,
very
simple
analyzer
of
field
plus
dl
data.
This
is
a
simple
script
that
will
run
on
pretty
much
any
linux
machine
that
pulls
data
from
the
nodes
provided
by
glyph.
H
For
us,
it
is
basically
all
the
verified
deals
that
are
updated
every
ten
minutes,
and
then
we
queried
their
note
for
to
to
resolve
like
addresses
and
all
this
kind
of
stuff,
and
I
sourced
all
the
requests
for
data
cap
from
both
the
current
github
registry
and
the
older
one
in
keycode
and
additionally,
I
sourced
associations
between
miner-
and
you
know,
owner
of
this
miner,
from
balls
like
falcon
slack
and
and
the
entire
our
minor
cohort
and
stuff
like
that,
and
the
result
of
this
are
two
csvs.
H
This
was
ran
literally
an
hour
ago.
So
now,
if
you,
if
you
can
open
the
client
first,
so
this
is
a
csv
that
has
every
client
that
we
currently
have
involved
in
fuel
plus
deals
and
if
you
scroll
into
this
further
down
further
further
yeah,
whenever
I
don't
know
who
that
is,
it
says
basically
new
unknown
and
they're
kind
of
all
them
together,
but
as
a
whole.
H
This
is
a
spreadsheet
that
tells
you
for
each
individual
client
how
much
data
are
they
totally
storing
and
then
towards
the
right
all
the
miners
in
sorted
by
size
with
which
they're
storing
this
data
and
the
percentages
where
they're
you
know
how
the
breakdown
works,
and
this
entire
space
is
obviously
sortable
and
you
can
you
know
you
can
figure
out
what
to
do
with
that.
H
If
you
are
interested
in
figuring
out
posts,
how
much
a
particular
client
has
already
used
and
how
much
you
know,
they're
kind
of
honoring,
this
decentralization
principle,
and
so
on.
So,
of
course,
and
on
the
other
side,
if
you
can
open
down
the
csv,
the
same
analysis
is
done
from
a
perspective
of
miner,
where,
basically,
I
take
all
the
miners
that
currently
store
field
plus
deals,
I
throw
away
any
miner
that
has
less
than
one
terabyte
of
field
plus
data
and
then
the
remainder
is
again.
H
The
csv,
where
we
have
the
miner
their
owner
and
all
of
those
unknowns
are
things
that
I
would
love
to.
You
know
fill
up,
I
just
don't
have
the
information
for
them
and
then
to
the
right
is
which
are
the
clients
that
are
storing
with
this
miner
and
again
they're
like
percentages
just
like
in
the
other
spreadsheet,
so
yeah,
that's
pretty
much
the
entire
repository.
You
can
totally
run
this
yourself
and
it
will
regenerate
the
csvs
for
you
on
the
fly
and
yeah.
That's
all.
A
Awesome,
I
think
this
is
also
a
great
opportunity
for
us
to
do
the
usual
plug
of
you
know
on
this
journey
of
increasing
data,
cap,
fluidity,
transparency
etc
in
the
ecosystem.
If
you
have
any
ideas,
if
there
are
projects
that
you're
interested
in
working
in,
please
feel
free
to
share
your
ideas
with
the
community
get
started
with
working.
I
know
philip
at
one
point
had
mentioned,
that
the
foundation
also
has
some
grounds
to
help
with
some
of
this
stuff,
megan
or
clara.
A
I
don't
know
if
you
can,
if
you
know
what
the
latest
is
on
that,
but
in
in
case
they're
still
like
funding
available
to
help
people
who
want
to
build
interesting
tooling
in
the
ecosystem,
I
think
it's
still
an
option,
and
so
we'd
love
to
encourage
and
continue
to
encourage
that
in
the
community.
F
Yeah,
absolutely
we
absolutely
have
funding
available.
I
mean
truthfully
if
there's
anything
anything
that
you're
seeing
that
you
think
might
be
useful,
like
that's
that's
how
broad
a
net
it
is
feel
free
to
come
to
us
for
a
developer
group.
I
Yeah,
so
normally
we
give
developer
grants,
as
you
guys
are
all
aware.
Every
quarter,
and
so
right
now
is
a
good
time
for
us
as
we're
thinking
about
our
next
wave
of
grants.
So
if
there's
new
ideas
that
you
guys
have,
this
is
exactly
the
right
place
where
we're
here
to
really
make
sure
we
can
find
the
right
tools.
So
definitely
just
contact
us
directly
or
you
can
just
say,
team
file.org,
which
comes
to
all
of
us,
and
we
can
kind
of
figure
out
from
there
how
to
make
the
best
ideas
available
in
rfps.
A
Cool.
Thank
you.
I
don't
want
to
spend
too
much
time
on
this
issue
because
I
think
we've
been
having
quite
a
bit
of
conversation
and
slack
on
it
as
well.
But
if
you
do
have
an
opinion,
you
know
please
feel
free
to
hop
into
github
and
we
can
continue
providing
updates
and
upcoming.
A
Notary
governance
calls
all
right.
So
with
that,
let's
talk
a
little
bit
about
the
new
issues,
so
the
first
thing
I
sort
of
launched
them.
I
took
the
liberty
of
lumping
them
into
three
different
categories.
The
first
one
is
specific
to
this,
this
concept
of
like
improving
like
time
to
data
cap
for
clients
that
are
interested
in
getting
data
cap,
so
there
are
three
different
issues,
all
of
them
filed
by
andrew
hill.
So
I
think
the
easiest
thing
to
do
is
actually
just
open
the
floor
to
andrew.
A
If
he's
still
here,
if
you
want
to
chat
through
these,
that
would
be
awesome.
If
not,
I
can
do
that
as
well.
C
Yeah
sure
so
the
first
one
I
already
mentioned
in
chat
pretty
basic,
but
the
phil
plus
notaries
are
reviewing
applications
from
first-time
applicants
and,
if
they're,
especially
if
they're,
an
individual,
not
associated
with
an
organization,
it's
hard
to
have
any
rubric,
that's
going
to
give
them
a
substantial
data
cap
to
use
and
that
even
counts
for
organizations
with
that
are
new.
So
you
imagine,
projects
going
through
incubators.
C
Things
like
that
eight
gigabytes
is
what's
given
given
out
from
the
glyph
project,
and-
and
all
you
have
to
do
is
verify
with
a
github
account.
It
seems
like
eight
gigabytes
is
like
nice,
but
not
very
meaningful
for
real
experimentation
and
projects
on
filecoin,
especially
if
you're
gonna
use
it
for
data
cap.
C
Things
where
you
might
want
to
try
recurring
deals,
try
different
minors.
So
anyway,
that's
just
a
proposal
to
allow
them.
I
don't
even
know
if
they'll
be
willing
to,
but
allow
them
to
grant
larger
larger
data
cap
for
first-time
users,
sort
of
just
lift
the
first
rung
for
those
people.
So
then
they
can
apply
for
larger
data
caps
and
use
their
distribution
of
that
initial
data
cap.
As
credibility
on
the
network,
so
that's
that
one:
do
you
stop
for
some
discussion?
C
A
Sorry
yeah,
I
think
we
should.
I
think
we
should
okay,
sorry
I'd
like
to
change
my
mind
halfway,
probably
better,
to
do
this
issue
by
issue.
Okay,
so
yeah.
If
anybody
has
any
ideas
or
thoughts
on
this,
please
feel
free
to
share.
A
I
know
wine
and
you
were
you
played
around
with
the
with
the
verified
glyph
initially,
and
you
found
that
the
eight
gigabytes
was
also
like
a
little
too
low
for
your
testing,
and
I
know
I
know
you're
in
the
audience,
if
you
think
32
would
have
been
better
it'd
be
nice
to
hear
that
if
there
any
other
opinions,
let's,
let's
open
the
floor
for
30
seconds.
G
E
Yeah,
I'm
curious
is
anyone
like
I
think
the
the
one
thing
that's
weird
is
like
in
the
original
notary
applications,
we
said
that
people
would
be
like
judged
based
on
their
like
predefined
strategy
and
how
well
they
execute
against
that.
It
feels
like
I.
I
don't
know
if
anyone
has
opposition,
but
it
does
feel
like
we're.
Most
people
at
least
are
sort
of
flagging
that,
like
it
would
be
better
if
we
could
increase.
B
E
And
I'm
also
of
that
opinion.
I
know
I've
been
talking
to
some
of
the
minor
x
folks
about,
like
their
minimum
like
deal
size
and
like,
especially
if
clients
are
supposed
to
store
like
multiple
copies,
so
like
four
copies
of
their
data,
like
a
two
gigabyte
file
like
will
just
not
get
stored
because
it
like
doesn't
make
sense
for
most
people
so
like
is
that
is
anyone
opposed?
Maybe
we
have
to
like
define
a
process
here
for
like
amending.
E
Maybe
we
could
get
jonathan
schwartz
to
like
file
an
issue
saying
like
I'd
like
to
my
strategy
to
like
bump
up
the
size
and
then,
if
someone
has
an
issue,
that's
not
on
this
call,
they
could
respond
to
it,
but
it
feels
like.
Generally,
I
don't
know
if
anyone
is
opposed,
but.
C
So
there
is
someone
in
chat,
says:
eight
is
fine
if
you
need
more
apply
data
apply
for
data
cap,
otherwise
it's
incentive
to
cheat.
If
you
ask
me-
and
this
actually
points
to
something
I
think
I
mentioned
in
my
issue,
which
is
well-
I
I'm
not
gonna,
read
it
all
right
now,
but
I
I.
C
I
also
agree
with
that
sentiment
that
before
we
jump
into
this,
I
don't
really
know
any
strong
analysis
of
like
what
can
be
gamed
here
like
like
at
what
level
is
this
too
gameable?
You
know
so?
Is
it
32
gigabytes
or
is
it
eight
gigabytes?
I
don't
actually
know
the
distinction
between
the
two
in
the
eyes
of
the
network,
and
so
it
may
just
take
a
little
bit
of
like
justification
from
somebody
and
and
before
we
before
we,
and
so
that
might
be
jonathan
shorts.
C
A
Yeah,
I
think
that's
a
really
like
fairly
valid
point.
I
did
see
what
you
added
to
the
issue
as
well,
that,
like
it,
would
take
a
lot
of
effort
for
somebody
to
even
add,
like
32
gigabytes,
like
make
a
significant
dent
in
the
network
or
like
do
a
lot
of
like
nefarious
like
activity,
because
the
bar
is
still
relatively
high
right,
like
that,
you
need
to
have
a.
B
C
Yeah
I
did
put
that
in
the
github
issue,
which
you'd
need
32
000
fake
accounts
to
create
a
petabyte
of
deals.
If
we
raised
the
bar
to
32
gigabytes
yeah,
so
for
me
is
that
that
doesn't
seem
incredibly
gameable.
In
fact,
we
could
even
possibly
go
higher,
but
that
might
be
why
that
analysis
could
be
nice.
Maybe
30
gigabytes
is
cutting
the
short
or
not
at
all,
but
either
way
I
feel
like,
as
somebody
I'm
experimenting
with
non-verified
wallets
all
the
time
and
I
feel
like
I
would
like
a
faster
route.
C
So
I
also
want
the
the
renewable
data
cap,
but
but
then
also
just
being
able
to
get
more
on
the
first.
Try
would
be
really
helpful
for
doing
experiments
so.
A
Yeah,
I
think,
there's
another
angle
here
as
well,
which
is
like
we
can.
We
can
experiment
and
get
it
wrong
and
then
correct
it
as
well,
like.
I
don't
think
these
things
are
like
that.
They
don't
have
that
much
permanence
like
if
we,
if
we
if,
for
example,
like
jonathan
schwartz,
wants
to
try
32
because
we're
all
sort
of
saying
that
this
could
help
like
developers
or
this
could
help
testing.
We
could
try
32
for
a
week
and
and
if
it
looks
like
it's
actually
not
good
and
it's
not
promoting
good
activity.
A
I
think
you
know
we
as
a
community
can
also
like
flag
that
and
and
and
influence
that
down
the
line.
But
I
do
agree
that
the
right
next
step
is
probably
for
jonathan
schwartz
to
see
if
this
is
something
he's
interested
in
doing
and
if
so,
probably
file
an
issue,
because
that
would
be
changing
like
fundamentally
like
what
his
initial
notary
application
looked
like,
because
I
think
that
he
was
grounded
data
caps,
specifically
for
like
automatically
like
grounding
like
eight
gigabytes.
C
C
I'll
ping
him
on
that
issue,
because
I
don't
even
know
if
he's
aware
of
it,
I
was
actually
just
looking
for
general
sentiment.
So
I
think
that's
good
nice
to
see
some
people
in
chat
have
some
concerns.
So,
let's
just
let's
just
explore
it
a
little
bit
and-
and
maybe
next
call
we
have
another
update.
So
the
next
issue
I
had
was
103
right,
and
so
this
kind
of
this
is
about
criteria
for
removing
inactive
notaries.
It's
I
think
it's
just
a
bad.
C
B
C
Then
don't
move
data
cap
and
that
could
be
from
a
lot
of
different
reasons.
So
this
one
needs
some
more
thought,
but
I
just
wanted
to
get
some
input
here,
in
particular
notaries
that
are
approved
and
then
have
open
applications,
but
don't
grant
a
single
data
cap
in
some
amount
of
time.
It
seems
clear
that
they're
blocking
data
cap-
and
so
I
think
we
should
be
proactive
on
removing
them
and
send
some
reason
and
then
some
reasons
why
that
might
happen
that
we
might
want
to
have
ways
for
them
to
not
get
blocked.
C
Are
the
people
and
you
call
on
the
call
you
already
mentioned,
whose
addresses
and
and
or
whose
notary
addresses
were
actually
messed
up.
They
just
need
a
way
to
get
around
that
and
get
more
time,
but
other
ones
that
are
just
piling
up
issues
on
github.
I
think
it's
a
really
bad
user
experience
and
I
think
we
should
aim
to
unblock
those
users
by
moving
the
tickets
or
moving
their
applications
as
quickly
as
possible
to
notaries
that
aren't
that
aren't
blocking
it.
A
And,
of
course,
the
conversation
can
continue
to
github
after
the
call
as
well
yeah
I
I
agree.
I
remember
that
there
was
also
this
like
topic
that
came
up
at
some
point,
and
do
we
just
want
to
have
tickets
be
reassigned
between
notaries,
but
I
think
the
the
outcome
of
that
conversation
is
basically
that
that
is
not
necessarily
good.
Assuming
notaries
are
sticking
to
some
of
their
original
ideas
of
serving
particular
use
cases
or
serving
particular
regions.
A
However,
this
at
least
gives
like
a
really
clear
path
for
a
client
who
might
feel
like
they're
just
stuck
and
not
like
seeing
traction
on
like
their
application,
where,
like
notaries,
are
now
going
to
be
held
to
a
bar
of
responsiveness
in
a
way
which
then
enables
like
clients
to
get
data
cap
that
they
like
should
like
may
deserve.
Basically,
at
a
more
like
reliable
speed,.
C
Yeah
yeah,
for
me,
it's
just
that
if
we
all
want
filecoin
to
be
successful-
and
this
ends
up
being
the
first
thing
that
a
user
hits,
it's
a
shame
if
it
takes
them
three
weeks
to
be
able
to
use
the
network
when
really
we
want
them
using
it
in
the
first
hours
or
day.
So
it's
very
related
to
issue
100,
but
going
further.
C
So
what
are
the
next
steps
there,
because
I'm
even
proposing
criteria
for
for
that
yeah?
This
actually
relates
to
104,
which
is
maybe
with
new
notaries.
We
could
set
up
these
criteria
before
they
even
join,
but
what
do
we
do
now.
A
I
think
I
think
it's
probably
worth
having
the
conversation
in
in
github
a
little
bit
to
ensure
everybody's
opinions
being
heard,
and
so
the
tactical
next
step
I
would
recommend,
would
probably
be
to
drop
it
into
slack
and
say,
hey
like
curious
to
hear
if
anybody
doesn't
agree
with
these
criteria,
or
these
are
not
like
good
criteria.
A
I
think
one
of
the
interesting
implications
of
what
you
were
proposing
would
also
be
like
the
removal
of
notaries
like
down
the
line
that
haven't
necessarily
finished,
allocating
all
of
the
data
cap
and
like
that,
that's
an
interesting
one,
because
then
we
have
to.
I
think
we
do
need
to
have
a
conversation
on
you
know
these
are.
This
is
not
a
full-time
job
for
notary
and
so
what?
What
are
the
implications
of
being
held
to
like
an
sla
of
some
sort,
for
example?
Well,.
A
E
E
It
does
seem
like
getting
to
that
point
of,
like
clients
can
get
some
data
cap
like
within
the
first
hour
or
something
like
super
trivial
implies
that
we
need
like
other
ways
of
like
verifying
people
like
automatically,
which
means
you
have
to
come
up
with
like
more
strategies
for
this
like
the
github.
One
is
like
one,
that's
like
good,
but
like.
B
E
We
need
like
more,
and
so
that
might
be
another
way
where,
like
we
could
change
the
balance
of
like
what
what
manual
review
is
required
versus
like
things
that
can
be
automatically
reviewed,
which
is
maybe
something
to
think
about
too.
A
Sure
in
general,
I
think,
there's
just
the
theme
of
like
a
little
bit
more
transparency
on
like
what
the
application
process
looks
and
feels
like
for
a
client
and
ensuring
that
that
ux
actually
improves
so
that
clients
that
are
worthy
of
receiving
data
cap
do
get
it
in
a
reliable
manner
and
like
no
reason
away
are
held
accountable
for
like
serving
as
fiduciaries
to
the
network
and
enabling
the
successor,
clients.
C
It'll,
give
you
a
sense
of
the
usability
so
yeah
and
then
104
is
it's
just
a
recommendation
on
future
notaries
that
we
should
have
them
and
actually
gave
me
an
idea
for
a
third
thing,
but
so
they
should
add
an
anticipated
response
time
and
throughput
to
their
application.
So
basically
they
should
say
like
like
how
much
work
are
they
going
to
put
into
being
a
notary
and
how
much?
C
How
many
applications
can
we
expect
them
to
be
able
to
deal
with
because
it
may
influence
who
we
we
want
to
accept
as
data
as
notaries
and
then
the
third
one
that
you
that
you
maybe
realize
is
just
we
should.
We
should
include
an
anticipated
start
date,
because
some
notaries
may
build
new
technology
in
order
to
deploy
the
notary
that
they're
imagining
and
it
may
take
them
three
months
but
they're
not
going
to
build
it
unless
they
get
approved.
C
But
that
can
actually
work
that
can
we
can
work
with
that
and
make
that
possible,
but
so
just
extending
the
application
to
do
those
things.
But
the
first
two
criteria
are
just
like
yeah.
How
many
applications
are
you
gonna
be
able
to
deal
with
and
how
quickly
do
you
think
that
you
can
respond
to
each
one
so
that
then
we
can
hold
them
accountable
if,
if,
in
some
months
time,
they
just
have
a
giant
bottleneck
and
they've
only
let
two
applications
through
they're
not
actually
living
up
to
their
application.
A
Yeah,
I
think
one
of
the
things
that's
interesting
about
notary
selection
in
falcon
plus
today
is
it's
based
on
this,
like
very
quantified
rubric
right
where,
like
based
on
how
you
score
on
certain
parameters
it
spits
out
it.
Doesn't
it
spits
out
a
number
that
effectively
defines
like
your
ranking
in
a
set
of
like
notaries
and
adding
in
like
a
parameter
like
this?
A
Actually,
I
I'm
wondering
if
another
way
to
look
at
this
is
maybe
there's
an
expectation
that,
like
here's,
a
minimum
response
like
a
maximum
response
time
and
a
like
a
minimum
throughput
and
before
you
even
like
self-select
yourself
into
being
a
notary,
you
should
be
comfortable
like
meeting
these
requirements,
because
my
fear
is
that
we'll
end
up
having
like,
I
don't
want
it
to
negatively
affect
the
fluidity
of
data
cap
in
certain
regions
versus
others
or
in
certain
cases
of
notice
versus
others,
because
that
may
then
influence
like
how
quickly
a
certain
nutrient
needs
to
be
like
tap
like
topped
up
versus
others,
for
example,
and
so.
C
The
only
push
we
to
have
there
is
like
what
kind
of
projects
they're
aiming
to
focus
on,
because,
if
you're
aiming
to
focus
on
like
just
making
something
up,
you
want
to
focus
on
early
stage
startups
with
like
one
to
five
terabyte
data
cap.
You
might
imagine
that
you
can
get
300
of
them
a
year,
a
thousand
of
them
a
year,
but
then
somebody
else
who
wants
to
focus
on
hackathon
teams
or
school
groups,
or
things
like
that.
Maybe
they're
imagining
thousands
of
smaller
data
caps.
So
they're,
not.
C
A
So
the
next
steps
are
adjusting.
The
rubric
are
actually
very
similar
where,
instead
of
opening
a
discussion,
thing,
you'd
actually
just
create
a
modification
issue
where
you'd
propose
to
change
the
rubric,
and
then
we
can
have
a
conversation
on
the
tactical
changes.
I
think
once
we
hear
maybe
a
few
more
opinions
and
probably
like
the
next
governance
call
like
anybody
can
help
with
like
what
the
implementation
might
look
like,
because
people
might
have
different
ideas
on
how
that
gets
represented
on
the
rubric
or
on
the
application,
and
so
we
should.
A
I
think
we
could
just
align
on
like
what
are
the
parameters
we
want
to
include.
So
it
sounds
like
there's
at
least
two,
if
not
three
that
we've
mentioned
today,
andrew
there's,
probably
more
is
like
more
conversation,
happens
and
then,
like
they're,
the
representation
on
the
app
and
the
rubric.
I
sort
of
view
that,
as
like
a
once,
this
is
resolved
and
once
we're
aligned,
that's
like
like
the
more
like
tactical
like.
Does
it
get
represented
in
the
the
scoring?
A
Even
or
is
it
just
part
of
the
application
and
probably
the
expectations
that
we
set
up
because
there's
a
different
angle
to
this,
where,
like
I
don't
know
if
we
should
disqualify
somebody
from
like
picking
a
use
case,
that's
more
or
less
requiring
like
a
serving
of
a
different
rate,
but
as
long
as
they're,
serving
like
a
diverse
use
case
or
able
to
like
adequately
provide
data
cap
for
different
kind
of
use
case,
and
do
it
in
a
way
that
is
adequate
for
that
use
case.
C
I
could
I
could
tell
you
one
mad
one,
one
way.
Sorry
john
one
last
thing
one
way:
I'd
imagine
that
you'd
add
this
to
the
rubric
is
to
say:
okay,
if
you
imagine
a
throughput
of
one
deal
a
week
and
an
average
deal
size
of
one
terabyte,
then
you're
talking
don't
give
them
a
data
cap
of
over
52.
A
A
But
I
think
one
of
the
interesting
things
that
does
also
is
it
gives
a
lot
more
predictability
to
the
ecosystem
on
when
we
need
to
do
like
new
elections,
for
example,
because
you
can
now
foresee
better
like
if
we
have
data
on
how
quickly
notice
think
they're
going
to
run
out
of
data
cap.
That,
like
makes
it
a
lot
more
predictable
and
a
lot
more
fluid.
A
A
E
Yeah,
I
was
just
gonna
say
I
think,
like
one
of
the
things
that
is
challenging,
which
I
don't
know,
maybe
we
are
at
a
place.
I
think
it's
so
early
where,
like
I
don't
know,
if
many
notaries
can
predict
what
like
their
average
throughput
should
be.
E
I
do
like
the
idea,
though,
of
saying
like
how
many
hours
or
like
some
sort
of
metric
of
like
how
much
you
can
like
spend
to
like
review,
because,
like
the
other
thing
that,
like
we
don't
have
to
keep
the
way
that
it
is,
is
like
how
do
we
think
about
sorry?
This
is
just
new
york
in
the
background.
How
do
we
think
about
how
like
the
number
of
notaries
per
region
and
so
like,
if
you
had
like
another
way
of
looking
at
this,
is
like?
E
This
problem
is
like
if
we
have
a
sense
of
like
how
much
not
even
like
what
is
the
expected
like,
like
I'm
hesitant
to
say,
like
people
are
like
forecasting
their
data
cap,
because
I
don't
know
if
anyone
can
do
that
really
well
like
until
we
actually
see
like
requests
that
are
coming
through.
But
I
do
think
like
in
terms
of
at
least
like
the
sizing
of
like
how
many
notaries
it
should
be.
Based
on
like
how
much
effort
people
can
like
put
in
so
that
we
have
like
some
distribution.
A
A
Cool
yeah,
I
do
think,
there's
plenty
more
conversation
behind
this,
but
I
think
the
general
direction
like
certainly
makes
sense
we're
two
minutes
wrapping
up,
so
I'm
gonna
continue
pushing
the
conversation
next
topics
andrew
thanks
for
sharing.
I
think
you
had
the
question
of
like
tactical
next
ups
for
like
the
rubric
change.
Are
you
good
on
that?
A
Okay?
Perfect?
So
then
there
was
this
issue
on
actual
decision
making
and
governance
where
somebody
proposed
like
having
a
tribunal,
a
falcon
plus
stakeholders,
so
this
issue
is
actually
created
under
lotus,
and
so
this
the
ask
to
the
person
that
filed
this
would
also
be
to
open
this
under,
like
the
the
client,
sorry,
the
notary
governance
report,
so
that
we
can
have
a
discussion
there
but
yeah
if
you're
on
the
call
and
want
to
talk
it
through.
A
It
would
be
great
if
not
the
ask
to
those
of
you
that
are
in
the
call
would
be
to
take
a
look
at
this
and
see
if
you
have
an
opinion.
If
you
find
this
to
be
interesting,
I'm
going
to
write
a
comment
on
this
real
quick
to
see
if
we
can
migrate
the
conversation
over
to
the
nordic
governance
repo.
But
I
did
think
it
was
an
interesting
conversation
and
worth
having
is
we
think
about
ways
in
which
we
can
continue
to
make
changes
and
iterate
in
the
ecosystem.
A
Quick
tldr
on
this
is
basically
to
have
some
sort
of
process
by
which
we
could
actually
make
faster
decisions
and
then
implement
those
decisions
in
a
way
that
was
like
quote-unquote,
fair.
A
Cool
all
right
and
then
last
up,
I
think
we
already
did
a
quick
plug
on
if
there's
any
ideas
for
additional,
tooling
and
dashboards.
The
only
thing
worth
adding
is
jonathan.
Schwartz
opened
up
an
issue
where
datacap
was
accidentally
granted
to,
via
the
like,
the
verified
dog
lift
on
io,
as
he
was
testing
something,
and
so
he's
written
up
a
quick
description
of
what
happened
and
and
why
it
was
an
issue
and
why
it
won't
be
an
issue.
A
So
for
those
of
you
that
are
curious
and
want
to
continue
contributing
to
additional
transparency
in
the
system
and
helping
resolve
issues
in
a
way
that
are
good
and
productive,
I
would
recommend
taking
a
look
at
issue,
one
or
two
and
figuring
out
some
next
steps
there.
A
With
that,
I
did
want
to
write
as
we
had
time
are
there.
Any
burning
topics
that
anybody
wants
to
bring
up
would
be
great
to
hear,
if
not
we'll
call
it
so
I'll,
give
I'll
give
a
couple
of
minutes
in
case
this
is
anything
anybody
wants
to
mention
for
those
of
you
that
could
make
it.
Thank
you
so
much
for
for
being
here.
You're
awesome
to
see
the
the
community
and
the
engagement
continue
to
grow
bi-weekly
over
bi-weekly.
J
Actually,
I
do
have
some
questions
I
just
got
out.
I
thought
I
got
mixed
up,
so
you
started
an
hour
ago.
A
I
think
it's
because
of
daylight
savings
time.
We
should
have
anticipated
this.
I
think
it's
completely
fair,
yeah,
weird
doctor
witch
is
wrapping
up,
no
worries.
Of
course
this
will
be
recorded
and
uploaded,
but
yeah.
If
you
had
any
topics
that
you
want
to,
have
some
discussion
on,
it
might
be
worth
doing
them
in
slack
just
opening
up
a
few
threads
and-
and
we
can
have
people
engaged
there
or
if
there
are
substantive
changes
to
the
ecosystem.
As
usual,
I
would
recommend
just
opening
up
an
issue
in
in
good
health.
J
Okay,
hey
jb.
I
have
I
posted
some
questions
on
linkedin
for
you.
If
you
could
kindly
respond
that
I
posted
two
weeks
ago,
for
you.
A
A
Bring
him
I
can
begin
and
see.
Thank
you
for
flagging
that.
J
Okay,
so
can
I
have
your
linkedin.
A
J
A
Sure
yeah,
if
you
just
look
up
deep
kapoor
you'll,
find
me
on
linkedin
happy
to
respond
to
stuff
there,
but.
A
Relates
to
falcon
plus,
all
you
know
much
happier
to
have
the
conversation
in
slack
as
well.
I
have
the
same
name
in
falcon
slack.
You
can
find
me
there
or
we
can
just
chat
on
the
falcon
plus
channel
as
well.
J
Okay,
okay,
I'll
post,
a
question
for
you
now
on
this
thing:
maybe
you
can.
G
30
seconds
of
your
time
as
you're
probably
aware,
philip
is
out
of
data
cap
at
the
moment.
So
is
there
any
notary
here
around
who
can
take
over
from
him
for
for
my
data
cap,
because
I'm
waiting
at
the
moment-
and
I
don't
want
to
spam-
a
github
for
30
notaries.
So
if
somebody
is
willing,
can
you
please
raise
hands?
So
I
know
who
to
contact
tomorrow
when
it's
a
day
time
again
here.
A
Yeah,
I
guess
that's
a
great
question
for
the
nurtures,
if
there's
anybody
that
wants
to
nominate
themselves
for
receiving
another
application.
I
know
neo
mentioned
that
he's,
certainly
down
for
more
applications
to
come
his
way.
Yes,
that's
happening.
A
All
right,
folks,
with
over
the
hour,
I'm
going
to
call
it
here.
Thank
you
so
much
for
coming,
it's
it
looks
like
some
people
are
still
joining
the
meeting,
so
I
do
think
that
the
the
the
time
swap
you
know,
messed
some
stuff
up,
but
hopefully
things
will
be
more
consistent,
at
least
for
the
next
few
months
before
we
go
through
another
one
of
these
changes
you
know
have
a
wonderful
week
ahead
of
you
and
looking
forward
to
catching
up
at
the
next
one
of
these
thanks.
So
much
take.