►
From YouTube: Filecoin Plus Day: May 2021 Notary Governance Call
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
We're
gonna
kick
off
our
falcon
plus
the
recoverance
call.
This
happens
every
two
weeks.
We
wanted
to
do
one
at
the
falcon
blast
day,
specifically,
so
those
of
you
that
are
here
get
the
experience
like
what
it
looks
like
and
and
the
kind
of
conversation
that
we
have
on
a
bi-weekly
basis.
Of
course
it's
an
open
sort
of
call,
so
any
of
you,
you
know,
feel
free
to
raise
your
hand
participate
via
chat.
If
you
have
any
thoughts
or
ideas
feel
free
to
share.
A
We,
we
like
we
like
to
do
these,
like,
as
I
mentioned
before,
at
like
alternating
times,
so
we
we
don't
always
keep
people
awake
at
midnight
at
different
parts
of
the
world,
but
today's
is
at
this
time
slot.
So
thanks
for
bearing
with
us,
this
is
not
the
usual
time
slot
that
we
we
do.
A
This
call
in
the
agenda
for
today's
falcon
plus
notary
governance,
call
is
to
get
an
update
on
how
data
cap
is
flowing,
discuss
a
couple
of
open
issues
specifically
discuss
the
topic
of
notary
elections
and
then
have
a
conversation
on
any
of
the
the
topics
that
are
flagged.
A
I'm
going
to
do
my
best
to
do
this
one
a
little
bit
quicker
like
I'll,
try
to
keep
this
to
roughly
about
20
25
minutes,
and
then
we
for
those
of
you
that
are
hanging
around.
We
have
one
last
talk
from
the
famous
zx
on
the
on
on
falcon
plus
as
an
economic
liver
for
a
more
useful
network,
so
pretty
excited
for
that
as
well.
So
I
want
to
make
sure
I
give
him
enough
space
to
share
some
of
his
ideas,
cool,
so
data
cap
flow.
A
There
were
actually
11
allocations
in
the
last
week
when
I
checked
yesterday
very
busy
week
and
roughly
20
in
the
last
two
weeks
through
the
github
repo,
that's
almost
double
from
the
prior
two
week
period,
resulting
in
like
greater
than
50
debit,
bytes
of
data
cap
being
allocated
via
github
in
the
last
week,
which
is
awesome.
A
Five
new
applications
came
in
and
36
currently
open
applications
for
notaries
to
address
are
in
the
repo
that
leaves
us
that
the
following
distribution,
so
starting
out
with
about
1950
w
by
c
data
cap
to
begin
with
1390
of
it,
is
still
with
notaries.
1560
sorry
560
of
it,
has
now
been
granted
to
clients
which
represents
just
shy
of
30
of
all
the
data
cap
and
from
that
about
40
percent
of
that
has
been
used
by
clients
in
deals
with
minors
and
the
current
average
time.
A
To
date,
a
cap
is
roughly
four
and
a
half
days
almost
five
days,
so
yeah,
that's
the
latest
from
andrew's
analysis
from
the
textile
side.
Here
is
a
quick
snippet
of
last
week
versus
this
week
in
terms
of
manual
tracking
of
time
to
data
cap
through
github
issues.
So
you
can
see
here
for
specific
notaries.
What
like
the
average
days
to
ground
is
currently
looking
like
and
again
like.
A
This
is
like
this
is
notaries
who
are
like
spending
their
extra
time
volunteering
to
serve
the
network,
so
we're
super
grateful
for
for
those
of
you
that
are
there.
This
is
not
like
an
intent
to
call
anybody
out.
A
The
main
thing
I
want
to
show
here
is
that
we're
actually
seeing
a
pretty
nice
downward
trend
across
the
board
and
how
long
it
takes
to
get
a
ground,
and
that's
awesome,
like
clients,
are
forever
grateful
for
the
time
that
you're
making
as
a
notary
to
ensure
that
their
needs
are
met
in
a
minimal
friction
experience
on
falcon.
A
So
thanks
for
the
hard
work
that
you're
doing
we're
going
to
continue
tracking
this
as
a
as
an
effort
to
see
ways
in
which
we
can
optimize
and
make
the
experience
better,
this
data
should
be
seen
more
as
like.
Where
is
there
room
to
improve
the
tooling
and
the
experience
both
for
clients
and
for
notaries
to
make
this
process
smoother,
not
as
like
a
you're,
slow
or
you're
a
fast
notary,
so
yeah?
A
If
you
as
andrew,
also
mentioned
in
his
talk,
if
you
have
ideas
for
ways
that
we
can
improve
the
processes
and
the
standards
in
this
program,
it
would
be
awesome
to
have
that
conversation
cool.
So
with
that,
let's
talk
a
little
bit
about
issues,
because
there
is
a
lot
of
interesting
fun
stuff
to
talk
about
it.
Today's
call
for
those
of
you
that
have
been
coming
to
the
calls.
You
also
hear
me
say
this
like
every
time,
but
that's
because
every
notary
governance
call
has
some
awesome
and
fun
stuff
to
talk
about.
A
A
So
without
further
ado
for
for
the
following
issues,
we
are
making
progress
on
the
implementation,
so
issue
109
was
raised
to
add,
like
an
auto
closed
bot,
client
applications
that
hadn't
had
any
traction
in
20
days
previously
we
were
exploring
adding
this
functionality,
the
5000
spot
that
already
exists.
We
have
pivoted
away
from
this
and
are
using
a
standard,
github
actions
flow
now
for
closing
out
scale
issues.
So
you
should
see
you
should
see
this
starting
this
week
or
so.
A
Issues
that
don't
have
any
traction
from
clients
or
notaries
in
20
days
will
automatically
be
closed
out.
This
will
not
only
clean
up
still
issues
in
the
repo,
but
also
serve
as
yet
another
notification
to
both
notaries
and
clients
to
make
sure
that
they
are
addressing
applications
that
are
coming
in
issue.
104
was
on
the
topic
of
adding
like
an
anticipated
response
time
and
throughput
for
like
client
applications
being
sped
up
like
in
previous
calls.
A
We
discussed
like
setting
like
a
goal
of
being
responsive
within
three
or
five
days,
more
or
less,
and
so
I
had
pr
adding
an
expectation
to
this
set
of
objectives
and
responsibilities
for
notary
to
say
that
you
should
try
to
be
responsible
within
three
days
again.
This
is
not
like
a
hard
and
fast
rule,
but
this
is
guidance
that
we
have
agreed
on
as
a
community.
A
This
is
especially
important
for
those
of
you
that
are
here
to
consider
joining
as
a
notary
that,
like
we,
we
are
looking
to
find
people
and
promote
people
that
are
able
to
be
active
on
the
network,
and
so
part
of
that
should
also
be
like
responding
to
clients
as
fast
as
possible.
So
we
I
have
a
pr
open.
I
didn't
get
any
specific
feedback
on
it.
It's
very
simple:
it
just
adds
this
verbiage
to
the
set
of
objectives
and
expectations.
A
So
if
anybody
sees
anything
here
that
doesn't
look
right
or
doesn't
look
good,
please
let
me
know
otherwise
I'll
probably
be
merging
that
pr,
probably
tomorrow,
because
I
think
it's
been
open
for
a
couple
weeks
for
feedback
and
and
so
far
it
looks
like
people
are
generally
okay
with
it.
A
So,
let's,
let's
talk
a
little
bit
about
some
of
the
the
topics
that
require
a
little
bit
more
discussion,
and
so
for
those
of
you
that
have
been
coming
to
the
calls
you
you're
familiar
with
issue
94.
for
those
that
are
new
here
today.
I'm
going
to
give
you
a
brief
introduction
to
what
it
is.
A
The
goal
here
was
that
there's
a
lot
of
projects
that
exist
that
are
interested
in
using
the
falcon
network
that
operate
at
a
much
larger
scale
than
like,
what's
available
in
falcon
plus,
like
the
world
of
data
cap
today.
So
a
good
example
is,
like
you
know,
for
example,
jonathan
doden
just
presented
on
starling.
They
have
multiple,
petty
bytes
of
data
that
could
be
brought
on
to
falcon
and
grounding
them.
Data
cap
would
probably
be
a
really
good
use
case
of
data
cap
because
that's
a
legitimate
use
case
for
open
publicly
viewable
data
set.
A
A
However,
the
total
amount
of
data
cap
that
was
given
to
nodejs
was
only
like
1.92
pebby,
bytes
right
and
that's
obviously
not
going
to
be
enough
at
the
scale
that
we're
looking
at,
and
so
this
was
an
initial
proposal
from
julian,
who
is
a
nursery
as
well
as
jonathan
victor
around
finding
a
way
in
which
we
could
create
a
path
for
clients
that
have
exceptionally
large
data
sets
to
like
basically
start
applying
for
for
applications
independently
of
the
regular
notary
flow
and
so
like.
A
A
client
that
has
this
large
data
set,
could
apply
for
a
large
amount
of
data.
Caps.
Specific
to
that
data
set
and
the
way
in
which
that
would
be
dispensed
is
through
a
multi
signatory
which
would
be
assigned
to
that
client,
which
would
be
a
set
of
at
least
seven
notaries
that
are
signing
on
to
becoming
multi
becoming
signers
on
that
multisig.
A
And
there
would
be
like
throttles
and
how
quickly
that
data
cap
would
actually
be
dispensed
and
like
a
lot
of
transparency
and
clarity
and
how
that
client
is
planning
to
use
that
data
cap,
but
that
would
result
in
a
flow
whereby
we
could
start
supporting
much
larger
data
sets
through
this,
like
new
multi-signatory
concept,
where
existing
notaries
can
participate,
and
we
can
start
introducing
much
more
data
caps
or
liquidity
into
the
system.
A
There
are
a
couple
of
interesting
challenges
here,
so
just
briefly
walking
you
through
what
the
flow
looks
like
and
then-
and
we
can
talk
about
some
of
the
changes
that
I'd
like
to
continue
proposing
so
the
flow
today
is.
We
actually
have
another
repo
that
I've
that
we
just
stood
up
called
like
file
coin
plus
on
onboarding.
Large
data
sets
something
of
this
sort
with
the
intent
that
it
would
be
tracked
similarly
to
the
client
onboarding
repo,
but
it
would
be
different
in
that.
A
It's
specifically
for
these
particular
kinds
of
applications
in
this
repo,
a
client
would
submit
an
application.
So,
in
the
last
governance
call
we
covered
like
what
that
application
had
and
like
the
complexities
of
the
application.
So
if
you
are
around
or
you
were
around
for
that,
then
check
it
out.
A
If
not,
I
would
also
recommend
that
you
take
a
look
at
the
recording
for
that
call
if
you're
interested
in
seeing
what
we
discussed
on
it
but
yeah
the
flow
effectively
is
that
a
client
would
fill
out
that
application
template,
which
is
a
rather
extensive
sort
of
application,
template
which
would
be
an
open
application
for
anybody
in
the
community
and
any
notary
to
comment
on
and
do
due
diligence
on
so
they
intend
to
be
over
two
week
period.
A
We
as
a
community
ask
all
the
questions
required
for
us
to
vet
this
data
set
and
this
client
as
something
that
we'd
like
to
support
in
the
falcon
plus
community
and
ecosystem,
and
if
the
community
is
in
agreement
and
seven
notaries
have
sort
of
opted
into
becoming
signers,
then
we'll
kick
off
the
process
of
creating
a
a
multi-signatory.
Basically,
that,
like
the
root
key
holders,
will
help
establish
that
can
serve
as
like
a
dedicated
like
data
cap
requesting
faucet.
For
this
particular
client.
A
There's
also
a
need
to
have
like
a
lead
notary
of
a
sort
to
which
the
issue
will
be
assigned.
That
will
manage
like
the
primary
sort
of
communication
with
the
bot
that
will
help
with
like
assigning
data
cap
to
the
client
as
well.
That's
more
of
like
a
figment
of
interaction
on
github
that
some
of
you
are
already
noticing
familiar
with,
but
obviously
there'll
be
some
like
instructions
and
readme
for
how
we
can
make
this
work.
A
The
implementation
team
has
been
doing
some
pretty
good
work
to
make
this
possible,
and
so
the
intent
would
be
that,
like
a
client
would
be
able
to
request
data
cap
from
this
faucet
like
as
they
need
it,
and
the
throttles
that
we
discussed
at
the
previous
governance
call
was
that
for
the
first.
Actually,
let
me
just
pull
it
up,
so
the
throttles
that
we
discussed
last
time
were
around
having
the
first
allocation
being
the
lesser
of
five
percent
of
the
total
data
cap
requested
by
that
client.
A
For
that
data
set
or
half
of
what
is
expected
to
be
used
in
a
week,
the
second
allocation
would
be
the
lesser
of
10
or
what
is
to
be
used
in
in
a
week
and
then
third
allocation
onwards
would
be
the
lesser
20
or
what
could
be
used
in
two
weeks
and
the
intent
here
is
like
for
each
consequent
allocation.
A
A
client
has
already
used
90
of
the
previous
allocation,
and
so
we
as
a
community
will
have
like
a
good
amount
of
visibility
into
what
that
client
is
attempting
to
do
and
get
enough
time
to
sort
of
validate
that
where,
where,
where
like
that,
the
client
is
actually
doing
what
they
claim,
they
would
and
we're
happy
to
continue
supporting
them.
Basically,
in
the
in
this
venture
of
getting
like
a
large
amount
of
data
cap
to
solve
that
bombs,
and
so
what,
as
I've
mentioned,
we
have
this.
A
It's
it's
a
private
reaper
right
now,
because
we're
doing
a
bunch
of
testing,
but
it's
part
of
the
project.
This
has
like
the
brief
sort
of
explanation
on
what
this
is
and
how
it
works,
and
so
the
intent
is
to
get
this
in
a
shape
that
we
are
in
agreement,
and
this
becomes
public,
hopefully
in
the
coming
days,
so
that
we
can
start
serving
applications,
and
so
I
think
the
things
that
we
need
to
discuss
in
order
for
that
to
happen
are
the
following.
A
The
first
is
that,
like
we
had
a
great
suggestion
from
fanbushi
the
notary
from
finbushi,
thank
you
for
that
to
improve
the
actual
application,
so
the
draft
sort
of
application
that
was
proposed
in
here
will
probably
have
a
couple
of
changes.
I've
recorded
I've
taken
the
recommendations
and
think
that
at
least
these
two
should
be
integrated
with
the
application.
So
I'll
probably
update
the
draft
for
that,
and
so
thanks
for
sharing
that
that's
awesome.
A
If
any
of
you
want
to
take
a
look
at
the
application
as
well,
please
like
head
over
to
this
issue.
I
can
drop
the
link
in
the
chat
but
yeah.
It
would
be
good.
If
let
me,
let
me
get
this
yeah,
it
would
be
good
if
any
of
your
ideas
on
like
the
data
we'd
like
to
collect
from
clients
as
part
of
this
flow.
A
So,
of
course,
the
client
could
apply
multiple
times,
but
at
least
initially
that
would
give
us
some
amount
of
control
on
the
rate
at
which
data
cap
is
being
dispensed
and
like
prove
out
a
bunch
of
these
things,
without
like
being
overly
cautious
or
over
or
not
cautious
enough
effectively
to
find
the
balance
and
then
adding
in
like
a
total
throttle
of
about
50
per
bytes,
which
would
allow
us
this,
for
it
would
effectively
be
a
forcing
function
that
once
like
50
peps
has
been
distributed
through
this
process
like
we
should
have
a
conversation
in
early
governance,
call
where
we
say
you
know,
here's
how
we
think
it
went.
A
Did
it
go
well,
what
are
the
things
that
we
can
improve?
What
is
the
feedback
that
we're
receiving
from
people,
and
how
can
we
actually
improve
the
process?
And
so
yeah?
That's
the
second
thing,
and
then
the
third
thing
I
was
going
to
propose
today
is
that
we
had
previously
talked
about
this
already.
So
this
shouldn't
be
news,
but
the
intent
that
of
those
seven
notaries
they
should
be
distributed
across
like
regions.
We
previously
talked
about
having
a
notary
from
every
region
I
was
in.
A
I
was
inside
going
to
propose
that
of
those
seven
notaries.
They
should
span
at
least
three
regions,
that's
mostly
because
we
don't
have
no
trees
in
every
single
region.
Today,
I'm
also
looking
at
you
know
the
continents
of
south
america
and
africa.
If
you've
got
any
notaries
that
are
interested
come
on
over,
but
yeah
as
an
effort
to
ensure
that
we're
unblocked,
I
think
we
should
re
sort
of
restructure
this
to
span
at
least
three
regions.
A
First,
okay,
talk
to
bunch
at
this
point
is
when
I
usually
keep
quiet
and
see
if
there's
any
opinions
hanging
around
in
the
community.
If
anybody
has
any
thoughts
on
this,
I
would
love
to
hear
them
again
feel
free
to
raise
your
hand
or
drop
something
in
chat.
I'm
just
going
to
pull
up
the
chat
and
see
if
there's
any
interesting
comments.
A
I
think
we
should
just
make
that
repo
like
public
as
we
continue
to
do,
testing
and,
and
it
might
be
good
to
actually
just
open
up
applications,
because
anyway,
we've
built
in
this
construct
of
having
multiple
weeks
to
do
due
diligence
on
an
application,
and
so
assuming
that
we're
generally
on
board
with
this,
like
I'll,
probably
post.
A
These
updates
to
like
issue
94,
see
if
there's
any
reactions
in
the
next
24
or
48
hours
and
ping,
a
few
of
you
in
the
falcon
plus
channel
to
make
sure
we're
like,
on
the
same
page,
to
the
people
that
did
have
opinions
on
this
in
the
past
and
then
based
on
that,
we
should
probably
open
up
applications.
No
action
needs
to
happen
on
those
applications
other
than
just
communication
and
due
diligence,
and
so
for
the
first
like
next
two
weeks.
Hopefully,
they'll
foster
some
good
good
conversation.
A
A
This
holds
a
lot
of
promise
like
I
want
to
thank
julian
and
the
other
community
members
that
have
partnered
on
building
this,
because
this
is
going
to
unblock
a
lot
of
larger
projects
from
having
to
go
through,
like
the
manual
flow
and
having
to
reach
out
to
multiple
notaries
and
then
also
will
prevent
notation
running
out
of
data
cap
as
fast
as
possible,
just
because
they
found
a
good
client
to
support
cool
next
up
is
this
new
issue
that
was
filed
this
week
by
land
landsat
file20.
A
I
don't
know
if
you
are
in
the
room.
If
you
are,
can
you
raise
your
hand
because
we'd
be
more
than
happy
to
have
you
talk
through
it?
If
not,
I
can
give
a
quick
summary
and
for
those
of
you
that
are
watching
for
the
first
time
oftentimes.
What
happens
here
is
that
somebody
has
written
an
issue
that
they
want
to
discuss
and
they're
at
the
call
and
then
they're
the
ones
that
drive
the
discussion.
So
you
don't
have
to
be
stuck
listening
to
my
voice,
the
entire
time
yeah.
So.
B
I
just
wanted
to
say
that
I
think
I
mean
I
fully
agree
with
what
you
just
said
and
I
think
it's
a
very
good
proposal,
I'm
just
wondering
if
we
should
we
have.
B
I
think
we
should
start
with
this
and
in
parallel
we
should
have
another
issue
open
for
the
audit
piece,
because
with
this
large
customers
will
probably
have
to
track
and
decide
which
tool
we're
going
to
use
to
track
and
share
what
the
how
the
data
cap
has
been
sent,
I
mean
spent
and
how
we
could,
because
we
would
be
multiple
notaries.
B
I
mean
working
on
the
same
with
the
same
client,
I'm
just
wondering
I
mean
how
we
should
have
a
separate
one
so
how
we
track
and
how
we
audit.
We
know
that
we
need
to
to
to
dry,
run
some
audit
because
we
nobody
did
that
before.
So
maybe
this
could
be
the
good
occasion,
but
not
to
slow
down
the
process.
I
think
we
should
start
with
this
and
just
open
another
one
for
this
audi,
10
tracking
piece.
A
Yeah
that
makes
sense.
Okay,
so
I'll
probably
write
up
a
new
issue
based
on
that
and
then
julian,
you
can
put
any
ideas
that
you
have
on
that.
I
have
a
few
in
terms
of
things
that
we
should
be
auditing
and
I
think,
like
field
plus
dot
info
itself
has
become
a
lot
more
sophisticated
based
on
the
conversation
large,
the
presentation
that
lara
just
gave
and
then
the
the
dashboard
that
I've
been
working
on
as
well
should
help
with
some
of
those
things.
A
So
probably
using
these
in
combination
plus
like
new
tooling
that
might
need
to
be
built
is
a
good
place
to
start,
and
that
can
be
an
ongoing
thing,
especially
if
we're
running
this
more.
As
like
a
scoped
experiment,
we
probably
have
an
opportunity
to
run
it
that
way,
yeah
so
that
that
makes
sense
to
me.
I
don't
know
if
there's
any
other
reactions
or
thoughts
to
that
one
from
reba,
which
is
post
auditing,
is
the
data
set
still
public
and
achievable
by
anyone
yeah?
That
is
a
really
good
point.
A
A
Julian
to
your
point,
it
might
mean
that,
like
the
lead,
notary
tries
to
do
a
retrieval
or
find
somebody
in
the
community
that
can
do
it
retrieval
like
once
every
week
or
a
couple
of
weeks,
there's
also
efforts
to
automate
some
of
these
things
from
different
projects,
and
so
we
could
probably
leverage
that
as
well.
A
There's
a
question
from
nelson
are
large
data
cap
clients
like
starling,
being
assigned
to
a
specific,
nurturing
getting
priority
for
data
cap,
great
question
so
so
large.
So
today
the
flow
is
that
a
client,
regardless
of
their
size,
would
basically
pick
a
notary
and
then
apply
to
that
notary
with
this
flow.
If
they're
looking
for
a
data
cap
amount,
that's
greater
than
500
tabi
bytes
and
less
than
five
pebby
bytes,
which
is
like
you
know,
massive
data
set.
A
Basically,
they
would
apply
directly
to
the
entire
pool
of
no
trees,
and
then
the
pool
of
notaries
would
pick
like
a
subgroup
effectively
or
like
seven
notaries
would
opt
into
being
signers
on
a
new
address.
That
would
serve
as
like
the
dispensing
point
for
data
cap
for
that
specific
project,
and
so
in
a
way
they
would
be
getting
priority
and
have
like
a
specific
notary
for
them,
but
it
wouldn't
be
like
an
individual
person.
It
would
actually
be
a
multi
multi-sig
address
on
chain.
A
A
Okay,
I'm
going
to
briefly
cover
this
issue,
and
for
those
of
you
that
have
thoughts
like
please
also
drop
your
thoughts
in
chat
or
head
over
to
the
issue
number
119.
The
suggestion
here
was
having
a
mechanism
of
removing
data
cap
from
a
client
and
and
then
potentially
returning
it
back
to
the
notary.
So
this
is
really
interesting.
We've
talked
about
this
in
previous
calls,
as
well
as
a
way
of
having
retribution
options
for
like
audits
and
for
disputes
that
are
being
filed.
A
We'd
want
to
have
like
a
mechanism
whereby
either
the
data,
at
least
the
data
cap
is
removed
from
the
client
and
then
in
some
cases
we
may
want
to
bring
it
back
to
the
notary
as
well,
because
in
the
case
of
like
the
the
private
keys
are
lost.
For
example,
you
probably
want
the
note
3
to
be
able
to
reallocate
that
data
cap,
so
I
did
some
digging
into
the
actor
implementation.
A
It
seems
like
there's
no
actual
easy
way
to
do
this,
but
it
can
be
done,
and
so,
if
we
as
a
community,
feel
aligned
that
this
is
something
we
want
to
build
out,
then
it'll
probably
require
fip
and
then
working
with
the
individual
implementation
teams,
which
we've
done
before
and
we'll
probably
do
again,
and
so
I
would
request
that
if
you've
got
specific
thoughts
on
this
head
over
to
issue,
119
drop
a
comment
there.
It
would
be
great
with
that.
I
wanted
to
spend
a
little
bit
of
time
talking
about
notary
elections.
A
A
So
we
ended
up
in
a
situation
where
north
america
and
china
had
more
than
three
notaries,
but
every
other
region
had
only
three
notaries
and
then
we
also
had
a
few
notaries
who
stepped
down
from
the
time
commitment,
and
so
europe
was
left
with
two,
for
example,
and
so
given
that
and
the
desire
to
have
both
more
data
cap
and
more
activity
and
faster
responsiveness
to
clients,
I
think,
as
a
community.
Over
the
last
weeks,
we've
agreed
that,
like
we
want
to
increase
the
amount
of
notaries.
A
So
in
the
last
call,
I
proposed
that
we
do
five
and
we
had
some
interesting
conversation
where
I
believe
it
was.
It
was
emma
from
the
coda
project
who
mentioned
that
it's
better
to
anchor
around
the
amount
of
data
cap
that
we
want
to
give,
rather
than
just
the
number-
and
I
think
that
was
echoed
by
andrew
and
megan
and
a
couple
others
as
well,
and
so
I
did
some
analysis.
A
I've
detailed
this
analysis
in
issue
117
very
extensive
lots
of
tables
if
you're
interested.
Please
excuse
me
if
I
made
any
mistakes
and
please
feel
free
to
correct
me,
but
the
intent
here
was
effectively
to
say
you
know
for
the
previous
round
of
notary
elections
like
how
is
datacap
actually
allocated
and
how
is
it
used?
So
how
is
it
used
across
regions?
A
Based
on
that,
I
made
an
assumption
that
we're
probably
going
to
give
like
grant
at
least
the
amount
that
we
had
before
so
say
that
we
target,
like
a
total
amount
of
five
pebby
bytes
of
data
cap
with
notaries
after
this
election
cycle.
Just
as
an
example,
and
we
want
to
ensure
that
you
know
europe
has
twenty
percent,
the
greater
china
region
is
at
least
forty
percent.
North
america
is
20,
the
rest
of
asia
also
has
10,
and
then
hopefully
we
start
to
see
presence
in
africa
oceania
and
south
america.
A
I've
had
conversations
with
people,
especially
in
australia
that
are
interested,
so
I
know
that
we'll
start
to
build
more
geographic
distribution,
which
is
awesome
yeah
based
on
these
numbers,
if
we
were
to
just
grab
like
I
want
you
to
look
at
the
first
four
columns
here.
If
we
were
to
grab
like
target
data
cap
availability
and
look
at
the
current
existing
data
cap,
then
we
get
these
numbers
for
like
what
we'd
want
to
ensure
gets
allocated
in
the
standard
of
elections
and
so
based
on
this.
A
Again,
I
would
flag
that,
like
clients
from
different
regions
have
applied
to
notaries
in
other
regions
and
that's
also
a
fine
way
to
go.
So
this
is
not
like
a
massive
blocker
in
any
way,
but
having
this
as
a
as
a
target
seems
to
be
something
important
that
the
community
is
voiced
over
the
past
few
months,
and
so
this
is.
This
is
the
direction
I
think
we
should
go
in.
I
would
love
to
hear
your
thoughts
from
existing
notaries
that
had
opinions
on
this
as
well
before
we
proceed
with
the
next
steps.
A
C
Hi:
okay,
I'm
quite
clear
about
the
the
the
rules
about
the
data
caps,
sorry
for
the
dodgers
members
so
including
the
greater
china.
So
it
means
it
will
be
increased.
Another
two
notaries
in
greater
china,
so
totally
will
be
five
at
least
yeah.
So
at
least.
A
C
If
not
up
to
seven
yeah,
yes,
the
maximum
for
the
seven.
So
is
there
any
information
about
the
way
our
clients
would
like
to
allocate
it
for
big
data
cap
I
mean
big.
It
means
more
than
10
tb.
A
So
for
that
I
would
refer
to
the
discussion
we
were
having
on
supporting,
like
the
large
data
sets
with
issue
94..
So
maybe
what
we
need
to
do
is
discuss
like
bringing
the
definition
of
big
down
to
100
db
instead
of
500.
Do
you
find
that
there's
a
lot
of
clients
that
are
that
are
at
100
db
or
like
at
a
few
hundred
db,
but
not
at
500
or
are
most
of
them?
Looking
for
more
than
500.
A
500
sounds
good,
so
for
that
we
will
have
this
individual
flow,
and
so,
if
a
notary
qualifies
to
this
round
of
individual,
like
notary
elections,
they
can
participate
on
the
multisig.
That
would
support
that
flow,
and
so
for
those
clients,
I
would
say
they
should
go
down
this
path
and
then
for
clients
that
need
less.
Like
you
know,
single
digit
or
double
digit
debit,
bytes
or
data
cap,
they
would
go
through
this
set
of
five
to
seven
notaries
yeah.
So
I
think,
with
that,
like
we
should
probably
open
up
applications.
A
I
think
that
you
know
we've
talked
about
this
for
a
couple
of
months
and
in
the
last
call
it
was
basically
that,
like
the
timeline
was
sort
of
plus
one
to
say,
like
may
11th
onwards,
we
should
open
applications.
So
what
that
means
is
for
those
of
you
that
are
interested
in
being
a
notary
head
over
to
this
repo
open
up
a
new
issue
and
choose
new
node.js
application.
A
That
already
has
a
template
for
everything
that
you
need
to
fill
out.
You
can
look
at
closed
applications
to
see
examples
of
what
that
looks
like,
and
so
this
would
be
like
a
four
to
five
week
process
from
here
on
out
to
elect
like
the
next
round
of
notaries.
A
I
do
want
a
flag
that,
if
you're
an
existing
notary
and
you
want
more
data
cap,
then
you
do
need
to
apply
again.
That
was
something
we
decided
in
a
call
about
a
month
ago.
But
if
you
are
an
existing
notary
with
data
cap-
and
you
don't
want
a
new
allocation,
then
you
can
just
continue
as
you're
going
we'll
probably
take
two
weeks
for
the
initial
scoring
and
then
we'll
have
results
being
announced
in
the
following
notary
governance.
A
I
think
it's
good.
We
should
do
this.
We
should
get
more
stuff.
I
just
want
to
say
any
any
any
additional
thoughts
comments
in
the
last
couple
of
minutes.
Before
we
hand
this
over
to
zx
julian.
B
A
A
Out
the
door,
so
I
don't
think
that's
the
case.
I
think
the
only
complication
that
this
may
create
is
that,
because
fip,
I
believe
it
was
fip
12
that
solved
the
data
cap
top-up
thing
that
still
hasn't
been
released
in
any
of
the
implementations.
Yet
I
think
that's
intended
to
go
out
in
june,
so
if
that
still
hasn't
released
by
the
time,
the
next
round
of
notaries
need
to
be
confirmed,
then
all
the
existing
notaries
will
probably
need
to
get
a
second
address
notarized
on
chain.
That's
fine!
B
Okay,
cool
and
another
thing
is:
if
we
have
I'm
very
happy
that
we
have
more
notaries,
but
I
think
we
should
ensure
that
we
are
all
aligned.
So
do
we
plan
to
do
like
trainings
for
the
notaries,
so
we
just
all
applied
sata
caps,
the
same
way
or
something
like
this.
A
That's
awesome,
that's
a
great
idea.
I
know
that
andrew
was
thinking
about
this
as
well
in
terms
of
like
onboarding
notaries,
a
little
bit
better
and
more
easily
and
neo
has
some
great
ideas.
Fenbushi
has
an
awesome
write-up
dr
anne
had
some
good
ideas
as
well.
A
It
might
be
good
if,
in
the
next
notary
governance
call,
we
spend
some
portion
of
the
time
discussing
like
how
we
want
to
improve
notary
onboarding,
and
maybe
we
can
come
up
with
a
set
of
resources
or
we
can
schedule
a
future
notary
governance
call
or
even
like
a
separate
meeting
for,
like
all
the
existing
and
new
notaries
to
get
together
and
talk
about
processes.
D
Question
in
the
chat,
so
we
got
a
question
about
the
applying
for
the
new
data
cap,
so
as
so,
we
found
bushi,
so
we
actually
had
allocated,
I
think,
a
130
like
terabyte
of
data
cap
and
the
total
of
amount.
We
have
like
one
terabyte
right
so
like
for
right
now
we
actually
want
to
like
apply
for
for
more
data
cap.
The
reason
is
that,
like
for
some
of
the
clients
we
have
talked
about,
they
actually
want
to
have
like
more
data
cap
so
that
their
like
requirements
may
not
be
like
over
500
terabyte.
D
B
D
Too
big
because
you
know
like
for
the
previous
guideline
like
we
have
been
like
put
on
github,
we
set
the
maximum
requirement
application
to
be
like
a
50
terabyte,
which
is
like
five
percent
of
the
total
data
allocations.
B
D
If
we
want
to
like
approve
this
certain
kind
of
question
from
like
100
terabytes
to
200
300
terabytes,
that
would
that
would
be
like
too
much
for
us
so
like
for
this
new
round
of
applications.
Should
we
like
like
apply
for
like
a
more
data
cap
based
on,
though
we
still
have
like
lots
of
data
remaining
right,
yeah.
A
That's
a
really
good
question,
so
I
would
actually
ask
like
for
these
clients
that
are
at
like
100
to
200
debit
bytes.
Do
you
think
that
they
could
come
through
this
process
of
like
supporting
the
larger
data
sets
like
does
that
comply
here
like?
Is
it
all
like
open
massive,
like
publicly
retrievable
data,
because
maybe
the
answer
is
that
we
change
the
threshold
for
using
the
large
dataset
solution
instead.
D
I
think
most
of
kthis
won't
apply
for
this,
so
like
most
of
the
the
company
will
talk
about
is
like
listed
company,
so
they
still
have
like
lots
of
consideration
on
on
you
know,
making
the
public
databases
so
interesting.
Okay,.
A
Yeah
so
yeah,
I
would
say
in
that
case,
like
the
correct
path
is,
you
should
probably
apply
again.
We
should
probably
have
a
conversation
in
the
next
governance
call
and
like
what
the
maximum
amount
of
like
data
cap
we
want
like
outstanding
with
the
notary
is
so
maybe,
instead
of
like
another
pebby
by
we
can
discuss
like
if
the
right
path
based
on
what
the
community
feels
is
like
500
terabytes.
Something
like
that.
A
Thank
you,
so
I
think
yeah
I
think,
for
now
what
we
should
do
is
we
should
start
taking
applications.
Of
course,
like
we've
got
like
another
governance
call
where
we
can
discuss
items
here
as
well,
if
needed,
towards
the
end
of
the
applications
in
case
we
need
to
change
anything
and,
of
course,
we've
got
the
fill
plus
channels
on
slack
that
we
can
have
this
conversation
in
and
so
yeah
megan.
E
Biocoin
plus
community
official
notary
elections
are
now
open.
Please
apply
okay,.