►
From YouTube: Filecoin Plus - April 27 Notary Governance Call
Description
Filecoin Plus governance call in which we discuss the upcoming Fil+ day (May 11), recent DataCap allocation trends, support for clients looking to onboard very large datasets, upcoming Notary elections, as well as a review of the open issues.
A
Welcome
everyone
to
the
governor's
call
for
april
27th
yeah.
Let's
get
started
today's
as
usual:
relatively
action-packed.
I've
got
a
lot
of
stuff
going
on
and
so
got
a
lot
of
stuff
to
talk
about,
but
yeah
a
couple
of
things
today
that
we
will
make
sure
we'll
cover
on
plus
day
again
like
a
quick
update
on
the
flow
of
data
cap
as
I've
sort
of
been
doing
in
the
previous
calls
discussion
on
the
open
issues
this
time
around.
Actually,
there's
only
like
one
like
very
like
open
for
discussion
issue.
A
Many
of
the
other
issues
are
actually
sort
of
lined
up
more
closer
to
implementation,
so
we'll
have
slightly
more
tactical
discussions
than
we
usually
do,
and
then
the
big
item
and
military
elections
so
yeah
lots
to
get
through
we're
starting
out
around
five
plus
days.
So
just
so,
you
guys
are
aware.
A
We
had
to
schedule
this
to
may
13
so
that
we
could
get
all
the
right
event
support
that
we
we
needed
for
the
recordings
and
things
like
that,
and
so
the
initial
date
was
may
11th,
which
was
tuesday,
because
we
wanted
to
replace
the
governance
call
for
that
week,
which
is
actually
the
next
governance
call,
and
at
this
point
it
will
be
on
thursday
instead
of
tuesday,
so
I'll
put
out
a
bunch
of
reminders
as
well.
A
So
people
are
aware,
but
we'll
probably
just
replace
the
notary
governance
off
that
week
and
part
of
this
it'll
have
a
significant
chunk
dedicated
to
like
community
showcase,
which
should
be
open
for,
like
anybody
in
the
network
like
regardless
of
what
role
you
play
in
the
network.
It's
it's
sort
of
open
to
you
and
so
andrew.
A
Had
this
awesome
suggestion
of
effectively
providing
a
forum
for
notaries
who
were
interested
in
sharing
more
about,
like
their
experiences
like
how
their
processes
have
changed
over
time,
as
well
as
learnings
that
they've
had
from
the
surrounding
elections
in
the
standard
data
cap
allocations,
and
so
I
think
this
would
be
an
excellent
place
to
do
it.
A
We're
considering
running
this
more
as
like
a
sort
of
moderated
open
floor
type
thing
so
like
open
for
like
for
30,
40
minutes
or
whatever,
and
people
can
show
up,
share
their
screens,
share
their
sort
of
interesting,
learnings
and
takeaways
and
so
yeah.
If
this
is
something
you'd
be
interested
in
doing
andrew,
I
assume
you're
interested.
So
I
have
you
on
the
list,
but
for
others
that
would
also
be
interested
in
sharing.
A
Please
reach
out
to
either
me
or
megan,
so
that
we
can
organize
around
and
curate
it
and
make
sure
that
there's
enough
space
going
on.
Of
course,
this
is
also
relevant
for
clients
and
minors
as
well,
who
might
have
interesting,
learnings
or,
if
you've
participated
in
similar
programs
around
just
generally
contributing
towards
falcon's
usefulness
such
as
slingshot.
A
This
would
also
still
be
a
good
opportunity
for
you
to
talk
about
like
the
interfacing
between
those
things
and
your
takeaways
and
learnings
from
this
at
least
that's
the
sort
of
approach
that
we're
currently
taking
we'll
keep
you
updated
as
that
changes,
but
yeah,
please
feel
free
to
reach
out.
If
you
have
any
questions.
B
A
Yeah
exactly
and
we
need
we
need
that,
because
that's
right
now,
the
community
there's,
you
know
20
of
us
or
so,
let's
show
up
with
these
calls
and
while
that's
really
important,
we're
also
shaping
the
most
important
program
for
falcon
and
so
the
more
the
merrier
and,
in
a
similar
vein,
there's
a
couple
of
efforts
to
increase
the
quality
of
the
content
around
falcon
plus,
and
so
I'm
re-sharing.
The
link
that
I
put
up
with
the
previous
call,
which
is
in
the
falcon
dock
section.
We
now
have
a
falcon
plus
section.
A
Please
review
it,
share
your
feedback,
send
it
to
your
client,
send
it
to
your
buddy,
send
it
to
your
mother's
brothers
sisters.
Let
them
read
up
about
the
fun
things
that
you
do
and
if
they
have
any
feedback,
because
they
don't
understand,
then
it
would
be
awesome
if
we
could
like
work
on
this
and
improve
the
standard
of
this
thing
so
that
it
becomes
a
valuable
resource
for
everyone,
who's
trying
to
learn
more
about
talking
to
us.
B
Yeah
plus,
if
we
do
our
job
right
on
file
point
plus
day,
it
should
make
everybody's
life
easier
because
then,
when
you
guys,
as
notaries,
are
like
getting
questions,
you'll
have
like
places
to
point
people
to,
as
opposed
to
answering
similar
questions.
A
Yeah,
all
those
sessions
should
be
recorded
and-
and
I
think
we'll
be
live
streaming-
the
whole
thing
as
well,
so
that
should
be
pretty
cool
cool
with
that,
let's
talk
a
little
bit
about
what's
changed
in
the
last
couple
of
weeks,
so
in
the
last
two
weeks
there's
been
about,
I
think
nine
data
gap
allocations
that
actually
went
to
the
depot
five
in
the
last
week.
A
It
amounts
to
about
26
terabytes,
okay
in
the
last
seven
days,
which
is
a
slightly
above
half
of
what
it
was
the
last
time
I
showed
this
stack
two
weeks
ago,
so
I
think
the
rate
at
which
applications
are
coming
in
is
is
probably
slowing
down
a
little
bit,
not
entirely
sure
why?
But
yeah.
C
A
Looks
like
generally
responsiveness
seems
to
be
going
up,
we'll
talk
about
that
in
a
second,
but
I
do
see
that
in
general,
like
the
the
trend
line
of
new
apps
and
new
data
cap
allocations
is
slowing
down
if
anybody
has
hypotheses
feel
free
to
share.
If
not,
I
think
it's
just
you
know
that
symptom
like
these
things
will
change
over
time,
and
so,
as
long
as
we're
aware
and
tracking
we'll
keep
learning
and
growing.
A
A
So
that's
that's
great
news
in
terms
of
the
total,
like
data,
cad
distribution,
just
updating
the
stats
like,
I
actually
had
a
couple
of
errors
in
this
I'd
like
to
apologize
for
the
last
one,
but
this
should
be
corrected
numbers
I've
switched
to
writing
everything
in
heavyweights
as
well,
so
it's
easier
to
follow,
but
yeah.
Basically,
at
the
moment,
this
is
what
the
breakdown
looks
like
and
then
emma.
A
I
know
that
we
were
chatting
on
github
and
in
the
falcon
plus
channel
about
this
yesterday
and
a
few
days
prior
as
well.
So
this
is
this
is
the
current
sets
that
I'm
looking
at
where
currently
about
540
w
bytes
have
been
allocated
with
across,
like
290
allocations
or
so,
and
then
of
that
about
35
of
that
has
been
used
in
deals
with
miners.
That
stat
continues
to
creep
up,
which
is
good.
A
D
One
question
deep
or
one
remark
on
that.
Actually,
personally,
I've
been
stuck
to
allocate
additional
data
cap
because
it
seems
like
now
we
have
to-
or
maybe
that
was
the
case
before,
but
we
have
to
pay
some
fees
to
just
publish
the
the
allocation
on
the
network,
and
I
I
mean
maybe
my
address
is
now
running
out
of
feedback.
So
that's
the
reason
why,
but
that
the
last
two
allocation
I
tried,
they
both
fade
and
I
can't
see
them
on
the
interface
anymore.
A
Yeah,
so
there
is
a
few
associated
with
it,
I
think.
Initially,
after
the
round
of
allocations,
we
just
had
a
wallet
sent
all
of
the
notaries.
Some
amount
of
of
falcon
is
part
of,
like
the
general
grant
that
was
going
in
towards
setting
up
in
the
falcon
plus
system.
I
assume
that
that
model
will
continue
for
the
time
being
until
we
come
up
with
a
better
solution,
and
so,
if
you're
blocked
on
that,
I
think
the
immediate
thing
we
can
do
is
right.
After
this
call.
Let's
go
ahead.
No.
D
D
D
That
the
reason
was
there
was
not
enough
feel
on
the
address.
Actually
so.
A
A
Another
one
yeah
okay,
so
I
am
meeting
with
the
team
that
runs
the
falcon
plus
app
in
like
about
an
hour
with
megan
as
well,
and
so
I'll
also
flag
this
and
see
what
happens
in
the
case
of
an
error,
it's
possible
that
it's,
the
github
issue
is
still
there
right.
So
it's
possible
that
you
can
regenerate.
A
A
Cool
yeah,
so
this
based
on
this,
I
also
wanted
to
bring
up
like
the
fact
that
things
are
still
actually
improving,
which
is
great.
So
this
is
the
little
handy
dandy
table
that
that
andrew
put
together
a
few
weeks
back
but
yeah
on
average.
The
days
to
grant
is,
is
reducing
quite
a
bit
and
then
the
days
left,
open
also
generally
is,
is
really
stupid
quite
a
bit.
A
So
thank
you
for
the
great
work
that
you're
doing
deliveries
in
being
responsive
and,
if
there's
anything,
we
can
do
to
help
with
that
feel
free
to.
Let
us
know
andrew.
I
did
have
a
question
for
you
where,
when
I
grabbed
an
image
for
this
call,
I
I
saw
that,
like
the
bras
allocation
here,
grounded
with
zero,
whereas
in
the
past
this
is
last
week,
so
it
was
at
one.
I
don't
know
if
this
is
meant
to
be
like
this
is
time
bound
correct.
The
table.
C
The
granted
should
be
all
time.
I
think,
that's,
probably
a
bug
I'll,
take
a
look
sometime
this
week.
A
See,
thank
you
no
no
rush,
but
I
think
what
we're
really
interested
in
is
is
the
third
column,
and
that
seems
to
be
decreasing.
So
that's
awesome.
A
Do
you
notice
in
general
feel
like
things
are
well
or
is
anybody
feeling
like
they
need
more
support
swamp
like
one
of
the
things
that
we
talked
about
was
that
in
in
general,
you
should
feel
more
comfortable,
just
sharing
in
the
notary
channel,
for
example,
if
you
feel
like
you
can't
get
to
a
particular
application
and
want
to
distribute
that
app
to
somebody
else
so
like
effectively
reassign
it
to
a
different
notary
like
that
kind
of
engagement
should
be
perceived
as
like
acceptable
and
alright
and
in
fact
encouraged,
because
it
will
help
the
client
get
to
the
data
cap
and
so
yeah.
A
If
at
any
point,
there's
anything
we
can
do
to
support
or
your
fellow
notaries
can
step
in
and
help
you
feel
free
to
reach
out
on
that
channel,
but
yeah,
hopefully,
generally
people
are
feeling
good
and
I
just
see
sonic
nodding.
So
I'm
going
to
take
that
as
yes,
but
I
don't
see
anybody
else
so,
hopefully
I'll
nodding
there
as
well.
E
I
I
I
want
to
add
something
right,
so
can
you
like
remove
the
first
one,
the
the
first
one
and
the
the
ring
phone?
So
it's
all
represented
the.
I
think
there
are
like
two
old
accounts
full
of
bushy
capital.
So
we
we
registered
a
new
one,
so
the
first
one
and
the
ring
one.
We
will
no
longer
use
that
account.
So
we
can
remove
that.
D
Yeah
thanks
good
flag,
yeah
one
thing,
sorry,
but
I
see
that
the
average
grand
number
of
days
is
like
40.
For
me,
I
think
there
is
a
the
mistake
here
because
I
I'm
is
it
based
on
the
how
long
the
issues
they
open.
A
A
The
other
thing
is
that,
like
as
part
of
that
dashboard
that
I
sort
of
previewed
in
the
past
that
we'll
be
hopefully
presenting
and
sharing
for
for
public
access
at
phil,
plus
today,
the
math
there
for
like
time
to
date,
a
cap
is
actually
from
like
at
which
github
issue
is
open
to
the
epoch
at
which
the
data
cap
was
actually
grounded
on
chain.
A
And
so,
even
if
the
github
issue
stays
open
or
conversation
continues
to
happen
in
the
github
issue,
it
like
the
stat
will
actually
be
to
when
the
message
was
actually
posted
from
your
account
to
the
address
of
the
client,
and
so
that
should
hopefully
be
a
more
accurate
representation
that
we
can
then
feed
into
this
table
and
feed
into
other
places
as
well.
A
I
see
lots
of
stuff
coming
in
on
the
chat
I
this
is
probably
zuha
and
andrew
cool
all
right.
Let's
talk
about
the
issues,
action-packed
fun
part
of
the
call,
and
so
there's
there's
three
there's
three
issues
that
we
need
to
chat
about
in
terms
of
like
implementation,
next
steps-
and
I
would
imagine
that
this
will
be
a
little
shorter
part
of
the
call,
and
then
we
have
a
full
section
dedicated
to
nurturing
elections
and
then
one
more
open
issue
yeah.
So
for
the
first
one
this
has
raised.
A
A
I
think
it
was
generally
well
received
in
the
first
call,
and
it
seemed
like
a
pretty
quick
thing
for
us
to
align
on
so
just
want
to
update
that,
like
the
falcon
plus
bot
will
be
modified
to
effectively
have
the
functionality
that
posts
a
little
snippet
that
says:
hey,
like
nothing's,
happened
on
this
issue
for
20
days
to
feel
free
to
reopen
this
issue
if
it's
still
relevant,
and
it
will
start
closing
out
issues
that
are
older
than
20
days,
so
that
should
happen
in
the
coming
weeks.
A
So
unless
anybody
objects
now,
I
guess
forever
hold
your
peace.
Okay,
don't
forever
hold
your
peace.
This
is
the
experiment
that
we're
probably
gonna
go
through
with
and
so
yeah
as
a
community.
You
know
feel
free
to
sort
of
react
to
that
and
share
your
thoughts,
especially
as
that
evolves
but
yeah
as
of
now
it
seemed
like
everybody
was
sort
of
in
line
on
this
one,
so
we'll
be
trying
it
out
cool.
The
second
one
was
also
opened
by
andrew.
A
I
think
two
governance
calls
ago,
which
is
around
the
idea
of
adding
an
anticipated
response
in
throughput
for
notary
applications.
So
I
apologize.
I
think
my
text
is
being
covered
slightly
by
a
picture
here,
but
effectively.
What
I
did
was
we
had.
We
had
a
little
temporary
snippet
that
was
being
posted
at
the
end
when
energy
was
about
to
be
confirmed
on
that
snippet,
I
added
a
section
borrowing
some
of
the
language
from
android's
issues.
It
says
like
commitment
to
efficiently
saving
the
network
and
so
effectively
you
know
it
would.
A
A
A
It's
all
right
perfect.
So
with
that,
I'm
going
to
close
these
two
issues
that
I
think
104
actually
already
closed
out.
I
open
up
a
pr
if
anybody
is
interested
in
modifying
the
language
I'll,
keep
that
open
for
a
couple
of
days
and
and
take
any
feedback
before
we
merge
that
in
for
109,
just
keep
you
updated.
This
is
the
code
for
the
bot
is,
is
tested
and,
and
the
functionality
is
added
currently
we're
down
to
15
open
issues
with
repo,
which
is
great,
but
every
week
we
make
some
progress.
A
So
for
those
of
you
that
you
know
are
interested
in
adding
topics
or
looking
at
what
else
is
open,
please
feel
free
to
check
it
out,
share
your
thoughts
and
and
let's
continue
working
towards
iterating
in
this
community
and
making
sure
that
we're
moving
forward.
D
Two
weeks
ago
there
was
an
issue
like
so
we
can
reapply.
I
mean
clients
can
reapply
with
the
same
address,
to
get
more
data
cap.
This.
A
So
already
so
that
was
approved
yeah
that
was
approved,
I
think
nearly
a
month
back
almost
now
like
that's
fit
12,
so
that
was
approved
and
that's
supposed
to
be
part
of
the
next
big
actor's
release.
So
the
latest
is
from
the
implementer
sink
yesterday
across
the
different
implementations.
A
I
think
they
agreed
that,
like
that
fix,
will
roll
out
sometime
in
mid-june
and
so
yeah,
so
we're
about
two
months
ish.
I
would
say
away
from
from
that.
I
also
want
to
note
that
if
an
address
actually
goes
to
zero
with
data
cap,
then
you
can
get
more
data
capital
address,
so
it
gets
removed
from
the
list
of
like
it
gets
removed
from
the
verified
registry
actor
effectively
when
it's
at
zero.
The
problem
is:
if
it's
non-zero,
then
you
can't
get
more
data.
A
Cool,
so
this
is
the
the
fun
one
that
we
dragged
julian
out
for
today,
but
along
the
lines
of
onboarding
projects
that
have
demand
for
a
really
large
data
cap.
So
I
did
a
little
bit
of
research
actually
met
with
a
couple
of
clients
in
the
last
couple
of
weeks
that
I
would
anticipate
wanting
to
use
this,
including
installing
discover
a
couple
of
other
projects
that
were
introduced
by
other
notaries.
Actually
in
the
call.
A
So
thank
you
for
those
of
you
that
have
been
doing
active
outreach
and
looking
for
clients
that'd
be
interested
to
remove
some
of
that
data
onto
firecoin,
and
so
based
on
this,
I
sort
of
wanted
to
propose
like
a
slightly
modified
version
of
scope,
that
as
compared
to
what
we
were
previously
looking
at,
which
was
a
little
bit
more
like
cookie
cutter
and
a
little
bit
more
like
you
know,
just
flat
like
throttles
and
fat
policies
that
would
have
applied
to
every
single
client.
This.
A
So
the
yeah,
I
guess
like
maybe
the
easiest
way
is-
is
to
for
me
to
call
out
like
a
couple
of
these
bullets.
So
the
first
one
is
that,
like,
I
think,
a
client
should
have
a
clear
sort
of
allocation
strategy.
That's
outlined
in
their
application,
so
that
includes
like
their
deal.
A
Making
strategy
like
how
they'd
be
allocating
the
miner
and
and
how
like
what
the
rate
will
be
actually
at
which
they'll
be
onboarding
their
data,
but
then
also
like
a
little
bit
of
insight
into
the
methodology
for
how
they'd
be
identifying
their
miners.
And
so,
if
that's
like
some
sort
of
objective,
we'll
be
using
full
wrap
or
if
we're
using
some
other
interesting
mechanism
by
which
miners
will
be
selected,
then
that
should
be
shared,
because
not
everybody
will
have
the
same
sort
of
rate
at
which
data
will
be
secured
across
different
miners.
A
So
having
like
a
flat
percentage.
That
is
not
nuanced
to
the
way
in
which
a
particular
project
works
like
may
not
like
actually
yield
value,
which
is
here.
I'm
sort
of
defining
is
like
this.
This
would
be
useful,
in
my
opinion,
if
we
actually
enable
clients
that
have
several
pebby
bytes
of
data
that,
like
could
actually
benefit
benefit
from
running
on
on
filecoin
and
so
in
sort
of.
A
In
that
spirit,
like
tweaking
these
to
say,
like
it's
more
about
like
understanding
each
of
these
clients,
there's
probably
not
going
to
be
that
many
anyway,
and
so
like
creating
room
for
them
to
express
like
how
they'd
ideally
like
to
to
function
and
then
having
the
notaries
like
discuss
and
limit
accordingly,
based
on
what
we
feel
would
be
reasonable.
A
That
means
not
only
is
the
data
like
not
weirdly,
encrypted
or
hidden,
like
anybody
that
has
access
to
that
data
should
not
need
any
special
permissions
to
be
able
to
access
what's
in
it,
but
also
that,
like
in
general,
the
miners
that
are
being
selected
to
store.
This
should
be
supporting
retrieval
of
that
data
on
the
network,
and
so,
if
somebody
does
want
to
go
and
get
like
a
styling
video
clip,
for
example,
they
should
be
able
to
do
that
and
in
fact
like
through
this
process,
we
should
have
both
manual
and
automated
verification.
A
It's
have
a
couple
of
ideas
there
around
like
some
of
the
things
that
we'll
be
doing
for
slingshot
as
well,
that
I'd
like
to
share
around
how
we
can
do
some
of
the
verification
around
retrieval.
Even
if
it
means
like
me
or
someone
else
makes
a
deal
like
once
a
week
for
some
of
these
clients,
there's
not
that
many
clients.
A
The
goal
is
that
we
should
ensure
that
it's
the
right
kind
of
data
that's
being
brought
into
the
network,
and
then
the
last
bit
I
wanted
to
call
out
is
that
previously,
I
proposed
like
a
five
percent
or
ten
percent
like
allocation
rate
total,
but
that
didn't
seem
to
be
super
well
received
by
some
of
the
clients
because
like
for
them,
it
depended
on
like
how
quickly
they
could
get
data
cap
onto
their
network
and
serve
what
I'm
proposing.
Instead,
it's
effectively
that
we'll
have
two
data
points
from
the
application.
A
They
requested,
and
the
second
will
be
like
how
quickly
they
think
they
can
actually
make
the
deals
with
that
data
cap,
and
so
what
I'm
sort
of
proposing
is
is
like
a
bit
of
a
ramp
up
where
the
the
first
allocation
is
either
the
lesser
of
five
percent
of
the
data
cap,
they've
requested
or
half
of
like
their
weekly
data
onboarding
rate.
The
second
allocation
is
the
lesser
of
ten
percent
of
the
data
cap
or
one
week's
worth
of
their
data
cap
and
then
third
allocation
onwards
is
basically
either
they're.
A
Getting
about
a
fifth
of
the
data
cap,
that's
requested
or
about
what
they'd
need
in
in
two
weeks,
and
so
my
expectation
is
for
the
really
large
clients,
they'll
probably
fall
into
like
based
on
the
weekly
allocation
rate
so
having
to
come
to
notaries
once
every
two
weeks.
A
I
thought
it
was
a
pretty
reasonable
sort
of
time
for
notaries,
also
getting
time
to
do
due
diligence
as
needed
on
whether
or
not
the
client's
operations
were
good
but
also
like
it
takes
some
amount
of
time
for
getting
multiple
signatures
on
on
a
multi-stake,
so
yeah
any
thoughts.
Any
reactions
on
this
would
love
to
get
some
feedback,
because
this
is
something
that,
like,
I
think
we
should
operationalize
very.
A
D
Yeah
yeah,
so
I
mean
I'm,
as
I
said
on
on
github,
I
mean
I'm
fully
in
line
with
this.
I
think
it's
it's
a
very
good
starting
point.
We
may
change
this
a
little
bit
when
we
see
the
the
first
project
and
we
take
some.
You
know
yeah.
C
A
Awesome
really
glad
to
hear
that,
so
I
actually
shared
a
link
to
in
that
issue
as
well
for
like
what
the
application
could
look
like
the
request
we
got
from
the
team
that
works
on
the
bot
is
that
we
do
this
in
a
different
repo
so
that
they
can
write
a
dedicated
like
app.
That
works
for
this
because,
like
the
flow
will
effectively
be
that,
like
every
time,
a
client
is
approved
for
something
like
this.
A
It's
as
if
a
new
notary
is
being
stood
up
right
so
like
there'll,
be
a
notary
that
is
in
the
form
of
a
multisig
wallet
with
the
set
of
notaries
and
the
issue
that
have
opted
into
it,
and
so
the
root
key
holders
would
be
involved.
So
the
first
part
of
the
floor
feels
much
more
like
what
needs
to
happen
in
the
military
governance
depot.
But
then
the
rest
of
the
flow
is
much
more
similar
to
the
client
onboarding
depot.
Where
it's
like.
The
client
is
saying.
A
I
need
more
data
cap,
the
node
is
agreeing
to
send
more
data
cap,
and
so
I
think
you
will
probably
just
create,
like
another
depot
called.
Like
five
point
plus
large
clients
or
something,
and
then
I
could
probably
paste
the
application
format
and
that
sometime
this
week
as
well,
it's
like
write
up
some
basic
documentation,
how
that
would
flow,
and
maybe
I'll
ping
out
in
the
falcon
plus
and
the
falcon
plus
notaries
channel,
so
that
I
can
get
some
feedback
as
as
we're
editing
the
next
few
weeks.
A
C
Yeah,
sorry,
if
you
said
this
in
the
introduction-
and
I
just
missed
it-
but
wouldn't
this-
wouldn't
this
be
a
good
candidate
for
what
all
notary
processes
look
like
like.
Wouldn't
we
want
just
like
clients
to
go
through,
say
glyph
and
one
of
these
to
be
created
automatically
for
when
they
run
out
of
their
first
32.
A
Yeah,
I
think
that's
a
pretty
reasonable
statement
actually
and
one
of
the
thoughts
that
I
was
having
is
like,
ideally
like
over
time.
We
increase,
like
I
I
don't
know
if,
like
just
over
32,
would
would
result
in
something
this
complex
being
set
up
for
each
client,
because
if
you
look
at
like
the
number
of
client
addresses
that
exist
today,
if
you
look
at
the
total
number
of
allocations,
minus
glyph
we're
still
at
like
120
having
120
allocations
like
requiring
their
own,
like
wallet
being
set
up
and
a
bunch
of
sort
of
extra
overheads.
A
We
might
introduce
like
additional
automated
notaries,
because
that's
probably
gonna
help
like
cover
like
a
bunch
of
the
gap
and
then
maybe
the
requirements
for
qualifying
for
this
shrink
a
little
bit
over
time,
and
so
maybe
this
comes
down
to
like
greater
than
100
debit,
bytes
or
something,
and
we
get
like
automated
verification
up
to
as
much
as
we
possibly
can
and
then
like.
What's
left
in,
the
middle
is
like
what
requires
that
manual
verification
which
hopefully,
over
time
becomes
a
very
small
portion
yeah.
A
I
think
that,
like
this
is
going
to
be
a
very
good
thing
for
us
as
a
community
to
understand
what
are
the
things
that
we
actually
care
about
and
like
understanding,
how
we're
going
to
track
those
things
and
build
automation
around
those
things.
So,
whether
that's
like
getting
things
into
a
dashboard,
getting
alerts
up.
Getting
reports
published
like
these
kinds
of
things
as
we
become
aware,
and
we
find
ways
in
which
we
can
openly
share
and
publish
these
these
things
in
a
transparent
way.
C
Yeah
yeah
yeah.
I
think
I
mentioned
to
you
that
this
out
of
band-
and
I
was
curious
if
this
made
it
into
the
issue
or
some
thinking.
But
we
want
this
for
one
of
our
projects,
but
it's
pretty
clear
like
because
of
the
gap
between
when
I
sort
of
commit
my
data
cap
to
a
deal
and
that
deal
making
it
on
chain
that
there's
many
hours
between
like
so
every
every
project.
C
Doing
this,
if
it
was
in
a
live
situation,
would
have
to
go
offline
essentially
for
a
number
of
hours,
because
they
wouldn't
have
any
data
cap
left
unless
they
sort
of
make
a
gamble,
or
I
don't
actually
know
how
that
works.
So
it's
almost
like
we're
going
to
need
two
clients
that
are
both
ready
so
that
we
can
have
some
sort
of
blue-green
deployment
where,
when
one
is
waiting
for
a
new
data
cap,
the
other
one
is
standing
in.
C
A
Yeah,
so
I
think
the
the
fourth
bullet
here
is
is
well
actually,
I
think,
there's
two
angles
to
this
andrew.
So
one
is
that,
like
your
address
can't
receive
data
cap
while
it
doesn't
have
data
caps.
So
this
is
like
a
temporary
like
bug,
given
that
the
fit
boss
and
therefore
the
community
agreed
that
we
should
support
like
data
cap
top-ups
on
the
same
client
address.
A
A
C
A
Yeah,
nice,
and
so
for
now
like,
even
though
that
bullet
exists,
and
maybe
90
is
too
high.
Maybe
that
number
should
have
been
like
something
based
on
the
allocation
rate,
but
I
think,
as
julian
mentioned,
we
should
expect
that
these
things
will
probably
get
tweaked,
but
the
framework
is
there
to
say
that,
like
as
long
as
you've
allocated
a
majority,
you
can
ask
for
your
next
sort
of
allocation.
A
I
think
what
we,
what
we,
ideally
that
would
be
on
the
same
client
at
this,
so
you
won't
have
to
worry
about,
like
as
a
client
at
operating
at
scale.
You
don't
need
to
worry
about,
like
managing
multiple
addresses
and
then
managing
your
own
like
deal
indexing
on
like.
Where
is
the
data
cap
flowing
out
of
what
like
wallet
and
what
will
happen
over
time,
but
so,
like
the
date
today
is
like
end
of
april.
A
A
At
the
field
plus
day
is
like
how
like
clients
can
apply,
and
maybe
this
becomes
like
a
short
like
lightning
talk
for,
like
large
clients
that
do
watch
that
content,
which
I
think
would
be
a
great
outcome,
and
then
I
would
imagine
like
what
I'd
like
sort
of
propose
is
that
an
application
itself
for
a
client
is
like
stays
for
about
two
weeks.
A
So
at
least
there's
one
governance
call
that
it's
brought
up
and
if
it
needs
to
be
discussed
like
synchronously
within
the
notaries
that
are
interested
and
so
like
the
earliest
like,
if,
if
clients,
open
up
applications
like
day
of
like
phil
plus
summit
like
I
would
still
imagine
that
it
would
be
like
june,
1st
or
or
like
early
june,
where
like
this,
would
actually
go
through,
and
so
if
the
actual
implementation
is
done
in
the
network.
Update
that
comes
in
mid-june.
Like
does
result
in
that
being
fixed.
A
Yeah,
okay,
so
I
think
we're
gonna
we're
gonna
proceed
on
this.
We've
talked
about
it
a
few
times,
as
is
the
case
with
the
other
thing
sort
of
it
feels
like
there's
general
consensus,
that
this
is
where
they
trying
out
at
least,
and
I
haven't
really
had
any
any
significant
pushback
in
this
direction.
A
A
request
I
have
for
you
is,
I
will
be
reaching
out
to
all
of
you
in
the
coming
two
weeks
effectively
to
do
like
some
sort
of
like
quick
interview
with
the
dashboards
that
would
like
we'd
like
to
get
stood
up
with
the
last
grant.
A
Cool
all
right,
so
the
the
one
issue
that
is
a
bit
of
an
ongoing
conversation,
but
this
is
back.
This
is
a
slightly
older
issue.
It's
been
open
for
a
while
now
around,
like
decentralization
requirements
for
like
stuff
making
and
stuff
self
dealing,
so
the
the
delta
or
the
recent
change
on
this
is
basically
well
okay.
Before
we
get
into
that,
I
would
say:
the
status
quo
is:
is
that,
like
generally,
there
is
a
principle
of
decentralization
and
distribution
in
falcon
plus?
As
a
result,
most
notaries
are
sort
of
adhering
to
you.
A
But
the
latest
on
this
is
basically
elio
wrote
a
proposal
which
basically
says
why
don't
we
limit
the
amount
of
verified
deals
going
per
miner
as
a
mechanism
to
reduce
the
incentive
towards
delaying
and
then
had,
and
had
this
like
proposal
of
a
ramp
up?
A
I
think
we
can
continue
to
discuss
this
one,
but
what
I
did
want
to
flag
for
today's
conversation
was
that
nelson-
I
don't
know
if
you're
here,
but
you
basically
asked
for
having
a
clear
sort
of
audit
and
compliance
process
for
disputes
against
what
what
is
deemed
to
be
excessive
or
at
least
finding
a
way
to
like
provide
the
knowledge
or
whistle,
though
so
that,
like
the
rest
of
the
community,
is
aware
and
like
a
notary
can
take
like
correspondingly
accurate
action.
A
That's
what
I
wanted
to
talk
a
little
bit
about
today
was
like
ensuring
that
we're
providing
the
right
mechanism
for
notaries
and
for
I
guess
for
other
clients
and
minors,
and
so
I
guess
the
the
question
I
have
for
the
room
is
like
on
the
notary
governance
page.
A
There
is
a
section
around
like
audits
and
dispute,
but
I
don't
think
anybody's
exercised
that
flow
like
it
hasn't
happened
yet,
and
so
I
guess
the
the
question
is:
has
anybody
felt
like
they
needed
to
exercise
it
and
that
wasn't
clear
enough
or
they
didn't
even
know
where
they
looked?
A
Or
has
anybody
looked
at
it
and
has
any
feedback
on
how
we
can
improve
it,
because
that
I
do
agree
that
we
should
have
that
like
regardless
of
the
outcome
of
this
particular
issue
and
how
we're
thinking
about
it,
but
that
that
is
like
a
pretty
like
important
thing
for
the
community
to
feel
like
they
have
access
to
and
and
they
have
an
opportunity
to
raise
a
voice
or
raise
an
issue
after
the
platform
yeah.
So
I'd
love
to
get
some
feedback
on
this
or
any
thoughts
on
this.
G
A
So
I
because
I
think
the
I
think
that
this
issue
is
going
to
be
an
ongoing
issue
and
an
ongoing
discussion,
but
the
audit
part
of
it
like
I
think
that
should
be
addressed
as
soon
as
possible,
like
people
should
know
that,
like
this.
A
A
Mean
it's:
it's
basically
like
file
an
issue
and
here's
how
you
share
like
what
you
need
to
share.
But
I
don't
know
if
that
works,
like
we've,
we're
doing
this
as
a
community
for
the
first
time
and
we
as
a
community
are
doing
this
in
like
the
world,
basically
for
the
first
time
and
so
we're.
We
all
need
to
sort
of
work
together
to
figure
out
like
what
the
right
model
is,
but
yeah
you're,
not
the
only
one
that
that's
like
I
haven't
either
yeah,
but.
D
A
thousand
of
good
reason
didn't
spend
any
time
on
that.
To
be
honest,
I
think
it's
it's
it's
today,
it's
time
consuming,
we
don't
have
any
tooling
for
that.
So
we
have
to
do
everything
manually
and
I
didn't
take
that
time
on
my
side
personally
so
but
of
course
we
could
think
about
doing
it
also
also-
and
I
think
when
it's
public
data
I
mean,
because
there
is
two
type
of
data
we
have
now:
okay,
there
is
public
and
private
data.
D
So
when
it's
private,
we
will
need
to
understand
how
how
useful
is
it
actually,
and
I
don't
know
how
we
can
achieve
that
so,
each
time
we
just
ask
the
clients
that
they
will
agree
that
to
work
together,
so
we
can
understand
to
what
how
the
data
are
used
and
we
can
just
ask
to
dig
in
one
sector
to
see
what's
happening
when
it's
public
data,
if
it's
public
data
that
are
stored,
but
you
don't
have
any
websites
you
have
no
way
to
you
know
get
the
data
back.
I
mean
it's.
D
A
F
A
Okay,
so
that
so,
I
think
it
might
be
good
to
have
him
participate
as
well
in
in
when
we're
discussing
this,
but
the
from
what
I
understood
from
reading.
It
was
basically
that,
like
that
number
could
go
up
like
it
could
be
like
you
said
it
could
be
say
three
deals
or
whatever,
like
the
community
thinks
it
is.
A
That
was
one
take
where,
like
his
expectation,
would
be
that,
like
you,
would
just
work
harder
to
find
those
other
miners
and
make
more
deals
with
other
miners,
and
if
you
physically
can't
and
there's
not
enough
miners
that
can
meet
your
need.
I
think
that's
is
that
what
you're
flagging,
where,
like
there
exists,
cases
yeah.
G
More
or
less
it,
I
think
it
depends
on
your
on
your
application,
you're
building
like
you're,
going
to
store
data,
and
maybe
for
for
for
one
person
it
can
be
over
three
miners
for
one
it
can
be
over
five.
G
G
A
A
I
think
the
the
other
pivot
that
he
had
in
his
proposal.
So
just
as
a
reminder,
the
original
proposal
was
a
lot
sim
like
a
lot
like
more
simplistic
which
which
is
from
me,
which
is
just
like.
If
there's
you
know,
if
there's
greater
than
three
miners,
they
can
actually
take
your
data
in
your
region,
then
like
ensure
that
you
make
deals
with
these
four
miners,
including
yourself
right.
It
was
like.
25
percent
can
end
up
like.
A
Plan
for
the
miner,
so
this
is
basically
saying
like
instead
of
having
to
moderate,
like
that,
like
go
straight
to
the
miner,
I
think
the
other
sort
of
proposal
that
he
had
was
actually
adjusting
the
story,
the
the
quality
adjustment
factor
on
a
on
a
per
minute
basis
as
well,
which
is
effectively
like
if
you
know,
if,
if
it's
the
client
then
like,
maybe
it's
not
10x
or
if
it's
a
minor
that's
received
like
multiple
deals
of
data
cap,
then
maybe
like
incremental
deals
like
decrease
from
10
to
like
smaller
values,
like
that's
at
least
how
I
read
what
he
had
in
the
issue.
G
But
the
the
the
downside
of
this
at
least
from
for
us
it
will
be.
We
applied
for
data
cap
of
one
terabyte
per
day.
G
A
Yeah,
I
I
think
it's
that
thing
is
reasonable.
I'm
going
to
time
bound
this
conversation
to
maybe
a
couple
more
comments,
because
I
want
to
spend
more
time
in
the
nerdy
elections.
I
don't
think
we're
going
to
be
closing
this
one
out
anytime
soon.
I
think
that
the
the
existing
status
quo
is
probably
going
to
continue
for
some
time,
which
is
that,
like
it's
sort
of
up
to
the
notice,
as
as
we
all
learned
through
this
process,
why
not
so,
I
think.
D
Your
statements
are
fine.
I
think
that
could
also
be
included
or
merged
with
this.
One
is
that
I,
I
think
that
I
mentioned
in
a
previous
issue
to
the
ability
to
remove
the
verified
flag
from
a
sector
if
we
could
achieve
that,
so
it's
more
like,
instead
of
just
controlling
all
the
time,
all
the
different
clients
it's
more
like.
If
we
find
there
is
an
issue
at
one
point,
then
we
can
just
slash
the
client
just
by
removing
the
verified
flag
from
all
the
sectors.
D
I
don't
know
if
it's
achievable
or
not
on
the
network
as
it's
designed
today,
but
it's
something
we
can
do
that
I
mean
because
what's
happening,
is
that
let's
say
what
we
see
here
that,
like
as
an
example
that
we
have
one
minor
that
is
has
been
created
with
just
verified,
deals
that
we
know
that
they
didn't
respect
what
we
ask
them
to
do
so
they
take
just,
I
think,
what's
like
10
tb,
they
just
create
a
100
tip
miner,
just
in
one
day
and
allocating
everything
to
the
same
miner,
so
they
didn't
respect
the
rules.
D
C
D
A
A
A
Yeah
yeah,
I
think
I
think
maybe
we
should
have
a
separate
issue
dedicated
to
discussing
this
as
well
just
for
the
sake
of
attracting
a
conversation,
but
I
think
that
that's.
That
is
a
valuable
point
and
I
think
julian
you
mentioned
this,
but
that
might
also
be
the
mechanism
for
like
the
dispute
audit
framework
right
like
where,
like
once
we
find
it
is,
it
is
yeah
it
is.
It
is.
D
D
A
You
can't
game
the
same
sector,
but
the
point
is
that
I
think
your
point
is
that
we
should
be
able
to
remove
that
flag
for
an
existing
sector
right
and
that
I
don't
know
like
there's.
Also
an
interesting
ask
that
came
in
this
week
around.
Can
we
add
that
flag
later
on,
so
that
a
client
can
like
start
making
deals
sooner
and
then
like
go
and
get
data
cap
at
the
same
time
and
then
come
back?
I
think
that's
also
a
very
interesting
proposal.
I
need
to
see
if
that's
even
remotely
feasible.
A
That
also
is
is
compelling
from
the
perspective
of
like
seeing
like
client
actions
before
committing
like
additional
data
capital
clients.
So
I
do
think
it's
interesting.
I
think,
in
the
interests
of
time
sonic
I'm
gonna
suggest
that
we
continue
discussing
either
on
slack
on
the
next
call,
just
because
I
don't
want
to
spend
some
time
discussing
the
military
elections
and
we're
at
like
10
minutes
to
so
left.
A
This
has
sort
of
been
shared
in
the
last
couple
of
calls
emma
coda
emma
from
the
corner
team.
You
had
some
interesting
suggestions
in
slack
yesterday.
I
think
you're
here
today,
so
I
think
it
would
be
great
if,
if
you
don't
mind,
sharing
sort
of
what
your
proposal
looked
like,
but
my
hope
would
be
that
as
a
group
today,
we
can
reach
a
consensus
on
which
direction
we
want
to
take,
because
the
notary
of
actions
itself
will
sort
of
be
like
a
several
week
long
process.
A
H
We
just
look
at
the
data
cup,
not
you
know
the
same
number
of
notaries
as
we
are
suggested
before
also,
I
think,
for
north
america
and
china
there
are
better
ecosystems,
there
are
more
miners,
more
data,
centers,
more
tech,
guys
more
tech
supporters.
H
So
I
think
we
should
give
more
data
cap
and
more
notaries
for
those
regions.
The
reason
is
coda.
We
are
a
robot
company,
also
we're
going
to
start
the
encoder
drive
and,
of
course
it's
on
ipf,
it's
a
five
core
network.
So
what
we're
trying
to
do
is
we
are
going
to
sell
those
decentralized
storage,
we're
going
to
find
storage
clients
and
we
want
to
get
a
data
cup.
H
I
mean
we
want
to
apply
to
be
declined
and
to
do
that,
we
actually
get
a
very
good
spokesperson,
we're
going
to
announce
tomorrow.
We
kind
of
we're
going
to
invest
a
lot
to
get
our
storage
clients.
H
Yeah,
that's
pretty
much
what
I'm
was
yes.
A
Let's
say
so
say
so,
then
I
then,
I
think
the
conversation
would
be
like
say
we
want,
like
x,
peppy
bytes,
a
data
cap
this
time
like
this
time
it
was
like
1.9
or
1.8
per
bytes
say
the
next
round
is
like
we
want
to
hit
like
10
bytes
and
distributing
their
stand
would
depend
on
like
the
activity
in
each
of
the
corresponding
regions.
And
then
we
pick
as
many
notaries
are
required
to
reach.
That
is
that
what
you're
suggesting.
A
B
I
can
read
the
chat
but
nothing
relevant.
Andrew
helper
said
he
had
to
run.
A
A
I
can
I
can
see
why
this
would
be
better.
I
think
it
will
be
a
little
bit
harder,
but
I
do
think
why
would
be
better,
and
I
think
it
would
also
prevent
us
from
the
situation
where
in
today
where,
like
a
region
like
europe,
is
close
to
being
out
of
data
gap,
whereas
other
regions
have
several
like
a
lot
of
data
caps
still
available.
A
So
I
I
understand,
and
I
and
I
think
it
makes
sense
any
other
opinions
are
we
sort
of?
Are
we
agreeing
that
this
is
the
direction
we
want
to
go
in.
A
Heads
and
I
see
people
are
interested
in
the
robot,
so
I'm
gonna
say
I
guess
we're
going
to
go
this.
I
guess
we're
going
to
go
down
this
path,
so
so
maybe
what
I'll
do
is
actually
I'll
do
as
emma
suggesting
I'll
like
do
do
some
of
the
math
and
actually
share
what
that
breakdown
could
look
like
in
the
channel,
and
we
can
use
that
as
a
mechanism
to
sort
of
decide
next
steps.
A
H
So
sorry
also,
I
can
release
a
little
bit.
It's
pretty
dumb,
but
tomorrow
I
got
a
main
mask
to
announce.
She
will
speaks
for
koda
drive
and
the
koda
robot
dog
I
mean
koda
drive
is
for
ipf
storage.
So
I
got
her
to
kind
of
really
speak
for
ipf's
community
to
did
the
mask.
A
B
Oh,
no,
don't
be
sorry.
Robot
dogs
are
very
cool
and
yeah.
I'm
sorry.
Robot
dogs
are
hard
to
compete
with,
like.
A
Awesome
and
then
there
are
a
couple
of
other
issues
that
have
been
opened
for
quite
a
while
from
like
the
last
standard
military
elections
and
so
the
for
the
bottom
two
I
feel
like
there's
like
pretty
clear
sort
of
next
steps
that
were
outlined,
and
so
I
think
we
don't
need
to
spend
too
much
time
on
those,
but
the
the
issue.
A
78
was
an
interesting
one
that
had
a
lot
of
conversation
last
time,
which
is
that
a
notaries
entity
which
is
like
the
organization
that
they
represent
should
be
in
the
same
region
that
they
apply
last
time
around.
This
was
not
the
case.
It
was
actually
based
on
where
the
node
themselves
are
operating.
Personally.
A
To
me
my
opinion
is,
it
makes
more
sense
to
stick
to
like
where
they
they
choose
to
operate
as
opposed
to
where
the
company
is
registered
because
like
if,
if
a
person
is
operating
in
a
different
like
geography
and
they're,
the
ones
serving
as
a
notary,
then
their
parent
organization,
then
they're,
probably
going
to
have
better
contacts
and
better
ability
to
serve
that
region.
Whether
that's
because
of
language
or
local
business
contacts,
and
things
like
that,
and
so
that's
sort
of
my
god
reaction.
D
Okay,
I
think
it's,
it
makes
sense
what
you
just
said,
so
this
is
more
where
they
operate,
but
today
it's
very
difficult
to
to
check
this.
The
only
thing
we
can
really
check
is
where
the
company
is
registered.
That's
the
easiest
thing.
D
So,
as
notary
I
mean,
I
mean
we
need
to
find
a
way
to
ensure
that
when
we
qualify
something
it's
properly
qualified,
so
today
it's
very
difficult
to
understand
where
the
yeah,
the
different
team
are
operating
like
I
just
received
one
today,
which
is
a
valid
one,
but
actually
they
said
that
they
provide
this
for
ethereum
and
falcon
users,
so,
like
it's
it's
worldwide,
so
they
decided
to
apply
in
all
the
different
regions.
A
A
So
there
were
a
couple
of
projects
that
were
suggested
on
this
issue
that
I
think
we
can
go
and
do
some
investigation
in
then,
on
the
note
of
clients,
I
think
that
you're
right,
it's
also
again
very
difficult
to
verify
and
then
clients
sort
of
seem
to
like
clients
would
just
go
where
they
can
get
data
cap,
as
opposed
to,
I
think,
like
needing
to
like
conform
to
a
region
by
region
basis,
and
so
the
status
sort
of
that
I've
seen
there's
it's
more
that
we've
fallen
into
a
pattern
where,
like
each
notary,
decides
for
themselves
and
many
notaries
are
now
sort
of
deciding
to
be
fine
with
applications
coming
from
multiple
places.
A
I
don't
know
how
that
will
work
in
the
future
like
if
we're
specifically
opting
to
pick
notaries
to
ensure
that
data
cap
demands
are
being
met
in
a
region,
then
maybe
that
will
happen
less
in
and
of
itself
like
it
will
be
based
on
like
minors
that,
like
maybe
part
of
the
client
selection
logic,
is
like
what
minors
are
you
sharing
with
where
those
minors
located.
A
I
think
I
think
we
have
more
to
more
opinions
and
experimentation
to
do
that,
but
I
think
that,
like
what
you're
suggesting
is
like
somebody
comes
to,
you
says
we're
a
global
app
and
it
looks
like
they're
a
global
application,
then
to
some
extent
you're
gonna
just
say
yes,
and
then
we
can
go
and
see
like
who
they
made
deals
with
right.
Once
we
have
like
a
better
process,
he
said
in
place:
yeah
yeah
cool,
we're
also
pretty
much
at
time.
A
The
only
thing
I
wanted
to
flag
was
effectively
like
what
like
the
timeline
would
be.
If
we
went
and
did
elections,
it
would
probably
take
like
two
weeks
of
open
apps,
two
weeks
for
scoring
and
then
about
a
week
at
least
to
get
like
everything
approved
and
allocated
by
root
key
holders,
as
some
of
our
other
notaries
know
who
were
just
activated
last
week.
Sometimes
it
takes
longer
than
a
week
to
get
everything
in
line
but
yeah
like
I.
I
don't
think
we
need
to
push
this
overly
difficulty.
A
I
was
sort
of
trying
to
see
if
we
can
frame
this
around
like
fill
plus
day,
because
we
will
have
some
session
from
the
notice.
I
think
that
we'll
be
interested
in
sharing
it
for
plus
day
so
either
we
can
open
applications
for
new
notaries
immediately
after
for
plus
day
or
we
could
have
it
so
that
football
stays
in
the
middle
of
the
period,
but
I
think
this
can
sort
of
be
a
work
in
progress
based
on
like
how
the
the
conversation
evolves
on
the
other
issues.
A
A
Also,
please
note
that,
like
this
is
after
the
agreement,
we
had
last
time
where
existing
notaries
that
have
existing
data
cap
allocations
that
don't
necessarily
want
more
data
cap
right
now
they
don't
have
to
reapply.
They
will
be
fine,
like
data
cap,
won't
be
removed
or
anything.
So
this
is
for
existing
notaries
that
are
reapplying
or
new
notaries
that
are
looking
to
apply.
E
Okay,
can
I
ask
one
question
so,
like
we've
ambushed,
we
actually
got
like,
I
think,
more
than
800
terabytes
left,
but
we
do
still
want
to
like
have
more
data
caps.
So
the
reason
is
that,
like
for
some
of
the
clients
we
have
been
talking
about
is
like
listed
company
in
china,
which
has
they
have
like
a
huge
requirement
for
for
data
caps,
so
which
so
like
for
so
for
our
current
guideline.
We
only
allow
clients
to
apply
for
like
at
most
50
terabytes,
but
50.
E
Terabytes
is
clearly
not
enough
for
this
listed
company.
So
so
we
only
have
like
one
pair
lights
in
total
and
if
we
want
to
like
give
them
like
100
terabytes,
it
will
be
like
10
10
of
the
our
like
total
data
cap.
So
in
this
kind
of
situation,
could
we
like
apply
more
more
data
cap
so
that
we
can
handle
this
kind
of
like
huge
application.
A
Yeah,
I
think
it's
it
you're
totally
able
to
open
a
new
application.
I.
What
I
would
suggest,
though,
is
for
like
clients
of
that
size
like
if
they
are
getting
to
multi
hundred
debit,
buys
of
your
data
cap,
then
they
should
go
down
the
flow
of
like
being
a
large
client
like
this
is
precisely
why
julian
suggested
that
we
start
thinking
about
this
was
because,
like
we
didn't
want
to
be
in
a
place
where,
like
one
node
dream,
is
allocating,
like
all
of
their
stuff,
to
do
like
just
two
clients,
for
example.
A
So
I
would
say
that
like,
if
you
have
more
clients
coming
in
that,
are,
you
know
like?
Maybe
you
can
share?
Actually,
if
there
are
like
all
between
100
and
500,
maybe,
like
our
large
client
thing,
should
work
for
clients
that
are
above
100
instead
of
above
500.
like.
I
would
rather
tweak
that
than
make
you
feel
like
you
need
like
three
peppy
rides
a
data
cap,
because
eventually
these
clients
will
want
like
much
much
more
like.
A
I
think
that
we
should
have
a
dedicated
flow
for
that,
so
yeah
like
if
you
have
any.
If
you
have
any
data,
you
can
share
on
what
maybe
their
maximum
like
data
cap
allocation
looks
like
and
if
you
can
review
the
conversation,
that's
happening
on
issue
number
94.
I
think
let's
start
there
and
then
accordingly,
we
can
chat
about
like
what
might
be
a
good
way
to
proceed.
A
Thank
you
cool.
I
think
we're
at
time
as
usual,
we're
slightly
over,
but
that's
okay,
thank
you!
So
much
for
being
here.
I
really
appreciate
it
as
usual,
looking
forward
to
the
next
chat,
which
will
actually
be
at
bus
day,
so
reminder
if
you
have
anything
you'd
like
to
share
that
please
reach
out
so
that
we
can
make
sure
that
there's
enough
time
allocated
thanks
have
a
good.