►
From YouTube: Filecoin Plus - June 22 Notary Governance Call
Description
Filecoin Plus governance call, in which we discuss recent DataCap allocation trends, clients applications for Large Dataset Notaries, and the ongoing Notary elections.
A
A
Accent
chat
says
yes,
and
you
can
see
the
falcon
plus
deck
correct
cool.
Thank
you.
So
much
folks
welcome
to
the
foxconn
plus
community
governance,
noted
governance
call
for
june
22nd.
As
usual.
You
know
action-packed
agenda,
but
some
stability
in
the
agenda
itself
and
the
conversation
that
we'll
be
having.
So
the
focus
points
for
today
will
be,
as
they
have
been
actually
for
the
last
couple
of
calls,
the
large
data
set
notaries
and
the
re-elections.
A
We
also
have
a
ton
of
learnings
from
the
sound
of
another
election
that
I
think
should
result
in
a
bunch
of
new
issues
being
filed.
So
I
think
next
noted
governance
call
will
be
pretty
pretty
pretty
active
in
terms
of
like
the
open
issues
to
discuss,
but
I
think
today,
at
least
from
what
I
saw
on
github
the
focus
as
well
as
on
slack.
A
The
focus
is
definitely
on
notary
elections,
so
yeah,
let's
get
into
it
in
terms
of
data
cap
like
not
a
lot
happening
like
two
manual
allocations
in
the
last
week,
through
github,
resulting
in
about
twenty
five
tabby,
buys
the
data
cap
being
allocated
only
in
like
net
new,
like
four
applications.
This
week,
eight
of
the
eight
so
of
the
15
total
applications.
At
the
moment,
eight
of
them
are
duplicates
or
applications
to
existing
no
trees
that
ran
out
of
data
cap.
So
not
not
that
many
coming
in.
A
I
suspect
that
this
period
of
slowness
is
primarily
because
of
the
elections
still
expecting
that
this
will
probably
pick
up
hopefully
this
week.
I
think
we
should
be
wrapping
up
everything
this
week,
but
just
so
you're
aware
at
this
point
in
time,
we're
at
780
telebytes
being
allocated
across
405
total
allocations
to
clients.
265
of
those
are
coming
from
the
automated
verifier.
A
So
you
know
across
100
allocations
a
couple
weeks
ago,
still
going
strong
and
then
about
a
third
of
that
has
been
used
in
client
deals,
which
still
only
represents
like
13
to
15
percent
of
the
total,
but
especially
with
the
larger
signatures
and
stuff.
I
expect
these
stats
to
start
changing
quite
a
bit,
so
the
average
time
today
cap
is
now
down
to
four
hours
and
six
minutes.
This
has
been
going
down
like
three
to
four
hours,
literally
every
two
weeks,
every
two
weeks
that
I've
shared
this
slide.
A
It's
been
like
down
four
hours
down
3.5
hours
for
the
last,
like
two
months
or
so
so.
Nice
work
for
notre
dame's
being
responsive.
We've
also
started
publishing
stats
around,
like
responsiveness
in
general
on
github.
So
this
is
sort
of
similar
to
what
andrew
was
doing
with
the
tooling
on
textile
side.
But
this
basically
shows
you
the
average
time
to
the
first
response
miniature,
which
is
currently
at
almost
ten
and
a
half
days
and
the
time
to
do
the
cap
allocated
through
github,
which
is
about
15
days.
A
So
it
does
show
that
it
takes
about
four
or
five
days
between
energy
and
our
client
to
get
an
allocation
done,
but
there's
definitely
a
lag
in
terms
of
when
the
first
response
comes
in
and
looking
at
the
data.
It's
like
pretty
bi-modal,
there's
like
a
set
of
notices
that
are
pretty
active
and
then
there's
a
set
that,
like
you,
do
these
in
chunks
and
we'll
wait
for
several
weeks
and
then
do
a
lot
of
allocations
at
one
time.
A
So,
just
as
we
continue
to
learn
more
happy
to,
you
know
hear
from
any
of
you
if
you
have
any
feedback
or
suggestions
on
improvements
that
can
be
made
to
the
tooling
to
the
general,
like
notary
flow
and
experience,
to
ensure
that
things
are
moving
effectively
and
efficiently
for
your
needs,
because
you
know.
Ultimately,
our
success
here
is:
were
we
able
to
get
clients
on
the
fault
line
like
people
with
real
use
cases?
A
Were
they
able
to
have
a
low
friction
experience
on
adopting
the
networks
and
this
technology
and
onboard
their
data
onto
this
particular
storage
offering
and
so
yeah?
If
you
have
any
ideas
and
things
we
can
do
to
improve
the
doing,
would
love
to
hear
them,
because
making
your
life
easier
will
make
everyone
in
the
ecosystem's
life
easier
as
well,
and
we're
about
to
have
a
bunch
of
new
notaries.
A
A
Okay,
I
think
we
should
keep
going
this
stats.
A
These
stats
are
available
on
the
falcon
plus
dashboard
that
I
shared,
I
think,
a
little
while
back
we're
working
on
getting
a
better
url
for
it,
but
if
you
ever
need
the
link
feel
free
to,
let
me
know
it's
on
slack
somewhere
as
well,
but
this
is
we
just
added
a
sad
space
so
that
we
were
tracking
some
stuff.
A
Similarly,
there's
the
statistics
still
coming
from
from
andrew's
thing
on
textile,
where
it
there's
generally,
you
know
a
downward
trend.
A
I
think
one
thing
that,
like
seems
to
be
pretty
interesting,
is
that
if
you
look
at
the
all
granted
here,
it
was
101
the
last
time
I
took
this
screenshot
two
weeks
ago
and
now
it's
at
100.,
and
this
is
something
we
noticed
in
when
I
was
doing
a
bunch
of
the
auditing
for
existing
notaries
as
well,
that
there
are
definitely
some
clients
that
delete
their
github
issue
after
the
data
cap
has
been
grounded.
A
A
Like
I
don't
really
know
what
the
incentive
is,
because
all
there's
like
two
dashboards
out
already,
both
the
dashboards
are
tracking
the
fact
that
an
allocation
was
made
and
we
can
go
back
and
look
at
the
name
of
the
client
and
what
address
the
allocation
went
to
and
which
note
we
made
the
allocation.
So
I
don't
really
know
what
the
incentive
is.
So
they
move
the
issue.
A
But
that
was
a
really
interesting
takeaway,
because
there's
a
case
where
I
don't
know,
I
think
stephen
is
is
here
in
the
call.
There
was
a
case
where,
like
without
him,
being
informed
one
of
the
clients
that
he'd
given
data
captive,
went
and
deleted
the
issue,
and
then
I
was
like
wait.
This
is
a
problem
because,
like
in
your
bookkeeping
plan,
you
state
that
everything
is
going
to
be
in
github
and
you're,
one
of
the
few
notaries
that
doesn't
have
like
a
copy
of
it
off
of
github
and
the
client
was
just
like.
A
We
can
track
the
flow
of
data
cap,
but
we
don't
actually
know
what
the
application
looked
like
and
so
that
that's
gonna,
I
think
that's
gonna,
be
one
of
the,
as
I
mentioned
at
the
start
of
the
call,
that's
one
of
like
the
sort
of
big
takeaways
from
this
round
of
note
elections
where
that's
probably
something
we
want
to
think
about
as
a
community
like
do
we
want
to
take
a
snapshot
of
index
or
snapshot
of
the
index
github
somewhere
or
just
snapshot
like
new
issues
somewhere,
or
do
we
even
care
like
if
we're
getting
the
data
that
we
need
from
the
dashboards?
A
So
yeah
just
think
about
that
see
if
you
have
an
opinion
to
share
I'll,
probably
kick
off
an
issue
to
discuss
it
this
week
and
then
it'll
definitely
be
part
of
the
next
couple
of
governance
calls
yeah
alice
shed
or
even
in
slack
yeah,
just
just
somewhere
that
yeah
we
could
even
have
a
bot
that,
just
like
say
application
was
received.
Here's
a
client,
here's
a
notary.
A
Here's
like
you,
know
the
core
information
that
way
if
we
ever
need
to
get
in
touch
with
the
client,
that's
possible
in
this
particular
case,
like
the
client
actually
responded,
this
is
steven's
application
identifying
themselves
as
well,
and
so
that
was
helpful
because
I
basically
just
ended
up
asking
the
client.
You
know
why
did
they
delete
the
issues?
A
A
B
Yes,
thank
you,
I'm
typing
something
in
the
chat.
Oh,
you
could
just
share
it.
Just
share
it
on
audio.
If
you
don't
mind,
okay,
I
was
just
wondering
as
far
as
I
recently
applied
as
a
notary,
and
I
wanted
to
know
what
is
the
status
of
the
approval
process
and
how
that's
coming
along
yeah.
A
A
Nice
to
meet
you
as
well:
okay,
yeah,
awesome,
okay,
so
yeah.
This
will
just
be
like
one
of
those
like
weird
things
that
I
think
we
need
to
figure
out
as
we
learn
more
about
like
client
behavior,
especially
post
allocation,
we're
reaching
a
scale-
that's
like
relatively
large,
and
I
think
over
time,
as
we
move
towards
more
and
more
automation
and
less
and
less
like
work
to
do
on
behalf
of
clients
and
notice.
A
Having
like
better
audit
trail
for
this,
like
subjective
or
qualitative
information
with
the
client
will
be
valuable,
so
just
just
to
see
some
thoughts
for
you
folks,
something
to
think
about
yeah.
Definitely
it
was
interesting
to
see
this,
and-
and
I
did
run
into
some
explicit
examples
of
this
as
well,
when
checking
some
of
the
allocations
that
were
happening
all
right
on
to
the
next
thing.
Large
data
set
notaries.
So
for
those
of
you
that
are
about
to
become
elected
as
a
notary,
a
little
bit
of
info
here
is
the
moment.
A
A
A
large
dataset
notary
is
basically
like
a
faucet
for
data
cap
for
a
singular
client
or
singular
project
that
requires,
like
a
large
amount
of
data
cap,
and
so
we've
set
that
number
right
now
to
be
between
500,
heavy
bikes
and
five
heavy
bites.
But
there's
been
feedback
in
this
room
in
the
past
that
maybe
we
need
to
reduce
the
lower
bound
to
like
100
bytes.
So
that's
probably
a
conversation
we
should
have.
However,
at
this
point,
we're
getting
a
ton
of
applications
and
so
having
more
notaries
that
are
monitoring.
A
That
would
be
great
it'd,
be
awesome
if
you
could
help
in
the
due
diligence
so
feel
free
to
hop
on
there
now,
even
and
ask
any
questions
that
you
think
you'd
want
to
know
about
a
client
in
order
to
give
them
that
much
data
cap
and
then
once
you
are
signed
on
as
a
nursery
like,
I
know
emma,
for
example,
you're
about
to
become
a
notary
most
likely,
but
at
the
time
you
were
not
when
you
suggested
that
you'd
be
willing
to
support
it,
like
that's
great
like
if
you
have
any
other,
you
know
other
applications
that
you've
seen
that
you'd
like
to
support.
A
Yeah,
of
course,
no
it's
great.
It's
very
nice
to
see
the
act.
The
sort
of
input
is
to
contribute
to
the
community
already,
and
so
I
guess
open
call
to
action,
for
everyone
is
go
and
review
those
apps.
Even
if
you're,
not
a
notary,
you're
welcome
to
participate
in
the
due
diligence
process
like
this
is
we're
trying
to
run
this
thing
as
an
experiment
to
ensure
that
we're
able
to
provide
value
in
the
best
way
possible
for
clients
that
need
access.
A
A
lot
of
data
cap
and
so
the
the
path
here
again
for
the
new
notaries
is
seven
notaries
across
at
least
three
regions
have
to
elect
themselves
to
serve
as
multi-signers
on
a
wallet
that
effectively
serves
as
a
data
cap
faucet
for
this
client.
So
a
client
would
basically
come
and
say:
hey
I
want
another.
You
know
hundred
tabi,
bytes
or
50
bytes
and
four
out
of
the
seven
notables
would
just
say:
yep
sounds
good.
You
haven't
violated
any
of
the
stuff
that
we
we
sort
of
heard
from
you
in
the
initial
application.
A
So
it
looks
fine
and
so
the
the
intention
there
is
to
provide
a
much
higher
amount
of
data
gap
than
would
otherwise
be
available
to
them
through
individual
notaries
and
then
also
start
testing
ways
in
which
we
can
increase
the
leverage
in
the
system
or
ways
in
which
we
can
increase
sort
of
how
clients
that
are
worthy
of
large
amounts
of
data
cap
are
being
served
and
their
needs
are
being
met.
A
One
of
the
open
sort
of
ongoing
conversations
has
been
slingshot
and
support
for
slingshot
themes
in
this,
and
so
the
the
quick
summary
of
last
week's
conversation
was,
and
not
last
week's.
I
guess
the
last
call
was
andrew
requested
that
the
slingshot
theme
comes
with
an
opinion
and
we
also
had
faye
who
had
submitted
the
actual
application
for
the
genome
rights
project,
which
is
the
largest
segmentary
application.
A
That's
causing
this
conversation
for
the
most
part,
so
the
opinion
that
I
had
shared
on
behalf
of
slingshot
was
basically
that
it
doesn't
really
make
sense
to
provide
data
cap
to
up
to
teams
that
are
already
being
incentivized
with
monetary
rewards
to
perform
the
same
function,
which
is
onboard
data.
However,
you
know
some
notaries
definitely
still
have
been
doing
that
at
a
much
smaller
scale,
and
so
we're
also
supportive
of
doing
it
at
a
larger
scale.
A
A
That's
like
safe
and
not
being
abused,
and
so
I
think
this
is
still
a
bit
of
an
ongoing
conversation
where,
like
it
seems,
like
you
know,
about
half
the
notaries
believe
that
we
shouldn't
do
this,
and
the
other
half
believes
that
we
should,
and
so
I
would
love
to
hear,
even
if
you're
in
this
call
and
you're,
not
a
notary
or
you're
about
to
become
a
notary.
A
If
you
have
an
opinion,
I
don't
know
if
phase
here
today
or
charles
are
here.
Those
have
been
some
of
the
vocal
participants
in
this
discussion
so
far,
so
we'd
love
to
hear
from
others
as
well
to
continue
sort
of
pushing
this
conversation
forward.
So
yeah
feel
free
to
drop.
Something
shout
out
on
mute.
B
Yeah
this
is
this
is
darnell
again,
since
I'm
jumping
in
with
both
feet.
I
think
that
one
of
the
things
that
I've
seen
is,
I
have
seen
a
lot
of
interest
of
people
doing
large
data
caps,
because
just
the
process
of
how
much
storage,
that
a
lot
of
companies
are
storing
internally
they're
storing
between
maybe
600
to
700
terabytes,
but
one
of
the
things
that
I've
been
looking
at
is
a
lot
of
large
data
cap
for
audit
logs
being
read
out
written
to
the
file
coin
blockchain.
B
A
Yeah,
that
makes
a
lot
of
sense,
definitely
something
that
I
think
the
community
would
be
up
for
supporting,
especially
at
that
scale.
That
seems
like
the
right
size
for
it
as
well,
which
is
great.
A
Which
yeah?
That's
that's
great.
The
I'd
love
to
hear
from
some
folks
who've
also
participated
in
the
slingshot
competition.
If
you
have
an
opinion
on
how
this
would
fit
in
so
donna.
This
is
specific
to
slingshot
as
well,
which
is
an
incentivized
competition
where
clients
are
effectively
rewarded
for
onboarding.
Large
amounts
of
specific
data
sets
so
like
curated
public,
open
scientific
data
sets,
and
so
those
clients
effectively
compete
to
onboard
data
through
phases
and
are
rewarded
in
like
a
file
point
ground,
and
so
the
conversation
around
this
particular
topic
has
been.
A
Should
we
also
give
data
cap
to
those
clients,
because
they're
already
being
paid
to
make
deals
or
being
rewarded
and
making
deals?
Should
they
also
receive
data
cap
to
make
those
deals
easier
and
make
it
lucrative
for
other
miners
in
the
system
to
provide
services
for
those
clients?
And
so
there's
it's
definitely
like
a
split
here
but
yeah.
I
would
love
to
hear
any
opinions
in
the
room,
regardless
of
whether
or
not
you're
a
nursery.
A
It's
been
a
pretty
active
conversation
in
the
community,
and
so
I'd
I'd
like
to
continue
ensuring
that
we're
having
the
conversation
so
yeah
feel
free
to
unmute
feel
free
to
drop
something
in
chat.
Anybody
that
might
have
an
opinion
on
this.
D
Well,
like
I
said
last
time,
when
you're
participating
in
sling
you're,
not
100,
sure
you
get
a
reward.
So,
when
you're
sending
out
a
lot
of
data
data
which
brings
a
lot
of
cost
per
bandwidth
or
data
you
use
in
your
transit
lines
or
whatever
it
is,
the
data
cat
might
help.
You
cover
some
of
that
cost.
D
A
D
D
A
D
A
D
A
A
We
would
definitely
like
to
sort
of
push
this
towards
closure
and
some
sort
of
resolution.
You
know
this
week
or
in
the
next
one.
A
A
A
A
Yeah
nelson
shares
in
chat
to
make
the
slingshot
participants
distributed
more
evenly
their
deals.
There
should
be
a
clear,
due
diligence
process
to
make
sure
there's
minimum
self-dealing.
A
A
I
only
have
you
know
these
two
machines
and
nothing
will
end
up
with
these
two
machines,
and
so
he
was
stressing
the
point
that,
like
there
is
some
way
for
him
to
prove
that,
but
it's
very
difficult
in
an
anonymous
peer-to-peer
network
guy,
a
new
minority
show
up
you
know,
diamond,
doesn't.
A
Yeah
nelson,
this
is
part
of
why
I
mean
like
even
when
we
were
doing
slingshot,
we
initially
had
a
no
self
dealing
rule,
and
then
I
mean
peter
is
here:
maybe
he
can
share
some
of
his
thoughts
from
context
prior
art
in
slingshot,
but
that
rule
was
removed
because
it
was
just
so
difficult
to
gauge
and
so
difficult
to
track
that,
and
so,
as
a
result,
we
just
stopped.
We
just
let
people
like
do
the
self-doing,
because
it's
it's
very
difficult
to
reliably
make
that
call
every
time.
E
Yeah,
just
to
clarify
this
a
little
bit,
it's
not
that
we
entirely
removed
the
like
self
dealing
is
a
problem.
You
know
conceptually.
What
we
did
change
is,
instead
of
making
it
an
enforcement
issue,
we
try
to
make
it
a
an
incentive
issue.
So,
instead
of
saying
you
cannot,
you
cannot
start
with
yourself
and
it's
up
to
us
to
catch
you.
We
said
you
have
to
start
with
n
number
of
minors.
E
I
believe
it
was
four
and
it
is
up
to
you
to
pay
for
all
of
them
if
you
want
to
be
the
one
maintaining
them
so,
by
setting
up
more
and
more
of
such
incentive
driven
mechanisms
that
basically
make
it
not
great
for
you
to
store
with
yourself.
This
is
how
we
can
get
out
of
this,
like
kind
of
like
local
maxim,
where
we
are
right.
A
Now
yeah,
that's
right.
We
ended
up
saying
minimum
four
miners:
it's
crap
yeah.
I
think
this
should
be
hard
to
track,
though,
for
in
this
case
too,
right
like
the
same
thing,
would
apply
where,
even
if
you
may-
it's
not
like,
if
anything,
what
we're
doing
is
we're
taking
the
fact
that
it
was
an
incentive
problem
and
then
making
the
incentives
easier.
E
And
this
is
a
relatively
unexplored
field
for
us
relatively
new,
so
if
anyone
can
come
up
with,
you
know,
schemes
for
this-
that
are
a
little
bit.
You
know
more
thought
through
and
still
seem
to
stand
up,
then
by
all
means
like
share
them
with
us
in
in
slack
or
in
order
the
next
hall
or
on
github
somewhere,
because,
ideally
yes,
we
want
to
make
sure
that
this
system
is
used
for
good
to
incentivize,
actually
useful
storage
and
not
be
used
by
folks.
A
Are
there
any
notaries
today
that
feels
strongly
at
this
very
point
in
time
on
supporting
slingshot,
I'm
not
supporting
slingshot.
A
Okay,
that
makes
sense,
and
so
for
those
of
you
that
are
that
are
sort
of
waiting.
What
would
what
would
be
helpful
in
driving
this
to
closure?
Would
it
be
useful
to
hear
from
the
people
that
have
stronger
opinions
like
maybe
we
have
a
sort
of
more
curated
conversation
at
the
next
call,
where
we
bring
in
the
the
people
that
feel
strongly
like?
Maybe
if
they
will
do
another
quick
presentation,
something
like
that,
we
can
request
to
come
talk
and
a
child's
stronger
opinion
as
well.
Weiner
has
an
opinion.
A
I
have
an
opinion
as
one
of
the
people
that
are
on
slingshot,
like
maybe
we
can
do
like
a
dedicated
sort
of
session.
For
that,
would
that
be
helpful?
Or
would
you
rather
we
test
this
with
constraints
and
see
how
it
plays
out,
or
would
you
rather
we
just
make
a
decision?
A
F
Julian
hello
yeah,
we
hear
you
yes,
so
personally,
actually
I'm
a
slingshot
participant,
so
I
prefer
to
stay
away.
A
A
A
Can't
wait
to
see
if
there's
any
responses
in
chat
I'm
going
to
keep
moving
along,
because
we
do
have
a
few
more
topics.
Oh
there's
something
in
chat.
Nelson
are
many
slingshot
clients
finding
it
difficult
to
send
deals
and
getting
many
errors
which
causes
them
less
often
to
do
deals
with
third-party
miners
question
mark.
A
A
Now
so
I
can
provide
some
insight
from
the
judging
perspective.
I
think
where
the
biggest
issue
for
clients
is
not
that
they
can't
make
the
storage
deal.
It's
that,
like
retrieval,
needs
to
be
guaranteed
by
the
miner
and
it's
hard
to
force
a
miner
to
ensure
that
they're
going
to
serve
every
retrieval
that
comes
in
and
retrievability
is
a
factor
in
how
a
participant
will
be
scored,
and
so
there's
a
ton
of
deals
happening.
I
mean
the
last
phase
had
like
300
000
deals
in
like
a
two-month
period,
which
is
absurdly
high.
A
A
A
A
So
anyway,
I
think
we
can
continue
to
add
any
points
in
chat.
If
you
have
any
questions
that
would
help,
you
clarify
your
opinion
or,
if
you're
a
noji,
that's
on
the
fence
and
would
love
to
provide
some
insight
on
what
would
help
you
pick
a
side
do
share
and
chat,
and-
and
we
can
make
it
happen
in
the
next
stop.
A
With
that,
let's
talk
about.
Oh,
I
forgot
to
update
this
number.
It's
a
lot
more
than
nine.
It's
actually,
I
think
13
now
one
two
yup
13
new
app
13
total
applications
that
are
open
so
far,
where
we're
at
is
falcon,
discover,
actually
had
seven
nodes
of
select,
so
we
actually
had
like
10,
but
three
of
the
nodejs
were
ineligible
at
the
time
that
they
elected
themselves,
which
is
either
you're.
A
Not
yet
a
notary
or
you
are
a
notary
that
finished
your
allocation,
but
there
are
seven
notaries
that
have
active,
ongoing
allocations
and
grants,
and
you,
I
think
yesterday
was
the
last
one
where
we
got
the
seventh
one
so
just
to
provide
support.
So
this
would
be
really
interesting.
I'm
excited
because
this
will
be
the
first
time
we
get
the
experiment
with
this
kind
of
data
cap
faucet.
A
A
It
may
be
a
little
complicated
like
in
some
cases
like
andrew.
I
know
you're
already
using
a
multisig
for
your
notary
role.
I
don't
know
if
you
want
a
multi-stick
that
participates
in
a
multisig
or
if
you
just
want
like
a
different
address.
That's
going
to
become
a
signer
onto
this
multisig
feel
free
to
sort
of
respond
in
the
github
phone,
so
that
I
have
the
actual
addresses
that
we
can
craft
as
part
of
the
message
yeah.
It
looks
like
we'll
be
moving
forward
with
that.
A
Also,
if
you're
noting
on
this-
which
I
believe
again
many
of
you
on
this
call,
we
would
love
to
have
somebody
like
champion
this,
like
be
like
the
first
sort
of
lead
notary
that
helps
serve
as
like
the
captain
of
the
notaries
on
behalf
of
the
client
and
and
ensures
that,
like
things
are
generally
going
well,
if
anybody
here
is
willing
to
do
that,
like,
I
would
love
to
hear
that
now
or
on
the
github
issue.
A
Just
just
because,
like
this
is
the
first
time
we're
gonna
get
to
do
this,
and
so
it'd
be
awesome
to
have
somebody
that's
excited
to
sort
of
take
charge
and
ensure
that
this
plays
that.
Well,
I
believe
this
is
a
five
period
allocation,
which
is
absolutely
massive.
A
You
know
this
actually
goes
through
that
would
you
saw
the
stats
in
the
first
page
right,
where,
like
less
than
one
period,
it's
actually
been
allocated
to
clients
so
far
and
200
300
writes
been
used
in
deals,
and
so,
if,
if
the
onboarding
rate
that
the
falcon
discover
folks
shared
in
the
application
is
accurate,
it
would
be
multiplying
the
efficacy
of
the
program
very,
very
quickly.
Yeah
julianis,
I
saw
that
you
propose
yourself
as
sort
of
the
captain
or
the
champion
notary
for
estuary.
A
I
don't
think
that
one
has
seven
nodes
yet
waiting
to
see
if
there's
any
others
so
yeah
I
we
will
go
over
these
and
serve
as
a
quick
reminder
for
notice.
I
want
to
participate
this
for
this
specific
one
julian.
I'm
not
saying
you
need
a
volunteer
for
this
as
well,
but
if
you
want
to
definitely
the
option
is
that
which
one
are
we
talking
about?
Five
point
discover.
A
A
The
the
how
much
data
come
down
to
five
bytes,
okay
and
I
think
the
application
said
they
would
probably
do
another
round
after
this
and
reba
is
here
from
the
project.
So
we
can
just
ask
him
peter
what
what
is
the
total
amount
that
you
want
to
on
board
over
time
in
an
ideal
world.
E
Oh
sorry,
I'm
admitted
I
do
not
have
the
number
in
front
of
me
right
now.
Actually
I
believe
it
was
I'm
sorry
I
I
don't
want
to
misspeak.
I
will
that's.
A
E
Yeah,
but
the
the
amount
of
data
that
we're
looking
to
onboard
like
without
a
redundancy
is
about
12
petabytes,
but
I'm
not.
I
don't
have
the
numbers
on
there.
They
don't
say
right
now
how
how
many
copies
to
actually
make
so
I
cannot
give
it
actual
numbers.
A
F
So
the
12
petabyte
is
more
that
what
you
already
sent,
I
mean
the
hard
drive
you
cheap,
you
ship,
sorry,
are.
E
A
Yes,
I
think
this
is
the
first
of
some
for
them
and
also
the
first
of
many
for
us
as
a
community,
because
it's
the
first
time
I
think
we'll
actually
have
a
multi-signatory
that's
up
and
running
and
and
providing
data
cap
for
clients.
So
yeah
excited.
A
Hopefully
you
know
as
soon
as
I
have
all
seven
of
you
confirming
your
addresses
and
somebody
who
can
serve
as
a
dri,
we'll
kick
off
the
process
to
get
the
root
key
holders
to
sign
this
multisig.
A
So
you
don't
have
to
say
it
now,
but
if
you
can
go
to
get
of
issue
and
self-elect
yourself,
that'd
be
great
for
those
seven
of
you,
I
think
I
think
neo
you're,
one
of
them
andrew
you're,
one
of
them
the
fanbooshi
noted
you're,
one
of
them.
I
believe
julian
was
part
of
it,
dr
and
I
think
might
have
been
and
masaki.
A
If
I
recall
so
that
covers
the
requirements,
I
think
maybe
even
steve
for
the
seventh.
I
can't
remember
now,
I'm
sorry
I
apologize
but
definitely
did
cover
the
three
to
three
regions.
We
just
need.
Somebody
to
you
know,
elect
themselves
as
a
lead
notary,
and
then
we
need
all
of
you
to
confirm
to
me
that
I
have
your
addresses
correctly
and
then
we'll
kick
off
the
process,
which
is
great.
A
We
there's
a
it
looks
like
there's
a
question
in
chat
for
peter.
If
you
don't
mind
answering
it
from
the
discover
perspective,
I
wanted
to
make
a
call
out
on
the
other
applications
like
great,
that
we've
got
seven
injuries
to
discover,
there's
still
12
open
applications
would
absolutely
love
to
see
any
questions
or
support
from
other
notaries
of
the
rest
of
the
community.
That
thinks
that
any
of
these
projects
are
worth
supporting.
A
So,
of
course,
the
3000
rise
genome
is
on
ongoing
conversation
as
part
of
the
slingshot
stuff,
but
yeah
estrella,
topi
osc,
china,
gw
technology,
shanghai,
sinchu,
contenders
foundation,
chengdu
digital
media
industry
base,
nft
storage,
hna
group
protocol
labs,
deal
bots
project
and
phil
swann
fs3
project.
A
It
would
be
great
to
see
if
you
have
any
questions,
or
this
is
something
you'd
like
to
support
as
a
notary,
especially
those
of
you
that
are
about
to
get
elected
as
notice
like
this
is
a
great
way,
especially
if
you
received
an
allocation
that
was
not
as
large
as
you'd
hope.
This
is
an
awesome
way
to
like
continue
impacting
the
network
at
scale,
so
I
would
highly
recommend
checking
these
out.
A
So
with
that,
the
the
topic
that
everybody's
excited
about
notary
elections
and
so
quick
recap,
we
updated
the
target
notary
account
to
five
for
naeu
and
china.
The
remaining
regions
stay
at
three.
The
current
plan
was
effectively
that
we
wanted
to
ensure
we
hit
like
this
min
sort
of
data
cap
allocation
for
for
the
base
for
like
each
of
these
regions
effectively,
and
so
what
we've
got
so
far
is
the
following
recommendation.
A
A
Europe,
we
probably
have
the
falcon
foundation
julian
rears
from
tech,
hedge
and
speedium,
create
a
china
region
is
still
sort
of
open.
I'll.
Just
share
some
details
on
that
n:
a
five
notaries
xn
matrix
performance,
coda
sorry
performance
with
a
flag,
I'll
discuss
that
as
well
coda,
phil,
swann,
secure
experts
and
then
in
oceania,
we'll
have
our
first
non-eu
ma
or
asia
notary
with
hold
on
innovations
serving
seniority
in
australia.
So
pretty
excited
the
couple
of
flags
that
I
wanted
to
share
were
in
gcr
simon.
A
I
think
we're
waiting
for
just
one
fix
the
application
before
he's
considered
good
and
electable
between
fall
by
base,
I
believe,
there's
still
an
open
conversation,
so
I
know
eric
is
here
in
the
call.
You
sent
me
a
message
last
night.
I
still
need
to
go
through
and
read
through
that.
Sorry,
I
didn't
have
a
chance.
I
had
to
look
through
that
yesterday,
but
we
needed
to
ensure
the
effectively
what
we're
doing
is
making
sure
we're
scoring
you
accurately.
A
But
if
that
does
go
through,
then
we
end
up
with
eight
notaries
inside
which
is
a
lot,
which
is
great.
I
know
we
set
the
max
at
seven
previously,
but
we
also
had
a
thing
where
if
people
were
tied,
then
we
would
continue
adding
everybody
at
that
level
of
being
tired,
and
so
that's
how
we'd
get
aid,
and
so
once
we
can
validate
that.
These
are
indeed
all
like
valid
applications.
Incorrect.
We
can
share
this
and
finalize
this
list.
A
The
last
thing
I
want
to
call
it
was
on
the
performer
front,
so
there's
a
sort
of
potential
like
dispute
kind
of
conversation
happening.
So
I
would
highly
recommend
if
you
are
a
notary,
to
check
out
that
particular
application
132,
because
I'm
having
a
bunch
of
back
and
forth
there,
where
I'm
trying
to
get
some
input
on
exactly
what's
happening.
A
But
this
is
a
case
in
which
there's
no
dna,
but
all
the
allocations
were
made
in
china
and
we
talked
about
like
serving
applications
from
outside
your
region,
but
that
was
typically
in
the
cases
that
that
region
no
longer
had
data
cap
or
like
didn't,
have
any
active
notaries
providing
data
caps.
So,
like
you
know,
there
was
a
case
with
europe
in
the
last
cycle
where
it
became
very
difficult
for
clients.
A
I
think
why
not
hear
an
example
where
you
ended
up
needing
to
go
to
notaries
in
other
regions,
because
there
were
just
no
active
allocations
happening
in
europe
anymore,
but
you
know
there
are
plenty
of
notice
in
china
who
were
plenty
of
applications
and
for
whatever
reason
in
this
case,
this
one
na
notary
served
only
applications
in
china,
and
so
my
recommendation
was
that
either
they
should
apply
to
be
a
notary
in
china,
or
that
should
not
be
the
case
yeah.
If
anybody
has
any
opinions,
definitely
check
out
that
issue.
A
Let
me
know
that
that
sort
of
is
like
the
last
one
to
finalize
on
there's
a
question
in
the
chat
from
darnell.
What
does
the
rounded
score
mean?
So
their
rounded
score
is
how
you
can
qualify
for
certain
levels
of
data
cap.
So
if
you're
rounded
at
one
that
means
you're
eligible
for
10,
tabi
bytes,
if
you
round
it
at
two,
it's
a
hundred
tabi
bytes,
if
you're
three,
it's
one
pebby,
byte
and
so
effectively,
donald
that
sets
the
cap
on
how
much
data
cap
you'll
receive
in
this
particular
round
of
elections.
A
Most
new
notaries
end
up
coming
in
at
the
10
tabi
byte
level,
which
they
run
out
of
fairly
quickly.
But
the
intention
is
that
you're,
showing
that
you're
competent
is
energy
and
and
you're
rewarded
for
that
in
the
next
election
cycle.
Because
there's
a
row
in
that
that
basically
says
did
you
successfully
allocate
like
some
amount
x
of
data
cap
in
the
past,
and
so
that's
what
that
number
means,
and
so,
unless.
G
A
Yeah,
unless
there's
any
specific
questions
on
this,
I
would
love
to
sort
of
share
some
summary
stats
and
next
steps.
Anybody
have
questions
on
this
table
or
any.
A
Cool
so
where
we're
at
right
now
is
basically
the
following:
expected
distribution:
azure
minus
gcn
at
three
node
trees
with
300
tabi
bytes
eu
with
four
notaries,
just
over
one
cubic
byte
degree
to
china
depending
on,
if
it
was
seven
or
eight
no
days,
will
roughly
be
at
like
four
or
four
and
a
half
baby
bytes,
and
it
has
five
notaries
over
one
peppy,
byte
and
oceania
will
have
one
note
3
at
100
device.
What
I
did
want
to
call
out
is
that
there's
a
reason.
A
This
is
all
less
than,
and
that's
because
like
of
course
like
note,
3
still
have
like
data
cap
caps
or
data
cap
maximums.
So
if
you're,
a
notary
that
was
eligible
for
up
to,
I
don't
know,
let's
say
you're
eligible
for
100
bytes,
you
applied
again,
you
had
50
debit,
bytes
remaining
you
you're.
A
If
you
qualify
qualified
for
100
or
requested
another
100,
you
would
basically
end
up
getting
50
more.
So
you're
talked
up
to
your
maximum,
because
the
point
of
that
rubric
is
also
the
flag,
like
vulnerabilities
in
the
system
where
it's
like
notaries,
with
certain
inputs
should
max
out
at
a
certain
output,
and
so
these
numbers
will
look
very
different
once
we
confirm
the
set
and
provide
everybody
with
the
amount
of
data
cap
that
it
was
requested,
because
in
some
cases,
notary's
also
requested
numbers
that
were
in
between
ranges.
A
So
there
were
a
few
applications
that
were
at
50,
tabi,
bytes
or
500
heavy
bytes,
as
opposed
to
the
full
of
the
range,
and
so
once
we
adjust
for
all
that
with
the
final
set.
These
numbers
will
definitely
drastically
go
down,
but
yeah
just
just
a
flag
that
all
of
these
are
very,
very
high
upper
maxima.
But
it's
it's
nice
to
see
that,
like
the
amount
of
data
cap,
total
in
circulation,
is
about
to
go
up
drastically.
A
We're
also
going
to
have
a
bunch
of
new
notaries
that
will
be
participating,
hopefully
actively
in
these
calls
and
in
furthering
the
community,
but
also
in
like
conversations
like
the
large
data
set
notice
and
just
bringing
in
more
voices
more
noticed
and
show
that
that
we're
evolving
as
a
community
to
be
more
efficient
and
more
effective
in
serving
real
use.
Cases
on
the
network.
G
A
I'm
really
excited
thank
you
for
those
of
you
that
have
stuck
out
and
applied
for
this
with
that
next
steps.
So
the
first
thing
is
this:
there's
the
two
sort
of
applications
that
actually
require
like
info
to
be
unblocked.
A
The
first
is
so
eric
the
input
that
we
need
from
you
for
bytebase's
application,
on
proving
the
participation
in
terms
of
oh
sorry,
I
just
got
a
question
in
the
chat
for
any
five
notaries
for
new
applications
or
for
the
whole
five
total
new
notaries
will
be
serving
as
active
notification
na
cool
yeah.
So
if
you,
if
you
have
thoughts
on
the
performance
app
or
the
bytebase
app
and
and
can
provide
sort
of
input
on
how
we
should
be
supporting
that,
I
would
greatly
appreciate
it.
A
So
there's
a
app
number
is
169
and
132.
So
if
you
just
go
to
the
nerdy
dominance
repo
and
look
at
issue
number
169
or
132
or
you
can
go
to
this
table,
which
is
on
issue
118
and
of
course,
that
third
column
is
just
links
to
all
the
applications,
so
you
can
click
through
and
check
it
out
so
yeah.
Another
question
in
the
chat:
how
much
can
two
point?
A
Six
points
get
five
hundred
terabytes
of
one
per
byte,
so
2.6
gets
rounded
up
to
three
so
based
on
the
current
scoring,
that's
actually
one
period.
I
think
we'll
have
some
adjustments
we
want
to
make
to
the
rubric
based
on
this
based
on
this
random
scoring,
but
for
now
excel
matrix.
I
believe
you're,
two
six
as
well
yeah,
so
you'll
probably
get
one
probably
write
a
few
requests
to
that,
which
is
great.
It
of
course
depends
on
how
much
data
you
request
at
the
top
of
the
application
as
well
anyway.
A
So
yeah,
if
you
have
any
thoughts
on
those
two
open
issues,
please
hop
on
to
the
issue.
Help
us
resolve
this
as
soon
as
possible.
It
will
help
unblock
the
rest
of
the
process,
I'm
about
to
go
through
the
remaining
regions,
so
I
think
any
all
the
applications
ended
up
getting
so
literally
all
nodejs
and
other
regions.
A
I'm
going
to
drop
the
disclosures
on
your
app
today,
which
you
should
read
and
confirm
that
you're
willing
to
comply
with,
and
once
you
are
able
to
do,
that
we
can
effectively
start
the
process
with
your
keyholders
assigned.
So
that's
all
that's
left
is
you
need
to
assign
the
disclosures,
which
is
awesome,
the
thing
that
comes
after
that,
though
it's
non-trivial,
you
do
need
to
get
set
up
with
the
app,
and
so
the
first
question
that
you
get
from
the
general
falcon
plus
registry
dueling
team
will
be.
Do
you
have
a
ledger?
A
So
if
you
don't
have
a
ledger,
you
should
probably
get
a
ledger
you're
going
to
need
a
ledger
to
log
into
this
stuff.
If
anybody
needs
assistance,
getting
a
ledger,
please
get
in
touch
with
me
on
slack
and
we
can
have
the
conversation
to
ensure
that
you're
set
up.
The
expectation
is
basically
that,
like
once,
your
allocation
is
given
to
the
address
that
we
confirm
with
you.
A
The
ledger
is
what
you'll
use
to
sign
into
the
plus.phil.org
website,
which
is
the
falcon
plus
registry,
where,
based
on
your
github
login
and
your
ledger,
login
the
app
will
help
you
automate
a
bunch
of
your
interaction
with
when
you
approve
allocations
or
decline,
allocations
and
stuff.
So
that
would
be
useful.
A
The
other
thing
is
there's
a
couple
of
resources
that
are
going
to
help
you
get
started
like
there'll,
be
a
couple
of
docs
on
like
onboarding,
new
notaries
and
stuff
that
I
will
send
you
away.
You'll,
be
added
to
a
slack
channel
called
falcon
plus
notaries,
where
it's
called
the
notaries.
That
channel
is
mostly
useful
for
communication
with
other
notaries,
like
in
case
I
give
a
client
in
one
region,
that's
applying
a
different
one,
and
you
want
to
bounce
that
client
back
or
if
you
want
to
have
like
ask
questions
about
a
particular
allocation.
A
That's
like
the
right
place
to
do
it.
The
last
thing
I
was
I've
been
sort
of
proposing
is
pairing
up
new
notaries
with
existing
notaries.
So
effectively.
A
If
you
are
a
notary,
that's
not
served
as
energy
before
or
this
is
not
a
reallocation
which
there
are
many
in
this
in
this
particular
round.
Basically,
having
like
a
body
system
that
says,
hey
like
help
me
like,
learn
the
processes
or
like
somebody
to
ramp
up
with
or
iterate
on
your
rubric,
your
processes,
your
allocation
methodology,
so
would
be
great
if
there
are
any
existing
notaries
or
notice
that
are
getting
reallocations
that
like
to
volunteer
and
sort
of
helping
make
this
happen.
F
A
Yeah,
so
let's
we
can
walk
through
your
example.
If
you
don't
mind
julian
yep,
okay,
so
I
believe
you
requested
a
hundred
heavy
bites.
This
time.
Correct,
yes,
okay
and
so
you
qualified
at
1.7
1.7
puts
you
around
it's
for
two,
which
means
you
do
qualify
for
100..
If
I
recall
correctly,
you
have
either
two
or
four
tabi
bytes
left.
A
Yes,
yes,
so
you
would
get
sorry.
I
have
a
few
yeah
yeah.
So
let's
say
you
have
two
right,
so
you
qualified
for
100,
you
requested
100.
You
have
two
outstanding.
You!
You
get
an
allocation
of
98
televised,
okay,
you're
based
on
the
rubric
you're
capped
at
100
at
any
given
point
in
time,
but
is
it
going
to
be
allocated
on
the
same
address
or
on
the
new
one
yeah?
So
right
now
the
fip,
the
fitbit
was.
A
Think
all
the
implementations
had
a
chance
to
yeah
yeah,
so
I
think
for
now
to
reduce
friction.
We
might
request
you
for
a
different
address.
Oh
okay,
I
see.
Okay!
Is
that
all
right.
F
If
it's
not
going
to
be
on
the
same
address
or
not,
because
actually,
if
I
just
go
back
to
the
the
multi
address
for
the
large
deal
yeah,
do
we
need
data
cap
on
this
address
or
not.
So
only
at
the
time
that
you.
F
A
Yeah,
you
don't
need
that
particular
address
to
be
a
verifier.
As
far
as
I
know,
I
will
double
check.
What
we
need
is
the
manual
check
that
at
that
time
you
are
an
active
nursery
which
you
are
at
the
moment.
A
A
And
yeah
the
I
I
don't
know,
I
don't
think
we
added
validation.
That
basically
says
like
that.
Particular
address
needs
to
be
like
verify
our
status,
but
I
will
double
check
because
that
was
a
conversation
at
some
point.
That
probably
should
be
a
requirement
once
we
fix
the
problem
of
receiving
allocations
of
the
same
address.
But
until
that
happens
I
think
I
had
asked
to
not
have
that
be
a
blocker.
F
Okay
and
the
last
one
is
like
for
discover:
they
asked
for
five
petabyte,
but
I
I
guess
the
the
seven
different
notaries
that
are
that
may
sign
the
the
the
data
cap
allocation.
Are
we
going
to
assign
five
metabyte
approach,
or
are
we
going?
No
so,
but
do
we
do
we
have
like
a
limit
for
each
each
time
we
allocate
do
we
set
two
to
five
terabyte
or
ten,
or
one
hundred
I
don't
know
yeah,
so
there.
A
Is
a
limit.
The
way
it
works
is
basically
the
the
first
the
it's.
This
is
documented
in
the
repo.
So
if
I
misspeak,
I
would
recommend
that
you
just
check
it.
But
if
you
go
to
the
readme
of
the
depot,
it
should
say
that
the
first
allocation
is
basically
capped
at,
like.
I
think
it's
what
the
client
claims
they
would
want
to
use
in
half
a
week
or
I
think
five
percent
of
the
total
amount.
A
Then
the
second
allocation
is
ten
percent
of
the
total
amount
or
what
they
would
claim
to
use
in
one
week
and
then
the
third
is
twenty
percent
or
what
they
would
claim
to
use
in
two
weeks.
So
the
maximum
first
allocation
for
five
period
bytes
should
be
250
gigabytes.
A
If
I'm
doing
that,
math
right,
maybe
I'm
not,
and
then
the
second
one
would
be
500
w
bytes
and
then
the
third
would
be
one
pebby
byte
and
then
every
one
after
the
third
would
be
20
at
most
and
there's
an
expectation
that
90
percent
of
the
previous
allocation
has
been
used
before
the
next
allocation
goes
out.
The
door.
A
A
Yeah,
so
existing
notaries.
If
you
are
interested
in
providing
support
for
new
notaries,
that
would
be
absolutely
awesome.
You
know,
please
feel
free
to
volunteer
yourself
on
the
thread
that
we
have
in
the
notary
channel
or
let
me
know
or
share
it
right
now.
I
would
love
to
sort
of
match
like
that,
the
handful
of
new
nurseries
that
we
have
with
somebody
that
they
can
learn
a
little
bit
from
and
can
provide
as
a
resource.
A
Yeah
with
that
open
some
over
discussion
julian,
you
still
have
a
hand
up.
Is
there
anything
else
you
wanted
to
share.
A
Nope,
okay,
there
was
a.
There
was
a
point
earlier
from
eric.
We
found
these
guys
check
it
out.
Regarding
data
uploading,
you
can
save
time
no
matter
for
data
center
or
ldn
issue,
so
I
think
that
might
be
yeah.
This
is
a
really
interesting,
really
really
interesting
proposal
eric
if
you're
in
the
call
and
you're
comfortable
sharing.
I
would
love
if
you
could
chat
about
this
for
a
few
minutes.
If
not,
I
can
do
a
quick
summary.
G
Sorry,
I
cannot
screw
the
screen.
So
could
you
please
close
down
yeah.
G
G
Yeah
yeah,
what's
this
one,
so
for
the
for
the
right
side,
that's
the
current
procedures
based
on
the
previous
informations.
We
have,
let
me
say
just
a
second.
G
For
the
previous
information
data,
we
can
say
the
average
time
to
the
data
cap,
that's
a
four
day
six
hours
and
the
average
time
to
the
first
response
from
the
notary.
That's
a
10
days,
certain
hours
and
also
the
the
time
to
the
data
cap
allocated
on
chen,
that's
a
15
days
and
seven
hours.
So
for
us,
that's
no
efficiency
for
for
the
data
cap
and
also
we
just
make
this
big
chart
according
to
the
the
ot
dong
suggestion
and
some
people
who
make
more
suggestions
with
this
one.
G
So
if,
if
they
cannot
having
the
data
cap
from
any
notary,
at
least
for
for
the
I
mean
for
the
protocol
lab
or
for
the
five
con
network,
we
have
the
real
data.
We
have
the
real
data
to
to
storage,
to
storage
in
the
fico
network
and
for
the
client
they
can
make
the
application
for
another
lottery
and
for
the
miners.
G
If
the
having
the
closer
deal
with
data
cap,
that
means
they
can
save
another
8
or
17
days
to
I
mean
faster
to
seal
the
seal.
The
data,
because
we
found
the
sealing
time
of
the
real
data.
I
mean
the
data
cap,
every
33
gigabytes
will
cost
us
around
80
to
20
hours.
That's
quite
a
lot
of
time,
so
we're
just
making
this
a
little
bit
changes
so
for
the
proposed
procedure
we
can
save
in
the
time
and
make
the
whole
procedure
more
efficiencies
for
that.
A
A
Yeah,
that's
really
great!
Thank
you
so
much.
This
is
a
really
really
interesting
and
very
compelling
proposal.
I
would
highly
recommend
if
you
haven't
checked
this
out.
I
know
eric
already
dropped
the
link
I'll
drop
it
in
again.
You
know,
please
contribute
your
suggestions
to
this
conversation.
I
know
some
of
you
have
already
seen
it,
which
is
great
yeah
eric.
Thank
you
so
much
for
joining
the
call
and
and
sharing
a
little
bit
more
info.
A
A
Yeah
definitely
check
out
the
issue
share,
any
thoughts
that
you
have
on
it,
thanks
eric
again
for
being
here
and
and
sharing
and
helping
push
this
kind
of
really
awesome.
Thinking.
A
Along
cool,
I
think,
with
that,
we'll
probably
call
it
for
today.
Thank
you
folks
for
being
here
as
usual,
thanks
for
participating
in
yet
another
iteration
of
the
nerdy
governance
call
check
out
the
fip
check
out
the
open
applications
that
have
some
kind
of
audit
work
left.
Make
sure
that
if
you're
a
existing
notary
on
the
discover
application
confirm
the
address
and
elect
yourself
as
a
champion.
A
A
Please
sign
them
as
soon
as
you
can,
so
we
can
proceed
with
the
next
steps
of
providing
you
nurturing
status
on
chain
and
if
you
have
any
thoughts
on
supporting
slingshot
or
any
of
the
other
disc,
any
of
the
large-scale
large
data
notary
applications.
A
Please
hop
into
that
repo
share
your
thoughts
and
questions
in
the
comments
of
those
issues.
Lots
for
you
to
do
lots
of
everyone
to
do,
but
awesome
progress
as
usual.
Thanks
so
much
for
being
here.
I
really
appreciate
it.
Let's
continue
the
conversation
in
the
falcon
plus
channel
and
slack
for
any
open
topics
that
you'd
like
to
continue
discussing,
thanks
for
being
here,
have
a
good
rest
of
your
day.