►
From YouTube: Filecoin Plus - Oct 26 2021 Notary Governance Calls
Description
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
A
A
So
thanks
for
joining
looking
through
our
agenda,
some
similar
topics
that
we've
covered
before
we'll
look
at
metrics
we'll
do
some
updates
on
our
sort
of
six
month
program
goals,
talk
about
some
of
the
ldn
application
tracking
and
then
have
some
space
for
some
other
discussion
issues,
and
then
some
ldn
process
and
expectation
updates
and
then
open
the
floor
for
open
discussions.
So,
let's
jump
in,
we
have
an
exciting
update
on
our
time
to
data
cap.
A
A
We
are
much
more
confident
in
this
number.
It
may
still
change.
Some
we've
been
digging
into
the
specifics
of
how
this
time
to
data
cap
is
getting
calculated
and
that
led
us
to
find
a
couple:
different
outliers,
for
example,
when
a
person
first
requested
datacap
for
the
very
first
time
if
they
then
made
subsequent
requests.
A
It
would
look
back
at
that
initial
submission
and
it
would
calculate
the
time
from
their
initial
request
to
their
most
recent
allocation,
and
so
that,
understandably,
would
be
a
much
longer
time
scale,
and
so
those
averages,
because
statistics
are
funny
this
way-
would
throw
up
the
entire
calculation.
So
we've
been
digging
into
that.
We've
done
some
things
to
exclude
those
outliers
and
recalculate
how
those
are
being
done.
There's
still
some
additional
kind
of
back
end
work
that
we're
doing
to
make
sure
that
this
is
as
as
accurate
and
clean
as
possible.
A
But
all
that
said
we're
seeing
two
days
which
is
phenomenal.
To
my
knowledge,
this
is
the
average
overall
for
the
whole
program.
This
is
not
an
average
of
a
particular
time
scale
and
so
taking
that
into
account,
because
you
can
see
next
to
it
the
average
time
to
date,
a
cap
in
the
past
30
days
is
sitting
at
one
day
and
two
hours.
42
minutes,
which
is
again,
is
really
spectacular.
Deep
has
a
hand,
so
I'm
going
to
pause
here.
I.
B
So
that
means
you
know
if
that's
like
automated
verifiers
or
30
seconds
and
of
course,
a
disproportionate
number
of
applications
come
into
that
and
then
ldn
for
recent,
like
top
ops,
has
been
pretty
quick
from
active
notice,
and
so
I
think
that's
what's
bringing
the
number
down,
which
is
fantastic
as
you
get
to
the
other
boxes,
I
think
we'll
still
be
able
to
make
a
couple
of
points
that
I
think
are
important,
though.
A
Yes,
thank
you
for
those
clarifying
details,
as
you
can
see
the
like
next,
the
next
section,
the
average
time
to
date,
a
cap
for
direct
notaries.
This
is
looking
at
when
issues
come
in
through
the
field
plus
registry
app
and
create
a
github
issue,
and
then
notaries
are
able
to
reply
and
that
approves
and
that
allocation
lands
on
chain.
A
That's
for
all
time,
looking
at
10
days
and
then
for
the
past
days,
we're
seeing
three
days
so
really
happy
that
we're
looking
at
these
kind
of
both
overall
and
as
a
recency
bias,
because
it's
showing
sort
of
that
improvement
that
we're
hoping
for
we've
been
trying
to
push
a
lot
of
changes
with
this
intent,
and
so
I
think,
looking
accurately
at
you
know.
How
are
these
working
helps
us
understand.
Like?
Are
the
changes
effective
and
what
direction
do
we
need
to
keep
going
so
average
time
to
first
response
from
a
notary?
A
This
is
still
looking
at
all
time
and
again.
This
is
from
when
the
the
application
github
issue
comes
in
two
of
the
notaries
are
replying
to
that
seeing
eight
days
and
then
the
time
to
data
capital
account
chain
at
12.,
so
happy
with
these
metrics,
I'm
much
happier
than
you
know
when
we
first
started
calculating
them
and
the
outliers
were
pushing
us
into
the
20-day
range.
That
was
a
sad
moment
for
for
deep
and
act,
but
this
makes
a
lot
more
sense.
A
So
continuing
on
looking
at
that
data
cap
pipeline
as
well,
we
still
have
some
work
to
clean
up
for
the
ldns,
because
the
when
we
rolled
eldians
over,
for
example,
the
ldn
for
discover,
had
some
something
close
to
five
heavy
bites
remaining.
So
when
we
rolled
them
to
the
new
process,
we
created
the
new
multisig.
A
We
have
yet
to
go
back
and
change
their
allocation
amount
of
data
cap
to
zero,
and
so
some
of
these
numbers
are
skewed
because
there's
two
ldns
for,
for
example,
discover
and
that's
the
case
for
a
number
of
those
ldns
that
were
approved
before
we
switched
to
the
new
v2
multisync
process.
So
these
numbers
are
a
little
off
still
continuing
to
work
on
that.
It's
been
more
of
a
priority
to
get
the
data
cap
flowing
under
the
new
process
than
it
has
been
to
go
back
and
clean
all
the
old
process
up.
A
It's
still
on
our
to-do
list.
We've
just
been
prioritizing
not
blocking
people,
but
in
addition
to
that,
the
total
amount
of
data
allocated
clients
is
now
hitting
two
pebby
bytes
and
that's
27
percent
of
the
total
data
cap
kind
of
out
there,
which
is
great
and
flowing
down
into
deals,
we're
seeing
750,
and
so
that's
almost
35
of
the
amount
allocated
to
clients
is
getting
sealed
in
deals.
A
This
is
a
number
that
we
want
to
keep
seeing
go
up,
but
we
are
really
focusing
on
the
kind
of
increasing
the
top
of
the
funnel
and
then
using
other
mechanisms
like
some
solutions,
architects
and
some
other
ways
to
support
storage
provider,
biz
dev
to
help
get
this
metric
to
increase.
So
some
interesting
things
down
here
around
the
percent
of
successful
data
cap
applications,
being
you
know,
27
close
close
to
a
quarter
or
a
third
depending.
I
guess
I
split
the
number.
A
Some
of
this
is
again
coming
from
people
that
submit
duplicate
applications
and
how
those
duplicate
applications
get
handled
by
notaries
and
then
other
is
you
know,
people
submitting
applications
that
are
incomplete
and
then
maybe
deleting
that
application
and
resubmitting
so
it's
sort
of
this
is
again
a
difficult
statistic,
metric
to
look
at
to
get
a
whole
picture
of
how
many
of
these
are
actually
quote.
Unquote
bad
applications,
but
we
don't
have
necessarily
a
target
metric
in
mind
here.
It's
just
something
that
we've
been
thinking
about
and
then,
like.
A
I
said
this
is
a
you
know.
34.9
percent
of
data
cap
is
something
that
we
do
want
to
help
see
increase
deep
anything
to
add
on
this
slide.
Okay,
moving
swiftly
on
looking
at
the
direct
notary
activity
there.
When
we
pulled
this
snapshot,
there
were
29
open
issues
in
the
repo
that
are
about
18
in
the
past
week,
new
applications
looking
across
time
we're
seeing
you
know,
210
with
that
label.
Granted
again
sometimes
clients
delete
their
issue
once
they've
been
granted.
So
that
becomes
a
difficult
thing
to
track.
A
We're
looking
at
ways
to
fix
that
and
in
the
past
14
days
we've
seen
we're
seeing
five
applications
that
are
closed
with
that
label
granted.
So
that's
interesting.
22.8
clients
are
receiving
data
cap
per
month
right.
A
Exciting
and
we're
seeing
some
of
our
updated
notary
table
metrics
that
deep
is
putting
together
from
a
table
that
I
think
andrew
hill
started.
Is
that
correct?
Yes?
A
So
we
can
see
some
stats
that
are
broken
out
by
notaries
and
we'll
keep
kind
of
increasing
that
we
have
some
projects
in
the
works
as
well
for
some
more
transparency
and
kind
of
like
scorecard
leaderboard
types
of
things
for
notaries
to
showcase
how
each
individual
notary
is
doing
so
we're
excited
for
that.
A
B
Cool,
thank
you
so
much
galen.
So
for
those
of
you
that
have
you
know,
participated
in
the
last
couple
of
early
governance
calls
or,
if
you're,
watching
this
recording.
You
know
that
we
started
doing
this
thing
where,
after
we
spent
some
time
talking
about
what
reasonable
targets
were
on
a
six-month
horizon.
B
Actually
I
made
this
joke
last
time
and
we
should
remember
edit
this
insights
that
we
have
to
reduce
six
to
five
now
because
we're
in
october,
and-
and
so
maybe
it
might
be
better
to
say,
like
a
target
for
march
or
something
reasonable
like
that,
like
maybe
q1,
is
something
we
can
rally
around
as
a
community
and
work
from
there,
but
basically
yeah.
We
divided
up
these
apis.
B
It
goes
into
five
different
categories,
and
I've
been
wanting
to
add,
you
know,
sort
of
tracking
for
that
on
a
per
governance
called
basis,
so
the
first
of
which
is
volume
where
there's
a
goal
to
get
several
hundreds
of
data
cap
available
and
data
cap
itself
should
be
plentifully
available
and
abundant,
both
for
notaries
to
be
able
to
give
directly
to
clients,
but
also
for
large
data,
set-based
support
or
any
other
mechanisms
that
we
choose
to
design
as
a
community
to
support
that
much
data
being
onboarded
onto
the
network.
And
that's.
D
B
Driven
by
the
fact
that
in
general,
like
there's
a
lot
of
different
efforts
like
falcon
gravity,
slingshot,
etc,
that
are
generally
working
towards
pushing
100,
plus
everybody's
worth
of
content
onto
the
network,
it's
a
current
status.
Like
last
time.
We
talked
about
the
slide.
We
were
in
the
40s
now
we're
at
63
periods,
that's
a
measurement
of
about
six
period,
bytes
that
were
given
to
nurseries
in
the
combination
of
the
two
election
cycles
and
then
57
periods
coming
from
the
sum
of
so
far,
ldns
that
are
sort
of
ready
to
sign.
B
B
The
the
general
expectation
is
that
we're
still,
you
know,
working
towards
an
order
of
magnitude,
improvement
and
so
been
doing
a
little
bit
of
thinking
recently
on
this
one
and
would
love
to
share
some
thoughts
on
it,
probably
closer
to
the
end
of
the
call
and
get
get
people
thinking
about
this
a
little
bit
more,
the
second
of
which
is
time
to
data
cap
and
so
goals
here
were
to
get
a
lot
more
aggressive
and
how
quickly
their
clients
can
become
unblocked.
B
B
I've
we've
had
a
couple
of
discussions
around
how
we
can
level
up
the
automated
verifier
to
support
like
a
one
type
of
data
cap
allocation,
and
so
that
discussion
is
on
on
github
we've
had
it
a
couple
of
times
it's
like.
We
should
continue
to
have
it
today,
because
I'd
love
to
see
what
new
ideas
have
come
up,
so
we
can
continue
to
iterate
on
this.
B
Mr
galen,
can
I
request
the
next
slide?
Thank
you
risk
mitigation.
So
we
currently
have
you
know
two
different
dashboards.
There's
fill
plastic
info
that
is
being
drawn
by
the
file
drive
team.
I
think
laura
is
in
the
call
today
and
then
there's
the
plus
dashboard
that
galen
just
pulled
up.
B
It's
very,
very
important
that
we
have
one
more
dashboards
that
are
coming
from
different
places
that
can
provide
an
opinion
like
it's
it's
difficult
as
a
community
for
us
to
take
a
dependency
on
a
small
set
of
providers,
and
so,
while
it's
great
that,
like
these
two
teams
that
are
spending
time
building,
that's
where
it's
like
gillen's
working
with
this
team
and
then
football
that's
right
file
driver
has
been
working
on
this
dashboard.
It's
extremely
important
that,
like
one
that
file
drive,
is
even
able
to
make
that
dashboard.
B
But
then
too,
the
fact
that,
like
we,
should
continue
encouraging
development
of
additional
like
opinions
in
this
space,
because
it
will
require
multiple
sources
to
be
able
to
distill
the
truth,
and
we
don't
want
to
have
points
of
failure
in
the
system,
especially
as
we
start
to
move
towards
more
automation,
and
we
start
to
move
towards
a
world
where
we
rely
on
data-driven
decision-making.
B
Like
we
look
at
actions
on
network
and
use
that
to
guide
practices
as
notaries
practices
as
a
community,
it
becomes
paramount
that
the
data
that
we're
looking
at
is
actually
data
with
integrity
and
can
be
verified,
and
so
I
would
love
to
see
another
team
here
or
anybody.
Listen.
B
This
call
interested
in
building
a
dashboard
reach
out
you
please
let
us
know
happy
to
have
a
ground
for
that
or
support
you
in
any
way
that
we
can
like
we're
working
on
adding
more
api,
so
that
the
actual
data
pulling
aspect
becomes
a
lot
easier,
and
so
this
just
turns
into
a.
How
do
you
want
to
design
it?
How
do
you
want
to
verify
your
data?
How
do
you
want
to
update
it
and
what
you
want
to
look
like?
What
are
you
trying
to
solve
for,
and
so
it
would
be
great.
B
B
That
starts
to
become
like
baseline
information,
that
a
notary
will
want
to
at
least
take
a
glance
at,
and
so
today,
if
a
nobody
chooses
to
look
at
it,
they
have
to
go
to
one
of
the
dashboards
click
through
a
bunch
of
things
and
find
this.
But
if
we
were
able
to
just
automate
like
flags
or
automated
risks
on
like
a
particular
allocation,
this
would
go
down
from
being
like
a
five
to
ten
minute
process.
That
nurturing
has
to
be
aware
of
doing
to
like
a
30
second
process,
where
nobody's
like.
B
Oh
no
flags
on
these
clients
happy
to
sign
it
sign
it,
and
so
I
would
encourage
those
of
you
that
have
a
you
know:
interest
in
like
broad
monitoring,
fraud
watching
or
in
general
have
past
experience
in
looking
at
like
basically
this
within
a
system
just
within
this
kind
of
design.
B
Where
there's
a
lot
of
different
entities
that
exist,
we
don't
always
know
who
the
entities
are
or
how
they're
related,
but
figuring
out
whether
or
not
there's
disproportionate
usage,
in
particular
ways
that
signals
higher
risk
activity
versus
not,
and
so
it'd
be
really
interesting
to
hear
from
you.
B
If
you
have
ideas
on
that,
then
there's
the
aspect
of
climb
onboarding
ux
needing
to
improve
to
ensure
that
the
ramp
to
file
coin
is
is
low,
friction
and
then
there's
this
coin,
plus
governance
side
of
things
where
we
want
to
ensure
that
we
have
more
scalable
governance
practices
on
both
of
these.
B
I
think
we
haven't
had
as
much
progress
in
the
last
month,
like
we've,
made
some
changes
to
falcon
plus
register,
yet,
but
not
enough
that
it's
worth
calling
out
as
a
substantive
step
in
improvement
here
other
than
for
the
notaries,
certainly
not
for
the
clients.
And
so
I
would
love
to
hear
if
you
have
any
ideas
on
this
or
how
you
want
to
contribute
to
this
again
the
happy
to
support,
either
with
time
or
with
grants
to
ensure
that
you
feel
empowered
to
make
changes
to
the
system
in
a
way
that
you
are
interested.
A
I
think
move
on
yep,
yeah
sweet,
so
led
application
tracking.
This
was
something
that
I
put
some
effort
into.
I
need
to
go
back
and
maintain
it.
I'm
not
sure
how
valuable
it
is.
I
thought
it
would
be
a
lot
more
useful
because
I
thought
there
could
be
some
automation
to
this
based
on
labels.
A
Unfortunately,
in
github
right
now,
that's
not
the
case.
Unless
I
go
deep
into
writing
my
own
api
rules,
which
believe
it
or
not,
I
don't
want
to
do.
The
intent
here
was
to
basically
try
and
have
a
little
project
tracking
board
where
we
could
move
the
open
applications
back
and
forth.
A
To
say
this
one
is
ready
and
you,
the
notaries
or
various
community
members
could
land
on
this
project
board
and
then
get
a
snapshot
of
you
know
where
are
these
because
we're
now
at
something
like,
I
think,
30,
so
they
don't
fit
on
a
single
page
in
github
anymore
right,
so
just
trying
to
find
better
ways
to
stay
on
top
of
it.
I
may
end
up
deprecating
this
if
it
continues
to
be
clunky
or
require
more
manual
effort
to
be
accurate,
and
we
may
just
rely
on
labels.
A
If
that's
the
case,
it
may
be
that
we
send
out
some
urls
that
are
saved
label
searches
for
the
notaries
to
be
able
to
say
you
know,
go
search
based
on
the
label,
ready
to
sign
and
just
look
at
those,
so
we'll
see
to
be
determined
if
anyone
else
wants
to
like
take
on
using
this
project
board
and
github
and
write
those
apis.
Let
me
know
happy
to
help
support.
A
We
still
have
some
various
notary
governance
discussion
topics
that
have
been
open
that
we've
been
pushing
through.
I,
to
my
knowledge,
I
think
that
meg
and
the
market
assessment
is
still
moving
forward
with
some
support
from
other
members
in
the
foundation,
and
then
I
think
the
big
ones
that
kind
of
around
this.
This
idea,
like
increasing
the
scope
of
automated
verification,
sort
of
what?
A
What
could
that
look
like?
What
tools
do
we
feel
we
would
need
to
have
in
place
before
we
can
support
more
automation
and
then
also
fundamentally,
what
are
some
additional
ways?
We
could
automate
that
we
would
want
to
test
early
so.
B
Did
you
want
to
dive
into
any
of
these
on
this
call
yeah?
I
would
love
to
I'd
love
to
go
into
the
first
one
at
least,
and
I
think
the
prompt
for
the
conversation
that
I
want
to
use
is
today.
If
you
go
to
like
verify.glycerio
once
every
30
days,
you
can
go,
get
32
gb
of
data
cap.
B
E
B
E
B
B
Okay
looks
like
silence,
which
is
generally
means.
I
think
people
are
like
shark,
not
necessarily
super
pro
against
yeah.
I
think
this
is
worth
sort
of
just
keeping.
The
back
of
your
mind
is
like
you
know,
the
first
step
that
we
can
take
towards
drastically
improving
ddd,
drastically
improving
the
number
of
clients
that
are
unblocked
on
the
network,
drastically
improving
the
amount
of
data
available
of
our
clients
on
the
network
and
how
they're
behaving
oh,
we
have
a
hand
eric
go
for
it.
C
We
can
hear
you
yeah
okay,
but
I'm
fine
with
this
proposal,
but
obviously
for
the
chinese
miners.
They
do
have
the
language
barrier,
so
it
can
be
a
chinese
version
for
them
yeah.
So.
C
B
Yeah,
that's
a
really
interesting
point,
so
I
I
guess
I
have
a
question
for
you
alex,
so
please
pardon
my
ignorance.
Basically,
what
I've
seen
is
browsers
today.
Do
this
like
attempt
of
translating
right
like
if
you're
in
a
region
or
actually
it
doesn't
actually
matter
where
you
are,
but
you
can
basically
open
a
page
in
a
browser
like
hit
right
click
on
a
page
and
say
translate
into
like
some
language.
C
I
think
the
the
key
point
is
the
the
the
channels,
because
the
only
few
miners
they
would
like
to
log
in
with
g
leap
website
and
for
chinese
people.
We
do
recommend
the
wechat.
Maybe
you
can
ask
karen
for
helpful.
They
post
some
news
from
the
wechats
for
the
for
the
official
pages.
Then
let's
say
everybody
can
know
this
news.
Otherwise
nobody
knows
here
we
can
get
the
data
cap.
B
I
see
okay,
so
the
point
there
is
basically
were
you
suggesting
wechat
as
a
way
in
which
we
market
this
and
like
talk
to
people
about
it,
or
were
you
suggesting
wechat
as
a
form
of
kyc?
So
instead
of
logging
in
with
github,
a
client
could
log
in
with
wechat
and
use
that
as
a
mechanism
to
get
data
cap.
C
Kyc
does
not
approved
for
for
for
the
wechat
login.
I
think
wechat
will
be
the
best
way
to
let
the
chinese
miner
know
what
happens
from
the
protocol
lab
and.
B
B
That's
very
helpful,
there's
also
generally
eric
a
question
for
you.
I
guess,
is
attention
to
this.
There's
generally
been
a
decent
amount
of
interest
after
you
filed
that
sort
of
translation
thing
on
people
who
want
to
translate
stuff,
and
so
I
get
a
few
things
in
slack
nowadays
on
people
who
want
to
work
on
translating
different
parts
of
the
plus
project
into
into
chinese
or
other
languages.
B
Is
it
worth
if
we
put
everybody
into
a
group
together
and
then
you
can
sort
of
leave
that
effort
and
coordinate
and
figure
out
the
best
way
to
approach
this?
Is
that
something
you'd
be
interested
in
doing
or
should
we
pull
in
some
support
from
the
foundation
or
something
like
that?
What
would
be
good.
C
Sure
sure
that
that
will
be
great.
So
if
you
can
pull
me
into
the
translate
team,
we
can
work
together.
B
So
so
two
weeks
ago,
for
those
of
you
that
were
here,
we
spent
some
time
talking
about
dao's
as
a
community
and
andrew
talked
a
little
bit
about
what
a
guild
was
and
sort
of
building.
C
B
Yeah
cool
generally,
it
seems
like
people
are
supportive
of
this,
though.
So
that's
that's
interesting.
Maybe
we
should
think
about
doing
this
sooner
rather
than
later.
Again,
I
would
love
to
I'd
love
to
I'd
love,
to
get
a
little
bit
more
data,
so
for
those
of
you
that
are
in
touch
with
clients
or
have
worked
with
clients,
you
know
bring
this
up
with
them,
see
what
they
think
and
then
maybe
for
the
next
call.
B
I
will
petition
this
more
as
an
issue,
and
then
we
can
discuss
like
how
we
want
to
implement
it
like
whether
or
not
it's
just
verified
or
glyph.I
o
that
levels
up
or
if
we
want
to
introduce
a
new
verifier
like
some
other
notary
here,
wants
to
build
one
using
their
code
for
their
data
gap
allocations
to
the
interesting
ways
in
which
we
can
go
about
doing
this.
B
Sweet
yeah,
you
have
one
on
here
and
then
I
have
another
topic:
I'd
love
to
cover
actually
too,
but
I
can
hold.
If
you
want
to
talk
about
this
thing,
first,
okay,
I
will
go
for
it.
So
the
first
of
those
two
topics
is,
you
know
one
is
building
like
this,
like
you
know,
leaderboard
stationary
stats,
type
page
where
we
track
like
governance
as
a
metric
as-
and
we
talked
about
this
in
the
last
call
for
quite
a
while.
B
B
But
basically,
the
conversation
evolved
around
this
sort
of
learnings
from
other
dows
and
wildlife
communities
that
are
starting
to
build
like
pages
where
they
can
track
stuart
contribution
participation,
performance,
and
that
was
more
an
idea
that
all
of
you
participate
in
now,
then
it
should
be
like
relatively
transparent
as
to
how
you
participate
and
in
what
ways
you
choose
to
participate
and
how
you
prioritize
your
your
time
within
it
and
meg
has
been
pushing
this
idea
of
like
having
different
levels
of
expectations
for
notaries
and
and
how
much
time
they
should
be
spending
and
being
more
transparent
as
a
community
about
that.
B
So
that
is
pretty
cool,
and
so
you
sort
of
as
a
stepping
stone
to
that.
You
know
I
saw
this
page
and
got
a
little
inspired
with
it.
I
think
I
shared
this
on
the
call
last
week
or
sorry
two
weeks
ago,
and
basically,
I've
been
working
on
like
a
proposal
like
an
rfp
for
a
grant
for
like
what
a
notary
dashboard
could
look
like.
So
who
are
the
active
notaries?
How
how
are
they
active?
What
do
they
usually
participate
in?
Is
it
ldns
or
direct
applications?
B
When
do
they
tend
to
be
active?
In
what
way
are
they
active
et
cetera,
so
that
one
we
can
better
inform
clients
of
you
know
how
they
should
select
an
area
who
they
want
to
work,
whether
relationships
they
may
want
to
build
in
the
ecosystem.
Two,
it's
interesting
from
the
perspective
of
basically
tracking
metrics.
That
could
then
result
in
us
having
the
data
to
have
conversations
around
like
things
like
service
levels,
where
it's
like.
B
We
can
start
to
create
working
groups,
focus
groups
either
task
forces
et
cetera,
whatever
we
want
to
call
them
within
this
ecosystem
within
this
community
to
become
more
efficient
and
there's
this
third
aspect
of
just
we
have
these
stats
on
the
data
like
on
the
on
the
dashboard
that,
like
gail
and
I
work
on,
we
have
them
in
the
far
combustion
info
dashboard
as
well,
but
we
have
no
centralized
space
to
like
look
at
our
own
performance
as
a
community,
and
so
this
would
be
that,
and
so
it
would
replace
it
would
be
like
the
level
up
basically
of
what
we
use
today
with
andrew's
like
csv,
that
we
pasted
in
at
the
start
of
the
deck
and
effectively
like
we
have
a
better
form
like
leaderboard
around
it.
B
So
I
think
it's
a
pretty
compelling
idea,
not
all
of
you
were
there
two
weeks
ago
to
talk
about
it.
Hopefully
what
I
said
made
sense.
If
it
didn't,
please
tell
me
I'm
happy
to
go
deeper
into
it.
If
not,
what
I'll
do
is
I'll,
probably
just
publish
like
the
rfp,
and
then
we
can
go
hunting
for
a
team
that
might
be
interested
in
implementing
something
like
this.
B
Next
topic,
I
love
flag
is
around.
What's
the
other
one
that
I
want
to
bring
up.
B
Oh
yeah
general,
like
ldn,
like
application
flow
and
and
how
we
want
to
scale
that
up
a
little
bit.
B
So
what's
some
of
you
are
aware
that
you
know
there
are
a
couple
of
like
technical
hitches
last
week
in
terms
of
making
sure
that,
like
the
application
showed
up
correctly
and
they're,
all
assignable
we've
tried
to
push
a
couple
of
changes
that
should
help
with
like
the
sign
out
sign
in
flow,
so
that
there's
no
like
accidental
allocations
coming
from
different
multi-sigs
and
we've
tried
to
improve
the
ux
a
little
bit
for
the
notary
experience
in
the
last
couple
of
weeks,
so
that
it's
a
little
bit
smoother
for
you,
folks
to
figure
out
like
what
you
can
do
within
the
system
and
how
you
should
be
doing
it
in
general,
there's
still
more.
B
That
needs
to
be
done,
so
first
thing
would
be.
If
any
of
you
are,
you
know
willing
to
sit
down
with
us
and
then
the
team
that's
working
on
it.
They
basically
say
you
know.
I
I
have
these
three
complaints
or
issues
or
ideas
for
improvement
and
give
us
that
kind
of
feedback,
even
in
like
a
five
minute
call
or
ten
minute
call.
That
would
be
super
appreciated.
B
Please
reach
out
in,
like
the
falcon
boston
notice
channel
to
let
us
know
that
you'd
be
interested
in
giving
feedback,
because
we'd
love
to
talk
about
it
and
then
second,
we
need
to
come
up
with
some
way
where
the
tool
itself
is
more
useful
for
you
in
in
like
pushing
like
a
way
to
let
you
know
that
it's
worth
looking
at
right
so
like
when
you
have
applications
come
in.
Typically,
you
get
flagged
on
github.
B
Maybe
we
use
slack
because
that's
really
where
we
are
most
active
as
a
community,
and
so
galen
was
toying
around
with
the
idea
of
having
like
a
slack
bot
built
that
basically
looks
at
the
ldn
repo
so
that
same
like
tracking
table.
We
have
basically
we
could
pull
from
that
and
we
could
just
do
a
like.
A
bot,
automated
post,
like
every
24
hours.
B
B
B
Theoretically,
we
could
have
a
visualization,
I
actually
don't
think
that's
like.
I
actually
think
somebody
published
this.
Let
me
just.
B
Jim
jinpik
made
one
recently,
I
think
and
the
and
if
you
look
at
the,
if
you
look
at,
he
delineates
like
what
is
a
verified
deal
versus
a
not
verified
deal
and
it's
actually
pretty
well
distributed.
I
I
thought
it
looked
good
yeah
here.
Here's
here
is,
it
looks
like
we've
got
a
couple
of
big
hubs
in
in
europe.
B
I
think
there's
a
special
yellow
dot
here
for
ynn's
team
as
well.
Let
me
see
if
I
can
get
you
a
link
to
this.
C
B
B
B
B
I
think
the
problem
with
this
is
also,
of
course,
it's
looking
at
probably
load
balance
id
as
opposed
to
the
ip
address
on
the
dht,
and
so
people
can
pretend
to
be
anywhere
but
yeah.
I
I'm
not
aware
of
a
minor
in
peru,
for
example
today,
but
maybe
I'm
wrong,
maybe
there's
a
sort
of
providing
data
center.
There.
B
F
I
don't
know
it's
like
their
firewall
and
they
need
a
vpn
sometimes
and.
F
B
Yeah,
that's
definitely
what
we've
seen
other
projects
do
as
well
like
just
put
everything
on
a
hard
disk
and
ship
it,
and
it's
actually,
if
you
do
it
at
a
certain
scale,
it's
cheaper
to
do
that
right,
because,
if
you're
doing
it
at
a
high
enough
scale,
then
you
amortize,
the
shipping
costs
and
the
hard
disks
are
actually
hard.
Disks
are
not
expensive.
It's
actually
just
the
logistics
challenge
of
getting
the
data.
So
how
much
like?
How
do
you
pay
for
shipping?
How
do
you
ensure
it
reaches.
F
We
weren't
sure
about
sending
hard
drives
with
data
on
it
internationally,
because
any
issue
in
the
customs.
B
F
B
B
Yeah,
I
think,
there's
cdn
is
one
option
I
think
cdn
will
always
be
too
expensive.
I
actually
think
what
you
need
is
a
point-to-point
data
transfer,
that's
more
efficient
right
because
having
a
cdn
means,
you
have
wide
coverage
all
the
time
and
that's
not
really
what
we
need
here.
What
we
need
is
one
place
to
another
place
very
efficiently.
B
There
are
some
like
there
are
some
software
solutions
that
are
being
built.
There's
this
one
that
I
was
talking
to
last
month.
That
is
worth
checking
out.
Maybe
there's
also,
of
course,
the
tried
and
tested
upload
do
webdue
storage
provider,
which
I
don't
love.
I
think
we
should
come
up
with
a
better
way
and
then
there's
also
like
the
fact
that,
like
maybe,
we
just
need
to
get
more
effective
data
transfer
in
the
falcon
network.
B
So
I
know
there's
some
thinking
being
done
to
make
that
component
a
little
bit
more
modular
and
then
figuring
out
like
how
we
can
use
bitswap.
For
example,
there's
a
really
good
talk
from
hannah
howard
at
the
falcon
orbit
sessions
last
week
would
highly
recommend
you
check
it
out.
D
B
Awesome,
that's
pretty
much
the
stuff
that
I
wanted
to
flag
today
again,
I'm
trying
to
think
of
this
as
anything
else.
I
think
in
general,
like
I
stepped
back
and
did
a
little
bit
of
thinking
last
week
on
some
ideas.
I
had
around
the
design
system
and
I
had
a
chance
to
sync
up
again
over
the
last
few
days
as
well
about
like
what
we
can
more
creatively
do
and
propose,
and
so
I
think
the
main
thing
from
that
is.
B
B
Yes,
definitely
we
were
fixing
the
allocation,
but
it
actually
should
have
been
fixed
yesterday,
but
I
didn't
get
a
chance
to
sync
up
with
the
team
yesterday
in
the
evening,
because
I
got
inundated
with
some
other
stuff.
Do
you
know
the
latest
on
where
we
are
with
the
issues
that
we
had
over
the
weekend?
Well,
I
guess
end
of
last
week
earlier
this
week.
A
Yeah,
I
was
typing
it,
but
yes,
the
it
was
not
looking
at
the
subsequent
allocation
bot
was
not
looking
at
the
rules
to
say,
calculate
this
value
and
this
value
and
take
the
lower
one.
So
one
of
the
bots
had
that
rule
the
allocation
bot
did
not,
and
so
that
was
where
the
two
bots
were
in
conflict.
So
they
are
updating
that
bot.
They
have
made
the
first
change
the
that
should
work
and
it
started
to
add
the
label
correctly.
A
The
problem
is
that
they
want
to
make
sure
they've
got
all
of
the
subsequent
calculations
correct
before
we
start
the
first
one,
so
it
doesn't
run
into
a
problem
on
the
subsequent
allocation,
so
it
should
just
be
another
day
or
so
to
make
sure
that
all
the
bots
have
the
right
logic.
A
Yes,
eric
root.
Key
holders
were
busy
last
week
at
like
orbit
events
and
yeah
that
was
so
pushed
through.
I
think
there
was
another
I
was
going
to
touch
on
this,
which
kind
of
rolls
into
some
of
these
questions.
The
like
current
process
around
the
client
opens
the
issue,
the
bot
and
the
governance
team
audit
it
just
for
completion,
and
then
it
goes
to
the
key
holders
they
get
the
signature.
A
The
bot
will
then
start
the
first
allocation
request.
A
lot
of
this
is
working
pretty
smoothly.
You
know
we
had
that.
One
problem
say
with
yours,
where
the
sort
of
delta
between
the
weekly
versus
the
total
that
hasn't
happened
before
so
it's
just
finding
these
edge
cases.
Sorry
that
that
hit.
You
should
be
getting
corrected.
A
We
are
not
doing
the
decision
making
or
the
auditing
or
the
opinions
on
is
this
valid,
and
so
the
expectation
for
the
notaries
is
to
actually
review
the
evidence
in
the
application
and
say
yes,
this
person
put
in
you
know
they
are
a
real
client
with
valid
data.
They
we
are
confident
that
they
claim
to
be
this
company
with
this
kind
of
data
and
they
can
provide
a
sample
and
it
you
know,
meets
our
standards,
and
so
we
want
to
encourage
you
know
as
we're
speeding
up
this
process.
A
Please
comment
on
the
issue
that
you
are
concerned
about
this
and
want
to
hear
the
answer
to
this
question
before
it
gets
approved,
because
what
we
you
know,
if
you
don't
ask
that
question
someone
else,
some
other
notary
may
read
it
and
think
this
is
fine
and
they
may
not
have
the
same
question
that
you
do.
That
doesn't
mean
that
your
question
is
is
not
valid.
It
could
be
that
they
have
more
context,
but
they
could
also
answer
that
question
for
you.
A
So
if
you're
concerned
about
one
of
the
ldns
that
you
see
for
any
reason,
please
say
so
like
flag.
It
and
say
you
know,
please
don't
approve
this
until
we
hear
more
because
that's
part
of
the
the
notary
governance
process
right.
It's
like
we
are
all
responsible
for
looking
at
these.
If
you
don't
do
that,
if
you
don't
raise
your
hand
and
say
this
is
an
issue
and
two
other
people
jump
in
and
approve
it,
you
could
still
flag
it
later
on
to
pause
the
subsequent
allocation.
You
could
say:
hey.
A
I
actually
had
a
question.
I
saw
two
people
sign
this
before
we
sign
the
next
one.
Let's
make
sure
we
get
an
answer
to
this
question,
so
that
was
the
the
last
thing
that
I
wanted
to
to
say
there
about
ldns,
there's
a
number
that
have
gone
through
as
of
yesterday.
That
should
be
pending
signatures
that
need
some
review.
So
that's
all
I'll
say
about
that.
A
E
Yeah,
I
was
just
wondering
I've
submitted
an
ldn
and
it's
approved
now
by
three
notaries
under
the
new
things,
it's
okay
and
good
to
go
now.
B
I
think
gil-
and
I
have
some
time
later
today
to
do
a
quick
audit
on
what
the
outstanding
things
are
for
the
root
key
holders,
because,
as
you
see
in
the
chat,
I
think
the
first
year
event
had
the
folks
that
actually
hold
the
keys
a
little
bit
busy
and
so
we'll
do
this
audit
today
ping
a
bunch
of
turkey
holders,
hopefully
get
some
good
responses
in
the
next
24
hours
and
unblock
applications.
Yeah.
A
A
That
was
the
previous
mechanism,
as
we
were
waiting
for
notaries
to
raise
their
hand
and
say
yes,
I
approve
yes,
I
approve,
so
the
slowdown
that
you
are
seeing
is
just
because
deep
and
I
and
then
the
root
key
holders
over
the
past,
like
two
weeks,
have
not
gotten
to
those
fast
enough
to
create
the
issue
to
send
it
to
the
rookie
holders,
for
them
to
sign
so
apologize
for
that.
But
yes,
as
steve
said,
we
should
get
to
it.
Does
that
answer
your
question.
B
But
the
fact
that
you
have
already
had
notice
look
at
it
and
approve
of
it
means
that
as
soon
as
that
happens,
your
turnaround
time
should
be
extremely
quick.
Okay.
I
think
that
would
be
good,
there's
also
a
similar
thread,
actually
in
the
slack
about
how
we
can
track
a
root
key
holder
like
perf
and
stats
and
governance
stats,
and
so
that
was
really
interesting.
I'm
super
supportive
of
that.
B
So
maybe
that's
something
we
add
into
the
funnel
as
well,
which
is
before
it
even
gets
to
the
notary
right
like
we
do
this
today
for
manual
applications,
which
is
like
how
long
did
it
take
for
the
notary
to
respond?
How
long
till
they
got
an
approval
or
a
decline,
but
maybe
we
need
the
same
thing
for
ldns
as
well,
where
we
start
to
measure
the
gap
between
a
client
making
an
app
and
how
quickly
we
can.
E
Okay-
and
I
have
another
question
I
submitted
the
data
set,
and
I
know
this
is
this:
calls
not
for
authorization
of
data
caps,
but
I
submitted
the
data
set
and
it
seems
well.
I
learned
just
a
few
moments
ago
that
somebody
else
also
did
a
similar
application.
E
B
E
B
E
B
Yeah,
I
I
think
at
that
point,
then
it's
up
to
whether
or
not
no
juries
want
to
support
it.
So
if
you
can
convince
countries,
they
give
you
the
cap
at
the
start,
then
I
think
yeah
cool
bring
her.
Someone
asked
me
about
the
time
of
the
next
round
of
nerdy
elections.
Do
you
have
a
specific
plan?
Oh,
that
is
a
good
topic.
B
I
think
we
should
bring
that
up
in
the
next
call,
probably
because
we
probably
want
to
do
do
one
like
around
the
end
of
the
year
like
I
was
just
looking
at
the
stats
and
where
more
than
40,
maybe
about
35,
40
percent
of
of
the
available
data
cap
to
notice
has
been
manually
allocated
and
in
general.
B
Something
we
talked
about
last
cycle
was:
how
do
we
also
improve
the
number
of
like
increase
the
number
of
notaries
and
figure
out
what
the
engagement
looks
like
in
a
way
that
it's
sustainable
for
individual
notaries
and
so
bingham?
I
think
it's
a
complex
question
with
more
nuance.
I
think
it's
worth
additional
conversation,
but
if
you
want
to
give
somebody
an
answer
that
stop
bothering
you,
you
could
say
something
like
in
the
next
few
months
or
you
could
say
something
like
around
january.
You
will
probably
have
a
better
idea,
but
that's
my
guess.
B
Wow
we're
going
to
end
on
time.
I
think
we'll
think
we
can
do
it.
Oh
anya,
you
want
to
give
us
a
quick,
30-second
thing
on
what
that
is.
D
I'm
just
sure,
though
we
just
I'm
developing
relations
at
the
tix
textiles,
so
I'm
trying
to
try
to
organize
external
communication.
So
if
you
guys
want
to
join
our
discord,
please
do
you
can
vote
and
submit
your
favorite
stickers
in
there
as
well
and
actually
we're
dropping
our
layer
too
solution
in
there
and
there's
like
a
brainstorm
going
on.
So
if
you
want
to
join
awesome.
D
A
A
So
thanks
for
joining
looking
through
our
agenda,
some
similar
topics
that
we've
covered
before
we'll
look
at
metrics
we'll
do
some
updates
on
our
sort
of
six-month
program
goals,
talk
about
some
of
the
ldn
application
tracking
and
then
have
some
space
for
some
other
discussion
issues,
and
then
some
ldn
process
and
expectation
updates
and
then
open
the
floor
for
open
discussions.
So,
let's
jump
in,
we
have
an
exciting
update
on
our
time
to
data
cap.
A
A
We
are
much
more
confident
in
this
number.
It
may
still
change.
A
So
we've
been
digging
into
that.
We've
done
some
things
to
exclude
those
outliers
and
recalculate
how
those
are
being
done.
There's
still
some
additional
kind
of
back
end
work
that
we're
doing
to
make
sure
that
this
is
as
as
accurate
and
clean
as
possible.
But
all
that
said
we're
seeing
two
days
which
is
phenomenal.
A
To
my
knowledge,
this
is
the
average
overall
for
the
whole
program.
This
is
not
an
average
of
a
particular
time
scale
and
so
taking
that
into
account,
because
you
can
see
next
to
it
the
average
time
to
date,
a
cap
in
the
past
30
days
is
sitting
at
one
day
and
two
hours.
42
minutes,
which
is
again,
is
really
spectacular.
Deep
has
a
hand,
so
I'm
going
to
pause
here.
I.
B
So
that
means
you
know
if
that's
like
automated
verifiers
are
30
seconds
and,
of
course,
a
disproportionate
number
of
applications
come
into
that
and
then
ldn
for
recent,
like
top
ops,
has
been
pretty
quick
from
active
notice,
and
so
I
think
that's
what's
bringing
the
number
down,
which
is
fantastic
as
you
get
to
the
other
boxes,
I
think
we'll
still
be
able
to
make
a
couple
of
points
that
I
think
are
important,
though.
A
Yes,
thank
you
for
those
clarifying
details,
as
you
can
see,
the
click
next,
the
next
section,
the
average
time
to
date,
a
cap
for
direct
notaries.
This
is
looking
at
when
issues
come
in
through
the
field
plus
registry
app
and
create
a
github
issue,
and
then
notaries
are
able
to
reply
and
that
approved
and
that
allocation
lands
on
chain.
A
That's
for
all
time,
looking
at
10
days
and
then
for
the
past
days,
we're
seeing
three
days
so
really
happy
that
we're
looking
at
these
kind
of
both
overall
and
as
a
recency
bias,
because
it's
showing
sort
of
that
improvement
that
we're
hoping
for
we've
been
trying
to
push
a
lot
of
changes
with
this
intent,
and
so
I
think,
looking
accurately
at
you
know.
How
are
these
working
helps
us
understand.
Like?
Are
the
changes
effective
and
what
direction
do
we
need
to
keep
going
so
average
time
to
first
response
from
a
notary?
A
This
is
still
looking
at
all
time
and
again,
this
is
from
when
the
application
github
issue
comes
in.
Two
of
the
notaries
are
replying
to
that
eight
days
and
then
the
time
two
data
cap
allocate
on
chain
at
12.
so
happy
with
these
metrics
much
happier
than
you
know
when
we
first
started
calculating
them
and
the
outliers
were
pushing
us
into
the
20-day
range.
That
was
a
sad
moment
for
for
deepening,
but
this
makes
a
lot
more
sense.
A
So
continuing
on
looking
at
that
data
cap
pipeline
as
well,
we
still
have
some
work
to
clean
up
for
the
eldians,
because
the
when
we
rolled
aldians
over,
for
example,
the
ldn
for
discover,
had
some
something
close
to
five
heavy
bites
remaining.
So
when
we
rolled
them
to
the
new
process,
we
created
the
new
multi-sig.
A
We
have
yet
to
go
back
and
change
their
allocation
amount
of
data
cap
to
zero,
and
so
some
of
these
numbers
are
skewed
because
there's
you
know
two
ldns
for,
for
example,
discover
and
that's
the
case
for
a
number
of
those
ldns
that
were
proved
before
we
switched
to
the
new
v2
multisync
process.
So
these
numbers
are
a
little
off
still
continuing
to
work
on
that.
A
A
We've
just
been
prioritizing
not
blocking
people,
but
in
addition
to
that,
the
total
amount
of
data
allocated
clients
is
now
hitting
two
pebby
bytes
and
that's
27
percent
of
the
total
data
cap
kind
of
out
there,
which
is
great
and
flowing
down
into
deals,
we're
seeing
750,
and
so
that's
almost
35
of
the
amount
allocated
to
clients
is
getting
sealed
in
deals.
A
This
is
a
number
that
we
want
to
keep
seeing
go
up,
but
we
are
really
focusing
on
the
kind
of
increasing
the
top
of
the
funnel
and
then
using
other
mechanisms
like
some
solutions,
architects
and
some
other
ways
to
support
storage
provider,
biz
dev
to
help
get
this
metric
to
increase.
So
some
interesting
things
down
here
around
the
percent
of
successful
data
cap
applications,
being
you
know,
27
close
close
to
a
quarter
or
a
third
depending.
I
guess
I
split
the
number.
A
Some
of
this
is
again
coming
from
people
that
submit
duplicate
applications
and
how
those
duplicate
applications
get
handled
by
notaries,
and
then
other
is
you
know,
people
submitting
applications
that
are
incomplete
and
then
maybe
deleting
that
application
and
resubmitting
so
it's
sort
of
this
is
again
a
difficult
statistic,
metric
to
look
at
to
get
a
whole
picture
of
how
many
of
these
are
actually
quote.
Unquote
bad
applications,
but
we
don't
have
necessarily
a
target
metric
in
mind
here.
A
It's
just
something
that
we've
been
thinking
about
and
then,
like
I
said
this
is
you
know
this
34.9
percent
of
data
cap
is
something
that
we
do
want
to
help
see
increase
deep
anything
to
add
on
this
slide.
Okay,
moving
swiftly
on
looking
at
the
direct
notary
activity
there.
When
we
pulled
this
snapshot,
there
were
29
open
issues
in
the
repo
there
are
about
18
in
the
past
week,
new
applications
looking
across
time
we're
seeing
you
know,
210
with
that
label.
A
A
22.8
clients
are
receiving
data
cap
per
month
right
now,
it's
pretty
exciting
and
we're
seeing
some
of
our
updated
notary
table
metrics
that
deep
is
putting
together
from
a
table
that
I
think
andrew
hill
started.
Is
that
correct?
Yes?
A
So
we
can
see
some
stats
that
are
broken
out
by
notaries
and
we'll
keep
kind
of
increasing
that
we
have
some
projects
in
the
works
as
well
for
some
more
transparency
and
kind
of
like
scorecard
leaderboard
types
of
things
for
notaries
to
showcase
how
each
individual
notary
is
doing
so.
We're
excited
for
that
so
from
there
I'm
going
to
kick
it
over
to
deep
to
talk
through
some
of
the
six-month
goal
updates.
B
Cool,
thank
you
so
much
galen.
So
for
those
of
you
that
have
you
know,
participated
in
the
last
couple
of
noted
governance
calls
or,
if
you're,
watching
this
recording.
B
You
know
that
we
started
doing
this
thing
where,
after
we
spent
some
time
talking
about
what
reasonable
targets
were
on
a
six-month
rise,
and
actually
I
made
this
joke
last
time-
and
we
should
remember
to
edit
this
in
slides
that
we
have
to
reduce
six
to
five
now
because
we're
in
october,
and-
and
so
maybe
it
might
be
better
to
say,
like
a
target
for
march
or
something
reasonable
like
that,
like
maybe
q1,
is
something
we
can
rally
around
as
a
community
and
work
from
there,
but
basically
yeah.
B
We
divided
up
these
apis
or
goals
into
five
different
categories,
and
I've
been
wanting
to
add,
you
know,
sort
of
tracking
for
that
on
a
per
governance
called
basis,
so
the
first
of
which
is
volume
where
there
is
a
goal
to
get
several
hundreds
of
data
cap
available
and
data
cap
itself
should
be
plentifully
available
and
abundant
both
for
notaries
to
be
able
to
give
directly
to
clients,
but
also
for
large
data.
C
B
Driven
by
the
fact
that
in
general,
like
there's
a
lot
of
different
efforts
like
falcon
gravity,
slingshot
et
cetera,
that
are
generally
working
towards
pushing
100,
plus
everybody's
worth
of
content
under
the
network,
it's
a
current
status.
Like
last
time.
We
talked
about
the
slide.
We
were
in
the
40s
now
we're
at
63
periods,
that's
a
measurement
of
about
six
terabytes
that
were
given
to
militaries
in
the
combination
of
the
two
election
cycles
and
then
57
periods
coming
from
the
sum
of
so
far,
ldns
that
are
sort
of
ready
to
sign.
B
B
The
the
general
expectation
is
that
we're
still,
you
know,
working
towards
an
order
of
magnitude,
improvement
and
so
been
doing
a
little
bit
of
thinking
recently
on
this
one
and
would
love
to
share
some
thoughts
on
it,
probably
closer
to
the
end
of
the
call
and
get
get
people
thinking
about
this
a
little
bit
more,
the
second
of
which
is
time
to
data
cap
and
so
goals
here
were
to
get
a
lot
more
aggressive
and
how
quickly
their
clients
can
become
unblocked.
B
Do
operations
on
the
network
and
generally
that
it's
not
a
good
user
experience
as
a
client
to
be
like.
Oh
I'm,
gonna
go
upload
data
at
the
network
and
then
just
be
completely
blocked
waiting
for
10-11
days.
If
that's
how
long
it
takes,
and
so
the
goal
here
is
to
say
that
four
very
small
amounts
of
data
gap
like
what
today
we
treat
as
a
32
gb
tier.
B
Why
don't
we
look
at
bringing
that
up
to
one
tabby,
byte
and
and
letting
clients
get
that
more
frequently
if
needed,
and
then
support
notaries,
we're
doing
to
ensure
that
much
more
data
cap
gets
served
a
lot
quicker,
and
so
this
has
been
an
ongoing
topic
for
a
while.
I've
we've
had
a
couple
of
discussions
around
how
we
can
level
up
the
automated
verifier
to
support
like
a
one
type
of
data
cap
allocation,
and
so
that
discussion
is
on
on
github
we've
had
it
a
couple
of
times
it's
like.
B
B
Mr
galen,
can
I
request
the
next
slide?
Thank
you
risk
mitigation.
So
we
currently
have
you
know
two
different
dashboards.
There's
fill
plastic
info
that
is
being
run
by
the
file
drive
team
and
I
think
laura
is
in
the
call
today
and
then
there's
the
plus
dashboard
that
galen
just
pulled
up.
It's
very,
very
important
that
we
have
one
more
dashboards
that
are
coming
from
different
places
that
can
provide
an
opinion
like
it's
it's
difficult.
B
It's
extremely
important
that,
like
one
that
file
drive,
is
even
able
to
make
that
dashboard,
but
then
two
the
fact
that,
like
we,
should
continue
encouraging
the
development
of
additional
like
opinions
in
this
space,
because
it
will
require
multiple
sources
to
be
able
to
distill
the
truth,
and
we
don't
want
to
have
points
of
failure
in
the
system,
especially
as
we
start
to
move
towards
more
automation,
and
we
start
to
move
towards
a
world
where
we
rely
on
data
driven
decision
making.
B
B
This
call
interested
in
building
a
dashboard
reach
out
you
please
let
us
know
happy
to
have
a
grant
for
that
or
support
you
in
any
way
that
we
can
like
we're
working
on
adding
more
api,
so
that
the
actual
data
pulling
aspect
becomes
a
lot
easier,
and
so
this
just
turns
into
a.
How
do
you
want
to
design
it?
How
do
you
want
to
verify
your
data?
How
do
you
want
to
update
it,
and
what
do
you
want
to
look
like?
What
are
you
trying
to
solve
for,
and
so
it
would
be
great.
B
B
You
know
five
plus
providers
and
distribute
data
across
them
so
ways
in
which
that
we
can
flag
that
in
the
system
that's
important,
not
just
because
it
it
builds
a
higher
standard
of
quality,
of
like
clients
that
are
willing
to
apply
for
data
cap
and
how
they
use
it,
but
also
important
in
reducing
time
to
data
gap,
because
for
every
subsequent
allocation
that
a
client
comes
for.
B
That
starts
to
become
like
baseline
information,
that
a
notary
will
want
to
at
least
take
a
glance
at,
and
so
today,
if
a
notary
chooses
to
look
at
it,
they
have
to
go
to
one
of
the
dashboards
click
through
a
bunch
of
things
and
find
this.
But
if
we
were
able
to
just
automate
like
flags
or
automated
risks
on
like
a
particular
allocation,
this
would
go
down
from
being
like
a
five
to
ten
minute
process.
B
That
nurturing
has
to
be
aware
of
doing
to
like
a
30
second
process
where
nothing's,
like
oh
no
flags
on
these
clients,
happy
to
sign
it
sign
it,
and
so
I
would
encourage
those
of
you
that
have
a
you
know:
interest
in
like
fraud,
monitoring,
fraud
watching
or
in
general
have
past
experience
in
looking
at
like
basically
this
within
a
system
just
within
this
kind
of
design.
B
If
you
have
ideas
on
that,
then
there's
the
aspect
of
climb
onboarding
ux
needing
to
improve
to
ensure
that
the
ramp
to
falcon
is,
is
low,
friction
and
then
there's
this
coin,
plus
governance
side
of
things
where
we
want
to
ensure
that
we
have
more
scalable
governance
practices
on
both
of
these.
B
I
think
we
haven't
had
as
much
progress
in
the
last
month,
like
we've,
made
some
changes
to
falcon
plus
our
industry,
yet,
but
not
enough
that
it's
worth
calling
out
as
a
substantive
step
in
improvement
here
other
than
for
the
nurtures,
certainly
not
for
the
clients.
And
so
I
would
love
to
hear
if
you
have
any
ideas
on
this
or
how
you
want
to
contribute
to
this
again
happy
to
support
either
with
time
or
with
grants,
to
ensure
that
you
feel
empowered
to
make
changes
to
the
system
in
a
way
that
you
are
interested.
A
On
yep,
yeah
sweet
so
led
navigation
tracking.
This
was
something
that
I
put
some
effort
into.
I
need
to
go
back
and
maintain
it.
I'm
not
sure
how
valuable.
C
A
I
thought
it
would
be
a
lot
more
useful
because
I
thought
there
could
be
some
automation
to
this
based
on
labels.
Unfortunately,
in
github
right
now,
that's
not
the
case.
Unless
I
go
deep
into
writing
my
own
api
rules,
which
believe
it
or
not,
I
don't
want
to
do.
The
intent
here
was
to
basically
try
and
have
a
little
project
tracking
board
where
we
could
move
the
oakland
applications
back
and
forth.
A
To
say
this
one
is
ready
and
you,
the
notaries
or
various
community
members
could
land
on
this
project
board
and
then
get
a
snapshot
of
you
know
where
are
these
because
we're
now
at
something
like,
I
think,
30,
so
they
don't
fit
on
a
single
page
in
github,
anymore,
right
and
so
just
trying
to
find
better
ways
to
stay
on
top
of
it.
I
may
end
up
deprecating
this
if
it
continues
to
be
clunky
or
require
more
manual
effort
to
be
accurate,
and
we
may
just
rely
on
labels.
A
If
that's
the
case,
it
may
be
that
we
send
out
some
urls
that
are
saved
label
searches
for
the
notaries
to
be
able
to
say
you
know,
go
search
based
on
the
label,
ready
to
sign
and
just
look
at
those,
so
we'll
see
to
be
determined
if
anyone
else
wants
to
like
take
on
using
this
project
board
and
github
and
write
those
apis.
Let
me
know
happy
to
help
support.
A
We
still
have
some
various
notary
governance
discussion
topics
that
have
been
open
that
we've
been
pushing
through.
I,
to
my
knowledge,
I
think
that
meg
and
the
market
assessment
is
still
moving
forward
with
some
support
from
other
members
in
the
foundation,
and
then
I
think
the
big
ones
have
been
kind
of
around
this.
This
idea,
like
increasing
the
scope
of
automated
verification,
sort
of
what?
A
What
could
that
look
like
what
tools
do
we
feel
we
would
need
to
have
in
place
before
we
can
support
more
automation
and
then
also
fundamentally,
what
are
some
additional
ways?
We
could
automate
that
we
would
want
to
test
early.
So
did
you
want
to
dive
into
any
of
these
on?
This
call.
B
Yeah,
I
would
love
to
I'd
love
to
dive
into
the
first
one,
at
least,
and
I
think
the
prompt
for
the
conversation
that
I
want
to
use
is
today.
If
you
go
to
like
verified.glycerio
once
every
30
days,
you
can
go,
get
32
gb
of
data
cap.
B
E
B
B
B
Okay
looks
like
silence,
which
is
generally
means.
I
think
people
are
like
sure,
you're,
not
necessarily
super
pro
against
yeah.
I
think
this
is
worth
sort
of
just
keeping.
The
back
of
your
mind
is
like
you
know,
the
first
step
that
we
can
take
towards
drastically
improving
ddd,
drastically
improving
the
number
of
clients
that
are
unblocked
on
the
network,
drastically
improving
the
amount
of
data
available
about
clients
on
the
network
and
how
they're
behaving
oh,
we
have
a
hand
eric
go
for
it.
C
Yeah
we
can
hear
you
yeah,
okay,
I'm
fine
with
this
proposal,
but
obviously
for
the
chinese
miners.
They
do
have
the
language
barrier,
so
it
can
be
a
chinese
version
for
them
yeah.
So.
C
B
Yeah,
that's
a
really
interesting
point,
so
I
I
guess
I
have
a
question
for
you
alex,
so
please
pardon
my
ignorance.
Basically,
what
I've
seen
is
browsers
today.
Do
this
like
attempt
of
translating
right
like
if
you're
in
a
region
or
actually
it
doesn't
actually
matter
where
you
are,
but
you
can
basically
open
a
page
in
a
browser
like
hit
right
click
on
a
page
and
say
translate
into
like
some
language.
C
I
think
the
the
key
points
is
the
the
the
channels,
because
the
only
few
miners
they
would
like
to
log
in
with
gleep
website
and
for
chinese
people.
We
do
recommend
the
wechat.
Maybe
you
can
ask
karen
for
help
for
they
post
some
news
from
the
wechats
for
the
for
the
official
pages.
Then
let's
say
everybody
can
know
this
news.
Otherwise
nobody
knows
here
we
can
get
the
data
cap.
B
C
B
That's
very
helpful,
there's
also
generally
eric
a
question
for
you.
I
guess,
is
attention
to
this:
there's
generally
been
a
decent
amount
of
interest
after
you
filed
that
sort
of
translation
thing
on
people
who
want
to
translate
stuff,
and
so
I
I
get
a
few
things
in
slack
nowadays
on
people
who
want
to
work
on
translating
different
parts
of
the
plus
project
into
into
chinese
or
other
languages.
B
Is
it
worth
if
we
put
everybody
into
a
group
together
and
then
you
can
sort
of
lead
that
effort
and
coordinate
and
figure
out
the
best
way
to
approach
this?
Is
that
something
you'd
be
interested
in
doing
or
should
we
pull
in
some
support
from
the
foundation
or
something
like
that?
What
would
be
good.
C
Sure
sure
that
that
would
be
great.
So
if
you
can
put
me
into
the
translate
team,
we
can
work
together.
B
So
two
weeks
ago,
for
those
of
you
that
were
here,
we
spent
some
time
talking
about
daos
as
a
community
and
andrew
talked
a
little
bit
about
what
a
guild
was
and
sort
of
building.
B
Yeah
cool
generally,
it
seems
like
people
are
supportive
of
this,
though.
So
that's
that's
interesting.
Maybe
we
should
think
about
doing
this
sooner
rather
than
later.
Again,
I
would
love
to
I'd
love
to
I'd
love,
to
get
a
little
bit
more
data,
so
for
those
of
you
that
are
in
touch
with
clients
or
have
virtual
clients,
you
know
bring
this
up
with
them,
see
what
they
think
and
then
maybe
for
the
next
call.
B
I
will
petition
this
more
as
an
issue,
and
then
we
can
discuss
like
how
we
want
to
implement
it
like
whether
or
not
it's
just
verified
our
glyphs.I
o
that
levels
up
or
if
we
want
to
introduce
a
new
verifier
like
if
some
other
notary
here
wants
to
build
one
using
their
code
for
their
data
gap
allocations
to
the
interesting
ways
in
which
we
can
go
about
doing
this.
B
Sweet
yeah,
you
have
one
on
here
and
then
I
have
another
topic:
I'd
love
to
cover,
but
actually
too,
but
I
can
hold.
If
you
want
to
talk
about
this
thing,
first,
okay,
I
will
go
for
it.
So
the
first
of
those
two
topics
is,
you
know:
one
is
building
like
this,
like
no
leaderboard
stationary
sats
type
page
where
we
track
like
governance
as
a
metric
as-
and
we
talked
about
this
in
the
last
call
for
quite
a
while.
B
B
There
were
a
couple
of
takeaways,
but
basically
the
conversation
revolved
around
this
sort
of
learnings
from
other
dows
and
valve
like
communities
that
are
starting
to
build
like
pages
where
they
can
track
steward
contribution
participation,
performance,
and
that
was
more
an
idea
that
all
of
you
participate
in
now,
then
it
should
be
like
relatively
transparent
as
to
how
you
participate
and
in
what
ways
you
choose
to
participate
and
how
you
prioritize
your
your
time
within
it,
and
meg
has
been
pushing
this
idea
of
like
having
different
levels
of
expectations
for
notaries
and
and
how
much
time
they
should
be
spending
and
being
more
transparent
as
a
community
about
that.
B
So
that
is
pretty
cool,
and
so
you
sort
of
as
a
stepping
stone
to
that.
You
know
I
saw
this
page
and
got
a
little
inspired
with
it.
I
think
I
shared
this
on
the
call
last
week
or
sorry
two
weeks
ago,
and
basically,
I've
been
working
on
like
a
proposal
like
an
rfp
for
a
grant
for
like
what
a
notary
dashboard
could
look
like,
and
so
who
are
the
active
notaries?
How?
How
are
they
active?
What
do
they
usually
participate?
In
is
it
ldns
or
direct
applications?
B
When
do
they
tend
to
be
active
in
what
way
are
they
active,
etc?
So
that
one
we
can
better
inform
clients
of
you
know
how
they
should
select
an
area
who
they
want
to
work,
whether
relationships
they
may
want
to
build
in
the
ecosystem.
B
Two,
it's
interesting
from
the
perspective
of
basically
tracking
metrics.
That
could
then
result
in
us
having
the
data
to
have
conversations
around
like
things
like
service
levels,
where
it's
like,
oh,
like
there's,
a
clear
set
of
folks
that
are
interested
in
spending
more
time
in
like
community
governance
within
the
topics
or
there's
a
set
of
folks
who
are
more
interested
in
just
doing
due
diligence
all
the
time
for
manual
like
verification
of
applications,
and
so
based
on
those
things.
We
can
start
to
create
working
groups,
focus
groups,
you
know
task
forces,
etc
or
whatever.
B
We
want
to
call
them
within
this
ecosystem
within
this
community
to
become
more
efficient,
then
there's
this
third
aspect
of
just.
B
We
have
these
stats
on
the
data
like
on
the
on
the
dashboard
that,
like
gail
and
I
work
on,
we
have
them
in
the
cluster
infra
dashboard
as
well,
but
we
have
no
centralized
space
to
like
look
at
our
own
performance
as
a
community,
and
so
this
would
be
that,
and
so
it
would
replace
it
would
be
like
the
level
up,
basically
of
of
what
we
use
today
with
andrew's
like
csv
that
we
pasted
in
at
the
start
of
the
deck
and
effectively
like
we
have
a
better
form
like
leaderboard
around
it.
B
So
I
think
it's
a
pretty
compelling
idea,
not
all
of
you
were
there
two
weeks
ago
to
talk
about
it.
Hopefully
what
I
said
makes
sense.
If
it
didn't,
please
tell
me
I'm
happy
to
go
deeper
into
it.
If
not,
what
I'll
do
is
I'll,
probably
just
publish
like
the
rfp,
and
then
we
can
go
hunting
for
a
team
that
might
be
interested
in
implementing
something
like
this.
B
Oh
next
topic:
I
love
flag
is
around.
What's
the
other
one
that
I
want
to
bring
up.
B
Oh
yeah
general,
like
ldn,
like
application
flow
and
and
how
we
want
to
scale
that
up
a
little
bit.
B
So
what's
some
of
you
are
aware
that
you
know
there
are
a
couple
of
like
technical
hitches
last
week
in
terms
of
making
sure
that,
like
the
application
showed
up
correctly
and
they're,
all
assignable
we've
tried
to
push
a
couple
of
changes
that
should
help
with
like
the
sign
out
sign
in
flow,
so
that
there's
no
like
accidental
allocations
coming
from
different
multisigs
and
we've
tried
to
improve
the
ux
a
little
bit
for
the
notary
experience
in
the
last
couple
of
weeks,
so
that
it's
a
little
bit
smoother
for
you,
folks
to
figure
out
like
what
you
can
do
within
the
system
and
how
you
should
be
doing
it
in
general,
there's
still
more.
B
That
needs
to
be
done,
so
first
thing
would
be.
If
any
of
you
are,
you
know
willing
to
sit
down
with
us
and
then
the
team
that's
working
on
it.
They
basically
say
you
know.
I
I
have
these
three
complaints
or
issues
or
ideas
for
improvement
and
give
us
that
kind
of
feedback,
even
in
like
a
five
minute
call
or
ten
minute
call.
That
would
be
super
appreciated.
B
Please
reach
out
in,
like
the
falcon
boston
notary
channel,
to
let
us
know
that
you'd
be
interested
in
giving
feedback,
because
we'd
love
to
talk
about
it
and
then
second,
we
need
to
come
up
with
some
way
where
the
tool
itself
is
more
useful
for
you
in
in
like
pushing
like
a
way
to
let
you
know
that
it's
worth
looking
at
right
so
like
when
you
have
applications
come
in.
Typically,
you
get
flagged
on
github.
B
Maybe
we
use
slack
because
that's
really
where
we
are
most
active
as
a
community,
and
so
galen
was
toying
around
with
the
idea
of
having
like
a
slack
bot
built
that
basically
looks
at
the
ldn
repo
so
that
same
like
tracking
table.
We
have
basically
we
could
pull
from
that
and
we
could
just
do
a
like.
A
bot,
automated
post,
like
every
24
hours.
B
Right
saying
here,
are
the
outstanding
aliens
that
I
was
looking
at
for
you
kind
of
the
thing
or
we
come
up
with
something
on
github
or
we
come
up
with
something
on
email
just
want
to
figure
out
like
now
that
people
aren't
directly
being
assigned
it
looks
like
clients
are
having
to
go
to
specific
notaries
or
we
like
people,
wait
till
they
reach
out
to
us,
and
then
we
end
up
tagging
all
of
you
and
that's
not
a
good
like
mechanism.
B
B
B
Theoretically,
we
could
have
a
visualization,
I
actually
don't
think
that's
like.
I
actually
think
somebody
published
this.
Let
me
just.
B
Yeah
jim
jinpik
made
one
recently,
I
think
and
the
and
if
you
look
at
the,
if
you
look
at,
he
delineates
like
what
is
a
verified
deal
versus
a
not
verified
deal
and
it's
actually
pretty
well
distributed.
I
I
thought
it
looked
good
yeah
here.
Here's
here
is,
it
looks
like
we've
got
a
couple
of
big
hubs
in
in
europe.
B
I
think
there's
a
special
yellow
dot
here
for
wine
and
steam
as
well.
Let
me
see
if
I
can
get
you
a
link
to
this.
C
B
B
B
B
I
think
the
problem
with
this
is
also,
of
course,
it's
looking
at
probably
load
balance
id
as
opposed
to
the
ip
address
on
the
dht,
and
so
people
can
pretend
to
be
anywhere
but
yeah.
I
I'm
not
aware
of
a
minor
in
peru,
for
example
today,
but
maybe
I'm
wrong,
maybe
there's
a
sort
of
providing
data
center.
There.
B
F
I
don't
know
it's
like
peripheral
and
they
need
a
vpn
sometimes
and.
F
I
was
taught
talking
to
eric
last
night
and
he
said
we
probably
need
to
mail.
The
hardware
hard
drives.
B
Yeah,
that's
definitely
what
we've
seen
other
projects
do
as
well
like
just
put
everything
on
a
hard
disk
and
ship
it,
and
it's
actually,
if
you
do
it
at
a
certain
scale,
it's
cheaper
to
do
that
right,
because,
if
you're
doing
it
at
a
high
enough
scale,
then
you
amortize,
the
shipping
costs
and
the
hard
disks
are
actually
hard.
Disks
are
not
expensive.
It's
actually
just
the
logistics
challenge
of
getting
the
data.
F
B
F
B
B
Yeah,
I
think,
there's
cdn
is
one
option
I
think
cdn
will
always
be
too
expensive.
I
actually
think
what
you
need
is
a
point-to-point
data
transfer,
that's
more
efficient
right
because
having
a
cdn
means,
you
have
wide
coverage
all
the
time
and
that's
not
really
what
we
need
here.
What
we
need
is
one
place
to
another
place
very
efficiently.
B
There
are
some
like
there
are
some
software
solutions
that
are
being
built.
There's
this
one
that
I
was
talking
to
last
month.
That
is
worth
checking
out.
Maybe
there's
also,
of
course,
the
tried
and
tested
upload
to
webdue
search
provider,
which
I
don't
love.
I
think
we
should
come
up
with
a
better
way
and
then
there's
also
like
the
fact
that,
like
maybe,
we
just
need
to
get
more
effective
data
transfer
in
the
falcon
network.
B
So
I
know
there's
some
thinking
being
done
to
make
that
component
a
little
bit
more
modular
and
then
figuring
out
like
how
we
can
use
bitswap.
For
example,
there's
a
really
good
talk
from
hannah
howard
at
the
falcon
orbit
sessions
last
week,
but
highly
recommend
you
check
it
out.
D
B
Awesome,
that's
pretty
much
the
stuff
that
I
wanted
to
flag
today,
I'm
trying
to
think.
If
there's
anything
else,
I
think
in
general,
like
I
stepped
back
and
did
a
little
bit
of
thinking
last
week
on
some
ideas,
I
had
around
the
design
system
and
had
a
chance
to
sync
up
again
over
the
last
few
days
as
well
about
like
what
we
can
more
creatively
do
and
propose,
and
so
I
think
the
main
thing
from
that
is.
B
B
Yes,
definitely
we
were
fixing
the
allocation,
but
it
actually
should
have
been
fixed
yesterday,
but
I
don't
know
I
didn't
get
a
chance
to
sync
up
with
the
team
yesterday
in
the
evening,
because
I
got
inundated
with
some
other
stuff.
Do
you
know
the
latest
on
where
we
are
with
the
issues
that
we
had
over
the
weekend?
Well,
I
guess
end
of
last
week
earlier
this
week,
yeah.
A
I
was
typing
it,
but
yes,
the
it
was
not
looking
at
the
subsequent
allocation
bot
was
not
looking
at
the
rules
to
say,
calculate
this
value
and
this
value
and
take
the
lower
one.
So
one
of
the
bots
had
that
rule
the
allocation
bot
did
not,
and
so
that
was
where
the
two
bots
were
in
conflict.
So
they
are
updating
that
bot.
They
have
made
the
first
change
the
that
should
work
and
it
started
to
add
the
label
correctly.
A
Yes,
eric
root.
Key
holders
were
busy
last
week
at
like
orbit
events
and
yeah
that
was
so
pushed
through.
I
think
there
was
another
I
was
going
to
touch
on
this,
which
kind
of
rolls
into
some
of
these
questions.
The
current
process
around
the
client
opens
the
issue,
the
bot
and
the
governance
team
audit
it
just
for
completion,
and
then
it
goes
to
the
key
holders
they
get
the
signature.
A
The
bot
will
then
start
the
first
allocation
request.
A
lot
of
this
is
working
pretty
smoothly.
You
know
we
had
that.
One
problem
say
with
yours,
where
the
sort
of
delta
between
the
weekly
versus
the
total
that
hasn't
happened
before
so
it's
just
finding
these
edge
cases.
Sorry
that
that
hit.
You
should
be
getting
corrected.
A
We
are
not
doing
the
decision
making
or
the
auditing
or
the
opinions
on
is
this
valid,
and
so
the
expectation
for
the
notaries
is
to
actually
review
the
evidence
in
the
application
and
say
yes,
this
person
put
in
you
know
they
are
a
real
client
with
valid
data.
They
we
are
confident
that
they
claim
to
be
this
company
with
this
kind
of
data
and
they
can
provide
a
sample
and
it
you
know,
meets
our
standards,
and
so
we
want
to
encourage
you
know
as
we're
speeding
up
this
process.
A
We
still
want
to
encourage
the
notaries
to
be
asking
those
questions
and
specifically,
if
because
we
have
now
created
this
faster
process
where
the
threshold
is
lower,
if
a
notary
gets
to
one
of
these
applications
and
has
a
concern
and
asks
a
question.
A
Please
comment
on
the
issue
that
you
are
concerned
about
this
and
want
to
hear
the
answer
to
this
question
before
it
gets
approved,
because
what
we
you
know,
if
you
don't
ask
that
question
someone
else,
some
other
notary
may
read
it
and
think
this
is
fine
and
they
may
not
have
the
same
question
that
you
do.
That
doesn't
mean
that
your
question
is
is
not
valid.
A
It
could
be
that
they
have
more
context,
but
they
could
also
answer
that
question
for
you.
So
if
you're
concerned
about
one
of
the
ldns
that
you
see
for
any
reason,
please
say
so
like
flag.
It
and
say
you
know,
please
don't
approve
this
until
we
hear
more
because
that's
part
of
the
the
notary
governance
process
right.
It's
like
we
are
all
responsible
for
looking
at
these.
A
If
you
don't
do
that,
if
you
don't
raise
your
hand
and
say
this
is
an
issue
and
two
other
people
jump
in
and
approve
it,
you
could
still
flag
it
later
on
to
pause
the
subsequent
allocation.
You
could
say:
hey.
I
actually
had
a
question.
I
saw
two
people
sign
this
before
we
sign
the
next
one.
A
Let's
make
sure
we
get
an
answer
to
this
question,
so
that
was
the
the
last
thing
that
I
wanted
to
to
say
there
about
ldns,
there's
a
number
that
have
gone
through
as
of
yesterday.
That
should
be
pending
signatures.
That
needs
some
review.
So
that's
all
I'll
say
about
that.
A
With
that
we
have
come
to
the
end.
We
have
six
minutes
left
for
some
open
discussion
issues
if
other
people
had
some
open
topics
or
questions
that
they
wanted
to
flag,
go
ahead
and
raise
your
hand
and
pass
it
over
to
you.
E
Yeah,
I
was
just
wondering
I've
submitted
an
ldn
and
it's
approved
now
by
three
notaries
under
the
new
things,
it's
okay
and
good
to
go
now.
B
I
think
gil-
and
I
have
some
time
later
today
to
do
a
quick
audit
on
what
the
outstanding
things
are
for
the
root
key
holders,
because,
as
you
see
in
the
chat,
I
think
the
first
year
event
had
the
folks
that
actually
hold
the
keys
a
little
bit
busy
and
so
we'll
do
this
audit
today
ping
a
bunch
of
turkey
holders,
hopefully
get
some
good
responses
in
the
next
24
hours
and
unblock
allegations.
Yeah.
A
A
That
was
the
previous
mechanism,
as
we
were
waiting
for
notaries
to
raise
their
hand
and
say
yes,
I
approve
yes,
I
approve,
so
the
slowdown
that
you
are
seeing
is
just
because
deep
and
I
and
then
the
key
holders
over
the
past,
like
two
weeks,
have
not
gotten
to
those
fast
enough
to
create
the
issue
to
send
it
to
the
root
key
holders
for
them
to
sign
so
apologize
for
that.
But
yes,
as
steve
said,
we
should
get
to
it.
Does
that
answer
your
question?
Yep.
B
But
the
fact
that
you
have
already
had
notice
look
at
it
and
approve
of
it
means
that
as
soon
as
that
happens,
your
turnaround
time
should
be
extremely
quick.
Okay,
I
think
that
would
be
good,
there's
also
a
similar
threat,
actually
in
the
slack
about
how
we
can
track
a
root
key
holder
like
perf
and
stats
and
governance
steps,
and
so
that
was
really
interesting.
I'm
super
supportive
of
that.
E
E
Can
you
enlighten
me
that
only
one
type
of
data
set
is
allowed
in
an
ldn,
or
can
you
tell
me
something
about
it
because
I'm
not
sure.
B
Is
this
related
to
3k
right
genome,
probably,
is
my
guess
versus
something
else.
B
E
B
E
B
Yeah,
I
I
think
at
that
point,
then
it's
up
to
whether
or
not
no
trees
want
to
support
it.
So
if
you
can
convince
two
no
race
to
give
you
the
cap
at
the
start,
then
I
think
yeah
cool
bring
her.
Someone
asked
me
about
the
time
of
the
next
round
of
nurturing
elections.
Do
you
have
a
specific
plan?
Oh,
that
is
a
good
topic.
B
I
think
we
should
bring
that
up
in
the
next
call,
probably
because
we
probably
want
to
do
do
one
like
around
the
end
of
the
year
like
I
was
just
looking
at
the
stats
and
where
more
than
40,
maybe
about
35
40
of
of
the
available
data
cap
to
notice
has
been
manually
allocated
and
in
general.
B
Something
we
talked
about
last
cycle
was:
how
do
we
also
improve
the
number
of
like
increase
the
number
of
notaries
and
figure
out
what
the
engagement
looks
like
in
a
way
that
it's
sustainable
for
individual
notaries
and
so
bingo?
I
think
it's
a
complex
question
with
more
nuance.
I
think
it's
worth
additional
conversation,
but
if
you
want
to
give
somebody
an
answer
that
stop
bothering
you,
you
could
say
something
like
in
the
next
few
months
or
you
could
say
something
like
around
january.
You
will
probably
have
a
better
idea,
but
that's
my
guess.
B
Wow
we're
going
to
end
on
time.
I
think
we'll
think
we
can
do
it.
Oh
anya,
you
want
to
give
us
a
quick
30.
Second
thing
on
what
that
is.
D
I'm
just
sure,
though,
we
just
developer
relations
at
tx
textiles,
so
I'm
trying
to
try
to
organize
external
communication.
So
if
you
guys
want
to
join
our
discord,
please
do
you
can
vote
and
submit
your
favorite
stickers
in
there
as
well
and
actually
we're
dropping
our
layer
2
solution
in
there
and
there's
like
a
brainstorm
going
on.
So
if
you
want
to
join.