►
From YouTube: Filecoin Plus - Sept 28 2021 Notary Governance Calls
Description
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
Agenda topics included: program metrics and stats, updates to the stats Dashboard, work towards 6-month goals, notaries presenting on their open "Discussion Topics", LDN audit process & mindsets, and open discussion time.
A
This
is
part
one
of
two,
so
this
is
the
morning
session
for
us
evening
for
others,
but
yeah
looking
forward
to
getting
through
a
bunch
of
content
today,
as
usual,
we'll
probably
start
out
with
some
of
the
metrics
and
a
conversation
around
like
where
we're
tracking
in
the
program,
and
then
we
want
to
chat
a
little
bit
about
like
changes
that
are
coming
to
the
dashboard
to
general
tracking
towards
like
the
goals
for
the
six-month
horizon,
ish
that
we
discussed
over
the
last
couple
of
notary
governance
calls
and
then
want
to
ensure
that
we
provide
enough
time
to
discuss
open
discussion
topics,
because
I
know
there's
a
few
that
have
been
opened
in
the
last
couple
of
days
well
weeks.
A
I
guess
at
this
point
and
then
the
ldn
changes
that
should
shift
this
week
and
then,
of
course,
any
additional
open
discussion.
I
know
charles
was
mentioning
that
he
may
have
a
couple
of
topics,
so
that's
in
show.
We
have
a
few
minutes
of
that
gillan.
You
want
to
take
it
away.
B
Yeah,
so
jumping
into
metrics
snapshot
from
our
dashboard
around
time
to
data
cap.
Our
average
time
actually
looks
like
it
climbed
up
a
little
bit,
but
still
in
that
seven
day
window.
One
of
the
things
that
we're
working
on
when
we
talk
about
some
updates
to
this
dashboard.
These
current
averages
are
looking
historically
at
all
time,
which
you
know
as
the
program
changes.
We
think
that
this
average
time
has
really
decreased
over
the
past
like
four
weeks
compared
to
what
it
was
some
number
of
months
ago.
B
So
we're
looking
at
some
easy
ux
modifications
and
a
simple
first
version
may
just
be
to
change
this
to
say
average
time
to
date,
a
cap
in
the
past
four
weeks,
but
then
wanting
to
keep
those
historical
snapshots
and
being
able
to
say
how
would
we
look
at
it
across
all
time
or
look
at
it
at
a
particular
point
in
time,
but
we
are
seeing
a
decrease
in
response
time
from
notaries
and
a
decrease
in
the
time
allocated
on
chain.
So
both
of
those
things
still
improvements.
B
A
couple
of
the
other
dash
board
updates
that
have
already
shipped.
So
you
could
go
see
these
now
we
are
splitting
out
some
elements
from
the
large
data
sets
specifically
right
now
I
wanted
to
call
it
how
we
are
separating
out
the
data
cap
allocation
and
part
of
this.
Like
you
know,
we
talked
about
this
in
the
past.
B
Few
governance
calls
it
really
changes
the
way
our
funnels
look
when
we
think
about
these
as
the
same,
and
we
can
look
at
our
photo
until
they
kind
of
highlight
that
so
our
data
cap
allocation
funnel
for
direct
notary
to
client
activity,
so
not
counting
large
data
sets.
We
have
12.75
petty
bytes
that
has
been
allocated
out
to
notaries
for
them
to
start
distributing
of
that,
13
has
gone
to
clients
at
this
point,
but
what's
really
exciting
here,
in
my
opinion,
is
the
the
transition
from
clients
to
deals.
B
Is
you
know,
third,
which
is
pretty
great?
Obviously,
this
data
capture
notary
is
just
going
to
keep
decreasing
as
more
is
given
up
to
clients.
You
know
over
time
until
we
get
to
another
notary
election
cycle,
so
part
of
the
reason
you
know
we
think
of
these
things
in
this
funnel
is
because
we
do
sort
of
this
big
allocation
at
an
election
point
and
then
they're
making
distributions
across
a
number
of
months
from
this
pot.
B
But
we
want
to
keep
seeing
you
know
how,
how
much
that's
getting
to
clients
how
quickly
and
then
how
quickly
those
clients
are
sealing
it
in
deals
so
highlighting
why
it's
you
know
useful
to
split
these
and
maybe
not
as
useful
to
look
at
the
ldns
on
the
same
funnel.
Those
because
of
the
process.
The
ldn
entity
when
it
has
stood
up,
has
a
very
large
allocation
by
itself,
especially
in
comparison
to
the
allocation
of
the
direct,
the
direct
notary.
B
So
you
know
we're
talking
32.9
compared
to
12.75
to
all
20
some
odd.
You
know
direct
notaries
and
that's
only
for
12
lvm
clients
we're
about
to
probably
double
the
number
of
lbn
clients
when
we
push
through
this
new
process
flow.
So
this
number
is
going
to
significantly
blow
up,
which
then
changes
just
the
calculation
of
how
quickly
and
at
what
pace.
That
goes
to
the
clients.
B
This
is
this
program
has
only
been
going
for
three
months
of
allocations,
so
very
exciting
to
see
that
here
a
couple
of
the
usual
statistics
that
we
like
to
highlight
our
direct
notaries
you're
doing
great,
there's,
currently
only
38
open
issues
as
of
yesterday,
which
is
fantastic.
B
Six
of
those
are
new.
They
don't
have
any
comments
yet,
but
some
number
of
those
are
duplicates
which
we
can
also
see
down
here
in
this
percent
of
applications
that
get
a
first
response.
B
A
lot
of
these
that
are
not
getting
the
first
response
are
probably
ones
that
get
closed
without
a
response,
because
they're
duplicate
entries
and
the
client
themselves
closes
it
with
210
granted
applications
and
26
in
the
last
14
days.
This
number
continues
to
increase
every
you
know
two
weeks
we're
getting
more
and
more
in
a
two-week
cycle
so
per
month
we're
seeing
20
clients
getting
data
cap
out
of
time.
It's
great
we're
going
to
keep
pushing
these
numbers
higher,
but
right
now
we're
very
happy
with
with
that
pace.
A
With
yours,
yeah
totally
happy
too,
so,
just
as
a
quick
reminder
on
updates
to
the
process.
Folks,
I
think
we
share
this
at
a.
I
think.
Governance
call
at
least
one
ago,
if
not
two,
but
one
of
the
things
that
we
wanted
to
start
doing
a
little
bit
more
was
using
github's
other
features
that
were
more
appropriate
for
some
of
our
processes
as
a
community.
A
So
and
that's
where
we're
going
to
start
like
discussing
the
discussions
as
like
areas
or
problem
areas
or
proposals
that
are
still
very
early
or
where,
like
a
lot
of
input,
is
expected
from
the
community
and
then
issues
are
more
around
the
actual,
like
solutions
or
more
finalized,
baked
proposals
that
are
on
the
bottowitz
implementation,
once
the
community
sort
of
rallied
around
a
particular
approach
for
a
specific
problem.
A
So
I
opened
up
this
discussion
on
specifically
increasing
scope
of
automated
verification.
So
this
is
a
pretty
interesting
challenge
and
something
that's
gonna,
like
matter
a
lot
in
terms
of
helping
the
program
scale
and
ensuring
that
the
falcon
plus
system
itself
enables
like
a
lot
of
like
interesting
clients
to
actually
get
on
board
as
fast
as
they
they
possibly
can
and
prove
that
on
the
the
actual
network,
they're
behaving
as
we
hope
for
like
a
good
client
to
behave.
A
So
a
couple
of
stats
that
I
want
to
call
out-
and
actually
maybe
I
can
share
my
stream
again,
because
I
can
just
throw
up
the
discussion,
because
I
drove
some
of
the
stuff
down
here
and
I
think
it
might
be
interesting
to
share
so
there's
this
thing
cool.
So
I
wanted
to
call
out.
Actually
let
me
also
dashboard
at
some
point.
A
Cool
so
a
couple
of
things
I
want
to
call
it.
Hopefully
this
text
is
large
enough,
but
basically
you
know
we
have
one
automated
verifier
today
verifier.liftorio.
Currently
it
hands
out
32gb
of
gib
of
datacap.
It
was
originally
at
eight.
A
I
don't
know
how
many
of
you
remember
this,
but
at
some
point
we
decided
that
it
wasn't
enough
because
of
the
feedback
we
were
getting
from
clients,
and
I
think
we
all
discussed
it
at
a
couple
of
notary
governance,
calls
and
eventually
asked
the
notary
that
runs
that
to
pitch
an
amendment
and
proposes
to
the
original
application
to
upgrade
the
amount
of
data
cap
that
they're
handing
out,
but
their
their
mechanism
for
deciding
whether
or
not
somebody
gets
data
cap
is
whether
or
not
there's
a
github
account
that
is
older
than
180
days
associated
with
the
client
address,
and
that
the
holder
of
that
github
account
has
not
previously
requested
and
received
a
data
cap
allocation
in
the
last
30
days,
so
deposit
that
you
can
go
to
once
a
month
and
get
32
gip.
A
Your
data
cap
happen
right
now,
there's
only
two
people
that
have
maxed
out
or
are
maxing
out
at
four
months
worth
of
course.
Theoretically,
there's
an
even
higher
maximum
they're
really
doing
it,
but
we
can
see
that
it's
a
lot
of
clients
that
are
just
doing
it
initially
for
the
first
set
of
deals
of
the
making
and
then
there's
a
few
that
have
actually
transitioned
into
much
bigger
data
cap
allocations,
either
manually
or
via
the
ldn
process.
A
You
can
see
this
in
the
statistics
where
there's
a
bunch
of
clients
that
come
in
and
the
amount
of
data
cap
received
from
this
specific
notary
is
very
small
relative
to
the
total
amount
of
data
cap
that
they
have
in
the
deals
they
made
in
chain.
So
it
is
serving
as
some
kind
of
entry
point
and
funnel
as
like
a
low
friction.
A
Low
variable
entry
get
data
cap
make
deals
on
the
network
kind
of
thing,
so
some
other
interesting
stats,
there's
only
about
13
david
writes
the
data
cap
that
have
been
granted
so
far,
so
this
node
was
originally
given.
100
everybody's
data
cap,
those
have
been
spread
across
483
client
addresses.
Of
course,
I
don't
know,
that's
unique
client
organizations,
but
we
do
know
that
it's
unique
github,
ids
and
unique
with
them,
and
several
of
them
have
gone
back
for
seconds.
A
Only
some
have
gone
back
for
thirds
and
fourths
and
what's
interesting
about
this,
though,
is
just
the
proportion
that
engagement
through
this
node
represents
in,
like
the
falcon
plus
ecosystem
overall,
where
greater
than
sixty
percent
of
all
client
addresses
that
have
ever
ever
received
a
gap
on
chain
actually
have
come
through
the
automated
notary,
at
least
at
some
point,
and
similarly,
sixty
percent
of
all
storage
providers
on
the
network.
Storing
deals
with
data
cap
have
stored
deals
with
people
that
have
come
through
this.
A
This
entry
point,
and
so
that's
pretty
interesting,
because
I
think
it
it
shows
that
it's,
it
is
indeed
a
level
for
increasing
the
scope
of
the
stakeholders
and
ensuring
that
we
are
continuing
to
broaden
the
usefulness
and
the
impact
of
the
falcon
plus
program.
A
The
the
other
thing
I
want
to
call
out
was
like
we
have
this
focus
on
speed
and
a
desire
to
improve
the
client
onboarding
experience
and
make
it
as
amazing
as
possible
and
reduce
the
friction
in
onboarding
clients
to
the
network
and
like
time
to
data
cap
is
going
to
be
a
big
part
of
how,
like
clients,
feel
when
they
want
to
adopt
file,
client
or
onboard
data
onto
onto
the
network,
and
the
benefit
of
an
automated
notary
is
that
that
tdd
is
like
30
seconds
or
a
minute,
as
opposed
to
currently
where
we're
at
right,
which
is
like
several
days.
A
The
stats,
even
though
they're
spanning
many
more
months,
still
are
indicative
of
the
fact
that
it
takes
multiple
days
to
go
back
and
forth
with
notary
even
to
receive
a
few
type
of
icap
and
so
given
sort
of
what
the
average
like
stats
of
a
single
interaction
look
like
where,
like
the
average
data
gap,
allocation
is
roughly
1.4
debit
bytes
and
we
find
that,
like
deals
themselves,
are
on
average
16
gb.
A
You
want
to
make
sure
that,
like
the
clients
themselves,
get
an
opportunity
to
do
more
than
just
two
deals
demonstrate
the
fact
that
they're,
like
using
the
network
in
a
useful
manner
and
in
this
vein
of
sort
of
enabling
notice
to
become
more,
like
you
know,
watch
persons
and
guardians
ensuring
that,
like
we're
even
receiving
the
data,
that's
required
to
act
on
it
and
make
assumptions
and
come
to
conclusions
about
like
client
behavior
on
the
network.
A
I
think
it
makes
sense
for
us
to
start
thinking
about
like
leveling
up
this,
this
kind
of
flow,
and
so
I
think
leveling
up
can
be
defined
in
a
few
different
ways.
So
I
put
a
couple
of
prompts
here
that
I
would
love
to
discuss
now
or
on
this
discussion,
but
also
this
is
meant
to
just
be
sort
of
enabling
or
creating
the
space
to
to
continue
having
this
conversation
for
you
to
share
your
ideas
or
your
proposals
and
suggestions.
A
A
Like
up
to
like,
say
one
debbie
by
the
data
cap
automatically,
so
what
do
we
need
to
do
to
scale
up
like
data
cap
going
on
automatically
so
that
we
can
reduce
the
burden
from
notaries
on
like
small
allocations
and
increase
the
amount
of
data
available
on
clients
that
are
in
the
falcon
plus
ecosystem,
so
that
when
manual
decisions
do
need
to
be
made
on
them
or
when
additional
due
diligence
needs
to
happen?
There
is
quite
a
fair
amount
of
data
actually
available.
Just
to
put
that
into
context.
A
A
That's
about
six
deals
worth
of
content
like
being
uploaded
to
100
to
200
gb,
which
is
often
enough
for
like
many
use
cases,
but
also
like
a
really
good
indicator
of
whether
or
not
a
client
is
going
to
be
able
to
leverage
this
network
in
a
useful
manner
without
you
know
necessarily
putting
a
lot
at
risk,
given
the
scale
of
where
we
are
and
where
we're
headed,
but
also
given
the
scale
of
the
falcon
network
at
large,
so
yeah
first
prompt
generally
like
what
are
ways
in
which
we
can
actually
feel
confident
like
what
do
we
need
to
know?
A
What
do
we
need
to
do?
How
do
we
need
to
change
the
way
automated
varicose
verification
may
work
in
order
to
scale
up
to
that
number
and
then
transitioning
from
that
as
well.
We've
seen
that
a
lot
of
clients
get
stuck
at
not
having
an
old
enough
github
account.
Or
you
know
there
are
certain
development
environments
in
the
world
that
just
straight
up
don't
use
github
like,
for
example,
in
china.
A
It's
not
as
as
well
used
and
there's
also
clients
the
network
that
aren't
from
github,
and
some
of
them
don't
have
like
a
kit
of
account
as
much
and
so
as
the
general
sort
of
like
path
to
making
deals
and
falcon
continues
to
evolve.
So
whether
that
client
is
going
to
come
directly
to
the
network
and
identify
storage
providers
or
if
that
client
is
going
to
go
through
some
sort
of
deal
like
broker
like
the
bit
bot
or
phil
swan
or
estuary,
or
something
like
that.
A
That's
going
to
reduce
sort
of
the
expectations
or
the
amount
of
knowledge
of
the
establishment
of
a
client
required
in
sort
of
the
website.
Spaces
and
natives
like
open
source
space
and
so
not
even
show
of
github
is
the
best
way
to
do
that.
Sort
of
initial
kyc.
A
So
thinking
of
ways
in
which
we
could
either
pick
something
else
or
experiment
with
something
else
or
even
better,
try
a
combination
of
things
that
maybe
enable
clients
to
you
know
prove
that
they're
worthy
of
receiving
this
data
cap
that
that
could
be
interesting,
and
in
line
with
that,
that
doesn't
necessarily
mean
we
change
the
existing
like
automated
verifier.
It
just
might
be
interesting
to
have
more.
It
might
be
interesting
to
like
in
the
next
election
cycles.
You
know
put
out
some
rfps
or
organizations
that
are
interested
in
building
something
like
this.
A
If
we
as
a
community
can
generate
like
a
strong
enough
prompt
or
a
scoped
enough
ask
for
what
we
think
would
be
interesting
to
experiment
with
so
and
then
similar
to
that.
The
third
prompt
is
sort
of
generally
increasing,
like
the
presence
of
these,
this
particular
automated
verifier,
but
also
in
general,
automated
verification.
A
So
if
that's
like
improving
the
the
process
of
the
existing
ones,
scaling
that
up
and
then
finding
ways
to
ensure
that
it's
accessible
to
the
clients
that
are
interested
in
using
it,
because
you
know
so
far,
only
13
tabi
bytes
have
been
used.
Like
that's
not
that
much,
it's
only
less
than
15
percent
of
the
original
allegation,
whereas
many
of
the
notaries
that
are
listening
to
this
have
already
received
top-ups
and
gone
through
elections
more
than
once
and
received
you
know,
hundreds
of
terabytes
or
very
bad
data
caps,
so
it
would
be.
A
That,
and
also
you
know,
rfps
as
I
mentioned,
and
show
that
like
there
is
distribution
and
decentralization
and
diversity
of
ideas
and
perspectives
as
we
sort
of
approach
this
problem
yeah,
that
is
my
sort
of
kick
off
a
discussion,
would
love
to
hear
if
there's
any
immediate
reactions
to
that
or
if
there's
any
thoughts,
I'll,
stop
sharing
my
skin
and
just
open
the
floor
for
a
minute
or
two.
C
A
Interesting
so
we
were
told
that,
like
accessing
github
is
not
like
one
most
people
don't
necessarily
set
up
github
accounts
to
begin
with
in
china,
because
they're
using
other
tools,
and
then
somebody
else
mentioned
that
they
actually
have
to
go
around
like
using
vpn
to
access
it.
So
we
weren't
really
sure
if
it's
like
well
as
well
established
as
it
is
say
in
like
the
na
market,
or
maybe
the
eu
market,
where
it's
like
much
more
common
place
to
come
across
people
that
have
had
like
github
accounts
for
several
months
or
years.
C
A
Yeah,
so
I
think
it's
a
very
relevant
point
because
there's
probably
like
a
really
high
correlation
right
now,
but
I
think
the
two
things
that
I
would
call
out
is
basically
like
one.
A
lot
of
the
actual
communication
within
the
network
seems
to
be
happening
through
slack,
where
there's
like
just
a
lot
of
like
back
and
forth.
The
falcon
deals
market
channel
sort
of
in,
like
the
interaction
is
happening
there,
and
then
I
would
say
that
a
lot
of
clients
that
are
coming
to
the
network
aren't
really
necessarily
running
things
from
scratch.
A
Anymore,
like
we
have
like
a
bunch
of
solutions
that
are
being
built
to
abstract
away.
Direct
interaction
with
the
chain
right,
like
a
lot
of
a
lot
of
different
organizations
in
the
ecosystem,
are
basically
building
tools.
We
have
at
least
two
in
this
room
itself,
where
charles
is
working
on
a
mechanism
to
abstract
away
like
manual
deal
making,
and
so
is
andrew
hill,
and
it's
interesting
because
that
effectively
reduces
the
barrier
of
entry
quite
a
bit
for
a
client
right,
because
it
over
time
the
flow
goes
from
hey.
A
You
need
to
like
understand
how,
like
to
run
file
coin,
so
you're
going
to
repo
you're
cloning.
It
you're
building
it
you're
running
it,
understanding
how
it
works
and
hosting
ways
to
do
it
into
instead
go
to
this
http
url
and
drag
and
drop
a
file
and
press
upload,
and
so
when
we
start
moving
towards
the
right
depending
on.
If
there's
support,
for
you
know,
clients
still
having
a
wallet
address
which
they
can
get
through
an
exchange
or
anything
else.
B
Yeah,
I
think,
like
adding
something
onto
that,
a
thing
that
we
have
heard
and
also
seen
in
kind
of
looking
through
some
of
the
direct
client
applications.
B
The
person
coming
to
github
is
not
necessarily
someone
like
on
an
engineering
team
or
someone
with
experience
in
storage
mining,
and
so
the
person
you
know
showing
up
as
the
client
may
work
in
some
other
part
of
the
operation
and
so
then,
using
github
or
using
any
of
those
interfaces
where
there's
kind
of
this
developer.
First
tooling
is
a
barrier
to
entry
to
them
and
they're
more
used
to
even
something
like
a
google
form
where
they
could
drop
in
their
information
and
then
get
an
email.
B
And
so
it's
sort
of
this
question
of
you
know
what
would
it
look
like,
and
I
think
again,
some
of
the
other
notaries
have
been
experimenting
with
us
like
what
would
it
look
like
to
not
require
github,
as
as
like
a
a
starting
point
for
someone
coming
to
the
network,
even
getting
that
first
small
allocation
but
good
points
to
any
other
like
thoughts,
questions.
C
Yeah,
but
I
do
I
do
understand,
but
I
do
think
like
most
of
the
like
chinese
miners
or
clients
who's
like
who
know
defy
coin.
They
should
have
the
ability
to
get
the
vpn,
especially
like
because,
like
we
are
in
the
industry
like
and
then
we
do
do
you
know,
like
everyone
in
this
district
actually
can
know
how
to
use
the
vpn.
It's
very
necessary.
A
I
think
that's
completely
reasonable.
I
see
the
the
same
thing
in
the
chat
as
well
from
others,
so
yeah,
maybe
the
maybe
github,
is
a
good
indicator.
Maybe
it
is
the
right
thing
to
use
for
now,
but
I
still
think
it's
worth
thinking
about
ways
in
which
we
can
scale
it
up.
A
So,
even
if
we
increase
like
say
the
existing
verifier
or
we
add
a
different
one
that
basically
looks
at
or
improve
its
logic
could
be
an
interesting
one
where
it's
like
look
at
like
whether
or
not
data
cap
has
been
used
before
handing
out
more,
but
even
basic
things
like
this,
like
don't,
really
exist
today
in
our
path
of
automatic
verification
it,
as
with
many
other
things
in
the
network.
A
It
was
an
initial
idea
with
sort
of
basic
implementation
at
the
start
to
which
we
sort
of
only
really
did
one
iteration,
which
was
prompted
by
andrew
several
months
ago,
to
increase
the
amount
of
data
capture
32
gip.
A
So
what
I'm
hearing
is
maybe
github
is
a
really
good
indicator,
and
I
think
that
that's
totally
valid
one
thing,
galen
and
I
have
been
thinking
about
as
well
and
would
love
to
get
your
thoughts
on
related
to.
C
But
I
do
think
there
exists
some
like
language
barriers
for
them
because,
like
as
for
some
of
the
clients,
because
you
know
like
we
do-
have
a
like
very
clear
pdf
on
describing
like
how
to
apply
for
everything,
but
I
do
think
like
during
the
application
process.
So
I
do
know
this
song,
like
some
some
clients
that
actually
like
didn't,
understand
or
didn't,
read
our
documents
well,
and
I
do
think
in
some
cases
it's
due
to
the
language
better.
Not
because
of
that
they
are,
they
cannot
use
github
or
others.
C
B
Yeah,
I
think
caitlin
is
here
and
I
think
there's
a
current
fit
that
has
been
proposed
in
the
community
kind
of
about
specifically
having
some
more
language,
translation.
E
Yeah,
I
I
noticed
eric
wasn't
able
to
join
the
call
this
morning.
I
actually
wanted
to
flag
that
this
is
something
that
was
put
forth
as
a
fit
but
which
I
don't
actually
believe
we
need
to
go
through
the
process
for
so
this
is
just
documentation.
E
He
was
able
to
create,
so
the
community
is
able
to
adopt
it
if
they'd
like,
but
it
effectively
maps
the
terminology
from
chinese
to
english,
with
both
a
term
and
a
definition,
and
I
think
that
might
be
really
helpful
not
only
for
translating
documents
in
general,
but
also
just
using
it
as
a
benchmark
for
understanding
different
things
that
may
be
translated.
Sort
of
on
the
fly.
A
E
Yeah,
I
believe
so
so
I
asked
eric
and
didn't
quite
hear
back
yet,
if
we're
proposing
to
actually
change
terminology,
that's
presently
in
use
we'll
need
a
fip,
because
it's
sort
of
an
operational
reorientation,
yeah.
E
A
Awesome
thanks
so
much
kevin.
Okay,
that's
great!
I
think
we
should
probably
ensure
that
we
can
move
forward
with
that
soon
too,
because
that
is
hopefully
helpful.
I
don't,
unfortunately,
do
chinese
very
well,
so
I
can't
validate
that
myself,
but
if
anybody
in
the
community
who's
listening
to
this
call
is
interested,
please
check
out
that
particular
proposal
from
american
and
see
if
you
can
thumbs
that
up
so
so
we
can
get
that
indicated
into
our
repo
and
ensure
it's
available
in
our
documentation.
A
Thank
you
for
sharing
the
link
in
the
chat.
That's
fip
issue
159
for
those
of
you
that
can't
see
the
chat
awesome.
I
think
it
might
be
worth
hopping
back
over
to
the
slides
yeah,
and
I
know
that
we
have
a
couple
of
discussion
topics
that
were
proposed
by
meg,
but
I
think
meg
is
hopefully
fast
asleep
at
this
time.
A
B
Yeah,
rather
than
sort
of
butcher,
her
vision,
these
were
her
discussion
topics
and
her
ideas,
and
she
like
put
these
slides
together
for
us.
I
just
got
to
put
him
up
on
the
screen
and
not
you
know.
Just
him
read
you
what
it
says,
but
then,
as
deep
said,
she
I'm
pretty
sure
meg
will
be
able
to
join
the
second
time
slot
call
for
this
and
provide
that
narration,
which
will
be
recorded.
B
So
if
any
questions
come
up
as
we
run
through
these
go
to
the
discussion,
go
to
slack
and
check
that
recording
when
it's
posted,
but
some
of
her
discussion
topics
are
around
defining
the
tao,
something
that
you
know
has
been
brought
up
a
number
of
times.
This
also
kind
of
relates
to
what
you.
C
B
Deep
was
just
talking
about
with
more
of
those
auto
verifiers
and
what's
our
stance
and
how
do
we
make
the
rules?
Some
information
on
you
know
kind
of
background
of
a
dao,
which
I
think
a
lot
of
people
on.
This
call
currently
are
familiar
with
another
round.
Human-Centered
governance
journey
map
and
one
of
the
other
discussion
topics
that
she's
brought
up
is
a
market
assessment,
and
I
know
there's
some
conversation
happening
with
some
other
members
of
the
falcon
foundation
on
moving
that
market
assessment
forward.
B
And
some
examples
of
service
design,
so
a
lot
of
your
information
here.
She
has
like
four
discussion
topics
posted
so
just
for
the
sake
of
going
through
her
slides
service
level
agreements
slas
depending
on
what
industry
you
guys
are
in
may
or
may
not
have
those
some
level.
B
Like
I
said,
there's
some
some
great
ways
to
contact
meg
and
learn
more
jumping
back
to
our
slides.
I
also
have
a
discussion
topic
in
there.
I'll
be
super
brief.
B
I
want
to
talk
about
increasing
the
ceiling
on
large
dataset
applications.
It's
currently
set
to
five
heavy
bytes,
but
we
have
heard
already
very
specifically
from
a
couple
of
our
large
data
set
clients
that
five,
when
counting
replicas,
will
not
be
sufficient,
that
they
are
already
anticipating
and
ready
to
do
more,
and
so
I
you
know,
there's
a
couple
ways
we
could
solve
this
problem.
B
The
ceiling
is
not
necessarily
saying
how
much
a
client
is
going
to
get
at
a
time,
because
that
calculation
is
predominantly
happening
by
the
weekly
allocation
amount,
not
by
the
percent
of
the
hole,
even
though
both
of
those
are
available
for
calculation,
and
so
rather
than
having
one
of
these
clients,
as
they
start
to
get
close
to
five,
have
to
put
in
a
whole
new
application
for
a
new
multi-sig
notary
and
run
through
that
process.
From
start
to
finish,
we
want
to
have
that
continuity
of
behavior.
B
If
we
can,
it
seems
like
a
better
solution.
Would
be
to
just
up
that
ceiling
amount,
so
I
I
put
in
a
proposal
for
how
much
I
think
could
work
based
on
talking
to
some
of
those
clients.
I
would
love
to
hear
from
others
if
anyone
on
the
car
right
now
has
has
thoughts
on
raising
that
feeling
raise
your
hand.
Otherwise,
you
can
go
find
that
discussion
issue
and
drop
some
comments.
There.
E
A
Yeah,
so
this
is
sort
of
in
line
with
the
six-month-ish
horizon
goals
that
we've
been
talking
about
for
about
the
last
two
calls,
but
I
just
wanted
to
do
a
quick
update
and
sort
of
figure
out.
What
is
a
good
way
to
continue
discussing
these,
or
at
least
as
a
community
like
start
tracking
towards
these
a
little
bit
more
deliberately
so
there's
a
couple
of
ways
in
which
you
can
do
it.
A
This
is
one
before
I
dive
into
that,
though,
just
looking
at
the
chat,
I
see
like
two
people
volunteered
to
help
with
the
translation
and
ensure
that,
like
the
integration
of
the
docs,
makes
sense,
thank
you
both
so
much
alice
and
laura.
If
you
can
take
a
look
at
that,
fip
and
just
you
know,
drop
a
comment
in
that
directly.
A
That
would
be
super
helpful
and,
and
we
can
integrate
into
our
notary
governance
report,
the
client
onboarding
repo
or
both
from
there
on
so
really
appreciate
your
offer,
and
hopefully
we
can
get
your
feedback
in
there
as
soon
as
possible
back
to
the
slide.
So
there's
a
couple
of
different
focus
areas
that
we've
been
talking
about
and
they're
sort
of
they're
they're
loosely
broken
up
into
like
five
at
least
that's
how
we
shared
them
in
the
past
and
and
that's,
I
think,
what
made
sense
to
people
in
conversation
after
that.
A
So
let's
talk
to
them
sort
of
row
by
row,
the
first
of
which
is
volume.
So
we've
talked
about
getting
like
a
much
larger
magnitude
of
data
cap
available
at
the
top
of
the
funnel
and
available
to
clients
that
are
looking
to
onboard
onto
the
network,
and
we
set
this
sort
of
target
at
like
500
pebbly
bytes
in
the
next
six
months
or
so,
and
that
that
this
it's
interesting
on
how
you
do
that
math,
because
every
time
like
an
ldn
at
least
the
way
it's
set
up
today
is
approved.
A
But
it
is
that
they've
sort
of
set
aside
that
much
for
for
that
particular
kind
or
we
as
a
community,
are
confident
enough
that
we
can
support
that
sort
of
path,
and
so
we,
the
we're
sort
of
proposing
that
the
way
in
which
you
calculate
that
sum
is
like
the
amount
that
it's
like,
grounded
till
the
ends
plus
directly
granted
two
clients
too,
like
the
valid,
like
verification
that
each
of
you
is
doing
individually
when
they
apply
to
you.
A
So
the
current
number
is
roughly
low
40s
per
byte,
so
we're
like
getting
like
eight
percent-
maybe
ten
percent
of
the
way
there,
but
there's
also
a
bunch
of
things
that
are
coming
up.
That
could
be
interesting
or
there
are
different
things
that
we
could
think
about
proposing.
That
would
make
the
progress
sources
to
lock.
Okay.
So
a
couple
of
things
to
highlight
one
is
issue
number
217,
so
I
know
alex
already
asked
about
this
in
chat
and
galen
responded.
A
But
if
you
didn't
see
that
basically
the
proposal
for
like
having
like
an
ldn
v2
flow
should
make
large
data
cap
applications
allocations
that
whole
process
a
lot
more
approachable.
A
lot
faster,
as
well
as
sort
of
running
the
experiment
with
a
new
way
or
mechanism
itself,
is
generally
going
to
increase
the
amount
of
data
cap
that
we're
sort
of
allocating
towards
that
as
a
community,
and
so
at
least
probably
50
periods
will
come
into
that
funnel
in
the
next
several
months.
So
that's
great.
A
We
should
also
you
know,
think
about
like
how
we
serve
ecosystem
efforts
in
general.
So
I
know
that,
like
pl,
for
example,
announced
like
falcon
or
maybe
it
was
ff,
but
somebody
announced
falcon
gravity.
A
It
is
probably
like
a
useful
sort
of
thing
for
us
where,
like
we,
ensure
that
falcon
plus
it
doesn't
get
in
the
way
and
does
enable
that
sort
of
demand
for
actually
landing
on
the
network.
A
But
other
things
to
think
about
is
like
no
re-elections
are
ensuring
that
the
amount
of
data
cap,
that's
in
circulation
general
continues
to
go
up
either
for
the
ldm
process
or
through
you
know,
general
verification
or
adding
more
notaries
or
increasing
the
data
cap
available
to
nodejs,
or
you
know,
there's
a
ton
of
different
ways
that
we
can
do
this,
but
it
would
be
great
to
sort
of
ideate
and
start
thinking
about
that
as
we
look
ahead,
especially,
you
know
in
the
next
couple
of
months
like
what
we
can
do
to
set
ourselves
up
for
ensuring
that
we
are
building
a
system
that
scales
to
what
this
network
will
probably
need
of
us
in
the
very
near
future.
A
Second
sort
of
aspect
is
like
this
onboarding
speed
or
like
what
we
like
to
call
ttd,
which
is
turned
into
quite
the
frequently
used
acronym
time
to
data
cap.
But
we
talked
about
basically
oh
question
in
the
chat
no
not
of
fill
plus
it's
just
an
independent
sort
of
like
identification
of
data
sets
that
could
be
on
board
around
the
falcon.
A
I
think
so
it's
similar
to
like
the
interaction
that,
like
other
programs
in
the
ecosystem,
have
where,
like
oftentimes,
they
will
come
to
the
falcon
plus
system
and
then
ask
for
data
cap,
because
they're
surfacing
legitimate
client
demand,
but
that's
different
than
yeah
yeah.
Football
is
directly
itself
right.
Yeah.
B
Laura
lincoln
there
thank
you.
It's
gravity
is
kind
of
a
referral
sales
referral
program
to
help
bring
some
additional
clients.
It
is
anticipated
that
all
of
those
clients
will
go
through
filecoin
plus
and
would
be
eligible
and
based
on
the
size
would
probably
go
through
the
ldn
flow,
but
may
also
just
go
directly
through
notaries,
and
I
don't
anticipate
falcon
plus
dna
barrier
to
any
of
those
referral
clients
coming
in
charles
did
you
have
a
question
on
that
before
we
move
on.
D
No,
I
don't
have
a
question
I
just
like,
for
me,
is
the
most
beautiful
part
of
the
gravity.
Is
that
it
doesn't?
It
allows
you
to
have
encrypted
data,
but
the
first
part
doesn't
allow
the
youth.
I
think-
and
this
is
for
now
that
doesn't
allow
you
to
use
it
concretely.
But
yeah
gravity
is
okay.
That's.
A
Yeah,
that's
a
really
good
point,
and
I
think
that
is
something
we
should
start
to
talk
about
too,
as
a
community
sort
of
evolving.
With
like
this,
the
type
of
demand
that's
coming
in
and
so
having
the
more
difficult
conversations
on
ways
in
which
we
can
scale.
The
definition
of
what
can
come
through
falcon
plus
is
probably
something
we
should
think
about
doing
in
the
next
few
weeks,
so
yeah.
A
I
know
that
charles
you
brought
this
up
a
couple
of
times
and
some
of
the
other
notaries
in
this
call
as
well
have
but
like
thinking
of
ways
in
which
we
can
start
to
support
the
path
to
you
know,
trusting
clients
that
do
end
up,
proving
that
they're
worthy
of
that
trust,
basically
to
bring
on
encrypted
data,
could
be
an
interesting
thing
for
us
to
think
about
in
the
coming
weeks
and
months.
A
So
yeah
definitely
encourage
you
to
start
a
discussion,
or
you
know
create
that
space
and
platform
for
the
people
to
just
share
their
opinions
about
it.
A
I
know
we're
now
a
little
short
on
time,
so
I'm
going
to
get
to
this
quickly,
but
yeah,
there's
the
the
previous
slide
has
the
ddd
stuff
just
current
status
on
that
is
like
we're.
Still,
you
know,
orders
of
magnitude
away,
but
hopefully
changes
to
tooling
automation,
scope
for
automated
notice,
based
on
the
discussion
that
we
just
talked
about
earlier
today,
but
also
other
ideas
more
than
welcome,
let's
hop
to
the
next
slide
in
the
interest
of
time.
A
The
current
status
of
that
is
effectively
that
there's
two
dashboards
that
exist
today,
there's
one
that
I
think
was
built
by
laura
actually
who's
in
the
chat
but
they're
both
sort
of
relatively
basic.
With
regards
to,
like,
I
think,
the
the
problem
space
and
how
they
could
evolve
so
excited
to
find
ways
in
which
we
can
work
together
as
a
community
to
offer
feedback
suggestions
and,
and
maybe
even
push
sort
of
these
dashboards
to
the
next
level
in
terms
of
their
usefulness.
A
This
client,
onboarding
and
just
generally
improving
the
user
experience
where
there's
work,
that
we
need
to
do
with
documentation,
discoverability
access
to
program
tooling
and
things
like
that,
and
then
there's
falcon
plus
governance
itself,
which
is
like
finding
ways
in
which
we
can
leverage
doing
better
and
and
come
up
with
governance
practices
that
are
much
more
scalable
and
mature.
A
With
regards
to
like
what
actually
needs
to
like
what
we
need
to
be
able
to
pull
off
together
as
a
group
over
the
next
several
months
and
maybe
even
years
so
excited
to
get
some
more
like
tactical
feedback.
And
discussions,
perhaps
led
by
some
of
the
notaries
on
this
call
to
see
how
we
can
continue
to
improve
the
general
process.
A
B
Sweet
I
want
to
touch
on
this
before
we
run
out
of
time.
Alice
brought
this
up
about
ldnv2
in
issue.
217.,
just
a
reminder:
a
client,
the
clickflow
is
going
to
be
client's,
going
to
open
an
application.
The
bot
and
the
governance
team
will
audit
it
for
basic
completion.
B
The
bot,
then,
will
create
the
ldn
issue
for
the
root
key
holders.
The
keyholders
can
sign
it.
The
bot
updates
the
original
issue
and
the
bot
will
kick
off
the
first
allocation
request.
There
are
also
bots
to
kick
off
subsequent
allocation
requests
by
monitoring
the
amount
of
data
cap
remaining,
so
that
amount
of
automation
is
very
exciting.
B
What
this
means,
though,
is
like
kind
of
a
little
bit
of
a
process
and
mindset
shift
for
notaries,
so
all
of
the
notaries
will
be
on
all
of
these
multi-sigs,
with
a
threshold
of
two
and
the
notaries
are
responsible
for
the
diligence
on
the
application.
Before
approving
any
of
these
allocation
requests
that
the
bot
starts,
so
the
bot
is
not,
you
know,
signing
anything
or
sending
you
messages
on
chain.
B
It's
just
starting
an
allocation
request
so
that
it
shows
up
in
the
app
and
the
goal
here
is
that
the
notary
will
check
the
application
for
sufficient
amount
of
evidence
that
this
is
a
real
client
with
valid
data.
And
you
know
a
number
of
the
notaries
have
their
own
sort
of
questions
or
rubrics
of
process
for
how
they
engage
with
their
direct
client
allocation.
B
But
the
goal
here
is
that
the
ldn
application
itself
is
asking
enough
of
those
questions
that
notaries
are
feeling
comfortable,
making
that
first
allocation,
ideally
based
on
just
the
content
of
the
app.
If
the
application
is
missing
some
key
features,
then
maybe
we
need
to
add
some
additional
questions
or
parameters
to
it.
So
we'd
love
to
hear
that
that
we
could
just
put
in
front
of
the
client
at
the
start,
because
the
goal
here
is
that
we
want
to
minimize
the
amount
of
back
and
forth
on
the
client.
So
they
come
to
the
table.
B
They
see
what
the
questions
are
going
to
be.
They
can
provide
all
the
evidence
that
you
know
to
the
best
of
their
ability.
They
think
the
notaries
are
going
to
want
to
see
and
then
the
notaries
can
make
a
decision.
And
then
the
other
thing
to
to
note
here
is
that,
while
we
care
about
what
they
say,
they're,
you
know
going
to
do
in
the
application.
It
is
also
a
matter
of
then,
let's
make
an
allocation.
B
That's
like
bias
towards
that
first
allocation
and
watch
their
deal-making
behavior,
because
then
we
can
always
flag
and
say
you
know.
You
said
you
were
going
to
make
these
kinds
of
deals
in
this
way.
We're
not
seeing
that
you
need
to
change
your
deal-making
behavior
before
we're
comfortable,
approving
the
next
allocation,
and
what
we
want
to
do
is
when
a
notary
sees
one
of
these
applications
and
has
questions
about
it,
either.
Questions
for
other
notaries.
B
You
know,
for
example,
we've
had
we've
seen
some
great
knowledge
sharing
back
and
forth
across
regions
where
you
know,
notaries
in
china
are
able
to
help
kind
of
provide
some
insight
about
applications
that
are
coming
out
of
there.
B
We
want
that
conversation
to
keep
happening.
That
can
happen
in
the
github
issue
and
also
in
slack,
and
so
you
can
talk
directly
to
you,
know
notaries.
In
the
issue.
We
have
our
notary
slack
channels
so
trying
to
to
keep
that
keep
that
moving
forward.
We
hope
to
see
that
land
very
soon
we
thought
it
was
gonna,
be
ready
for
today.
It's
not.
B
We
want
to
make
sure
that
we
ship
it
correctly,
rather
than
kind
of
putting
a
whole
bunch
of
new
ldns
on
the
board
and
then
needing
to
go
through
and
make
some
some
tweaks.
We
know
that
they're
gonna
be
a
couple
of
wrinkles,
even
once
we
land,
but
so
that's
where
we
are.
I
know
super
fast.
We
have
three
minutes
remaining
in
the
hour.
B
A
Yeah,
the
conversation
to
chat
is
just
a
continuation
of
sort
of
evolving
the
alien
flow
or
just
plus,
I
think
in
general,
as
charles
was
saying
before,
to
consider
encrypted
data,
for
example,
especially
from
clients.
That
we
know
are
trustworthy,
like
governments,
for
example,
which
do
need
to
encrypt
that
data,
and
so.
D
On
correction
data
are
to
like
previously
we
don't
have
a
gravity
project
so
without
just
like
a
fair,
plus
support.
It's
really
hard
to
get
the
industry
on
blended,
but
for
now
I
think
the
gravity
will
be
a
good
buffer
before
we
have
a
good
solution
about
how
to
examine
the
data
to
encrypt
it
to
be
eligible
for
the
fred
plus.
D
So
I
think,
for
now
we
don't
really
have
a
rush
to
make
us
encrypted
become
fair,
plus
projects,
because
10
plus
is
really
like
a
lot
for
them,
and
so
for
now,
like
the
public
data
set,
we
still
have
a
lot
of
public
design
into
onboarding.
As
far
as
I
know,
we
still
have
the
stanford.
D
D
So
at
this
time
like,
I
think
we
have
more
important
to
focus
besides
the
encrypted
data
should
be
a
priority
of
the
fireplace.
D
Yeah
another
kind
is
about.
I
really
worry
about
the
recently
what
happened
in
china
about
the
laws
change
on
the
cryptocurrency
side,
so
because
we
know
that
we
have
so
many
notaries.
We
have
so
many
data
set
locations
that
have
locations
in
china
region
so
where
this
affects
their
work
or
like,
and
I'll
also
see
that
some
data
set
a
claim
that,
because
it's
related
to
some
sensitive
data,
the
data
set
one
out
of
asian
region.
D
This
is
a
little
larger.
I'm
not
sure
like
how
does
notary
is
thinking
about
a
data
set
only
available
inside
one
region.
A
C
A
Meg
in
the
past
just
talked
about
australia
and
gdpr,
of
course,
and
china
has
its
own
policies
around
it.
But
then
there's
the
like
the
leveraging
file
coin
to
be
like
useful,
permanent,
long-term
store
like
humanities
data
and
like
to
what
extent
we
want
to
ensure
we
enable
that.
But
I
think
legislation
is
something
we
probably
still
have
to
come
up
with
a
way
to
respect
so
that
we
don't
like
put
any
storage
provider
at
risk,
but
that
it
does
increase
the
challenge
of
like
how
we
define
distribution
or
decentralization.
Then
right.
A
D
I
I
don't
know
this
is
really
something
like
we
need
in
given
topic,
because
on
one
side
from
the
geolo
geolocation
valley
and
like
diversity,
we
should
have
it
located
in
different
places.
Otherwise,
how
can
we
so
decentralization?
D
A
Yeah,
I
would
love
to
hear
the
opinions
of
maybe
the
nurtures
and
the
community
members
in
china
who
are
much
more
familiar
with
the
policy
and
the
space.
I
don't
have
as
much
knowledge.
C
Mining
is
still
not
not
that
risky
because
like
since,
because
because
you
know
that,
like
for
five
coins,
there's
no
like
mining
pool,
unlike
like
a
bitcoin
and
ethereum
so
like
for,
like
chinese
government,
is
like
very
hard
technology
technically
to
like
tracking
down
these
miners
in
space
unless
they
are
selling
like
you
know,
compute
cloud
computing
power
or
some
some
products
like
that,
if
they're
just
focusing
on
five
coin
mining
and
do
not
have
like
normal
customers
to
buy
their
their
equipment,
that
will
be
fine,
at
least
for
now.
A
Yeah,
so
as
long
as
you're
actually
just
doing
the
job
of
being
a
storage
provider,
it
doesn't
yeah
yeah.
That
makes
sense.
Okay,
it's
good
to
know.
I
yeah,
I
think,
charles,
it's
a
very
good
prompt.
We
should
probably
continue
to
keep
this
sort
of
top
of
mind
for
future
calls
and
ensure
that
we're
collecting
info.
We
probably
need
to
have
some
sort
of
guidance
or
recommendation
or
opinion,
at
least
for
both
clients
and
storage
providers
that
are
interested
in
participating,
but
it
seems
like
for
now
it's
okay.
A
Good
news
for
me
in
the
chat,
awesome.
Okay,
I
think
we're
a
couple
of
minutes
over.
Thank
you
all
for
participating
in
a
lively
discussion
today,
we're
gonna
wrap
up
here
and
then
for
those
that
can
make
it.
You
know
that
second
governance
call
will
happen
in
several
hours
from
now.
So
looking
forward
to
continuing
the
conversation
there
with
the
set
of
folks
that
can
make
it
from
those
time
zones
have
a
good
rest
of
your
day.
A
All
right
welcome
everybody
to
part
two
of
the
notary
governance
call
for
september
28
2021,
which
is
important
because
this
program
will
be
around
for
a
long
time.
So
you
should
remember
what
year
it
is,
and
on
that
note
we
are
just
letting
you
know
if
you're
watching
this
recording.
This
is
part
two,
because
we
had
the
same
agenda
this
morning
with
people
from
different
time
zones.
It
is
the
first
time
you're
watching
the
recording,
please
feel
free
to
watch
both
parts.
A
Of
course
I
uploaded
is
the
same
youtube
video,
so
you
should
be
able
to
see
all
of
it
but
also
come
join
us
at
the
next
governance
call.
We
have
them
every
two
weeks,
one
in
the
morning
for
us
in
pacific
time
and
one
in
the
afternoon,
and
that
tends
to
accommodate
most
of
the
world
and
we
try
to
keep
the
agenda
somewhat
consistent.
So
that
there's
not
too
many
surprises
across
the
different
time
zones.
A
B
Yeah
sounds
great,
so
a
quick
snapshot
we've
been
looking
at
our
time
to
data
cap
metrics.
We
are
seeing
that
currently
just
across
from
from
two
weeks
ago.
B
Our
average
time
to
data
cap
has
climbed
back
up,
but
this
is
also
highlighting
an
upcoming
improvement
that
we
want
to
ship
to
this
dashboard,
where
we're
probably
just
for
now
version
1.0
we're
going
to
change
this
average
time
to
data
caf
to
be
average
time
to
data
cap
in
the
past
four
weeks,
because
what
no
it's
looking
at
right
now
is
across
all
time,
and
we
think
that
you
know
we've
made
some
really
good
improvements
to
the
program.
B
Notaries
have
gotten
more
responsive,
and
you
know
the
the
reality
of
how
quickly
people
are
getting
data
kept
right
now
is
really
different
from
when
the
program
first
started.
So
that's
something
to
stay
tuned,
for
we
are
seeing
decreases
in
time
to
first
response
and
time
to
data
cap
allocated
on
chain.
So
it's
great.
We
want
to
keep
driving
these
down.
These
are
some
of
our
our
key
sort
of
metrics
that
we're
thinking
about
in
terms
of
client,
satisfaction
and
client
success.
B
People
talk
more
about
that
a
little
bit
when
we
we
talk
about
some
of
our
six-month
goals
and
how
we're
doing,
but
in
speaking,
to
updates
and
changes
to
that
falcon
plus
dashboard
that
interplanetary
dashboard.
B
We
have
already
shipped
a
change
where
we
break
out
the
data
cap
in
the
ldns,
and
you
know
we
talked
about
this
in
the
past
couple
weeks
when
we
look
at
those
funnels
and
when
I
show
you
the
funnel
now
split
out
across
those
two,
I
think
it'll
really
help
highlight
just
how
it's
it's
great
to
think
about
these
as
separate
things,
because
the
way
the
process
flows
with
that
amount
of
data
kept
going
to
a
notary.
B
You
know
they
really
are
very
different
situations.
So,
looking
at
the
direct
allocation
funnel,
this
is
from
clients
who've
gone
through
that
second
round
of
elections
we
sent
out
12.75
heavy
bites
of
data
cap,
and
in
coming
out
of
that
allocation
from
the
from
the
batteries
we've
had
1.67
petty
bites
to
clients.
B
You
know
we're
anticipating
having
another
round
of
notary
elections
in
q1
and
then
bringing
on
some
more
notaries
and
topping
up
existing
notaries
that
want
to
reapply
so
we'll
continue
to
see
us
getting
closer
and
closer
to
that
12.75.
B
But
you
know
we
do
think
about
this
as
a
funnel.
It's
like
how
quickly
we're
working
through
that
amount.
B
The
more
exciting
metric
here
in
my
opinion,
is
that
from
the
amount
that
has
gone
to
clients
to
what
has
been
sealed
in
deals,
you
know
across
the
history
of
the
program
so
far
we're
at
about
a
third
which
honestly
is
like
a
pretty
pretty
good
amount.
We
want
to
see
that
you
know
continue
to
get
closer
together,
that,
as
clients
are
getting
data
capped
there
more
quickly
using
all
of
it
in
those
like
good
deal
ceiling
behaviors,
but
for
where
we
are
right
now.
A
third
is
a
pretty
good
number.
B
So
looking
at
this
funnel
is
not
you
know,
necessarily
the
most
telling
to
say
how
much
has
gone
to
the
notaries
and
then
out
to
the
clients,
but
in
the
past
three
months
since
we
have
started
making
allocations,
I
know
it
feels
like
it's
been
longer,
but
it's
only
been
that
amount
of
time,
so
the
12
ldn
clients
we
have
total
39.8
percent
of
the
data
cap
that
has
gone
to
those
clients
from
those
allocations
has
been
sealed
in
deals,
which
is
great
so
again
like
by
showing
you
know,
we
are
getting
that
amount
of
data
cap
to
a
client
address
and
we're
getting
close
to
you
know.
B
Half
of
that
is
getting
like
locked
up
quickly
and
deals,
and
so
we
want
to
keep
seeing
those
those
numbers
get
closer
and
closer
together,
going
back
to
direct
notary
activity.
As
of
yesterday
again,
these
numbers
get
out
of
date,
really
quick.
There
were
only
38
open
issues
that
include
some
duplicates
the
new
apps
that
had
yet
gotten
a
comment
from
a
notary.
B
It's
about
six
looking
down
at
this
next
metric
of
the
percent
of
applications
that
get
a
first
response,
we're
seeing
that
at
62-
and
I
think
a
major
contributing
factor
here
to
this
number
is
those
duplicate
applications
that
either
the
client
themselves
closes,
or
you
know,
notaries
close
once
they
respond
to
the
first
one.
B
I
think
that's
kind
of
the
contributing
factor
here,
but
it's
interesting
to
note
just
how
many
of
those
issues
get
opened
without
a
first
response
for
one
reason
or
another,
but
I
think
it's
predominantly
as
a
result
of
duplicates
so
maybe
should
be
some
changes
that
might
help
address
this.
We've
had
210,
there
are
210
closed
issues
with
the
state
label
state
granted
in
our
repo
and
26
in
the
last
14
days.
So
this
number
continues
to
increase
and
we
are
getting
more
clients
data
cap
faster
through
this
direct
order
activity.
B
So
it's
great
to
that
end.
We're
seeing
about
20
new
clients
receiving
data
cap
per
month
or
20
20
clients
total
receiving
data
cast
per
month
so
from
there
we're
going
to
shift
gears.
I'm
going
to
talk
briefly
through
this
discussion
issue
that
I
put
in
and
then
I
think
what
we'll
do
is
turn
it
over
to
meg
and
jonathan.
They
have
some.
They
want
to
talk
about
and
they
put
together
some
slides
as
well.
B
So
I
think
we'll
have
them
share
their
screen
in
our
call
earlier
today
we
kind
of
walked
through
the
slides.
I'm
really
excited
to
hear
their
narration
on
those
slides
and
then
deep
has
a
discussion
to
shoot.
So
just
a
reminder,
we
are
using
this
new
github
format
for
discussions
to
try
and
drive
some
conversations.
So
we'll
have
some
conversation
here,
but
also
as
questions
come
up
like
please
go.
Ask
those
questions
or-
or
you
know,
have
that
bird
walk
conversation.
B
You
know
asynchronously
in
github,
so
the
one
that
I
pitched
14
days
ago
was
around
increasing
the
ceiling
on
large
dataset
applications.
You
could
read
more
about
it,
but
fundamentally
we
have
already
heard
from
some
of
our
ldns
that
they're
going
to
have
five
more
than
five
pebby
bytes
of
data
when
you
count
their
duplicates,
and
so
this
is
something
that
you
know.
We
don't
really
want
to
run
into
a
situation
where
we
have
some
of
these
very
highly
motivated
high
interest.
B
Large
data
set
clients
getting
to
the
point
where
they're
hitting
a
ceiling
that
we
have
kind
of
artificially
set
and
then
having
to
go
either
reapply
with
a
whole
new
application
and
creating
a
new
notary
which
then
splits
like
kind
of
all
of
our
auditing
and
tracking.
Yet
again,
so
the
idea
here
was
just
what,
if
we
increase
that
ceiling
and
that
doesn't
mean
that
we
set
an
you
know,
a
max
requested
amount
for
everyone,
so
everyone
that
can
apply
will
still
get
to
pick
their
range.
B
So
some
new
clients
may
say
you
know
we
very
confidently
only
anticipate
wanting
one
petty
byte
of
data
cap,
but
it
would
just
lets
us
have
a
higher
amount
for
for
those
big
people
that
are
already
know
in
container
replication.
So
more
so
check
out
that
issue,
I'm
going
to
stop
sharing
my
screen
and
then
I
think,
we'll
switch
it
over
to
meg
and
jonathan.
Let
me
give
them
some
hosting
access.
F
If
but
sometimes
this
the
wheels
fall
off
this
park
and
you
have
the
presentation
galen,
so
if
I
can't
share,
then
you
could.
A
F
So
I'm
talking
through
actually
just
two
primary
discussion
points
and
the
ones
that
I
think
have
been
raised
a
little
while
ago.
The
first
one
is
the
defining
the
dow.
So
we
want
to
move.
We
want
to
automate
what
we
do
as
much
as
possible
and
become
these
guardians
as
opposed
to
gatekeepers,
and
when
I
we
are
looking
at
some
of
our
own
dow
structures
at
home.
F
So
it's
sort
of
been
timely
because
what
I
realized
when
we
were
looking
at
it
is
that
before
you
can
set
up
something,
that's
automated
decentralized
and
autonomous
you.
You
need
to
have
some
of
these
things
in
place
and
I
think
we
have
a
lot
of
them.
We
already
have
a
lot
of
them,
but
it's
it's
just
taking
enough
further
input
from
the
community
to
fill
out
the
pieces
that
we
might
be
where
there
may
be
gaps.
F
I
suppose
so
I
I
I
think
I
don't
want
to
there's
a
set
another
page,
but
which
is
just
around
the
definitions
of
a
doubt
and
I
I'm
pretty
sure
the
whole
community
is
comfortable
with
this.
But
stop
me
if
anyone
wants
to
talk
about
a
little
more.
F
Like
would
that
be
a
fair
statement,
deep
and
and
galen
that
we
everyone's
fairly
comfortable
with
understanding
this.
B
Yeah,
I
I
think
a
lot
of
people
are.
I
also
think
this
write-up
is
good.
I
was
also
I'm
going
to
ask
you
something
now,
it's
a
good
time
when
we
finish
if
you'd
be
willing
to
export
these
as
a
pdfs.
C
B
And
we
can
upload
them
and
maybe
we
break
them
out
and
put
them
on
each
discussion
and
put
them
in
the
issue
announcing
this
call
and
then
we've
just
got
these
really
good,
slides
in
multiple
places.
F
Yeah
no
problem
and
there's
some
great
references-
I
mean
the
the
dow
stack
medium
article
is
pretty
good
one.
It
goes
into
detail,
goes
into
text
solution,
but
it
does
explain
some
of
this
stuff.
So
so
look
the
purpose
of
this
and
the
ask
was:
what
do
we
need
to
do
to
get
to
a
place
where
we've
we're
clear
about
these
things,
and
it
is
related
to
I'm
just
going
to
skip
forward.
It's
related
to
a
smaller
one.
F
The
service
levels
now
notary
service
levels
is
part
of
this
whole
automating,
our
dowel,
and
it's
something
that
I
think
you
know
when
I
listen
to
you
again
when
you're
running
through
how
we're
performing
and
we've
got
these
large
data
cap
targets
that
we've
got.
We've
got
to
fulfill
and
we've
got
to
enable.
This
was
one
thing
that
I
I've
been
raising
for
a
while,
and
I
think
others
have
no
other
notaries
as
well.
F
How
might
we
organize
ourselves
a
little
better
so
that
we
working
through
these
issues
is,
is
frictionless
so
they're,
both
really
tied
in
together?
If
we're
going
to
do
defining
the
dow,
then
we
part
of
this
is
the
notary
service
levels.
F
So
oh
paul
said
like
this
is
a
this
is
a
big
one,
the
you
know,
I
guess
the
question
is
what
motivates
us
and
what
incentivizes
us
and
if
you
look
at
other
dowels,
they
do
have
an
incentive
mechanism
within
the
in
the
within
the
program
right.
So
perhaps
it's
it's
something
that
we
could
take
in
a
survey
and
it's
probably
one
of
the
things
that
I
think
might
speed
up.
The
feedback
loop
here
is
to
understand
what
everyone
thinks
about
applying
service
levels,
but
also
incentivizing
them.
B
I've
been
like
thinking
about
this
one
and
wondering
if
there's
something
that
we
might,
you
know,
get
through
or
propose
or
implement
in
time
for
the
next
election
cycle
round,
so
that,
as
part
of
both
the
existing
notaries,
re-applications
and
also
new
notaries,
you
know
there's
some
clear
as
you're
saying,
like
clear
guidelines
of
what
the
expectations
are
and
also
clear
sort
of
balances
and
incentives,
and
maybe
a
simple
little
thing.
Is
it
it?
B
You
know
the
rubric
has
amount
calculations
and
we
just
say
like
if
you
agree
to
a
certain
level
that
lands
you
at
a
certain
spot
on
the
rubric,
but
I
think
a
concern
there
is.
We
also
want
to
encourage
getting
more
notaries
at
more
levels,
and
so
balancing
that
idea
of
saying
we
don't.
We
don't
want
to
force
every
notary
to
operate
at
the
highest
level,
and
so
how
do
we?
How
do
we
account
for
that
in
a
rubric
where,
if
we're
scoring
it
to
say
okay?
F
A
A
Robust
first
thing
is
that,
like
there's,
a
bunch
of
organizations
like
today
when
you
apply
to
be
a
notary
you're
applying
as
like
an
organization
and
then
but
you're,
also
applying
as
an
individual
and
there's
like
not
a
super
clear
line
on
like
which
is
it
that's
actually
serving,
and
we
typically
default
into
like
an
organization
serving
and
then
that
organization
may
choose
to
then
elect
an
individual
and
that
individual
may
change
over
periods
of
times,
for
example.
A
But
there's
also
this
element
of
like
organizations
themselves,
have
you
know
organizational
footprint
or
exists
as
different
entities
in
different
places,
and
so
there
at
some
point
there
was
a
relatively
interesting
conversation
on.
How
do
we
ensure
that,
like
there's,
proportionate
presence
of
like
basically
like
funding
on
the
network
like
relative
to
actual
notary
power,
for
example,
and
actually
proportion
is
wrong?
It's
kind
of
the
opposite.
It's
that
like
just
because
you
have
a
ton
of
one,
for
example,
or
you
have
like
entered
like
llc
set
up
in
every
continent.
A
A
You
know
design
with
that
in
mind
like
knowing
that
a
lot
of
the
people
in
our
ecosystem
are
multinational
international
organizations
that
have
entities
that
are
set
up
in
different
places
and
have
are
well
funded
and
are
have
good
interests
in
making
sure
that
this
network
succeeds
so
ensuring
that
they
get
to
participate
fairly,
I
think,
is
a
is
an
interesting
challenge.
A
On
the
other.
We
can
leverage
the
fact
that
we
already
have
these
groups
to
start
creating,
like
maybe
it's
not
the
perfect
layer,
but
that's
what
we
have
today
right.
We
have
25
notaries
split
up
across
regions,
maybe
in
the
future
we'll
have
you
know,
500
notaries,
with
like
five
from
each
country,
that's
actively
participating
in
and
that's
a
completely
different
world
state.
But
what
we're
solving
for
today
is,
like
you
know,
25
to
100
notaries,
spread
out
relatively
thinly
across
the
world
with
some
concentrated
areas.
A
How
do
we
leverage
groups
of
notaries
that
share
background
share
presence
in
a
region?
Share
knowledge,
share
language,
share
expertise
in
a
way
that,
if
you're
going
to
start
pitching
like
slas,
then
how
do
we
ensure
that
there's
like,
instead
of
it,
just
being
here's
a
set
of
notice
that
at
a
higher
sla
for
the
entire
pool
of
notaries
like?
Why
isn't
it
that
we
do
you
know
each
region
or
each
country?
A
Has
somebody
that's
operating
it
like
a
like
a
lead,
pier
who's,
doing
that
coordination
effort
for
that,
like
group
of
knowledge,
that
could
be
interesting.
I
think,
in
terms
of
leveling
this
up
to
scale
better,
because.
C
A
True
that
we
want
to
increase,
like,
in
general,
we're
working
towards
increasing
the
presence
of
data
cap,
the
presence
of
notaries,
the
ability
of
notaries
to
empower
clients
on
the
network
and-
and
that's
probably
gonna-
come
from
leveling
up
like
automation
and
like
aligning
on
policy,
and
so,
if
we're
gonna,
do
that
then
like
having
like
slaves,
makes
perfect
sense,
because
you
want
somebody
who's
heller,
then,
instead
of
making
them
the
dri
of
the
entire
notary
pool,
we
should
make
them.
A
But
thinking
a
little
bit
further
to
say,
like
imagine
a
world
with
like
say,
say:
we
even
went
to
15
orders
in
the
next
election
or
a
hundred
node
reasons,
and
we
had
this
concept
of
like
groups
and
slas
like
how
could
we
work
towards
a
much
more
effective
and
efficient,
like
community
organizing
mechanism
basically
and
leverage
this
idea
that
you
have
to
turn
it
into
like
the
stepping
stone
into
groups
of
people
that
can
create
policy
together?
F
A
Yeah,
I
would
love
to,
I
think,
do
a
survey
and
I
would
also
love
to
see
some
other
people.
Other
notaries
engage
on
this
and
see
what
they
have
to
say
and
ways
in
which
we
could
do
this.
So
I
mean
there's
other
notaries
in
the
call.
If
you
want
to
unmute
and
talk
about
this
one
I'd
love
to
hear
it.
If
not,
I
mean
there's
a
link
on
the
github
discussion.
A
For
a
reason,
I
would
love
to
see
some
engagement,
galen
and
I
in
general,
have
been
doing
a
little
bit
of
thinking
around
like
starting
to
come
up
with
ways
to
survey.
So
maybe
we
can
front
load
some
of
that
thinking
this
week,
so
that
we
have
like
a
process
that
we
can
pitch
back
to
you
and
then
run
it
as
an
experiment
with
this
one
to
say:
hey
like
let's,
actually
push
community
sentiment
and
ensure
that
we're
hearing
the
audience.
A
Of
go
from
there
and
I
see
megan
in
the
chat,
said
phil
paul
yeah
that
I
don't
think
that's
appropriate.
Yet
I
think
this
is
more
like
surveying
like
what
do
people
think
and
ensuring
that,
like
those
that
do
have,
opinions
are
being
given
the
chance
to
share
that
and
the
friction
to
share
that
is
reduced.
B
B
I
think
one
of
one
of
the
things
that
we've
run
into
is
understanding
what
survey
technology
is
not
going
to
be
regionally
constrained
and
like
secure
enough
to
give
us
the
things
that
we
want
to
get
out
of
it,
and
it
may
be
just
as
simple
as
starting
with
a
google
form
and
and
going
from
there
and
shipping
that
to
our
notaries,
internationally
and
and
getting
feedback
from
them
about.
You
know
whether
or
not
they
were
able
to
easily
access
it.
That's
been
a
thing
that
has
come
up
a
couple
times.
F
A
F
Look,
I
think
the
only
thing
that
will
take
it
next
level
is
putting
a
survey
out
and
asking
for,
if
possible,
for
the
notaries
to
it's
a
mandatory
to
complete
it,
because
we
want
to
get
to
a
place
where
it's
decentralized
and
autonomous,
and
we
can't,
unless
we
you
guys,
don't
want
to
be
telling
people
how
to
do
it.
That's
not
the
point
right.
The
point
is
that
we
all
decide.
So
that's
probably
my
only
input
there
that.
F
Unless
anyone
else
in
the
chat
has
a
question
I'll
move
on
to
the
next
discussion-
and
these
are
all
actually
so
related-
I
part
of
the
with
like
the
my
observation,
was
that
the
the
system
and
the
process
at
the
moment
is
really.
It's
really
focused
around
it's
really
geared
to
file
coin,
so
we
serve
file
coin,
but
we
may
not
be
serving
the
clients
and
I
have
a
design
thinking
background.
F
So
every
time
I'm
interacting
with
someone
like
with
the
internet
archive,
for
example,
like
I'm,
asking
I'm
asking
them
if
they
need
any
help
and
how
do
they
onboard
and
what
do
they
need
from
us,
and
how
do
we
make
sure
this
is
successful?
So
I
want
to
bring
that
lens.
I
don't
I
don't.
I
don't
think
we
I
just
want
to
bring
it.
I
think
everyone
we
hope
we
need
it
overall
as
a.
F
How
do
we
understand
what
problems
we're
solving
for
our
customers
so
that
we
can
meet
those
targets
and
the
best
way
to
do
that
is
some
form
of
journey
map.
Now
there's
two
ways:
you
can
do
it,
so
I'm
just
going
to
run
through
this
two
things.
There's
an
actual
customer
journey.
F
That's
really
puts
a
spotlight
on
the
customer,
deeply
understands
them
from
their
whole
process
from
when
you,
you
know,
recruit
them
or
acquire
them
to
when
you
when
they
leave
you
when
they
retire,
it
develops
their
value
proposition
and
it
designs
the
experience
around
them.
It's
a
real
spotlight
on
the
customer.
F
The
thing
that
I've
really
been
I've
been
training
in
the
last
couple
of
years,
and
I
really
do
love
is
service
design.
Now,
by
comparison,
it
is
it
maps
across
the
whole
value
chain,
which
I
think,
I
think
adds
a
little
more
value.
So
you
know
uber
is
excellent
at
this
and,
dare
I
say
it,
but
amazon
in
their
distribution
followed
this
model
and
they
they
met.
F
Everyone
from
you
know
in
in
the
procurement
center,
through
to
the
distribution
center
and
or
all
the
way
through
to
a
customer
what
their,
what
the
systems
are,
what
the
touch
points
are
and
they
make
every
single
layer
frictionless.
So
you
know
what
I'm
maybe
I'm
proposing
and
I'm
putting
these
two
examples
up.
F
Is
that
it's
a
blend
of
both
that
we,
we
really
deeply
understand
our
customers
and
we
understand
what
we
need
to
do
to
service
them,
so
it
may
be
a
blend
of
both
and
it
is
actually
also
related
to
this
market
assessment
piece.
F
So
I'm
having
a
separate
conversation
with
the
foundation
around
supply
and
demand
a
little
about
out
of
balance,
and
when
I
heard
the
size
of
the
data
quotas
that
you
would
like
us
to
work
towards,
we
have
been
using
network
and
it
has
been
quite
organic,
but
there's
a
point
with
which,
in
any
sales
cycle
that
runs
out
right,
you
can
only
tap
your
network
so
much
until
you
have
a
strategic
way
to
commercialize
it.
So
in
all
the
products
are
new
technology
products.
I've
commercialized!
F
You
need
this
piece,
it
tells
you
who's
doing
what,
when
how
it
understands
our
value
proposition.
It
involves
some
prototyping
and
co-creation
with
clients,
and
the
customer
journey
is
part
of
it.
I
can't
remember
the
reason
why
I
split
them
out.
There
was
a
good
reason
and
when
you
asked
me
to
put
this
together,
I'm
like-
why
did
I
split
these
into
separate
pieces,
but
maybe
it
was
just
to
make
it
easier
to
make
decisions
on
smaller
chunks.
F
Now
this
one,
I
guess
the
ask,
is
I
want
to
know
whether
the
the
community
in
in
some
of
the
notaries
I've
spoken
to.
They
said
that
this
is
pretty
important,
but
I
would
like
to
know
if
there's
greater
support
from
the
notaries
and
and
also
that,
would
support
our
application
to
get
this
work
done.
I
would
lead
it,
but
we
would
need
like
a
core
group
from
different
regions,
so
we
could
take
this.
F
What
some
of
this
will
be
universal
and
then
we'll
hand
it
off
to
different
regions,
and
we
can
do
requesting
funding
for
qualitative
research
in
each
region,
so
you
can
adapt
debt
for
cultural
differences,
so
we
would
set
up
a
baseline
and
what
is
included
in
that
baseline
is
this.
You
know
pretty
fabulous
kind
of
customer
journey
in
service
service
design.
F
A
Yeah
I've
got
a.
I
have
a
quick
process
related
question
that
I'm
hoping,
maybe
megan
and
galen
might
have
an
idea
on
which
is
basically
for
this
specific,
like
market
research
thing,
not
the
service
design
necessarily,
but
for
the
market
research
thing:
can
that
not
just
proceed
as
like
a
rfp
or
like
a
grant
request
outside
of
phil
plus,
even
because
that's
like
relevant.
F
Yeah,
well,
I
think
it's
just
been
moving
slowly,
so
any
support
I
have,
then
that
will
help.
I
see.
G
Yeah
julie,
I
I
know
I
thought
that
you
and
clara
were
talking
about
this,
yes,
and
so,
where,
where
did
it
get
hung
up?
How
can
I,
how
can
I
unblock
it,
for
you.
F
I
met
with
clara
late
last
week,
so
it's
still
with
her.
I
and
I've
put
the
proposal
and
the
funding
and
in
a
in
a
sort
of
startup
style,
I've
suggested
that
we
tranched
the
stages.
So
you
know
we
only
release
funding
when
we're
achieving
what
we've
done
in
the
milestones
before
that,
and
I
think
the
biggest
piece
of
funding
is
the
qual
for
each
region.
F
So,
if
we're
not,
if
we're
doing
universal
first
and
then
drilling
down
on
regions
later,
then-
and
it
might
even
fall
into
what
deep
was
suggesting-
which
was
how
do
notaries
act
in
groups?
Some
of
that
could
come.
Some
of
that
could
be
a
little
self-sourced
as
well.
So
we
basically
just
need
to
start
with
what
is
the
universal
baseline?
F
Do
some
of
the
qual
and
then
find
out?
Who,
who
is
the
core
group
or
line
up
the
core
group
to
understand
if
we're
doing
qual
in
other
regions
as
well,
to
make
it
richer?
And,
I
suppose
it's
representative
of
a
global
view
and
so
yeah?
I
guess
the
next
stage
is
to
to
get
an
update
on
how
we
proceed.
G
Okay,
cool-
I
I
suspect,
like
that.
This
is
like
a
little
outside
of
the
current,
like
their
brands
that
are
coming
through,
and
so
I
think
that
clara
is
probably
you
know
doing
it
herself
as
opposed
to
putting
it
through.
You
know
sonia
who
usually
does
just
transfer
the
foundation,
so
I
I'll
poke,
clara
and
see
what's
going
on,
but
I'm
guessing
that
we
should
probably
just
kind
of
kick
it
over
to
sonia,
just
like
stick
it
through
the
normal
def
grant
process.
F
Yeah-
and
I
think
maybe
like
this-
is
what
I'm
actually.
This
is
part
of
my
point
that
I'm
pioneering
that
we've
got
dev
grants,
but
where
is
the
cx
where's
the
cx
like?
Let's
make
sure
we
don't
miss
that,
because
experience
is
everything,
it's
absolutely
everything,
it's
what
it's.
What
differentiates
lots
of
products
and
services
nowadays,
and
you
know
we
we're
offering
some
great
incentives.
But
if
the
experience
is
bad
then
then
we
only
have
one
shot
at
that.
A
Yeah,
I
think
that
makes
a
lot
of
sense
meg.
The
only
thing
I
call
out
there
is
basically
there's
two
different
approaches
here.
One
is
that
you
think
of
several
services
and
products
being
built
independently
on
top
of
file
finance
protocol
and
each
one
of
these
is
doing
its
own
market
analysis.
Ux
works,
cx
working
so,
for
example,
being
her
in
the
chat
just
dropped
that
they're
working.
C
A
Then
there's
the
angle
of
like
trying
to
do
this
on
a
protocol
wide
level
to
increase
the
funnel
to
increase
the
amount
of
traffic
and
transit
that's
coming
through,
and
there
is
also
that
kind
of
work
stream
that
exists.
I
know
the
foundation
specifically
has
been
hiring
people
that
can
work
as
client
solution,
architects
and
things
like
that.
That
understand
the
needs
and
can
think
about
designing
solutions
at
scale,
and
so
I
think
you
going
down
the
grants.
C
F
F
Absolutely
100
and
you
know
what,
if,
if
the
you
know,
the
wonderful
community
of
entrepreneurs
and
we've
been
having
a
conversation
with
andrew
around
it,
but
if
he
already
has,
if
you
can
bring
to
the
table
some
personas
some
journey
maps
and
and
something
that
we
can
add
in
here
and
we
I
can
sort
of
imagine
we're
catalog
cataloging,
a
demographic,
a
persona,
a
vertical.
We
understand
their
pains.
F
Imagine
sharing
that
on
a
global
scale
right
like
and
if
andrew
wants
to
scale
a
workforce
here
in
australia,
and
we
want
to
sell
his
solution
on
his
behalf.
This
could
become
like
a
really
rich,
like
I've
not
run
a
market
assessment
that
can
be
crowdsourced
before
I'm
kind
of
excited
about
it.
It
could
be
really
really
powerful
and
it
becomes
a
living
thing.
It's
not
something
that
you
do
once
and
sits
on
the
shelf
and
collects
tasks.
F
F
And
the
ask
is
yeah
vote
for
it,
provide
some
feedback
and
median
you're
gonna.
Do
some
follow-up
from
me.
F
A
Cool,
I
think,
for
the
the
last
sort
of
discussion
I
might
just
briefly
just
touch
on
it,
get
on
and
over
a
little
short
on
time,
and
I
also
got
a
lot
of
speaking
time
this
morning
and
so
people
can
watch
the
recording
for
that.
But
tldr
on
that
was
basically
sort
of
pushing
us
to
start
thinking
about
how
we
can
level
up
automated
verification,
as
like
the
first
big
incremental
step,
both
towards
supporting
larger
volumes
in
data
cap,
but
also
much
faster
time
to
date.
A
gap
in
improving
the
client
onboarding
experience.
A
A
But
the
few
salient
points
are
basically
that
we
today
we
have
one
automated
verifier
that
has
not
only
brought
in
greater
than
65
of
all
unique
client
addresses
participating
in
file
coin
plus,
but
it's
also
resulted
in
deals
being
made
with
greater
than
60
of
all
unique
minor
ids
making
deals
with
data
cap
on
the
network
and
so
like
a
literally
a
majority
of
the
actual
on-chain
presence
of
our
entire
program
is
coming
through
this
automated
verifier
and
all
it's
doing
today
is
handing
out
batches
of
32
gibby
bytes
right
and
at
some
point
in
time
it's
actually
handing
out
eight
gigabytes,
not
even
32,
and
we
as
a
community
decided
that
was
probably
worth
living
it
up
to
at
least
get
it
to
fill
up
a
sector,
and
so
some
stats
that
sort
of
caught
my
attention
this
past
week-
and
I
think,
gillan
sort
of
generally
agrees
with
this-
is
the
average
deal
size
on
the
network
today
with
data
cap
is
about
16
gb,
which
means
that
somebody's
getting
32
gb
of
data
cap
and
now
I'm
oversimplifying,
but
I
should
be
saying,
gib
they're
able
to
make
two.
A
Like
average
deals
to
prove
their
presence
on
the
network
and
if
we're
looking
at
unblocking
clients
like
making
two
deals,
is
not
only
like
relatively
trivial,
but
it's
also
kind
of
useless
from
the
perspective
of
generating
data
about
the
client
and
the
and
like
the
the
kind
of
actor
that
that
client
might
be
in
the
far
corn
plus
ecosystem.
And
if
that's
somebody,
we
want
to
support
in
the
future,
either
through
automation
or
through
manual
verification,
but
enable
them
in
a
much
larger
scale.
A
Yeah,
and
that
combined
with
the
fact
that
the
average
allocation
today
of
data
cap
is
only
about
1.4
gb,
bytes
sort
of
prompted
the
thought
that
hey
like
what
are
the
things
we
need
to
do
to
start
thinking
about
automated
verification
on,
like
the
next
order
of
magnitude
or
more
so,
instead
of
32
gb
bytes
like
how
do
we
get
the
hundreds
of
gb
bytes
or
even
up
to
like
one
peppy
byte
via
automated
application?
And
so
that
could
either
mean
scaling
up
exist.
Automate
like
the
existing,
automated
verifier,
it
could
mean
adding
more.
A
The
actual,
like
kyc
elements
that
drive
the
existing
automated
verifier,
but
I
feel
like
it's
a
really
good
opportunity
in
terms
of
step,
one
in
the
direction
of
serving
clients
better,
faster
and
reducing
like
individual
burden
on
a
notary,
but
increasing
burden
on
the
group
of
notaries
to
start
thinking
together
as
a
group
and
proposing
changes
that
can
then
start
to
inform
automation,
and
so
that
discussion
will
just
kicked
off
as
a
prompt
to
start
thinking
about
some
of
that,
we
are
happy
to
engage
on
that,
probably
on
github,
because
I
know
we
have
a
couple
more
things
we
need
to
cover
on
that
note
just
wanted
to
start
adding
in
a
little
section
on
where
we're
tracking
towards
goals
that
we've
been
chatting
about
for
the
last
month
or
so
for
like
six
months
from
now.
A
So
there's,
of
course,
like
conversations
around
you
know,
volume
which
is
like
how
do
we
get
up
to
like
500
bytes,
say
of
data
cap
across
ldns
and
clients,
and
how
are
we
tracking
towards
that?
So
today
we're
at
41
per
bytes,
as
you
saw
at
the
start
of
this
presentation.
A
What
are
the
things
that
are
going
to
massively
impact
that,
like
we're
all
like,
you
know,
issue
217
is
shipping
soon,
which
is
like
a
new
flow
for
ldns.
That's
likely
going
to
contribute.
You
know,
50
plus,
maybe
buys
a
data
cap.
What
about
like
another
node
elections?
What
if
we
scale
up
nodejs
like
what
would
that
do
in
terms
of
supporting
a
much
higher
volume
of
circulation
data
cap?
A
There's
also,
this
protocol
labs
program
called
falcon
gravity
which
is
unrelated
to
firecoin
plus
but
related
to
general
business
development
on
falcon
pass,
which
is
like
a
sales
referral
program
to
bring
in
additional
data
sets
onto
the
network.
It's
just
some
sort
of
the
the
point
of
this
is
like
you
know
what
what
are
the
goals?
Where
are
we
at
today?
A
What
are
the
big
things
that
are
up
in
front
of
us
or
prompts
for
discussion
or
ideas
that
we
should
be
thinking
about
so
moving
on
from
volume,
there's
a
set
of
goals
around
onboarding,
speed
and
time
to
data
cap,
where
we're
probably
quite
quite
far
away
still,
but
conversations
like
what
I
just
talked
about,
where
you
know
we
think
about
like
automation,
more
and
then
meg's
prompts
around
like
how
we
can
organize
ourselves
better,
as
a
community
are
probably
going
to
like,
accrue
well
towards
improving
like
onboarding
speed
and
experiences
for
clients
and
there's
also
just
the
aspect
of
like
improving
the
tooling
and
so
part
of
why
we
want
to
get
better
at
like
surveys
and
collecting
data
is
to
start
looking
at
ways
in
which
you
can
also
start
improve,
tooling
and
improve
the
support
of
the
software.
A
That's
been
built,
there's
also
a
bunch
of
ecosystem
development
happening
in
this
space.
So
I
I
don't
know
if
you
were
there
a
few
weeks
ago,
but
we
had
matrix
storage
come
and
present
their
notary
verification
tool.
I
know
that
1475
has
a
tool,
so
there's
definitely
work.
That's
been
done
and
so
sort
of
consolidating
efforts,
making
it
more
open
to
the
community.
A
Improving
the
general
state
of
tooling
will
definitely
go
a
long
way
in
reducing
onboarding,
latencies
and
then
there's
sort
of
the
next
category
of
things
on
the
next
slide
risk
mitigation,
ensuring
that
you
know
we're
building
dashboards
and
creating
the
data
that
can
be
tracked
and
analytics
can
be
done
on
to
ensure
that
on-chain,
behavior
or
stakeholders
is
something
we
can
follow
and
and
continue
to
build.
Trust.
Using
the
status
of
that
today
is
we
have
two
dashboards
that
are
still
relatively
basic.
It's
of
course,
iteration
happening
on
those
dashboards.
A
A
Lots
of
nice
upgrades
not
just
a
better
ui,
but
also
much
better,
like
data
processing
and
some
interesting
insights
coming
in,
but
again
good,
prompt
and
a
good
opportunity
for
you
all
to
think
about
ways
in
which
you
can
improve
transparency
in
the
system
so
that
we
have,
as
we
continue
to
get
more
aggressive
and
experiment
with
things.
We
also
have
strategies
for
mitigating
risk
for
falcon
plus
and
ensuring
that
the
things
that
we
do
are
going
to
serve
the
network.
A
For
example,
or
like
many
active
community
members,
as
we
like
level
up
the
magnitude
of
impact
of
this
program,
sorry,
if
I
rush
through
that
a
little
bit,
I
expect
that
this
little
segment
will
probably
become
a
little
bit
more
consistent
in
in
our
governance.
Calls
as
we
talk
towards
these
goals,
so
gail,
and
hopefully
this
enough
time
left
to
cover
the
rest
of
this
stuff.
B
You're,
muted,
lightning
round
yeah
lightning
round
here,
so
the
ldn
process,
v2
that
we've
been
talking
about.
We
thought
that
it
was
going
to
be
shipped
and
ready
to
go
by
today.
That
wasn't
quite
the
case.
There's
some.
There
were
still
some
tech
issues
we
have
tested.
B
It
end
to
end
in
a
test
net
repo
with
the
heco
team,
and
it's
looking
good
we're
hoping
still
to
have
a
chip
by
the
end
of
this
week,
which
would
mean
rolling
all
of
the
existing
ldns
that
are
already
in
flight
to
a
new
multi-sig
and
some
new
processing.
And
so
I
just
wanted
to
take
a
second
and
point
out
some
of
that
process
and
in
mindset
for
the
notaries
as
they
work
on
these
ldns.
B
B
On
that
front
end
before
it
shows
up
in
the
app
and
at
that
time,
that
allocation
request
is
already
there
ready
to
go
for
two
of
the
notaries
to
jump
in
and
do
the
diligence
on
the
application.
And
what
we're
really
hoping
for
here
is
that
the
questions
that
we
ask
in
the
application
itself
give
the
client
sufficient
time
to
like
demonstrate
who
they
are,
that
they
are
a
real
client
with
valid
data
and
they
are
worthy
of
allocations
of
data
cap.
B
And
if
we
are
feeling
like
we're
having
to
do
a
lot
of
back
and
forth,
then
maybe
we
need
to
think
about
changing
that
application
on
the
front
end,
because
our
goal
here
is
a
client
lands
on
that
application.
Submit
that
submits
that
information
and
is
quickly
able
to
start
making
deals
to
then
prove
that
they
are
going
to
operate
in
good
faith
as
they
say,
and
so
again
for
the
notaries
we're
we're
looking
at
both.
B
Let's
audit
that
deal
making
behavior
and
that's
going
to
be
supported
through
those
bots
that
post
stats
in
the
github
issue
as
they
kick
off
subsequent
allocations.
So
as
we
have
questions
both
for
the
clients
but
then
also
for
other
notaries,
you
know
we
want
to
be
asking
those
on
the
github
issue
and
then
also
asking
in
slack
we
have
a
really
good
private
notary
channel
and
so,
as
we
kind
of
organize
ourselves,
we've
seen
great
examples
in
this
ldn
process.
B
So
far
of
those
regional
and
subject
matter,
experts
where
a
notary
can
lean
on
other
community
notary
members
to
help
say,
like
you
know,
I
don't
really
understand
what
this
client
is
saying,
or
this
type
of
data
can
somebody
else
like
help
me
assess
if
this
seems
valid,
so
we
want
to.
We
want
to
keep
encouraging
that
behavior.
So
that
was
also
super
fast.
We
with
our
final
five
minutes
just
wanted
to
stop
and
see
if
there
were
other
like
burning
pending
questions
or
topics
for
discussion
that
we
wanted
to
bring
up.
B
Support
changing
the
client
address
for
where
you
will
receive
the
data
cap
is
that
right,
yeah
to
our
knowledge,
it
is
a
change
that
can
happen
on
the
github
issue,
so
we
can
adjust
that
where
it
gets
tricky
is
how
we
connect
all
of
that
behavior
back
for
auditing,
but
since
it
is
coming
from
that
ldn
notary,
when
we
go
to
audit
it
will
just
say
you
know
this
notary
has
made
allocations
to,
for
example,
two
or
three
client
addresses
how
have
those
client
addresses
behaved,
knowing
that
those
are
multiple
client
addresses
to
the
same
kind
of
application
project,
so
there
shouldn't
be
technical
or
programmatic
limitations
to
changing
the
client
address.
B
I
think
what
we'll
want
to
do,
and
maybe
you
know
maybe
deep-
and
I
can
talk
more
if
we
feel
that
we
need
some
additional
diligence
or
something
before
we
make
a
change
like
a
message.
B
You
know
message
receipt
in
the
issue
that
that
someone
could
point
to
to
just
help
show
ownership
and
connecting
of
those
addresses
so
we'll
stay
tuned.
I
think,
as
we
need
to
like
make
those
changes.
We
can
just
comment
in
the
github
issue
and
go
from
there
to
discard
the
remaining
allocation.
B
So
there
is
not
right
now,
a
way
to
you
know,
discard
or
roll
an
allocation
from
a
client
address
to
another
client
address,
and
so
that's
just
where
you
know
what
we
hope
to
do
is
just
have
the
process
working
so
that
the
flow
from
the
notary
to
the
client
is
sufficient
to
not
lead
to
both
a
client
running
out,
but
also
to
not
feel
like
a
client
got
a
huge
amount
and
it's
now
sitting
there
and
they
like
lost
access
to
that
client
address
and
they
can't
use
it.
B
A
A
Nice,
how
about
changing
data
set?
That's
tricky.
I
do
think
that's
worthy
of
a
new
application
right,
because
that
the
whole
point
is
this
thing
is
specific
to
like
a
client
use
case
unless
well,
I
think
so.
Faye
your
case
is
interesting
because
you
specifically
applied
for
data
for
the
three
categorized
genomes
right,
but,
for
example,
estuary
or
like
phil
swann
are
applying
for
data
cap.
That
says
hey
we're
doing.
A
A
So
I
think
that's
interesting
sort
of
dependency,
where
I
don't
think
you
have
as
much
flexibility,
because
you're
uploading
data
that's
effectively
like
from
a
subset
of
data,
sets
that
are
being
made
available
to
you,
where
the
decisions
are
being
made
by
you
and
by
other
people,
which
is
different
than
like.
If
you
were
building
a
service
or
if
you
were
your
own
scientific
organization
that
was
conducting
its
own
research
or
something
like
yeah,
some
other
stuff
that
we're
seeing
in
the
ldn
apps.