►
From YouTube: Filecoin Plus - Sept 14 Notary Governance Calls
Description
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
Filecoin Plus governance calls, same agenda in two timeslots (8am & 4pm PT) to allow for broader geographic attendance. On these calls, we discussed recent DataCap allocation and application metrics, heard from various clients applying through the Large Dataset process, discussed some highlights from the recent design summit hosted by Filecoin Foundation, shared 6-month goals, and allowed for open discussion from notaries on current ideas.
A
Great
happy,
tuesday
september,
the
14th
eight
a.m.
First,
over
here
pacific
time,
thanks
everybody
for
joining
us
in,
as
we
did
two
weeks
ago,
we're
doing
this
as
a
the
same
agenda
in
two
different
time
slots
today.
So
this
same
agenda
will
happen
again
in
a
few
hours.
If
you
can't
make
both
that's
totally
fine,
we'll
jump
through
our
notary
governance
meeting,
we've
adjusted
the
planned
agenda
just
a
little
bit
to
bring
client
presentations
up.
We
have
some
large
data
set
clients
here
that
want
to
share
their
applications.
A
So
first
we'll
run
through
some
metrics
touch
on
that
and
then
talk
about
some
discussions
put
air
quotes
around
that
because
we're
using
the
actual
discussions
feature
in
github,
introduce
that
deep.
Now,
you're
gonna
talk
about
some
things
from
the
summit
that
we
had
last
week
share
out
some
six
month.
Program
goals
show
some
of
the
new
ldn
tools
and
then
have
time
for
an
open
discussion,
at
which
point
I
think,
we'll
also
hear
from
eric
regarding
a
great
fit
that
he
has
proposed.
A
So
that's
the
agenda,
kicking
it
off.
We
learned
just
a
flash
30
minutes
ago
that
a
number
of
web
services
are
currently
down,
including
interplanetary,
dashboard
and
space
gap.
A
We
grabbed
these
screenshots
from
yesterday,
but
if
you
go
check
the
dashboard
right
now,
I'm
gonna
slip
down
so
working
on.
A
Getting
that
fixed,
we
were
able
to
update
how
it
is
scraping
and
calculating
to
get
away
from
as
happy
as
we
were
to
see
that
18
minute
average
time
today
to
cap
we're
back
at
what
we
think
is
a
more
accurate,
seven
days
average
overall
time
to
date,
a
cap
we
have
seen
a
decrease
in
average
time,
the
first
response
from
notary,
which
is
phenomenal
and
time
to
data
cap
allocated
on
chain
slight
change
by
a
few
hours,
but
still
looking
at
that
12
day
mark
so
metrics
still
continuing
to
go
down.
A
This
is
one
that
we're
happy
to
see
decrease.
We
really
want
to
keep
driving
these
as
far
down
as
possible.
We
have
some
ideas
about
that.
A
Looking
at
the
allocation
funnel
again,
this
is
still
taking
in
large
data
set
notaries
as
well,
which
is
part
of
the
reason
we're
at
36
petty
bytes,
because
it
is
counting
each
of
those.
You
know,
for
example,
five,
petty
bytes
to
an
outline,
I'm
breaking
that
down
to
clients
for
1.9
pebby
bytes.
That's
made
it
out
the
door
to
clients
which
is
phenomenal
and
an
increase
from
where
we
were
0.75
heavyweight,
sealed
in
deals.
A
A
A
Looking
at
some
of
our
data
set
metrics,
we
did
get
an
update
yesterday
to
change
the
naming
for
the
ldns,
so
that's
exciting.
Hopefully
this
helps
make
it
a
little
bit
more
clear
when
you're
looking
at
that
notary
dashboard.
A
So
we've
go
over
this
formatting
of
lvn1,
because
this
was
issue
number
one
and
then
the
project
and
an
affiliated
organization
so,
for
example,
ldn22
with
internet
archive
at
this
point
we're
seeing
21
total
client
applications
with
54
petty
bytes
of
data
cap
requested
in
submission
for
all
of
those
breaking
down
that
funnel
a
little
bit
of
the
client
applications
that
have
been
approved
and
started
to
allocate.
We
have
742
teddy
bytes
out
to
clients,
specifically
with
large
datasets
and
from
there
174
teddy
bytes
sealed
in
deals.
A
B
B
B
So
we
are
the
company,
that's
serving
the
payment
gateway
and
psp
service
in
korea
at
the
moment.
So
we
do
have
the
market
share
of
number
one
payment
gateway
service
in
korea.
We
have
been
operating
in
korea
for
20
years
now
and
we
are
part
of
the
one
of
the
korea's
largest
I.t
company
called
nhm
corporation,
which
has
now
more
than
like
102
subsidiary
companies,
including
us
and
ebola
companies,
e-commerce,
some
like
streaming
service,
and
I
t
and
cloud
service
data
service
and
all
the
fintech
service
that
offers
in
korea.
B
So
nhm
is
definitely
one
of
the
biggest
I.t
or
corporation
in
korea,
so
which
has
subsidiaries
in
singapore,
japan,
other
countries,
including
nh
and
kcp
ourselves.
We
also
have
our
subsidiary
companies
in
singapore,
so
nh
in
case
ppte,
and
also
we
have
run
the
joint
venture
with
the
skt
from
korea
and
also
thailand's
national
telecom
company
called
tripe.
B
So
we
have
been
operating
that
company
for
about
seven
years
as
well,
so
I've
been
in
thailand
and
working
for
them
for
far
more
than
five
years,
and
I
just
came
back
to
korea,
okay,
so
this
is
just
brief
introduction
of
our
customers,
so
we
serve
payment
service
for
like
google,
amazon,
microsoft,
tyson
louis
vuitton,
ms
and
all
the
top
companies
in
the
world
so
which
we
are
serving
for
providing
the
local
payment
processing
for
them.
So
that's
one
of
our
key
jobs.
B
We
serve
of
payment
service
for
e-commerce
from
auto
retails
fashion
and
every
company.
So
these
three
sections
are
our
major
business,
which
is
the
payment
gateway
service
and
also
the
value
addition
network
service,
which
is
providing
network
for
between
the
issuers
acquires
merchant
and
the
car
companies
in
korea
and
the
fintech
service,
including
like
wild
living
service,
and
also
because
our
parent
company
is
running
one
of
the
biggest
e-wallet
and
data
platform
called
paco.
So
we
are
serving
some
data
analysis
service
for
them
as
well.
B
So
this
is
just
brief,
just
explanation
of
our
market
share,
so
we
are
definitely
one
of
the
top
market
share
of
the
companies
in
korea,
which
has
around
200
psp
companies
registered
in
korea.
So
it's
just
top
three
us
and
lg
thirst
and
initiates
33
dominating
the
market
about
70
at
least
so
that's
the
pg
pse
market
share
in
korea
and
our
continued
sales
are
grossing
quite
rapidly.
B
So
when
I
first
joined
the
company
in
2014,
our
rank
was
about
number
four,
but
now
we
have
grown
extremely
fast
since
last
four
years,
starting
with
apple.
So
we
have
started
acquiring
a
solo
for
apple.
For
about
three
years
now
and
after
we
started
apple,
a
lot
of
the
top
players
came
for
us
as
a
reference
like
we've,
been
processing,
tesla,
google,
amazon
and
all
the
big
names
that
help
us
to
grow
quite
fast
in
korea.
So
now
we
became
number
4
with
around
less
than
10
percent.
B
B
B
Oh
so
I'll
just
briefly
explain
what
I'm
going
to
do
for
as
well.
So
so
that's
that
was
just
brief.
Introduction
of
our
company
and
one
of
the
things
that
I'm
building
and
planning
is
because
I
want
to
because
it's
quite
big
thing
in
korea,
all
the
big
data,
blockchain
technology,
machine
learning
and
data
stuff.
They
are
supporting
a
lot
of
the
companies
to
start
those
business
within
korea
and
expand
to
the
work.
B
So
there
are
a
lot
of
the
governments
that
is
providing
all
this
data
for
free,
so
they
and
support
and
even
pay
for
you.
So
you
can
make
business
of
yourself.
So
there's
a
lot
of
supportive
programs
that
I
can
get
some
decent
data
and
stuff
and
because
we
are
like
a
fintech
company
doing
payment
processing.
I
always
thought
it
will
be
a
lot
of
fun
for
if
we
can
utilize
those
data
and
gather
some
big
data
and
provide
us
marketing
and
the
service
to
the
customer.
B
So
I
was
planning
to
because
we
have
all
the
data
processing
ourselves
because
we
have
been
operating
career
for
20
more
than
20
years.
Now
we
have
all
the
customers
like
payment,
behaviors,
geological
behaviors
and
all
the
stuff,
and
we
could,
if
you
can
utilize
that
with
and
though
public
data
like
all
the
gps
information
and
how
density
of
those
like
commerce
population
is
from
those
images
and
videos
the
government
provides
and
all
those
like.
Also
not
just
the
government.
B
B
B
So
those
are
the
just
things
that
I
want
to
do
so
examples.
I've
been
choosing
the
thesis.
I've
got
the
mockup
for
the
system
that
I'm
build
planning
now,
so
customers
can,
because
at
the
first
we
won't
be
able
to
have
too
many
data
we'll
and
we'll
have
to
rely
on
some
data
that
actually
direct
governments
and
universities
provide.
So
we
want
to
build
a
platform
that
the
potential
customer
can
upload
where
they
want
to
get
the
data
analysis,
so
they
will
the
place.
B
They
are
alerting
themselves
like
some
pictures,
images
of
some
excel
datas
or
videos,
so
they
can
upload
themselves
and
ask
us
to
evaluate
the
data,
and
we
will
evaluate
and
compare
with
the
database
of
ourselves
and
compare
with
all
the
other
reasons
and
how
those
markets
are
like
how
competitive
this
area
is
and
all
this
stuff.
So
this
is
the
example,
so
I
want
to
know
the
section
of
this.
So
if
I
upload
some
video
image
of
those
sub
sections,
so
I
want
to
build
the
system.
B
So
that's
pretty
much
the
big
feature
that
I'm
planning
out
which
is
in
built-
and
I
was
looking
because
one
of
the
key
issue
that
I've
been
facing
this
was
I've-
been
actually
planning
this
book
quite
a
long
time.
Maybe
two
years
ago
since
I
was
in
thailand-
and
one
of
the
key
challenges
was
that
this
cursor-
if
I
want
to
store
this
all
this
data
into
like
amazon
aws,
it's
really
expensive,
like
it-
is
almost
impossible
to
provide
these
informations
for
customers
for
free.
B
B
C
C
I
definitely
have
some
questions
in
terms
of
like
utilization
of
like
falcon
and
like
your
general
strategies
for
onboarding
business
data.
It
looks
like
there.
I
I
see
the
whole
like
failover
thing.
I
see
also
different
types
of
data
and
different
media,
probably
coming
in
what?
What
do
you
think
is
going
to
be
your
order
in
which
you
onboard
your
data
and
then
how
are
you
going
to
prioritize
onboarding
new
data
to
file
coin
versus,
like
making
more
replicas,
for
example,
to
ensure
you
know,
long-term
reliability
and
access.
B
A
B
Okay,
so
this
I'm
not
sure
if
you
can
see
this,
so
this
is
like
the
one
of
the
examples
that
I
have
gathered
from
the
official
government.
So
sorry,
this
is
korean,
but
this
is
just
the
data
that
I've
downloaded
from
the
government
website.
So
this
is
like
the
information
that
the
government
provides.
What's
the
establishment
here,
what's
their
like
business,
registration,
number
and
merchant
name,
what's
the
category
of
the
name,
what
are
they
selling
their
address
and
where
is
the
gps
information?
That's
all
the
stuff!
B
I
have
utilized
these
informations
to
sh
that
that
graphicalized,
the
before
the
previous
page.
So
all
this
data
is
in
upload
from
the
government
as
a
free,
like
anyone
can
access
and
utilize
this
information.
So
there
are
like
tons
of
information
like
I
just
need
to
ask
them
for
application.
So
can
I
use
this
data?
B
I
want
to
provide
make
this
service
as
a
kcp
and
provide
to
my
customer,
and
it
is
okay.
It
is
just
a
like
agreement
that
I
need
to
receive.
There's
there's
no
payment
involved
any
like
agreements
involved,
so
it's
really
actually
quite
useful.
This
is
one
of
the
key
things
from
the
korean
governments
to
actually
make
the
boom
of
the
first
generation
using
big
data
and
data
mining
and
stuff
I'll
show
you
one
of
the
examples
of
the
video
as
well.
B
Okay,
so
this
is
the
another
data
that
I've
received
from
the
jeju
islands
like
it's,
not
the
government
states,
so
they
provided
those
drawn
like
images
and
also
video.
So
I
can
look
up.
I
can
put
into
our
system
and
record
like
the
how
many
numbers
car
costs
per
day,
how
many
people
counts
a
day
and
all
this
data
and
store
this
data
as
a
public
and
make
some
useful
informations
from
it.
B
There
are
tons
of
information
that
I
actually
received
for
free,
so
there's
there's
those
images
and
also
that,
like
substantial,
like
images,
so
these
are
the
images
that
I
have
just
sliced
for
of
my
like
analysis,
it's
like
it
was
about
like
five
per
five
seconds,
so
I
have
utilized
and
captured,
and
so
I
can
do
the
people
count
and
the
car
pass
count
and
all
this
data.
So
there
are
a
lot.
B
I've
been
looking
for
this
for
quite
a
lot
and
a
lot
of
the
governments,
and
a
lot
of
the
states
are
quite
supportive
of
doing
this,
so
they
are
more
than
willing
to
provide
data
for
for
public.
As
long
as
I
can
guarantee
them,
I
will
do
those
like
a
mosaic
for
personal
information,
like
the
people
face
or
complex
and
all
stuff.
So
as
long
as
those
securities
the
past,
they
are
willing
to
give
us
the
oldest
images,
videos
data
as
much
as
I
want.
C
That's
really
cool.
Thank
you
so
much
for
sharing
very
insightful
yeah.
I
just
want
to
again
just
give
you
another
minute
or
two
to
see
if
there's
any
other
questions
in
the
room
thanks
so
much
for
coming
and
then
sharing
all
this,
I
I
really
appreciate
it
to
the
notaries
in
the
room.
If
there's
anybody
that
would
like
to
ask
any
questions,
please
feel
free.
Oh,
we
have
a
hand
from
buchi
from
the
nurture
you
go
for
it.
D
Hi,
so
it
seems
like
that
your
platform
is
not
ready
right.
It's
still
like
in
building
right
now,
so
I'm
wondering
like
how
much
data
cap
you
need
in
the
like
first
batch
to
upload
some
data
in
the
first
place
and
how
much
like
data
camp
you
need,
after
that,
so
we
we
do
want
to
know
a
plan
of
like
application.
B
Actually,
that's
really
good
question,
so
it
is
not
built
yet
for
initial
analysis.
We
will
need
to
store
some
data,
so
I
need
some
data
to
build
my
platform,
so
it
won't
be
open
to
public
for
at
least
like
three
months
calculated
in
a
required
about
30
man
months
to
build
so
I'm
willing
to
put
about
six
seven,
the
developers
into
these
projects
and
get
get
it
for
soft
launch
within
like
three
months,
but
before
we
launch,
I
need
to
gather
as
much
as
information
as
possible
at
the
first
or
at
early
stage.
B
B
Oh,
maybe
like
50
tib
per
week,
maybe
so
I'll,
just
I'll
contact
all
those
like
the
entities
and
see
how
much
information
I
can
get
at
the
first
stage,
but
I'm
expecting
about
50
tib
per
week
for
uploading.
Our
images
and
videos.
C
Cool
that
could
be
satisfied
via
the
first
allocation
or
even
via
a
notary
directly.
If
that
ends
up
being
a
better
path,
we
have
a
hand
from
eric
go
for
it.
E
E
Okay,
good,
so
can
you
accept
the
miner
in
china
to
store
your
data?
B
So
I'm
going
to
reach
here
for
at
first
stage,
first
few,
maybe
like
three
or
four
miners
in
korea-
first
see
how
it
works.
I'll,
probably
speak
to
them
for
like
the
offline
transfer
at
first,
and
I
will
speak
with
them.
So
if
we
can
integrate
with
them
like
for
overload
like
sfptv
integration
or
ppcli
integration
and
all
this
stuff,
so
I'll
work
that
out
as
long
as
there
is
some
done,
make
some
progress
with
that
I'll
try
to
reach
out
to
other
regions
as
well,
so
like
china,
america
anyway.
B
B
E
Okay,
that's
cool!
So,
according
to
this
data
sample,
it's
all
the
data
is
the
video
or
only
the
gif
pictures.
Now.
Actually
I
get.
B
I
will
need
some
excel
data
text,
status,
gifs
images
and
videos
and
those
four
times
will
be
the
major
so
I'll
get
as
much
information
and
I'll
get
it
to
start.
My
platform
to
study
based
on
those
numbers,
videos
and
images.
E
Okay,
clear
so
do
need
to
frequent
to
retrieve
those
data,
for
example
like
some
days,
video
which
took
him
by
the
drawers
and
you
need
to
retrieve
it
back.
I
don't.
B
Think
it
will
be
needed
at
first
at
first
early
stage
because,
but
maybe
after
we
run
for
a
fair
bit
of
time,
we
may
need
to
some
have
some
operation
policy
about
the
retrieving
data
at
that
stage,
but
not
for
at
least
three
six
months.
Don't
think
you
will
be
required
that
often
to
retrieve
at
the
first
stage.
C
Who
are
there
any
other
questions
or
any
notice
that
want
to
explicitly
voice
support
right?
Now?
We
can
document
and
put,
of
course,
the
ldn
process
itself
is
changing,
but
just
while
we're
in
a
transition
phase,
I
want
to
make
sure
that
we
go
about
things
as
as
we
are
familiar.
C
Can't
see
any
more
hands,
give
you
20
seconds
and
then
we'll
let
them
go
yeah.
Unless
there's
any
other
questions,
you
know
just
feel
free
to
look
at
the
github
link
that
I
think
galen
shared
earlier,
feel
free
to
give
it
a
thumbs
up
or
approval
there
or
share
it
with
me
in
chat,
and
I
can
collect
them
at
the
end
of
the
call,
if
not
or
in
addition
to
that
johnson.
Thank
you
so
much
for
showing
up
really
appreciate
it.
C
B
C
A
A
A
C
F
F
Chloe
from
sing
self,
I
am
glad
to
have
this
opportunity
to
introduce
sincere
to
you.
First
I'll
show
you
a
video
about
things,
so
you
can
watch.
F
F
F
F
Okay,
we
are
focused
on
the
patient's
experience
by
leading
a
brand
new
business
logic
layer,
our
product
enabled
those
patients
upload
their
medical
images
to
the
single
deaf
system,
the
system
and
crafts.
The
uploaded
in
mesh
data
generates
a
private
key
and
provides
it
to
the
patients
sensor.
Depth
turns
the
patient's
images
into
digital
assess
those
patient
data
and
doctor
data
will
turn
to
an
energy
which
is
not
only
used
for
doctor
diagnosis
on
death,
but
also
treated
for
medical
research.
On
the
digital
trading
platform
provided
by
sensor.
F
F
F
F
F
C
C
And
just
taking
a
look
at
the
application,
we
already
have
a
few
notices
on
it
performance
by
base
1475.
C
C
Just
but
it
might
be
worth
just
readdressing
it
here,
so
the
data
is
okay
to
upload
onto
filecoin.
It's
not
like
private
patient
data
right.
G
Yeah,
I
have
similar
questions
here
as
well.
Like
medical
data
is
very
sensitive
and
you
know
you
you,
as
the
canada
or
europe,
or
have
something
like
gdpr
or
hpa
compatible
or
even
amazon,
google.
A
Yep
it
looks
like
yeah,
we
may
have
dropped
chloe
from
senso.
I
know
she,
they
have
said
in
their
app
that
it's
desensitized
and
public
and
that
they
can
provide
a
sample
download.
I
don't
know
if
that
is
different
when
they
are
then
storing
it
outside
of
china.
I
think,
charles,
I
think
that
was
part
of
your
question,
so
it
might
be
good
to
post
that
in
their
github
issue
and
see
just
to
confirm,
is
it?
Is
it
different
for
storing
with
storage
providers
in
regions
outside
of
china.
A
A
We're
back
since,
though,
we
had
a
question
from
charles
one
of
our
notaries
about
storing
this
information
outside
of
china.
I
know
you've
said
in
the
issue
that
the
oh.
A
Okay,
let's
jump
through
it,
we
have
lots
to
cover
not
a
lot
of
time.
Left
we're
going
to
do
this
part
really
fast,
using
github
discussions.
A
We
had
been
using
everything
in
issues
in
the
notary
governance,
repo
we're
going
to
try
out
using
this
new
platform
that
happens
under
discussions
instead
of
issues.
It
could
be
sort
of
a
lower
lift
way
to
start
some
conversations,
and
you
can
kind
of
increase
the
signal
on
those
conversations
and
similarly
add
comments,
but
then
minus
the
news.
People
can
also
reply
in
line
to
comments,
and
so
we
can
get
some
more
discussion
topics
going
before
we
translate
those
into
an
issue
and
we'll
use
issues
for
some
of
the
kind
of
specific,
concrete
proposals.
A
Every
applications
different.
You
know,
process
problems
that
like
come
up,
so
if
you
have
an
idea,
have
something
that
you've
been
thinking
about,
or
you
know
just
a
concern
that
maybe
isn't
a
specific
proposal
yet
and
just
wanted
to
generate
some
additional
conversation
async.
A
Let's
try
using
that
discussion
forum
and
see
how
it
goes
quickly.
Moving
in
so
last
week,
protocol
labs
and
filecoin
foundation
hosted
a
design
summit
on
falcon
plus
it's
three
days
of
intense
amazing
strategic
work.
A
A
couple
of
highlights
we're
still
working
on
like
a
total
summary
of
everything
and
we'll
be
able
to
like
share
out
more
information
async,
but
she
wanted
to
give
some
of
the
you
know
the
teaser
information
of
some
of
the
things
that
we
were
talking
about
and
and
working
towards,
because
we've
heard
from
a
lot
of
people
that
these
are
very
top
of
mind,
questions
for
the
program,
so
one
was
around
encrypted
and
where
we
land
on
encrypted
data
and
how
we
will
be
able
to
support
encrypted
data
on
filecoin
plus,
we
talked
a
lot
about
supporting
notaries,
doing
business
development
for
large
data
sets
and
ways
in
which,
for
example,
a
notary
when
it's
a
large
data
set
can
store
a
replica
with
themselves
and
how
we
can
sort
of
track
that
and
encourage
that
process
from
tooling
supporting
the
client
onboarding
user
journey.
A
We
really
want
to
focus
on
kind
of
consolidating
some
of
our
readmes
and
documentations
and
work
on
our
tool
surfaces
to
make
that
very
first
step
to
getting
data
cap
even
easier
and
more
clear
and
focusing
on
that
kind
of
top
of
the
funnel
to
also
really
minimize
the
overhead
for
notaries.
And
so
what
are
the
ways
that
we
can
make
it
much
much
easier
for
the
notaries
to
scale
how
they're
doing
the
diligence?
How
they're
doing
the
allocations
to
decrease
some
of
that
tactical
lift
that
a
lot
of
the
notaries
are
doing.
C
Mute
button
wasn't
functioning.
Can
you
hear
me
yeah
awesome,
so
I
think
the
the
remaining
things
on
the
slide
that
we
want
to
touch
on
for
sure
are
these
two
sort
of
concepts
that
we've
talked
about
a
little
bit
in
the
last
couple
of
weeks,
one
especially
through
the
the
issue
discussion
that
was
happening
in
github,
but
also
in
general,
just
sort
of,
as
you
think,
about
the
role
of
evolving
to
scale
with
the
magnitude
of
operations
that
are
likely
to
happen
on
the
network
in
the
medium
term.
C
In
the
long
term,
the
first
of
these
sort
of
tropes
is
like
ensuring
that
we
as
a
community
are
biasing
towards
action
and
experimentation
and
sort
of
implied-
and
this
is
also
iteration,
where
we
get
to
do
things,
learn
from
them
and
improve
them.
So,
like
the
ldn
process,
I
think,
is
a
very
good
example
of
this.
Where
we
ran
this
first,
like
phase
experiment,
we
got
a
bunch
of
feedback
from
clients
and
now
we're
going
to
try
another
experiment
and
then
we'll
probably
get
a
bunch
of
feedback
and
then
we'll
change
it
again.
C
We
have
a
lot
of
room
to
continue
expanding
experimenting
and
growing
together
to
ensure
that
the
program
does
indeed
make
the
network
more
efficient
and
effective,
the
second
of
which,
being
that
we
should
transition
from
at
least
for
nowadays
today,
where,
like
it's
a
direct
gatekeeping
element
in
the
flow
of
a
client
who's
onboarding
onto
the
network,
where
they
need
to
go
through
in
order
to
get
the
required
resources
and
and
the
kyc
to
get
humber
onto
the
network.
C
There
are
ways
in
which
we
can
compensate
or
also
collect
data
over
time
that
should
become
part
of
how
we
decide
you
know
or
how
you
decide.
C
Technically,
whether
or
not
you
know
considering
the
right
thing
or
doing
the
fair
thing
on
the
falcon
network
when
making
deals,
and
so
switching
over
a
little
bit
from
this
element
of
like
gatekeeping
and
more
towards
guardians,
as
you
continue
to
look
at
client
behavior
over
time,
and
we
as
a
community,
understand
patterns
trends
and
set
ourselves
up
for
backing
up
clients
that
should
be
verified,
as
I
think,
is
an
important
transition
that
we
should
start
to
make,
which
I
think
will
be
a
key
in
sort
of
improving
the
client
experience
overall
on
telecoin,
because
it's
looking
like
a
lot
of
the
clients
are
coming
through
falcon
plus
today,
and
it's
probably
going
to
continue
being
that
way
and
so
ensuring
that
it's
a
minimal
friction
both
for
you
and
for
them.
C
Given
those
two
things
in
mind,
I'd
love
to
talk
a
little
bit
about
the
goals
as
well,
and
so
you
know
some
numbers
that
we
were
floating
around.
That,
I
think,
would
be
great
to
share
with
you
and
get
your
thoughts
on
and
your
feedback
on,
where
we
sort
of
divided
up
the
goals
into
a
few
different
categories
as
following
so
the
first
being
volume
where
we
were
looking
at
how
much
data
caps.
C
This
is
a
kind
of
difficult
thing
to
calculate
because
of
the
way
the
ldn
process
tends
to
work
more
like
a
faucet,
but
roughly
how
much
data
cap
do
we
think
we
want
to
target
as
a
community
as
being
made
available.
The
clients
in
the
network
right
and
the
network
today
is
around
eight
exabytes.
It's
well
on
its
way
to
get
into
ten
exabytes
five
one
plus
is
operating
still.
C
C
It
would
be
great
to
sort
of
think
you
know
at
a
magnitude
higher
than
we're
used
to
together
start
brainstorming
at
the
level
where
you
know,
500
terabytes
needs
to
be
shipped
out
to
be
shared
with
clients,
it's
a
great
sort
of
thing
to
keep
in
the
back
of
our
minds,
as
we
think
about
designs
that
we
come
up
with
and
how
they
should
scale
in
terms
of
onboarding
speed.
You
know
that
there's
been
an
ongoing
focus
on
ttd,
which
is
our
sort
of
in-house
metric
of
time
to
data
cap.
C
C
I
think
we
talked
about
this
a
lot
last
governance
call
but
sort
of
a
client
coming
to
the
network,
then
having
to
wait
for
substantive
period
of
time
to
be
enabled
leads
to
very
poor
client
experience,
and
so
the
intent
is
to
really
focus
on
that
one
and
find
ways
to
enable
clients
today
that
are
coming
in
at
a
small
enough
scale
that
we
think
is
safe
to
experiment
with.
So
specifically
like.
C
That
we
think
we
can
collect
over
time
will
become
an
interesting
conversation
to
have
to
enable
some
of
these
more
aggressive
time
to
data
cap
targets.
Similarly,
getting
to
a
point
where
that
first
big,
like
tranche
of
like
an
ldn
application,
it's
like
a
50
terabyte
allocation
happens
within
a
day
as
opposed
to
multiple
days.
I
think
some
of
these
things
will
be
directly
impacted
by
process
changes,
and
then
some
of
these
things
will
be
impacted
by
like
behavior
changes
that
we
have
and
on
both
those
fronts.
C
I
think
that
we're
looking
at
the
nerd
rays
as
like
the
key
sort
of
driver
of
some
of
these
active
conversations
and
discussions
as
we
continue
to
iterate
on
the
different
processes
within
falcon
plus
there's
a
specific
goal
and
desire
to
also
have
some
level
of
risk
mitigation
with
better
dashboards
and
data
and
analytics,
and
then
generally
improvements
to
documentation,
discoverability
of
the
programs
ensuring
that
clients
land
here
when
they're
coming
to
the
network,
general
tooling
as
well,
and
then
governance
processes
to
ensure
that
everyone
who's
attempting
to
participate
actively
is
given
the
flow,
the
opportunity
to
provide
ideas
and
proposals
that
can
improve
the
flow
both
for
stakeholders
in
the
network,
such
as
yourselves,
but
also
for
clients
that
might
be
coming
in.
C
C
G
Yeah,
it's
just
like
as
note
3
right,
we've
recently
like
started
improving
large
data
set
and
we
approved
a
couple
of
larger
stuff,
so
this
is
different
from
smoothie
that
I
said
I
want
to
give
you
10
terabytes,
because
it's
a
large
we
really
like
to
want
to
know
after
they
store
the
data.
How
is
the
retrieval,
or
are
those
data
still
qualified?
It's
just
like
a
postage
investigation
about
how
everything
goes.
Well,
it
can
should
be
a
routine
work
like
a
during
screenshot.
G
We
know
that
we,
we
know
that
in
sweden,
short
pl
game
and
other
games,
I'm
not
very
disappeared
him.
They
have
those
spots
right
to
take
people.
We
can
see
that
this
really
is
different
from
zero
to
100,
it's
quite
different
from
different
teams.
So,
like
so
in
this
case
like
when
data
said
it
was
saved
to
the
different
storage
providers.
We
want
to
know
how
is
it
going
so
it
isn't-
and
it's
also
part
of
the
guidance,
the
guidance
of
the
actions
that
we
needed
to
track
those
data.
G
So
maybe
it's
too
early.
Okay.
I
know
that
ecosystem
everything
is
still
early
stage,
basically,
or
is
it
too
early,
but
I
know
that
some
teams
working
on
it.
They
just
want
to
know
which
teams
are
working
on
it
so
to
prevent
them
reinventing
the
wheels.
So
you
can
better
cooperation
together
building
platform
or
to
us,
sharing
libraries
so
and
also
we
can
take
a
reference
like
how
we
can
use
them
and
especially
timeline.
We
know
that
after
two
months
or
one
month
like.
C
C
Yeah,
I
think
so,
there's
two
excellent
ideas
there
I
think
for
the
for
the
retrieval
issue.
That
is
a
very
interesting
concept.
Like
you
said,
we
have
the
technology
we're
using
it
for
slingshot.
I
don't
see
a
reason
why
we
couldn't
use
something
similar
to
filecloneplus.
I
think
that
would
just
be
a
matter
of
us
deciding
as
a
community
that
we
want
to
use
it.
The
code
is
open
source.
We
can
definitely
find
a
way
to
host
it
if
needed,
to
have
specific
instances.
C
A
Should
we
we
have
four
minutes
left?
I
think
we
should.
A
I
think
we
should
hit
this
because
we're
going
to
be
rolling.
These
changes
this
week,
so
we've
proposed
these
ratified
it
and
just
wanted
to
reiterate
this
new
process
flow
and
then
show
some
of
the
tooling
so
specifically
run
through
the
process
from
the
top.
A
client
submits
an
ldn
and
the
save
way
in
the
github
ldn
repo,
the
governance
team
and
a
bot
will
audit
it
for
basic,
completeness
and
flag
any
missing,
incomplete
or
invalid
sections
directly
back
to
the
client.
A
A
A
They
can
view
the
application
and
perform
the
due
diligence
and
then
we'll
post
once
those
tutorials
have
signed
that
the
data
cap
is
granted
in
the
github
issue,
then,
for
subsequent
allocations,
some
additional
improvements,
we've
made
a
bot
will
be
monitoring,
client
data
cap
remaining
and
once
they
have
used
75
of
their
previous
data
cap
allocation,
the
bot
will
request
an
additional
data
cap
automatically.
A
The
goal
here
is
for
no
one
to
for
these
large
data
sets
to
not
run
out
that
instead,
as
they
get
low
start,
the
next
allocation
request
automatically
without
having
to
wait
for
someone
to
make
the
announcement
the
bot
will
also
post
the
previous
deal
stats
and
comments.
This
will
help
with
that
audit
diligence
and
then
notaries
can
perform
subsequent
diligence
from
those
stats
and
ask
any
follow-up
questions
and
sign
the
subsequent
request.
A
So
we
have
a
screen
share
video
from
kiko
team.
I
think
this
is
about
one
minute
long,
so
just
walk
through
it
again.
They
will
submit.
You
know,
use
a
template
and
make
sure
everything's
filled
in
they
just
use
like
copy
paste,
and
then
the
governance
team
will
kick
off
that
it
is
complete.
It
will
automatically
try
and
create
that
multi-sig,
which
creates
another
issue
in
that
notary.
Onboarding
running
through
that
application
basically
gets
signed,
marked
completed,
we're
going
to
see
kind
of
the
quick
back
end.
A
A
A
This
is
a
sample
other
large
data
set
it
gets
submitted,
the
notary
is
requested,
allocation
is
requested
and
approved,
and
then
the
subsequent
allocation
is
automatically
requested
and
the
bot
will
post
a
comment
pulling
data
from
dashboards
to
show
behavior,
as
you
can
see,
maybe
not
the
greatest
example,
because
this
is
all
simulated
data,
but
we
think
that
these
tools
will
really
help
the
audit
and
speed
process
so
going
from
there.
It's
now
901
appreciate
everybody.
A
E
Hi
guys,
so
to
be
honest,
we
haven't
prepared
the
fips
in
english,
so
we
just
translated
it
to
to
english
to
to
chinese,
because
when
we
meet
the
the
storage
provider,
I
mean
sorry
the
data
provider,
to
ask
them
to
help
with
from
the
storage
provider
to
store
their
data.
They
may
make
confused
of
some
concept.
So
that's
why
we
make
those
proposals
to
let
the
people
in
china
market
to
understand.
What
are
we
talking
about?
E
This
is
our
major
purpose
later
we
will
work
on
it
and
we
will
translate
to
english,
but
guys
you
can
see
the
through
the
flips
which
we
provided
on
the
github.
E
So
mostly,
we
would
like
to
keep
it
into
chinese
and
to
let
everybody
know
what
we
are
doing
and
because
people
are
make
wrong
different
understanding
about
the
junk
data
for
between
the
real
data,
so
the
junk
data
we
are
definite
to.
We
were
using
the
lotus
and
the
the
program
writes
the
data
for
us,
but
the
real
data
is
it's
a.
E
So
later
we
will
keep
it
updated
in
english
in
github.
A
Okay,
I
think
that
thank
you
I
think
caitlyn
had
to
run
to
another
meeting.
I
don't
know
if
you
have
met
caitlyn
beagle.
She
is
relatively
new
with
the
filecoin
foundation,
helping
with
governance
tpm
for
governance,
and
she
had
some
questions
in
the
github
issue
and
then
we
can
get
some
async
conversation
going
back
and
forth
there.
C
I
just
wanted
to
flag
to
this.
I
think
we
have
senso
back
on
the
call.
I
know
charles
had
the
question
for
them
around
data
privacy,
especially
as
it
extended
or
yeah
charles.
You
want
to
restate
your
question
and
see
if
we
can
chat
about
it
now,
if
not,
we
can
follow
up
on
github.
G
C
A
All
right,
thank
you
all
we'll
stop
this
recording.
This
will
be
available,
probably
in
the
next
few
days,
on
youtube,
in
conjunction
with
the
one
that
we'll
have
this
afternoon,
so
you
can
catch
the
recap.
A
All
right
welcome
to
notary
governance,
meeting
september
14th
round
two.
These
will
both
be
combined
together
with
when
we
have
this
morning
same
agenda
different
audience
in
the
one.
This
morning
we
did
have
an
opportunity
to
see
and
talk
to
a
couple
different
clients
that
have
some
large
data
set
applications.
A
So
if
you
were
not
able
to
attend
the
morning,
call
please
check
it
out
and
if
you're
watching
this,
then
you
probably
just
watched
those
so
that's
redundant
so
jumping
right
in
we
have
a
jam-packed
agenda,
we'll,
as
we
usually
do,
go
over
some
metrics
and
statistics
for
the
program.
I
don't
know
that
we
have
any
clients
that
will
be
able
to
join
us
on
this
time
slot.
A
We
are
introducing
a
new
mechanism
using
github
discussions.
Let's
talk
about
that
deep
and
I,
along
with
some
others
from
protocol
labs
and
falcoin
foundation,
had
a
three-day
design
summit.
Last
week,
some
of
you
were
able
to
join
for
some
of
those
sessions
and
help
share
some
great
insights.
We
had
some
announcements
from
that
summit
rolled
into
some
of
the
six
month.
A
Program
goals
show
some
of
the
updated
tools
that
we'll
be
launching
in
support
of
the
new
ldn
process
and
then
have
time
for
open
discussion,
so
action
packed
we're
gonna
jump
right
in
looking
at
our
metrics,
pretty
sure
that,
as
of
still
an
hour
ago,
I
believe
that
our
dashboard
is
down.
It
went
down
sometime
this
morning,
along
with
a
couple
other
online
tools,
space
gap,
phil
green
and
some
others.
A
So
we
pulled
this
screen
grab
yesterday
for
the
current
metrics,
but
I
believe,
as
of
right
now,
our
interplanetary
filecoin,
plus
dashboard
is
still
down
so
we're
troubleshooting
that
we
were
able
to
update
the
apis
and
the
crawlers
to
get
a
more
accurate
average
time
to
date.
A
cap
as
much
as
we
were
very
excited
to
see
this
18
minutes,
and
that
is
a
you
know,
reach
goal
for
us.
A
It
is
not
accurate
to
the
current
reality,
taking
into
account
the
different
mechanisms
that
people
have
to
get
to
datacap,
so
we're
seeing
a
more
realistic
seven
days.
We
are
seeing
a
drop
in
the
average
time.
First
response
from
a
notary,
it's
very
exciting
there.
It's
not
a
metric.
We
really
care
to
keep
driving
down.
So
thank
you,
notaries,
for
your
work
on
that
one
and
still
seem
about
a
similar
time
to
data
cap
allocated
on
chain.
A
So
looking
at
our
funnel
this
again
similar
to
two
weeks
ago,
this
funnel
is
taking
into
account
the
data
cap
that
has
been
granted
to
those
large
data
set
notaries.
So
we
have
some
of
those
five
petty
by
multi-sig
notary
accounts
in
there
in
total,
we
are
looking
at
36.18,
petty
bytes
with
our
notaries
and
then
a
drop
down
to
1.9
that
has
been
allocated
out
to
clients
and
then
0.75
petty
bytes
stored
in
deals.
A
So
this
is
an
increase
over
two
weeks
ago,
so
we're
happy
to
see
that
we
want
to
keep
improving.
These
funnel
drops
some
of
our
direct
node
reactivity
statistics.
As
of
yesterday
afternoon,
there
were
57
open
issues
and
13
new
apps.
Some
of
these
are
duplicates
so
just
be
aware,
this
number
is
13
just
at
a
quick
glance,
but
we
know
that
some
of
those
are
duplicate
entries
by
my
clients,
checking
the
client
onward
and
repo
for
closed
issues
with
the
label
state
granted
we
have
185
client
applications.
A
A
We
want
to
keep
seeing
that
number
going
up
over
time
and
our
auto
verifier
still
cranking
away
doing
its
great
job
as
an
outlier,
with
441
client
approvals
coming
through
the
glyph,
auto
verifier
good
job
to
the
infinite
scroll
team,
chugging
away.
Looking
at
some
of
our
large
data
set
metrics.
When
that
dashboard
comes
back
online,
you
will
see
we
were
able
to
make
some
nomenclature
changes
that
I
think,
will
make
this
a
little
more
readable
to
see
which
of
these
node
3's
are
represented
representing
those
ldns.
A
So
we
now
have
ldn
and
then
the
issue
name
and
affiliated
organization
that'll
help,
because
we
anticipate
that
the
number
of
verified
clients
to
those
ldns
is
not
going
to
increase
very
much.
We'll
have
a
few
that
have
more
than
one,
but
a
lot
of
these
ldns
will
only
potentially
have
one
client,
whereas
our
other
direct
notary
activities
will
have
many
more
verified
clients.
A
So
at
this
time
we
have
21
client
applications
going
through
that
ldn
process,
with
a
total
of
54
petty
bytes
of
data
cap
requested
checking
the
metrics
on
those
ldns
really
quickly.
Yesterday
afternoon
we
are
seeing
742
teddy
bytes
from
the
ldn
notaries,
two
clients
and
174
teddy
bytes
from
those
clients
sealed
into
deals
so
exciting
metrics
there,
pausing
and
checking
our
participants.
A
List,
I
don't
see
any
clients,
so
we're
going
to
jump
ahead
and
talk
briefly
about
using
github
discussions.
So
this
is
something
that
we
had
previously
been
using
github
issues
for
lots
of
things
and
we
are
rolling
out
a
new
test
that
you
can
see
in
that
top
toolbar
of
github
discussions.
We
think
this
will
allow
us
to
start
more
asynchronous
conversations
about
things
that
may
be
concerns
or
ideas,
or
something
that
isn't
necessarily
quite
ready
to
be
a
full,
fully
fledged
issue
proposal.
A
Yet
so
we
think
that
you
know
orienting
these
around,
maybe
a
problem
until
it
becomes
a
specific
solution,
and
this
will
allow
people
to
kind
of
reply
and
thread
and
kind
of
brainstorm
together
and
that
we'll
use
issues
for
things
like
concrete
proposals
for
ratification,
notary
applications
like
we
continue
and
different
process
flags.
We
post
an
issue
when
there's
a
governance
call.
A
A
My
call
to
action
there
is
try
to
create
distinct
discussion
threads
for
these,
for
your
different
topic
ideas
so
as
as
we're
gonna
try
and
bias
towards
a
few
more
of
those
discussion
threads
as
possible,
rather
than
maybe
one
giant
single
discussion.
A
Okay
last
week,
deep
had
to
put
up
with
three
days
with
me
in
person
it's
great
and
we
covered
a
lot
of
ground.
We
have
been
working
hard
at
kind
of
consolidating
capturing
all
of
that
information
and
getting
it
into
a
more
robust
set
of
summary
documentation,
we're
just
not
done
with
it
yet.
A
So
we
wanted
to
share
just
some
of
the
kind
of
the
highlights
highlight
reel
from
last
week
to
let
you
know
that
we're
talking
about
a
lot
of
things
that
have
been
brought
up
before
so,
for
example,
we
had
a
conversation
around
encrypted
data
and
what
it
would
look
like
for
filecoin
plus
to
be
able
to
support
encrypted
data
again.
These
are
just
things
that
we
were
discussing
and
not
necessarily
things
that
we
feel
we
have
an
answer
on
right
now.
A
These
are
just
topics
that
came
up
the
to
generate
more
conversation,
supporting
notaries.
Doing
business
development
directly
themselves,
especially
for
the
large
data
set
onboarding
and
some
of
the
ideas
around
story
in
a
replica
with
yourself,
if
you
are
a
notary
bringing
some
of
those
large
data
sets
to
the
table.
A
Some
additional
tooling
topics
that
we
know
that
we
want
and
really
kind
of
landing
on,
focusing
on
this,
the
front
end
of
how
do
we
go
from
a
client
that
discovers
the
file
coin
network
to
them
getting
data
cap
and
focusing
on
that
onboarding
top
of
the
the
funnel
at
this
point
in
time,
and
with
that,
how
do
we,
you
know
specifically
minimize
some
of
the
overhead
for
notaries
and
the
kind
of
the
the
various
tactical
administrative
work?
How
do
we
support
making
that
easier
to
do
so?
C
Thanks
kevin,
so
I
mean
one
of
the
things
that
has
been
a
general
like
topic
of
conversation
in
this
community,
specifically,
of
course,
with
you
folks,
but
also
in
general,
like
github
and
other
places
has
been
sort
of
like
looking
at
ways
in
which
we
should
redefine
or
evolve
the
ways
that
we're
thinking
about
the
program
and
how
we
should
think
about
next
steps.
C
Evolution
changes
et
cetera,
and
there
are
a
couple
of
like
themes
that
have
started
to
emerge
and
I
think,
focusing
on
them
and
actually
making
them
like
a
paradigm
that
we
use,
or
or
like
a
primary
sort
of
thinking,
framework
that
we
use
to
approach
general
problem
solving
in
the
community.
That
would
make
a
lot
of
sense
and
a
lot
of
the
conversation
we
were
having
last
week.
C
Sort
of
aligned
with
this
as
well
and
two
of
those
that
are
just
pulled
out
in
our
as
bullets
in
this
in
this
slide
are
firstly,
and
this
danny.
You
specifically
called
this
out
as
well.
C
In
that
conversation,
that
was
happening
on
issue
217,
but
this
idea
that
we
should
bias
more
towards
action
and
experimentation
and
like
sort
of
the
unsaid
third
one
here
is
like
iteration,
where
we
should
knowing
that
we're
so
early
right
in
the
general
time
horizon
of
the
network
and
in
the
general
sort
of
scope
of
the
program
where
the
network,
today's
already
done
exabytes
and
falcon
plus,
is
like
still
at
the
like.
The
single
digit
peppy
bytes
scale
of
deal,
making
the
double
digit
peribyte
scale
of
data
cap.
C
We
have
a
long
way
to
go
to
enable
demand
that
comes
in
at
an
order
of
magnitude
higher
than
where
it's
at
today
to
be
able
to
leverage
the
supply,
but
then
also
realizing
that
you
know
the
network
is
only
a
year
old,
falcon
plus
is
only
a
year
old.
We
have
made
decisions
in
the
past
together
as
a
group
that
I
think
we've
revisited
and
changed,
and
it's
almost
always
been
for
the
better
and
we've
continued
to
learn
and
improve,
and
so
keeping
that
sort
of
at
the
forefront.
C
As
we
think
about
you
know,
reducing
the
overall
perceived
friction.
The
experiment
or
test
with
something
and
just
being
a
little
deliberate
about
bounding
tests,
as
opposed
to
necessarily
shying
away
from
doing
them,
might
be
a
really
like
interesting
sort
of
take
and
an
interesting
thing
to
keep
as
like
a
core
pennant
of
like
designing
solutions.
So
I
know,
for
example,
like
meg's,
been
proposing
like
a
bunch
of
different
ways
in
which
notaries
could
engage
at
different
levels,
and
you
know,
instead
of
us
spending
too
much
time.
C
C
Let's
get
it
maybe
based
on
time
or
in
some
cases
the
amount
of
data
cap
that
goes
through
or
something
like
that,
and
we
can
start
like
evolving
this
more
as
like,
a
as
a
sort
of
experiment
based
approach
or
iteration-based
approach,
so
that
we
can
continue
to
evolve
with
the
network,
because
it's
also
likely
that
the
shape
and
the
nature
of
falcon
plus
is
going
to
continue
to
change
drastically
right
like
given
that
we're,
probably
in
like
decimal
points
of
the
time
horizon
of
this
program
or
the
impact
of
this
program.
C
We
have
a
long
ways
to
go
and
so
not
falling
into
the
trap,
and
I've
been
particularly.
You
know.
I've
made
this
mistake
myself.
A
lot
is
what
I'm
trying
to
say
where
I
often
think
about
like.
If
we've
established
something
and
it
seems
to
be
working
like
it
could
be
better.
So
then
I
start
thinking
about
ways
in
which
the
currently
established
process
could
be
improved,
as
opposed
to
like
actually
just
stepping
back
and
saying
what
are
we
actually
trying
to
solve,
for
we
do
have
the
buffer
to
experiment.
C
We
do
have
the
opportunity
to
do
things
completely
different
and
give
them
a
shot,
so
just
want
to
encourage
all
of
you
as
well,
and
let
this
serve
as
a
reminder
that,
like
together
as
a
community,
we
are
capable
of
doing
awesome
things
and
we
have
to
actually
do
awesome
things
to
ensure
that
we
can
enable
client,
success
and
minimize
friction
to
onboarding
under
filecoin
in
general,
and
then
the
second
one
which
I
think
you've
heard
from
from
me
and
from
others,
and
from
meg
in
her
post
and
and
other
people
on
github
as
well.
C
This
idea
that
we
should
think
of
the
notary
role
itself,
transitioning
from
being
like
a
gatekeeping
role
to
being
a
guardian
role.
We've
used
a
few
different
terms.
I
think
watch
persons
watch
dogs
watch
like
also
also
like
sort
of
observing
kind
of
terms,
are
being
used,
but
I
think
guardian
sounds
nice.
It's
got
the
the
alliteration,
but
also
it.
C
Given
the
chat
likes
to
call
this
the
data
capacitor
role,
because
you're
giving
data
cap
and
you're
a
definite
data,
capacitor
pretty
funny
but
yeah
in
general,
sort
of
instead
of
like
feeling
like
we
need
to
put
you
in
a
position
where
you're
only
getting
to
do
this
like
upfront
due
diligence
and
upfront
gatekeeping
of
access
to
network.
C
We
should
think
about
like
balancing
out
like
what
we
can
do
upfront
for
with
regards
to
interaction
with
the
client,
but
also
enabling
the
client
and
then
what
we
can
do
over
time
and
combining
those
two
things
into
getting
a
much
more
holistic
picture
of
what
is
considered
like
fair
use
of
the
network
or
good
behavior
on
the
network
or
a
reasonable
usage
of
data
cap
or,
like
all
these
different.
Like
facets
of
the
same
thing,
which
is
really
is
this
client
using
filecoin
properly?
C
Maybe
their
plans
are
but
then
also
looking
at
the
execution
of
the
network,
which
of
course,
is
going
to
influence
the
way
in
which
we
think
about
tooling,
about
auditing
about
how
you
know
data
cap
might
move
whether
or
not
data
cap
is
a
reversible
construct,
like
I
know,
there's
a
bunch
of
fips
that
have
floated
around
and
so
just
sort
of
thinking,
a
little
bit
more
about,
like
the
longer
term
implications
of
what
it
means
to
be
put
in
a
position
where
it's
like
some
amount
of
up
from
work,
then
some
amount
of
longer
term
work
and
then
utilize,
sorry
utilizing
software
to
make
that
as
easy
as
possible
and
give
us
as
holistic
a
picture
as
possible
to
serve
effectively
as
guardians
of
the
network
of
the
program
etc.
C
And
bearing
that
in
mind,
we
also
try
to
set
some
sort
of
targets
so,
like
metrics,
that
I
think
metrics
is
probably
a
better
word,
but
like
things
that
we
can
work
towards
as
a
community
in
like
the
near
term
and
so
probably
on
the
next
slide.
We
should
discuss
the
different
parameters.
C
So
these
are
some
that
we
decided
we'd
like
to
propose
to
the
group,
and
I
would
love
to
hear
your
reactions,
but
this
is
like
on
a
few
different
parameters,
the
first
of
which
is
volume
volume
being
like
what
is
the
total
amount
of
data
cap
actually
being
made
available
to
clients,
and
that
number
that
we
thought
might
be
good
in
like
a
six-month-ish
range,
might
be
500
megabytes
to
give
you
a
scope
of
what
that
means
that,
like
that,
actually
means
about
10xing
the
program
in
the
next
six
months,
which
is
an
order
of
magnitude
above,
but
as
we
bear
in
the
principles
of
the
previous
slide,
it
doesn't
actually
seem
like
that
much
because,
from
the
lens
of
like
how
much
supply
there
is
available
in
the
network,
we
should
be
striving
towards
ensuring
clients
can
store
a
large
amount
of
data,
significantly
significant
amount
of
data
successfully
on
the
network,
speed,
which
I,
which
at
least
personally,
I
think
this
is
the
one
that
probably
matters
more
in
the
near
term,
you're
all
familiar
with
ttd
for
those
of
you
that
aren't
familiar.
C
It
stands
for
time
to
data
cap.
That's
become
like
a
sort
of
internal
metric
in
this
program
where
we
think
about
how
quickly
a
client
actually
received
data
cap
and
can
be
enabled
to
make
deals
on
the
network,
and
today
that
tends
to
happen
in
three
ways.
Right.
You
have
like
this
automated
verification,
which
is
like
30
seconds,
and
you
get
32
gb.
Then
you
have
like
this
middle
ground
of
being
in
touch
with
the
note
3
and
then,
depending
on
the
responsiveness
with
that
notary.
C
C
But
we
were
just
thinking
in
general,
like
in
terms
of
giving
a
client
an
opportunity
to
actually
get
moving
as
fast
as
possible
and
then
also
get
to
demonstrate
to
the
group,
whether
that's
the
group
of
notaries
or
the
community
at
large,
that
what
they're
doing
is
reasonable
and
fair
use
of
data
cap.
We
should
allow
them
to
be
enabled,
perhaps
with
a
lower
bar
up
front,
with
more
strict
sort
of
auditing
and
risk
mitigation
practices.
C
And
so
we
wanted
to
propose
like
time
to
date,
a
cap
along
two
sort
of
dimensions
where
you
have
this
one
like
below
one
tabby,
byte
range,
which
is
potentially
like
a
risk
safe
or
we
do
enough
work
together
on,
like
upfront,
kyc,
plus
audit,
tooling,
etc.
Such
that
we
feel,
like
it's
a
safe
enough
number
for
a
client
to
go
and
get
that
kind
of
data.
Cap
in
a
very
short
amount
of
time,
say
less
than
an
hour
and
then
for
like
the
the
next
sort
of
order
of
magnitude
up.
C
We
talk
about
getting
people
in
in
about
a
day,
and
that
could
mean
improving
processes.
In
addition
to
also
investing
in
tooling,
of
course,
we
also
try
to
talk
a
little
bit
specifically
as
well
about
like
supporting
aspects
of
the
program
like
improving
documentation,
improving
the
discoverability
of
the
program,
given
that
it
seems
to
be
like
most
clients
are
now
coming
in
via
file
coin
plus
onto
the
network,
so
ensuring
that,
like
they're,
actually
able
to
get
here
efficiently,
effectively
understand
the
purpose
of
the
program
and
leverage
it
tooling.
C
So
the
experience
of
going
to
the
falcon
plus
app
to
apply
for
data
cap
like
investing
in
that
a
little
bit
to
improve
the
overall
client
onboarding
experience
and
then,
of
course,
investing
in
governance
processes
to
ensure
that,
like
we,
you
know,
we
ensure
that
ideas
are
proposed,
but
also
individuals
feel
enabled
or
feel
supported,
and
the
ability
to
actually
push
things
across
the
finish
line
like
ratify
proposals
and
actually
get
them
moving,
as
opposed
to
just
constantly
discussing
for
extended
periods
of
time.
C
Until
we
arrive
on
like
some
sort
of
consensus
and
so
just
learning
how
to
work
together
more
efficiently
and
going
back
to
these
principles
of
wanting
to
experiment
in
iterate,
enabling
our
us
to
be
able
to
do
that
with
less
friction
over
time,
yeah,
I
wanted
to
pause
and
just
hear
reactions.
I
would
love
to
hear
if
you
think
these
seem
reasonable,
not
reasonable
ideas
for
other
things.
We
should
be
thinking
about
as
a
community
and
any
other
thoughts
you
might
have.
H
I
think
it's
ambitious
deep
it's
making,
but
we
should.
We
should
have
a
crack.
C
D
C
I
think
they're
almost
indicative
of
like
again
like
design
principles
in
some
ways
where
it's
like
we
should.
We
should
operate,
knowing
that
this
might
be
a
magnitude
of
order
we
want
to
reach,
and
if
it
happens
in
like
seven
months,
eight
months,
that's
less
material.
It's
more
that
we
built
a
system
that
scales
to
ensuring
that
that
could
have
happened
right
yeah,
one
element
of
500
baby
bytes-
is
that,
like
demand
also
needs
to
show
up
to
ask
for
500
pebby
bytes
of
data
cap
and
that's
a
whole
different
problem
right.
H
C
I
think
that's
what
I
see
in
the
chat
as
well
like
how
can
we
get
enough
applications?
I
think
that's
a
great
question.
C
I
think
implicitly,
what
this
means
is
that
we're
also
making
it
a
point
to
enable
better
business
development
for
individuals
in
the
ecosystem,
whether
that's
miners
or
notaries,
or
even
other
stakeholders
like
reducing
friction
and
barriers
for
them
to
feel
like
they
can
make
it
easy
for
their
clients
to
come
and
get
data
cap,
and
I
think
galen
you
touched
on
this
briefly
before,
but
I
I
do
think
that
becomes
like
a
implied
like
priority
kind
of
thing
but
yeah.
C
A
Yeah,
I
was
also
going
to
say
to
the
500
penny
as
like
a
bold
reach
goal
and
like
a
couple
other
things
to
like
call
out
and
as
as
reminders
as
we
think
about
these,
not
necessarily
as
500
bytes
of
unique
data
and
so
thinking
about
how
we
can
also
you
know,
support
and
encourage
clients
that
are
coming
to
us
to
have
best
data
replication
practices,
and
so
we've
seen
some
ldns
come
in
that
say,
you
know
we
want
to
store
two
to
three
copies
of
our
data
set
and
maybe
finding
ways
to
like
help,
encourage
them
to
have
greater
geographic
distribution
and
more
replicas
can
also
help.
A
A
You
know
with
some
of
those
smaller
and
medium-sized
clients
doing
the
same
thing
but
yeah.
We.
We
also
think
that
this
is.
This
is
a
a
big,
bold
direction
and
a
reach
goal.
Danny.
I
So
I
get
this
is
all
great
and
really,
if
it's
kind
of
what
what
I
I
think,
would
be
a
fantastic
kind
of
move
for
the
whole
process.
I
I'm
just
sort
of
mulling
over
external
groups
that
maybe
would
be
able
to
help
us
in
this,
like.
I
I
feel
like
we're
trying
to
solve
a
lot
of
very
unique
problems,
but
maybe
they
have
sibling
problems
that
other
people
are
trying
to
solve
and
I'm
sure,
if
you've
thought
of
this,
the
two
that
sort
of
spring
to
mind
here,
just
kyc
in
the
a
bunch
of
what
we're
trying
to
do,
is
to
do
a
really
fast
and
efficient
and
easy
kyc
process,
particularly
if
we
move
to
going
to
a
guardian
role
right,
because
what
we
really
want
to
do
if
we're
in
a
guardian
role,
is
make
sure
those
people
don't
just
come
back
and
do
it
again
right
and
and
the
other
part
of
it
that
I
was
thinking
about
is
like
what,
after
we've
onboarded
like
a
lot
of
these.
I
These
clients,
you
know,
are
trying
to
problem
solve
and
we
should
really
have
a
way,
maybe
or
a
way
of
directing
them
to
resources
and
third
parties
that
can
help
them
put
up
that
kind
of
data
and
I'm
sure
they're
working
on
it
as
it
is.
But
I
do
feel
in
some
ways
we
kind
of
sign
off
on
the
process
and
then
just
leave
them.
I
And
I'm
not
saying
that
we
should
be,
you
know
following
them
through
the
whole
life
cycle
of
it,
but
I
do
think
that
maybe
there's
a
way
that
we
can
go,
and
your
next
step
is
to
talk
to
these
people
and
these
people
and
these
people,
and
so
they
come
away
feeling
like
they
know
what
to
do
next,
which
gives
them
a
better
feeling
about
the
whole
process.
A
Yeah
we
we
had
some
like
short
ideas
around
some
of
that,
like
what
are
some
other
ways,
a
couple
things
of
like
who
could
we
be
talking
to
around?
You
know
best
auditing
practices
and
like
watch.
A
You
know,
fraud,
watch,
kind
of
kind
of
tooling,
metrics
and
so
we'll
be
putting
out.
You
know,
probably
using
a
discussion
topic
to
like
see
who,
in
our
community,
has
some
like
good
people
first
to
talk
to,
but
then
what
are
other?
A
You
know
ways
that
we
can
look
for
other
examples
of
like
how
people
do
kyc
and
that
sort
of
diligence
process
and
then
also
a
question
that
has
come
up
a
few
times
from
storage
providers
is
how
to
how
do
we
support
storage
providers
doing
more
of
their
own
outreach
to
verified
clients,
and
so
a
near
proposal
that
we're
gonna
that
we're
thinking
about
is
putting
in
the
data
cap
application
a
little
checkbox
where
a
client
can
say
I
I
would.
A
I
would
like
to
opt
in
to
communications
from
storage
providers
and
then
basically
put
that
list
in
front
of
storage
providers.
To
say
you
know
here
are
clients
that
are
getting
data
cap.
That
would
be
willing
to
hear
from
you,
and
maybe
that's
just
another
way
that
we
can
help
scale
our
impact
of
what
you're
talking
about
danny
and
those
kind
of
sealing
the
deal
once
they've
got
the
data
cap
in
hand,
I
think
extra
matrix,
you
had
a
hand
up.
J
Yeah
thanks
gallon.
Can
you
hear
me
okay?
The
thing
is
that
I
think
there
aren't
nearly
enough
applications
for
notaries
like
us,
and
I
guess
things
that
we
do
to
lower
the
barriers
for
the
applications
and
notaries
to
deal
with.
All
these
things
are
very
good,
but
still
on
how
to
increase
the
like
application
amounts.
J
Does
this
proposals
directly
lead
to
the
increase
in
the
amounts,
and
I
I'm
putting
a
doubt
here
and
as
for
the
vertical
field,
things
most
notaries,
as
I
know,
have
some
sort
of
basic
knowledge
towards
ipfs
firecoin,
and
I'm
wondering,
if
it's
possible
for
us
to
introduce
and
invite
some
enterprises
in
certain
fields
and
educate,
educate
them
or
bring
out
some
benefits
to
make
them
becoming
one
of
the
stakeholders
in
our
network,
like
we
do
some
actively
active
work
to
invite
more
people
to
join
us
and
on
the
same,
in
the
same
time,
we
should
do
all
this
like
work
to
improve
our
networking
and
that's
the
point.
J
C
I
have
some
thoughts
if
you
don't
mind
again,
so
I
think
your
point
on
needing
to
scale
like
the
group
of
notaries
itself
is
extremely
accurate,
like
I
think
that,
like
for
sure,
we
need
to
find
a
way
to
do
it
to
make
that
role
both
a
little
easier
but
also
better
supported
and
at
a
higher
scale,
and
so
that
is
something
we
should
actively
think
about.
I
don't
think
that
there's
an
easy
solution
to
this,
but
maybe
it
means
reducing
the
application
difficulty
for
example
or
today.
A
C
Ended
up
being
like
legit,
properly
filled
out
all
the
info
was
there
and
out
of
that,
like
22,
I
think
became
like
net
new
notaries,
and
so
that's,
like
you
know,
more
than
60
drop-off
rate
in
between,
like
where
we
are
to
where
we
want
sorry
who
applied
to
who
actually
was
accepted
as
a
notary,
and
that
was
mostly
because
of
like
limitations
that
we
were
putting
in
place
as
a
community
on
how
much
data
cap
we
wanted
to
see,
circulated
and
like
how
notaries
had
been
performing
in
the
past.
C
The
other
thing
I
wanted
to
call
out,
though,
is
we're
also
looking
at
the
ldn
process,
as
I
think
like
an
experiment
of
how
a
lot
of
filecoin
plus
could
run
too
right,
because
we
have
this
mechanism
whereby
you
now
have
a
set
of
notaries
that
are
responsible
that
are
acting
as
like
sort
of
watch.
People
watch
keepers,
guardians
of
a
particular
client
who's
onboarding
onto
the
network.
C
That
client
is
receiving
data
cap
and,
like
orders,
the
magnitude
beyond
what
we've
ever
seen
before
right,
like
five
penny,
bytes
plus
sure
they're
getting
dispensed
at
like
a
rate,
that's
much
lower
and
gated,
but
that's
part
of
what
makes
it
feasible
for
notre
dame's
to
serve
as
guardians
and
they
get
their
five-pound
by
allocation.
We
have
like
currently
seven
soon
to
be
like
23,
no
trees
like
watching
it
and
getting
updated
reports.
What
could
we
have
done
if
we
had
30
notaries
or
40
notice,
or
even
with
20
notaries?
C
What
could
we
have
done
if
the
client
got
20
peppy
bytes
to
begin
with,
because
say
they
had
five
period
bytes
of
data,
but
they
needed
to
store
four
replicas,
five
replicas
of
it.
So
it
actually
is
not
that
far
away
from
where
we
are.
I
think
it's
just
a
matter
of
sort
of
learning
how
we
can
scale
what
we
have
plus
redesigning
some
of
these
things
in
innovative
ways
to
ensure
that
we
reduce
the
burden
on
both
notaries
and
clients.
C
I
think
danny
has
this
excellent
point
on
sort
of
one
main
thing
we're
effectively
doing
is
kyc,
and
so
there's
a
lot
of
organizations
out
there
that
do
kyc,
even
in
our
like
web
3
world,
even
like
things
like
exchanges
that
have
like
a
kyc
process
that
then
limits
how
many
transactions
you
can
make,
for
example,
or
how
much
money
you
can
move
in
and
out
of
an
exchange.
It's
not
that
different
from
us
saying.
C
Oh,
here's
an
like
an
onboarding
flow
from
a
client
that
we
all
sort
of
agree
on
as
a
group
of
notaries
and
like
notaries,
set
policies
and
clients
comply
with
those
policies.
Upfront
with
like
x,
amount
of
time
of
investment,
and
then
after
that
it
just
becomes
a
matter
of
like
watching
how
that
client
like
operates
over
time.
So
I
think
that's
definitely
part
of
it.
Then
there's
this
other
side
of
scale,
which
is
clients
that
actually
need
data
cap.
C
Today,
if
you
look
at
the
data
cap,
that's
available
not
counting
the
ldn
right,
like
just
looking
at
notaries,
there's
something
like
eight-ish
pebby
bytes
with
individual
notaries,
and
out
of
that,
like
I
think,
only
about
30
percent
has
been
given
to
clients
per
like
gill
and
slides
at
the
start
of
this,
that's
bringing
in
more
clients
in
and
of
itself,
I
think
becomes
like
a
pretty
interesting
problem,
where
not
only
do
I
think
you're
absolutely
right
and
that
we
should
be
probably
doing
some
investment
in
education
and
bringing
in
these
clients
to
network
and
showing
them
ways
and
we
can
participate,
but
also,
I
think,
there's
this
aspect
of
just
like
us,
providing
feedback
to
the
underlying
implementations
or
us,
ensuring
that
the
experience
that
we're
building
for
clients
is
as
low
friction
as
possible,
so
that
those
clients
these
stories
testimonies
can
compound
on
each
other
and
increase.
C
The
general
network
effect
that
exists
towards
moving
towards
web
3
technologies
or
moving
towards
file
coin
is
like
your
storage
solution
of
choice,
and
so
I
think,
there's
a
few
different
like
parameters
that
we
can
use
to
optimize
this
and
then
there's
just
the
general
bd
stuff,
and
I
know
meg
just
proposed
in
the
chat
right
like
having
like
an
actual
market
assessment,
that
the
notaries
can
like
party
on
together
and
get
behind
and
rally
bd
efforts
behind
for
people
that
are
interested
in
engaging
in
that
way.
C
That
could
lead
to
some
really
interesting
insights
on
how
we
choose
to
prioritize
our
time
as
a
community
for
the
bd
work
that
we
would
like
to
do
to
bring
net
new
clients
into
the
ecosystem
or
net
new
companies
that
could
graduate
to
leverage
this
ecosystem
that
we're
building
together
and
then
again
like
scaling
these
things
down
a
little
bit
like
500,
probably
sounds
like
a
massive
number,
but
the
moment
you're,
like
oh
a
legitimate
client,
probably
needs
five
different
replicas
of
something
you
now
you're
down
to
a
hundred
percent
falcon,
as
a
network
is
already
at,
like
20
30
bytes
of
data
being
stored,
we're
not
that
far
away.
C
I
think
it's
more
a
matter
of
reducing
the
friction,
educating
folks,
ensuring
that
we
do
things
in
like
a
deliberate
and
targeted
manner
and
invest
the
time
upfront
to
align
ourselves
in
certain
directions
and
then
actually
get
to
execute
together,
and
so
I
think
I'm
pretty
excited
for
this.
Like
next
few
months,
like
I
think
period
of
how
we'd
be
able
to
achieve
these
things
together,
I
think
there's
already
a
bunch
of
ideas
floating
around.
C
H
I
think
dude,
just
yeah,
that
all
makes
sense
we
do
need
to
uplift
like
excellent,
is
right.
We
do
need
to
uplift
the
supply
the
demand
side,
so
we're
building
supply.
There's
one
thing
I
suggested
in
the
longer
term,
with
you
guys.
Last
week
I
didn't
put
it
in
the
channel.
I
think
we
we
could
really
benefit
from
an
end-to-end
journey
map
and
it
plots
all
everyone
in
that.
H
C
I
think
there's
a
couple
of
other
thoughts
in
the
chat
I
think
neo,
you
dropped
a
link
to
discussion
if
you
want
to
chat
about
it
or
share
any
thoughts
right
now,
slur
is
yours.
If
not,
we
could
probably
proceed
to
the
next
sort
of
topics
of
discussion
as
well
on
the
agenda.
A
All
right
kick
this
back
over
to
me:
shifting
gears
from
some
of
the
high
level
strategic
conversations
and
pulling
it
back
down
to
where
we
kind
of
are
tactically
some
ldn,
tooling
updates.
A
A
A
The
bot
will
create
the
multi-fig
with
all
of
the
active
notaries,
as
signers
and
thresholds
set
to
two
from
there.
The
bot
can
automatically
in
that
other
repo
proposed
notary
status
for
the
multi-sig.
The
governance
team
can
send
a
message
and
either
a
slack
dm
or
some
alert
message
to
the
key
holders.
A
They
can
then
sign
and
grant
that
notary
status
once
that
happens,
the
bot
can
automatically
request
the
first
allocation.
This
is
another
thing
that
we've
we've
seen
some
of
that
slow
down
back
and
forth
of
not
knowing
when
the
root
key
holder
step
had
completed,
so
we
can
automate
that
process.
Bob
will
initiate
that
first
request.
The
two
notaries
at
that
time
can
view
and
perform
the
due
diligence
before
signing
and
get
a
data
cap
granted
message
in
the
github
issue
from
there
for
subsequent
allocations.
A
This
is
another
exciting
thing.
We
have
a
bot
that
can
monitor
data
cap
at
client
addresses
to
be
able
to
then
once
they
have
hit
a
threshold
amount
go
ahead
and
initiate
their
subsequent
application.
One
of
these
we
have
been
kind
of
made
aware
of.
We
don't
want
to
have
happen
where
a
client
is
chugging
along
through
their
data
cap
allocation
and
then
runs
out
due
to
it.
You
know
just
a
timing
issue
or
or
something
happening,
an
error
in
the
process.
A
So
if
we
can
kick
off
that
automated
subsequent
allocation,
when
they
are
at
a
certain
threshold
level,
we
think
this
will
help
the
bot
can
also
post
some
previous
deal
stats
as
a
comment.
So
this
can
help
with
some
of
that
auditing
that
we're
doing
right
now
somewhat
more
manually,
but
you
know
we
can
help
put
it
in
there
for
the
notaries
to
see
at
a
glance
in
the
github
issue.
A
Notaries
can
then
perform
diligence
based
on
that
comment.
If
they
need
to
ask
any
questions
or
pause
for
any
reason,
and
then
again,
two
notaries
can
sign
the
subsequent
allocation
requests
so
to
see
it
in
action
in
testing
the
same.
We
we
still
start
at
the
same
place
in
that
github
repo
to
submit
using
the
template.
A
A
Once
that
goes
through,
we
see
that
the
request
was
approved
and
changing
the
label
to
granted
process
complete
smiley
face.
So
I
wanted
to
take
a
second
and
show
you
as
well
a
sample
again.
This
is
this
is
in
the
testing
repo.
Once
that
multi-sig
is
requested,
it
gets
created
that
amount
of
data
cap
is
requested.
A
A
We
were
going
to
take
it
to
open
discussion,
so
we
talked,
we
heard
a
lot
about
the
different
goals,
but
at
this
point
see
if
there
are
any
show
of
hands
about
other
top
of
mind,
issues,
questions
about
that
lbn,
tooling,
update
or
other
questions
about
the
summit
or
program
overall.
J
Hello:
everyone
now
that
notary
is
not
full-time
clients
can't
can't
know
how
when
they
spend
on
online-
and
I
I
say,
as
I
saw,
that
radio
thing
application
time
is
a
good
way
to
attract
clients.
Maybe
we
need
to
change
the
way
of
submitting
an
issue
and
github
into
a
more
real-time
docking
application
method.
J
C
Yeah,
I
think,
if
I
understood
that
correctly,
that
makes
sense,
I
think,
having
like
some
sort
of
process
by
which,
like
the
amount
of
effort
that
a
client
puts
in
might
be
correlated
with
the
amount
of
data
cap
that
they
get
or
with
what
they're
unable
to
actually
do
on
the
chain
might
be
like
an
interesting
approach.
C
Where
you
know
sometimes,
clients
don't
actually
need
that
much
data
gap
to
get
unblocked
immediately
or
don't
actually
need
that
many
deals
to
do
their
poc
or
their
first
test
on
the
network
and
so
finding
ways
in
which,
with
very
little
effort
from
them
and
very
little
effort
from
a
notary
we
can
get
them
unblocked,
is
interesting,
like
our
response
to
that
today
is
like
a
32
gb,
auto
verifier,
but
32
gb
is
just
one
deal
like
that.
C
Doesn't
that
doesn't
really
let
a
client
do
anything,
and
it
doesn't
really
allow
us
to
gather
data
on
what
that
client
would
have
done
if
they
had
more
data
happen.
So
maybe
we
look
into
like
more
complex
but
still
auto
verification
at
a
slightly
higher
scale.
That
could
be
interesting
where,
like
without
human,
like
specific
human
intervention
like
we
do
auto
verification
at
a
higher
degree
where
clients
are
immediately
unblocked
or
something
is
that
sort
of
in
line
with
what
you
were
talking
about,
or
is
there
something
else.
A
I
think
that
was
also
what
I
was
understanding
as
well.
A
I
think
we've
also
heard
from
a
number
of
people
that
github
is
not
the
easiest
place
to
start
also,
geographic
issues
in
certain
areas
where
github
is
blocked
or
not
available,
and
so
thinking
about
other
ways
that
we
can
have
tooling.
C
I
have
a
separate
hand.
This
is
to
directly
respond
to
meg's
prompt
in
slack,
but
also
in
general,
sort
of
like
rehashing
your
slide
earlier.
Galen
on
using
github
discussions
just
want
to
ask
not
to
put
mega
on
the
spot
meg.
Do
you
feel
like
for
those
three
topics
that
you
proposed
in
slack?
C
Are
you
comfortable
starting
each
of
those
off
as
like
a
separate
github
discussion,
because
I
think
they're
individually
worthy
of
that?
Or
would
you
rather
the
opportunity
to
hash
that
out
further
in
slack
or
hash
that
out
further
in
a
governance
call
before
taking
into
like
a
written
format
on
github.
H
I'd
look,
I
think,
to
follow
it
through.
Github
is
fine
and
you
know
it's
an
experiment,
so
we
can
have
a
go
and
see
how
they
work
out.
So
why
don't
we
start
with
that?
You
know.
Maybe
this
forum
is
a
good
place
to
sort
of
present
a
one
page
or
like
a
like
a
two
to
three
minute
like
this
is
the
topic:
let's
vote,
let's
move
forward
kind
of
thing,
so
that
might
be
another
way
to
accelerate
making
decisions.
Yeah.
C
I
think
the
mental
model
that
I
I
have
in
my
mind
is
basically
that,
like
discussion
serves
as
like
the
asynchronous
version
of
like
what
this
room
could
have
been
had,
we
like
say
you
liked
us
better
and
we
all
wanted
to
hang
out
a
lot
longer,
which
clearly,
I
don't
think
is
the
case,
especially
for
galen.
I
mean,
as
a
result,
we
end
up
using
github
and
writing
these
things,
and
then
I
think
galen
is
pitching.
C
Discussions
is
an
alternative
to
issues
because
it
allows
for
like
credit
responses,
which
I
think
makes
a
ton
of
sense,
and
so
we
get
to
do
all
that
stuff
like
async,
and
then
we
have
this
slide
right
now.
That
was
up.
That
said,
open
discussion
right,
which
is
where
maybe
this
is
the
point
at
which
we
say.
Oh,
I
wrote
this
discussion
topic,
here's
the
responses
or
let's
have
more
synchronous,
conversation
on
that
stuff
and
then
once
like
that.
C
That
is
more
like
problem
oriented
conversation
like
let's
identify
the
different
approaches
to
solve
a
problem.
Let's
understand
the
problem
get
on
the
same
page
about
the
direction
and
then
once
like,
we
actually
started
moving
towards
solutioning
and
like
specific
proposals
that
to
me
is
like
you
have
an
issue
and
then
we
should
have
a
dedicated
part
in
this
gov
call
where
we
should
have
addressed
those
issues
and
done
like
the
mechanics
of
like
voting.
Agreeing
disagreeing
like
earlier
on
in
this
call.
C
Had
there
been
any
issues
to
have
addressed
have
been
addressed,
so
that's
kind
of
what
I'm
imagining
we're
gonna
try
doing
is
that
sort
of
does
that
map
right
to
you,
galen
and
meg
and
others?
Does
that
sound
about
okay,
cool,
sweet?
Danny
looks
very
happy
with
us.
C
H
I
I
do
look.
I
actually
have
one
operational
one
about.
I
think
it's
false
one.
It's
sort
of.
I
might
just
connect
with
you
guys
after
where
we
go
with
that.
Next.
C
Is
it
this
is
stuck
waiting
for
rookie
holders.
H
It's
it's
been
a
long
spread
yeah,
I'm
just
not
sure
where
to
go
with
it.
Next,
okay,.
A
I
think
it's,
I
think
it
was
technically
at
six
notaries
signed
on
to
foswan,
but
as
we
land
issue
217
and
implement
it,
it
will
roll
into
a
ldn
with
all
the
notaries
and
a
threshold
of
two.
Just
like
all
of
the
we.
We
will
be
rolling
all
of
the
existing
ldns
that
are
previously
approved
or
currently
submitted
to
multi-sigs.
With
a
threshold
of
two.
H
I
think
it
was
more
about
the
dialogue.
I
will
I'm
going
to
slack
it
to
you
after
this
and
just
have
a
look.
C
Yeah,
we've
really
do
it
in
the
notification.