►
From YouTube: Filecoin Plus - Aug 31 Notary Governance Calls
Description
Filecoin Plus governance calls, same agenda in two timeslots (8am & 4pm PT) to allow for broader geographic attendance. On these calls, we discussed recent DataCap allocation and application metrics, heard from various clients applying through the Large Dataset process, continued our discussion of Issue 217 in the Notary Governance Repo (proposed changes to the LDN process), and allowed for open discussion from notaries on current ideas.
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
A
Great
hello,
everybody
welcome
to
falcon
plus
notary
governance,
meeting
we're
recording
this
one.
There
will
be
another
one
of
these
calls
later
on
in
a
couple
of
hours,
we'll
record
that
as
well.
Both
these
recordings
will
get
uploaded
and
posted.
Hopefully,
we'll
get
some
notaries
that
are
able
to
attend
one
or
the
other.
The
goal
there
is
to
to
try
and
get
more
coverage
so
we'll
see
how
this
goes.
A
We
gonna
jump
in
we've
got
some
metrics
to
review.
Then
we're
actually
gonna
reorder
this
a
little
bit
for
this
meeting.
We
have
some
client
presentations
that
we'll
do
right
after
the
metrics
and
then
we'll
go
into
the
process,
discussion
for
the
large
dataset
notary
process
and
then
open
discussion
there
at
the
end.
A
So
from
two
weeks
ago,
we
were
seeing
again
those
times
to
data
cap
times
to
respond
driving
down.
Currently,
if
you
go
to
our
dashboard,
it's
going
to
say
that
the
average
time
to
data
cap
is
about
17
or
18
minutes.
This
is
while
aspirational
and
we
like
to
see
those
numbers.
It
is
a
little
bit
of
an
error,
we're
changing
some
of
the
calculations
in
where
we
are
pulling
these
things.
A
Currently,
the
github
scraper
is
running
into
an
error
and
timing
out,
and
so
it
is
not
correctly
calculating
some
of
this
average
time
today
to
cap.
So,
while
we'd
love
to
see
that
18
minute
number,
it
may
not
be
the
most
accurate
right
now
so
stay
tuned,
we're
hoping
to
get
that
updated
by
the
end
of
the
week
and
we'll
see
some
more
metrics,
as
well
as
some
updates
to
things
like
notary
names
getting
filled
into
the
dashboard,
so
the
allocation
funnel.
A
We
looked
at
two
of
these
breakout
funnels
at
our
last
notary
governance
call
and
like
I
mentioned,
then
it's
a
it's
becoming
a
little
bit
tricky
to
build
some
of
these
allocation
funnels
with
the
large
data
set
notaries
being
factored
in,
so
we're
working
on
breaking
out
those
statistics
so
that
we
can
more
easily
present
the
data
cap
allocation
for
direct
client
activity
and
for
large
data
set
activity.
So
this
26.3,
petty
bytes,
includes
some
of
those
five
petty
by
large
data
set
notaries.
A
So
that's
where
we're
sitting
from
26.3,
we
drop
down
to
six
percent
being
sent
out
to
clients,
and
then
we
dropped
down
to
that
half
of
a
petty
byte,
getting
sealed
in
deals.
So
that's
where
we
want
to
kind
of
keep
driving
some
of
that
impact
direct
notary
activity.
There
was
an
issue
that
came
up
last
week.
We
should
have
gotten
already
addressed
I've
heard
from
a
couple
of
notaries.
They
brought
it
up
in
the
channels
that
they
weren't
seeing
all
of
their
direct
requests.
A
The
problem
with
that
we've
identified
what
had
to
do
with
page
donation
on
github.
Once
we
hit
a
certain
number
of
open
issues,
we
weren't
grabbing,
all
of
them
that's
been
adjusted.
Everyone
should
see
their
open
issues
this
time
we
have
about
59,
open
issues,
that's
good
and
bad
we're
seeing
very
quick
responses,
we're
just
seeing
a
lot
more
issues
getting
created.
So
that's
that's
what
we
like
to
see.
We
just
want
to
kind
of
keep
an
eye
on
other
problems
that
pop
up
with
the
app.
A
So
let
us
know
in
slack,
as
things
come
up
from
checking
it
yesterday,
there
were
11
new
applications
that
hadn't
gotten
comments
yet
and
in
the
last
14
days
we
have
closed
and
granted
30
data
cap
applications,
so
total
164.,
it's
looking
great,
taking
out
the
auto
verifier
as
an
outlier,
we're
seeing
about
seven
clients
per
notary
so
that
direct
activity
is
looking
wonderful.
A
At
this
point,
we
have
a
number
of
open
large
data
set
applications.
Some
of
them,
as
you
can
see,
are
pretty
close
to
getting
their
seven
notaries.
We
encourage
you
to
go
check
those
out
and
see.
Some
of
these
are
re-submissions
where
something
happened
with
their
github
account,
so
some
notaries
have
already
signed
on
to
those
I'm
going
to
be
working
with
them
to
see
if
we
can
recover
some
more
of
the
comment
thread
to
get
posted
back
in
there
for
the
various
audit,
but
that's
just
so
that
you're
aware.
A
I
think
there
was
one
other
resubmitted
that
had
already
been
approved.
So
that's
where
we're
sitting
with
that,
I
think
we've
got.
I
think
I
saw
some
members
from
clay
watch
on
the
call.
A
B
Right
so
for
clay
watch
disclaimer
here
I
have
joined
this
call
as
a
translator
for
the
clay
watch.
It's
a
team
of
korean
developers,
blockchain
developers,
and
they
were
not
fluent
with
english
and
and
they
were
not
comfortable
with
presenting
their
projects
in
english.
So
I've
worked
as
a
community
trans,
like
staff
translated
for
several
meetups
before
been
in
the
local
community,
so
I
just
got
in
here
to
help
just
putting
it
out
there.
So
we
have
two
other
team
members
with
us
at
the
call.
B
Jace
he's
our
main
developer
and
kim
is
helping
us
out
and
clay
watch
I'll.
Just
briefly,
just
briefly
speaking,
let
me
share
the
screen.
Pull
up
the
screen
here.
B
Yeah
it's
it's
excuse
me,
the
it's.
Currently,
the
language
is
only
supported
in
korean,
but
basically
speaking,
this
is
a
d5
dashboard
for
clayton,
which
is
one
of
the
biggest
mainnet
projects
here
in
korea.
It
was
developed
by
cacao.
It's
a
it's
a
mega
enterprise
here
and
there
are
the
the
activity
is
growing
and
within
the
within
korea
and
within
asia.
Right
now,
and
currently,
there's
one
d5
project,
which
is
clay
swap
in
within
the
collecting
ecosystem
and
clay
watch.
B
We
developed
a
d5
dashboard
where
the
people
can
see
how
many,
what
the
liquid,
what
they
provided,
liquidity
their
lp
tokens,
how
much
they
have
stayed
and
how
much
they
have
in
their
in
their
wallet.
B
So
this
is
like
a
dashboard
of
their
portfolio
and
their
liquidity
pools,
and
also
here
you
can
see
the
transaction
history
of
of
your
swaps,
of
your
liquidity,
pro
or
your
lps,
and
how
much
you
have
withdrawn
and
how
how
much
you
have
been
rewarded
and
also
the
clay
watch
is
providing
a
feature
called
swap
premiums
which
provides
real
world
data
feeds
and
gives
you
how
much
premium
you
have
within
the
clay
swap
ecosystem
and
the
on
the
exchange.
B
It
provides
you
the
price
difference
between
arbitrage
opportunities
and
also
there's
a
swap
alert
where
you
can
set
an
alarm
if
for
say,
if
there's
more
than
five
million
dollars
worth
of
swaps
happening
within
play,
swap
right
here,
you
will
be
able
to
monitor
these
big
whale
alerts
like
like
a
whale
alarm,
so
play
swap.
B
How
we
are
going
to
utilize
falcon
is
that
there
are
a
lot
of
transaction
data
which
we'll
be
indexing
and
also
there
will
be
snapshots
snapshots
of
images
of
of
people's
accounts
and
their
current
current
deposits.
Current
history
info
reward
swap
and
withdraw
information,
and
that
will
be
currently
we
are
doing
it
in
our
own
servers
and
we
want
to
put
it
on
an
ipfs
on
a
decentralized
storage
like
filecoin,
so
we
have
submitted
our
applications
for
a
large
data
set
and
then
and
also
yeah
there.
B
The
other
samples
and
the
the
data
samples
have
been
on
the
application.
You
can
check
it
out
on
the
google
drive.
B
Currently,
there
are
about
10
10
000
to
20
000,
clayton,
active
wallet
users
and
about
5
000
of
them,
which
is
about
25
to
50
percent,
have
interacted
with
the
clay,
swap
clay
swap
on
the
contract,
place
of
service
and
so
currently
they're.
How
much
data
they're
providing
based
on
the
5000
active
users
is
they're,
providing
about
50
gigabytes
worth
of
data
per
day
and
about
400
gigabytes
per
week.
So
that's
the
amount
of
data
will
be.
B
We
are
projecting
and
as
the
user
traffic
grows
in
the
in
the
clay
swap
ecosystem
and
as
they
as
there
are
more
d5
services
and
nft
services
that
are
are
coming
in
the
ecosystem.
We
expect
the
number
to
grow.
So
this
will
be
a
huge,
we'll
be
using
a
huge
storage.
Obviously,
so
I
guess
that
will
be
from
the
introduction
of
play,
swap
I'm
clay
watch
and
yeah.
I
guess
if
you
have
any
questions,
please
let
us
know.
A
Thank
you
thanks
daniel
yeah
from
the
notaries
on
the
call,
I
think
in
checking
the
github
issue.
Right
now.
I
think
there's
one
notary
from
foul
point
foundation.
Danny
has
jumped
in
wondering
if
there
are
any
other
notaries
that
are
on
the
call
that
had
any
questions
for
that
team,
while
we've
got
them.
A
We
can
also
post
them
in
the
chat
or
post
them
directly
in
the
issue.
This
is
issue
29,
so
there's
the
link
and
it
looks
like
deep-
has
a
hand
go
for
it.
C
Hey
daniel
thanks
for
presenting
on
behalf
of
the
team,
I'm
not
a
notary,
but
I
do
have
a
question
that
I
think
might
be
interesting
to
hear
the
answer
to
if
possible,
like.
I
was
curious
about,
like
the
data
residency
and
the
storage
provider
selection
for
this
data,
like
is
it
going
to
be
served
by
providers
in
specific
regions
in
the
world?
C
Are
there
any
requirements
in
terms
of
like
access,
the
data
set
like
the
number
of
replicas
that
will
be
stored
and
then
based
on
that,
like
how
or
is
clay
watch
intending
to
do
the
actual
selection
of
the
storage
providers?
So
is
it
like
picking
specific
storage
providers
which
the
contracts
are
being
signed
because
it
seems
like
it's
a
pretty
big
project
and
there's
a
lot
of
like
companies
that
have
been
that
have
been
looped
in,
and
so
I'm
curious
how
you
know
they're
intending
to
ensure
long-term
quality
storage?
Basically.
B
I
see
I
see
so
let
me
relay
this
question
to
the
team
and
before.
Let
me
check
if
I
got
the
question
right
on
geograph
geographical
dispersion
and
how
many,
how
many
replicas
and
copies
of
the
dataset
will
be
will
be
necessary
required
and
last
question
was
how
we
would
be
selecting
the
service
providers
and
have,
if
any,
contacted
them
before.
B
So,
based
on
the
the
application
here
and
the
the
answer
from
the
team
is
that
first,
they
will
be
focusing
mainly
on
the
asia
region,
especially
in
korea,
and
next
will
be
china
and,
as
clayton
is
bootstrapped
to
a
more
global
community,
this
would
be
more
broadened
and
broadly
dispersed
around
the
around
the
globe
and
but
for
the
first
phase
this
will
be
mostly
focused
on
the
asia
region
and
for
for
the
replicas
they
are
projecting
to
save
at
least
three
or
four
copies
of
data
and
in
terms
of
the
service
providers,
they
have
meetings.
B
We
have
met
with
several
service
providers
before,
but
I
have
not
confirmed
on
any
of
them
yet.
B
E
F
And
I
have
a
question:
I
noticed
that
clearwatch
will
work
with
four
or
five
miners,
but
our
large
data
sets.
F
Normally
we
we
will,
we
will
allocate
less
than
five
percent
to
per
manner.
So
I
want
to
ask
that
if
there
are,
there
is
some.
There
are
some
plans
to
work
with
more
miners.
A
Yeah,
that's
a
great
question
so
typically
so
daniel
to
to
check
to
make
sure
I'm
I'm
understanding,
especially
like
typically
we
try
and
see
from
the
percent
allocation
of
the
data
that
we
get.
A
You
know
no
more
than
five
percent
for
these
large
data
sets
to
a
particular
storage
provider
so
doing
four
or
five
individual
storage
providers.
Four
or
five
miners
is
a
pretty
low
number
compared
to
what
we
like
to
see
for
these
large
data
set
applications.
So
I
think
what
we're
what
some
of
the
notaries
want
to
know
is
you
know?
E
B
Right,
so
yes,
of
course,
this
was,
they
said
this.
This
was
our
first
estimate
and,
of
course,
they're.
Obviously
they
would
have
to
expand
more
and
we'll
be
trying
to
reach
out.
The
first
estimate
was
because
that
was
the
only
number
that
they
could
reach
to.
I
guess
once
they
participate
in
the
community
more
and
more,
they
obviously
have
more
plans
to
meet
with
the
service
providers
and
contact
them
and
expand
the
numbers.
B
I
guess
that
was
answered
in
it's
it's
in
the
issue
and
to
to
point
it
out.
There
was
about
40
gigabytes,
just
a
second.
B
There
was
about
40
gigabytes
of
data
that
are
being
produced
every
day
and
we
in
weekly,
we
estimate
about
400
gigabytes.
Yes,
I
hope
that
sound
that
answers
the
question.
B
Yes,
the
the
the
team,
the
blockchain
developers,
will
have
plans
to
develop
a
system
that
will
be
able
to
transfer
the
data
through
online
that
will
be
automatic
automatically
on
a
regular
basis.
G
B
And
yes,
that's
the
the
online
transfer
will
be
the
primary
method
and
offline
will
be
a
backup
backup
plan.
Offline
transfer
will
be
a
backup
plan.
C
Cool
are
there
any
other
questions
yeah,
and
I
want
to
make
sure
that
we
have
enough
time
for
the
other
clients
as
well
and
stuff.
So
maybe
it's
worth
just
asking
the
present
notaries
to
see
if
anyone's
interested
in
supporting
this
application-
and
I
can
take
some
notes
and
put
it
in
a
comment
in
the
issue.
C
C
All
right
we'll
definitely
be
following
up.
Thank
you
so
much
to
the
clea
watch
team
for
coming
to
present.
They
really
appreciate
your
time.
Thank
you.
A
Awesome
so
we
heard
from
clay
watch
station
29.
We
can
go
jump
in
or
ask
some
more
questions
for
follow-up
questions
in
that
and,
like
I
said,
there's
a
couple.
Others
both
you
know
shanghai
and
this
particular
lab
steel
bot
are
being
very
close
with
six
notaries
and
we
may,
I
think
we
may
hear
from
usc
showa
joining
the
call
in
a
little
bit,
but
another
quick
update,
root,
key
holders.
A
We've
mentioned
this
a
few
times,
and
this
question
has
come
up
with
a
couple
of
pending
large
data
sets.
Our
goal
is
to
add
four
new
root,
key
holders
to
specifically
speed
up
this
process
of
getting
the
notaries
created
and
we
are
still
processing
those
those
messages
are
I'm
pending
and
we're
working
with
the
existing
root
key
holders
that
that
approved.
A
We're
still,
you
know
hoping
to
get
this
accomplished
as
soon
as
possible.
This
is
a
priority
with
multiple
people
pushing
on
it.
So
we
are
aware
that
it's
slowing
down
some
of
those
clients
that
have
already
been
approved,
some
quick
updates
on
boss.
A
A
These
are
to
support
the
large
data
set
notary
process
update
issue
217,
but
even
if
we
change
that
proposal,
we
could
still
utilize
some
of
the
features
of
these
bots
so
specifically,
there's
a
onboarding
bot
to
try
and
improve
automation
for
all
of
the
client
data
cap
onboarding,
it's
analyzing
applications
for
completeness
this
is
happening
predominantly
with
the
large
data
sets,
and
it
can
then
propose
a
new
multi-sig
notary
in
the
governance
repo
for
the
root
key
holders.
A
If
we
move
forward
with
the
issue
where
we
automatically
know
which
signers
would
put
on
it
and
then
once
that
notary
is
completed,
we
could
propose
the
first
data
cap
allocation
automatically
and
we
could
notify
the
client
of
that
allocated
data
cap.
So
some
of
these
steps,
like
I
said,
could
happen
independent
of
the
process
proposal,
but,
for
example,
this
one.
A
We
would
need
to
know
that
we
had
that
list
of
the
multisig,
so
some
samples
of
what
the
comments
would
look
like
in
github
to
let
the
client
know
that
it's
been
completed,
there's
also
a
stat
spot
that
I
think
a
number
of
people
have
been
talking
about
wanting
to
help
increase
some
of
our
audit
tools
so
very
excited
about
this
one.
Specifically,
it
can
monitor
a
client's
remaining
amount
of
data
cap
and
when
a
particular
threshold
gets
reached,
it
can
update
an
issue
which
could
change
the
labels
to
start
a
subsequent
allocation
request.
A
A
So
if
we
could
get
to
a
certain
threshold,
we
could
then
kick
that
off
to
help
with
the
audit.
We
could
link
to
the
stats,
dashboard
and
post
a
table
of
the
previous
deal
behavior.
This
is
a
sample
like
I
said,
from
that
font
being
tested,
so
we'd
be
able
to
see.
A
You
know
what
the
subsequent
allocation
would
be
and
their
previous
behavior,
so
you
could
get
that
snapshot
like
we
were
talking
about
of
what
percentage,
and
we
are
also
the
foundation-
is
submitting
a
micro
grant
for
development
of
a
slack
github
bot
that
basically
can
post
a
reminder
update,
and
we
could
have
multiple
different
work
streams
that
this
would
utilize,
but
one
example
would
be
to
look
in
the
client,
onboarding
repo
and
count
the
number
of
open
issues
that
are
labeled,
bought,
looking
good
and
state
verifying.
A
So
right
now,
in
that
previous
slide,
we
saw
there
were
59
of
those,
and
so
a
thing
that
we
could
do
would
be
every
wednesday
in
these
various
channels.
We
could
post
a
message
that
says:
there's
59,
open
issues
for
client
applications.
We
could
also
be
able
to
watch.
You
know
how
many
closed
and
granted
applications
are
there
to
keep
track
of
that
or
we
could
use
it
for
reminders
about
large
data
set
applications.
A
C
C
Yeah,
like
lots
of
like
general
changes,
were
proposed,
but
just
to
give
you
an
overview,
if
you
haven't
already
read
through
the
issue
that
was
proposed
a
couple
of
weeks
ago
in
general,
like
the
goal
of
falcon
plus
right,
is
to
reduce
the
friction
of
onboarding
clients
to
the
falco
network
and
ensuring
that
the
property
of
the
productivity
of
the
network
grows
and
that
clients
have
sort
of
the
ability
to
ensure
that
they're
able
to
get
from
file
point
the
quality
storage
that
they're
looking
for
we've
been
running
this
so
for
those
of
you
that
have
been
around,
especially
from
the
first
round
of
like
notary
elections.
C
This
was
proposed
several
months
ago,
like
having
some
sort
of
mechanism
whereby
extremely
large
amounts
of
data
cap
would
be
made
available
for
specific
clients
was
an
idea
that
was
floating
around
in
like
q1,
q2
and
morphed
into
this,
like
experiment,
which
is
like
the
ldn
like
mechanism,
which
we
as
a
community
had
sort
of
agreed
to
test
out
an
earmark
for
like
up
to
50
bytes
across,
like
10
plus
times,
and
what
we're
seeing
so
far
is
that
the
time
to
receiving
data
cap
for
clients
that
apply
through
this
mechanism
is
still
like,
extremely
long
several
weeks
to
go
through
the
process
of
getting
the
you
know,
seven
notaries.
C
Then
the
root
key
holder
is
actually
turning
around
and
granting
the
approval
of
a
multi-sig
as
a
notary
for
that
client
and
then
subsequent
allocations
requiring
a
signature
threshold
before
and
so
we're
doing
a
little
bit
of
thinking
and
and
collecting
feedback
from
clients
and
like
what
this
experience
like
felt
like
for
them
and
ways
in
which
we
could
improve.
C
It
reduce
the
friction,
but
also
continue
surveying
like
the
actual
purpose
behind
what
it's
there
for,
and
so
the
details
are
definitely
the
issue,
but
just
to
make
you
aware
of
like
the
the
salient
points.
C
The
first
is
that
you're
thinking
of
removing,
like
seven
notary
self-electing
in
onto
the
multisig
and
proposing
that
active
notaries,
are
opted
non-opted
out,
notaries,
basically
end
up
being
on
the
multi-stick
to
begin
with,
and
in
that
way
it
functions
a
little
bit
more
similarly
to
how
each
individual
notary
works,
where,
like
once
you
get
notary
status,
you're
a
notary
until
you
have
data
gap
to
grant,
you
are
expected
to
do
the
due
diligence
of
clients
that
are
coming
to
apply
to
you,
respond
to
those
applications,
identify
who
is
worthy
of
receiving
the
data
cap
and
grounding
data
cap
to
that
client,
and
so
similarly,
the
multi-stick
could
consist
of
a
larger
set
or
a
complete
set
of
like
active
notaries,
whereby
we
wouldn't
have
to
wait
like
a
client,
wouldn't
need
to
wait
for
specifically
a
set
to
elect
themselves.
H
C
But
we
could
just
start
out
with,
like
those
notaries
being
on
the
multisig,
which
would
then
result
in
once
the
application
actually
comes
in
and
is
validated.
C
The
multi-stick
is
already
prepared
and
sent
to
the
key
holders,
and
then,
during
that
time
we
can
still
carry
out
the
due
diligence
as
notaries
do
or
involved
community
members
have
been
for
many
of
the
existing
applications
for
clients
and
then
substituting
sorry.
First
and
subsequent
allocations
would
be
at
a
decreased
threshold.
So
instead
of
having
four
signers
on
each
allocation,
we
were
initially
proposing
having
two
and
doing
as
much
of
the
client
work
as
possible
in
an
automated
fashion.
C
So
galen
talked
about
how
like
some
of
those
bots
could
be
extended,
basically
based
on
the
outcome
of
this
conversation
to
support
like
actions
on
behalf
of
the
client,
like
checking
how
much
data
cap
is
left
on
a
particular
address
and
requesting
for
additional
data
cap.
That
is
something
that
could
take.
C
You
know
a
client
a
few
hours
or
even
days
to
notice
that
they're
running
low,
but
it's
something
easily
automatable
reduces
the
friction,
reduces
the
turnaround
time
and
so,
generally
speaking,
we
wanted
to
sort
of
propose
a
bunch
of
these
changes,
and
so
I
think
galen
actually
wrote
up
all
this
in
detail
in
issue
217,
and
so
we
can
head
to.
The
next
slide
got
a
bunch
of
good
feedback
on
this.
C
Oh,
I
should
have
probably
had
this
one
up,
while
I
was
talking
sorry
about
that,
so
just
just
run
through.
Actually
I'm
going
to
leave
this
up
for
20
seconds
and
make
sure
people
if
you
have
any
questions
on
the
process,
let
me
know.
C
Cool
yeah,
so
the
as
I
mentioned,
like
the
main
changes,
are
basically
that
if
we
do
as
much
in
an
automated
fashion
as
possible
set
up
the
multisig
almost
immediately,
it's
still
fine,
for
you
know
the
client
application
not
to
go
through
it's.
I
would
imagine
still
quite
likely
that
not
every
client
will
get
data
cap
because
notice
decide
that
they
don't
want
to
actually
sign.
For
this
the
same
way,
they
would
behave
in
case
the
client
had
applied
directly
to
them,
and
so
with
the
additional
automation
that
we
have.
C
That
can
help
with
auditing,
with
data
gathering
like
this
in
in
at
least
in
our,
in
my
opinion,
ends
up
being
a
much
more
efficient
format
for
us
to
start
moving
closer
towards
enabling
clients,
even
if
it's
initially
for
smaller
amounts
of
data
cap
to
get
started
and
test
out
the
network
and
for
us
as
a
community,
to
start
to
become
more
like
watch
persons
of
the
of
the
network
where
notaries
take
this
role,
where
they
get
data
and
feedback
to
actually
make
decisions
based
on,
as
opposed
to
constantly
having
to
do
a
bunch
of
like
kyc
work
up
front,
and
so
it
represents
like
an
interesting
direction.
C
Where,
as
a
program,
we
can
shift
much
more
towards
operating
like
a
set
of
principles
that
enable
certain
types
of
clients
over
time
to
be
very
productive
on
the
network.
But
yeah,
I
would
say,
definitely
take
a
look
at
the
issue.
All
of
this
is
detailed
there.
If
you
have
any
immediate
questions,
feel
free
to
raise
your
hand,
and
let
me
know
if
not,
I
did
want
to
do
a
little
bit
of
a
deep
dive
into
the
discussion
that
was
happening
and
the
responses
to
this
proposal.
C
So
there
were
three
like
different
sort
of
changes
that
I
think
people
had
really
good
feedback
and
suggestions
for.
So
I
wanted
to
talk
through
each
of
those.
The
first
is
in
the
allocation
calculation.
C
So
for
those
of
you
that
are
aware,
the
the
current
ldn
format
operates
based
on
this,
like
client
submitted
weekly
onboarding
rate,
where
the
first
allocation
is
like
the
lesser
of
half
of
that
weekly
onboarding
rate
or
like
five
percent
of
the
total
application
applied
for
data
cap.
C
I
guess
the
second
allocation
is
the
weekly
onboarding
rate
or
the
equivalent
of
10,
and
then
the
subsequent
allocations
from
there
are
20
of
the
total
allocated
data
cap
or
whatever
the
client
claims
they'll
use
in
every
two
weeks,
and
so
there's
some
really
good
analysis
that
was
done
by
notaries
and
community
members
on
basically
like
how
that
could
end
up
taking
extended
periods
of
time
and
then
there's
also
like
fips
in
the
works
such
as
snap
deals
that
may
impact
like
the
ability
of
interaction
between
note
fees
and
clients
to
ensure
clients,
don't
get
blocked
waiting
for
data
cap
to
be
able
to
make
deals,
and
so
there's
definitely
a
lot
of
like
great
ideas
on
how
we
could
scale
this
or
change
the
way
it
works.
C
Some
of
the
notable
proposals
were
you
know,
just
increasing
that
rate,
for
example
like
substantively
like
just
like,
instead
of
it
doubling
or
just
10xing
it
for
every
subsequent
allocation,
for
example,
there's
another
one
which
is
really
interesting
around
like
having
like
this
target
data
gap
amount.
I
think
that
was
suggested
by
eric,
but
I
might
be
misremembering.
C
But
this
idea
that,
basically
you
can
look
at
you-
could
look
at
the
clients,
like
past
data
cap
allocation
usage
over
a
period
of
time
and
make
assumptions
on
how
much
they're
likely
to
use
in
the
coming
weeks
and
use
that
basically
to
have
like
a
shifting
like
target
amount
that
can
drop
or
go
up
over
time
depending
on
how
productive
a
client
is.
So,
instead
of
us
having
to
ask
a
client
upfront,
we
actually
would
just
learn
from
their
patterns
and
practices
on
the
network.
C
The
original
proposal
didn't
actually
have
anything
on
this
like
changing,
because
we
were
specifically
trying
to
address
the
problems
of
like
speeding
up
the
existing
process,
as
opposed
to
changing
like
the
rate
of
dispensing
of
data
cap.
But
I
think
these
are
like
pretty
compelling
proposals
and
I'd
love
to
open
the
floor
to
anybody
that
wanted
to
share
more
on
their
ideas
or
have
feedback
on
that
specific
topic
and
then
we'll
talk
through
the
threshold
and
or
actually,
I
think
some
of
you
did
address
many
of
these
in
your
comments.
C
C
The
second
topic
is
on
the
threshold,
so
this
threshold
being
the
actual
number
of
signatures,
that's
required
from
signers
on
the
multisig
to
actually
result
in
data
cap
being
granted.
So
the
current
threshold
is
four
signatures
out
of
seven
notaries
across
three
regions,
and
the
proposed
threshold
was
actually
having
that
number
drop
down
to
two
and
things
that
we
were
considering
or
called
out
as
flags
in
the
proposal
included.
C
Finding
ways
to
ensure
that,
like
maybe
notaries,
don't
sign
multiple
consequent
allocations
or
we
restricted
so
that,
like
each
of
the
signatures,
can
come
from
like
a
specific
region,
but
they
have
to
be
from
two
unique
regions
or
something,
and
so
there's
some
pretty
good
discussion.
I
think
a
lot
of
people
felt
that
two
might
be
a
little
risky.
C
Others
felt
that
you
might
help
speed
things
up
quite
a
bit,
so
you
know,
I
think,
we'd
love
to
hear
if
there's
any
strong
opinions
to
share
on
basically
like
what
what
people
think
about
that
number
and
ways
in
which
that
we
can
make
it
safer
or
learn
from
it
as
a
community.
Of
course,
barring
that
this
is
continued
iteration
on
experimentation
and
and
we're
all
working
together
to
sort
of
make
this
a
more
effective
solution
in
the
long
term.
C
C
But
if
people
think
that
this
could
be
an
interesting
mechanism
to
mitigate
risk,
then
it
would
be
good
to
hear
opinions
on
that
as
well.
Okay
talked
for
several
minutes.
Hopefully
you
got
most
of
that,
but
yeah.
I
think
the
main
thing
I'd
like
to
do
is
just
open
the
floor
up
to
see
if
there's
anybody
who's
attending
that
has
any
ideas
on
these
three
fronts
or
in
general
on
the
proposal
would
love
to
get
feedback.
H
Hey
hi
hi,
deep,
hey
yeah,
I
still
like
insist
the
yeah,
like
the
answer
like
we.
I
asked
like
two
weeks
ago
when
you
were
on
vacation
so
about
the
threshold
about
the
like
the
two
two
notaries.
Can
we
like
have
the
like
region
restriction
on
the
notary,
so
we
have
to
like,
like
have
one
for
the
from
the
region,
that,
from
the
same
region
from
we
see
the
applicants
and
also
like
get
one
from
other
regions
to
like
to
monitor
it.
C
Yeah,
I
think
that
one
is
a
is
a
very
reasonable
sort
of
change.
I
think
the
main
thing
that
we
also
then
have
to
do
is
ensure
that,
like
everybody
uses
the
same,
like
a
method
of
sending
a
message
so
right
now,
what
we're
seeing
is
like
some
notaries,
are
comfortable
just
sending
messages
directly
to
the
chain
and
then
there's
most
notaries
are
using
the
falcon
plus
registry
app,
and
there
are
some
interop
issues
between
those.
C
In
terms
of
like,
I
think
there
were
some
issues
last
week
where
you
might
be
aware,
with
like
transaction
ids,
not
being
hashed
or
encoded
and
decoded
correctly,
resulting
in
multiple
signatures
coming
in,
but
the
actual
proposal
not
being
approved,
and
so
I
think,
if
we
could
centralize
on
one
process.
So
if
you
were
to
say
yeah,
let's
like
take
a
dependency
on
one
tooling
right
now,
which
is
like
the
falcon
plus
app,
then
what
we
can
do
is
add
like
flags
for
like
based
on
when
you
were
elected
as
a
notary.
C
You
were
picked
for
a
particular
region
and
based
on
that,
we
could
add
like
a
flag
that
says,
like
yeah,
like
approved
from
one
from
region,
that
the
client
is
applying
for
and
one
from
a
different
region.
Is
that
what
you
were
proposing,
or
were
you
just
thinking
that
each
of
the
signatures
should
be
from
a
unique
one,
regardless
of
the
region
where
the
client
applied
from
the
previous
one?
C
Okay,
so
like
one
from
the
client
up
again,
so
one
from
the
region
in
which
the
client
claims
to
be
applying
from
and
then
one
from
anywhere
else.
H
Yes
or
if
the
I
mean,
if
the
applicants
that
need
to
like
transfer
their
data
to
like
multiple
regions-
and
it
like
should
be,
have
like,
maybe
more
than
two
like
so
like
you
know
like,
if
you
want
to
like
save
data
at
different
regions,
it
will
have
like
different,
like
regulation
right
there.
So.
C
C
We
should
probably
think
about
like
with
like
medium
and
longer
term
implications
where
the
point
of
having
a
region
specific
notaries,
also
to
ensure
that
we
have
region,
specific
expertise
right
on
like
regulation
on
policy
on
ways
in
which
the
clients
can
actually
flourish
and
be
productive
on
the
network
and
so
leveraging
them
for.
That
purpose
is
also
like
a
very
compelling
point,
thanks
for
sharing
that
that's
really
interesting.
Galen,
you
have
a
hand.
A
Yeah,
I
was
wondering
like
thinking
about
that
point.
Do
we
think
that
every
allocation
would
need
the
a
notary
from
the
region
of
the
client,
or
do
we
think
that
that's
most
important
for
the
first
allocation,
or
maybe
it's
you
know
it's
important
for
the
first
and
second,
but
then
from
the
third
on,
and
I
don't
necessarily
know
from
a
technical
standpoint,
what
is
possible
about
that
like
what
we
could
put
in
as
a
flag?
A
But
I
just
wonder
if,
if
we
require,
for
example,
if
the
application
comes
in
from
europe,
if
we
require
a
notary
from
europe
for
every
allocation
that
make
it,
you
know
when
they
then
have
like
a
reliance
on
a
subset
of
people
again
which
could
slow
us
down.
I
also
wonder
if,
like
another
way
to
think
about
that
would
be
because
some
other
people
have
mentioned
this
idea
of
like
what,
if
the
notaries
are
responsible
for
stopping
the
allocations.
A
So
like
one
way
to
think
of
it,
is
you
know
if
two
notaries
could
sign
the
allocations,
but
then
another
notary
from
the
region
was,
you
know,
responsible
or
could
somehow
look
at
it
and
say
like
actually
I'm
the
expert
on
this
type
of
data
or
this
region?
And
you
know
I
have
a
question
or
concern
here.
A
So
I
think
they're
like
there
could
be
some
interesting
ways
to
like
look
at
and
solve
that.
But
I
wonder
if
you
think
that
every
allocation
should
have
a
representative
from
the
region.
C
Yeah,
it's
pretty
pretty
interesting
to
think
about
like
when
you
need
that,
like
response,
what
I'm
wondering
is,
would
that
be
like?
Like
you
know,
we've
we've
talked
about
like
denying
like
data
cap
before
where
it's
like.
I
don't
like.
I
don't
think
this
client
should
get
data
cap
or
I
don't
want
to
give
like
data
capture
this
kind.
I
think
that
the
bar
for
that
is
like
relatively
high,
like
I
think
I
see
from
like
a
empirical
standpoint
from
my
experience
in
working
with
notre
dame's
over
the
last
several
months.
C
It
feels
to
me
like,
instead
of
saying
no
most
notaries,
just
instead
just
choose
not
to
react
as
much,
and
so
I'm
curious,
if
like
nobody's
here,
have
any
thoughts
on
like
you
know,
stepping
into
a
github
issue
and
saying
hey
like
I.
I
don't
think
that
this
is
gonna
work
or
I
don't
think
this
like
will
will
actually
like
pan
out,
based
on,
like
the
current,
the
client
expectations,
I'd
love
to
hear
some
feedback
on
that
as
well,
and
then
julie,
and
I
see
your
video
popped
up.
C
Cool,
I
don't
want
to
take
up
too
much
time
on
this,
because
I
know
that
there's
also,
you
know
some
that
jonathan
don't
register
on
the
call.
C
I
know
he
wanted
to
share
stuff
about
his
showa
application
request,
and
so
I
think,
maybe
what's
worth
calling
out
is
basically
that,
like
you
know
today,
is
the
we're
testing
out
the
dual
time
zone
calls
so
we'll
have
a
similar
discussion
in
the
second
call
in
case
you're
awake
or
want
to
participate
in
that
and
want
to
think
between
now
and
then
and
come
share
your
thoughts
and
feel
free.
If
not,
what
we
can.
C
What
we're
planning
to
do
is
basically
consolidate
the
feedback
from
the
calls
plus
what's
been
shared
in
the
issues
and
then
update
the
proposal
accordingly
and
and
hopefully
sort
of
get
some
additional
comments
and
additional
feedback
this
week,
so
that
we
can
figure
out
like
what
the
right
next
steps
are
and
and
proceed
with,
enabling
the
existing
client
pool,
but
also
future
clients
that
are
excited
to
use
alcon.
C
A
Awesome,
yes,
thank
you
deep,
as
he
said,
we'll
be
consolidating
feedback
and
updating
that
proposal.
So
please
go
check
out
that
issue.
Leave
your
comments
and
thoughts
there
and
turning
it
over
to
jonathan.
I'm
gonna
stop
sharing
and
I
think
he
can
take
the
screen.
B
I
Great
well,
it's
great
to
be
here
and
I
guess
in
the
interest
of
time
I'll
just
speak
very
briefly,
and
then,
if
there's
questions
I
can.
I
can
turn
on
some
slides
to
clarify.
So
I'm
here
from
the
starling
lab,
which
is
a
joint
initiative
between
the
usc
showa
foundation
and.
I
Department
of
electrical
engineering
and-
and
I'm
here
to
talk
about
our
data
cap
application
for
those
of
you
who
may
have
seen
it.
Thank
you
galen
for
throwing
it
in
the
the
chat
window,
we're
in
the
final
stages
of
getting
notaries
behind
this
application,
and
I
thought
I
would
just
use
this
opportunity
to
briefly
introduce
myself
and
the
project
and
make
a
very
quick
case
for
for
our
application.
I
The
the
data
cap
that
we're
looking
for
is
to
store
the
world's
most
comprehensive
and
largest
collection
of
video
testimony
of
the
survivors
of
genocide.
It
includes
10
different
collections,
which
begin
with
the
armenian
genocide
at
the
beginning
of
the
20th
century,
all
the
way
up
to
contemporary
conflicts
in
myanmar
and
syria
and
iraq.
I
On
average,
we
have
testimony
which
looks
at
around
three
hours
in
length
and
in
total
there's
about
240
000
files,
basically,
videos
that
cover
55
000,
individual
testimonies
from
these
survivors.
I
So
it
is,
it
is
a
large
data
set,
but
indeed
is
also
a
vulnerable
data
set
and
so
about
two
and
a
half
years
ago
we
began
a
process
of
working
with
falcoin,
even
before
it
had
launched
to
start
figuring
out
how
to
lead
a
way
for
our
organization
to
actually
put
the
entire
collection
onto
the
falcon
network.
I
The
first
is
that
we
have
been
working
at
each
step
to
help
perfect
tools.
They
could
make
large-scale
data
movement
onto
the
file
network
possible,
so
in
supporting
our
application,
you're,
also
supporting
the
path
that
others
will
also
follow
to
put
their
large
data
sets
on.
So
that
means
direct
engagement
with
minors,
storage
providers.
I
I
should
say,
and
and
and
also
folks
who
develop
tools
so
there's
kind
of
a
two
for
one
we
get
preservation
as
well
as
perfecting
a
process
and,
and
then
secondly,
this
is
one
of
the
things
that
really
command
this
project
is
the
fact
that
it
is
an
intuitive
and
clear
example
of
why
distributed
systems
matter.
So
when
you
think
about
vulnerable
data
sets,
it
stands
to
reason
like
like
this
data
set.
I
It
stands
to
reason:
you'd
want
to
distribute
them
as
far
and
as
bright
as
possible,
more
copies,
keep
things
safe,
and
so
here
now
with
falcoin,
we
have
the
ability
to
do
exactly
that
and
to
distribute
the
data
capital,
multiple
storage
providers,
that
can
help
us
secure
this
testimony,
and
we
will.
We
will
take
the
deals
for
as
long
as
we
possibly
can.
I
So
we
know
they're
limited
now
in
length,
but
we
absolutely
intend
to
renew
them
and
have
exciting
plans
to
keep
renewing
them
for
many
years
to
come
with
that
I'll
I'll
turn
things
over
in
case
anyone
has
questions
or
galen
all.
Let
you
continue
on
thanks
again
for
having
me.
A
Thank
you,
gentlemen.
Yeah.
The
application
link
is
there
if
there
are
some
questions
from
notaries
on
this
call,
we'd
love
to
see
some
hands
or
you
could
type
it
in
the
chat
as
well.
C
Do
you
know
what
the
current
notary
count
is
for
this
application?
I
believe
it
is
currently
at
three.
E
C
The
github
issue,
it
looks
like
it's
danny
and
bushy,
really
yeah.
J
I
I
think
that,
for
this
initial
application,
there
are
definitely
some
data
sets
which
are
from
main
chain,
and
so,
if
you
feel
that
those
are
safe
to
store
we'd
be
very
happy
to
provide
that
information,
whereas
the
international
information,
I
think
it's
just
for,
I
think,
to
be
cautious
and
and
to
make
sure
that
this
is
the
right
thing
for
source
providers
in
china.
We'd
probably
not
store
that
information
there.
I
But
if
we
had
the
support
of
the
data
cap,
we
could
use
that
to
support
other
storage
providers
in
the
ecosystem,
but
but
yeah.
The
the
ninjing
files,
I
think,
are
are
an
important
part
of
chinese
history
and
and
absolutely
should
not
be
forgotten.
So
I
think
we
could
definitely
explore
putting
them
over
there.
I
Great
well
again,
thanks
to
all
for
your
time,
we're
really
pleased
to
be
working
with
falcoin
and
and
this
falcon
plus
solution.
We
hope
we
have
many
more
things
we'll
be
able
to
do
together,
so
hopefully
I'll
be
here
in
in
no
time
at
all,
telling
you
about
the
success
of
this
project
and
and
bringing
more
fun
and
exciting
things
to
work
on
together.
Thanks
to
all.
A
Thanks
samantha
and
deep,
it
looks
like
you're,
adding
in
those
two
other
notaries,
x8
matrix
and
sonic
42.
A
So
with
that,
I
don't
know
that
we
have
other
clients
to
present
today,
so
we
cover
those
client
presentations
wanted
to
take
a
second
and
just
see
the
four
minutes
remaining.
If
there
were
other
sort
of
open
forum,
questions
or
concerns
or
outstanding
needs-
and
it
looks
like
also
deep
eric
is.
C
Jumping
on
to
usc
I
just
and
also
so
the
the
deal
bots
project
and
the
deputation
system
project
in
general
is
at
six
signing
notes,
and
so
that's
one
of
the
things
that
I
spend
my
time
on
as
well.
So
there's
any
questions
on
that.
I'd
be
happy
to
talk
through
our
plan,
our
ideas,
thoughts
and
like
why.
I
think
it's
a
really
important
lever
for
falcon
view
making.
K
Yeah
and
yeah,
this
is
charles
and
I
saw
the
email
so
I'm
joined
anyway,
so
yeah
I
make
a
shortest
plan.
I
make
a
few
slides
to
better
demonstrate
what
does
the
first
one
nfg
ipf
stories
project
going
to
do
so,
if
you're
interested,
I
can
make
a
presentation
for
that.
K
Yeah
so
yeah,
I
think
you
can
see
my
screen
now
good,
so
so,
basically
like
yeah
we're
doing
the
first
one
empty
ipfs
for
each
project
which
is
designed
for
sending
so
for
now
we
have,
we
still
guys
are
increasing
demand
from
different
blockchains
like
neo
and
alexa
and
other
blockchains.
So
but
beyond
that,
we
also
have
other
industry
users
like
vr
and
vr
ar
and
ai.
They
all
want
those
stories
to
be
saved
on
the
fire
con
network.
So
we
are
working
on
this.
K
We
build
a
team
with
contract,
smart
android,
developer
manager
and
the
problem
manager,
so
they
can
do
the
work
and
we
also
get.
I
saw
that
there
were
currently
four
background
projects
we
are
working
on
to
support,
so
we
have
the
block
construction
from
us
who
is
doing
the
ai
for
far
coin
machine
learning.
Another
background
is
for
the
nfk
storage.
Understandably
by
pass
through
friday
coin.
The
company
has
two
branches
in
both
china
and
canada.
K
K
So
then
another
one
is
another
from
where
industry,
which
is
doing
converting
the
unity
3d
model
to
nfc
projects,
which
is
for
novices.
There
are
lots
of
cyberpunk,
which
is
2d
images,
so
he
strongly
believes
that
the
next
training
graph
is
3d
images
or
3d
features,
so
he
wants
those
products
can
be
mint
as
nft
and
a
sleeve
on
paracord.
So
we
are
working
on
super
source
projects.
K
This
is
one
example
like
a
machine
learning
dataset
we
have
been
working
before
last
year.
This
is
several
different
space
of
monkeys.
A
monkey
is
very
hot
in
cyberpunk
now
so
like
yeah,
those
monkeys
can
be
labeling
at
different.
K
Doing
the
geolocation
distribution,
so
we
are
mine
x
and
my
x2
members.
So,
with
past
experience
we
have
successfully
storage
the
data
center
on
those
new
locations,
so
we
are
adding
more
geolocations.
If
people
are
interested,
they
can
like
contact
us
and
we
can
add
them
so
for
now
we
have
five
miners
in
north
america
and
eu
will
be
two
storage
providers,
and
asian
will
be
selected
one
or
two
depending
on
where
they
are
located
under
their
batteries.
K
We
also
have
some
new
two
modules
from
first
one
we'll
be
using
for
support
the
storage
and
the
payment,
so
the
storage
will
be
like
you
can
use
any
fpfs
as
the
gateway
to
fetch
and
upload,
but
we
also
build
the
our
own
ipfs
first,
one
gateway,
so
we
so
now
you
can
using
north
america,
canada,
node
so
to
doing
the
transfer
and
distributed
the
data
from
password.
K
Ipf
has
no.
We
have
all
the
frameworks
of
offline
gear,
sending
components
have
been
running
for
months
and
we
look
at
a
new
component
called
the
payment
gateway,
so
with
payment.
Getaway
currently
will
be
able
to
support
different
currencies,
usdt
and
even
bsc
chan
and
other
chance.
So
those
part
has
been
recently
almost
a
finished
development.
K
This
is
this
structure
with
first
one
we
supported
both
two
components:
one
is
the
fragrance
s3
integration,
module
and
ipfs
model
for
for
this
project,
we
don't
use
an
ifs3
module,
because
it's
more
for
industry
usage,
we're
using
the
ipfs.
So
after
you
place
order,
make
a
transfer
to
ipfs
you'll,
be
able
to
send
it
to
the
first
one
and
their
processor,
which
will
make
a
auto
market
or
many
market.
You
can
place
bid
and
to
get
subscriptions
and
on
the
other
side
we
can.
K
We
are
doing
a
swamp
batch
mixer,
which
means
that
it's
the
mft
or
other
budget
is
too
small.
Well
composing,
then
sending
them
back
together
to
a
virtual
pull
storage
provider
pro
and
the
bidding
provider.
Pro
so
the
two
different
pros,
the
slow
improviser
pro
is
more
like
some
existing
high
computation
storage
provider
will
be
directly
sticking
to
there,
as
we
have
shown
that
it's
a
geolocation
distributed
data
back
and
then
the
bdd
be.
The
provider
is
like
some
people.
It's
really
interesting
to
get
that
as
a
new
commerce.
K
K
We
are
going
to
send
out
those
data
set
in
september
with
him
together
based
on
ipfs
technology,
and
we
have
already
the
petabyte
data
set
the
transfer
grant
in
march,
which
is
designed
for
this
per
person
down
like
the
data
set
yeah.
So
this
is
basically
like
the
background
technology
and
distribution
plan
for
first
one
ft,
ipfs
storage
project.
Thank
you.
C
So
just
quickly
look
at
charles's
application,
so
it
looks
like
we
have
three
notaries
supporting
it
right
now:
sonic,
danny
o'brien
and
irene
young
from
the
12th
street
foundation.
Are
there
any
other
notaries
that
have
questions
or
want
to
support
this?
Let
me
know
I
will
happily
document
and
update
the
issue.
C
A
Yeah,
I
think
the
only
the
thing
to
say
is
we're
going
to
have
this
same
call
again
about
eight
hours.
I
don't
know
who
can
make
it
to
that,
but
it'll
be
the
same
agenda
and
then
both
of
these
will
be
posted
on
youtube,
to
watch
again
and
to
check
out
the
various
open
large
data
set
issues
and
the
open
issue
217
in
the
notary,
governance
channel
and
anything
any
burning
questions
we
didn't
get
to
in
general
feel
free
to
please
post
those
in
our
slack
channels.
A
We
can
drive
some
conversation
there
as
well,
so
I
don't
have
anything
else
for
me
this
morning.
Oh,
we
ran
a
little
bit
over.
Thank
you
all
for
being
here
making
time
and
a
great
community
governance
conversation
deep
anything
else
on
your
side.
Thank
you
thanks.
Everybody
all
right
thanks.
Everyone.
A
Awesome
all
right
thanks.
Everybody
welcome
to
round
two
of
the
notary
governance
meeting
for
august,
the
31st
so
we'll
be
recording
this
and
combining
it
with
the
call
that
was
the
same
agenda
that
happened
about
eight
hours
prior
to
this
and
we'll
post
both
of
those
up
on
youtube
together.
So
you
can
find
everything
there
for
posterity.
We're
gonna
jump
in
some
more
people
may
be
joining
us
as
we
roll
through,
including.
A
A
couple
clients
that
make
it
ideally
later
on
in
this
hour
and
we
can
hear
from
them
about
some
of
their
large
data,
dataset
applications
so
similar
agenda,
as
we
normally
have
running
through
some
kind
of
current
metrics,
and
then
talking
about
the
proposal
to
the
ldm
process
and
then
some
time
for
some
open
discussion.
A
So
two
weeks
ago,
our
snapshot
on
our
interplanetary
dashboard
was
looking
good.
It
has
been
increasing
every
two
weeks.
If
you
open
it
right
now,
you
may
see
something
shocking
with
the
average
time
today
to
cut
sitting
around
17
or
18
minutes.
While
this
is
a
great
aspirational
goal,
it
is
in
fact,
to
our
knowledge
a
bit
of
a
miscalculation
so
sorry
to
burst
everyone's
bubble.
There
is
currently
an
issue
with
the
github
scraper
and
it
is
miscalculating
some
of
these
metrics.
A
We
hope
to
have
it
corrected
and
updated
by
the
end
of
the
week.
So,
just
if
you
sign
on
and
see
that,
while
it
is
exciting,
it
may
be
currently
inaccurate.
I
just
wanted
to
call
that
out
to
everybody.
A
A
This
is
taking
into
account
both
large
data
sets
and
direct
notary
activities,
so
we're
working
on
having
some
easy
ways
to
split
out
those
metrics,
because,
as
you
can
imagine,
the
data
capture
gets
assigned
to
an
ldn
in
like
the
five
pebby
byte
range
will
significantly
alter.
All
of
our
calculations
for
how
that
compares
to
you
know
the
rest
of
the
direct
notaries,
so
we're
working
on
increasing
our
you
know
audit
streams.
So
we
can
look
at
those
things
separately,
but
taken
in
conjunction.
A
We
are
seeing
about
half
the
petty
byte
of
data
cap,
sealed
in
deals
down
the
bottom
of
this
funnel
speaking
of
direct
notary
activity.
We
have
about
59
open
issues
when
I
talked
and
updated
these
slides
yesterday,
and
that
includes
11
new
apps
that
are
waiting
for
kind
of
first
action
and
we've
had
30
in
the
last
14
days
that
have
been
closed
and
granted.
So
pretty
pretty
good
speed,
total
of
164
closed
issues
that
have
been
granted.
A
A
So
I
appreciate
everyone's
patience
as
we
resolve
that
at
this
time,
notaries
should
be
seen
all
of
their
requests
through
the
app,
and
please
continue
to
let
us
know
as
errors
like
that,
pop
up
so
from
this
calculation,
if
we
exclude
our
glyph
auto
verifier,
which
is
currently
has
428
clients
a
bit
of
an
outlier,
the
average
clients
per
notary
is
about
seven,
which
is
an
increase
from
two
weeks
ago.
So
fantastic.
A
We
do
have
a
number
of
open
applications
for
large
amounts
of
data
cap.
Some
of
those
are
at
six
notaries.
I
believe
on
this
list
as
well.
Usc
show
foundation
is
also
now
at
six.
We
heard
from
them
this
morning.
A
We
also
heard
from
clay
watch
and
phil
swann
and
deep
talked
a
little
bit
about
the
deal
bot
project
we'll
give
him
some
time
at
the
end.
To
talk
about
that
again,
but
please
go
check
out
these
open
issues
and
ask
some
questions
of
these
clients
and
put
your
name
in
to
help
be
a
notary
for
that
process.
A
One
other
update,
we
are
still
working
on
getting
the
root
key
holders
changed.
Our
goal
is
to
add
four
new
keyholders
to
expedite,
especially
to
expedite
notary
creations.
It
has
been
pointed
out
as
a
slowdown.
I
know
that
there's
a
couple
of
large
data
sets
that
have
met
the
threshold
of
seven
and
are
waiting
to
be
approved
by
root
key
holders,
so
they
can
start
allocating.
A
We
are
still
processing
those
addresses
and
making
those
updates
to
the
f080
address.
So
that's
still
in
the
works.
There
have
been
some
slight
unexpected
technical
difficulties
that
we're
working
through
to
get
that
completed.
So
at
oh
right,
one
other
thing
before
we
turn
it
over
to
deep.
A
I
wanted
to
give
you
guys
some
information
about
some
other
changes
that
we
are
making
to
some
more
bots,
so
we
have
two
bots
that
are
in
testing
and
then
one
that
is
still
not
yet
in
testing,
but
it's
in
development,
so
these
first
two
bots
the
onboarding
and
the
stats
we're
building
those
with
the
kiko
team.
A
Mostly,
we
kicked
off
the
development
of
these
in
connection
to
these
changes
to
the
large
data
set
process,
because
we
knew
that
we
could
still
utilize
a
lot
of
this
tech
investment,
even
if
the
specific
proposal
doesn't
get
passed
in
its
exact
current
state.
So
we
wanted
to
go
ahead
and
show
some
of
these
spots,
and
some
of
the
you
know
features
that
we'd
be
able
to
use
when
we
rolled
them
out
in
connection
the
onboarding
bot
is
designed
around
improving
some
of
the
automation.
A
It
can
analyze
an
application
for
completeness
and
kind
of
look
through
and
flag
any
missing
sections,
something
we're
doing
a
little
bit
of
already,
but
we're
going
a
little
bit
further
in
that
and
then
also
based
on
the
new
ldn
process.
This
bot
can
propose
a
new
multi-sig
automatically
to
open
an
issue
for
the
root
key
holders.
A
A
We
could
automate
that
and
we
can
notify
the
client
more
directly
of
the
allocated
data
cap
and
so
a
sample
kind
of
comment
there,
they're
working
on
so
once
that
data
allocation
goes
through
we'll
just
have
some
more
transparency
and
automation
for
that
the
other
bot
is
around
stats.
A
The
goal
here
is
to
monitor
the
remaining
amount
of
data
caps
that
a
client
address
has,
and
when
that
client
reaches
a
certain
threshold,
the
bot
would
update
their
application
issue
to
change
the
labels
and
automatically
start
a
subsequent
allocation
request
and
then
post
a
table
with
an
audit
of
their
previous
deals
and
a
link
to
the
stats
dashboard.
So
the
goal
here
is
to
try
and
make
the
audit
step
for
notaries
a
little
more
automated.
A
So
this
is
a
sample
of
what
that
comment.
Could
look
like
the
client
address.
If
you
know
if
it
is
connected
to
a
large
data
set
for
a
multi-sig
and
the
amount
that
they
would
subsequently
be
requesting
and
then
a
snapshot
of
their
previous
behavior,
so
in
this
sample,
where
we're
using,
you
know
incomplete
data,
you
can
see
these
deals,
don't
look
great,
but
in
an
ideal
world
you
know
the
number
of
deals
and
the
percent
allocated
to
the
top
provider.
A
C
C
You
have
like
confidence
in
like
the
the
statistics
that
are
being
shown
to
you
about
how
a
client
is
using
their
data
gap
and
whether
or
not
they're
like
doing
the
so-called
right
thing
like
on
the
network,
and
so,
if
you
have,
if
you
have
like
questions
that
come
to
mind,
that
you'll
be
like.
Oh
it'd
be
great
to
see
an
answer
to
this
or
great,
to
see
like
a
particular
kind
of
statistic
or
metric.
C
E
A
Yeah,
the
third
bot
that
is
currently
still
in
development,
this
coming
from
a
microgame
from
the
foundation
and
we're
basically
working
with
it
as
a
developer,
to
make
and
deploy
a
slack
github
bot
that
essentially
could
tally
open
issues
from
certain
repos
with
specific
labels
and
then
post
in
slack
channels.
A
So
a
lot
of
the
current
slack
github
integrations
work
on
waiting
for
some
kind
of
push
notifications
on
action
in
github
to
then
push
a
single
message
to
the
slack
channel
and
instead
we're
kind
of
trying
to
find
a
way
to
have
some
better
cadence
of
reminders,
but
also
than
you
know,
driving
community
engagement
to
show.
This
is
how
active
our
community
is
across
these
different
repos.
So
a
sample
behavior
for
this
bot
would
be
to
look
at
that
client
onboarding
repo
and
count
the
open
issues
labeled.
A
You
know
bot,
looking
good
and
escape
verifying
and
post
a
comment
in
the
file
coin:
community
slack
channel,
phil,
plus
and
phil
plus
notaries.
The
thought
there
is
you
know
once
a
week
this
bot
say
there
are
59
open
issues.
You
know
go
read
about
them
here
and
kind
of
help.
Remind
the
notaries
of
those
open,
behaviors
and
link
them
directly
to
them.
C
Thanks
folks,
so
some
of
you
have
probably
already
seen
this.
I
think
in
this
call.
Probably
a
majority
of
you
actually
have
issue.
217
was
dropped
up
as
a
proposal
on
ways
in
which
we
could
speed
up
the
overall,
like
large
data
set
notary
flow
that
existed
today
and
so
in
general.
C
Just
stepping
back
a
little
bit
right,
like
the
goal
of
five
coin
plus,
is
to
increase
the
productivity
on
the
network
and
what
that
looks
like
in
today's
version
of
altcoin
is
like
reducing
the
friction
to
onboard
clients
to
the
network,
because
we
are
in
a
state
where
we're
all
working
at
surveying
organic
demand,
as
it
shows
up
in
the
network,
as
well
as
finding
ways
to
reduce
the
friction
for
clients
that
do
turn
up
at
the
network
and
are
excited
to
use
it
and
leverage
it
for
whatever
services
can
be
provided
to
them
in
the
form
of
storage.
C
And
specifically,
the
purpose
of
the
ldn
like
process
as
it
was
originally
proposed,
like
several
months
ago,
was
effectively
that,
like
there
are
clients
with
very
legitimate
use,
cases
that
require
data
capability
skill,
that's
much
larger
than
what
like
the
an
individual
notary,
would
be
expected
to
administer
effectively.
C
So
that
includes
not
just
like
having
that
much
data
cap
allocated
to
them,
but
also
in
addition
to
that,
like
doing
the
due
diligence
doing,
the
appropriate
leg
work
working
with
the
clients
ensuring
that
they
feel
supported
like
having
that
burden
on
an
individual
notary
for
something.
That's
like
a
baby
bite
scale
allocation,
especially
when
falcon
plus
is
a
program,
is
just
as
young
as
it
is
today
and
with
so
much
to
learn.
C
It
seemed
a
little
unrealistic
as
we
were
looking
at
different
ways
in
which
we
could
do
this,
and
so
we
had
this
like
version,
one
of
like
the
ldn
like
multi-sig
allocation
mechanism
that
was
developed
between
like
april
and
may
of
this
year,
and
it
was
intended
as
an
experiment
right.
We
like
gated
it
for,
like
50
pv
bytes,
to
see
what
it
would
look
like
we're
about.
C
And
what
we're
seeing
is
that,
like
the
time
to
data
cap
for
those
clients
is
in
the
weeks
range
and
that's
almost
like
unacceptable
from
like
the
perspective
of
an
actual
service
right,
if
you
think
about
like
a
client
that
has
a
legitimate
business
need
showing
up
the
far
coin
is
an
alternate
option
of
like
cloud
storage
or
just
storage
as
a
service
and
then
being
told
hey.
C
Not
only
do
you
need
to
fill
out
all
this
kyc
stuff,
but
now
you
need
to
wait
like
four
weeks
or
whatever,
while
we
do
due
diligence
and
get
the
community
buying,
while
that
it
is
reasonable
to
have
that
barrier
of
the
community
buying
and
the
kyc
the
amount
of
time
that
it's
taking
is
not
and
and
not
all
of
that
is
on,
like
specific
individuals,
but
on
the
process
that
we
designed,
because
we
were
approaching
it
from
the
angle
of
a
little
bit
more
caution
to
ensure
that
it
wasn't
like
easily
usable
and
then
also
like
looking
at
ways
in
which,
like
it
would
make
sense
to
engage
notaries
and
what
we
were.
C
Also.
I
think
one
of
the
other
things
that
has
changed
a
lot
in
the
last
few
months
is
like
the
total
number
of
active
notaries
has
also
gone
up
like
fairly
substantively.
We
also
happened
to
run
another
round
of
nerdy
elections
over
that
same
duration
and
we're
seeing
different
patterns,
practices
and
data
emerging
on
like
behavior
as
we
get
more
and
more
notaries,
many
of
whom
are
much
more
active
over
the
last
several
weeks.
C
But
taking
some
inspiration
from
the
individual
notary
interaction
flow
a
little
bit
more
than
we
were
before,
to
streamline
things
and
reduce
the
friction
both
for
notaries,
but
also
for
clients,
and
specifically,
I
guess
targeting
for
clients.
C
Instead
of
like
waiting
for
that
seven
notary
barrier
of
like
due
diligence
plus
the
same
set
of
seven
having
to
sign
again,
a
bot
could
effectively
do
validation
on
the
answers
and
ensure
that,
like
enough
information
is
there
or
ensuring
that
the
application
itself
is
not
missing
or
incomplete
in
any
way,
and
this,
of
course,
can
get
more
sophisticated
over
time,
but
could
also
fire
off
immediately.
C
The
creation
of
a
multisig
with
all
notary,
addresses
or
all
notaries
that
are
opted
in
or
the
set
that
haven't,
opted
out
like,
depending
on
how
we
want
to
frame
this,
but
as
many
notaries
as
possible
could
be
on
this
multi-stick
for
this
particular
client,
and
the
threshold
of
that
would
be
set
there
too,
as
opposed
to
what
it
is
today,
which
is
four
and
the
proposal
and
the
sort
of
driving
mechanism
behind
the
two
is
basically
that
we
ensure
that
there's
a
proposal
and
an
approval
message.
C
We
ensure
that
there's
some
amount
of
sort
of
watching
happening
where
it's
just
not
like
one
individual
notary,
because
there's
no
different
from
like
an
individual
notary
process.
But
it's
also
more
about
like
evolving
the
role
of
the
notary,
a
little
bit
to
be
more
curious
about,
like
how
clients
interact
on
the
network.
C
More
focused
on
like
data
driven
decision
making
like
over
a
period
of
time,
as
opposed
to
necessarily
like
being
too
much
of
a
gatekeeping
role
like
right
at
the
start
of
adoption
of
the
network,
for
example,
and
so
reducing
the
burden
of
the
notary
up
front
and
getting
them
more
data
over
time
and
letting
them
react
and
make
decisions
as
a
group
or
as
an
individual,
but
with
more
better
inputs.
That
decision
making
process
basically
and
then
also
reducing
the
time
to
data
cap
for
the
client
to
get
like
an
issue
allocation.
C
If
there's
like
active
notaries
that
are
willing
to
you
know,
take
a
bet
on
this
client
as
they
would
have
had
that
client
applied
directly
to
them,
and
so
the
multiseg
sort
of
gets
created
like
out
of
band
as
long
as
like
the
actual
application
is
validated,
and
while
we
are
sending
over
to
the
rookie
holders
waiting
for
them
to
approve
the
notary
silence
at
that
stage,
there's
still
due
diligence
happening
either
from
notaries
that
are
excited
to
support
this
particular
application
or
members
of
the
community
that
are
interested
or
want
to.
C
You
know
ask
questions
or
even
participate
in
the
market
making
that
might
be
happening
as
a
result
of
this
particular
interaction
or
this
project
that
might
be
onboarding
data
on
the
falcon
and
then
effectively.
What
what
would
happen
is
now
that
you
have
like
this.
You
have
a
lower
threshold.
C
Each
subsequent
allocation
as
well
is
lower
friction,
but
the
expectation
would
be
that
we
focus
a
little
bit
more
on
like
collecting
the
right
data,
improving
our
auditing
practices,
improving
like
the
kind
of
inputs
that
we're
getting
to
make
decisions
as
like
notaries,
effectively
and
and
arming
you
with
tooling
and
with
whatever
is
required
for
you
to
feel
confident
about,
like
a
client
actually
doing
the
right
thing
on
the
network
and
indexing
more
on
that,
as
opposed
to
like
upfront
like
whether
or
not
like
an
application,
looks
good
or
whether
or
not
like
a
client
is
making
the
time
to
come
to
a
governance,
call
and
talk
through
everything
and
so
clients
get
moving
faster
and
then
notice
get
more
data
quicker
and
more
data
over
time
and
better
decision
making
practices
at
the
risk
of
like
initial
allocations
being
perhaps
looser.
C
As
opposed
to
where
we
are
today,
where
applications
get
stuck
for
a
long
period
of
time,
so
yeah
there's
more
details,
of
course,
written
up
in
the
issue.
If
you
have
any
specific
questions
on
like
the
process
changes
being
proposed,
please
let
me
know
now
I'll
give
you
guys
20
seconds.
While
I
take
a
sip
of
my
coffee.
L
These
tools
are
great,
I
think
in
proposal
and
they're
definitely
going
to
be
helpful.
I
I
think
the
like.
It
still
comes
down
to
a
human
at
the
end
of
it.
So
how
might
we
organize-
I
think
I
asked
this
in
the
last-
call
how
how
might
we
do
that
better
and
it's
interesting
because
I'm
I'm
you
know
thinking
through
this
as
a
dow
like
how
does
it,
how
can
it
be
more
efficient?
What
what
is
the
motivators
for
the
root
key
holders
and
the
notaries
and-
and
we
all
have
day
jobs?
L
So
how
do
we
find
the
time
how
how
do
we
stay
motivated
and
what's
the
incentive?
I
don't
have
answers
to
those
questions,
but
they're
probably
going
to
be
part
of
getting
more
engagement.
C
Yeah,
I
think
that's
an
excellent
point.
I
think
that,
like
each
each
stakeholder
role
like
and
there's
several
in
this
in
this
pipeline
right,
there's
the
new
key
holders,
like
you,
called
out
the
notaries,
the
clients
themselves,
who
have
decided
to
apply
for
this
thing
and
are
willing
to
wait
for
it.
C
There's
the
storage
providers,
who
might
be
shepherding
clients
doing
bd
on
their
own
behalf,
because
they're
offering
services
on
the
network,
their
clients
who
yeah
yeah
and
so
a
bunch
of
different
stakeholders,
and
each
of
them,
I
think
other
than
maybe
the
actual
storage
providers
and
the
clients
like
are-
are
not
necessarily
doing
this
as
like
a
full-time
job.
Even
in
those
cases
there
might
not
be
but
yeah.
This
is
this.
C
This
is
like
a
interesting
infrastructure
decision
that
is
being
taken
by
a
client
and
then
like,
maybe
like
the
business
of
a
particular
service
provider.
But
the
decision
making
is
happening
primarily
on
on
the
notary
front,
and
so
I
think
what
it
comes
down
to
is
a
combination
of
ensuring
that
notaries
are
still
empowered
to
make
impactful
decisions
and
ensure
that
they
have
like
a
valued
and
valuable
role
in
the
ecosystem,
but
also
reducing
that
burden
as
much
as
possible
so
as
not
to
interfere
with
other
responsibilities
or
create
unnecessary
friction.
For.
C
Tasks
to
doing
automation,
et
cetera
galen,
you
have
any
thoughts
yeah.
I
think
that
this.
A
A
This
kind
of
you
know
application
process
or,
like
lean
more
on
like
this
question,
or
this
type
of
you
know,
client
and
another
notary
could
choose
to
go
in
a
different
direction.
I
think
being
able
to
let
both
of
those
things
happen
and
see.
What's
working
for
you
over
here,
or
what
are
your
hurdles
with
proposals
like
this?
Our
hope
is
that
we
bias
towards
like
trying
things
that
feel
you
know
low
enough
risk
to
like
learn
from
them
and
then
find
ways
of
like
what
did
we
learn?
What
did
we.
A
L
While
danny's
finding
his
hand,
you
know
all
the
responsibility
is
often
on
you
guys
and
this
might
sound
anti-dal,
but
is
there
any
way
you
can
distribute?
Could
this
be
a
meritocracy?
L
L
So
so
I'm
happy
to
put
my
hand
up
and
say,
I'm
going
to
spend
an
hour
every
day
or
whatever
it
might
be,
and
then
I
can
herd
the
kittens
around
a
cohort
of
notaries
to
bring
them
together,
because
I
these
tools
are
wonderful
and
they're
really
going
to
help
with
alerting
and
the
statistics,
and
even
the
you
know
the
calculation
around,
what's
the
right
allocation
and
so
on,
but
I
think
at
the
end
of
the
day
it's
still
going
to
come
back
to
humans.
That
need
to
do
something.
L
So
so
may
that's
an
idea,
and-
and
perhaps
we
could
just
test
it
but
yeah
it's
we
we
have
we,
we
have
to
do
something
and
we
have
to
help
you
with
it.
I
think
it's
not
just
advice.
C
Like
that's,
I
think,
that's
that
was
part
of
like
having
like
a
lead
notary
as
well
like
the
shepard,
like
a
set
of
numbers
for
like
the
multiseg
to
begin
with,
and
I
definitely
think
like,
I
think,
like
part
of
this,
like
we,
where,
as
a
program
like
falcon
plus,
is
like
fairly
young
and
as
a
network,
falcon
itself
is
also
fairly
yeah,
like
it's
not
even
a
year
right
that,
like
we've,
been
up
in
mainnet
and
experimenting,
and
already
we
have
like
such
an
insane
size
like
marketplace.
C
I
think
we
have
a
lot
like.
We
have
a
pretty
nice
opportunity,
a
pretty
unique
opportunity
right
now
to
test
stuff
that
will
probably.
C
At
like
outside
scale,
compared
to
like
where
we
are
today
right,
we're
like
operating
in,
like
like
small
percentages
of
what
the
network
is
like
capable
of
having
and
so
like
using
like
thinking
of
this
is
like
time
where
we
can
like
play
around
with
these
constructs,
I
think,
is
really
good,
and
so,
if
you
have
ideas
around
how
you
might
want
to
structure
like
in
notary,
like
set
interactions,
I
think
meg.
C
It's
definitely
worth
like
proposing
and
like
talking
about
and
seeing
how
we
how
we
could
go
about
doing
it,
but
in
general,
like
moving
towards
more
doubt-like.
C
Behaviors
is
probably
the
direction
I
think
like
most
of
us
think
this
program
is
going
to
go
to,
and
so
I
kind
of
view
this
proposal
as
a
step
in
that
direction,
right,
which
is
like
more
like
more
reactive,
more
like
looking
at
actual
data
and
on-chain
behavior,
as
opposed
to
speculation
and
more
decentralized
like
decision
making
and
distributed
decision
making
and
like
testing
things
at
scale
that
go
beyond
like
what
the
the
power
of
like
one
individual
entity
has,
but
instead,
like.
C
D
D
That
is
the
we're
sort
of
compelled
by
the
tooling
to
do
this
in
a
somewhat
batched
way,
or
at
least
I
do
like
you
know,
I
I
have
a
day
job
as
well,
and
so
I
I
tend
to
just
do
these
in
a
a
a
compact
set,
and
so
I
think
we
should
think
about
just
as
a
footnote
to
try
and
make
the
tooling
a
bit
more
asynchronous
so
that
you
know
we
are
either
on
the
main,
slack
channel
or
individually
sort
of
notified,
because
I'm
not
sure
the
github
notifications
are
doing
that.
D
C
Awesome
thank
you.
I
did
want
to
highlight
like
a
couple
of
the
discussion
points
that
have
been
raised
in
the
issue
for
a
few
minutes,
and
then
I
know
we
have
a
couple
of
clients
in
the
call,
so
I'd
love
to
turn
it
over
to
them.
C
So,
just
briefly,
the
first
sort
of
discussion
point
that
came
up
a
fair
bit
was
around
like
the
calculation
of
allocation
limits
that
exist
today,
and
so
like
the
initial,
like
ldn
mechanism,
had
this,
like
sort
of
like
tiered,
like
first
allocation
maximum
second
allocation
maximum
and
then
third
and
beyond,
like
subsequent
allocation
maximums
for
the
client,
and
there
were
some
interesting
proposals
brought
forth
on
how
we
might
want
to
consider
restructuring
that,
like
whether
we
increase
the
slope.
C
Basically,
if
that
function
substantively
or
change
it
to
basically
be
based
on
like
target
onboarding
rates
that
we
adjust
based
on
behavior
of
the
client
on
the
network,
as
opposed
to
you
know
what
they
wrote
in
an
application
at
the
start
of
their
project.
So
there
were
some
interesting
ideas
that
would
be
awesome
to
get
some
more
opinions
on,
as
we
continue
to
find
and
change
and
experiment
with
this
process.
C
C
How
do
we
actually
want
to
adjust
this,
and
so
I
I
know
galen
added
in
some
responses
on
why,
like
initially,
we
thought
you
might
be
a
good
idea,
and
so
I
would
urge
you
again
to
like
take
a
look
at
the
issue
and
see
if
you
have
any
thoughts,
but
basically
the
idea
that
we
can
start
to
you
know
shift
a
little
bit
more
in
the
direction
of
like
being
more
like
watch
persons
of
the
network
and
push
the
notary
role
to
be
more
like
the
enduring
longer
term.
C
Responsibility
of
ensuring
clients
are
successful
on
the
network,
as
opposed
to
the
short-term
responsibility
of
ensuring
that
the
clients
are
initially
trustworthy
on
the
network
and
sort
of
using
that
as
a
as
a
motivation
to
adjust
the
the
structure
of
the
interaction
between
a
client
and
a
notary
over
time.
C
There
were
also
points
raised
around
whether
or
not
we
want
to
ensure
like
notaries
themselves,
who
are
able
to
sign
things
like
lose
or
gain
the
ability
based
on
like
other
notaries,
that
may
be
interacting.
So
there
were
proposals
around
wanting
to
have
geographic
distribution
of
notary
signing
or
urged
like
notaries
like
ensuring
that,
for
example,
one
of
the
notaries
has
to
be
from
like
the
legion
that
the
client
is
applying
from
which
are
interesting.
C
Proposals
around
you
know
what,
where
the
the
expertise
or
individual
expertise
of
a
nurturing
could
be
leveraged
to
ensure
that,
like
the
client
is
being
served
accurately
on
the
network
and
that's
also
to
the
account
the
sort
of
the
other.
The
flip
side
of
that,
though,
is
like
if
we
are
working
towards
like
a
dao
dao,
like
structure
where
we
are
operating
more
on,
like
actual
data,
on
with
regards
to
like
on-chain
interaction
that
should
apply
to
notaries
as
well.
Right
like
we,
we
have
the
benefit
of
blockchain.
C
Is
it's
it's
a
log
where
we
can
also
see
every
single
signature
from
every
single
notary
and
see
all
the
outcomes
of
that
like
data
gap
allocation
and
the
deals
that
were
made
from
it
and
where
it
ended
up
and
so
worrying
a
little
bit
less
about
like
the
sort
of
the
initial
potential
for
extreme
use
cases
and
ensuring
that
we're
more
focused
on
enabling
the
clients
that
are
being
blocked
in
our
that
we
all
collectively
believe
are
worthy
of
the
data
cap
is
probably
like
an
a
way
to
defend
this
discussion
a
little
bit
but
anyway.
C
What
I
I
do
want
to
wrap
up,
because
I've
taken
up
plenty
of
time
on
this
topic,
but
yeah.
Next
steps
to
this,
I
think,
are
you
know,
consolidating
any
feedback
that
you
have
that
you've
shared
here,
that
other
participants
in
the
call
in
the
morning
shared
and
then
making
any
adjustments
to
the
proposal
and
sharing
those
in
the
issue.
C
We'd
love
to
see
continued
engagement
there
because,
I
think,
ideally
like
we
can
arrive
on
some
sort
of
next
steps
or
like
some
sort
of
consensus,
at
least
around
like
what
we
think
might
be
like
the
next
iteration
of
this
or
the
v
1.1
or
v
1.5
effectively
of
the
ldn
process
fully.
You
know
going
in
eyes
wide
open
that
we're
likely
to
continue
change,
changing
things,
adjusting
things
based
on
feedback
from
from
you
from
clients
and
continuing
to
propose
ways
in
which
we
can
make
a
similar,
effective
mechanism
to
ensure
clients
have
minimized.
C
Friction
in
adopting
this
awesome
storage
technology
cool
with
that,
I'm
going
to
hand
it
back
to
you,
gail,
and
I
think
jonathan
doden's,
on
the
call
to
talk
about,
shows
application,
and
we
have
a
few
other
people
as
well.
A
Yes,
that
is
accurate,
so
I'm
going
to
stop
sharing
for
one
second
and
we're
gonna
hear
from
some
client
applications.
So
first
up
jonathan
dutton
look
to
hand
the
mic
over
to
you.
I
Good
afternoon,
can
everyone
hear
me:
okay,
all
right,
great,
well
lots
of
friendly
faces
here
so
hello
to
all,
and
I
I'm
here
to
just
raise
some
awareness
about
the
data
cap,
application
that
we've
put
forward
and
I
think
we're
we're
getting
close
to
getting
enough
notaries
to
join
us.
But
we
thought
yes,
divas
signaling,
six,
so
we'd
love
to
have
just
the
final
push
to
get
over
the
hump
here,
because
we've
got
lots
of
work
to
do
to
start
sealing.
I
Here
we
for
us,
it
would
be
a
tremendous
accomplishment
to
get
this
done.
We've
been
a
big
supporter
of
the
falcon
protocol
since
its
beginnings,
we've
been
at
this
for
three
years
and
figuring
out
how
best
to
onboard
the
data
to
filecoin,
and
I
think,
with
this
project.
The
easiest
way
to
summarize
the
benefits
are
that
there
are
two
things
you
get
for
for
essentially
one
data
cap
allocation.
I
The
first
is
that
we
are
smoothing
the
past
for
other
big
data
providers
to
actually
get
onto
the
network.
So
that
means
we're
examining
all
the
tooling
the
best
practices
we're
working
with
the
storage
providers
directly
to
figure
out
how
to
make
sure
that
we
can
start
sealing
at
scale,
and
that
includes
also
pioneering
some
of
the
first
offline
deals
that
are
at
scale
to
then
be
able
to
distribute
widely
across
the
falcon
ecosystem.
I
No
less
we're
big
supporters
of
different
types
of
auction
and
distribution
methods.
I
see
andrew
hill
here
and
we're
big
fans
of
textile
have
been
for
years
and
and
their
latest
work
in
the
auctioning
space
is
something
that
where
we
would
be
using
this
data
cap
as
well
to
help
advance
their
work
as
well.
So
one
is
that
you're
getting
technology
development
for
the
whole
ecosystem
and
then,
secondly,
of
course,
we're
getting
the
preservation
of
this
data.
I
So
we're
really
excited
to
be
working
with
the
community
on
this,
I'm
happy
to
pose
questions
to
and
wants
to
learn
a
little
bit
more
about
the
program.
But
I
guess
the
big
message
to
you
all
is
that
we're
really
close,
and
so
for
those
of
you
who
have
been
on
you
know.
Some
of
you
have
already
supported,
but
those
of
you
who
are
on
the
fence,
I'm
happy
to
answer
any
questions,
to
try
to
strengthen
the
application
and
potentially
work
closely
with
you
all
on
this
project.
D
Yeah,
so
the
thank
you
very
much
for
for
going
into
the
detail
about
this,
so
you
said
that
the
the
database
as
a
whole
is
four
petabytes
and
you're
asking
for
five,
and
I
was
wondering
if
you
had
any
plans.
D
So
one
of
the
things
that
we
tried
to
do
is
to
make
sure
that
multiple
miners,
the
load,
is
shared
between
multiple
miners.
Are
you
planning
to
distribute
the
four?
Are
you?
Are
you
thinking
about
having
duplicate
copies,
or
are
you
going
to
sort
of
distribute
the
the
four
petabytes
in
total
over
a
number
of
miners.
I
The
it's
the
latter
option.
The
reason
we
asked
for
a
little
bit
more
than
four
petabytes
is
because
we
have
some
experiments
that
we've
been
running
in
smoothing
the
path
for
the
big
lift
so
yeah.
I
I
think
the
the
the
answer
in
terms
of
the
distribution
pattern
is
that
we
have
several
minors
from
the
minor
x
program
or,
I
should
say,
storage
providers
from
the
original
minor
x
program
and
they
step
forward
to
help
us
with
some
of
the
tooling
and
some
of
the
processes
and
then
from
there.
We
have
a
plan
of
rapidly
scaling
to
go
beyond
others,
but
as
you're
pointing
out
danny
it's
it's
one
copy
that
we're
spreading
across
multiple
storage
providers
as
opposed
to
duplicate
copies.
I
C
D
C
Well,
that
could
be
something
that
we
should
address
in
any
just
like
additional
changes
that
we
do
as
well
right,
where,
like
part
of
that,
was
more
again,
it
was
part
of
a
protect
like
it
was
like
a
scale
test.
It
was
like
intended
to
help
limit
the
downside
of
an
experiment
gone
wrong
in
many
ways,
and
so,
if
we
are,
if
we're
building
confidence
in
our
processes
and
are
adjusting
things,
then
we
should
also
adapt
and
maybe
change
those
limits
over
time.
Just
something
to
think
about.
Yeah.
I
Some
of
the
things
that
we're
thinking
about
danny
that
are
on
the
horizon
and
we'll
we'll
pilot
it
with
at
least
some
of
the
data
cap
that
we've
got
is
looking
at
larger
file
sizes.
So,
specifically,
we
have
like
giant
photogrammetry
files
that
are
massive,
and
we
also
have.
I
This
really
cool
program
called
dimensions
and
testimony
which
is
an
ai
driven
engine
that
takes
survivors
of
genocide
into
that
engine.
Then
you
can
actually
ask
them
questions
in
an
interactive
format.
It's
pretty
cool.
It's
a
60
minutes
did
a
whole
piece
on
this.
It's
quite
extraordinary
and
those
file
sizes
are
giant,
and
so
the
idea
is
to
actually
break
them
apart
and
then
spread
them
over.
I
You
know
several
sectors,
none
of
that
we're
actually
working
on
right
now,
because
we're
looking
at
jpeg
2000
masters
that
are
in
the
you
know
several
gigabytes
in
size
for
each,
but
those
larger
virtual
reality
and
testimonies
are
are
certainly
really
exciting.
I
We
just
so,
I
think,
we'll
have
data
cap
to
be
able
to
test
out
how
that
might
work,
but
we're
the
the
main
goal
for
this
one
is
to
to
make
good
on
the
promise
that
we've
been
we've
had
for
quite
some
time,
which
is
to
get
the
entire
show
foundations
archive,
which
I
should
just
briefly
mention.
If
you
don't
know
those
of
you
here,
it
covers
10,
different
genocide
collections
from
the
armenian
genocide
at
the
beginning
of
the
20th
century.
I
A
Yes,
thank
you,
jonathan,
thank
you
for
your
time
and
it
looks
like
now
with
with
those
two
additional
notaries.
We've
hit
threshold
so
deep
and
I
will
work
on
going
through
and
kicking
off
that
process
so
exciting
to
hear,
and
we
appreciate
your
time.
Thank
you
thank
you
and
I
think
you
are
also
in
the
github
issue.
So
if
there
are
any
questions
or
you
can
come
up,
you
can
see
those
also
see
were
there.
Other
clients.
C
Sorry,
yeah,
sorry
to
end
up
going
you're,
just
gonna
open
the
floor
to
other
clients.
C
I
have
an
open
application
with
six
notaries
waiting
for
a
seventh
that
I'm
happy
to
talk
about,
which
is
viewbots,
so
in
general,
like
one
of
the
things
that
is
gonna,
be
gonna,
become
very
compelling
and
very
important.
Part
of
the
general
like
deal
making
mechanism
and
flywheel
for
falcon
is
reputation,
since
this
is
anonymized
peer-to-peer
transactions
happening.
C
Data
that
is
collected
on
chain
about
the
behavior
of
people
on
the
other
side
of
a
contract
becomes
extremely
valuable,
and
so
specifically,
as
we
look
to
enable
like
clients
on
the
network
on
a
journey
who,
who
are
finding
ways
to
identify
reliable
storage
providers
on
the
network,
ensuring
that
data
is
available
to
those
clients
about,
like
how
storage
providers
behave
in
the
short
term
or
at
least
from
what
we've
seen
so
far,
becomes
like
a
really
compelling
data
point
and
a
very
important
data
point
in
enabling
their
decision
making
to
identify
where
they'll
store
their
data.
C
How
they'll
store
that
data?
Of
course,
there's
a
bunch
of
clients
going
via
abstraction,
like
services
that
make
these
decisions
for
them.
For
example
like
what
andrew
and
his
team
are,
building
would
effectively
enable
those
decisions
to
be
made
on
behalf
of
a
client,
but
still
several
clients
are
still
coming
directly
to
the
network
and
end
up
leaning
on
resources
like
fieldwrap.io.
C
The
falcon
reputation
service
that
exists
today,
as
well
as
other
similar
data
that
exists
from
on
chain,
not
necessarily
scanners,
but
like
dashboards
and
analyzation
tools
to
make
those
decisions,
and
so
the
datacap
app
application
that
I
have
opened
right
now
is
for
having
deal.
C
Bots
run
that
effectively
simulate
client
behavior
on
the
network
by
making
deals
over
extended
periods
of
time,
with
as
many
miners
as
possible,
collect
that
data
and
make
that
available
via
a
sidechain
store
for
any
reputation
service
to
be
built
on
or
to
incorporate
in
the
methodology
or
the
calculation
that
a
reputation
service
may
be
using
to
identify
whether
or
not
a
service
provider
is
likely
to
actually
be
reliable
over
a
period
of
time,
and
the
reason
this
is
important
is
because,
just
doing
like
a
query
ask
is
like
very
difficult.
C
You
know
that
what
a
client
sees
when
they
just
want
to
see
whether
or
not
they
can
make
a
deal
with
you
is
very
different
than
when
they
actually
try
to
make
a
deal,
and
that's
because
there's
like
a
lot
of
complexity
in
the
system,
there's
a
lot
of
like
challenging
aspects
to
interacting
with
you
know
a
client
address
in
a
minor
advice,
and
so
because
of
that
complexity
actually
making
the
deal
is
the
best
way
to
know
if
a
deal
would
actually
go
through,
and
so
that's
that's
the
purpose
behind
this
one.
A
Make
looking,
I
know,
I'm
looking
back
through,
and
so
I
think
that
I
think
that
we
have
sorry
starting
at
the
bottom.
I
think
we've
got
neo,
mag,
danny
and
then
another
meg
and
then
zht,
bite,
bass
and
bras
and
then
now
andrew
hill
as
well.
So
andrew
puts
us
at
seven.
Thank
you.
You
don't
need
to
get
to
it
until
the
morning,
because
the
message
is
next
going
to
get
kicked
over
to
the
root
key
holders.
You
don't.
A
Yeah,
yes,
you
don't
need
to
do
anything
right
now,
andrew,
but
I
appreciate
it
so
we've
got
an
andrew
commented
so
get
them
across
the
board.
C
Yeah
favorite
kind
of
dos,
the
one
that's
done
by
somebody
else,
accurate
awesome.
I
know
phase
phase
waiting
on
rookie
holders
no
actually
face
should
have
data
cap
allocation.
At
this
point
I
don't
know
laura
is
correct,
right,
laura
or
joe.
If
there's
anything
you
were
here
to
share
in
particular
as
well.
If
you
are
waiting
on
allocations
or
have
data
cap
applications,
you
might
be
following
in
the
future,
please
feel
free
to
here.
If
not,
I
think
dylan
is
there
anything
else
we
wanted
to
flag
or
talk
about?
C
It
might
be
worth
pulling
up
that
table
to
see
where
we
are
with
the
current
ldn
apps,
as
a
gentle
reminder
for
the
notaries
that
are
here
to
see,
if
there's
anything
that
catches
their
attention.
C
So
there'll
be
10,
open
apps.
Basically,
once
the
deobot
and
the
showa
foundations
are
closed,
there's
still
a
few
more
very
close
right.
There's
the
shanghai
information
technology
company
and
there's
several
of
them
at
three
and
four
as
well.
I
know
charles
was
here
this
morning
to
talk
about
the
phil
swann
project,
so
for
those
of
you
that
didn't
make
it
to
that
one,
I
would
recommend
checking
out
the
recording
as
soon
as
we
get
it
up
in
the
next
day
or
two
to
see.
A
Yeah
and
then
otherwise,
there
were
just
other
sort
of
open,
outstanding
questions.
D
So
I'm
not
even
putting
my
hand
up
anymore,
so
I'm
pretty
sure
that
all
of
us-
or
some
of
us
like
keeping
track
of
the
notary
signatures
in
these
github
issues
for
the
large
data-
and
I
was
wondering
for
those
of
you,
like
galen,
who
was
sort
of
doing
it
more
assiduously-
is
there
a
way
that
we
could
like
share
your
your
tally.
So
we're
not
duplicating
effort.
When
I
look
at
these
things
and
sort
of
mentally
calculate
yeah,.
A
So
so
one
of
the
things
that
I've
been
looking
at
github
has
the
ability
to
create
projects
and
project
boards,
and
then
you
can
kind
of
group
things
and
do
a
little
bit
more
of
like
project
tracking
as
to
like
what
things
are
open.
What
things
are
pending
and
that
yeah
similar
to
like
a
trello
board,
I've
been
kind
of
investigating
what
it
would
look
like
to
do
that
with
labels,
but
it
would
mean
introducing
a
lot
more
labels
to
basically
have
a
label.
A
That
says
you
know
one
notary
right
and
then
you'd
have
to
go
through
and
add
another
label.
That
says
you
know
notary
count
two.
So
it's
still
like
a
lot
of
manual
lift,
but
that
may
help
one
of
the
reasons
we've
been
sort
of.
I
I
have
been
waiting
to
do.
That
is
one
seeing
what
happens
with
this
process
proposal
to
see
if
the
threshold
stays
at
seven,
because
it'd
be
a
lot
of
lift.
A
If
we're
going
to
change
that
and
then
similarly
two
there's
we're
investigating
moving
some
of
our
dependencies
off
of
github
in
the
kind
of
next
year,
q1
q2
time
frame,
and
so
similarly
it
doesn't
make
the
most
sense
to
re-tool
the
way
that
we're
tracking
these.
But
yes
to
your
point,
like
as
it
is
right
now
using
an
issue
and
using
just
a
human
comment,
makes
it
challenging
to
to
stay
on
top
of
sort
of
what
the
count
is.
A
I
have
a
spreadsheet,
I
think
a
number
of
different
notaries
do
the
same
thing.
I'd
love
to
hear
if
somebody
has
great
github
tricks
or
experience,
I'd,
be
happy
to
hear
other
ideas.
C
Yeah,
I
think
the
shortcut
for
now
is
probably
just
maybe
a
public
spreadsheet
that
anybody
can
edit
or
share
it
with,
like,
I
said,
are
no
reason
interested
you
don't
like
the
idea
of
anybody.
What
like
anyone
can
go
update
the
note
of
the
account
it's
not
going
to
actually
influence
unchained
stuff
until
it's
verified.
A
L
Well,
I
mean
if
if
we
could
use
something
like
trello
and
it
probably
wouldn't
be,
as
I
mean
if,
if
some
of
the,
if
the
lead
was
delegated
the
responsibility
to
manage,
to
hurt
the
kittens
then
they're
taking
that
issue
and
then
moving
them
through.
So
maybe
it's,
maybe
it's
like
eight
categories
across
the
top,
which
is
one
sign
on
one
two,
three
four
and
you're
just
moving
them
across
the
lead.
L
A
Yeah,
I
think
I
think,
like
your
point,
like
that's
it's
a
it's
a
program
management
like
project
management,
type
of
thing,
and
it's
not.
There
are
tools
for
it
and
it's
not
impossible
and
it
is
a.
It
is
an
amount
of
work
that,
like
you
know,
anyone
in
the
group
could
take
on
another
thing.
The
foundation
is,
is
kind
of
working
on
increasing
our
staff
and
hiring
some
additional
people,
also
around
community
managers
for
a
variety
of
governance
and
ships
and
and
paul
claim-related,
falcon
plus
related
things.
A
A
Getting
the
allocations
passed
because
maybe
that's
like
a
higher
higher
leverage
set
of
tools
and
kind
of
kind
of
work
where,
like
a
lead,
notary,
could
really
be
shepherding
subsequent
allocations
and
audits,
rather
than
the
lead
notary
or
a
you
know,
to
your
point
of
like
a
cat
herder
focusing
on
the
front
end
of
the
funnel,
which
is
new
applications
come
in
and
getting
notary
traction
around
them
like
those
may
be,
and
it
may
be
that
that's
just
two
separate
people,
as
we
get
close
to
our
time
here,
really
appreciate
this
conversation.
A
Great
great
points
all
around
fantastic.
Thank
you
guys
for
joining.
This
recording
will
go
up
along
with
one
from
this
morning.
Please
go
check
out
issue
217,
see
if
there
are
some
additional
comments
or
questions
that
you
have
there
there's
a
lot
more
details
and
a
lot
of
good
discussion
happening.