►
From YouTube: Filecoin Plus - Oct 12 2021 Notary Governance Calls
Description
Want to get involved in Filecoin and Filecoin Plus? Check out our community Slack channels at https://filecoin.io/slack, and learn more about Filecoin Plus at https://plus.fil.org/
A
Awesome:
hey
everybody!
Welcome
to
the
notary
governance
call
the
first
of
june
early
governance
calls
for
today,
tuesday
october
12th.
Unfortunately,
galen
is
not
able
to
make
it
today
so
I'll
be
running
this
on
my
own
today,
but
happy
to
have
all
of
you
here
and
excited
to
get
started.
I'm
just
going
to
share
my
screen
and
we'll
take
it
from
there.
A
To
do
this
cool,
can
somebody
give
me
a
thumbs
up
that
they
can
see
the
agenda
awesome.
Thank
you
very
much.
So,
as
usual,
we
will
start
out
with
the
metrics
and
looking
at
well.
I
bumped
up
this
sort
of
six-month
program
goal
updates
thing
that
we
started
doing
two
weeks
ago
to
cover
that
as
part
of
like
metrics
and
where
we
are
in
progress
of
the
program
and
then
we'll
talk
about
some
improvements
to
ldn
application
tracking
and
then
have
some
time
for
discussions
and
pictures.
A
The
agenda
for
this
call
is
a
little
front
loaded,
so
it
shouldn't
end
up
being
that
packed
and
hopefully
we'll
have
good
time
for
some
lively
conversation,
then,
because
there
are
a
couple
of
topics
that
at
least
I
know
I'd
like
to
propose-
and
I
know
some
of
you
are
also
thinking
about
so
with
that-
let's
get
started,
of
course,
if
you
have
any
questions,
feel
free
to
raise
your
hands
all
right
on
the
metrics
front.
A
So
looking
at
changes
to
time
to
data
cap
and
responsiveness
in
general,
it
looks
like
you
know:
our
our
metrics
is
sort
of
the
same
like
plus
or
minus
within,
like
an
error
difference.
As
you
can
see,
the
numbers
are
still
within,
like
you
know,
12
hours
of
each
other
for
the
most
part,
and
so
not
substantive
change
here.
Sorry,
is
that
something
did
somebody
say
something.
A
I
guess
okay,
I
do
want
to
add
that
this
is
again
like
aggregating
from
like
several
months
of
data,
so
we
are
trying
to
work
on
getting
just
like
the
last
four
weeks
or
the
last
month's
worth
of
data
as
well
published,
but
for
that
I
think
a
better
indicator
is
actually
this.
This
table
here
so
we'll
get
to
that
in
a
second
but
yeah
in
general.
A
You
know,
study
stated
on
this
front
in
terms
of
the
actual
like
funnel
and
the
utilization
of
the
data
cap
that's
being
handed
out
today.
It's
roughly
the
same
for
both
ldn
and
non,
which
is
at
about
30
to
35
percent
of
the
data
cap
being
given
to
clients
is
being
used
in
deals.
What
that
does
mean?
Is
you
know
between
the
last
couple
of
weeks
or
so?
A
We
finally
crossed
you
know
one
pebby
byte
with
the
data
stored
through
falcon
plus
on
the
network,
which
is
awesome,
great
milestone
for
the
community
to
hit,
but
we
are
continuing
to
onwards
and
upwards,
like,
as
you
all
know,
we're
trying
to
work
together
to
get
this
to
the
next
magnitude
of
operation,
whether
that's
increasing
availability
of
data
cap
or
increasing
the
amount
of
data
you
have
used
in
making
deals
so
that
clients
that
deserve
to
have
a
frictionless
or
friction
free
experience
on
boarding
onto
file
coin
have
that
opportunity
today.
A
I
also
wanted
to
call
out
some
of
the
stats
in
this
bottom
row
here,
which
is
that
currently,
if
a
client
applies
via
the
manual
sort
of
verification
pathway,
instead
of
the
other
one,
so
they
go
to
the
the
app
and
they
ask
for
an
allocation
or
they
go
straight
to
the
client.
Onboarding
github
repo
about
63
of
those
applications
get
a
response
from
a
notary.
So
that's
typically
the
response
where
the
notary
says
hey.
You
know
here's
my
due
diligence
process
like
here
additional
questions.
I
need
you
to
answer
before.
A
I
do
think
that
the
first
box
should
be
a
much
higher
number,
but
I
think
it's
also
hard
to
track
that
right,
because
we've
seen
that
clients
will
sometimes
go
and
delete
github
applications
and
that
that's
something
that's
one
of
the
topics
I
want
to
bring
up
later
on,
but
it's
sort
of
hard
to
track
how
frequently
a
notary
or
ensuring
that
the
notaries
are
actually
responding
to
requests
that
are
coming
in
the
success
rate
of
those
applications.
I
don't
think
that
that
number
is
too
low
or
too
high
per
se.
A
I
would
actually
look
to
you
folks
to
see
if
you
think
that's
reasonable
for
the
nerd
reason
and
the
call
or
the
notaries
that
are
watching
this
recorded,
because
it's
very
likely
that,
like
not
100,
should
be
getting
the
data
cap.
So
I
understand
that
there
should
be
some
filter
and
I'm
glad
that
we're
able
to
administer
like
a
due
diligence
process
and
make
decisions
based
on
that.
A
That's
awesome
on
that
note:
let's
take
a
little
deeper
look
into
individual
military
activity
and
and
how
that's
been
going
in
the
client
onboarding
repo.
So
as
of
the
last
two
weeks
we've
seen
about.
A
A
So
it's
like
something
like
in
the
low
30s
for
the
last
two
weeks
and
the
amount
that
have
other
total
granted
right
now
are
202,
which
is
interesting
because
two
weeks
ago,
when
we
took
the
same
data,
that
number
was
at
210,
which
definitely
means
that,
like
either
labels
are
being
changed
even
after
the
states
being
granted
or
the
more
likely
outcome,
which
is
people
are
actually
removing
the
github
issues.
A
I'm
not
entirely
sure
why
this
is
happening,
so
I
would
love
to
get
some
insight
if
people
know,
but
it
does
look
like
clients,
you
know,
after
applying
for
data
cap
and
even
successfully
receiving,
it
will
sometimes
go
and
then
delete
their
application.
A
This
is
interesting
because,
like
all
that
info
is
still
tracked
right,
like
we
still
have
for
those
of
us
that
are
subscribed
with
email,
notifications
still
have
all
that
content
and
email,
and
we
have
bots
and
dashboards
tracking
the
stuff,
so
we're
still
aware
of
who
the
clients
are
and
what
they're
doing
with
the
data
cap.
What
the
addresses
are
that
received
it?
A
It's
just
interesting
that
they
would
want
to
remove
the
actual
issue
so
yeah,
if
you
folks,
have
any
ideas
on
that
one,
and
I
have
a
sense
of
why
that
might
be
happening.
I
think
it
would
be
great
if
you
could
share
on
the
right.
We
pulled
the
same
table
that
we
used
to
from
textiles
notary
stats
repo.
So
this
is
more
like
individual
statistics
on
where
we
are
with
each
of
the
individual
notaries.
The
amount
of
allocations
they've
grounded
how
long
it
typically
takes
for
them
to
grant
data
cap.
A
How
many
are
currently
open
and
then
the
amount
of
time
it
typically
takes
for
a
notary
to
make
a
decision,
which
is
the
one
right
over
here
and
then
there's
the
sort
of
the
amount
that
don't
get
data
cap,
which
is,
I
think,
fine
to
look
at
as
like
an
indicator
of
selectiveness.
I
think
these
numbers
are
pretty
high
across
the
board
and
I
think
that's
great.
A
We
also
went
through
a
phase
where
I
think
clients
were
spamming,
a
bunch
of
notaries
and
getting
a
lot
of
duplicate
applications
out
there,
and
so
a
lot
of
that
I
think,
gets
attributed
here.
But
you
know
we'd
love
to
see
steps
being
taken
to
sort
of
improve.
Some
of
these.
A
A
lot
of
a
lot
of
folks
here
are
in
like
the
high
tens,
and
we
often
talk
about
or
at
least
low
tens,
but
we
often
talk
about
you
know
wanting
to
do
this
in
a
couple
of
days,
and
so,
instead
of
it
being
like
a
hey,
why
are
you
taking
so
long?
I
think
the
conversation
should
be:
what
are
the
things
that
could
be
done?
A
That
would
help
you
take
less
time
like
what
do
you
think
you're
blocked
on
as
a
notary
or
where
would
you
want
more
information
for
you
to
feel
like
you
can
comfortably
make
a
decision
like
regardless
of
whether
or
not
it's
giving
somebody
data
cap
like
sure
it
might
be
that
you
don't
want
to
give
them
data
cap
or
it
might
be,
applications
that
come
across
more
like
suspicious,
requiring
additional
due
diligence
that
end
up
like
requiring
more
time,
but
something
should
happen
from
a
tooling
perspective
or
general
process
perspective,
or
maybe
even
like
client
education
perspective
that
we
can
do
together
as
a
community
to
improve
the
quality
of
the
incoming
applications,
as
well
as
the
the
interaction
points
between
the
youtube.
A
Cool
with
that
diving
into
sort
of
where
we're
headed
for
the
six
month
goals,
we've
talked
about
these
for
the
last.
You
know
two
or
three
calls
now
just
updating
each
of
the
current
statuses
and
and
chatting
through
sort
of
what
we
should
be
looking
forward
to
so
for
the
first
focus
area
of
like
getting
more
volume
of
data
onboarded
through
the
falcon
plus
program,
we're
targeting
500
pebby
bytes,
that,
of
course,
being
across
ldns
and
clients.
A
I
think
we're
quite
a
ways
off
still:
it's
certainly
a
slightly
lofty
goal,
but
I
do
think
it's
a
really
good
thought
exercise
at
the
very
least
in
terms
of
thinking
about
what
needs
to
change
in
order
for
our
systems
and
our
processes
to
actually
scale
to
support
this
kind
of
load-
and
I
know
I'm
saying
six
months
here,
but
at
this
point
it's
really
five
months,
given
that
we've
been
talking
about
this
for
nearly
a
month
and
so
we're
continuing
to
edge
closer
to
I
don't
know,
maybe
we
should
set
a
date
like
maybe
it's
march
or
next
year,
or
something
but
yeah
we're
continuing
to
do
the
edge
closer
towards
it,
and
we
should
practically
start
thinking
I'd
love
to
see
if
there
are
any
ideas,
either
in
the
call
right
now
or
in
github
discussions
about
ways
in
which
we
can
level
this
up.
A
The
next
focus
area,
which
is,
I
think,
my
personal
most
interesting
one
at
the
moment,
which
is
time
to
data
cap,
curious
to
see
ways
in
which
we
can
get
time
to
data
cap
down
substantively.
So
just
to
be
clear.
This
goes
back
to
sort
of
the
actual
client
onboarding
experience
so
like.
A
Ultimately,
the
point
of
filecoin
plus
is
to
ensure
that
that
clients
that
should
be
receiving
data
capper,
should
have
access
to
making
views
on
the
network
are
able
to
get
that
as
easily
as
possible,
and
I
often
draw
this
comparison
where
like.
If
you
go
to
like
a
cloud
offering
today
like
a
web
2
offering,
and
you
want
to
store
data,
it's
like
you
just
able
to
do
that.
A
You
create
an
account
and
you
can
store
data
and
then
once
you
hit
certain
limits,
you
get
in
touch
with
the
support
team
and
you
talk
to
them
for
like
a
couple
of
hours
and
then
within,
like
usually
12
hours,
sometimes
24.
If
additional
kyc
is
required,
you
then
get
like
maximal
quarter,
requests,
etc.
So
that
you
can,
you
know,
leverage
that
offering
as
best
as
possible,
and
so
when
a
client
comes
to
falcon,
and
it's
like.
A
Oh,
you
can't
really
do
anything
or
you
can
sort
of
get
started,
but
it's
not
going
to
be
as
good
as
you
you'd
wanted
it
to
be,
or,
like
the
ideal
part
that
you
sort
of
had
was
you
know,
to
go,
get
data
cap,
and
do
this
then
sort
of
expecting
that
it
would
be
an
hour
or
a
less
than
a
day,
at
least
right
first
for
like
a
higher
magnitude,
and
so
like
orienting
around
like
how
do
we
actually
align
on
an
improved
client
user
experience?
A
How
do
we
align
around
a
high
quality,
client,
onboarding
story
to
enable
the
growth
of
the
of
the
network
and
the
productivity
of
the
network?
I
think,
is
a
key
sort
of
focus
area
for
us,
and
so
the
the
sort
of
metrics
that
we
we
talked
about
several
weeks
ago
were
getting
this
zero
to
one
time
you
by
down
to
like
less
than
an
hour
and
the
prompt.
A
There
was
like
the
discussion
topic
that
was
started
on
like
how
do
we
actually
scale,
auto
verifiers
or
automated
notaries,
up
to
being
able
to
do
that
and
like
what
needs
to
go
into
that
from
a
due
diligence
perspective
that
we
all
feel
comfortable
with
the
risk
that
comes
with
giving
somebody
one
like
heavyweight
data
cap
and
then
there's
the
scale
beyond
that
which
is
like
up
to
like
a
hundred
terabytes
of
data
cap,
which
today
is
typically
when
somebody
will
either
apply
it
through
the
ldn
path
or
through,
like
a
directly
to
a
notary.
A
But
waiting
like
12
days,
for
that,
I
think
is,
is
close
to
unacceptable.
And
so
then
it
comes
down
to
what
do
we
actually
want
to
do
from
a
tooling
perspective
to
support
those
decisions,
or
how
do
we
want
to
like
change
the
the
interaction
flow
of
the
system?
So
not
only
can
we
start
leveling
up
like
the
role
of
the
notary,
as
we've
talked
about
to
be.
A
You
know
more
of
like
a
guardian
than
a
gatekeeper,
but
then
also
enable
the
notary
to
be
able
to
make
decisions
quicker
or
more
at
scale,
and
one
of
the
conversations
that
came
up
at
the
governance
call
two
weeks
ago
was
around
the
idea
of
having
higher
quality
applications
that
are
coming
in,
that
are
more
region
specific
or
even
more
country
specific.
A
I
think
it
was
a
later
call
from
two
weeks
ago,
if
you
haven't
seen
the
recording,
but
it
would
be
great
to
see
if
there's
other
ideas
on
how
we
could
improve
the
actual
tooling
that
exists
to
enable
militaries
there's
a
some
of
you
in
the
call
right
now.
Some
of
you
watching
this
recording
definitely
would
appreciate
some
thoughts,
ideally
in
a
github
discussion
topic.
A
The
other
sort
of
areas
that
we've
identified
is
one
is,
you
know,
risk
mitigation,
which
is
we
ensure
that
we
have
some
sort
of
on-chain,
tracking
and
analytics
and
auditing
of
actions
that
are
taking
place
both
of
clients
and
and
miners,
but
also
the
rest
of
the
falcon
plus
system
or
economy,
which
goes
all
the
way
from
rookie
holders
to
the
actual
storage
providers
at
the
end
of
it
and
so
having
like
basic
dashboards,
which
we
do
have
today.
A
A
I
have
a
proposal
that
I'd
like
to
share
on
this
one,
but
I'm
going
to
save
that
to
the
discussion
conversation
that
we'll
do
later
on,
but
yeah
the
in
general,
like
file
coin
plus
itself,
like
we,
you
know
we're
looking
at
40
per
bicycle
cap
available
for
allocation
about
you
know,
single
digits
and
petty
bias
have
actually
been
handed
out
to
clients
and
then
of
that,
like
a
single
period
by
roughly,
has
actually
been
used
to
make
deals.
A
The
network
is
well
on
track
to
like
close
to
10
exabytes
of
available
supply
from
for
storage
from
search
providers.
So
it'd
be
great
to
see
one
you
know
like,
as
we
continue
to
level
up
the
utilization
like
what
do
we
need
to
track
to
ensure
that
we're
doing
it
safely,
but
then
also
to
just
in
general?
A
A
A
I
just
think
that's
happening
as
a
result
of
us
together
spending
more
time
on
some
of
these
other
topics
that
are
a
little
bit
more
hot
at
the
moment,
but
again,
ultimately,
the
success
of
the
popcorn
bus
program,
I
think,
relies
substantively
on
the
success
of
clients
that
are
coming
through
the
pipeline
into
the
system
and
so
improving
the
discoverability
of
the
different
moving
parts,
understanding
of
tooling
understanding
of
like
how
to
actually
navigate
the
system,
I
think,
will
all
contribute
towards
improving
the
user
experience.
A
One
of
the
challenges
that
we
have
here
right
now
is
like
reliably
getting
like
client
feedback.
I
think
the
the
places
we
typically
get
it
are
in
slack
people.
Sometimes
you
know
talk
about
issues
that
they're
having
or
ask
questions
or
in
governance
calls
like
this,
where
sometimes
we'll
have
clients
that
will
join
us
and
talk
through
their
experience
and
their
path,
but
in
general
we
we
should
come
up
with
a
more
systematic
approach.
A
I
think
where,
like
whereby
clients
feel
like
they
have
a
platform
to
share
their
opinions,
their
feedback,
their
takeaways
from
the
experience
of
interacting
with
the
falcon
5
system.
So
I've
been
trying
to
do
a
little
bit
of
thinking
on
this
one.
I
don't
necessarily
have
a
great
idea,
but
I
would
love
to
know
if
anybody
in
the
call
does.
A
I
especially
am
looking
towards
notice,
because
you
are
currently
at
least
the
primary
touch
point
that
clients
have
with
the
system.
So
if
you
have
any
thoughts
on
this
would
love
to
hear
them.
A
And
then
last
is
on
the
governance
run,
which
is
we
start
to?
Actually
it's
not
start.
We
have
started
we'd
like
to
continue
scaling
and
sharing
the
processes
that
we
have
in
the
community
today.
A
So
I
think
that
the
biggest
recent
change
here
has
been
that
we
have
started
to
use
both
github
discussions
and
issues
where
discussions
are
where
a
lot
of
the
conversations
happening
on
problems
that
we
care
about
or
like
food
for
thought,
as
opposed
to
like
specific
solutions
to
those
problems
where
we,
I
think
we
believe,
generally,
that
discussions
are
the
place
where
we
can
have
threatened
conversations,
ideas
and
ways
in
which
we
can
discuss
like
creative
solutions
to
problems
that
we
align
on
and
then
issues
turn
into
the
implementation
mechanism.
A
For
like
one
of
those
solutions
where,
if
we're
aligned
on
a
solution,
turns
into
an
issue
and
that's
the
thing
that
we
use
to
track
the
actual
implementation
deployment
or
changes
in
process
to
that
cool
talk
for
a
bunch
of
time,
I'm
gonna
give
folks
a
minute
to
see
if
there's
any
reactions
to
this
table
or
in
general
to
the
metrics
before
I
know,
I
threw
out
a
bunch
of
prompts
for
conversation,
but
not
necessarily
expecting
you
to
react
to
all
of
that
right
now,
just
welcome
them
all
over
it
and
then
talk
about
it
in
you
know
in
the
future.
A
Hey
yeah
great
question
is
the
35
periods
from
ld
and
grantor
applied,
so
that's
technically
like
granted
via
the
old
process
right,
which
is
when
they,
when
a
client
would
show
up
and
say
hey.
I
want
at
least
like
a
maximum
of
x,
pebbly
bytes,
and
then
you
had
the
seven
members
say
yes,
so
it's
technically
approved,
but
it's
not
actually
granted
in
that
it's
still
going
to
go
in
like
stepwise
functions
or
in
like
tranches
to
the
client.
A
So
it's
like
a
bit
of
a
weird
measurement
where
it's
not
like,
necessarily
just
like
how
much
data
gap
is
available
to
clients.
It's
more
that
it's
like
the
clients.
Are.
You
know
asking
for
that.
Much
data
cap
are
in
the
pipeline
and
generally
like
the
process,
will
support.
C
Them
yeah,
I
got
a
small
question,
or
maybe
it's
not
a
question.
So
after
the
we
open
the
largely
the
notary
applications,
the
number
of
application
goes
through
our
own
notary
like
decreased
significantly,
because
maybe
because
our
like
kyc
requirement
is
relatively
high,
so
somehow
it's
probably
higher
than
the
the
ldn
so
yeah.
So
so
like
not
many
like
clients
are
applying
through
our
applications.
Right
now,.
C
Yes,
so
they
have
no
like
any
motivation
to
go
through
our
application
or,
if,
if
only
like,
their
amount
of
like
requirements
on
the
data
cap
is
relatively
small.
They
might
go
through
us
but
yeah.
So
so
the
number
of
like
data
cap
that
we
granted
for
the
past
few
months,
like
decrease
significantly.
A
C
A
To
prefer
the
ldn
path,
because
that
is
technically
a
little
more
efficient,
because
you
now
have
that,
like
a
group
right
instead
of
applying
to
an
individual
and
going
back
and
forth
with
an
individual,
you
have
a
group
that
you're
working
with
so
it's
more
efficient,
usually,
but
if
the
standard
is
dropping
and
like
this,
the
bar
that
we
hold
clients
to
for
the
same
like
grant
is,
is
much
lower.
Then
that
is,
I
think
that
is
a
problem
and
what
we
should
instead
work
on.
A
In
my
opinion,
is
you
know
for
the
higher
threshold
or
higher
risk
client
applications?
A
I
think
those
two
things
should
be
relatively
correlated
yeah,
yeah
and
so
that
I
think
that
would
be
the
prompt
for
you.
I
guess
would
be.
What
do
you
think
we
should
do
to
the
application
process
for
that
or
like
how?
How
should
we
change
expectations
for
clients
where,
like
I
don't
know,
maybe
they
show
up
for
depending
on
how
much
data
cap
they
show
up
for,
like?
I
don't
know
how
much
it
like
it.
C
A
A
So
maybe
that
application
needs
to
be
more
robust
or
needs
to
expect
more
from
a
client,
and
so
that
I
think,
would
not
accrue
towards
longer
time,
because
that's
preparation
work
that
we
can
expect
from
a
client
as
long
as
we
educate
them
and
we're
upfront
about
it
and
we're
public
about
it,
then
they
shouldn't
be
caught
like
that,
much
by
surprise
and
then
there's
the
follow-up
part
of
it,
which
is
we
may
want
to
do
additional
like
back
and
forth,
which
is
costly.
A
I
think
there
are
cases
in
which
the
back
and
forth
is
required
right.
If
we
see
an
application
that
is
definitely
like
suspicious,
so
could
be
considered
like
abuse
of
the
system,
then
I
think
it's
worth
tracking
that
and
like
spending
the
time
on
that.
A
But
my
gut
sense
is
that
a
lot
of
the
ones
that
are
coming
in
today,
even
if
they
are
like,
let's
say
fraudulent
we'd,
probably
be
able
to
identify
that
anyway
from
like
unchained
behavior,
for
how
they'd
like
handle
the
next
allocation
and
because
the
first
allocation
is
generally
like
pretty
small
anyway
in,
like
the
grand
scheme
of
things,
I
think
for
me
that
the
focus
turns
instead
to
how
do
we
want
to
audit
the
interactions
like?
A
What
do
we
think
is
considered
good
and
bad
like
client
behavior,
based
on
which
we
can
start
to
say,
hey,
like
you
know,
write
some
basic
automation
that
describes
chain
data
for
that
client
address
and
then
says
you
know,
put
a
flag
and
says
hey.
This
looks
highly
risky,
or
this
client
looks
like
they
might
be
for
art
event.
A
Please
do
additional
investigation
before
proceeding
with
more
data
cap
to
me,
that
is
a
better
mechanism
than
like
going
back
and
forth
a
lot
in
front
because
going
back
and
forth
a
lot
in
front,
one
could
lead
to
like
bias
and
like
subjective
influences
to
how
like
decisions
happen,
it
could
create
like
more
lag
because
of
just
people
being
busy
with
things,
but
it's,
I
think,
it's
very
hard
for
somebody
who
is
like
trying
to
fraud
the
system
to
be
long-term
at
least
able
to
hide
that
I
think
example,
behaviors
we've
seen
in
the
past
right
when
this
has
been
attempted,
is
typically
been
where,
like
clients
will
show
up
and
then
all
of
a
sudden
use
all
the
data
cap.
A
They
were
given
in
like
one
day
with
only
one
like
storage
provider
or
they'll.
Do
it
with
like
a
bunch
of
storage
providers
that
didn't
exist.
You
know
24
hours
before
the
deals
started
being
made,
so
new
storage
provider
id
shows
up
all
of
a
sudden,
all
the
deals
and
data
gap
are
attributed
to
these
brand
new
storage
providers.
So
we
we
have
like
ideas
of
like
how,
like
it's
likely
that
the
system
could
be
abused.
A
I
think
it's
worth
spending
some
time
like
investing
on
that,
and
I
think
one
one
thing
that
you've
been
bringing
up
in
the
past
as
well
has
been
like.
What
do
we
do
with
regards
to
people
that
claim
to
be
storing
a
certain
kind
of
data,
but
we
don't
actually
know
what
they're
storing
right.
So
I
know
you
talked
about
having
them,
send
you
samples
of
data,
for
example.
A
Maybe
we
do
need
to
do
that
in
the
case
that
we
started
to
support
like
encrypted
data
at
scale
and
that
just
becomes
part
of
like
the
expectation
for
client,
which
is,
you
know,
send
your
application
and
then
mail.
This
to
this
address
or
something
like
that
or
you
know
we
come
up
with
with
tooling
that
can
use.
I
don't
know.
A
Theoretically
you
we
could
even
invent
like
a
proof
right
that
looks
at
encrypted
data
and
tries
to
identify
whether
or
not
it
aligns
with
the
expectations
by
looking
at
the
the
binaries
of
the
bits
within
it.
So.
A
But
I
think
part
of
this
is
about
not
being
as
reactive,
upfront
but
being
more
responsive
and
reactive
to
like
data
that
we
gather
about
a
client
over
time,
because
that
also
gives
us
more
information
in
the
rest
of
the
network
and
like
who's,
interacting
with
whom,
like,
ultimately,
this
program
is
about
establishing
trust
so
having
ideas
on
the
different
actors
that
exist
and
how
they're
related
to
each
other
and
then,
as
a
result
of
that,
how
we
track
trust
across
those
things
will
become
like.
I
think,
pretty
paramount.
For.
A
A
Yeah,
but
I
think
it's
it's
in
general,
I
I
think
the
thing
that
you're
calling
out.
I
agree
with
you
that
it's
a
bit
of
an
anti-pattern,
and
so
I
think
the
takeaway
here
should
be
that
we
should
figure
out
how
to
make
that
application
process
more
robust,
how
to
ensure
that
we
can
start
building
like
audit,
tracking
or
automation
for
unchained,
behavior
tracking,
much
better,
much
faster,
and
then
you
know.
A
Maybe
there
are
cases
in
which
we
have
to
have
more
time
and
more
manual
verification
and
that's
okay,
and
we
should
set
that
expectation
both
within
ourselves
as
a
community,
but
then
with
clients
as
well
right
like
if
a
client
is
going
to
show
up
and
ask
for
50
wiser
data
gap.
Maybe
it's
not
so
bad,
but
if
they
show
up
and
ask
for
50
peppy
bites,
then
they
should
expect
that
it
will
take
some
back
and
forth
for
this
community
to
feel
like
they're,
that
trustworthy.
A
Okay,
I
guess
we
keep
going
so
just
wanted
to
do
a
quick
call
out
there's
at
this
link.
So
this
is
completely
public.
Anybody
can
go
here.
The
you
can
also
navigate
here
by
going
to
the.
Hopefully,
you
can
still
see
my
github,
but
you
can.
Click
into
the
large
data
sets
repo
where
all
the
applications
are
coming
in
at
the
top
there's
projects,
and
if
you
look
at
the
projects,
there's
actually
something
that
galen
created.
A
Thank
you
galen
for
when
you're
watching
this
recording,
but
it
basically
tracks
like
all
the
incoming
applications
and
where
they
are
in
different
stages.
So
you
can
see
what's
ready
for
additional
diligence
what's
been
signed
once
what's
been
approved
and
what
we
want
to
be
tracking
on
chain
for,
like
client,
behavior
and
client
address
sort
of
general
auditing
for
deal
making
and
data
cap
utilization.
A
So
this
should
hopefully
help
in
just
like
tracking
the
flow
and
and
generally,
if
you
ever
want
to
get
a
picture
of
where
the
processes
are
and
how,
like
the
new
flow,
is
working
for
clients
where
they're
getting
stuck.
Then
I
think
this
is
a
pretty
good
sort
of
indicator
of
how
things
are
going
before
we
jump
into
this.
I
do
want
to
spend
a
second
on
the
general
changes
to
the
process
and
the
expectations
here.
A
I
got
a
couple
of
questions
from
notaries
in
the
last
week
around
like
how
the
new
process
should
influence
their
actions,
and
so
I
just
want
to
spend
a
second
on
this
one
and
and
see
if
other
notaries
all
start
ideas,
but
in
general
the
the
new
process
is
more
aimed
towards,
like
the
same
sort
of
like
you
know,
evolving
notaries.
A
I
like
this
line
a
lot
like
evolving
them
from
being
gatekeepers
to
guardians
or
like
changing
our
patterns,
a
little
bit
from
being
less
of
like
gatekeeping
up
front
to
instead
being
oriented
around
having
tracking
of
actions
that
are
happening
on
chain
after
data
cases
are
granted
after
deals
have
been
made,
and
so
that's,
I
think,
the
pattern
that
we
want
to
move
towards,
and
so
what
that
means
for
notice
that
are
participating
in
this
process
is
that
one
there's
still
due
diligence
like
there's
still
an
application
process.
A
Clients
are
still
applying,
even
if
a
multi-stick
is
created
for
a
client
application,
and
data
cap
is
being
proposed
to
be
given
to
that
client.
That
doesn't
mean
they
should
receive
that
data
cap
right.
A
There
is
still
an
expectation
that,
before
you
sign
that
thing
you're
at
least
looking
at
the
application,
because
nobody
else
is
because
that's
still
the
responsibility
of
a
note
3
today,
which
is
really
the
application,
or
unless
the
community
members
looked
and
already
flagged
something
it's
probably
worth
taking
a
quick
scan,
at
least
making
sure
it
makes
sense
to
you
ensuring
that
it
doesn't
seem
like
immediate
cause
for
fraud
or
concern,
and
if
it
does
then
call
it
out
in
the
github
issue.
A
I
think
once
you
send
that
allocation,
that's
where
then
it's
more
about
following
up
to
see
how
that
client
is
doing
and
typically
the
touch
point
will
probably
be
when
that
client
comes
back
for
more
data
cap.
So
I
don't
know
if
you
guys
noticed
this,
but
as
per
the
plan
that
we'd
shared
you
know,
I
think,
a
month
ago,
we're
trying
to
add
some
automation
that
can,
on
behalf
of
the
client,
like
request
additional
data
caps.
A
So
when
they
get
down
to
about
25
data
cap
left
it'll
post
a
comment
that
just
says:
hey,
you
know
requesting
the
next,
their
cap
allocation
quantity
and
that's
cool
because
it
reduces
the
amount
of
like
thinking
that
a
client
has
to
do.
It
reduces
the
amount
of
touch
points
that
they
have.
It
improves
their
experience,
but
because
the
flow
still
has
notaries
in
the
loop,
we
don't
really
have
that
much
of
a
cost
of
doing
that.
A
As
long
as
when
that
notification
comes
in
and
that
process
is
kicked
off,
there's
still
some
diligence
happening,
and
so
this
is
where
I
think
it's
our
first
sort
of
actual
tactical
opportunity
to
influence
the
way
in
which
we
do
audit
tracking
and
the
way
we
like
watch
for
fraud
in
the
system.
Whereby
the
I
mean,
I
think,
we've
shown
you
screenshots
of
what
this
thing
is
capable
for.
But
it's
it's
not
that
sophisticated
right.
What
it
was
doing
was
just
here
are
the
stats
of
like
how
the
client
made
deals.
A
Here's
who
signed
the
previous
data
cap
allocation
here's
how
much
of
it
ended
up
with
certain
search
provider
ids,
but
beyond
that
it's
still
pretty
like
primitive,
and
so
it
would
be
cool
to
see
this
work
and
then,
as
a
community
start
to
think
about
how
we
need
to
beef.
That
thing
up
to
like,
say:
hey
like
there's
a
problem
or
there's
no
problem,
so
that
the
notary
load
also
decreases
over
time,
because
all
of
you
also
have
day
jobs,
and
this
should
be
sustainable.
A
You
know
we're
already
looking
at,
I
don't
know
we
had
like
6
700
clients
come
to
the
pipeline
and
from
that,
like
three
four
puppy
bites,
you
did.
A
cap
have
gone
out
like
say
that
was
five
puppy
bites,
especially
with
the
ldn
or
closer
to
50
bytes.
But
we
want
to
multiply
that
by
10
right
like
we
want
to
go
from
50
to
500,
and
then
I
don't
think
it's
sustainable
to
imagine
that,
like
that
means
the
time
that
you
spend
should
multiply
by
10..
A
So
instead
I
think
it's
a
combination.
The
tooling
needs
to
get
better.
The
process
needs
to
get
easier.
The
automation
that
exists
needs
to
be
more
useful
and
then
maybe
it's
notary
responsibilities
evolve
accordingly
or
we
get
more
notaries
or
we
like
change
up
the
expectations
in
the
flow,
but
that's
the
kind
of
like
thinking.
I
would
like
to
push
each
of
you
on
cool.
A
E
C
E
Yeah
so
first
the
client
standard
application
and
then
boards
and
the
government's
team,
like
notary
teams,
to
audit
for
the
application.
Then
the
boards
will
created
the
issues
on
the
github
and
then
the
notary
would
like
to
have
a
multi-sign.
So,
according
to
the
rfip
2y,
so
that's
we
need
two
notary,
multi
design.
Am
I
right?
E
Yes,
yeah
and
after
that
the
boards
will
continue
to,
for
example,
the
clients
seal
their
data,
and
the
boards
will
continue
to
send
the
second
allocation
to
the
client,
and
then
we
need
the
another
two
notary
to
have
a
muddy
sign
and
then
step
by
step
until
the
project
finished
right.
That's
right,
okay,
clear,
but
one
more
question:
two
note
three
will
be
the
first
leading
to
notary
or
it
will
be
changed
latin
for
any
notoriety
who
joined
to
the
project.
A
So
when
that
bought
a
request
for
additional
data
cap,
it
also
says
like
oh,
who
signed
the
previous
one,
so
that
in
the
future,
the
next
signature
should
be
from
a
different
set
of
nurseries,
and
that
generally
ensures
that,
like
additional
due
diligence
is
happening,
I
don't
know
if
that's
the
right
way
to
encourage
that,
but
it's
certainly
one
idea,
and
so
I
think
that's
where
we're
initially
starting
okay,
clear
honesty!
Thank
you
and
then
the
other
thing
I
wanted
to
call
out
is
that
this
whole
part
right.
A
The
first,
the
first
set
where
it's
like.
Oh
the
bot
and
the
team
like
audit.
That
is
not
like
audit
on
the
actual
application.
That's
audited
on
the
completion
of
the
application.
So
that's
is
the
application
filled
out
like?
Is
it
legible?
Is
it
understandable?
Are
there?
Is
there
anything
that
is
missing
from
the
client?
And
so
that's
mostly
because
the
bot
checks
a
bunch
of
fields
to
ensure
that
they've
been
filled
with
some
text
and
then,
like
somebody
from
the
government's
team,
can
look
and
say
that
oh,
it's
actually
a
sentence.
D
A
Cool,
thank
you
all
right,
so
these
are
the
open
discussions
that
they
are
today,
there's
sort
of
the
several
ideas
that
were
thrown
out
by
meg,
which
are
pretty
awesome.
I
think
most
of
that
conversation
happened
in
the
second
governance
call
last
week,
because
that
started
two
weeks
ago,
because
that's
the
one
that
she's
able
to
make
you
know
coming
from
australia,
and
then
I
started
this
one
on
like
how
do
we
increase
the
this
scope
for
the
automated
verification,
so
yeah?
A
If
you
haven't
already
watched
the
recording
or
read
through
these
discussion
topics,
I
highly
suggest
that
you
take
a
look.
What
I
would
like
to
do
at
this
point
instead
is
probably
instead
of
diving
into
these,
like.
I
have
a
couple
of
other
ideas
that
I
wanted
to
talk
about,
and
I
wanted
to
spend
more
time
today
on
open
discussion
to
see
if
you
know
folks
have
other
ideas,
especially
around
like
how
we
level
up
the
system
to
meet
the
six
month
goals.
A
So
I'm
gonna
stop
sharing
my
screen
and
it
looks
like
faye
has
a
question
in
the
chat.
This
is
my
suggestion.
I
think
100
terabytes
per
week
allocation
could
use
an
increase.
What
do
you
mean
by
this.
B
I
think
right
now
the
on
the
documentation
says
the
euro
weekly
allocation
is
100
terabytes
per
week,
but
when
I
actually
use.
C
B
I
mean
the
template
says:
a
hundred
thirds
yeah.
I
thought
that's
a
seating
of.
B
B
A
Okay,
yeah,
I
think,
that's
fair
sort
of
encouraging
clients
to
work
harder,
stronger,
faster
on
the
network.
I
think
that's
good,
fair
enough
cool,
so
I
wanted
to
actually
jump
back
in
conversation
to
this.
This
slide
here.
First
as
one
of
the
prompts
for
conversation,
this
specifically
this
part,
which
is
that,
like
applications
get
deleted-
and
I
don't
really
understand
why,
like,
as
I
had
mentioned,
when
we
were
going
through
the
content,
the
data
is
still
there.
It's
still
being
tracked,
it's
available
to
everybody.
A
Who's
subscribed
to
the
thread
it's
available
through
the
dashboards,
of
course,
not
in
the
same
robust
manner,
but
it
is-
and
so
I
did
want
to
ask
you
know
if
there's
any
clients
in
the
chat
or,
if
there's
any
notaries,
there's
a
handful
of
applications
that
just
get
deleted
every
week.
I
don't
know
if
this
is
like
a
a
github
related
issue
where
people
don't
like
the
tooling
or
are
not
comfortable
using
the
tooling
or
if
it's
something
else,
we
would
love
to
get
some
insights
on
that.
A
So
like,
even
if
you
look
at
you,
know,
client
onboarding
today,
if
you
look
at
the
depot
there's,
let's
see
if
I
just
do
granted.
A
Interesting,
maybe
maybe
I
was
missing
something
in
browser,
but
anyway
I
have
definitely
seen
I've
definitely
seen
that
that
happen,
and
then
the
other
indicator
of
that
is
probably
that,
like
if
you
just
look
at
the
number,
the
total
number
of
issues
that
exist
right,
we're
looking
at
769,
but
we're
number
970,
and
these
only
increment
by
one
and
so
theoretically,
that
means
close
to
200
have
been
removed.
And
of
course,
I
think
a
lot
of
these
can
be
mistakes
and
duplicates.
A
But
it's
weird
that
the
granted
number
would
not
be
going
up
right,
even
if
it
was
211.
We
know
that
new
ones
have
been
approved
in
the
last
week,
so
it's
still
net
down
and
yeah.
Like
I
mentioned,
I
think
it's
confusing,
because
all
that
info
still
tracked-
and
so
I
was
wondering
if
anybody
has
any
insights.
I
know
there
are
certain
countries
and
regions
that
have
restrictions
on
github
usage.
I
don't
know
if
that
is
part
of
it.
I
don't
know
it's
because
people
feel
like.
Oh
once
the
process
is
done.
A
They
can
just
like
delete
the
issue,
but
the
standard
process
in
github
is
typically
like
yeah.
The
issue
just
gets
closed
and
that's
it
right
like
we
can
still
leave
the
issue
up,
and
so
maybe
it's
like
a
mixing
like
education
thing,
but
yeah.
If
you
have
any
ideas,
let
me
know
on
this
one,
because
it's
it's
always
better
to
have
like
the
audit
trail
in
the
open.
Of
course,
we
can
find
different
ways
to
publish
it,
because
all
of
that
is
still
documented
and
stored
in
databases
cool.
A
I
don't
know
andrew
is
here
at
the
moment,
looks
like
he
actually
is,
which
is
great,
which
is
around
like
just
tracking,
like
notary
statistics,
not
necessarily
because
it's
meant
to
have
people
compete
with
each
other,
but
more
because
it
was
like
a
really
like
good
indicator
of
like
reliability
and
also
influenced
like
client
decisions
in
the
past,
where
it
was
like.
A
If
I
can
get
responsiveness,
and
I
think
that
doesn't
create
the
right
incentives,
necessarily
because
there's
still
like
differences
in
how
we
do
due
diligence
across
notaries,
but
I
do
think
it
pushes
the
conversation
of
how
can
the
process
become
a
little
bit
more
efficient
and
effective
and
also,
in
general,
like
a
little
bit
more
standardized
so
to
to
to
house
point
earlier
right
like
how
do
we
ensure
that
the
quality
stays
high
and
the
standard
is
high
for
applications
that
are
coming
out
coming
into
the
system
from
clients
are
useful
and
interesting
and
trustworthy?
A
I
think
it
would
be
great
to
see
this
sort
of
leverage
a
little
further
and
used
as
like
insights
into
our
system
and
insights
into
our
process,
to
figure
out
what
we're
doing
so.
In
general,
there
have
been
some
interesting
advancements
in
like
the
dao
world.
I
don't
know
how
much
of
this
you
guys
are
following,
but
there
is
this
like
pretty
cool,
so
get
git
coin.
I'm
sure
you
guys
have
heard
of
but
jb,
who
many
of
you
know
has
been
spending
a
lot
of
time.
A
Thinking
about
that
space,
and
like
I
shared
this
website
with
me,
which
is
pretty
cool
stuart.ethe.limo,
let
me
see
if
I
can
pull
that
up
for
you,
so
not
necessarily
saying
that
this
is
what
we
need,
but
I
do
think
this
is
a
pretty
cool
okay,
so
the
site
doesn't
seem
to
be
loading
for
me
at
the
moment,
so
I'm
going
to
drop
the
dropper
in
the
chat,
but
it's
like
this
like
pretty
like
neat
looking
like
dashboard
of
sorts
but
more
along
the
lines
of
like
who
are
the
people
that
are
participating
in
this
now.
A
How
do
they
participate?
How
do
they
allocate
that
time?
What
does
it
look
like
and
then
meg
one
of
the
discussion
topics
she's
been
pushing
is
around
like
notary
like
slas
right
like
how
do
we
have
levels
for
how
how
much
time
somebody
has
to
actually
commit?
What
are
the
expectations
that
are
fair
for
their
commitment
and
how
much
progress
can
a
client
expect
to
make?
A
And
so
I've
been
thinking
about
like
ways
in
which
we
can
sort
of
level
up
some
of
what
andrew
has
already
built
out
in
terms
of
tracking
notary
performance
and
notary
statistics
into
something
that
might
look
a
little
bit
more
interactive
and
look
closer
to
this
so
yeah.
A
If
this
is
something
you're
interested
in
either
influencing
or
working
on,
I
would
love
to
hear
it
I'll,
probably
share
like
a
more
concrete
proposal
in
the
coming
day,
or
so
in
github
itself,
to
get
your
thoughts
on
it
and
then
maybe
we
can
put
out
a
grant
and
find
a
team
that
would
be
interested
in
working
on
this.
A
A
I
can't
even
open
it
right
now,
so
I
think
that
there's
just
a
lot
of
traffic,
maybe
going
to
the
site,
and
so
it's
just
having
issues
process
that
yeah
so
yeah
I
I
might
have
saved
a
screenshot
somewhere
when
I
was
like
just
thinking
about
it.
Let
me
see
if
I
can
pull
it
up
in
some
way,
for
you
folks,.
A
Okay,
nice,
I
think
I'm
able
to
do
that.
So
that
means,
if
I
shut
the
screen
yeah,
let
me
show
you
a
screenshot
there
you
go
so
this
is
what
it
sort
of
looks
like.
Obviously,
it's
got
a
bunch
of
like
styling
at
the
top
and
stuff,
but
it's
like
tiles,
basically,
where
it's
like.
Oh
here's,
this
person,
you
know
they've
been
steward
in
this
dao
since
a
certain
date.
This
is
how
engaged
they
are
like
how
how
much
they
vote
what
their
participation
looks
like.
A
So
I
think
this
is
like
a
cool
stepping
stone
to
both
forcing
the
conversation
of
like
how
do
we
expect
the
notary
rule
to
change,
but
also
like,
like
a
more
interactive
and
interesting
manner
of
exploring
like
in
general,
like
ttd
governance
participation,
you
know
increasing
presence,
increasing,
I
think,
participation
in
general
in
the
system
and
then
it's
like
getting
clearer
on
the
expectations
that
people
are
setting
for
themselves
so
that
we
can
then
set
them
at
a
community
by
level
as
well.
C
Yeah,
okay,
so
actually
I
I
think,
just
two
weeks
ago
I
had
a
talk
with
a
a
project
called
skill
worth,
which
was
also
like,
invest
by
protocol
labs.
They
were
from
from
attacking
on
the
acceleration
programs
and
they're
like
doing
some
like
dell,
toolings
and
they're
like
doing
some
kind
of
like
non-transferable
nft
so
and
they
can
like
support
some,
like
unchained
data
analytics
to
to
support
the
the
hotels
to
make
some
decisions,
something
like
that.
C
A
C
A
Awesome
yeah
definitely
we're
taking
a
look.
Has
anybody
else
heard
of
this.
F
F
A
What
do
you
guys
think
of
that,
like
dashboard
type
thing
worth
making.
F
Well,
it
actually
leads
me
to
another
question
which
so
like
when
we
applied
to
be
a
notary.
We
applied
as
textile
to
be
a
notary,
but
just
like
many
dows,
and
that
dashboard
is
individual
driven
yeah.
F
So
I
wanted
to
introduce
anya
from
my
team
who's
going
to
help
me
as
a
notary,
but
so
welcome
on
yeah,
but
it
actually
leads
me
to
the
question.
How
are
we
thinking
about
notaries
having
multiple
people.
A
Yeah
as
like
entities
that
are
like
it's
like
at
the
organization
level,
but
then
you
have
more
than
one
person
in
there
right
yeah.
So
I
think,
there's
there's
there's
a
couple
of
ways.
We
can
do
this
so
one
is.
We
do
the
model
that
you've
sort
of
been
following
a
textile
which
is
notaries,
are
actually
multi-sticks
in
and
of
themselves
for
like
an
organization,
and
then
you
have
multiple
signers
on
that
with
a
threshold
of
one.
A
So
you
have
somebody
in
the
organization
that's
worthy
or
is
aware
enough
of
the
process
to
be
able
to
make
that
signature.
There's
a
second
approach
where
you
know
there's
a
single
single
like
it
doesn't
have
to
be
multi-stake.
It
could
just
be
an
f1
address,
but
that's
attributed
to
an
organization
and
then
it's
the
organization's
responsibilities
to
care
about
like
key
sharing,
etc.
A
But
that's
like
the
presence
on
chain
there's
also
a
third
way,
which
is
that,
instead
of
having
like
organizations
b,
notaries,
we
have
individuals
be
noticed,
and
I
think
we've
kind
of
strayed
away
from
that.
Third,
one-
and
I
think
for
good
reasons,
which
is
generally
that,
like
we
like
that,
the
ecosystem
is
pretty
early.
A
We
have
like
a
couple
of
like
very
major
players
from
like
an
organization
standpoint,
either
because
they
have
high
presence
in
the
ecosystem,
from
the
the
work
they're
doing
or
because
of
the
money
they've
invested
or
their
presence
on
the
mining
front,
and
then
general,
like
provisioning
of
data
center
front
and
investing,
and
so
like.
Disproportionate
leverage
provided
to
early
players
is
probably
was
like
something
that
we
thought
would
not
be
like
a
great
outcome,
and
I
think
that's
why.
A
The
original,
like
design
that
zx
proposed
in
february,
is
like,
oh,
like
and
like
an
organization,
should
have
one,
but
then,
based
on
that.
We've
also
had
interesting
conversations
on
like
organizations
and
they're
like
wholly
owned
subsidiaries
applying
to
be
notaries,
and
that's
led
to
some
interesting
edge
cases
as
well.
Right
like
where,
like
theoretically,
the
same
parent
organization,
has
like
three
or
four
notaries
that
are
like
interacting.
But
each
of
those
actually
are
beholden
to
a
different
like
registered
company,
and
each
of
them
will
have
their
own.
F
This
is
really
interesting
because
you
actually
lead
to
another
question
I
have
but
quick
tldr
on
that
one
is
I'm
going
to
be
on
boarding
on
you
more
and
more
to
our
thing
so
like
getting
her
to
be
the
person
tagged
in
our
issues,
for
example,
so
that
she
can
review
applications.
F
Yeah
will
be
cool
anyway,
so
I
wanted
to
just
introduce
her,
so
people
got
used
to
seeing
thank.
F
And
then,
but
you
actually
lead
me
to
another
thing:
that's
on
my
mind,
which
is
cool
that
people
are
thinking
about
this
being
a
dao,
I'm
a
big
advocate
of
that,
and
I
know
that
I've
spoken
to
you
and
jv
about
that
a
few
times,
but
I
have
a
little
dow
drum
that
I
like
to
beat,
which
basically
says
if
you
look
at
the
way
that
dows
operate
from
the
outside,
if
you're
trying
to
evaluate
how
sort
of
buttoned
up
the
dow
is
and
how
effective
they're
going
to
be
a
really
great
tool
is
to
look
at
how
their
documentation
is.
F
So
if
you
go
my
my
like
case
example
of
that
is,
if
you
go
and
look
at
the
bank
list
dao,
if
you
look
at
their
wiki,
it's
very
easy
to
jump
into
that
and
figure
out
what
you
want
to
figure
out
and
learn
how
that
organization
is
organized
like
who's
who's,
taking
on
what
roles
who's
doing.
What
and
what
you
were
just
saying
about
this,
this
kind
of
three
forks
or
four
forks
in
individuals
versus
organizations-
I'm
not
certain.
F
It's
documented
anywhere
for
us,
and
part
of
that
makes
me
think
that
currently
phil
plus
is
a
little
hierarchical
coming
from
protocol
labs,
because
not
in
not
a
bad
way.
You
had
to
bootstrap
it
but
like
when
you
come
to
these
meetings,
you
have
to
run
them
every
time
you
know
like.
I
don't
think
you
necessarily
like
that
either,
but
a
little
bit
of
that.
F
I
wonder
if
so
we
do
this
at
textile
already
we're
not
a
dow,
but
we
started
to
experiment
with
how
we
organize
our
system
as
a
dao
internally,
and
I
wonder
if
phil
plus
could
be
doing
more
of
that
and
possibly
it
starts
with
documentation
and
the
adoption
of
guilds.
So
not
every
phil
plus
notary
needs
to
be
involved
in
every
decision.
F
But
when
we
decide
like
when
we
decide
that
we
want
to
figure
out,
like
your
open
discussion,
possibly
there's
a
guild,
that's
responsible
for
figuring
out
how
we
make
onboarding
simpler
and
faster
and
there's
a
smaller
group.
That's
just
going
to
go
work
together
every
week
to
come
up
with
a
plan
and
then
come
back
to
the
bigger
group
to
propose
it
and
everybody
kind
of
signs
off
for
votes
on.
It
makes
it
happen.
C
F
What
you
want
either
so
anyway,
I'd
love
to
like
keep
this
discussion,
if
you're
interested
in
it-
and
I
can
share
some
of
what
we're
up
to
but
just
trying
to
like
get
this
movie
and
I
think
towards
the
dao
idea-
would
be
really
cool.
A
Yeah
yeah,
I
think
that
makes
a
lot
of
sense
yeah.
I
think
in
general
also
like
transitioning,
some
of
this
stuff
at
least
gillum
isn't
here
today,
but
he's
been
running
like
for
the
last
three
months
or
so
sort
of
more
of
the
dri
for
this
stuff,
because
I
think
it's
better
for
her
to
lie
in
the
foundation,
I'm
just
stepping
in
from
today,
but
everything
else
you
said
I
completely
agree
with.
I
think
it's
definitely.
F
D
A
Yeah
yeah.
I
definitely
think
we
should
continue
having
that
conversation
thanks
for
sharing,
I
think
it
might
be
worth
if
you
could
share
a
little
bit
more
about
like
how
you're
structuring
some
of
your
organization
and
what
we
could
learn
from
that
from
and
apply
here,
but
also,
as
was
pointed
out
earlier,
it's
not
like
we're
gonna
have
like
our
own
like
token
or
obvious
incentivization
mechanism,
so
we're
probably
gonna
need
to
think
a
little
bit
more
about
that
front
as
well.
A
So
I
think
those
are
definitely
like
good
conversations
to
keep
going
andrew
if
you'd
be
down
to
share
that
stuff
either
and
I
could
have
discussion
or
maybe
we
can
have.
You
share
that
more
as
like
a
dedicated
topic
of
conversation
at
the
the
next
call
or
something
that'd
be
awesome.
F
Yeah
happy
to
want
to
check
next
call
timing-wise,
but
yeah
happy
too.
Actually
one
thing
that
really
pops
in
my
mind
that
we
might
be
missing
out
on
is
like
the
way
the
taos
organize
their
planning
over
the
more
midterm.
F
Most
muslims
seem
to
be
aligning
on
the
language
of
seasons,
so,
like
a
season
is
two
or
three
months
long
and
has
kpis
and
objectives
that
you're
like
trying
to
innovate
towards
yeah
and
having
that
kind
of
alignment.
As
a
group
like
hey,
what
are
we
focused
on
for
the
next
three
months
and
then
like
letting
the
people
go
off
and
and
work
on
different
projects
to
move
that
needle
might
help
us,
but
at
the
same
time
maybe
phil
plus
is
pretty
static.
F
A
Cool
all
right,
thank
you
so
much,
I
think
we're
a
couple
of
minutes
over.
Thank
you
all.
For
sticking
with
it
today
appreciate
you
making
the
time.
Of
course,
part
two
of
the
the
same
agenda
and
similar
conversation
will
probably
happen
in
several
hours
from
now,
and
then
both
of
these
will
be
shared
online
on
youtube.
So,
if
you're
watching
recording
great,
if
you're
here
live,
thank
you
for
making
it
and
hopefully
we'll
see
you
at
the
next
one.
A
All
right,
everybody
welcome
to
part
two
of
the
notary
governance
call
for
october
12th
and
for
some
of
us
in
this
call
it's
october
13th,
which
is
great
and
on
the
agenda
today.
We've
got
a
couple
of
items.
A
First
thing:
you'll
notice
is
that
I'm
the
one
saying
this
stuff
and
not
gillan,
and
so
he
wasn't
able
to
make
it
today,
unfortunately,
so
I'm
here
in
instead
yeah,
hopefully
we'll
have
him
back
for
the
next
one,
because
he's
certainly
a
lot
more
interesting
and
makes
a
more
compelling
argument
than
I
do
for
a
lot
of
these
things.
A
But
here
we
go
on
the
metrics
front
generally
flatlined,
not
it's
not
really
surprising,
it's
kind
of
what
we're
seeing
where,
because
we're
aggregating
data
over
such
an
extensive
period
of
time
and
lately,
client
applications
have
been
coming
in
more
from
the
ldm
flow,
as
opposed
to
directly
to
notaries.
A
lot
of
the
ttd
metrics
are
sort
of
stable,
and
so,
as
you
can
see,
there's
not
not
substantive
change
over
the
last
two
weeks
between
a
lot
of
these
time,
to
date,
a
cat
metrics.
A
I
did
want
to
call
it
that
we
now
have
a
view
where
we
separate
the
funnel
for
stuff.
That's
going
directly
to
clients
from
manual
verification
from
a
notary
versus
do
a
client
via
an
ldn.
So
that's
on
the
stats
page,
the
dashboard
that
I
typically
pull
these
screenshots
from
the
salient
takeaway
here
is
that
in
general
they
were
pretty
comprehensively
looking
at
about
30
of
the
data
cap
or
35
data
cap,
that's
been
given
to
clients
is
being
used
in
deals.
We
did.
A
I
think
I
called
this
out
last
time,
but
where
you
know,
we've
crossed
one
period
of
data
on
board
onto
the
network
through
data
cap
and
through
falcon
plus.
So
that's
pretty
awesome
excited
to
hit
many
more
milestones
of
this
sort
and
at
a
many,
a
higher
magnitude
as
well.
A
The
general
amount
of
data
cap
out
there
is
roughly
around
three
plus
pebby
bytes
measuring
that's
a
little
tricky,
as
we've
mentioned
in
the
past,
because
the
the
ldn
flow
now
follows
like
a
different
mechanism
for
how
data
cap
could
be
considered
given
to
a
client,
because
that
number
is
otherwise
like
closer
to
40
45
per
bytes
in
total
meg
is
the
question
in
the
chat:
is
there
a
stat
for
total
active
falcon
clients?
There,
probably
is
it
is
not
on
our
dashboard.
A
The
place
I
typically
look
at
for
stats
is
there's
like
file.app.
Maybe
that
has
something
I
don't
know.
If
you
can
see
my
screen.
D
Yeah,
I
think
looking
at
that,
I
just
it's
just
a
question:
we're
raising
more
capital
and
it's
good
to
know
how
many
verified
five
plus
clients.
I
just
wondered
if
we
could
size
the
whole
market
and
I'm
starting
a
piece
with
idc
soon.
So
I
just
wanted
to
know
if
you
could
direct
me.
A
Yeah,
that's
a
good
good
question.
I
think
it's
not
that
difficult
to
figure
out
to
be
honest,
file.app
seems
not
to
be
loading
for
me
at
the
moment.
Oh
there
we
go
clients
there.
You
go.
A
Okay,
okay,
cool
yeah,
so
that's
the
number
that
I
see
bear
in
mind,
though
I
mean,
as
you
continue
to
pitch
this
to
people.
A
I
don't
think
that
the
client
number
itself
is
such
an
excellent
indicator
of
the
usage
of
the
network,
because
I
think
we're
starting
to
see
some
consolidation
around
deal
brokers
and
deal
intermediaries
like
a
lot
of
clients
are
now
coming
in
and
using
the
same,
like
deal
broker
to
do
deals
on
the
network,
so
they'll
use
something
like
estuary
or
like
the
textile
bit
bot,
and
so
the
way
that
shows
up
on
the
chain
may
just
be
as
like.
A
G
A
D
A
D
A
There's
an
additional
several
hundred
that
are
making
deals
that
are
not
so
if
you
want
a
specific
number
of
of
active
deal
making
clients.
I
think
this
is
a
better
indicator.
The.
C
A
Plus
is
indicative
of
you
know,
maybe
latent
demand
that
will
hopefully
soon
activate
yeah.
This
is
storage.com.
A
With
that
continuing
on
with
the
stats
direct
notary
activity,
the
client
onboarding
repo,
as
of
last
night,
I
believe,
has
46
open
issues.
26
new
applications
came
in
in
the
last
week,
roughly
low
30s.
In
the
last
two
weeks,
the
state
grounded
is
202,
so
this
is
actually
wrong.
I
fact
checked
myself
in
the
previous
call
and
it
was
at
211,
but
in
an
effort
to
keep
the
content
the
same
across
the
calls.
A
The
content
and
the
back
and
forth
is
available
both
via
email
to
the
people
who
subscribe
to
the
topic,
but
also
in
general,
like
because
our
dashboards
track
all
the
activity,
there's
no
real
good
incentive
for
a
client
to
like
delete
an
application,
and
so,
if
you
have
any
ideas
on
why
you
think
clients
might
be
deleting
the
applications.
A
I'd
love
to
hear
them
we're
trying
to
figure
out
like
what
the
the
pattern
is
and
and
what
we
might
be
missing
in
terms
of
how
we're
educating
clients
in
the
system
about
like
their
use
of
github,
two
other
percentages
worth
calling
out
so
about
62
percent
of
applications
coming
into
the
repo
actually
get
a
response
from
a
notary
and
then
about
20
out
of
73
applications
as
possible
on
the
receive
data
cap,
and
so
we're
decently.
A
Selective
like
25,
it's
good
to
see
that
it's
not
zero
and
it's
great
to
see
that
it's
not
100
as
well,
which
means
some
aspect
of
due
diligence
is
definitely
still
working,
even
though
we
may
not
have
standardized
those
processes
across
different
notaries
on
the
right.
This
is
the
table
that
we
haven't
looked
at
in
a
little
bit,
but
I
wanted
to
start
bringing
it
back.
This
is
the
original
analysis
that
andrew
hill
did
where
he
looked
at
everybody
that's
ever
been
assigned,
so
some
of
the
handles
here
are
not
actually
notaries
like.
A
I
am
super
mario
and
I'm
not
a
notary
but
I've.
I
have
close
two
issues
which
I
used
when
we
did
the
first
notary
office
hours
after
the
elections,
to
show
some
of
you
how
to
use
the
tooling
with
github
and
the
bot,
but
the
rest
of
it
is
still
pretty
useful,
which
is
it's
a
good
indicator
of
actual.
A
You
know:
data
cap,
allocating
notaries
in
the
way
that
they
are
providing
data
cap,
how
responsive
they
are,
how
long
it
often
takes
for
them
to
make
a
decision
and
roughly
how
their
decisions
are
made
as
a
percentage.
A
Generally
speaking,
like
a
lot
of
like
numbers
between
like
8
and
15,
which
you
know
as
we've
talked
about
in
the
past,
we'd
love
to
motivate
and
see
go
down,
especially
as
we
talk
about
the
popcorn
itself
being
a
more
mature
and
approachable
client
onboarding
experience
for
the
jet
users
that
want
to
use
it
who
might
be
comparing
their
experiences
to
the
web
2
world.
A
This
gives
us
an
opportunity
to
really
level
up,
and
this
is
probably
the
place
where
a
lot
of
their
client
experience
will
be
impacted
and
so
working
on
ways
in
which
we
can
improve
the
time-to-date
cap
metric
is
definitely
a
priority
for
us.
I
think,
on
that
note
pivoting
into
some
of
the
goals
that
we
set
ourselves.
So
we
talked
about
the
six
month
goals
almost
two
governance
calls
ago.
So
that's
four
weeks
ago,
which
truly
means
at
this
point
there
might
be
five
month
goals.
A
We
might
wanna
give
ourselves
like
a
a
date
instead
of
calling
them
the
six
month
bills.
I
was
considering
just
starting
to
call
them
like
q1
or
like
much.
If
that
sounds
good
to
you,
folks
or
if
you
have
any
better
suggestions,
I'd
love
to
hear
them,
but
yeah
focus
area
based
on
a
focus
area
by
focus
daily
basis,
the
first
one
being
volume.
A
You
know
we're
still
in
that,
like
40
to
50
per
bite
range,
we're
expecting
that,
like
we
get
a
lot
more
applications
than
the
rest
of
the
year
with
the
new
flow,
but
also
we've
you,
we
sort
of
mentioned
this
at
the
last
call,
but
we
would
love
to
get.
You
know
more
conversation,
maybe
in
the
latest
segment
of
this
call
around
how
we
can
improve
data
cap
availability
and
data
cap
abundance.
A
A
So
falcon
gravity
is
one
such
program,
but
I
know
that,
like
several
of
you
are
also
involved
in
leading
business
development
for
the
network
and
on
boarding
clients
that
are
bringing
on
projects
at
a
much
larger
scale
in
different
regions
around
the
world,
so
ensuring
that
they
have
a
viable
path
via
falcon
plus
and
a
friction
minimized
path
via
falcon
plus.
And
so
I
think
I
know
jonathan
has
a
topic
on
this
one,
so
I'll
ask
him
to
hold
that
as
soon
as
we
get
to
this
section.
A
Next
up
is
speed
or
ttd,
so
ensuring
that,
like
clients,
that
we
do
decide
to
support
via
a
falcon
bus
do
receive
data
cap
like
quickly
so
we've
talked
about
setting
goals
where
in
the
zero
to
one
heavyweight
range,
that's
like
almost
instantaneous
or
at
least
less
than
an
hour
and
then
in
the
100
televised
range.
A
So
it
was
a
good
amount
of
conversation
on
that
last
time,
I'm
happy
to
continue
that
as
well,
but
also
in
general,
like
thinking
of
ways
in
which
tooling
can
be
created
for
you
as
notaries
to
feel
like
one
up
front,
you
know
you're
getting
the
information
that
you
need
to
decide
whether
or
not
a
client
is
legit
or
two
in
your
due
diligence
process.
A
There
are
ways
in
which
we
can
like
augment
or
supplement
the
content
that
you're
using
to
make
a
decision
to
make
that
even
more
efficient
for
a
client
and
then
third
is,
of
course,
this
aspect
that,
in
general,
we're
motivating
notaries
to
be
more
like
guardians
of
the
network
as
opposed
to
gatekeepers
and
with
that
comes
the
watch.
What
happens
once
a
client
on
boards,
as
opposed
to
needing
to
make
all
these
decisions
up
front
and
like
front
loading?
A
The
friction
for
a
client
onboarding
onto
the
network,
the
other
three
areas
that
we've
identified
as
risk
mitigation,
which
is
we
need
to
have
you,
know,
good
audit
tracing
as
a
community,
especially
as
it
pertains
to
the
previous
point.
If
we
get
a
little
faster
and
looser
with
data
cap
than
ensuring
that
we've
got
checks
and
balances
in
places
paramount
ensuring
that
the
system
stays
stable
and
the
network
stays
safe,
so
we're
at
two
dashboards.
A
A
bit
of
a
gap
here
that
I
wanted
to
call
out
is
we
don't
really
have
a
super
structured
way
of
getting
feedback
from
clients
today.
Most
of
the
feedback
that
we
get
is
like
ad
hoc,
either
in
slack
where
people
will
share
their
thoughts
in
a
channel
or
through
calls
like
this,
where
we
often
have
clients
who
come
and
are
open
to
actually
conversing
with
us
or
sharing
the
content
in
chat.
A
So
I
guess
another
prompt
for
the
notaries
is.
You
are
some
of
the
you
know,
stakeholders
in
the
system
that
interact
the
most
with
clients?
If
you
have
ideas
on
how
we
could
do
reach
out
for
the
entire
community
or
lessons
that
we
can
get
from
the
clients
you've
worked
with.
That
would
be
useful
to
share
with
the
community
I'd
love
to
have
that
conversation.
A
One
of
the
things
we've
we've
thought
about
proposing
in
the
past
has
been
like
the
more
traditional
like
ux
researcher,
based
approach,
where
we
could
like
provide
a
grant
to
a
team
who,
like
interviews
a
bunch
of
clients,
watches
them
on
board
onto
the
network
and
like
records
that
and
and
gets
insights
from,
like
a
product
design
perspective
and
a
ux
perspective
on
like
what
their
takeaways
are
or
how
it
feels.
A
Even
or
ways
that
we
you're
already
doing
and
other
notaries
or
other
folks
in
the
system
are
not,
as
aware
of
that
might
be
worth
sharing
and
last
and
certainly
not
least,
is
sort
of
maturing
and
scaling
our
governance
processes,
so
not
only
in
terms
of
taking
in
into
account
opinions,
but
also,
just
in
general,
scaling
to
support
more
decision
makers,
more
more
community
engagement
and
sort
of
pushing
ourselves
in
this,
like
dow,
like
direction
that
we
often
talk
about
we're
at
the
point
where
we're
now
using
github
a
little
bit
better.
A
A
Yeah
and
so
I'd
love
to
share
some
of
the
takeaways
in
that
conversation,
part
of
which
was
I've.
I've
been
thinking
about
proposing
the
creation,
as
some
kind
of
like
notary,
like
leaderboard
or
stats
board,
that's
similar
to
what
a
lot
of
dows
do
when
they
look
at
like
members
that
participate
so
I'll.
Show
you
an
example
of
this
and
we
can
chat
about
it
in
the
open
discussion.
A
Part
of
the
call
I
did
want
to
call
out
one
awesome
thing
that
galen
added,
which
is
at
this
link
you
can
actually
get
here
by
just
going
to
the
large
data,
sets
repo
and
up
here.
You've
got
projects
if
you
click
into
this
there's
something
called
ldn
tracking
and
this
sort
of
tracks
all
the
incoming
applications
and
where
they
are
so,
if
they're
ready
for
additional
diligence
waiting
for
a
secondary
assign
or
if
the
application
has
been
approved.
It's
like
a
one-spot
view.
A
I
do
think
it
currently
requires
manual
updating,
so
we
might
want
to
come
up
with
a
mechanism
where,
via
labels,
we
can
track
these
things,
I'm
not
sure
how
feasible
that
is,
but
it's
a
nice
snapshot
into
like
where
we
are
with
the
with
like
people
that
are
on
boarding
via
this
path
and
the
progress
that
we're
making
as
a
community.
A
Thank
you,
galen
shout
out
for
doing
that,
yeah
awesome
and
then
I
think
the
the
last
thing
I
wanted
to
touch
on
briefly
was
on
expectations
around
the
ldn
process.
A
So
all
the
the
I
think
the
people
got
confused
by
this
sentence
up
here,
which
is
like
bought
and
governance
team
audit
for
completion,
but
literally
all
we're
doing
is
we
have
a
bot
that
says
have
all
the
fields
on
the
form
being
filled
out,
and
then
you
know
somebody
takes
it
past
and
says:
oh,
are
there
actual
like
legible
english
responses
to
those
questions,
and
that's
it
there's
no
like
opinionation
happening
other
than
when
a
nurturing
comes
in
and
then
expresses
an
opinion
on
the
legitimacy
of
an
application,
and
so,
as
a
notary.
A
I
think
this
is
like
a
like,
certainly
like
a
like
you're,
even
using
like
this
tracking
thing.
To
look
at
this
column
is
a
good
use
of
your
time.
If
you
have
an
extra
few
minutes
on
a
friday
evening,
I
don't
know
sitting
at
your
favorite
micro,
brewery,
eight
minutes
away
from
house,
or
something
like
that.
You
always
have
the
opportunity
to
pop
in
and
take
a
look
at
client
applications
that
exist
and
seeing
if
any
of
them
are
worthy
of
getting
any
data.
Cap.
D
Just
question
deep:
so
is
it
we
have
moved
to
two
notaries
right
and
you
don't
really
need
a
lead
anymore,
because
there's
just
two.
A
So
this
one,
I
think,
is
a
bit
of
an
open
question.
The
original
proposal
that
galen
had
in
that
issue
was
that
we
didn't
actually
need
a
lead,
but
part
of
that
proposal
was
also
that
the
two
would
always
cycle,
so
it
wouldn't
be
that
the
two
sign
every
allocation,
but
that
two
can
only
sign
one
consecutive
allocation
at
a
time
so
suppose,
like
notary,
a
and
notary
b
sign
the
first
allocation
for
client
one,
the
next
time.
A
Client
one
needs
data
cap
like
we
have
a
bot
that
will
say:
hey,
client,
one's
running
out
of
data
cap.
Here's
the
next
allocation,
node
g,
a
and
b,
can't
actually
sign
that,
and
so
the
idea
is
that,
because
we
can
cycle
people,
we
also
cycle
diligence
and
get
an
additional
perspective
from
the
community,
and
that
way
each
consecutive
app
at
least
is
bringing
in
some
fresh
eyes.
A
I
do
think
that
we
we
may
want
to
still
consider
having
a
lead
notary,
because
I
do
think
it
had
upsides
so
meg.
I
was
actually
thinking
that,
depending
on
some
of
the
topics
I
wanted
to
propose,
based
on
the
conversation
this
morning,
plus
like
the
direction
that
we
might
want
to
take.
Your
notary,
leveling
idea
into
we
might
still
want
to
consider,
having
like
somebody
from
like
a
higher
tier
of
engagement,
for
example,
still
be
responsible
for
like
shepherding
like
one
of
these
clients,
not
entirely
sure
how
that
would
play
out.
D
A
Cool
lots
of
stuff
sort
of
talk
straight
for
like
20
minutes,
hopefully
a
lot
of
it
landed
with
that.
I
wanted
to.
You
know
open
it
up
for
open
discussion,
there's
a
lot
of
topics
that
I
think
we
do
need
to
discuss,
but
I
was
going
to
first
give
the
mic
to
jonathan,
because
I
know
he
had
something
he
wanted
to
share
and
then
we
can
take
it
from
there.
G
I
think
thanks
very
much
deep,
so
this
this
kind
of
question
I'm
coming
from
the
hold
on
storage
provider,
side
of
the
business.
So
this
is
really
a
question
for
yourself
and
danny.
We
or
we've
started
chatting
to
clients
on
how
we
can
store
their
data
now.
The
key
thing
that
our
clients
are
looking
for
is
their
data
to
be
more
reliable,
more
redundant
and
more
robust
right.
Those
are
the
three
things
that
we
need
to
deliver
over
amazon
to
take
their
cold
storage,
backups
and
then
to
use
us
now.
G
Interestingly,
it
doesn't
need
to
be
that
much
cheaper
than
amazon,
because
we
it's
robust
and
redundant
and
they
they
get
this
feeling
that
yeah,
you
know.
If
amazon
turned
them
off,
then
they
they,
wouldn't
they
don't
know
what
to
do
right.
It's
just
amazon,
storing
it,
but
if
it's
across
four
or
six
providers
they
get
me
more
reassurance
as
a
business
that
they'll
be
able
to
access
their
data
now
getting
data
into
file
coin.
G
Moving
it
from
them
to
us,
is,
you
know
another
problem,
but
the
way
of
I've
kind
of
looking
at
this
problem
is
we'll
create
these
ipfs
nodes
across
the
world.
We'll
probably
have
four
or
six.
Our
providers
will
upload
the
data
to
this
ipfs
node.
Think
of
this,
like
as
a
monthly
backup
service,
we
the
data
that
we'll
get
there
will
be
unencrypted
so
hold
on,
and
somebody
else
or
a
notary
will
check
that
that
data
is
fine
and
then
there's
nothing
untoward
in
that
data.
G
We
will
then
encrypt
that
data
and
put
it
into
two
pieces,
and
then
we
will
give
that
those
two
pieces
to
to
four
or
six
storage
providers
across
the
world.
G
Now
that
is
the
simplest
process
that
I
can
see
to
sell
data
storage
and
I'm
currently
looking
for
partners
that
are
enterprise
providers
across
the
world
to
help
facilitate
these
deals.
Now
what
I'm
really
really
coming
here
today
and
asking
is:
is
there
any
blockers
that
you
guys
can
see
for
me
for
this
transactional
for
this
process
not
to
happen?
You
know
encryption
picking
my
own
storage
providers
making
sure
they're
enterprise.
A
Yeah,
I
think
that's
really
interesting.
You
know
in
general,
in
the
community
we
have
this
challenge
on
like
figuring
out
how
we
want
to
level
up
and
build
standards
around
like
checking
for
data
that
will
eventually
be
encrypted,
and
I
think
that
it's
I'd
love
to
hear
from
the
notaries.
I
know
meg
is
here
that
might
be
slightly
biased,
but
at
least
danny
and
bing
are
here
as
well
we'd
love
to
hear
like
what
they
have
to
say
about
that
being
a
viable
process.
A
My
my
initial
opinion
is,
I
think,
that's
like
fairly
reasonable.
The
only
thing
I
would
ask
is
once
it's
added
to
ipfs:
is
it
bad
that
it
was
added
to
ipfs,
because
if
somebody
gets
a
hold
of
those
crds,
then
it
is
visible
to
anybody
that
goes
to
a
gateway
and
tries
to
pull
that
content
like
what
does
the
risk
actually
look
like?
Is
that
something
a
client
would
say?
Yes
to
sorry?
Can
you
say
that
again,
sorry
yeah?
A
So
when
you
upload
something
to
ipfs
like
you
get
a
cid
right
and
technically,
nothing
on
ipfs
is
actually
like
private.
If
you
know
what
the
cid
is,
so
if
you
uploaded
unencrypted
data
there
yeah
theoretically,
that
is
like,
then
releasing
it
to
the
public
like
agree
that
it
won't
survive
long
term
and
agree
that
people
won't
probably
won't
find
it
and
specifically
go
and
chase
it
down.
A
But
it
is
technically
like
a
violation
of
like
somebody
who
might
have
like
pii
data
or
private
data,
and
so
I
just
wanted
to
check
that
your
client
would
be
okay
with
that,
because
it's
effectively
the
same
almost
other
than
like
the
permanent
aspect
of
it.
It's
very
similar
to
just
making
a
deal
with
that
data
directly
right,
because
you
have
put
it
on
the
internet
and
it's
not
encrypted.
G
H
So
I
I'm
I'm,
I
want
to
just
clarify
something
so
with
this,
when
you're
describing
this
process,
are
you
envisaging
it
and
interacting
with
the
falcon
plus.
G
H
So
I
think
something
that
might
be
useful
is
that
I
don't
think-
and
others
can
jump
in
here-
the
the
notaries
really
have
to
see
the
data
itself.
The
thing
that
we're
trying
to
establish
is
the
bona
fides
of
the
person
providing
the
data
and
pretty
much
explicitly
we're
trying
to
make
sure
that
one
storage
provider
isn't
sort
of
benefit
by
virtue
of
sort
of
creating
a
a
a
fake
agreement
right
where
there
isn't
there
isn't
any
real
data
at
the
bottom
of
it.
H
It's
just
there
purely
to
benefit
the
storage
provider.
So
as
far
as
that
goes
really
the
stuff
that
the
notaries
do
is
more
due
diligence
on
the
on
the
organizations
themselves.
So
if
you
provided
us
with
that
information
right,
if
you
the
thing
that
would
speed
this
up,
the
most
is
you,
as
part
of
your
contract,
just
wrote
up
a
description
of
the
company
description
of
the
data
very
similar
to
the
data
cat
requests.
H
We
have
here
then-
and
this
is
probably
something
we
can
discuss-
whether
we
should
be
giving
the
data
cap
to
you
or
we
should
be
giving
the
data
cap
to
them.
I
think
it's
probably
you
because
you'll
be
the
person
who's
actually
arranging
the
deals
and
that's
the
path
that
we've
done
with
with
other
similar
projects
like
history
and
nft
storage.
But
we
don't
need
to
see
that.
Does
we
don't
need
to
check
the
data
itself?
H
The
I
guess
the
two
things
we
need
to
check
are
bona
fides
of
the
the
pro
the
person,
the
customer
and
some
insight
into
what
the
contracts
are
with
multiple
multiple
storage
providers,
because
one
of
the
clues
that
we
look
for
is
that
there
is
one
deal
exclusively
going
to
one
or
a
cohort
of
providers.
G
So
why
should
I
use
filecoin
and
I'm,
like
you,
know,
xyz,
it's
decentralized,
it's
more
robust,
but
then
I
get
to
the
cto
and
they
go
yeah
great,
but
I
just
do
one
click
and
it
gets
stored
in
glacier
and
it's
cheap
like
that's,
so
I'm
competing
with
a
one
click
cheap
service.
So
I
need
to
create
a
one
or
two
click
cheap
service,
but
it
has
to
be
more
robust.
It
has
to
be
more
redundant
and
it
has
to
be
more
reliable
right
so
yeah
so
yeah.
That's
that
sounds
better
to
me.
G
G
So
that's
a
key
thing
that
that's
one
of
the
key
things.
I
think
we
should
allow
and
you're
right
if
we
could
get
the
data
cap
through
hold
on
storage
provider
business,
that's
going
to
be
a
lot
easier
for
our
clients
and
also
the
clients
we're
going
after
like
education,
companies
that
are
universities-
and
you
know
like
media
companies
right
so
it's
you
know,
it's
they've
got
lots
of
data
and
they're
a
100
million
dollar
business
right.
So
it's
it's
easy
for
us
to
just
point
at
them
and
go
yeah
there.
They
are
so.
H
So
I
would
say
I
mean
one
of
the
things
that
sort
of
and
if
somebody
else
wants
to
speak,
please
put
up
your
hands
because
and
suddenly
stop
me,
because
I'll
talk
forever.
H
One
of
the
things
that
I
really
do
understand
what
you're
saying
here
about
this
there's
a
degree
of
price
they're,
not
the
price,
isn't
the
most
compelling
part
of
this
sort
of
compelling
part
of
an
alternative
and
the
falcon
plus
process
for
the
provider
for
for
the
client
right
now,
it's
its
primary
incentive
is
is
around
price
right,
but
there
is
this
interesting
discontinuity
right
where
the
price
for
many
storage
providers
right
now,
if
you
from
file
coin
plus
is,
is
free,
is
zero
price
and
we
don't
know
how
long
that
will
that
will
continue.
H
It
has
to
do
with
the
crypto
economics
of
the
the
whole
system,
but
one
of
the
things
in
conversations
I've
had
with
large-scale
academic
or
educational
providers
is
that
is
not
just
a
difference
in
in
kind
of
degree,
in
quantity.
It's
it's
actually
a
a
really
big
difference
in
in
the
total
system,
in
that
they
can.
Perhaps
they
can
use
the
system.
H
Like
if
we
streamline
the
process,
so
you
don't
have
to
do
another
click
to
plead
that
you
are
a
real
organization,
you
can
still
store
on
the
network
relatively
cheaply,
but
we
also
have
this.
This
thing,
which
lets
you
store.
H
For
very
large
is
about
a
large
amount
of
data
for
free
and
so
those
two
things.
Those
are
two
different
different
offerings
in
some
ways
right,
like
here's,
an
offering
where
you
can
store
this
data
in
this
robust
and
reliable
way
and
here's
this
offer
where
you
can
store
this
in
a
reliable,
robust
way,
but
also
you
can
decide
how
much
you
want
to
store
right.
Like
you
don't
for
18
months
right,
you
don't
have
any
limits
on
that.
H
Apart
from
how
quickly
you
can
get
the
data
up,
so
it
might
be
better
for
you
for
your
considerations
to
think
about
it
in
terms
of
like
in
terms
of
quickness,
we
can
get
you
on
the
network
like
super
quickly,
but
with
this
little
bit
of
paperwork,
we
can
even
offer
this
part
of
it
for
free
again,
like
the
thing
there
is
because
nothing's
for
free
right,
you
have
to
store
and
you
have
to
distribute
you
have
to
send
the
data,
but
it
really
changes
the
marginal
cost
of
storing
more
data.
G
Yeah
yeah
we're
never
we're
never
going
to
provide
we're,
never
going
to
provide
our
clients
free
data
storage,
because
at
some
point
we
have
to
charge
them
and
go
from
zero
to
one
is
hard
work,
but
where
going
from
one
to
two
is
easy
right
and
also
you
know
you
start
talking
about
free
data
storage.
G
One
of
the
one
of
the
questions
was
who
do
I
call
up
when
I
want
my
data
and
something's
not
working
right
and
that
that
has
to
be
poland
and
holland
engineers
right
and
that
costs
me
a
lot
of
money,
because
you
know
manpower
is
very
expensive
in
australia,
so
yeah
we
won't
sell
it
for
free,
we're,
gonna,
double
down
on
robustness,
reliability
and
redundancy,
and
we're
just
going
to
do
it
with
enterprise
miners
and
yeah.
G
We
just
need
to
make
the
process
as
seamless
as
possible
and
and
that's
kind
of
like
the
first
way
you
know
I
I
was
talking
about
the
encryption
because
I
know
about
six
months
ago.
You
know
that
encryption
was
a
bit
of
a
big
question,
but
I
just
you
know
we're
just
happy
to
open
up
that
data,
so
people
can
check
it
if
they
need
to
is
the
key
here.
I
think
and
yeah
just
wanting
to
get
your
thoughts
on
this
as
well.
G
Okay,
is
that
okay,
were
you
deep?
So
what
I'll
do
is
I'm
going
to
go
away?
I'm
going
to
keep
trying
to
sell
data
storage
as
much
as
possible.
G
I
think
I'm
going
to
have
about
three
clients
by
the
end
of
the
end
of
the
month,
which
puts
me
in
a
predicament
where
I
need
to
find
my
enterprise
storage
providers,
because
at
the
moment
I've
only
got
two.
So
so
I
need
well
us
and
somebody
else,
so
I
need
two
more
once
I've
got
them,
then
I'm
gonna
chat
to
meg
and
I'll
get
her
support
to
help
me
write
everything
up
and
then
feed
it
out
into
the
community
and
for
the
next
steps.
Yeah.
A
So
I
think
that,
like
it's
definitely
fine
with
me,
I
don't
actually
really
have
a
say
in
this.
So
if,
if
nobody's
supporting
you
like
danny,
then
that's
all
that
actually
matters.
I
think
that
in
general
it
will
be.
The
only
thing
that
I
would
say
is
it's.
It's
likely
that
you,
we
should
probably
push
the
entire
community
to
have
some
sort
of
standard
consensus
on
this
topic
as
well.
A
So
you
may
want
to
just
set
expectations
for
yourself
that
you
may
have
to
like
pivot,
based
on
the
needs
of
specific
notaries,
as
the
ecosystem
itself
is
a
little
young
and
we
haven't
yet
standardized
these
processes.
But
I
do
think
that
in
the
near
term,
or
hopefully
the
next
few
months,
we
will
have
a
much
more
like
standardized
process
by
which
you
have
consistency
in
how
you
need
to
keep
proving
the
same
thing.
So
that's
one
flag
and
then
the
second
flag
is
for
your
mining
needs
that
channel.
A
That
you're
asking
me
about
feel
free
to
abuse
that
to
be
like.
Hey
like.
I
need
reliable
people
here,
because
that
is
where
I've
added
all
the
people.
I've
worked
with
in
the
past
for
the
exact
same
reason
and
others
are
doing
the
same
and
so
feel
free
to
leverage
that
working
group
for
a
base
pool
of
like
geo-distributed,
like
reliable
search
providers
across
the
world.
A
A
Much
thank
you
thanks
jonathan
cool,
so
I
wanted
to
pivot
into
some
of
the
other
open
discussion
topics.
One
of
the
ones
that
I
wanted
to
talk
through
was,
as
we
start
to
you
know,
move
towards
dao,
lenas
and
dao
hood.
A
We
also
look
at
aspects
of
like
how
the
community
actually
interacts
and
what
we
get
done
as
a
group,
and
there
was
actually
this
twitter
thread
from
git
coin
dao,
which
is
one
of
the
more
you
know
popular
prevalent
ones
where
they
talked
about
how
they've
been
tracking
stuart
participation,
and
so
I
want
to
show
you
this
thing.
A
It's
kind
of
cool,
but
the
idea
here
is
basically
everybody
that
participates
in
that
dao
has
is
like
on
this
dashboard,
where
you
can
see
like
you
know
who
they
are
since,
when
they've
participated,
what
that
engagement
typically
looks
like
if
they're
committed
to
specific
work
streams-
and
this
part-
I
guess
like
not
as
relevant,
which
is
the
voting
weight
aspect-
is
like
we
don't
like
have
our
own
currency
that
we
use.
A
But
it
was
interesting
to
look
at
like
this
kind
of
mechanism,
whereby
I
think
the
first
thing
it
would
do
is.
It
would
provide
a
lot
of
like
insight
into
like
how
we're
proceeding
on
some
of
our
metrics,
such
as
like
time
to
data
cap
and
things
like
that
and
how
individual
notaries
are
performing
with
regards
to
that
stuff.
A
And
the
second
thing
I
think
that
it
would
do
would
be.
It
would
start
to
provide
the
specific
data
that
we
would
need
to
action.
Some
of
the
proposals
like
meg's
idea,
on
having
different
commitment
levels
or
service
levels
to
notaries,
because
once
you
have
that
data
in
front
of
you
and
you
can
observe
it,
it's
also
much
easier
to
start
to
figure
out
what
the
natural
patterns
are
or
where
the
lines
are
that
might
need
to
be
drawn.
A
And
I
think
it
also
helps
set
both
the
standard,
but
also
the
expectation
for
people
that
may
want
to
be
notaries
in
the
future.
And
you
know
just
in
general,
leveling
up
the
presence
that
we
have
for
for
the
nurturing
role
in
the
system.
A
Yeah,
I
think
it's
it's
not
dissimilar
to
the
screenshot
that
I
pulled
from
like
andrew
hill's,
like
textile
nerdy,
stats,
repo,
it's
just
putting
it
in
a
place
that
looks
a
little
nicer,
maybe
and
and
has
some
of
the
stats
that
we
we
think
are
important,
so
probably
work
on
getting
like
an
rfp
out
for
that
at
some
point
and
seeing
if
we
can
get
a
ground
for
someone
to
develop
it
so,
but
if
anybody
wants
to
specifically
be
involved
in
that
or
be
part
of
influencing
what
that
looks
like
please,
let
me
know.
A
Definitely
all
this
stuff
will
happen
on
github
anyway,
so
you'll
be
able
to
track
the
conversations
cool.
The
next
thing
I
wanted
to
flag
today
was
around
the
deletion
of
applications.
So
I
mentioned
this
when
we
were
earlier
in
the
slide
deck.
A
All
the
applications
that
are
getting
actual
data
cap
are
still
tracked
right,
like
all
the
messages
are
in
an
email
inbox
for
the
bot
and
all
the
data
cap
allocations
made
are
trackable
to
the
dashboards,
but
people
still
for
some
reason
feel
like
they
should
delete
their
application.
I'm
kind
of
confused
about
that.
D
I
think
it
comes
back
to
your
question
before
around.
Do
we
have
any
insights,
but
the
best
insights
we
have
will
be
asking
them
right
like
what?
Why
do
they
think
they
need
to
do
that?
We've
been
playing
around
with
an
interface
that
is
not
github
that
will
hook
into
github,
so
github's
a
dev
tool
right
like
unless
you've
got
some
unless
you're
familiar
with
it
like
it'll.
Look,
it
won't
be
it's
not
the
best
ux,
so
you
know.
Who
knows
why
they're
doing
it,
but
let's
ask
them.
A
Yeah,
that's
right
the
best
way
to
go.
If
we
don't
already
know
absolutely
agree,
what
is
the
tool
that
you're
using
would
love
to
hear
more.
D
Oh
remember
I
shared
it
with
you
and
galen
a
while
back.
It
was
early
stages,
but
we
we're
trying
to
onboard
some
resellers
and
they
look
at
github
and
they're
techy,
guys
and
they're
like
oh,
you
know,
there's
got
to
be
a
better.
D
There's
got
to
be
a
better
way,
so
it
was
just
I'm
just
mucking
around
in
a
squarespace
site
at
the
moment,
which
is
a
proto
for
how
do
we
ask
them
the
same
questions
that
allows
them
to
apply
for
data
cap,
but
then
once
they're
a
a
client,
they
move
into
a
like
a
logged
in
state
with
a
dashboard,
and
you
know
it
could
become
their
it's
like
their
portal
right.
So
that's
something
that
we
need
to
build
at
some
point.
A
Yeah,
that
would
make
sense.
Do
you
think
that
that's
something
that
would
be
applicable
to
other
notaries
as
well
and
something
that
they
could
tap
into
using,
or
is
that
mostly
just
for
the
clients
that
you're,
specifically
working
with.
D
Well,
well,
we'll
work
it
through
we'll
work
it
through
we're
happy
to
share
that
and
we're
running
a
hackathon
in
november
right
and
so
we've
decided
to
split
the
two
streams
into
a
design
stream
so
that
we
can
get
some
design
thinkers
to
have
a
look
at
ux,
ux
and
cx,
which
is,
I
think,
new,
like
we're
working
with
little
guys,
who
are
big
sponsors
of
it.
So
so
I
think
it's
it's
about
the
right
time
to
start
to
test
some
mvps.
A
Nice
awesome
makes
sense:
okay,
switching
gears
a
little
bit
meg
from
the
conversations
we
had
two
weeks
ago.
First,
one
on
your
market
assessment
ground
did
that
make
any
progress.
I
did
a
little
bit
of
pushing
off
the
call
and
it
seemed
like
there
were.
There
were
some
like
tactical
next
steps
that
we
could
benefit
from
yeah.
D
Yeah,
it
was
awesome.
Actually,
action
happened
the
very
next
day,
which
was
great,
so
the
falcon
foundation
see
that
there's
a
lot
of
value
in
doing
a
co-authored
report
with
idc.
That
does
a
really
broad
survey
that
allows
us
to
then
focus
at
the
second
round
on
some
qualitative
stuff.
So
the
stuff,
the
questions
you're
asking-
are
all
very
real
right
like
if
we
had
a
database
of
people
that
we
could
go
to
when
we
could
start
to
ask
some
of
these
questions.
D
But
the
first
part
is
that
we
are
going
to
start
this,
like
the
outcomes
of
this
report,
or
this
work
that
we
do
with
idc
will
only
be
as
good
as
the
question
set.
So
my
ask
is
to
the
notary
community,
who
wants
to
have
a
look
at
what
we're
crafting
with
them
and
if,
for
your
specific
region
or
vertical,
you
have
something
that
you
want
to
know.
Then
then,
now
is
the
time
like
in
the
next
few
weeks
to
take
that
on,
so
that
it
can
be
useful
for
everyone.
D
So
I'm
having
the
kickoff
with
them
tomorrow,
okay
and
then
yeah
between
now
and
the
next
to
recall.
It
would
be
great
to
to
do
something
where
we
can
get
a
whole
community
to
to
nominate.
Maybe
it's
a
survey
I
don't
know.
Maybe
we
have
a
look
at
we're
proposing
and
then
we
ask
for
other
ideas.
F
D
A
Cool
and
then
the
the
last
thing
I
wanted
to
touch
on
was
around
your
proposal
on
on
service
levels
and
in
general
sort
of
pushing
like
commitments
and
expectations
and
notaries.
So
this
morning,
when
we
were
talking
about
dows
andrew
hill,
had
this
really
interesting
idea,
which
was
around
how
in
many
dows
there's
this
concept
of
a
guild
where
a
guild
or
or
like
in
the
dashboard.
I
just
showed
you.
A
And
then
that
becomes
like
an
active
like
brainstorm
community,
basically
or
like
the
set
of
people
that
align
on
the
problems
and
propose
solutions
that
the
redirects,
the
community,
can
then
much
more
quickly
action
or
decide
on,
instead
of
always
needing
every
single
person
to
hold
the
same
contacts
and
like
moving
with
every
single
notary
at
the
same
time.
So
that
was
a
pretty
interesting
idea
where
I
don't
think
we're
necessarily
ready
for
like
slas
yet.
But
I
view
these
all
as
like
stepping
stones
towards
that.
A
Where,
like
we
can
get
a
leaderboard
up
where
we
can
start
to
see
the
natural
fits,
then
we
can
start
to
do
like
guilds
or
like
working
groups
within
far
corn
costs
that
are
dedicated
to
you
know
somebody
like
a
working
group
dedicated
to
data
cap
abundance
and
ensuring
that
we
can
get
that
500
people
so
feel
like.
A
We
can
support
that
scale
or,
like
a
working
group
dedicated
to
how
do
we
solve
the
encryption
problem
or
dedicated
ttd
and
tooling,
that
might
need
to
exist
to
decrease
that,
and
so
that
could
be
like
an
interesting
and
a
compelling
way
to
go
in
the
immediate
term.
A
Even
like
that's,
not
a
difficult
thing
for
us
to
do,
and
that's
something
we
could
do
is
like
a
structure
of
like
how
we
interact
in
the
in
the
next
couple
of
months,
even
as
an
experiment,
but
in
general,
like
as
a
stepping
stone
toward
being
more
of
a
dow
and
less
of
a
group
of
people
that
are
just
randomly
pinged
by
clients
that
want
verification.
D
Yeah,
look,
I
think
you
guys
were
talking
about
a
survey
right
so
asking
the
notaries
what
they
want
and
why
they're
here
and
like
notaries
in
australia
are
like
actually
quite
a
revered
role.
D
Right,
like
you
have
a
seat
at
the
table
and
you
you've
got
to
fulfill
that
and
if
you
don't,
then,
then
your
rights
are
rescinded,
so
I
think
it
would
be
good
to
frame
like
why
did
you
do
it
in
the
first
place
and
are
you
still
connected
to
that
and
do
do
you
have
time
for
it
and
if
not,
then
then
you
know
these
dashboards
are
great
for
actually
saying
well,
you
know.
Maybe
this
is
not
the
right
time
for
you,
if
you,
if
you
can't
contribute
to
it.
A
A
Awesome,
I
think
this
might
be
the
first
one
of
the
first.
I
think
we've
had
this
once
before.
This
might
be
the
second
governance
call
we
actually
end
on
or
before
time.
So
it's
kind
of
all
kind
of
awesome
yeah.
I
think
I
think,
if
we're
good,
then
I'm
happy
to
wrap
up
danny
anything
else.
You
want
to
flag
or.
H
A
Cool
and
then
I
know
the
nurturing
from
being
her-
is
here
as
well
all
good.
A
I'll
guess
I'll
take
that
as
a
yes,
okay,
I'm
gonna
go
ahead
and
hit
the
recording
stop
button
shortly
here.
Thank
you
all
for
for
making
it
and
for
those
of
you
watching
the
recording,
thanks
for
making
the
time,
hopefully
we'll
catch
you
at
the
next
governance
call.