►
From YouTube: Filecoin Plus - May 25 Notary Governance
Description
Filecoin Plus governance call in which we discuss recent DataCap allocation trends, clients applications for Large Dataset Notaries, and the upcoming Notary elections.
A
Okay,
welcome
to
the
notary
governance
call
for
may
25th,
as
usual
lots
to
talk
about,
so
I'm
glad
that
you
all
can
make
it
today.
A
We
will
be
focusing
a
little
bit
on
the
large
data
set
notaries
and
the
set
of
applications
that
are
coming
from
there,
as
well
as
updates
around
the
notary
elections,
not
that
much
on
the
open
issues
front,
as
you
can
imagine
most
of
the
open
issues
in
the
report.
Right
now
are
just
notary
applications,
and
actually
I
might,
I
might
swap
the
order.
So
you
might,
I
might
do
the
update
for
the
entry
elections
depending
on
how
people
are
coming
in
and
if
there's
stuff
you
want
to
touch
on
first.
A
I
also
do
want
to
call
out
for
those
of
you
that
are
here
for
the
first
time.
I
thank
you
for
joining
us.
This
is
like
an
open
discussion
sort
of
format.
If
at
any
point
you
want
to
interrupt
me,
please
feel
free
to
mute
yourself
or
drop
something
in
the
chat.
The
slides
are
just
more
as
like
a
guide
to
ensure
that
we
govern
the
topics
that
we
do
want
to
talk
about,
and
so
this
is
interactive.
Everyone
here
should
feel
free
to
share
any
of
their
thoughts.
A
We'll
have
a
dedicated
set
like
amount
of
time
at
the
end.
For
specifically,
if
you
want
to
fly
new
issues
or
if
you
want
to
bring
up
new
new
conversation
topics,
but
let's
get
through
the
set
that
we
have
at
the
front
and
if
you'd
like
to
share
anything
on
those
then
feel
free
sounds
good.
A
Let's
do
it.
Okay,
so
data
cab
movement
in
the
last
month,
so
some
of
you
have
been
seeing
that
I've
been
sort
of
tracking
this
over
the
last
couple
of
calls.
A
We
had
no
allocations
to
the
github
repo
in
the
last
week
and
then
14
in
the
prior
sort
of
week
before
that
right
around
the
prior
week,
which
is
a
total
of
basically
14
applications
in
the
last
two
weeks
so
like
since
the
last
minute,
another
governance
call
and
interestingly,
that
was
actually
quite
a
bit
like
usually
we're
at
like
40
to
50
whites
per
week,
and
this
was
110
heavy
bites
over
two
weeks.
So
our
average
rate
is
like
about
the
same,
but
there
were
no
applications
that
were
reviewed
this
weekend.
A
My
guess
is
this
is
because
of
not
real
actions,
and
people
were
just
focused.
Existing
notaries
are
probably
just
focused
on
writing
their
applications
and
stuff.
We
should
chat
about
that,
but
I
think
we
can
do
that
a
little
bit
later.
There
were
five
new
applications
that
came
in
this
week
and
then
I
went
through
and
trimmed
some
applications
for
notaries
that
are
no
longer
doing
allocations
and
so
that
the
number
now
is
17
for
open
applications
at
the
moment,
which
is
quite
good.
A
It's
a
lot
lower
than
it
used
to
be
we're
typically
floating
in
the
30s
and
so
cleaning
up
that
repo
is
good,
because
it
shows
that
you
know
there's
a
lot
of
activity
in
terms
of
where
we
are,
with
the
total
amount
of
data
cap,
we're
up
to
about
700
heavyweights
that
haven't
been
granted,
which
is
great
lots
of
good
progress.
For
this
run.
I
think
last
time
we
tried
it
was
in
the
500s
and
again
it
was
going
up
in
like
50
to
100
increments.
A
So
this
is
great
and
in
terms
of
what's
being
used
in
deals,
that
number
is
around
215
w
bytes.
That's
up
tiny
bit
like
up
like
10
heavy
bites,
so
that
that
may
be
the
conversation
also
worth
having
at
some
point,
which
is
like
how
do
we
help
activate
the
clients
that
are
receiving
data
cap?
And
what
can
we
do
to
support
them?
A
The
average
time
to
data
cap
right
now,
which
is
measured
as
a
combination
of
how
long
the
github
issue
is
open
to
plus
the
for
the
verified,
upgrade
io
we're
just
saying
30
seconds,
because
that's
how
long
like
the
next
epoch
would
probably
come
in,
and
so
that's
down
three
hours
since
our
last
conversation.
So
some
progress
is
being
made,
which
is
great
yeah.
Any
questions
on
this
one.
A
Let's
keep
going
so
grabbing
the
stats
that
are
published
thanks
to
textile
and
andrew
hill,
who
who
should
be
here?
I
think
we
have
seen
you
know
like
relatively
good
trending
direction
of
things
in
general,
like
ever
since
this
was
published.
A
Things
have
been
improving
in
terms
of
like
the
speed
at
which
we
tend
to
react
and
do
allocations,
which
is
awesome,
of
course,
like
the
data
is,
is
changing,
but
in
the
last
like
week,
or
so
it
hasn't
changed
as
much
just
because
we
didn't
really
have
any
allocations
go
through.
If
anybody
here
did
indeed
do
like
apple
allocations-
and
you
know
not
in
the
last
10
hours
but
within
last
week-
and
I
missed
it-
please,
let
me
know
because
that
means
my
method
of
checking.
A
We
would
love
to
hear
if
anybody
can
share
if
this
is
like,
due
to
the
competing
interest
of
like
wanting
to
spend
time
on
your
nursery
like
application
for
the
next
round
of
elections,
or
if
it's
just
folks
haven't
had
time,
and
this
is
just
like
a
one-off
thing,
because
I
don't
think
we
have
had
a
week
without
like
an
allocation
happening
in
a
pretty
long
time.
So
I'm
just
curious,
I
hope,
there's
nothing
like
technical
blocking
as
well,
that
that
we
can
help
sort
out.
B
B
Talk,
sorry,
can
I
just
say
something
on
that:
yeah
absolutely
yeah.
So
I'm
talking
to
a
few
clients
that
could
be
interested
for
that
I
kept.
They
said
they
are
busy
because
of
things
of
slingshot,
so
they
have
no
time
to
apply.
A
Yeah,
that's
a
really
good
point
too.
I
didn't
I
forgot
about
that.
That
makes
sense
yeah
because
all
of
them
have
their
materials
reward
submission
last
week,
and
so
they
were
all
putting
together
all
their
materials
on
on
deals
and
things
like
that.
Okay,
that
makes
sense
yeah.
So
probably
that
in
combination
of
like
nerdy
spending,
sometimes
like
just
writing
up
and
editing
current
applications
is
what
did
it.
A
B
A
A
Cool,
I
think
it's
safe
to
safe
to
move
on
from
this.
I
just
was
worth
calling
on
it
because
I
was
wondering
if
it
is
because
of
the
elections,
because,
if
that's
the
case,
then
we
should
find
a
way
to
make
that
a
little
bit
more
sustainable
too,
like
it
is
important
that
there's
continued
sort
of
traction
and
liquidity
of
data
cap,
and
so
there's
things
we
can
do
to
make
this
process
better.
A
A
So
some
of
you
are
quite
aware
of
this,
because
you've
been
participating
in
calls
or
you've
been
sort
of
coming
here
and
discussing
it
actively
with
us
or
seen
the
issue
94,
which
is
onboarding
projects
a
lot
of
data
cap
requirements,
but
for
those
of
you
that
are
new
to
it,
I'm
just
going
to
spend
a
quick
minute
talking
about
what
we
talked
about
at
the
last
sort
of
the
latest
updates
on
this
run,
and
then
we
can
talk
through
some
of
the
applications
that
have
come
in.
A
The
way
in
which
that
would
happen,
because
that's
a
pretty
massive
allocation
like
this
is
sorry
between
500
bytes
and
five
terabytes,
and
the
way
in
which
that
would
happen
is
by
a
multi-signal
tree
being
created
specifically
for
that
particular
data
set
or
that
particular
project.
And
so
each
project,
like
writes
an
application
in
this
repo,
which
is
the
falcon
plus
large
data,
sets
repo
there's
due
diligence.
A
That's
like
group
due
diligence,
which
is
meant
to
be
both
like
the
notaries
who
want
to
get
more
information
to
build
their
confidence
in
signing
onto
this
multisig,
as
well
as
community
members
that
might
have
questions
or
want
to
ensure
that
the
program
is
actually
delivering
the
value
that
it
should
can
raise.
Questions
ask
questions
as
a
client
flag.
A
But
that
enables
the
ability
to
go
and
get
data
cap
at
a
faster
rate
at
a
much
higher
scale,
to
unblock
projects
that
we,
as
a
community
believe,
are
worth
supporting
and
so
right
now
we've
got
four
applications
open,
but
no
traction
on
the
applications,
and
so
first
one
is
a
sort
of
call
for
engagement,
which
is
like.
A
A
I
have
a
slide
for
each
of
these,
where
I
just
pulled
out
one
or
two
of
the
key
questions
in
some
background,
and
then
we
can,
as
a
group
sort
of
talk
about
like
key
questions,
that
we
want
to
ask
them
and
then
post
sort
of
a
message
in
the
thing
being
like
hey
like
we
need
this
information
as
a
community.
A
Before
we
can
proceed
from
here.
I
also
dropped
a
quick
note
on
the
applications
themselves,
saying
that
this
governance
calls
happening
at
this
time,
and
so
perhaps
some
of
the
people
that
have
applied
should
also
be
able
to
be
here
and
live
review.
Some
of
the
content,
which
would
be
fantastic,
I'm
not
sure
to
what
extent
we'll
get
that
attendance,
but
in
the
future
I
think
that
might
be
a
good
model
depending
on
how
today's
call
goes,
since
this
is
the
first
time
that
we're
doing
this
together
as
a
group.
A
C
A
Yeah,
it
would
be
similar
to
that.
So
basically,
we
have
a
version
of
the
falcon
plus
bot
that
one
that
one
of
the
teams
that's
helped
with
supporting
some
of
the
falcon
plus
stuff
in
the
ecosystem
in
general
built
out.
So
they
actually
took
the
thing
that
exists
for
notaries
today,
because
notice
today
like
when,
when
somebody
posts,
a
little
message
that
says,
hey
like
you
know,
give
this
personality
status
like,
which
is
something
that,
like
any
of
the
governance
folks
are
like
me,
have
done
in
the
past.
A
That
goes
to
in
the
back
end
to
the
set
of
root
key
holders
that
are
signed
on
to
like
the
root
key
holder
address
for
falcon
plus
and
effectively.
We
need
to
hit
a
threshold
of
signatures
and
so
they're
just
using
their
ledger
to
sign
that
message
and
as
the
moment
like
you
know,
half
the
number
of
root
key
holders
to
sign
the
message
it's
considered
like
signed
on
chain.
A
So
you'll
see
the
proposed
message
from
the
bot
and
then
oh
sorry,
the
proposed
message
from
the
bot
to
the
set
of
your
key
holders.
The
first
review
holder
drops
the
proposed
message
on
chain
and
then
it
goes
approve
approval
approve
and
then
it's
done,
and
so
this
would
work
similarly,
where
for
each
of
these,
once
this
is
set
up,
we
go
through
the
root
key
holder
flow
and
the
note
3
would
be
established
through
the
root
key
holders,
because
that
would
be
a
dedicated
amount
of
data
cap.
A
A
Cool
any
other
questions,
and
so
this
is
like
a
scoped
experiment
like
what
we
talked
about
two
weeks
ago.
Was
that,
like
we're
gonna,
do
this
for
up
to
50
bites
to
see
sort
of
how
it
goes
and
the
experience
that
we
have
and
things
that
we
can
do
to
improve
the
process.
So
you
know,
please
feel
free
to
suggest
changes
feedback.
We
already
made
some
modifications
to
the
application,
thanks
to
your
recommendation,
based
on
like
the
finbushi
sort
of
application
for
clients,
so
that
was
great.
D
Yeah
yeah
yeah.
I
have
a
question
about
the
slingshot
dataset.
Sorry,
I
missed
the
last
notre
meetings.
I
don't
know
what's
happening,
but
I
was
remembered.
I
remembered
that
we
were
talking
about
the
solution.
Data
is
not
qualified
for
verified
data.
Is
that
a
rule
changes
now
so.
A
Actually,
the
it's
kind
of
been
decided
on
a
notary
by
a
notary
basis
and
so
from
what
I'm
seeing
some
notaries
are
granting
data
cap
for
slingshot
and
some
are
not.
I
think,
when
we
discuss
this
early
on.
There
was
like
a
recommendation
that,
like
you,
don't
need
to
because
it's
already
incentivized
data
storage,
like
slingshot
participants,
are
receiving
incentives
to
participate
in
the
network.
A
But
if
you
as
a
notary
believe
that
that
is
like
a
good
use
of
the
falcon
network
and
want
to
support
bringing
it
onto
the
network,
then
you
know
notaries
do
have
sort
of
the
rain
to
make
that
decision.
So,
charles,
I
we
need
to
have
that
conversation.
I
I
don't
know
how
we
feel
about
doing
it
at
scale.
I
know
individual
notaries
have
been
supporting
slingshot
data,
but
many
have
not,
and
so
I'd
be
curious
to
have
that
conversation.
Once
we
look
at
the
3000
rise
genome
application
as
well.
D
Yeah
because
the
question
now,
the
nearly
the
data
set
is
assigned
by
one
address
assume
so,
for
example,
the
guy
who
is
submitted
requested
for
the
solution
to
24.
He
will
be
granted
the
data
set
to
sorry
granted
the
data
camp
for
his
address.
So
then
he
will
have
the
right
to
send
out
all
those
data,
but
this
data
set
actually
is
not
owned
by
anybody.
It's
a
public
data
set.
So
is
this
reasonable
to
assign
to
this
person
or
it
should
be
assigned
to
a
google
or
person
or
assigned
to
a
company.
A
Yeah,
that
makes
sense.
I
think
the
the
thing
is
a
lot
of
the
data
sets
that
we
want
to
support
through.
This
are
going
to
be
public
and
open
data
sets.
But
yes,
it
is
true
that
a
lot
of
them
like
the
projects
that
are
at
least
the
other
three
applications.
They
are
much
closer
to
the
ownership
or
much
or
are
a
bigger
group
that
is
well
funded
to
sort
of
pull
off,
ensuring
that
the
data
gets
onboarded.
A
I
think
this
is
definitely
like
up
to
us
to
sort
of
decide,
as
a
group
in
general,
like
the
guiding
principle
should
be
the
same
right,
which
is
like,
let's
on
let's
onboard
data
that
we
think
makes
use
of
falcon
and
like
falcon,
is
able
to
give
value
to
the
community
as
a
result
of,
and
I
think
that
beyond
that,
like
the
only
scope
that
we
set
for
this
is
that
we
want
to
ensure
that
the
data
is
actually
public
and
open,
which
means
anybody
can
retrieve
it
without,
like
any
specific
permissions
and
it's
accessible
to
everybody
on
the
internet,
and
so
within
those
bands.
A
It's
kind
of
up
to
us
to
discuss
what
we
believe
the
right
limiting
factors
are
but
yeah,
let's,
let's
definitely
save
some
time
to
have
that
conversation.
When
we
look
at
the
the
the
slingshot
based
application,
the
third
one
thanks
charles
any
other
thoughts,
comments
or
questions
on
the
overall
setup
before
we
dive
into
some
of
the
applications.
A
Cool,
let's
do
the
first
one
and
I
think
jb
actually
popped
into
the
call,
which
would
be
awesome
if
he
just
wants
to
sort
of
introduce
himself
and
talk
us
through
his
use
case.
I'm
gonna
give
him
a
second
to
see
if
he's
here.
A
A
E
Okay
yeah,
so
if
I
discover
man,
it's
been
a
long
time
coming,
this
is
something
that
we
actually
announced
before
may
not
launch.
This
is
a
project
that
we
started
basically
to
store
a
bunch
of
large
open
source
data,
sets
to
help,
see
the
network
and
then
hopefully
provide
like
an
on-ramp
for
folks
who
are
trying
to
develop
some
things.
On
top
we've
gotten
data
sets.
Some
of
them
were
just
like
open
source
ones
that
we
sort
of
like
prioritized
and
grabbed.
E
Some
of
them
came
from
partners
so,
like
the
internet
archive
shared
a
couple
of
like
their
collections
with
us
and
really
this
is
both
sort
of
like
an
experiment
where
there
were
like
a
lot
of
challenges
to
figure
out
how
to
store
some
of
these
data
sets
in
a
way
that
we
can
make
it
like
indexable
and
then
retrievable,
and
then
some
of
it
was
to
like
sort
out
things
that
need
to
be
developed
inside
of
lotus
as
well.
E
So
yeah,
where
we
sort
of
are
today,
there's
still
some
issues
on
the
lotus
side
which
are
getting
prioritized
and
hopefully
we'll
be
landing
in
the
next
couple
weeks.
But
we're
basically
getting
close
to
actually
being
able
to
start
doing
the
onboarding
process
and
so
yeah
five
petty
bites
actually
probably
is
not
the
total
like
totality
of
what
we'll
need
to
onboard
the
data,
so
we'll
probably
be
back
for
more,
but
I
think
it's
given
the
process
the
right
amount
to
start
and
we
have
in
the
application.
E
We
talk
a
little
bit
more
about
like
how
we're
sort
of
like
planning
on
doing
our
allocation.
I
think
this
is
going
to
be
kind
of
different
from
how
most
folks
will
be
doing
stuff.
The
way
that
the
program
was
structured.
Protocol
labs
acquired
a
bunch
of
data
sets.
I
did
a
bunch
of
pre-processing
to
basically
get
them.
The
data
sets
into
like
one
gigabyte
file
formats
and
then
those
are
sort
of
like
packaged
up
onto
eight
terabyte
drives.
E
Each
drive
contains
about
seven
thousand,
maybe
a
little
bit
more
north
of
that,
depending
on,
like
the
the
capacity
of
the
drive
about
seven
thousand
one
gigabyte
files
and
then
what
we
did
was
you
set
up
a
store?
It
says
store.filecoin.discover.com
where
folks
could
basically
go
choose
to
participate
in
the
network.
E
So
people
would
buy
a
drive
which
is
literally
just
a
hard
drive
for
the
transport
of
it,
and
so
it's
just
the
hardware
costs
we
would
ship
the
drive
to
wherever
they
are.
We
would
do
some
work
to
basically
try
to
make
sure
that
the
right
data
sets
are
going
to
like
people
can
store
like
a
specific
category
of
data,
so
like
space
data
or
like
science
data
and
then
allocate
the
data
sets,
so
we
can
get
coverage
across
like
different
data
sets
and
then
yeah.
E
E
E
Our
plan
is
to
basically
rate
limit
folks
so
that
we
can
sort
of
check
a
couple
things
as
we
watch
them
on
board
their
drugs
their
plan
is
you
have
an
onboarding
bot
that
basically
will
test
to
make
sure
that
miners
are
actually
like
serving
up
the
data,
so
it's
not
just
like
them,
storing
it
and
then
not
making
it
accessible
and
then
yeah
as
we're
doing
the
onboarding.
We're
also
going
to
be
monitoring
their
increase
in
power.
E
So,
if
someone's
trying
to
or
like,
if
we're
concerned,
that
someone
is
trying
to
over
store
or
is
like
acquired
too
many
copies
of
the
data,
we're
at
least
able
to
like
identify
that
and
slow
the
onboarding
such
that,
what
someone
doesn't
get
like
too
much
power
yeah,
maybe
I'll
pause
there.
I
don't
know
if,
julian
your
hand
is
for
me.
B
B
E
Yeah
one
of
the
challenges
is
obviously
per
geography.
There
are
some
copyright
rules,
and
so
we
did
a
lot
of
work
to
try
to
like
self
double
check,
based
on
like
where
people
like
had
the
drives
being
shipped
to
for
folks,
like
germany
doesn't
allow
project
gutenberg
to
be
stored,
so
we
tried
to
bias
towards
caution.
E
E
Yeah,
so
we
have
one
that's
in
the
works.
We
basically
have
paused
it
for
right
now
until
we
actually
have
the
deals
live,
but
the
goal
is
that
yeah,
you
should
be
able
to
explore.
The
data
sets
that
are
being
stored,
hopefully
be
able
to
pull
back
specific,
minor
information
too,
and
then
yeah.
I
will
also
have.
If
someone
wants
to
like
download
the
data
themselves,
what
instructions
they
would
need
to
run
to
grab
that
data.
A
A
So
jvm,
I
think
this
okay,
you
got
a
plus
one
in
the
chat.
I
have
a
question
around,
like
your.
Your
sort
of
retrieval
checking
plan
is
the
expectation
that,
like
you'll,
be
testing
retrievals
across
like
along
the
entire
duration
of
the
deal,
or
is
it
more
like
at
the
front
to
ensure
that
the
rate
at
which
you're
giving
deals
to
these
miners
is
in
line
with
the
likelihood
that
they're
going
to
serve
their
achievements
correctly?
Just
trying
to
understand,
like
you
know
how
that
plays
a
role
in
the
long
term.
E
Yeah
our
plan
is
to
do
this
over
time
and,
like
I
think,
especially
for
minors
who
intend
to
get
like
renewals
or
anything
like
that.
That
would
be
like
a
part
of
the
factor.
Of
course,
I
think
there's
other
reputation,
systems
that
are
being
built,
and
so
it's
also
an
opportunity
for
us
to
help
like
provide
information
to
those
systems
so
yeah
our
plan
is
to
do
that
sort
of
testing,
but
most
of
it,
hopefully,
is
just
gonna
be
automated.
A
Any
other
thoughts
or
questions,
especially
looking
at
nodries,
because,
obviously,
in
order
for
something
like
this
to
go
through,
seven
of
you
have
one
have
to
take
the
desire
to
want
to
support
it
and
so
curious.
If
this
is
the
kind
of
information
you're
looking
for
or
if
there's
additional
questions,
you'd
like
to
ask.
B
Do
we
plan
to
have
another
batch
of
filecoin
discovery
discover
later
on.
E
Sir,
I
know
there's
a
lot
of
efforts
to
do
large-scale
data
storage.
I
don't
know
if
we'll
do
it
specifically
through
filecoin
discover,
I
think
there
may
be
more
effective
ways
to
actually,
especially
given
some
of
the
logistical
challenges
with
like
trying
to
get
physical
drives
to
folks,
so
we
may
be
rethinking
components
of
it,
so
I
think
that's
a
little
bit
tbd.
There
is
like
a
definitely
a
push,
though,
to
get
more
large-scale
data
sets
onto
the
network.
It
just
may
not
be
in
this
exact
format.
F
A
Cool,
so
the
so
the
next
steps
for
this
are
notary
should
go
to
the
the
actual
link
on
the
on
the
issue
and,
if
you're
willing
to
sort
of
serve
on
this
particular
app
and
be
a
signer
on
the
multisig
for
this,
then
it
would
be
great
if
you
could
just
drop
a
message
and
say
that
you're
willing
to
do
that.
A
We
also
need
somebody
to
sort
of
pick
themselves
as
like
the
the
dri
for
this,
and
so,
if
you're
willing
to
be
the
notary,
that
is
the
the
main
sort
of
captain
of
the
nurtures
for
it.
Then
that
would
also
be
very
helpful.
The
responsibilities
are
just
slightly
more
than
that
of
the
other
notaries,
where
you
want
to
ensure
that
the
client
is
actually
meeting
the
thresholds
of
usage
of
data
cap
before
applying
for
more
but
yeah
I
mean.
A
If
there
are
no
questions,
then
a
big
immigrative
notice
could
see
if
that's
something
you
want
to
support
or
if
notice
you
want
to
go.
Take
a
look
at
the
applications
you
think
of
other
questions,
especially
if
you're
watching
the
recording
then
check
out
the
link,
see
if
any
conversation
has
already
happened
or
feel
free
to
drop
any
comments
with
with
thoughts
that
you
might
have
thanks.
Jonathan.
A
Cool
next
up
is
estuary
cool,
so
estuary
is
being
shipped
by
the
applications.
Research
group,
the
website
is
estuary.tech
they're,
also
requesting
five
pebbites
of
datacap
cap.
So
I
I
skim
through
the
application
from
my
understanding.
It's
basically
users,
big
data,
that
they
want
to
put
on
onto
the
network
through
estuary.
A
A
copy
of
that
data
is
also
made
available
through
ipfs,
so
it
is
always
open
and
retrievable
or
sorry.
That
is
their
sort
of
way
of
saying
that
it's
it's
open
data
that
anybody
can
also
access
and
then
deals
are
made
with
at
least
six
miners
per
like
piece
of
data
that
needs
to
be
onboarded
onto
the
network.
A
So
this
seems
to
be
closer
to
like,
like
an
imager
app
or
like
some
sort
of
like
data
uploading
service
sort
of
where
anybody
can
sort
of
go
to
this
site
become
a
user
and
if
they're
comfortable
with
their
data
being
open,
then
they
can
just
sort
of
upload
it,
and
so
one
example
of
data
that
we
provided
was
that
it's
like
wikibooks,
but
I
it
seems
like
there
might
be
like
a
lot
of
flexibility
in
what
they're
supporting
so
I'd,
be
interested
to
hear
like
I
do
think
it
falls
fair
enough
within
like
what
we
consider
useful
bands
and,
if
they're,
putting
a
copy
in
ipf
test,
then
for
sure
it's
part
of
like
their
principles
that
it's
open,
but
at
the
same
time
like
it's,
not
a
single
like
data
set
or
a
set
of
data
sets
that
we
know
about
and
so
yeah
like.
A
A
A
It's
a
great
question:
yeah
yeah,
I
don't
know
we
should
ask
if
they're,
if
they're
doing
a
check
on
that.
A
G
It's
gonna
say,
questions
around.
You
know
illegal
data
aside.
I
really
like
this.
I
really
like
the
ease
of
you
know
for,
like
you,
know,
clients
and
people
who
just
want
to
store
things
and
being
able
to
to
come
and
and
put
it
up
and
have
opened
it
public
data
to
be
like
easily
uploadable.
I
have
the
incentive
of
the
being
verified.
A
A
I
would
imagine
that
if,
if
they're
uploading,
a
public
version
to
ipfs
as
well
and
providing
cids,
then
they're
doing
some
kind
of
checking,
but
there's
that
and
then
there's
also
ensuring
that
they
don't
violate
the
the
terms
of
service
for
a
minor
right
like
you
don't
want
to
accidentally
like
have
a
situation
where,
like
the
miner,
is
like
unable
to
accept
data
because
of
their
geopolitical
boundaries
or
regulation
and
so
yeah.
I
think
it's
a
really
good
point.
C
Yeah,
like
especially
like
when
you
upload
some
data,
like,
for
example,
like
porn
like
points
some
so
like
these
kind
of
data
might
be
like
legal
for
some
region,
but
like
it's,
definitely
not
it's
illegal
in
china
and
what,
if
they
like,
a
minor
in
china
like
accept
this
data
cap
and
the
store,
they
say
that
their
their
miners
will
definitely
be.
You
know,
be
monitored
by
the
government
and
something
bad
gonna
happen.
Yeah.
A
Okay,
so
it
seems
like
that's
like
the
pretty
open
question,
and
so,
if
you
want
to
post
that
in
the
app
yourself
to
how
you
totally
can,
if
not,
I
can
do
it
as
like
here's
what
we
discussed
in
today's
call
and
then
we
can
see
what
they
come
back,
but
then
continue
the
conversation
in
the
issue.
That
sounds.
A
B
We
were
talking
about
jonathan,
you
said.
E
B
G
Exactly
yeah,
okay,
cool
yeah-
I
was
talking
to
the
murmuration
team
recently
and
I'm
not
sure
where
they're
at
with
their
build
on
that,
that's
definitely
what
their
their
road
map
is
is
to
have.
You
know,
be
able
to
filter
content
and
have
that
shared
amongst
the
minor.
So
we
can
check
to
see
where
they're
at
with
that
and
where
it
is
on
their
roadmaps.
Maybe
this
would
get
later
if,
if
that's,
if
those
features
aren't
there
yet
or
what
would
be
like
that?
B
Okay,
again
yeah,
so
julian
speaking,
I'm
also
developing
a
solution
for
this
content
filtering.
So
I'm
happy
to
be
in
the
loop
of
this,
so
we
go
in
the
like
a
common
direction.
On
that
great,
there
is
a
lot
of
effort
to
to
build
this
type
of
services.
So
I
think
it's
it's
it's
good.
If
we
can
share
our
thought
and
how
what
we,
which
direction
we
move
and
how
we
we
activate
this
filtering
and.
A
All
right,
moving
on
to
the
3000
rise
genome
project,
and
actually
I
don't
know
if
feiyan
is
in
the
call-
I
do
not
see
them
cool.
So
I'm
going
to
just
talk
through
this
one
like
we
did
for
the
prior
one.
The
application
came
in
for
500
pebby
bytes.
A
The
description
of
the
data
itself
is
it's
a
2000
drive
studio
project
where
they're
trying
to
get
obviously
the
genome
of
different
device
varieties
and
then
the
allocation
plan
is
they
have
a
set
of
small
miners
that
are
all
below
500
debit
bytes,
as
well
as
people
that
participate
in
the
binary
x
fellowship
and
they
plan
to
evenly
distribute
data,
and
therefore
data
cap
across
that
set
of
miners.
A
But
the
the
interesting
part
of
conversation
here
is
slingshot
like
do
we
actually
want
to
support
slingshot
at
scale
through
this
process
and
so
yeah?
Charles?
Do
you
want
to
sort
of
kick
off
that
conversation?
Then
we
can
hear
a
couple
of
opinions.
D
Yes,
actually,
like
I
surprised
this,
is
you
know
you
need
to
buy
a
screen
shutter
who
could
feed
him
so
yeah?
I
saw
myself
also
australian
shelter,
so
I
could
participate
in
the
last
semester.
2.3
we
were
using
the
common
claw.
Now
it
was
banned
in
the
next
screenshot
and
but
actually
this
project,
the
june
project,
also
in
our
target
in
the
beginning.
D
D
D
High
value
and
the
interesting
part
is
that
it
can
be
another
source
to
support,
verified
data
storage,
and
I
said,
and
also
because
this
data
set
was
studied
from
screen
shutter.
So
we
know
that
how
to
split
those
data
to
different
trunks
or
merge
it
to
be
a
big
data
set
fit
to
the
32
or
64
gigabytes
deals
so
and
there's
a
person
who's
in
charge
of
india
has
a
lot
of
experience
sending
online
and
offline
data.
D
So
from
clicking
point
of
view,
I
don't
think
he
has
any
problem
and
yeah
the
only
problem
now
and
then
from
the
legal
side,
because
there's
all
all
those
are
open
source
data
set,
so
it
doesn't
have
any
legal
issue.
So
from
this
point
of
view,
it
is
a
perfect
data
set
as
a
logical
side.
So
the
only
question
is
that,
is
it
a
conflict
of
the
goal
of
a
screenshot
or
if
this
one
be
moved
to
the
verified
data
set
for
this
purpose?
D
Should
we
remove
it
from
the
data
set
of
a
three
shot?
So
this
is
the
only
concern
I
have
so
should
it
be
part
of
the
screenshot.
A
Yeah,
so
just
to
echo
back,
that's
because
is
that,
because
of
the
the
nature
of
slingshot,
where
it
already
provides
like
an
incentive
to
onboard
data?
Okay,
so
for
those
of
you
that
don't
have
context,
slingshot
is
a
comedy,
is
a
competition
that
is
still
ongoing
in
a
new
phase,
a
phase
2.4
where
clients
are
given
rewards
basically
to
onboard
open
source
data
onto
the
network,
so
they
compete
to
bring
on
as
much
data
as
possible
of
a
particular
type.
A
One
of
the
data
sets
that
is
eligible
for
that
is
this
data
set,
and
so
the
intent
here
is
effectively
to
facilitate
as
easy
as
possible
like
deal
making
for
the
client,
but
slingshot
in
and
of
itself
is
also
a
program
that
has
a
reward
to
the
clients
that
have
competed
in
that
competition,
and
so
the
expectation
there
is.
I
believe
that
they
can
offset
some
of
the
costs
that
are
required
even
to
make
deals,
if
required
from
by
by
receiving
rewards.
A
That
can
help
pay
for
the
deals
that
they're
making,
and
so
the
the
conflict
of
interest
here
is.
If
we
were
to
give
data
capture
like
one
slingshot
project,
for
example,
like
deal
making
for
them,
might
become
a
little
bit
easier
and
they
would
be
able
to
onboard
more
data,
and
then
they
they're
likely
to
win
a
disproportionate
amount
of
the
slingshot
reports.
Is
that.
D
Well,
and
the
problem
is
that,
if
you
give
this
data
set
verified,
so
the
problem
that
the
person
who
is
using
this
data
set
with
a
verified
address,
he
has
ready
for
more
advantages
than
other
three
shutters.
How
can
we
deal
with
this
issue?
Another
one
is
that
if
we
give
everybody
who
is
using
this
data
set
the
verified
year
priorities,
then
it
will
become
useful
in
2.2.5
yeah
in
2.5.
We
are
going
to
bank
this
data
set.
D
So
with
each
solution,
you
will
burn
one
more
data
set
because
it
gives
you
and
there's
another
reason
that
this
is
a
big
data
set.
Even
you
don't
give
it
to
the
position
or
verified.
Yes,
people
are
still
going
to
use
it,
because
this
is
the
next
biggest
data
set
beyond
the
common
core.
So
people
will
do
it
anyway,
if
even
you,
you
don't
give
them
verified
idea.
A
That
makes
sense
yeah.
I
think
the
the
only
benefit
that
I
can
see
to
giving
data
capped
a
slingshot
at
scale
is
that
it.
It
probably
like,
in
this
particular
case,
there's
like
a
very
clear,
like
allocation
plan
where,
like
multiple
miners,
are
being
pulled
into
the
loop
and
we
get
to
test
sort
of
onboarding
legitimate
data
and
deal
making
it
scale
sort
of
with
with
a
lot
of
miners.
A
Whereas
a
lot
of
the
slingshot
participants
today
tend
to
have
a
very
limited
set
of
minors,
often
within
their
operational
sort
of
world,
so
that
they
have
some
amount
of
control.
Not
all
do,
but
some
definitely
do,
and
this
sort
of
is
pushing
diversification
of
the
deals.
But
I
don't
think
that's
sort
of
the
principle
of
like
phil,
plus
necessarily
it
gets
more
to
ensure
that
onboarding,
useful
data
and
yeah.
A
I
I
sort
of
agree
with
you,
charles,
I
think
slingshot
is
an
incentivized
program
specifically
and
so
a
further
or
disproportionately
rewarding
like
one
team
or
one
project,
that's
onboarding
the
same
data
as
what
other
projects
could
also
be
onboarding
doesn't
seem
like
the
most
defensible,
but
I
sort
of
that
opinion
is
not
like
super
strongly
held
I'd
love
to
hear
other
people's
opinions
as
well
and
charles.
If
there's
anything
you
want
to
add,
feel
free
to
continue
chatting.
We
have
a
few
more
minutes
we
can
spend
on
this
for
sure.
D
A
D
Just
give
them
my
suggestion
about
this
one.
If
you
we
wanted
to
add
this
one
to
the
logical
set,
I
suggest
to
move
it
out
of
the
screenshot,
so
it
can
be
easily
instantiated
as
that
or
in
large
data
application
like
that
it
cannot
emboss.
So
that's
my
suggestion.
C
Yeah
so
like
so
to
me,
I
think
at
least
so
for
this
kind
of
situation
for
the
allocation
plan.
We
have
to
make
sure
that,
like
this
kind
of
data
is
transferred
to,
like
you,
know
these
kind
of
small
miners
and
the
miners
fellows,
and
we
should
make
sure
like
there's
no,
like
I
mean
exchange
between,
I
mean
I
mean
the
clients
and
the
md
miners.
C
I
mean
you
know
the
the
money
you
know
to
buying
the
data
cap.
At
least
we
should
make
sure
to
that.
But
for
for
I
mean
the
other
deals
also
like
for
based
on
our
like
a
previous
discussion.
We
just
ignore
this
kind
of
situation,
because
we
think
it
will
just
happen
but
like
for
this
kind
of
for
for
this
case,
we
should
make
sure
it
won't.
A
A
That
makes
sense,
I
think
the
that
becomes
even
more
difficult
right
in
this
case.
But,
yes,
I,
I
actually
think
that's
a
pretty
that's
a
very,
like
appropriate
point
to
bring
up
for
all
of
these
applications.
We
should
ensure
that
there's
like
a
clear
follow-through
on
the
allocation
plan
and
it's
rational,
reasonable
and
defensible
right.
A
H
Well,
I
guess
if,
if
fiona
is
sending
it
out
for
the
benefit
of
others
and
making
bandwidth
cost
and
also
taking
risk
in
the
slingshot,
because
not
not
everything
is
re
retrievable
and
it
should
be
okay
to
be
here
to
get
rewards
from
both.
A
E
I
mean
I
would
throw
out,
like
you,
can
think
of
either
of
the
incentives
as
being
the
de-risking
force,
so
like
slingshot
generally
has
a
de-risking
mechanism,
which
is
like
these
incentives
that
already
exist
and
then
discover
or
not
discover,
but
like
that
data
cap
in
general
is
like
a
de-risking
thing,
and
so
is
there
like
additional
risk?
It's
the
size
of
the
scale
that
like
somehow
warrants
it
like.
If
this
wasn't
a
large
data
cap
allocation,
would
normally
we
give
data
cap
to
that
sort
of
thing.
I
think
it's
like
another
question.
E
A
Some
have
so
this
question
was
raised
at
the
start
of
the
call
as
well.
I
think
jv,
where
it
was.
Is
this
something
that's
like
acceptable
per
se
and
it
is
sort
of
to
some
notaries,
and
so
some
notaries
have
been
giving
data
cap
for
slingshot
projects,
but
it's
not
like.
I
don't
think
it's
necessarily
consistent.
C
Because,
based
on
my
understanding,
if,
like
so
far
like
slingshot
project
like
if
again
that
they
they
were
probably
like
get
benefit
from
from,
I
mean
that
that
project
but
like
for
for
the
you
know
five
plus,
if
they
are
like
giving
their
you
know
the
data
cap
and
the
let
other
miners
to
store
the
data.
Actually
they
were
not
directly
benefit
from
these
projects.
I
mean
from
the
file
side,
but
I
mean
did
the
major
group
of
people
that
benefit
from
his
application?
Is
these
small
miners
right?
A
Yeah
but
then
we
then
we
should
be
consistent
across
every
slingshot
project.
Is
I
think
what
charles
is
saying
right,
which
is
like?
How
do
we
differentiate
if
it's
is
it
is?
It
is
what
you're
saying,
maybe
that,
like
in
the
case
that
they're
not
storing
the
data
with
themselves,
they
are
more
eligible
than
not.
D
So
another
policy
changes
to
all
screen.
Shots
can
be
granted
to
verify
dl's,
even
those
that
are
not
the
logical
side
that
was
smooth
data
set.
Now
it
becomes
another
scenario.
D
A
D
Then
the
question
will
become
another
question
comes
out,
so
how
do
you
define
the
self
really?
No,
then
we
go
back
to
the
perfect
discussion
in
the
2.2.
We
have
lots
of
minors,
but
I'm
just
a
hosting
company.
They
hosting
their
servers
are
my
notes,
so
I
store
it
on
their
notes.
It
shouldn't
consider
as
I
stored
for
myself.
So
then
I'm
not
sure
for
myself.
So
I'm
eligible
now,
nobody
is
really
doing
self-selling
because
we're
studying
for
others
I'm
a
hosting
company.
D
A
Yeah
interesting:
what
are
your
thoughts
on
that?
Those
of
you?
Maybe,
if
you
have
any
thoughts
on
yeah,
we'd
love,
to
hear.
H
Andrew
well
yeah,
it's
it's
more
like
if,
if
you
store
the
data
for
slingshot
on
your
own
miner,
then
you're
riskless.
So
if
you
store
it
on
somebody
else's
miner,
then
you
risk
the
retrieval.
H
D
Yeah,
it's
just
my
opinion
yeah.
I
know
that's
why
in
2.3,
the
screenshot
rule
removes
the
self
dealing
sees
that
suffering
is
okay,
because
we
cannot
solve
this
issue
to
discriminate.
Who
is
really
the
owner
of
that
miner?
We
cannot
solve
this
issue.
For
example,
like
me,
I
have
eight
miners
under
my
control,
but
they
belong
to
different
companies.
So
then,
from
the
lower
point
of
view,
I
don't
own
any
of
them.
At
least
they
are
belong
to
different
entities.
H
Right,
you
can
prevent
it
by
by
posting
the
minus
up
front,
where
you're
going
to
store
your
verified
data.
So
if
you
store
you
put
up
front
the
minor
numbers
where
you're
going
to
store
it
like
the
minor
x,
minus
or
whatever,
then
a
notary
can
decide
if
this
data
cap
will
be
valid
or
not.
So
if
you
put
it
on
on
unidentified
miners,
then
it's
something
quite
different.
H
Why
we
don't
give
some
new
miners?
No
well
only
mass
miners?
Well,
you
you
ask
for
a
data
cap
for
to
make
verified
deals.
So
the
note
3
can
ask
you
where
you're
going
to
store
it
and
if
it's
verified
miners-
and
it's
not
your
miner
and
the
notary-
can
verify
that.
Then
the
data
cap
will
be
granted.
Otherwise
it
won't.
D
H
Well,
that's,
that's
not
in
the
how
do
you,
how
do
you
say
that
it's
not
in
the
light
of
the
contract
you
made
so
many.
H
D
D
H
No,
no
I'm
saying
that
the
note
3
can
ask,
and
you
should
up
front,
give
a
list
of
the
minus,
where
you're
going
to
store
your
data
that
prevents
a
lot
of
fraud
and
other
things.
D
D
A
D
C
A
C
A
Because
there,
but
that's
only
because
it
takes
away
the
problem
of
knowing
who
the
miner
actually
is
correct,.
H
A
Yes,
okay,
so
the
only
fear
that
I
have
with
that,
as
somebody
that's
involved
in
the
minor
x
program,
is
that
I
also
don't
want
to
like
over
index
too
much
on.
Well,
I
guess
the
original
crowd
is
a
fair
one
like
if
you
look
at
the
way
in
which
it's
trending.
Now
it's
much
more
focused
on
like
testing
and
deal
making
for
specific
sets
of
clients,
and
so
maybe
the
original
set
of
fellows,
who
were
very
engaged
in
the
community.
A
Maybe
that
is
a
good
indicator
of
like
minors
that
are
verified
but
then
effectively,
you're,
trusting
me
or
you're,
trusting
like
pl
is
the
entity
that
verified
those
miners
and
that's
an
okay
way
to
go.
But
that
doesn't
mean
self
dealing
may
not
happen
right
because,
like
feyen
is
a
minor
x
fellow
and
so
they
could
also
theoretically
make
you
know
five
new
minorities
that
we
don't
know
about.
Necessarily,
but
your
point
is
that
they
weren't
on
the
original
list.
A
Yeah,
I
don't
know,
I
think
we
need
to
continue
this
discussion.
So,
charles,
if
you
don't
mind,
maybe
we
can
start
a
thread
in
the
field
plus
channel
and
then
we
can
share
some
or
I
can
start
that
off.
I
know
that
you
have
to
leave
for
a
meeting
and
then
we
can.
We
can
continue
there,
but
I
think
at.
H
A
More
meta
level,
this
is
like
a
conversation
that
we
should
have
about
like
to
what
extent
we
want
to
give
data
gap
to
slingshot
participants
andrew
also
made
the
ask
that
the
slingshot
organizers
should
propose
an
opinion
here,
and
so
I
can
also
go
push
some
of
that
thinking
and
come
back.
If
that's,
if
there's
something
useful
to
be
added
from
that
angle,
cool
the
last
application
is
tau
pipe,
which
is
a
a
photography
site.
Actually
I
don't
know
if
the
person
that
applied
for
this
is
actually
here
just
checking.
A
A
If
not,
the
sort
of
briefs
that
I
understood
from
it
is
yeah,
the
site
is
taupipy.com
they're,
requesting
1.5
bytes
of
data
and
they're,
basically
planning
to
take
photography
that
is
already
publicly
available
and
archive
it
to
the
falcon
network.
So,
specifically,
their
focus
is
on
storage
for
archival.
A
So
the
questions
that
aren't
achievable
they're,
like
probably
not
that
much
retrieval,
but
they
want
to
have
like
a
backup
to
the
network,
and
this
is
they're
getting
the
explicit
permission
of
the
photographers
that
you
know
have
taken
wedding
photos
and
various
celebration
photos,
and
things
like
that
to
upload.
I
just
saw
somebody
raise
a
hand
eric
do
you
want
to
share
any
info
on.
A
Oh,
it's
not
working,
okay,
okay,
yeah!
We
don't
hear
you
cool,
so
the
charles
is
flagged
that
we
should
check
on
the
legal
front
for
this,
because
client
photo
is
sensitive
information
yeah.
I
think
I
think
that
they're
getting
permission
is
what
they
claimed
from
both
sides
of
the
of
the
marketplace
to
ensure
that
the
photo
can
be
uploaded
but
yeah.
It
is
probably
worth
double.
Checking
on
this.
A
A
F
Okay.
Okay,
sorry,
so
I
would
like
to
make
a
little
bit
explanation
with
the
topi.
Actually,
it's
a
topi
would
like
to
upload
their
pictures
and
videos
to
the
firecall
network.
Actually
before
this,
it
was
signed
each
agreement
with
those
waiting
people-
or
I
mean
the
people
who
take
photos
from
the
top
I
actually
from
the
the
customer
side.
They
are
allowed
to
let
top
pipeline
to
use
their
data
and
upload
it
and
to
file
it.
F
A
A
C
A
Kind
of
situation
that'd
be
great
and
we
should
be,
please
feel
free
to.
You
know,
post
updates
in
the
issue
itself
and
we
can
check
there
but
yeah
from
what
I've
heard.
You
know
a
great
use
case
if
it
is
something
we
can
indeed
support
on
falcon.
I
think
it
would
make
sense
to
find
a
way
to
do
that.
F
Yes,
I
would
like
to
make
a
little
bit
explanation
with
this,
because
topi
would
like
to
upload
their
data
through
database
platform.
That's
what
work
we
we
will
do
is
help
the
people
I
mean.
I
will
help
our
customers,
like
topi,
never
uploads
their
data,
but
they
can
find.
Who
is
who
I
mean
all
the
data
was
stored
in
storage
in
the
fico
network
and
they
can
find
which
customer
would
like
to
delete
it,
and
then
they
will
download
it
and
delete
all
the
files
by
topic
by
themselves.
So.
A
F
A
Thanks
thanks
for
sharing
that,
so
in
the
interest
of
time,
let's
continue
that
conversation
in
the
issue,
but
it's
good
that
we
probably
have
some
mechanism
for
this.
It
would
be
great
see.
How
can
I
I
request
you
to
follow
up,
as
you
suggested
and
then
and
post
an
update
on
the
issue,
and
we
can
continue
the
discussion.
C
H
C
C
So
but
like
do,
we
want
them
to
like
store
their
data
like
just
in
china
or,
like
you
know
all
over
the
world.
So
that's
the
question
I
want
to
ask
for
you
guys.
C
A
Just
quick
flag
for
for
notre
applications.
Sorry,
I
should
have
mentioned
this.
We're
sort
of
past
the
clock
and
I
don't
want
to
keep
people
too
late,
but
all
right
just
to
wrap
up
quick
updates
on
the
nerdy
elections
front
is
that
the
applications
will
close
tonight
at
midnight,
pacific
time,
which
is
in
roughly
12
or
13
hours,
so
yeah.
If
you
haven't
applied
already
and
you're
interested
or
because
you're
an
existing
notary
that
wants
moderate
cap,
then
please
do.
A
The
first
round
of
scoring
will
happen
until
june
7th,
sorry,
the
first
time
it's
going
to
happen
until
june
1st,
I
believe,
where
we'll
post,
updates
and
feedback
in
your
issue
where
you
applied
and
then
we'll
do
another
round
to
update
the
rubrics
and
share
initial
results
on
the
notary
governance
call
next,
which
is
june
7th.
A
If
you
have
any
questions
reach
out
to
me
on
slack,
I'm
generally
sort
of
orchestrating
with
the
scoring
process
right
now
to
make
sure
that
we
have
hands
on
every
application.
Every
time
I
open
the
depot,
there's
many
many
more
applications,
and
so
it's
great
to
see
yeah
any
interesting
time.
I
think
we
should
call
it
like.
Thank
you
so
much
for
being
here
today.
We
will
continue
the
conversation
on
slack
on
github
and,
of
course,
in
the
next
notary
governance.
Call.
Thank
you.