►
From YouTube: Filecoin Plus - December 21 Notary Governance Calls
Description
Topics Covered
1. Year end metrics
2. LDN Allocation Step Increase
3. Slingshot Restore
4. Confidential Data
Helpful Links
Slide Deck - https://docs.google.com/presentation/d/1dofSU-cWRvB8ooNBf7VnM1bMu_z_YvyhLRO8zlRXvB8/edit?usp=sharing
Github Issues: https://github.com/filecoin-project/notary-governance/issues?q=is%3Aissue+is%3Aclosed+117
Slingshot Restore: https://docs.google.com/spreadsheets/d/1LWVndxGyegTdz5cPU86UZ5Y9vqN2n-VlK1kC0OeJHC8/edit#gid=1655540378
Storage Provider: https://docs.google.com/spreadsheets/d/1LWVndxGyegTdz5cPU86UZ5Y9vqN2n-VlK1kC0OeJHC8/edit#gid=1655540378
A
Hey
well
good
morning,
good
afternoon,
good
evening,
depending
on
where
you
are
in
the
world
today
is
the
last
notary
governance
call
of
2021.,
so
cue,
either
the
nostalgia
or
the
excitement
on
what's
to
come
and
we
have
a
pretty
packed
day
for
you.
So
this
means
being
recorded,
you're
watching
live
or
with
us
now
and
let's
take
a
look
at
what
the
agenda
is,
what
we're
going
to
hope
to
cover
is
I'll,
make
a
pass
and
I'll
pass
it
over
to
galen.
A
We
have
deep
to
talk
about
that
with
us.
We
have
some
community
flags
that
came
through
over
the
github
repo,
so
we'll
allocate
some
time
for
that
and
we'll
as
always
save
time
at
the
end.
So
if
there's
any
alibis
or
anything,
you
wish
to
cover
that
didn't
set
up
we'll
have
plenty
of
time
to
go
into
that
galen.
Is
there
anything
else
that
we
should
have
on
this
agenda
that
we're
good
rock
and
roll?
Then
galen?
Do
you
want
to
speak
to
this?
One
yeah.
B
I'm
just
kind
of
looking
back
a
a
reminder
for
those
that
are
with
us:
fip
003
was
created
october
2020.,
so
kind
of
the
overall
file
coin
plus
program
is
just
a
little
over
a
year.
I've
been
here
less
than
that
time,
but
just
looking
back
at
some
of
the
stats
of
what
we've
accomplished
this
year,
at
the
first
round
of
notary
elections,
we
had
1.8
petty
bytes
of
datacap
and
12
notaries.
B
B
Where
I
mean
it
is
safe
to
assume
that
those
numbers
are
going
to
increase
even
more
that
we
will
have
an
order
more
notaries
and
more
data
cap
going
out
looking
at
sort
of
how
our
data
captures
get.
It
was
getting
to
our
clients
in
december
of
2020
notaries,
allocated
78
tabi
bytes
to
clients,
78
heavy
bites
in
november
of
2021
notaries
allocated
1.6,
petty
bytes
to
clients.
So
you
know
I
was
going
to
pull
december
stats,
but
that
wouldn't
be
fair
and
looking
at
november.
2020
stats
isn't
fair.
B
So,
yes,
these
are
a
month
and
a
year
apart,
but
that's
a
2
000
increase
just
in
november
alone.
This
year
we
worked
with
103
clients
in
one
month
and
in
december
it
was
19
last
year,
so
notaries,
significantly
ramping
up
the
impact
that
we
are
having
looking
at
our
ldn
program
that
launched
in
june
of
this
year.
So
it's
really
only
been
going
for
a
handful
of
months.
We
now
have
85
open
issues
just
in
the
past
two
weeks.
I
think
we
have
had
something
like
30.
B
More
applications
come
through
so
seeing
some
significant
spikes
in
that,
some
of
which
relate
to
the
slingshot
program
which
people
talk
to
later,
but
with
85
open
issues
with
214
heavy
bites
allocated
to
those
multi-sigs
and
2.6
pebby
bytes
has
been
already
sealed
in
deals.
So
just
looking
at
sort
of
the
again
the
impact
that
our
our
programs
are
having
on
the.
B
Network
massive,
so
great
job
thanks
everybody.
B
I
I
can
do
this
one
too
yeah,
so
our
metrics,
like
that
was
kind
of
from
then
and
now
looking
at
our
metrics,
specifically
just
to
remind
everybody
where
we
are,
we've
made
it
under
that
two-day
threshold
for
time
data
cap,
our
allocations,
the
past
30
days,
are
getting
close
to
like
one
day,
which
is
phenomenal.
B
Looking
at
the
direct
notary,
it's
still
at
11
days.
So
that's
I
think,
about
the
same
than
has
been
so
we're
kind
of
wanting
to
keep
thinking
about
that
and
response
time
from
notaries
is
at
six
days.
On
average
the
we
still
see
us
similar
there's
like
a
slight
difference
in
the
breakdown.
B
As
we
start
getting
into
direct
note,
reactivity
versus
ld
inactivity,
three
petty
bites
to
clients
versus
6.7
to
clients
from
ldns,
but
what's
in
what
we've
been
seeing,
that
is
interesting,
is
sort
of
the
split
from
client
to
deals,
as
you
can
see
here
in
direct
notary
activity.
That's
getting
closer
to
50
of
the
data
cap,
so
50
of
the
data
cap
that
has
been
given
to
clients
they
have
used
in
deals
in
ldns
right
now
that
is
closer
to
40.
B
Those
have
been
really
close
together,
it'll
be
interesting
to
see
if
one
really
starts
to
pull
ahead,
where,
when
a
client
is
getting
the
data
cap
they're
using
it
and
deals
faster
and
we're
seeing
more
of
it,
go
out
so
we'll
just
see,
but
again
lots
of
data
cap
at
the
top
of
the
funnel
and
working
its
way
from
notaries
and
multi-cigs
to
clients
and
then
from
clients
to
deals
so
very
happy
about
these
stats
next
slide.
B
Has
direct
notary
activity
26
open
issues
in
the
direct
client,
onboarding
51
in
the
just
in
the
last
week,
11
that
have
been
granted
in
the
last
seven
days?
I
think
this
is
about.
I
think
these
are
mostly
about
average.
What
they
have
been
about,
64
are
getting
a
response
and
about
23
clients
out
of
the
79.
B
So
far
like
this
month
are
getting
data
cap,
so
those
I
think
are
about
where
we've
been
seeing
those
numbers
and
we're.
You
know,
still
pulling
the
snapshot
of
impact
and
hoping
to
kind
of
translate
some
of
the
stuff
into
that
leaderboard
that
I
think
k
ray
will
be
talking
about
soon.
B
C
Hello,
everybody
hope
you're
all
doing
well
wanted
to
bring
back
this,
for
today's
governance
call
to
chat
through
our
different
goal,
areas
for
2021
q1
you're
all
quite
familiar
with
this
at
this
stage,
but
for
those
of
you
that
are
hearing
us
seeing
this
for
the
first
time,
this
is
sort
of
the
the
five
main
focus
areas
we
have
for
the
different
work
efforts
within
the
falcon
plus
ecosystem,
first
of
which
is
volume
where
the
goal
here
is
ensuring
that
data
cap
itself
is
like
plentifully,
available
and
abundant
for
the
trustworthy
clients
that
should
be
receiving
it.
C
The
current
status
of
this
actually,
as
if
like
late
last
night,
is
in
my
time
zone
is
127
bytes.
Our
goal
is
to
get
to
about
500..
The
way
I
do
this
math
is
basically
I
take
that
7.7
number
that
gillian
called
out
earlier
and
that's
the
sort
of
the
second
number
in
the
parentheses,
and
then
119.5
is
the
sum
of
all
the
data
cap
that
is
given
to
ldns
that
have
already
received
at
least
one
signature.
C
So
it's
so
even
if
an
ldn's
been
created,
but
that
data
cap
is
never
given
to
the
client
it
does.
It
shouldn't
show
up
here,
and
so
it
doesn't
and
so
we're
hoping
that
this
number
continues
to
grow
up
go
up.
I
think
a
good
good
way
to
think
about
this
is
the
amount
of
data
cap
that
has
been
given
to
notaries
that
will
be
given
to
clients,
plus
the
amount
of
data
cap
with
ldns.
That's
been
committed
to
clients
so
because
we
changed
the
flow
for
the
to
get
stood
up
quicker.
C
There
wasn't
actually
a
there's,
no
longer
a
due
diligence
threshold
necessarily
and
before,
like
the
ldn,
gets
stood
up
to
speed
up
the
process,
and
so
this
is
just
a
flag
where,
if
you
go
to
the
dashboard,
you
might
see,
you
know
number
that's
higher
than
200
per
bytes,
for
example,
for
the
amount
of
data
cap.
That's
sitting
without
the
ants
goals.
Here
are
basically
improving
the
flow
even
further.
Looking
at
ways
in
which
we
can
unlock
greater.
C
You
know
understanding
responsiveness
from
notaries
in
terms
of
like
what
they
need
to
do,
and
then
I
think
the
big
one
here
would
actually
be
streamlining
client
applications
so
like
standardizing
a
little
bit
on
the
expected
quality
and
the
expected
amount
of
data
required
from
a
client
having
a
uniform
application
process
is
the
first
step
in
that
direction.
But
there's
probably
a
lot
more.
C
Next
up,
we've
got
time
to
date,
a
gap
which
is
our
sort
of
leading
metric
in
the
five
one
plus
ecosystem.
This
one
we've
seen
like
tremendous
gains
in
the
last
several
weeks
and
months.
Our
goal
is,
of
course,
still
tremendous
gains
away,
but
we're
making
great
progress.
C
There
was
a
time
where
getting
data
cap
was
like
a
multi-week
affair,
even
at
smaller
amounts
of
data
cap.
I
know
we
were
averaging
even
like
above
10
days
for
the
longest
time,
and
that
number
is
starting
to
drop.
Pretty
drastically,
I'm
sure
that's
some
combination
of
greater
volume
coming
in
through
our
automated
verifiers,
because
we
also
recently
had
you
know
a
thousand
unique
client
addresses
on
chain
that
have
received
data
cap.
C
But
of
course
a
lot
of
it
is
also
because
I
think
notaries
are
just
getting
faster
and
better
and
more
efficient
at
doing
what
they
were
doing
before
and
just
getting
used
to
what
they
need
to
do
and
getting
just
in
general,
like
understanding
like
what
they
need
to
get
before
the
confidence
to
proceed.
Our
ldn
process
still
has
a
very
high
variance.
It's
hard
to
measure
this
one.
C
We
need
to
get
better
metrics
in
place,
but
this
is
anywhere
from
like
one
day
to
like
four
weeks,
five
weeks
and
so
generally,
bringing
this
down
will
probably
be
great
too.
C
C
This
is
sort
of
a
proxy
for
like
transparency
in
the
in
the
system
and
like
wanting
to
generally
have
more
people
participate
and
understand
in
what's
actually
happening
in
the
falcon
plus
ecosystem,
but
then
also
write
interesting
analysis
and
tooling,
on
top
of
it
that
can
ada
aid
us
as
a
community
in
better
decision
making
and
better
process
development
and
better
program
management,
and
so
this
one
we've
had
like
pretty
good
progress,
actually,
not
because,
like
I
think
there
was
any
intentional
effort
necessarily,
but
rather
that
it's
an
interesting
topic
for
a
lot
of
people,
and
so
we
started
this
year
with
zero
actual
published
dashboards.
C
We
got
to
have
here
with
about
two
ish
dashboards,
one
thanks
to
the
file
drive
team
that
built
one
and
and
then
the
other,
where
we've
been
partnering
with
a
digital
mob
to
get
one
out
there,
and
so
we've
got
two
dashboards
that
have
gone
through
like
several
months
of
vetting
and
feedback
between
the
community.
But
as
of
recent,
like
last
month,
two
more
dashboards
have
popped
up
and
one
more
is
in
development
like
fraud
and
audit
tracking
more
closely.
C
So
I
think
we're
we're
doing
really
well
on
this
front
and
I'm
excited
for
more
collaboration
and
more
discussion
on
this,
so
that
we
can
make
all
these
more
effective
and
efficient
just
a
quick
flag.
C
I
think
I
mentioned
this
at
a
previous
noted
governance
call,
but
we
now
have
a
channel
specific
to
dashboards
in
falcon
plus,
sorry,
not
in
on
slack
specifically
for
plus
dashboards,
so
in
case
you're
interested
it's
hashtag
phil
dash
plus
dash
dashboards,
a
little
quiet
in
there
right
now,
but
I'm
hoping
that
that's
probably
going
to
become
a
hub
at
some
point
in
the
near
future
for
conversation
around
analytics
data
transparency,
data
collection
so
excited
to
see
you
there
if
you've
got
questions
or
feedback
or
want
to
work
on
something
in
this
space.
C
Next
up,
client
onboarding.
I
think
this
is
the
one
where
we
probably
have
the
least
amount
of
progress
so
far
between
like
when
we
set
these
goals
at
the
end
of
well
at
the
start
of,
I
guess
q4
and
the
q3
to
q1,
but
also
probably
a
lighter
weight.
C
One
initially
there's
definitely
been
ideas
circulating
in
the
community,
especially
around
just
improving
documentation
and
stuff,
so
I'm
really
hoping
that
we
can
get
together
get
to
it
as
a
group,
you
know
early
in
the
new
year
and
make
sure
we
get
like
a
higher
quality
set
of
things
that
we
can
ensure
results
in
both
clients,
but
also
groceries
falling
into
the
pit
of
success,
with
our
various
mechanisms
that
we
had
built
out
last
and
absolutely
not
the
least,
is
falcon
plus
governance.
C
Exciting
news
on
this
front
is
that
I
think
gaya
just
facilitated
the
the
public
posting
of
the
rfp
for
the
nursery
leaderboard,
which
is
something
we've
talked
about
for
several
weeks
as
a
community,
and
I've
been
excited
to
see
get
done.
So
if
you
are
aware
of
any
teams
that
are
interested,
I
think
there
might
be
more
info
on
this
later
on.
C
You
should
feel
free
to
follow
up
with
gary
in
the
chat,
but
definitely
take
a
look
at
that
and
see
if
that's
something
that
you'd
be
willing
to
you
know
partner
with
or
are
excited
to,
help
out
with
in
general.
I
think
we're
also
getting
a
little
bit
more
used
to
this
combination
of
like
discussions
and
discussions
with
issues
which
I
think
has
been
good
and
engagement
on.
C
That
front
has
been
great
and
then
really
excited
for,
like
the
the
surveys
that
gary
has
already
started
doing
with
some
of
you
and
we'll
be
continuing
to
do
to
ensure
that
we're
able
to
engage
folks
in
a
way
that
makes
sense
for
them
and
we're
able
to
gather
that
feedback
and
and
make
things
more
interesting,
engaging
efficient,
effective,
so
yeah
good
progress
on
that.
One
for
sure
really
thankful.
That
kerry
is
here
to
spearhead
many
of
these
efforts.
D
Please
go
ahead,
though
yeah
I
just
have
one
question
so
many
people
that
are
doing
this
thing,
slingshot
repair
program.
They
are
asking
for
100
terabyte
of
data
cap.
What
are
we
supposed
to
do
with
that
like
because
they
are,
they
are
not
like.
As
an
example,
I
received
two
requests
from
chinese
miners,
so
I
mean
our
clients.
Let's
say
what
I'm
supposed
to
do,
I'm
I'm
supposed
to
to
support
them,
one
of
the
turbine
anyway,
it's
too
much
for
me.
That's
my
total
data
cap
allocation.
D
So
so
what?
How
should
we
manage
that?
So
is
it
supposed
to
be
covered
by
the
local
notaries?
Are
we
going
to
have
more
data
cap
to
distribute
to
support
them?
What's
the
plan,
and
what's
your
idea
on
that.
C
Let's
talk
about
that
further
down,
you
missed
the
start.
I
think
galen
has
has
given
us
time
to
talk
specifically
about
slingshot,
restore
so
we'll
get
to
we'll
get
to
it
soon
in
a
bit
and
we'll
discuss
it
then
a
great
question,
though,
and
I
think
very,
very
topical.
A
All
right,
everyone,
you
are
going
to
see
your
calendar
update
tomorrow.
We
are
going
to
be
resetting
the
cadence
for
this,
so
this
is
just
one
of
those
just
kind
of
talking
about
it
before
we
do
it
and
you'll
also
notice
that
next
week
is
the
holidays
here
in
the
states
celebrating
christmas,
so
we're
going
to
have
reduced
response
times.
So
what
that
means
is
that
if
you
write
into
slack
or
there's
a
big
issue,
I
don't
think
we
can
get
away
from
checking
our
phones
just
welcome
to
2021
by
you.
A
I
check
that
before
I
get
out
of
bed
right
when
I
crawl
into
it,
but
if
there's
anything
major,
let
us
know
and
we'll
obviously
triage
that
with
firefighting,
otherwise
you'll
see
reduced
response
times
for
anything
in
the
repo
anything
on
slack,
as
we
kind
of
take
some
time
to
celebrate
our
family's
recharge
hope
you
do
the
same
when
we
come
back.
The
first
sink
of
2021
is
going
to
take
place
on
january.
4Th,
so
you'll
see
a
brand
new
invite,
come
I'll,
be
messaging
you
right
now.
A
What
we
do
is
we
have
the
notary
governance
call
where
you
can
add
on
the
google
calendar,
we'll
still
do
that
and
I'll
also
be
sending
pings.
Sometimes
it's
helpful
to
have
that
calendar
manually
added
to
your
own.
So
if
you
want
to
be
on
that
invite
list
I'll,
send
you
a
ping,
and
that
makes
it
nice
if
you
want
to
have
that,
be
part
of
your
own
calendar
without
having
to
follow
it.
So
we'll
follow
up
with
that.
A
If
that's
that's
helpful
for
you
and
then
taking
place
going
forward
january
11th,
that's
when
the
new
schedule
starts
every
other
tuesday
and
we're
still
getting
feedback
on
the
times
to
make
sure
that
this
time
works
for
all
of
you
in
all
the
time
zones
incorporating
the
entire
globe
has
been
a
little
bit
of
a
challenge.
So
we're
constantly
tweaking
that
to
make
that
format
work
for
everybody,
and
so
we'll
still
collect
your
feedback.
If
you
have
thoughts,
please
let
us
know
so
with
that.
A
So
what
we
have
in
mind
and
what
we've
submitted
for
the
rfp
is
essentially
a
public
display
of
awesomeness
and
just
because
a
lot
of
us
are
nerds,
we
have
it
laid
out
like
a
video
game
format
where
you
can
actually
see
like
how
do
we
stack
rank
and
what
we're
doing
is
we're
taking
some
of
the
fields
that
our
work?
That's
being
done
so
like?
How
long
have
someone
been
a
notary?
A
What's
the
time
to
data
cap,
you
can
you
can
see
the
board
that
we
have
set
up
so
that
rfp
is
out
we're
looking
for
bids
on
this.
So
if
you
know
somebody
or
that's
in
the
field
that
wants
to
submit
for
this
we're
hoping
to
get
started
asap
and
so
we're
blasting
this
wide
and
proud,
and
hopefully
this
will
get
launched
in
early
january
february.
A
We
have
this
up
where
you
can
start
to
see
some
of
these
metrics
and
have
some
comparison
charts
as
we
go
through
so
really
excited
about
this
also
really
keen
on
feedback,
so
you'll
be
seeing
some
more
polys
come
out.
The
goal
of
those
are
just
really
quick
check-ins
on
like
what
are
your
thoughts,
what
can
we
do
better?
It's
designed
to
be
like
two
minutes
of
a
survey
or
less
right
there
in
slack,
obviously,
it's
completely
optional,
but
that
feedback
it
gets
taken
into
account.
B
Sweet,
thank
you
yeah.
So
if
you
want
that
money,
if
you
have
friends
that
are
great
developers
and
they
want
that
money
jump
on
that,
the
the
five
coin
foundation
is
trying
to
get
grants
out
the
door,
that's
our
that's
our
job,
one
of
our
jobs,
okay,
so
shifty
gears!
Here
I
have.
We
had
a
discussion
topic
for
a
while.
I've
now
made
it
into
a
proposal
to
add
two
more
steps
to
the
ldn.
Okay
ready.
Would
you
mind
clicking
on
that
governance
issue?
B
Specifically,
I'm
going
to
open
it
another
tab.
I
don't
think
this
is
going
to
be
super
controversial.
I'm
just
going
to
talk
through
it
really
quick
and
we're
going
to
pause
for
a
few
seconds
and
take
questions.
The
current
calculation
has
three
three
steps:
right,
we're
familiar
with
those
it's
either:
5
of
total
or
50,
weekly
10
of
total
100,
weekly
20,
total
or
200
weekly,
and
that's
where
it
has
capped
out
we've
had.
B
I
think
three
clients
get
to
this
third
allocation
at
this
point,
and
it
is,
you
know
it
creates
a
bit
of
a
slow
down
for
them
to
only
ever
have
a
max
of
200
of
their
weekly,
so
it
seems
very
safe
and
it
is
very
low.
Technological
lift
to
just
add
two
more
allocation
steps
so
that
the
fourth
allocation
would
be
40
or
400,
weekly,
fifth
allocation,
80,
total
or
800
weekly.
B
This
still
gets
us
that
scale
so
that
if
we
have
an
ldn
that
we
don't
really
know
the
client,
we
don't
really
know
the
project
they're,
still
starting
with
a
small
amount
of
data
cap
up
front
while
they
start
making
deals-
and
it
lets
us
see-
but
they
know
that
they
can
get
to
an
eight-week
runway
of
data
cap.
B
This
is
not
necessarily
saying
we
won't
still
investigate
other
you
know
mechanisms.
This
is
just
a
proposal
to
add
two
more
steps,
so
I
want
to
pause
here
and
see
if
there
are
specific
questions,
comments
or
concerns
about
this,
and
you
can
also
post
them
in
the
issue
itself.
B
All
right,
I
take
that
as
you
all
love
it
and
think
it's
the
best
idea,
you've
heard
of
since
eggnog.
If
that's
the
case,
we're
gonna
discuss
it
again.
I
may
want
to
like
push
this
forward
so
that
it's
ready
in
january
to
be
implemented.
We've
already
talked
to
our
the
kikotec
team
and
it's
not
incredibly
difficult
to
add
those
extra
calculations
to
the
code,
so
we
will
keep
you
posted
and
by
the
next
notary
governance,
call
on
the
11th.
We
may
just
do
one
more
round
of.
B
B
That
discussion
topic,
I
think,
back
to
the
slides.
The
next
thing
up
is
the
slingshot
restoration
initiative
over
to
d
all
right.
C
Cool-
let's
talk
about
this,
so
I'm
sure
a
lot
of
you
have
already
seen
stuff
on
this
in
slack
or
in
general,
heard
about
it
before
this,
because
even
if
you
hadn't,
you
heard
julian
talk
about
it
precisely
four
minutes
ago,
so
you've
definitely
heard
about
it.
Now,
to
give
you
a
little
bit
of
context,
there
was
an
incident
that
you
know
resulted
in
a
tremendous
data
loss
from
the
falcon
network
approximately
two
weeks
ago.
C
Less
than
that,
actually
it
was
like
the
prior
weekend
so
about
10
days
ago
or
so,
resulting
in
both,
like
the
loss
of
several
sectors
with
you
know
nothing
in
them
with
something
in
them,
but
the
biggest
losses
on
the
slingshot
front,
where
close
to
13
terabytes
of
of
content
was
lost
over
the
course
of
the
weekend
because
of
a
fire
in
a
data
center,
and
so
in
general,
like
we
kicked
off
like
recovery
efforts
in
slingshot,
using
like
two
parallel
programs,
one
called
restore
the
other
called
repair.
C
You've
heard
less
about
repair,
because
that
one
is
a
much
smaller
scope
and
is
flowing
through
estuary
today.
So
from
a
from
like
a
what
do?
We
care
about
in
falcon
plus
perspective,
that's
already
sort
of
accounted
for,
there's
a
bunch
of
vetting
happening
that
ensure
that
estrella
is
continuing
to
do
the
same
level
of
account
management
search
provider,
diversification
and
all
the
other
things
that
resulted
in
them
getting
data
cap
as
a
deal
broker
for
the
network
in
the
first
place.
C
Data
loss
of
this
nature
is
is
not
like
common,
but
definitely
happens
at
even
the
web.
Do
cloud
provider
scale.
It's
just
that
end.
Users
typically
have
no
idea
that
it
happened
because
they're
so
good
at
rebuilding
replicas
flipping
over
traffic,
ensuring
that
there's
always
somewhere
for
the
request
to
go
that
it
is
very,
very
difficult,
as
a
user
of
a
web
2
cloud
to
like
realize
if
a
storage
service
has
actually
received
any
sort
of
interruption.
C
And
so
what
we're
trying
to
do
with
slingshot
restore
is
sort
of
set
the
precedence
for
the
first
time,
where
that's
actually
happening
at
the
scale
in
the
falcon
network
and
we're
kind
of
excited
to
see
what
we
can
do
in
terms
of
testing
the
limits
of
getting
data
at
this
scale,
back
onboarded
onto
the
falco
network,
and
so
specifically
we're
talking
about
approximately
2.5,
unique
pebby
bytes
of
like
data
sets
that
were
identified
as
part
of
the
slingshot
program
that
are
then
getting
replicated
at
least
five
times,
resulting
in
12.5
terabytes
of
slingshot
data
being
re-on-boarded
to
far
coin.
C
And
a
lot
of
this
is
like
more
painful
work
in
terms
of
like
needing
to
go
back
to
the
source
of
the
data.
So,
like
the
original,
you
know
bucket
on
s3
in
the
best
case
scenario.
C
But
otherwise,
like
going
to
the
research
programs
website
to
download
the
data,
set
and
store
it,
and
so
clients
that
are
participating
in
this
program
are
applying
to
participate
being
manually
selected
based
on
like
when
they
applied
their
experience,
location,
their
ability
to
process
data
quickly
and
then
being
given
access
to
a
list
of
miners
that
are
also
applying
to
participate
being
vetted
based
on
experience,
location
ability
to
seal
quickly
and
also
access
the
capital
like
we're.
C
Frankly
we're
asking
them
how
much
falcon
they
have,
because
it's
going
to
limit
their
ability
to
onboard
verified
versus
unverified
content
in
terms
of
the
collateral
that
they
will
need
to
front
as
storage
providers
and
so
we're
we're
sort
of
facilitating.
This
is
a
much
more
curated
marketplace
for
deals
where
it's
like
a
specific
set
of
clients,
with
a
specific
data
set
assigned
to
them,
working
with
a
specific
set
of
storage
providers
where
there's
requirements
to
decentralize
that,
where
at
least
out
of
those
five
replicas
is
coverage
for
three
different
countries
and
two
different
continents.
C
You
know
early
in
q1,
for
example,
around
indexing
and
wanting
to
ensure
that,
like
the
data
that
we're
collecting,
you
know
leverages
what's
available
on
the
network
in
a
way
that
it's
truly
useful
and
so
we're
going
as
far
as
actually
syncing
with
clients
on
a
weekly
basis
to
ensure
that,
like
they're,
actually
understanding
the
requirements
complying
with
them
able
to
make
progress
and
if
not
choosing
to
drop
out
as
soon
as
possible,
so
that
they
can
sort
of
pass.
C
That
torch
on
to
the
next
person,
and
so
that
it
also
includes
them
like
publicly
stating
the
exact
storage
providers
are
working
with
and
specifically
assigning
them
to
data
sets.
So
it's
a
very
auditable
process
and
everything's
in
in
sort
of
available
to
anybody
to
look
at,
and
so
with
that,
given
maybe
hop
to
the
next
slide,
and
I
can
walk
you
through
some
of
the
content
that's
available
because
what's
happening
today
is
a
lot
of
these
guys
are
asking
to
get
data
cap,
especially
from
the
ldn
processor
julian.
C
This
is
oh,
it's
k-ray
who's
driving!
My
bad!
Sorry
thanks!
Gary
julian
you
mentioned
you
know:
do
you
want
to
give
your
own
data
cap
for
this,
and
I
actually
think
that's
probably
not
a
good
idea,
mostly
because,
firstly,
each
of
these
clients
is
going
to
require
500
timing
bytes
of
data
cap
at
minimum
right,
because
each
of
them
is
doing
at
least
100
everybody
like
set
of
data
going
to
replicate
it
five
times.
No
single
notary
here
other
than
like
four
have
access
to
like
more
than
like
a
february
data
cap.
C
That
could
be
useful
also
because
this
is
something
where,
like
somebody
might
like,
not
fully
go
all
the
way
through
the
program
then
having
like
gates
that,
like
give
them
tranches,
makes
a
lot
more
sense
where
we
can
see
the
progress
they're
making
continue
to
give
them
data
cap
and
because
this
program
kicked
off
last
week
and
it's
probably
going
to
run
non-stop
at
full
velocity
through
the
next
four
weeks.
Five
weeks.
C
As
an
individual
notary
probably
is
not
the
move,
I
would
say
it's
much
better
to
go
via
the
ldn
process
and
recommend
that
the
clients
reaching
out
to
you
do
the
same,
just
because
it's
so
transparently,
audible
that,
like
you,
don't
need
to
like
build
that
individual,
like
trust
with
the
client
as
much,
because
you
can
just
do
that
in
the
open,
and
so
some
links
probably
worth
clicking
on
qra.
Those
three,
if
you
don't
mind
where
it
says,
data
set
assignments
address
info
and
approved
sps
yeah.
C
C
If
you
scroll
to
the
right,
this
includes
how
much
they're
able
to
store
where
they're,
located
and
stuff,
so
we
can
ensure
sp
diversity
if
you
want
to
hit
the
next
link
in
this
slide
boom.
This
is
their
actual
address
is
registered
on
chain
and
the
search
writers
are
committed
to
currently
working
with.
This
is
compelling
because
it
should
map
to
exactly
where,
like
data
cap
is
being
requested.
C
So
when
an
application
comes
in
for
an
ldn
and
you're,
a
notary
and
the
application
claims
to
be
coming
in
for
for
data
cap,
here's
where
you
can
actually
look
at
the
client
name,
the
slack
handle
and
the
address
that
they've
registered
to
the
program
and
for
the
like
actual,
like
search
provider,
deal
distribution
plan
literally
this
is
it
like.
We
have
all
the
data,
and
so
we
can
validate
the
traffic
that
are
coming
up
where
required,
and
the
last
link
in
the
slide
is
helpful.
C
Oh
wait:
what
did
I
copy
the
same
link
twice?
The
first
link
is
probably
incorrect
on
the
slide,
then,
because
that
should
have
been
a
data
set
assignments.
That's
okay!
If
you
go
back
to
the
the
third
tab
that
you
have
your
tab
in
your
browser,
two
two
over
one
to
the
left
of
that
that
one!
Yes,
if
you
go
to
the
next
tab
in
the
spreadsheet,
where
you
see
data
sets
cool
yeah.
So
this
is
the
actual
list
of
data
sets,
so
I'll
fix
the
link
in
the
slide.
C
So
you
have
like
the
the
mapping
between
which
clients
are
working
on
what
data
set
as
well,
but
this
is
basically
taking
the
data
that
was
lost
from
the
network
in
that
the
cids
are
not
available
anywhere
in
the
file
cloud
network.
So
knowing
that
we
basically
have
to
regenerate
it
by
reprocessing.
The
data
resulted
in
this.
C
These
data
sets
being
identified
as
like
the
ones
that
need
to
be
just
stored
onto
the
falcon
network,
we're
trying
to
be
conservative
here
and
that
we're
like
definitely
picking
more
unique
data
than
before,
because
in
slingshot
the
average
replication
factor
was
closer
to
eight,
so
like
less
unique
data
was
probably
lost
and
it
was
just
replicated
more
times.
C
But
this
is
more
interesting
because
we'll
this
time
through
this
process
we're
getting
a
much
higher
quality
of
compliance
and
storage
and
and
partnership
with
clients
and
search
writers
to
ensure
things
are
happening
correctly
and
so
on.
This,
this
particular
page
is
where
you
can
actually
check
so
I'll
update
the
link,
but
basically
I'll
give
you
a
mapping
that
says
for
this
client
name.
This
is
the
set
of
data
sets
are
actually
working
on.
C
So
then,
when
you
go
to
that
tab,
that
carrier
has
on
the
issue
list
for
the
large
data
set
applications
that
are
coming
in
the
last
one,
on
your
browser,
sir,
over
the
right
that
one,
yes,
okay,
so
these
you
can
actually
go
and
validate.
You
can
see
like
they're
like
if
you
click
into
an
application.
C
I
don't
know,
let's
click
the
first
one
I
haven't
looked
at
these
set,
I'm
not
entirely
sure
okay,
so
this
one
looks
to
be
from
flyworker.
Who
is
oh,
it's
charles
cow,
and
he
specifically
is
working
on
storing
the
fly,
brain
anatomy
data
set
and
is
requesting
data
cap
for
it,
and
so
what
you
can
do
is
you
can
actually
go
and
look
at
the
list
of
clients
that
are
participating.
B
Yeah,
jumping
in
here
I
was
just
gonna,
say
carrie
if
you
could
hit
back
one
and
just
go
back
to
that
overall
view.
At
this
point
we
have
nine
open
ldns
on
the
slingshot
restore,
and
so
a
number
of
them
have
already
been
granted.
Some
of
them
are
still
waiting
for
signatures.
Please
go
sign
ones
that
are
waiting
for
signatures.
B
As
deep
is
saying,
there
is
a
lot
of
auditing
and
diligence
happening
and
he
does
bring
up
a
point
of
the
question
of
like.
Should
we
be
adjusting
requested
amounts
before
approving
them,
which
we
can
discuss,
but
a
number
of
the
applications
themselves
were
a
little
bit
light
because
all
of
this
information
about
what
they
were
assigned
wasn't
determined
when
they
put
the
application
in.
So
we
can
kind
of
push
back
on
these
clients
to
go
edit
their
applications
or
add
some
more
details
as
well.
C
Yeah
so
the
last
thing
I
said
in
the
chat
I'll
added
that
on
the
slide,
so
you
have
it
be
looking
at
the
deck
later,
but
that's
the
list,
mapping
the
so
yeah
on
the
you
have
the
person's
name,
the
slack
handle
and
the
actual
data
set
assignment,
and
the
next
tab
tells
you
what
data
sets
are
comprised
in
each
of
the
assignment
like
groupings
and
so
there's
21.
B
An
ask
on
that
on
that
other
client
assignment
sheet,
if
we
could
get
more
like
if
we
could
get
their
address,
I
saw
their
address
in
another
place,
but
on
that
one
where
it
just
has
name
and
slack,
because
not
everyone's
github
lines
up
to
that.
So
there
were
a
couple
questions
so
deep.
I
don't
know
that
was
posted
in
the
notaries
channel
yesterday
I
don't
know
if
you've
seen
it,
but
that
client
assignment
list
where
it
just
has
name
slack
assignment
number.
C
I
think
address
is
probably
the
right
way
to
go,
because
that's
what
we're
tracking
is
all,
and
so
I
could
just
make
another
sheet
that
combines
those
things
into.
Maybe
one
view
and
share
that
with
the
notaries
easy
gary.
Do
you
mind
switching
back
to
the
slides
cool,
so
the
the
deal
asked
like
here
actually
is
that
the
restorer
is.
C
It
is
the
first
time
we're
actually
end-to-end
testing
high
quality
restoration
of
data
loss
on
the
network,
setting
the
tone
for
how
we
view
it
from
an
urgency
and
a
prioritization
perspective
is
going
to
be
paramount
to
ensuring
long-term
success
in
the
network
and
ensuring
that
the
network
is
built
to
handle
this
kind
of
loss
and
failure,
and
so
having
you
folks,
engaged
and
checking
this
stuff.
C
C
You
know
either
here
or
in
slack
or
anywhere,
but
with
the
idea
being
that,
basically,
we
are
probably
running
the
the
largest
effort
on
the
network
so
far
to
get
this
amount
of
data
onboard
in
such
a
short
amount
of
time
and
like
really
testing
the
limits
on
like
a
physical
scale
even
with
when
it
comes
to
networking
hardware
compute
things
like
that,
and
so,
given
that
there's,
like
significant
amounts
of
wedding
happening
tracking
happening,
audible
content
put
out
there,
I'm
hoping
that
for
the
projects
that
are
looking
for
the
support
of
falcon
plus,
because
not
every
everybody
is
actually
because
of
collateral
requirements
from
storage
providers,
but
for
the
ones
that
are
we're
able
to
funnel
them
through
as
required,
and
that
also
then
means
that
some
nurturings
are
probably
going
to
get
pinged
later
this
week
and
next
and
so
in
case
you
are
around
and
you
happen
to
not
be
off,
then
would
really
request
your
time
in
helping
process
some
of
this
stuff.
C
I
also
noticed
that
in
attendance
we
have
a
couple
of
slingshot
just
to
all
clients.
I
see
shinon
faye
a
couple
others
in
here
as
well.
If
you
guys
want
to
add
anything,
please
feel
free
to.
C
F
Hey
we
can
hear
you
yeah,
okay,
so
I'm
going
to
ask
you
two
questions
about
the
slingshot
restore.
Actually,
a
lot
of
the
storage
provider
comes
to
us
and
asking
for
for
joining
this
program.
So
did
this
program
open
for
the
chinese
storage
provider.
C
So
I
think
at
this
point,
the
majority
of
the
search
providers
that
were
put
through
were
actually
in
n
a
and
europe.
My
understanding,
like
some
of
the
storage
providers.
I've
spoken
to
in
china
are
a
little
bit
concerned
with
onboarding
at
this
rate,
especially
with
like
regulation
stuff
and
so
like.
I
think
the
focus
right
now
is
still
like
nau
replicas
of
the
data
eric.
If
you,
if
you
have
specific
thoughts
on
that,
I
would
love
to
like
follow
up.
C
We
can
do
that
in
a
dm
or
chat
about
it
and
see
how
we
can
get
more
validated
search
providers
in
the
china
region
also,
but
a
lot
of
a
lot
of
folks,
I
talked
to
flagged
that
it
was
a
risk
from
a
regulation
standpoint
and
I
don't
know
if
we
should
be
pushing
people
to
over
commit,
because
this
is
like
a
very
sort
of
fast-paced
thing.
That's
going
to
probably
take
a
lot
of
bandwidth,
take
a
lot
of
time
and
effort,
and
so
just
want
to
make
sure
that
we're
doing
things
correctly.
F
Understand
so
that
means
case
by
case
right.
C
Probably-
and
I
would
love
your
advice
because
I
mean
I
I
don't
under-
I
don't
know
the
the
regulatory
environment
as
well
as
you
do
probably
and
so
happy
to
take
suggestions
and
advice
and
feedback
on
that.
F
Okay,
that's
cool,
okay,
question
number
two,
so
I
say
so
we
will
go
through
with
all
the
ldn
and
we
will
use
our
ledger
to
approve
it,
but
actually
our
laser
gets
something
wrong
again
like
last
time.
So
when
we
approved
the
data
sets
sorry
the
ldn,
then
we
found
the
the
transaction
called
propose
again.
So
when
I
help
yanfei
to
approve
his
data
set,
but
actually
they
show
me
proposed,
so
they
got
stuck
so
we
don't
know
why.
So
maybe.
F
B
So
three,
three
or
four
days
ago,
one
of
the
bots
shut
down
again,
and
so
when
a
message
would
go
through
the
front
end,
it
would
post
in
the
github
comment.
The
message
would
work
successfully,
but
it
didn't
because
one
of
the
bots
was
off
it
didn't
change
the
label,
so
it
didn't
have
an
approval
count
of
one
on
the
app.
So
then,
when
a
notary
went
to
sign
again,
even
though
there
is
a
pending
transaction,
because
the
app
doesn't
catch
that
approval
count
of
one,
it
still
says:
approval
zero.
B
When
another
notary
goes
to
sign,
it
starts
another
transaction
id,
so
fay.
This
is
happening
for
yours.
This
happened
for
a
number
of
different
notary
ldns
this
weekend,
where
they
would
send
a
message
because
it
didn't
change
the
labels
to
start
sign
and
it
didn't
change
the
approval
count.
The
next
notary
to
send
a
message
started
another
transaction.
So
now
there
are
multiple
ldns
where
there
are
two
transaction
ids.
I
think
I
have
caught
and
edited
all
of
those
labels
and
the
bots
are
back
online
and
we're
seeing
more
success.
B
Also,
we
are
working
and
very
close
to
having
a
log
explorer
that
will
be
able
to
post
all
of
the
bot
logs
publicly
for
the
app
so
that
we
can
see
in
a
situation
like
this.
I
posted
this
message.
The
bot
didn't
recognize
it
what's
going
on
and
we
can
troubleshoot
and
debug
faster.
So
that's
the
situation.
F
Okay,
okay,
so,
according
to
the
schedule
of
the
communities
holiday,
so
for
china,
we
will
keep
working
by
the
end
of
january
2020.
So
that's
for
more
than
two
weeks
holiday
from
the
u.s
side
right.
So
actually
we
hope
we
can
fix
the
problem
and
we
will
keep
working
on
all
the
ideas.
Otherwise
we
will
stop
our
working
okay.
Thank
you.
C
Cool,
I
think,
unless
there's
any
other
questions
on
slingshot
restore
side,
I'm
going
to
turn
it
back
to
any
additional
discussion
topics
that
you
want
to
flag
gary
and
gillen
and
others
as
well.
Of
course,.
D
No,
please
go
ahead.
I
just
wanted
to
know
like
do
we
have
any
ideas
when
is
going
to
be
the
the
next
action
notary
election.
C
C
I
don't
know
if
we
have
a
specific
date
yet,
but
realistically
typically,
we
want
to
announce
these
things
and
then
keep
applications
open
for
two
to
three
weeks,
and
then
it
takes
about
two
to
three
weeks
to
do
like
all
the
all
the
stuff
after
that,
so
earliest
would
be
like
probably
finishing
the
entire
round
by
end
of
february,
and
so
probably
somewhere
between
I'd,
say,
mid-jan
and
mid-feb
start
so
yeah,
okay,
oh
yeah.
I
forgot
to
call
that
out.
C
Actually
we
had
our
first
ldn
finish
like
actually
give
out
all
the
data
cap
that
it
had
successfully
meeting
the
compliance
set
out
by
the
client
who's
in
the
call
right
now
in
terms
of
the
search
writers
they
wanted
to
use.
So
that
is
pretty
epic.
Congratulations
like
great
work
as
a
client
like
keeping
up
the
code,
keeping
up
the
compliance
and
keeping
the
entire
community
aware
of
like
the
work
that
you're
doing.
G
Yeah,
I
just
want
to
add
something
smaller
like
a
request,
as
mentioned
just
now
so
because,
like
in
china,
we
are
going
to
have
like
national
holiday
from
the
start
of
the
february,
so
we
are
probably
not
gonna
work
from
the
end
of
the
january.
So
yes,
I'm
so
like
maybe
doing
elections,
we
might
encounter
some
problem
at
that
time.
C
C
C
Oh,
my
god.
I
also
noticed
that
a
couple
of
things
I
sent
in
chat
went
to
the
didn't
switch
to
the
lobby
after
I'd
message,
the
waiting
room,
so
I'm
gonna,
send
he
send
a
few
things
in
the
chat
shortly.
Thank
you
for,
for
letting
me
know.
B
H
A
I
Yeah
sure
really
appreciate
getting
on
the
agenda
and
thank
you
all
very
much
for
your
attention
and
happy
holidays,
all
that
good
stuff
yeah.
So
I'm
I'm
like
brand
spanking
new
to
file
coin
and
to
seal
storage,
which
is
a
data
storage
provider
on
the
network.
I
We
just
added
about
20
petabytes
or
we're
in
the
process
of
right
now.
So
we're
very
excited
about
that.
But
I
I
come
from
the
world
of,
like
you,
know
scientific
instruments
and
r
d
and
I've
built
stuff
for
nasa
and
nist
and
doe,
and
things
like
that.
I'm
a
mechanical
engineer.
So
I'm
like
I'm
like
not
a
blockchain
person
at
all,
I'm
just
coming
coming
at
this,
like
from
a
business.
I
You
know
standpoint
and
from
like,
where
is
really
exciting
data
standpoint,
and
so
I've
been
reaching
out
to
a
lot
of
my
contacts
in.
Like
the
you
know,
interesting,
you
know
scientific
places
and
finding
you
know
like
a
lot
of
like
innovators
in
the
space.
I
You
know
even
like
you
know,
kind
of
like
where
we
are
like
before
early
adopters
people
that
have
been
dealing
with
petabytes
for
decades
and
you
know
probably
try
the
data
center
thing
and
then
now
are
fighting
like
like
with
amazon,
and
then
they
talk
to
us
and
like
wow,
there's
this
kind
of
awesome
decentralized
thing
out
there,
let's,
let's
give
it
a
shot,
and
so
because
these
are
large
scientific
organizations,
sometimes
there's
you
know
some
politicking.
I
That
has
to
happen
internally
before
a
major
decision
can
be
made
to
kind
of
try
some
new
flavor
of
of
cloud
storage.
And
that's
you
know
that's
how
how
this
is
viewed
and
and
so
anyway.
So
we
have
a
very
exciting
customer.
The
data
itself
is
not
confidential.
It's
just
the
the
customer
wants
to
try
a
10
petabyte
pilot,
which
is
a
small
amount
of
data
for
them.
I
This
is
a
particle
accelerator
data,
it's
compressed
not
encrypted,
and
they
just
want
to
keep
the
pilot
confidential,
to
prove
it
out
internally
and
kind
of
get
the
influence
that
they
need
inside
the
org.
So
they
can
go
to
management
and
they
go
hey.
Look.
This
is
awesome
we
should.
I
We
should
really
do
this
in
a
big
way,
and
so
anyway,
so
that's
where
this
comes
from
and-
and
I
have
about
where
we
have
about
maybe
five
or
six
other
clients
like
this-
we're
not
sure,
if
they're
all
going
to
want
the
pilot
to
be
confidential,
but
we're
kind
of
stacking
up
some
exciting
scientific
customers
on
dark
matter,
climate
data,
genome
data,
some
really
cool
stuff
and
anyway,
and
so
this
is
unique
to
this
first
customer
others
may
have
this
requirement,
I'm
not
really
sure
yet
we're
kind
of
I'm
learning
in
real
time.
B
I
I
mean
I
have
a
question.
I
have
some
questions,
so
it
sounds
like
the
data
itself
is
not
encrypted.
The
data
could
be
publicly
available
and
downloaded
by
other
members
of
the
community.
Is
that
correct,
that's
correct,
and
so
in
that
situation,
what
you're
asking
is
to
like
keep
the
source
the
client
confidential
for
now,
but
to
still
go
through
a
file
claim
plus
verification
process.
B
To
then
make
these
deals
as
verified
duals
with
data
cap
correct,
I'm
correct
so
far:
100
yep,
okay,
and
so
then.
The
last
question
that
I
have
is
you're
representing
a
storage
provider.
B
Is
the
intent
here
that
for
these
10
petty
bites,
you
would
be
storing
10,
petty
bytes,
which
is
one
copy
or
one
copy
which
is
10,
petty
bytes
with
yourself.
Would
you
also
be
storing
multiple
other
copies
that
are
all
also
10,
petabytes
total,
with
other
decentralized
non-local
storage
providers
that
you
are
not
in
ownership
of.
I
B
Know,
I
think
that's
the
stickiest
wicket
for
me
is
that
it,
because
part
of
the
large
dataset
process
specifically,
is
that
that
these
deals
should
be
made
across
multiple
storage
providers
to
decentralize
them,
and
you
know,
spread
out
that
large
data
set
and
that
data
cap,
both
in
different
geographic
regions
and
also
different.
B
You
know,
scales
of
storage
providers,
and
so
one
of
the
things
that
we
look
for
as
we
are
doing,
diligence
is
the
kind
of
percent
allocation
to
a
single
storage
provider,
and
so,
if
a
large
data
set
gets
approved
and
then
turns
around
and
is
only
working
with
one
storage
provider
that
is
typically
deemed
suspicious
of
not
following
kind
of
the
current
state
of
the
program,
I
think
the
aspect
of
keeping
the
client
themselves
confidential
is
less
is
is
easier
to
get
around
kind
of
how
we
can
become
comfortable
with
that
and
like
how
much
diligence
we
can
do
and
kind
of
being
able
to
provide
samples
of
the
data
you
know
for
the
notaries
to
look
at.
B
B
The
current
ceiling
is
five
petty
bytes
for
a
single
application,
and
typically
we
see
about
a
hundred
tevy
bytes
a
week,
and
so,
as
we
talked
about
earlier
in
the
call,
I
don't
know
if
you
were
there
for
that,
the
allocation
scales,
so
your
first
allocation
is
either
5
total
or
50
weekly,
so
your
first
allocation
would
probably
be
50
teddy
bytes
of
data
cap,
and
once
you
have
used
75
of
that,
you
could
get
the
next
allocation,
which
would
probably
be
100
teddy
bytes.
So
it
would
be
highly.
B
It
would
also
be
highly
unlikely
for
us
to
give
a
client
a
single
client
address,
10
pebby
bytes
of
data
cap
like
straight
out
the
door
so
again
in
thinking
about
how
to
make
this
kind
of
map
a
little
bit
more
aligned
to
like
things
that
we've
been
doing.
Oh
sorry,
yeah
we're
at
time.
It
may
be
that
we
need
to
push
back
and
say
what,
if
we
did
five
copies
of
one
petabyte
across
some
decentralized
people
and
started
there,
and
would
that
be
a
good
enough
test.
C
I
Yeah,
I
I
this
is
all
all
good
advice
and
and
we'll
have
to.
I
guess
this
is
brand
new,
so
we'll
have
to
take
it
in
and
develop
a
plan.
It
sounds
like
I
mean
on
the
other
side,
you
know
we
have
this
customer
and
we're
looking
to
prove
out
a
concept
for
them
and
so
we're
you
know,
walking
a
line
on
the
scope
that
they
want
to
embrace,
and
so
you
know
a
lot
of
this,
as
I'm
sure
you
know,
is
building
trust
right
and
storing
data
right.
I
This
is
this,
is
you
know
at
you
know
the
the
ultimate
and
so
we've
you
know
been
working
hard
on
those
relationships,
and
you
know
I
guess
I
would
be
curious
to
you
know
to
hear
a
response
on
you
know.
So,
if,
if,
if
we're
going
to
chop
this
up,
I
mean
now
we're
highly
reliant
upon
brand
new
relationships
right
to
help
us
with
this
very
long
and
existing
relationship,
and
so
you
can
see
how
that
kind
of
makes
it
I'll
say
challenging.
But
I
I
guess
I
would
say
that
right.
I
So
I
guess
I
would
love
to
hear
a
comment
on
that.
Like
so
we're
gonna
adopt
three
or
four
or
five
partners,
and
you
know:
are
they
all
enterprise
level?
Are
they
right?
Well,
I
think
you
know
those
kinds
of
things.
B
Right
that
is,
that
is
sort
of
the
the
question
with
how
do
you
pick
a
storage
provider,
and
I
think
this
goes
to.
We
have
some
people
that
are
acting
as
deal
brokers
where
they
are
sitting
closer
to
the
front
working
directly
with
clients
and
they
have
relationships
with
a
group
of
storage
providers.
B
Textile
and
estuary
are
two
examples
where
they
are
working,
as
you
know,
deal
brokers
to
connect
to
different
storage
providers.
It
sounds
like
you
are
fitting
in
on
a
different
part
of
this,
like
marketplace
where
you're
a
storage
provider,
you
know
going
out
and
doing
business
development
with
clients
to
work
directly
with
you.
So
to
your
point
of
like
well,
how
do
I,
how
do
I?
How
do
we
find
other
storage
providers
that
we
have
as
much
confidence
in
them
as
we
do
with
ourselves,
because
our
reputation
is
going
to
be
on
the
line?
B
It
could
be
a
question
of,
should
you
coordinate
with
one
of
these
brokers,
and
maybe
that
would
be
a
starting
point?
Anya.
I
don't
know
if
you
want
to
jump
in
here.
I
J
I
was
already
like
chatting
with
him
in
private
on
the
during
the
whole
call
inviting
him
to
discord
and
so
yeah.
I
mean
we're
working
on
this
particular
issue.
You're
addressing
so
it
would
be
really
cool
if
you
could
find
time
and
we
can
hang
out
outside
of
the
governance
school
would
be
really
nice.
J
B
Awesome
so
yeah.
I
understand
that
you
know
doing
this
kind
of
very
high
level
enterprise
level.
Business
development
requires
a
lot
of
relationship
building
and
a
lot
of
trust.
I
think
that
you
know
we
don't
have.
We
don't
have
like
a
simple
flat
answer
for
you
like
right
now
today.
I
think
you
know
my
two
big
flags,
like
we've
already
talked
to
and
maybe
getting
in
front
of
some
of
these.
You
know
other
people
in
the
community
can
also
help
and
then
maybe
the
other
next
step
is
like
finding.
B
You
know
using
there's
like
phil
rep,
dot,
io,
there's
a
couple
different
reputation
sites,
there's
just
various
slack
channels
for
deal
making
finding
one
other
storage
provider
also
talking
to
faye
who
was
on
the
call
earlier.
He
is
a
client
he
he
acts
as
a
client
to
do
some
of
these
slingshot
programs.
He
has
storage
providers
that
he
works
with
that
are
in
his
network.
B
That
could
potentially
be
another
person,
and
maybe
it's
a
situation
where
you
know
you
need
to
onboard
somebody
as
a
consultant
to
find
another
way
to
like
kind
of
decentralize
and
I
think
to
deep
point.
B
The
the
pushback
here
to
your
client,
as
you
say
we
are
advocating
for
this,
because
this
is
why
this
is
best
practice
in
this
network.
You,
the
client,
are
going
to
get
the
choice.
We're
going
to
present
to
you.
You
know:
here's
our
level
of
storage
provider,
service,
here's
six
other
options.
We
encourage
you
to
pick
three
because
part
of
doing
this
is
the
act
of
decentralizing.
B
That's
why
you
choose
this
marketplace
so
that
you're
not
beholden
to
some
other
entity,
telling
you
and
being
in
total
control
of
your
data
set
where
it's
located,
what
it's
stored
on.
Is
it
at
risk
to
catastrophic
failure?
That's
like
that's
the
value
prop
of
the
file
coin
network
and
so
being
able
to
kind
of
push
that
back
to
them
and
say
this
isn't
a
challenge
this.
This
is
the
whole
point
of
doing
it
this
way.
But
yeah
you
know
I
don't
do
I
don't
do
biz
dev.
I
I
just
so,
and-
and
I
guess
maybe
I
would
talk
with
anya-
about
what
a
kind
of
data
storage
plan
would
look
like.
Is
that
what
I'm
saying
yes,
yeah,
yeah
yeah,
oh
okay,.
B
Yeah
me
with
anya,
and
you
guys
like
get
an
understanding
like
how
textile
functions
and
like
how
those
deal
brokers,
kind
of
manage
their
relationship
with
storage
providers,
and
they
can
give
you
know
they
can
give
their
two
cents.
I
A
That
no
problem,
just
gonna,
say
the
same
thing:
gregory
thanks
for
coming
on.
Thanks
for
the
calls.
What
we'll
do
is
we'll
migrate
that
conversation
over
to
slack,
make
sure
that
we've
connected
to
you
guys,
I
am
shared
the
link
to
the
deck,
we'll
put
the
recording
on
youtube.
As
always,
and
if
there's
anything
else
you
want
to
flag,
please
let
us
know,
and
just
the
big
takeaway
is
again
with
the
holiday
coming
up.
You
might
see
response
times
delayed.
A
If
you
send
us
multiple
things,
though,
like
this
is
on
fire,
you
will
obviously
get
our
full
attention
on
it.
So
just
keeping
that
in
mind.
Otherwise,
the
warmest
of
merry
christmas
and
happy
holidays
to
you
guys
and
we'll
see
on
the
keys,
deep
galen,
any
closing
thoughts.
A
Yeah
all
right,
well,
hello,
hello,
hello,
as
we
fist
pump
and
smile
into
the
last
notary
governance.
Thinking,
2021
just
wanted
to
start
with
a
very
friendly
hello
as
we
kick
off.
We
went
right
up
to
the
the
last
minute
and
we
went
seven
minutes
over
on
the
morning
session
of
this.
So
we'll
try
to
be
conscious
of
time
and
we
might
go
a
little
bit
faster
because
we
had
some
questions
that
came
up
as
a
reminder,
we'll
always
post
a
video
to
youtube
and
have
that
in
slack.
A
If
you
curious
to
watch
the
morning
session
or
have
any
questions,
I
put
the
deck
inside
of
the
slack.
If
you
have
any
questions
too,
as
we
would
follow
along.
So
with
that,
let's
take
a
look
at
what
we're
going
to
talk
about
today
on
the
agenda
galen's
going
to
walk
us
through
like
a
year
in
review
and
look
at
some
of
the
milestones
we
hit
in
2021.
A
It's
kind
of
a
look
back
and
growth
a
little
bit
of
back
padding,
which
is
well
in
order
we're
going
to
do
just
a
quick
refresh
from
deep
on
what
are
the
goals
that
we're
looking
at
for
2022
and
what
we're
prioritizing
making
sure
that
tracks
in
line
with
your
expectations
from
the
program
we're
going
to
spend
a
moment
or
two
on
just
kind
of
like.
What's
on
the
schedule
for
us
in
the
community?
What
days
are
we
meeting?
What
days?
A
Are
we
not
really
just
making
sure
that
we
don't
leave
anybody
high
and
dry
on
that
also
check
in
on
holiday
planning?
After
that,
we're
going
to
spend
a
little
bit
of
time
on
galen's,
going
to
walk
us
through
the
ldn
setup
and
some
issues
going
on
then
we're
going
to
dedicate
a
lot
of
time
for
a
walk
through
from
deep
again
on
the
slingshot
issue.
There's
probably
been
a
lot
of
questions
and
he's
gonna
walk
us
through
a
high
level
detail
some
of
the
the
decks,
the
slides
and
the
databases
that
we're
tracking.
A
With
that,
we
had
a
question
come
up
in
the
repo
and
gregory
joined
us
in
the
morning.
So
we'll
just
kind
of
brief
you
on
what
that
question
was
making
sure
that
you're
right
on
and
then
lots
of
time
for
any
questions
that
you
may
have
so
with
that.
If
you
have
anything,
you
want
to
add,
as
always
just
a
mute
post
in
chat
and
we'll
jump
through
it.
So
with
that,
let's,
let's
take
a
look
back
at
2021.
B
Yes,
the
year
was
2020.
The
month
was
october,
galen
was
just
a
twinkle
in
file,
coin's
eye
and
fip003
was
pushed
and
ratified
to
basically
kick
off
the
final
coin
plus
program.
So
just
as
like
a
there's,
a
reminder,
as
we
talk
through
these
metrics
falcon
plus,
you
know,
in
addition
to
the
falco
network,
is
still
so
pretty
young,
so
the
first
round
of
notary
elections
happened.
B
You
know,
beginning
like
late
last
year,
beginning
of
this
year,
with
1.8
pebbly
bytes
of
data
cap
going
out
to
12
notaries,
where
we
are
now
we
have.
You
know
more
than
double
those
notaries
and
7.7
petty
bytes
of
data
cap
that
has
gone
out
to
notaries,
to
then
allocate
around
looking
at
the
sort
of
client
impact
in
december
of
2020.
B
In
that
month
alone,
78
heavy
bites
went
to
19
clients
in
november
of
2021
1.6
pebby
bytes
went
to
103
clients
in
that
month.
So
that's
a
that's.
B
A
2
000
increase
in
our
sort
of
client
impact
very
excited,
in
my
opinion,
so
the
ldn
program
kind
of
launched
kickoff
officially
june
of
this
year
and
since
june
we
are
now
looking
at
85
open
issues
with
214
petabytes
at
the
top
to
the
multi-sigs,
so
not
necessarily
out
to
clients
yet
but
sort
of
locked
up
at
the
top
and
then
2.6
peppy
bytes
have
made
it
to
deals.
So
just
sort
of
you
know.
B
In
this
past
year
the
network
has
has
grown
in
a
lot
of
different
directions,
but
specifically
the
filecoin
plus
program
I'd
say:
has
accelerated
a
lot
of
the
network
growth
both
in
notary
engagement,
client
engagement
and
deal
flow,
so
very
exciting
stats
there,
let's
jump
into
kind
of
our
typical
stats
view.
Looking
at
our
metrics,
our
average
time
to
data
cap
is
still
under
two
days.
We're
excited
about
that.
We're
getting
that
average
in
the
past
30
days,
even
lower
we're
getting
close
to
that
one
day.
B
Time,
which
is
great.
We
want
to
keep
driving
that
down.
I'm
still
seeing
about
11
days
for
direct
notary
path.
That's
been
pretty
similar
to
what
it
was.
11
to
12
days
about
six
days
to
get
a
response
from
a
notary
through
the
direct
path.
That
number
again
is
about
the
same,
as
has
been
for
a
few
weeks.
B
Looking,
you
know
more
closely
at
this
funnel.
We
have
you
know
three
pebby
bytes
that
has
gone
to
clients
in
the
direct
notary
process
of
that
three
one
and
a
half
pepe
bytes
is
being
used
in
deals.
So
that's
getting
closer
to
50
percent
of
that
flow
down
versus
in
ldns.
B
We
are
now
at
6.7,
pebby,
bytes
out
to
clients
with
2.5
in
deals.
So
that's
about
37,
so
those
had
been
closer
in
percentage
change
to
one
another
at
about
30
to
40
percent
you'll
be
interested
to
see
if
one
of
those
starts
to
pull
away,
we
may
be
replenishing
ldns
faster
because
we're
getting
more
ldns
in
which
is
going
to
be
changing
how
that
percentage
is
calculated
clicking
through
to
the
next
looking
at
some
direct
notary
activity.
B
We
got
about
26,
open
issues
currently
in
the
repo
about
51
in
the
last
week,
and
you
know
275
with
that
label
as
granted
11
in
the
past
seven
days.
These
are
pretty
consistent.
I
don't
think
that
we've
seen
like
a
significant
increase
or
decrease
in
this
activity,
still
getting
about
64
of
them
are
getting
a
response
and
about
23
of
the
79
clients
getting
data
cap
per
month.
So
we
haven't
seen
like
a
significant
change
there.
B
We
have
seen
a
pretty
big
increase
in
the
number
of
ldns
and
the
frequency
of
ldn
submissions,
so
something
to
track.
We
are
still
working
on
some
tracking
and
analytics
for
time
to
data
cap
for
ldns
and
getting
that
reported
out
and
just
sort
of
flagging.
Some
of
these
metrics
we'll
be
curious
to
see
if
there
is
a
slowdown
kind
of
in
the
next
two
weeks
across
christmas
and
new
year's
and
then
another
flag
will
be
sort
of
the
end
of
january
going
into
february.
B
C
Goal
updates
hello
from
the
studio
I'm
here
to
talk
to
you
about
q1
drain
run
into
goal,
update
progress.
You
are
mostly
familiar
with
this
at
this
stage,
if
you're
watching
the
recording.
You
just
heard
me
talk
about
this
about
35
minutes
ago,
so
you
know
exactly
what's
going
on
if
you're
here
live,
you're,
definitely
familiar
with
this
and
yeah
just
to
get
into
it
across
the
different
focus
areas,
the
first
of
which
is
volume
gillian,
just
dropped
a
really
nice
number
in
the
200s,
but
how
I
like
to
track.
C
This
is
specifically
by
looking
at
the
amount
of
ldns
where
datacap
has
actually
been
given
to
a
client.
So
it's
either
cases
in
which
they're
not
actively
waiting,
so
they
have
a
state
grounded
or,
if
they're
waiting
it's
because
it's
a
subsequent
allocation
like
a
second
or
third
or
fourth
allocation,
as
opposed
to
a
first
allocation.
C
And
so,
if
you
look
at
the
number
of
ldns
that
have
reached
that
point
and
are
more
just
going
through
recurring
data
cap
tranches
and
due
diligence
that
comes
from
that,
that
number
is
closer
to
120,
peppy,
bytes
and
so
we're
looking
at
about
127
127-ish
heavyweights
in
circulation
in
terms
of
data,
cap,
abundancy
and
availability
being
the
key
goal
here,
and
so
the
hope
is
to
you
know,
continue
improving
this
rapidly
getting
it
up
to
closer
to
500
before
the
end
of
q1.
C
10
show
that
deal
making
on
falcon
is
enabled
by
phil
plus,
as
opposed
to
blocked
by
phil
plus
and
so
excited
to
sort
of
keep
leveling
up
and
increasing
the
flow
here.
Some
things
to
look
forward
to
here
are
probably
better
ldn
process
clarification
improvements
to
the
overall
client
ux,
which
sort
of
is
its
own
role,
but
will
have
influence
on
this
improvements
to
ttd
same
but
then
also
in
general
sort
of
improving
the
role
of
a
notary.
C
So
interesting
discussions
have
been
happening
for
the
last
couple
months,
I'm
like
having
notaries
that
you
know
are
different
levels
of
expectations.
Having
noticed
that
can
take
more
of
like
an
automated
or
can
use
better,
automated
tooling,
to
reduce
the
amount
of
like
manual
load
required,
but
still
like
improving
the
overall
due
diligence
process
by
increasing
the
amount
of
consistency
getting
more
data
from
clients
up
front.
C
So
we
can
reduce
the
amount
of
back
and
forth
et
cetera
and
so
lots
of
opportunity
for
growth
and
lots
of
already
active
conversations
on
that
front.
So
I'm
pretty
excited
to
see
how
the
next
several
months
sort
of
evolved
this
path
for
a
client
in
terms
of
ttd
we've
seen
like
a
drastic
improvement.
I
started
the
well
start
of
the
year
we
weren't
measuring
it
but
q2-ish
it
was
like
you
know,
two
weeks-ish,
plus
or
minus,
and
even
up
to
like
a
couple
of
months
ago.
C
This
number
was
closer
to
like
11
days,
but
if
you
really
look
at
the
dashboards
nowadays
and
look
at
where
we're
at
especially
like
the
number
for
the
last
30
days,
it's
very
very
close
to
like
one
day
and
slightly
more
than
one
day,
and
so
it's
awesome
to
see
the
progress
there
granted.
Some
of
that
is
definitely
coming
from
continued
attention
that
our
automated
verifier
is
receiving,
which
currently
grants
out
data
cap
in
like
30
seconds
less.
C
Of
course,
if
you,
if
you
are
eligible
for
it,
and
we
now
have
about
a
thousand
unique
addresses
on
chain
that
have
received
data
cap,
so
definitely
the
numbers
being
brought
down
a
little
bit
from
that.
But
that's
okay,
like
that's
part
of
the
client
experience
to
be
unblocked
on
falcon
and
the
metric
is
supposed
to
account
for
that.
C
So
it's
really
great
to
see
the
progress
on
that
front
in
terms
of
larger
numbers
of
data
cap,
it's
kind
of
hard
to
say
the
variance
is
still
pretty
high,
where
I'm
trying
to
calculate
tdd
for
ldns,
which
I
think
will
be
a
useful
addition
to
this
conversation,
but
until
then
we're
still
also
continuing
to
improve
on
tooling
automation
and
we've
already
had
like
a
discussion
on
improving
this
or
increasing
the
scope
of
automated
verification.
So
we
need
to
get
that
implemented
next
slide.
Sir,
thank
you
from
the
risk
mitigation
standpoint.
C
Also
tremendous
progress
in
the
last
quarter.
We
started
this
year
with
zero
dashboards.
We
were
halfway
through
the
year
with
two
dashboards
and
we're
ending
the
year
with
four,
and
we
have
a
fifth
one,
with
a
grant
out
already
and
a
team
actively
working
on
fraud
and
audit
tracking
for
falcon
plus.
C
So
we're
excited
to
share
some
updates
on
that
early
in
the
new
year
and
yeah
we're
continuing
to
facilitate
better
data,
transparency
and
availability
for
different
stakeholders
in
the
ecosystem
that
are
excited
to
explore
and
share
more
about
their
takeaways
and
generally
improve
the
efficacy
and
efficiency
of
the
program.
C
I
think
one
call
out
here
is,
if
you're
actively
working
in
this
space
or
actively
want
to
be
working
in
this
space.
There's
a
channel
a
phil
plus
dashboards
and
falcon
stack.
It's
a
little
quiet
right
now,
but
we'd
love
to
see
some
of
you
chatting
asking
questions
that
you
have
from
dashboard
owners
or
asking
questions
in
general
about
consistency,
consistency
of
data
across
the
different
tools
that
exist
so
that
we
can
have
multiple,
independently
verifiable
sources
of
truth
of
the
falcon
plus
system.
C
Client
onboarding
ux
front
this
one
not
as
much
progress
this
year,
but
we
are
working
on
drafting
up
an
rfp
to
improve
the
the
ux
side
of
things
for
clients
that
are
looking
to
get
data
cap
so
excited
to
share
more
in
the
new
around
that
and
then
last
and
absolutely
not
least,
falcon
plus
governance.
C
Lots
of
recent
up
progress
here,
primarily
thanks
to
kerry,
getting
ramped
up
and
being
able
to
support
just
general
military
operations
and
community
operations
so
really
excited
to
have
him
on
board
on
the
governance
team,
but
specific
things
to
call
out.
We
have
a
live
rfp
for
the
noting
leaderboard
that
we
talked
about
several
weeks
ago
in
case.
That's
something
you're
interested
in
building
or
helping
build,
definitely
check
that
out.
C
I'm
sure
there'll
be
a
link
somewhere
useful
and
then
we've
you
know,
started
getting
more
used
to
using
discussions
and
discussions
about
the
issues
we're
going
through
flips
there's
at
least
one
active
fip.
A
Hey
thanks
deep
nicely
said,
so
this
is
just
a
quick
refresh
you're
going
to
see
the
calendar
invites
come
through
tomorrow.
We
didn't
want
to
change
anything
before
it
happened
that
way.
Something
broke,
we
didn't
have
it
happen
so
coming
up.
First
is
the
holiday
schedule.
Next
week,
there's
going
to
be
limited
people
in
home
offices
taking
place,
so
you
might
see
a
delayed
response
if
it's
an
emergency,
please
tag
us
and
just
write
hey.
A
This
is
an
emergency
and,
of
course,
we'll
be
there
to
help,
but
otherwise
you
might
see
a
little
bit
of
delay
coming
through
from
december
24th
up
until
january,
2nd
as
a
quick
reminder
again,
the
january
4th
call
is
break
so
hug,
your
loved
ones,
get
an
extra
hour
of
work
in
whatever
you
want
to
do
with
that
time
and
we're
going
to
resume
the
calls
on
january
11th,
so
you'll
see
a
new
invite
come
through
I'll,
also
be
dming
you
to
see,
if
you're
keen,
to
have
a
specific
invite
sent
to
your
calendar
right
now.
A
We
have
you
add,
through
the
notary
governance
shared
calendar
on
google.
I
know
sometimes
it's
helpful
to
have
it
automatically
populate
on
your
calendar
going
forward.
So
if
you're
keen
off
that
I'll
be
messaging
asking
you
if
you
want
that,
we'll
just
put
you
into
the
automatic
invite
so
we'll
automatically
refresh
and
we'll
get
to
see
your
face
as
you
come
through.
A
The
second
thing
just
for
for
high
level
is
the
notable
leader
board,
so
the
rfe
is
now
published
I'll
share
the
link
in
chat
right
now,
so
you
can
take
a
look
at
it.
We've
already
gotten
a
little
bit
of
interest
on
it,
and
so,
if
you
are
keen
to
put
in
a
proposal
or
you
have
friends,
it
might
be
a
great
addition
to
this
community
or
do
the
work
we'd
love
to
hear
from
them
at
a
high
level.
This
is
kind
of
the
goal
is
that
you
can
look
right
at
a
dashboard.
A
Almost
like
a
scoreboard
layout,
see
all
the
notaries
and
just
kind
of
have
a
quick
check
in
who
has
been
along
the
program
the
longest.
How
much
data
cap
has
been
allocated?
Where
are
you
from,
and
the
goal
of
this
is
to
kind
of
continue
to
foster
that
community
of
backing
up
data
safe
and
responsible
way
and
then
have
a
metric
to
kind
of
tie
that
to
it.
So
again,
this
rfp
just
posted
it
in
there
feel
free
to
take
a
look.
A
B
As
me,
so
let
me
actually
post
this
in
the
chat
as
well.
We
have
a
had
a
multiple
conversations
about
this
kra.
If
you
wouldn't
mind
clicking
on
that
link
and
opening
it.
So
we
could
see
it
here,
just
a
reminder
for
everybody.
B
The
ldn
process
right
now
has
three
calculation
steps
to
determine
how
much
data
cap
a
client
will
get
for
their
first
allocation,
it's
lesser
of
five
percent,
total
or
50
weekly,
and
then
it
moves
from
there.
So
for
a
lot
of
our
clients,
they're,
you
know
requesting
five
petty
bites
total
with
an
estimated
weekly
usage
of
a
hundred
teddy
bytes,
so
their
first
allocation
is
50.
B
Tevi
bytes
second
is
100
tubby
bytes
third
is
200
teddy
bytes
and
that's
kind
of
right
now
the
cap
is
they
can
get
allocations
of
200
teddy
bytes
at
a
time
for
most
of
our
clients.
So
all
that
we
are
doing
here
is
proposing
adding
two
more
calculation
steps,
not
changing
anything
to
the
first
three,
but
just
saying
the
fourth
allocation
would
be
40
or
400
of
the
weekly
fifth
allocation
is
80,
total
or
800
of
the
weekly.
B
We
wanted
to
kind
of
honor
what
we've
heard
from
some
specific
clients
and
notaries
in
various
places,
and
also
for
now
keep
that
kind
of
onboarding
scale
so
that
a
brand
new
client
who
we
maybe
don't
have
a
lot
of
information
about-
is
still
kind
of
working
through
that
allocation
process.
So
I
wanted
to
bring
this
up
I'm
going
to
pause
here
and
see
if
there
are
any
questions
or
comments
on
this
specific
proposal.
B
Okay,
so
with
that
there
were
no
other
questions
or
comments
in
in
the
first
one
either
as
I've
said,
I
think
that
we've
talked
about
this
a
few
times.
I
think
that's
not
incredibly
controversial.
I'm
gonna
leave
that
proposal
open
until
the
next
governance
call
that
we
have
in
january.
B
At
that
time,
we'll
do
one
last
check
to
see
if
there
are
any
notary
questions,
otherwise
we'll
probably
ratify
it
at
that
january
call
and
implement
it.
Then
it's
not
a
huge
technical
lift
to
change,
so
it
doesn't
really
change.
Much
of
you
know
the
notary
experience
at
all.
It
will
just
help
give
clients
the
ability
to
get
to
eight
weeks
of
runway
of
data
cap,
which
I
think
will
help
our
enterprise
level
clients.
So
thank
you
for
that
time.
C
Pebb
bytes
of
unique
data,
and
the
data
sets
are
the
same
sort
of
public
open
data
sets
that
were
selected
for
slingshot
and,
are
you
know
publicly
verifiable,
publicly
obtainable,
ideally
for
free
and
and
stored
on
slingshot,
with
the
intent
of
having
like
a
distributed?
Storage
mirror
for
them
in
the
long
term?
C
The
the
program
it
has
a
it's
like
slingshot
on
steroids
in
some
ways
where
there's
a
very,
very
tight
sort
of
coordination
and
communication
across
the
different
involved
entities
of
the
of
the
program
to
ensure
that
the
data
is
being
stored
with
very,
very
high
standards,
extremely
high
quality
and
retrievability
in
the
long
term.
So,
specifically,
clients
and
storage
providers
are
applying
to
participate,
are
being
manually
selected
based
on
their
experience,
their
location,
their
ability
to
process
data
seal
data
and
make
deals
with
it
and
their
access
to
capital.
C
The
last
one
being
important,
because
one
of
the
big
blockers
we're
seeing
with
deal
making
with
with
falcoin
plus,
especially
today,
is
that
collateral
requirements
are
pretty
high
and
so
in
general.
Getting
search
providers.
Access
to
lots
of
farcoin
to
seal
verified
deals
is
not
a
trivial
thing
to
do,
and
slingshot
resource
certainly
has
an
aspect
of
urgency
to
it.
In
that.
C
It's
the
first
time
that
the
network
is
going
through
like
this
size
of
data
loss,
and
it's
not
uncommon,
to
need
to
sort
of
have
best
practices
in
place
to
guard
against
data
loss.
In
that,
like
lots
of
traditional
cloud
providers,
have
this
issue,
but
just
have
extremely
well-oiled
processes
in
place
where
they
are
able
to
handle
their
loss
incidents
without
an
end
user
even
experiencing
a
difference
in
their
service.
C
So
we're
specifically
doing
like
a
mapping
of
the
clients
that
are
applying
with
specific
data
sets
that
they'll
be
onboarding
and
parts
of
data
sets
that
they're
onboarding
they're
expected
to
get
us
like
a
list
of
individual
cids
that
are
mapped
to
those
data,
sets
and
then
store
at
least
five
geo-distributed
replicas
of
those
cids
and
we're
defining
zero
distribution
is
crossing
three
countries
and
crossing
two
continents,
and
each
of
those
replicas
should
be
in
a
different
city,
so
definitely
ensuring
that,
like
there's
high
availability.
C
C
In
the
network,
and
so
it'll
be
good
to
see
if,
for
the
long
term,
like
what
the
bandwidth
limitations
might
be
there,
in
addition
to
just
networking
but
like
how
long
does
it
actually
take
to
process
the
data
and
make
it
ready
for
deal
making
like
getting
it
into
a
car
file
and
doing
all
the
comp
generation
stuff,
but
also
then
making
deals?
C
Even
if
it's
with
very
minimal
data
transfer
and
networking
latency,
I
am
personally
syncing
up
with
all
the
participating
clients
this
week,
but
other
people
on
the
slingshot
admin
team
will
continue
to
uphold
that
every
single
week
until
the
program
runs
out
in
february
other
than
next
week.
C
I
think
which
we're
taking
off
and
we're
also
working
to
manually
approve
each
of
the
storage
providers
that
they're
storing
that
data
with,
and
so
this
is
like
a
very
tightly
coordinated
and
controlled
loop
that
should
hopefully
ensure
that
the
program
falls
into
the
pit
of
success
and
the
data
being
onboarded
is
long-term,
available
and
reliable
on
the
network
cool
just
to
talk
a
little
bit
more
from
sorry.
C
Your
perspective
as
a
notary,
the
main
os
is
that
restore,
is
like
the
first
sort
of
program,
it's
kind
and
hopefully
not
too
many
more,
but
it
is
important
in
that.
It's
the
first
time
we're
dealing
with
the
network's
ability
to
repair
like
lost
data
and
recover
data
that
used
to
be
on
the
network
and
no
longer
is
and
so
setting
a
tone
for
the
urgency,
and
the
prioritization
of
this
is
very
important
to
the
long-term
success
and
network.
C
I
understand
that
it's
also
like
holiday
season
and
stuff,
so
there's
a
few
compounding
factors
here,
but
it's
still
like
something
that
I
want
to
flag
as
like
top
of
mind
for
those
of
you
that
are
around
and
are
available
over
the
next
few
weeks.
We
would
really
appreciate,
if
you
keep
an
eye
out
for
this
and
ensure
that
we're
able
to
enable
the
clients
that
are
participating
in
this
program
and,
of
course,
even
though
the
data
is
publicly
accessible
and
we're
doing
all
this
coordination.
C
It
is
also
an
exercise
in
due
diligence
and
ensuring
that,
like
notaries,
have
an
ability
to
understand
like
what's
actually
happening
and
setting
a
standard
in
like
community
based
programs
to
onboard
data
at
scale
onto
the
network
by
you
know
ensuring
high
level
of
data,
transparency,
coordination
and
communication
about
how
the
program
is
actually
sort
of
conducting
itself
and
also
the
success
it's
having
over
a
period
of
time.
I
see
a
hand
from
mr
o'brien.
What's
up.
H
Hello,
I
just
wanted
to
to
clarify
a
few
things
in
my
head,
so
the
data
that
we're
restoring
is
data
that
was
lost
and
was
part
of
the
original
slingshot
public
data
storage.
Or
is
this
a
new
set
of
data?
Okay,
yeah.
C
So
slingshot
itself
has
been
running
for
basically
a
year
at
this
point,
and
so
a
lot
of
it
is
like
earlier
data
sets
that
were
stored
in
the
previous
phases.
But
then
what
we
did
is
we
actually
ran
a
bunch
of
analysis
on
the
deals
that
were
lost
in
the
cid,
specifically
that
were
lost
and
mapped
those
cids
to
specific
data
sets,
because
we
had
that
metadata
from
participating
teams.
H
So,
okay,
that
that
makes
sense
to
me
so
we're
talking
about
like
several
sets
here,
we've
got
the
sets
of
ones
where
perhaps
there
were
like
10
cop
is
made,
and
now
there
are
nine
copies
on
the
network,
so
you're
trying
to
get
back
that
tenth
one.
This
is
ones
where
that
was
kind
of
the
probably
just
as
a
result
of
how
slingshot
was
organized.
There
was
just
one
provider
or
one
that
was
stored
in
one
storage.
I
C
Had
like
basically
identification
purely
based
on
like
an
entity
and
so
people
a
lot
of
folks
who
were
like
being
service
providers
or
search
drivers
for
slingshot
clients
happen
to
collocate
in
one
massive
data
center
and
so
the
the
original
sort
of
premise
of
wanting
to
be
distribute
across
like
fault
domains
or
infrastructure.
Failure.
Domains
was
kind
of
lost
in
that,
but
there
was
no
verification
happening
at
that
level
to
be
able
to
prove
otherwise,
and
so,
even
though
they
may
have
stored
them
with
like
individual
organizations.
H
This,
this
really
reminds
me
of
the
early
days
of
of
the
internet,
where
people
would
think
that
they
had
multiple
co-located.
They
had
multiple
backups,
but
in
fact
they
all
ended
up
in
the
same
place.
Yeah
yeah.
C
I
mean
it's
a
good
lesson
for
us
too,
in
terms
of
like
getting
better
at
like
tracking
replica
distribution,
and
I
think
that's
also
going
to
be
a
really
interesting
step
for
for
two
reasons.
One
is
there's
a
lot
of
work
happening
around
like
indexing
right
and
like
development
of
index
nodes
in
the
network
to
help
with
retrieval,
but
even
discounting
or
irrespective
of
that.
C
You
can't
optimize
for
the
physics
for
like
what
it
means
to
go
retrieve
that,
and
so
I
think
this
is
going
to
lead
to
some
really
good
and
interesting
outcomes
on
in
terms
of
like
what
can
be
parallelized
in
the
network.
What
can
be
sped
up
and
like
what
are
the
main,
like
bandwidth,
limiting
factors
like
what
are
the
interesting
ways
to
solve
them?
C
So,
for
example,
we
did
a
bunch
of
analysis
this
past
week
and
learned
that
shipping
hard
disks
almost
in
every
single
case,
would
be
like
a
network-based
transfer,
but
shipping
hard
disk
is
then
dependent
on
an
actual
shipper,
so
think
about
the
holiday
season
and
like
bhl
ups
or
whatever,
needing
to
actually
ship
hard
disks
or
hardness
supply
itself
being
constrained
like
needing
to
acquire
the
hard
disk
to
sell
them.
C
There's
a
lot
of
opportunities
here
for
like
partnering
with
suppliers
partnering
with
even
like
delivery
systems
and
logistics
to
ensure
that
these
things
happen
a
lot
quicker.
So
there's
a
lot
of
room
for
growth,
but
we
haven't
like
gotten
to
double
click
on
a
lot
of
parts
of
this
process.
I
think
this
program
will
let
us
do
that
in
a
level
of
granularity
and
depth
that
we
haven't
had
before
in
the
falcon
network.
C
So
I'm
pretty
optimistic
about
some
of
the
takeaways
here,
because
I
do
think
they're
going
to
influence
a
lot
of
the
thinking
either.
In
like
the
core
development
of
the
protocol
or
more
interestingly,
I
think
in
the
development
of
different
tools
in
the
ecosystem
or
companies
in
the
ecosystems
that
can
speed
up
the
efficiency
of
the
network.
H
Yeah
this
points
to
I
mean
the
thing
that
we're
seeing
in
a
lot
of
places-
that's
really
useful
for
for
us
is
as
notarists
too,
which
is
just
having
ways
to
label
storage
providers
with
useful
metadata
right.
So
like
the
falcoin
green
project
and
others,
so
I
mean
right
now
we
have
some
very
broad
information
about
roughly
regional
locations
of
storage
providers,
but
the
more
information
we
can.
We
can
give
on
that.
H
The
more
that
as
notaries,
we
can
make
sure
that
we're
that
there's
a
also
a
diversity
of
storage
when
we're
we're
approving
verified
data,
which
would
be
useful
for.
C
C
Are
they
being
held
to
the
right
standard,
I'm
trying
to
make
that
as
easy
as
possible,
not
specifically
for
you,
but
also
specifically,
for
just
generally
the
admin
and
the
operations
of
the
program,
but
why
not
knock
out
two
birds
in
one
stone
so
for
those
of
you
that
are
looking
at
this
deck
later,
because
I
think
we
might
be
able
to
give
access
to
it
these?
C
These
are
the
links
that
you
should
care
about,
the
first
of
which
is
this
thing
under
client
address
info,
which
is
on
kerry's
third
dab
of
his
browser
he's
going
to
pop
up
in
a
sec,
and
what
I'll
be
able
to
show
you
guys?
Oh
not
this
one,
the
one
before
awesome.
Thank
you
so
much.
C
Once
we
publicize
that
or
the
actual,
like
name
of
the
data
set,
and
so
this
is
compelling
because,
as
a
notary
you're,
now
able
to
see
exactly
what
the
address
is,
that
they're
applying
with
cross-check
that
against
their
name
and
cross-check,
that
against
a
data
set,
so
that
effectively
should
reduce
the
time
for
your
due
diligence.
Specifically
and
assuming
that
you
support
our
general
like
distribution
requirement.
You'll
see
that
I've
listed
I've
had
them
list
out
each
of
their
storage
providers.
C
So
we
know
exactly
where
they're
going
and
we're
vetting
these
in
terms
of
region
and
stuff
like
that,
and
if
you
want
to
double
click
into
the
storage
providers
as
well.
The
second
link
in
the
deck
points
to
the
list
of
storage
providers,
and
in
that
list
you
can
actually
see
each
of
the
storage
providers
that
are
currently
registered.
Their
slack
handles
in
case
you
want
to
ask
them
any
questions.
C
The
rules
they're
playing
in
slingshot,
restore
the
ids
that
they
have
on
chain
the
amount
of
data
they're
willing
to
onboard
in
the
next
two
months,
whether
or
not
they'll
be
making
verified,
unverified
deals
and
what
their
stated
location
is
in
the
world,
and
so
in
every
case
that
we
can
we're
actually
validating
the
geographic
location
based
on
the
peer
ids
on
the
network,
because
that's
going
to
be
useful
for
us
from
a
slingshot
like
your
distribution
perspective,
but
it
should
be
useful
to
you
as
well
from
like
a
yes
indeed.
C
These
are
because
they're
going
to
be
built
in
a
in
in
a
way
that
their
geographically
distributed
and
available,
and
so
one
to
stress
this
as
well,
is
like.
This
is
also
like
a
good
example
of
like
what
a
notary
process
could
look
like
in
the
future
right
where
we
want
to
have
clients,
be
more
upfront
with
the
data
they're,
providing
in
a
way
like
I'm
attempting
to
do
that
with
this
set
of
clients
where
it's
like.
C
Oh,
here's
all
the
info
that
theoretically
you'd
want
to
know
here's
the
standard
that
they're
being
held
to
here's
like
the,
where
the
storage
providers
are
and
and
the
the
ways
in
which
they're
deciding
what
to
store
and
how
to
store
it
and
here's
how
you
can
also
access
the
data,
they're,
storing
and
so
to
answer
the
last
question:
if
you
go
back
to
the
client
spreadsheet,
you
have
a
second
tab
in
here,
which
is
called
data
sets,
and
if
you
check
that
data
that
list
it
actually
has
like
enumerated,
which
are
the
data,
sets
how
big
is
each
one,
if
there's
any
specific
directories
within
them,
that
we're
asking
people
to
prioritize
and
for
each
of
these
data
sets,
clients
are
being
asked
explicitly
not
to
compress
them,
which
then
yields
much
better
retrieval
in
the
future.
C
The
data
center
is
cheapest
and
not
specifically
optimized
for
this.
One
really
interesting
thing
that
I
learned
today
was
one
of
the
data
sets
that
we
lost
a
little
bit
of
data
from
landsat
8
actually
had
to
change.
Its
downloading
policy
did
not
be
free
anymore
because
of
changes
in
policy
with
some
of
the
organizations
that
were
supporting
the
egress
of
the
data
from
an
s3
bucket
on
aws.
C
And
so
that's
that's
been
really
interesting
because
it's
like,
oh
wouldn't,
it
have
been
great
if
we
had
a
replica
of
this
on
falcon
available
for
people
to
retrieve
at
an
extremely
low
cost,
because
right
now
pulling
that
off
of
aws
will
cost
you
approximately
fifteen
thousand
dollars
for
that
data
set,
which
is
quite
substantive,
and
so
this
is
like
a
pretty
cool
use
case
for
exactly
this
work.
C
Unfortunately,
it
now
means
it's
not
so
trivial
for
us
to
like
get
that
initial
data,
so
we've
reached
out
to
the
geological
survey
server
like
government
arm,
I
guess
the
usgs
and
so
we'll
try
to
work
with
them
to
figure
out
how
we
can
get
that
data
in
a
cheaper
way
and
then
get
a
mirror
of
it
up
on
falcon
yeah
good
use
case,
hopefully-
and
I'm
pretty
excited
to
see
with
like
the
sense
of
urgency,
with
the
coordination
across
the
entire
system
between
client
storage
providers
and
notaries
that
are
available.
C
How
quickly
can
we
actually
bring
back
data
onto
the
network?
What
does
it
mean
for
like
a
week
in
the
network
in
terms
of
onboarding
stuff?
That's
actually
lost,
and
is
it
tenable
for
us
to
to
defensively,
say
that,
like
five
point
is
in
the
network
like?
How
far
are
we
from
from
that
as
a
standard
and
like
what
do
we
wanna
do
in
2022
to
ensure
that
we
can
say
that
with
confidence,
so
yeah,
I'm
looking
forward
to
having
your
support
for
these
clients?
C
Please
let
me
know
if
you
have
any
questions
now
or
later.
I
know
a
couple
of
you
have
already
looked
at
some
of
these
applications
and
have
questions.
Some
of
the
applications
are
coming
in.
Preemptively
like
clients
are
claiming
to
have
data
sets
when
they
don't
even
they
haven't
even
been
picked.
So
please
view
this
link
as
the
definitive
source
of
which
are
the
currently
engaged
active
clients.
I
do
expect
that
we'll
have
quite
a
bit
of
churn
as
clients
realize
whether
or
not
they
can
participate
in
this
program.
B
Thanks
deep
neo,
I
think
that
answers
some
of
your
questions
from
yesterday
that
we
were
working
on
and
then
I
just
wanted
to
jump
in
here
from
the
sort
of
ldn
side
of
things.
K-Ray.
If
you
wouldn't
mind
clicking
to
the,
I
think
it's
your
second
tab.
You
should
just
we
should
just
be
able
to
like
drive
your
remote
control.
Your
screen
right!
B
Oh
no!
I
lied.
Maybe
maybe
it's
the
last
one.
I
think
there
there
are
three
ldns
right
now
yep
here
we
go,
there
are
nine
ldns
open
for
this
program
and
three
of
which
need
signatures.
So
if
you
are
on
this
call,
when
you
finish,
this
call
jumping
in
checking
those
ldns
and
looking
at
them
would
be
great.
In
addition
to
these
nine
open,
ldns,
three
of
which
need
signatures.
I
think
there's
like
27
other
large
data
sets
that
are
ready
for
signatures
too.
B
So,
while
signing
these
slingshot
ones,
I
can
sign
some
other
ones
in
there
as
well.
I
forgot
one
other
announcement
in
our
annual
yearly
recap.
Just
wanted
to
throw
out
some
kudos
to
the
notaries.
We
did
have
an
ldn,
fully
complete
and
close
and
go
through
its
it's
working
through
its
full
requested
allocation,
and
that
was
phase.
This
is
like
a
major
success.
B
In
our
opinion,
I
think
he
he
went
through
two
petty
bites
of
data
cap
working
with
slingshot,
so
great
job.
Let's
jump
back
over
to
the
slides
and
go
back.
One
slide,
please
I'll
just
sort
of
present
a
little
bit
of
this
and
then
the
recording
when
it
goes
live
we'll
have
our
guest
speaker
from
the
earlier
call
greg
sharpstein.
B
You
know
internal
politicking.
While
they
do
this
test,
the
data
itself
would
still
be
public
and
publicly
downloadable.
So
the
question
here
is
really
not
one
of
encrypted
data.
The
question
here
is
of
a
confidential
client
going
through
the
file
coin
plus
process,
and
how
do
we
feel
about
that?
B
How
would
we
do
the
diligence
on
it
and
I
think
one
of
the
flags
that
came
up
in
that
conversation
is
how
will
they
find
and
work
with
other
storage
providers,
because
at
this
point
one
storage
provider
will
be
standing
themselves
up
to
kind
of
represent
this
client's
interest,
and
so
they
may
not
necessarily
you
know,
know
what
other
storage
providers
they
would
be
working
with
and
the
client
may
not
want
to
for
this
pilot
test
store
multiple
copies
in
a
decentralized
way.
So
that's
kind
of
an
ongoing
conversation.
That's
happening
right
now.
B
This
is
sort
of
just
a
first
introductory
investigation
into
how
filecoin
plus
would
think
about
approving
a
situation
like
this.
So
I
think,
did
they
open
a
discussion,
topic
k-ray
as
well?
B
No
okay,
so
maybe
maybe
a
next
step
would
be
for
us
to
also
just
open
a
discussion
topic
to
to
get
some
get
a
asynchronous
place
for
that
conversation
to
happen.
It's
pausing
here,
seeing
if
there
are
any
questions
on
that,
knowing
that
I
am
neither
the
client
nor
the
storage
provider
who
brought
this
I'm
just
a
guy
in
a
hat
who
was
on
a
call
this
morning.
C
Deep
you
want
to
call
this
morning
too
yeah.
I
know
I
just
one
thing
that
I
don't
think
we
got
as
much
of
a
chance
to
talk
about
this
morning.
That
I
do
think
would
be
interesting
to
talk
about
here,
especially
because
mag
is
on
the
line
and
has
good
experience
with
this
kind
of
client
is
just
in
general
like
how
we
want
to
support.
You
know
encrypted
data
or
more,
like
private
ish
data
in
the
in
the
medium
long
term.
C
This
is
a
very
hot
topic
of
conversation
a
few
months
ago
in
in
our
community,
but
I
think
lately
the
focus
has
been
more
on,
like
general,
like
ttd
and
ldns
and
onboarding
notaries
and
tooling
and
stuff,
but
I
do
think
it's
a
very
pertinent
topic
and
there
were
some
pretty
cool
ideas
that
were
suggested
at
the
time.
I
know
that
fenbushi
and
ipfs
union
and
a
couple
other
notaries,
especially
those
based
in
china,
were
working
on
a
offline
verification
method
where
they
would
have
clients.
C
Meg's
teammate
over
at
holland
had
a
really
cool
idea
of
basically
storing
some
of
the
data
already
encrypted
on
ipfs
and
then
sharing
the
decryption
keys
for
some
of
it,
with
a
subset
of
the
nodejs
that
could
approve
the
ldn
and
then
given
that
basically
proved
that
it's
the
same
content
being
stored
onto
the
network,
but
more
of
it
and
probabilistically
at
least,
if
you've
established
trust
with
a
client.
At
that
point,
it's
the
same
type
of
data
just
at
scale,
and
that
could
be
a
really
cool
model
as
well.
C
But
I
do
think
it's
can.
We
should
like
continue
to
push
on
this
one,
especially
early
in
the
new
year,
and
look
at
ways
in
which
you
want
to
do
this,
because
enabling
this,
I
think,
will
be
incredibly
important
in
like
like
early
on
to
unblock
these
kinds
of
pocs
and
projects
which
I
know
are
coming
down.
C
The
pike
q1
q2
of
next
year
and
it'll
be
paramount
in
enabling
the
success
of
like
web
view
companies
or
web
2
projects
that
will
finally
get
a
transition
to
the
web
3
world
and
so
I'd
love
to
see
us
sort
of
evolve
and
become
like
a
huge
lever
for
that
kind
of
success.
For
that
kind
of
profile
of
customer,
danny
keaney.
H
I'm
just
going
to
act
like
I've
come
up
with
what
meg
and
stefan
are
just
saying
in
chat,
which
is,
we
should
definitely
try
and
make
this
work.
I
think
that
I
mean
you
know
that
the
my
my
feeling
on
these
things
is
always
that
the
thing
we're
trying
to
avoid
is
self
dealing
by
storage
providers.
So,
as
long
as
there
is
the
commitment
to
store
this
data
with
more
than
more
than
one
storage
provider,
which
I
don't,
I
don't
see
why
that
would
be
like
that
would
be
an
issue.
H
I
think
that
anybody
wanting
to
build
this
out
would
probably
be
sensitive
to
wanting
to
store
at
least
test
out
having
multiple,
redundant,
copies
and
they'll
be
able
to
store
it
for
free.
I
don't
think
that's
a
bad
condition
to
put
on
this,
and
I
think
if,
if
we
have
that,
then
we
can
kind
of
we
don't
have
to
worry
too
much
about
some
of
the
other
aspects
of
of
confidentiality,
and
I
think
it's
fine,
if
one
storage
provider
is
is,
is
representing
a
confidential
client.
H
You
know,
I
I
think
it's
just
the
condition
of
getting
getting
falcoid,
plus
verification
is
going
to
be
a
distribution
of
that
data
that
trial
data
across
the
whole
network.
B
Yeah
so
again,
I
will
only
repeat
what
I
heard
this
morning
as
I,
as
I
remember
it,
from
greg
his
response
there
in
terms
of
like
finding
multiple
other
storage
providers,
was
from
a
business
development
relationship
building
standpoint
he's
like
I'm,
a
storage
provider.
I
know
the
service
that
we
can
provide
and
the
level
of
service
that
we
can
provide.
Our
storage,
provider's
reputation
is
on
the
line
in
this
pilot
with
this
big
client.
B
B
B
That's
the
flip
of
the
pro
of
the
falcon
network
right
is
that
you're,
like
the
whole
point
here,
is
that
you,
the
client,
have
this
choice
to
pick
who
you
want
to
work
with
and
have
this
decentralization
and
have
the
ability
to
you
know
flex
this
market
against
itself
for
competition
to
say
I
want
to
get
the
best
price
for
the
best
service
by
you
know
having
choice,
so
I
agree
with
all
these
things
that
that
we
should
make
it
work.
B
H
H
So
I
think
that
there's
sufficient
incentive
within
the
system
for
them
to
have
some
leeway
to
go
and
work
with
with
other
storage
providers
to
provide
the
kind
of
redundancy
that
I
think
any
big
customer
is
going
to
be
is
going
to
want
anyway.
But,
like
I
said
you
know,
I'm
not
I
don't.
I
don't
want
to
use
you
as
the
proxy,
I'm
just
happy
to
have
that
conversation
with
them.
So
thanks.
B
Yeah
amazing,
like
I
said
I'll,
I
think
we'll
work
with
greg
sharpstein
and
try
and
get
a
discussion
topic,
we'll
probably
post
that
in
the
slack
channel
once
we
have
a
asynchronous
place
for
this
conversation
to
happen,
and
I
think,
as
far
as
like,
how
do
we
make
this
work
it'd
be
great
to
have
some
asynchronous
back
and
forth
ideas
so
that
at
the
next
governance
call
in
january,
we
have
an
actual
proposal
that
we
can
kind
of
weigh
in
on
and
see
how
we
feel
so
that
we
can
start
a
pilot.
B
You
know
as
soon
as
possible
and
not
lose
some
of
this
momentum.
So
with
that
we've
got
six
minutes
according
to
my
clock.
If
there
are
some
other
open
discussion
issues,
any
hands.
H
B
Oh
sorry,
yeah
you're
right,
we'll
speed
up.
Well,
if
that's
the
case,
I
think
we
can
go
ahead
and
cut
off
the
recording
thanks
everybody,
and
hopefully
you
have.