►
From YouTube: Slingshot - Phase 1 Closing Ceremony
Description
Join us as we celebrate the end of Slingshot Phase 1! We’ll recap the event, have some teams present in a live Showcase, and present the Community Choice Awards and top teams in the storage competition.
A
Yes,
sir
awesome
welcome
everyone
out
there
watching
over
youtube
or
zoom
or
wherever
you
are
to
the
slingshot
phase.
One
closing
ceremony
super
excited
to
be
here
today
after
our
four-week
competition
slingshot.
A
For
those
of
you
who
might
not
know
it's
a
competition
to
store
real
and
useful
data
to
the
file
coin
network
and
competitors
have
been
working
hard
to
to
do
so
with
the
competition
ending
last
wednesday,
final
submissions
due
friday,
community
review
taking
place
the
last
four
days
and
then
culminating
now
in
in
our
closing
ceremony.
A
A
After
that,
we'll
have
our
award
ceremony
where
we'll
present
our
our
community
awards
show
the
top
teams,
who
won
the
most
awards
in
the
storage
phase
of
the
competition
and
then
give
more
details
on
what
to
expect
for
your
phase
one
awards
and
then
give
a
brief
preview
of
phase
two
and,
at
the
end,
juan,
will
join
us
for
some
closing
remarks.
A
B
Hello,
everyone-
this
is
so
so
exciting
that
we're
finally,
here
at
the
at
this
moment
of
phase
one,
the
onboarding
phase
for
slingshot
being
complete
and
we've
officially
entered
phase
two
already,
which
is
the
scale
phase
of
the
slingshot
competition.
As
david
mentioned,
you
know
just
a
brief
reminder
for
everyone.
The
slingshot
competition
was
a
community
competition
for
storage,
clients
and
developers
that
rewards
the
storage
of
real,
valuable
and
usable
data
onto
the
filecoin
network.
B
So
over
the
last
several
weeks,
folks
have
been
storing
massive
amounts
of
data
to
the
file
coin
network
and
also
building
these
really
incredible
applications
and
uis
on
top
of
the
data
as
well.
In
order
to
make
that
data
explorable
to
people
in
the
broader
community,
the
we're
going
to
hear
a
lot
more
about
rewards
and
the
way
that
the
teams
that
won,
but
the
primary
structure
of
this
competition
was.
B
B
Can
go
to
the
next
slide,
so
yeah,
it's
only
been
five
weeks
since
slingshot
started,
which
is
crazy.
I
personally
feel
like
it's.
It's
been
so
many
months,
there's
just
been
so
much
that
has
happened
since
then.
We
actually
started
this
competition
before
the
file
coin,
mainnet
launch
and
so
for
the
first
three
weeks
of
the
slingshot
competition
participants
in
the
competition
were
storing
data
to
the
space
race
network.
So
it
was
the
test
net.
B
At
the
time
and
a
couple
weeks
ago
we
launched
the
filecoin
mainnet,
which
was
just
an
amazing
moment
for
this
community,
and
so
far
has,
I
think,
been
extremely
an
extremely
successful
launch.
We
last
week
had
liftoff
week-
and
this
was
you
know
five
days
of
jam-packed.
It
was
just
like
a
jam-packed
schedule
with
a
bunch
of
members
of
the
file
coin.
B
Ecosystem
and
community
who've
been
contributing
all
sorts
in
all
sorts
of
ways
to
this
project
of
the
last
several
years
from
minors,
to
many
many
folks
who
actually
participated
in
the
slingshot
competition
to
developers
who
are
building
developer
tools
and
more
so,
you
know,
and
and
also
just
some
panelists
who
were
talking
about
the
future
of
the
file
coin
network
and
the
file
queen
foundation,
murmuration
labs.
B
So
many
just
other
groups
that
have
corralled
together
to
make
this
network
and
ecosystem
just
an
amazing
ecosystem,
an
amazing
community
for
the
last
several
several
years.
So
that
was
an
amazing
moment.
We
actually
have
the
full
playlist
of
liftoff
week,
talks
and
events
and
all
the
various
like
workshops
that
were
given.
That
is
on
youtube.
B
So
if
you
missed
any
of
the
events
during
liftoff
week,
I
highly
encourage
you
to
go
check
out
the
falcon
project,
youtube
channel
and
see
some
of
what
you
missed,
because
it
was
really
just
wonderful.
B
A
number
of
fips
that
have
been
accepted
by
the
community
and
all
the
different
protocol,
implementations
that
are
building
file,
coin
protocol,
implementations
and
also
just
you
know,
as
I'm
sure
many
folks
in
this
competition,
noticed
lots
of
there's
still
a
long
way
to
go
in
improving
the
ux
of
what
it
means
to
store
data
on
the
file
coin
network
and
retrieve
this
data
back
out
and
a
lot
of
the
feedback
that
you
as
participants
surfaced
throughout.
This
competition
was
also
surfaced.
B
You
know
back
to
the
core
development
teams
and
has
also
since
been
improved
in
numerous
ways.
We
still
have
a
long
way
to
go,
but
we're
continually
going
to
be
improving
filecoin
and
we're
really
excited
that
you're
part
of
this
journey
with
us.
As
as
we
build
a
truly
you
know:
phenomenal
decentralized
storage
network
and,
of
course,
the
network,
especially
on
the
on
the
storage
side,
now
has
capacity
of
more
than
675
pebby
bytes
of
storage
capacity.
So
also
just
really
amazing.
B
We've
been
saying
hundreds
of
peppy
bites
ago
that
this
was
the
largest
decentralized
storage
network
that
has
ever
existed
and
that
only
just
becomes
more
and
more
true
every
single
day.
So
thank
you
so
much
to
the
community
that
has
been
tirelessly
working
to
grow
capacity
on
the
network
as
well
and
so
much
more.
B
So
yes,
so
what
happened
in
slingshot
on
the
slingshot
side?
So,
as
we
mentioned,
you
know,
there's
just
been
a
huge
amount
of
hands-on
learning
throughout
this
competition.
All
of
you
have
helped
to
stress
test
and
debug.
The
network
surfacing
a
number
of
issues
to
various
core
technologies
that
are
necessary
as
part
of
the
file
coin,
storage
experience
so
including
lotus
and
powergate
and
textile
buckets
and
so
much
more,
and
there
have
been
a
number
of
ideas
for
future
improvements.
B
Future
fips,
which
are
I
mentioned
this
with
I've,
used
the
acronym
before
but
at
stanford
file,
coin
improvement
proposals.
So
a
number
of
ideas
for
future
fips
have
been
surfaced
throughout
this
competition.
B
Many
many
issues
and
there's
just
been
so
much
valuable
learning
which,
as
you
remember
you
know,
we
started
this
competition
pre-madenet
and
we're
still
in
the
very
early
days
of
the
network.
So
this
sort
of
feedback
is
extremely
useful
and
it
will
certainly
lead
to
a
much
smoother
experience
down
the
road
for
everyone
who
who
wants
to
use
filecoin
for
storage.
B
In
addition,
there's
you
know,
I
think,
we're
starting
to
see
just
a
really
wonderful
community
develop
around
slingshot
and
all
of
the
related
events
and
just
being
storage,
clients
and
developers
on
the
in
the
early
file
coin
ecosystem.
Thank
you
so
much
for
supporting
each
other.
Thank
you
for
your
patience.
Thank
you
for
being
creative
and
taking
wild
shots
and
trusting
bilecoin
with
your
precious
data
and
building
wonderful
applications.
Thank
you
for
attending
all
of
the
events
that
we've
hosted
over
the
last
several
weeks.
B
The
the
tireless
slingshot
and
events
team
has
put
on
more
than
16
events
in
just
the
the
four
weeks
of
the
sunshot
competition
alone.
In
addition
to
the
liftoff
week,
events,
so
there's
just
been
a
lot
going
on
and
thank
you
so
much
for.
For
being
part
of
all
of
that,
and
then
of
course,
one
of
the
primary
purposes
of
this
competition
was
to
store
a
large
volume
of
data
and
we
totally
have
how
much
exactly
is
more
than
650
tbi
bytes
of
data.
So
we
use
this
comparison.
B
It's
more
than
3.2
billion
sony
floppy
disks
from
1981,
which
is
just
you
know,
a
staggering
number,
it's
hard
to
even
conceptualize
what
that
really
means.
But
this
is,
you
know
a
huge
amount
of
data.
It
was
stored
through
more
than
520
000
storage
deals
by
more
than
100
projects
across
19
countries
and
total
throughout
this
competition.
We
this
first
phase
of
the
competition
we
will
have
awarded
59
850
file
coin
tokens
in
in
total
and
you'll,
see
where
that's
going
in.
In
a
few
minutes.
B
So
yeah
before
I
hand
this
back
to
david,
I
just
wanted
to
say
truly
thank
you
so
much
for
being
just
the
best
community
and
we're
really
excited
to
have
you
participate
in
phase
one
of
this
competition
and
there's
gonna
be
a
lot
more
down
the
road
so
really
excited
to
see
what
we
do
in
phase
two
thanks
and
back
to
you,
david.
A
Awesome,
it's
always
crazy
to
see
the
actual
numbers
side
by
side,
so
thanks
puja
for
sharing
all
that.
A
So
you
know
that
that
covers
at
a
high
level,
where
we've
been
and
we'd
love
to
transition
over
to
the
slingshot
showcase,
where
our
teams,
some
of
the
teams,
will
present
their
projects-
and
you
know
the
slingshot
team
and
the
community
reviewers
have
had
a
chance
to
dive
in
pretty
in
depth
to
all
of
your
projects
and
we
we've
been
lucky
enough
to
play
around
with
them.
So
hopefully
this
gives
you
all
a
taste
of
the
experience
that
we've
been
able
to
to
go
through.
So
we
have
10
teams
presenting
today.
A
So
here
they
are
up
here.
I
won't
introduce
all
of
them
now,
so
filezoo
will
be
presenting
first
and
I'll
hand
it
off
to
them
in
a
second.
So
filezoo
is
a
public
data
sharing
platform
and
they
stored
data
from
various
public
data
sources
like
the
kobit
19,
open
research,
data
source
stock
data,
open
images
and
and
more
so
without
further
ado
falzou.
Would
you
please.
C
C
Oh
thanks
thanks,
so
let
me
introduce
my
application,
fair
rule
and
the
fair
warehouse,
but
the
first
I
want
to
see
something
about
the
selling
shooter
computation
because,
as
for
me,
the
solution
combination
is
not
just
a
combination.
C
Are
you
seeing
my
powerpoint,
the
hydro
in
my
teachers,
and
he
said
to
me
the
fair
coin.
Network
storage
is
a
big
change
compared
to
the
traditional
cloud
storage.
She
wanted
to
have
one
more
try
and
test
her.
So
this
competition
competition
for
me
and
for
for
my
team,
it's
more
a
test
or
a
try,
so
we
wanted
to
do
more
things
in
the
faircoin
network.
C
So
thanks
thanks
the
vehicle
in
faircoin
network.
We
are
a
loyal
coin.
Community
members
and
we
are,
we
are
focusing.
We
are
working
a
fair
coin
network
and
mining.
So
next
let
me
start.
I
have
two
application,
fair
rule
and
a
fair
warehouse.
C
The
us
there
seems
the
theme,
but
it's
a
different
aims
for
me.
First,
let
me
see
about
the
two
applications.
The
first
one
failure.
C
My
teachers
told
me:
I
we
want
to
build
it
as
an
interesting
software
like
like
a
rule,
storage
for
all
kinds
of
fields,
especially
for
elephant
level,
fails
and
the
document
and
the
target
and
the
target
audience.
So
we
fail
to
welcome
everyone
to
upload
the
public
useful
date
to
the
falcon
network.
C
Now
we
wanted
to
do
a
faircoin
network
storage
platform,
and
for
now
we
have
storage,
many
real,
real,
read
it
in
the
philippine
network
include.
It
include
imagine
that
data
coveton
co-igniting,
open
registrar
data
and
the
stock
data
data
and
for
a
special
top
issue
and
topic.
We
will
storage,
many
field
coin,
a
filtering
related
data
such
as
parameters
and
the
trust
setups
data
for
the
next
four
second
application:
fair
warehouse.
C
So
it's
different
with
the
fair
rule,
considering
the
target
audience
fair
warehouse
hope
to
have
students,
teachers,
researchers
and
engineers
about
their
ai,
to
build
a
great
and
stable
test
study
environment,
because
I
I
have
a
ai
community,
so
many
students
and
researchers
asked
are
doing
some
ai
work
and
research,
so
I
wanted
to
help
them
to
have
a
stable,
stable
platform
and
for
now
I
just
wanted
to.
C
C
For
this
special
topic
we
will
we'll.
We
will
also
storage
the
field
calling
related
data
because
I
want
I,
I
think
we,
the
star
the
platform
is
based
on
the
vertical
network.
So
we
want
to
give
faircoin
a
special
place
and
talk.
E
C
Talk
about
architecture,
technical
architecture,
for
now
we
are
using
the
we
are
using
the
lotus
and
my
circle
database
and
the
lotus.
We
are
just
send
the
data
using
the
low
test
command
to
send
the
sender
data
data
data
to
the
miners
and
the
my
circle
is
to
is
to
use
that
is
used
to
record
the
field,
fill
information
and
the
radio
states,
but
the
pocket
is
very
powerful
to
us
for
the
next
step.
C
We
are
one
we
wanted
to
convert
to
paste
the
project
to
standard
data,
send
and
storage
and
retrieval
data,
because
it's
not
just
because
the
loader
lotus,
we
think
the
ip
office
is
a
hot
date
storage
and
the
lotus
or
faircoin
network
is
a
good
data
storage.
So
we
we
wanted
to
be
a
powerful
platform.
We
must
to
put
the
put
the
hot
data,
hot
data
and
the
coded
two
together,
so
we
will
in
the
next
step
or
in
the
future
we
will
to
use
the
ipfs
and
the
velcro
network
together.
C
So
first,
let
me
give
a
failure.
Demo
for
the.
For
now
the
fail
rule
has
three
functions.
The
fact
the
first
is
upload.
Your
useful
fail
files
for
everyone.
You
can
upload
your
files
to
my
application
and,
in
my
do
deal
management
system
we
will.
We
will
send
your
data
to
the
different
manners.
C
The
second
is
search,
auto
travel
database
data,
id
of
your
names
and
the
third
one
is
download
your
interested
data
directly
by
click,
the
file
name
file
name.
C
Talk
about
the
fair
warehouse
step,
deep
demo,
we
have
designed
the
three
shelf
and
the
storage,
some
ai
training
data.
The
first
function
share.
The
first
shelf
is
training
standard
share.
For
this.
This
part,
we
will
storage
where
all
kinds
of
ai
training
data
and
for
the
second,
the
shelf,
we
are
either
of
application
chef.
We
will,
we
will
start
to
build
a
training
model
shelf.
Recording
the
application
shelf
for
the
third
third
party
is
a
fair
coin.
Chef.
This
part
is
a
special
part.
C
We
will
store
it
about
the
fair
coin
date
and
about
a
sphere
coins
document
and
other
project
link
and
another
work.
We
have
developed
a
deal
management
system,
it's
included
two
parts,
it's
similar
to
pass.
The
first
part
is
this:
the
id
list
we
will.
We
will
first
to
import
all
all
the
ddcld
data
from
to
my
system
and
this
the
the
first
part
is
the
dlc
id
list.
It
will
show
the
field
name
and
the
field
size.
This
is
the
id
and
the
times
successfully
signed,
and
I
can
add
some
and
different.
C
I
can
some
date
to
the
task.
This
task
user
will
be
send
to
send
to
different
manners.
C
You
can
see
this
this.
This
window
is
used
to
add
tasks.
The
second
part
is
to
task
list
it.
This
this
part
will
show
all
the
deals
is
included
with
team
queen
and
the
complete
has
has
been
sent
to
the
miners,
and,
and
this
energy
will
show
the
field
names,
field,
size
and
main
id
price
priority
create
time
and
release
your
times.
The
priority
level
will
will
allow
you
to
arrange
your
deals
out
data
to
send
the
of
manus
and
different
a
shot.
C
A
Leo
we're
over
time
here
I
I
would
love
to
hear
more
about
this
deal
management
system-
that's
awesome,
but
unfortunately
we
have
a
lot
of
presentations
to
go
through,
so
I
might
wrap
it
up
here.
If
you
don't
mind
giving
me
back
the
the
screen
share.
C
The
last
function
is
update
the
task
parameters
for
the
weekly
queen.
I
have
we.
We
can
update
the
task
parameters
such
as
id
priority
and
the
price
next
is
the
place
for
our
futures.
In
short,
the
fair
route
will
in
the
future
we
will.
We
will
merge
it
to
the
fair
warehouse
as
a
data
rule,
so
in
the
next
or
in
the
future.
C
The
fed
referral
is
a
function
of
failed
warehouse,
the
farewell
warehouse
we
will
be
our
mainly
application
and
in
the
future
we
will
develop
our
valid
function
to
support
the
user
standard
data
and
develop
the
main
quality
ranking
system
in
in
case
on
stable
manners,
because
I
have
I
have
no
my
own
manners,
so
I
must
have
to
structure,
put
a
structural
manner
to
send
my
data.
A
It'd
be
great
if
you
had
a
way
to
choose
some
good
miners
out
there,
so
really
looking
forward
to
seeing
what
happens
so
yeah.
I
guess
next
up,
we
have.
We
have
dragonfly.
So
dragonfly
is
a
docker
image
hub.
So
dragonfly,
do
you
mind
taking
over
the
screen
share
and
going
ahead.
F
F
A
F
F
In
this
version,
the
user
actually
cannot
pull
image
from
the
coder,
because
in
the
early
stage,
due
to
the
unstable
transmission
across
the
network
and
level
of
manual,
we
decided
to
use
offline
deals
directly
through
routers.
So
most
of
the
data
is
only
stored
in
code
later,
but
right
now
the
target
doesn't
support
for
retrieving
data
of
offline
deals.
We
tried
to
import
that
information
to
publish
by
modifying
the
code
and
the
mutual
data
using
router
32,
but
both
methods
are
failed
to
make
the
restrictions
successfully
we're
still
under
investigation
investigation
to
find
a
solution.
F
F
F
F
F
F
F
G
H
F
F
A
That's
so
cool,
like
thanks
for
the
live
demo.
There
that's
awesome
to
to
see
it
working
like
that,
so
give
snaps
to
like
for
sharing
dragonfly.
Next
up,
we
have
starry
sky
and
yunnan.
A
They
are
storing
satellite
image
data
from
yunnan
university
and
also
some
meteorological
data,
and
I
know
they're
super
excited
about
the
project
and
have
a
tight
relationship
with
the
university.
So
please
take
it.
I
I
I
Is
the
first
apple
notice
to
the
special
thread,
the
regulatory
and
then
the
programmer
will
upload
the
data
file
to
ipfs
and
the
factor
in
this
way
it
is
a
computer
data
first
original
progress
system,
technical
design,
tool,
ritual
and
display
user
information
after
data
file,
upload
data
and
the
story
that
each
data
file
will
be
dislocated
on
their
folder
in
the
older
website,
lower
linker
and
you
can
directly
travel
their
first
story.
The
fpfs.
I
I
I
H
H
I
I
I
I
I
I
A
A
I
had
the
pleasure
of
of
retrieving
some
of
your
data
from
the
filecoin
network
and
the
the
data
that
I
got.
It
was
a
dot
fits
file.
I
I
installed
a
a
a
program
from
nasa
to
view
satellite
images
which
were
up
there
and
it
was.
It
was
awesome
to
see
so.
Cool
data
being
stored,
really
appreciate
it.
So
next
up,
we've
got
the
d
platformer,
so
peter.
A
Please
take
it
away,
you're,
storing
open
images
and
you
can
search
for
them
and
have
them
show
up
on
a
map.
I'm
really
excited
to
see
this
presentation.
J
J
The
platformer
is
working
on
tools
to
help
you
move
your
data
from
platforms
like
facebook
and
google
photos
to
web3
networks
like
filecoin
storage.
The
problem
that
we're
addressing
is
the
loss
of
online
privacy,
the
sale
of
internet
activity,
information
to
advertisers
and
the
lock-in
of
personally
curated
content
in
walled
gardens.
J
Our
target
market
is
people
who
have
become
fed
up
with
social
media
platforms,
sometimes
referred
to
as
a
delete
facebook
movement.
In
addition
to
liberating
their
curated
content
from
walled
gardens,
we
will
enable
our
users
to
generate
their
own
machine
learning
models
against
their
personal
content
collections
to
add
new
value
and
ownership
of
this
data.
So
the
google
open
images
data
set
is
a
rich
resource
for
trading
these
image
classification
models.
J
It
consists
of
over
9
million
photographs
that
google
has
used
to
train
its
image
classification.
Algorithms.
The
photos
are
all
creative
commons
licensed
and
google
has
put
an
open
license
on
the
machine
learning
annotations
that
were
generated
from
training,
validating
and
testing
their
ai
models.
This
is
the
largest
public
resource
of
its
kind
in
the
world.
It
can
be
used
by
any
machine
learning
project
to
train,
validate
and
test
their
own
image
classification
models.
J
I
am
a
professional
digital
archivist
by
trade,
and
I
have
some
serious
concerns
about
the
long-term
digital
preservation
of
the
open
images
data
set
from
our
own
download
and
verification
statistics.
At
least
seven
percent
of
the
photographs
on
flickr
are
no
longer
available
for
download
to
make
sure
we
weren't
missing
anything.
We
contacted
a
google
representative
for
the
open
images
project.
J
He
told
us
that
they
expected
this
number
to
be
even
higher
and
that
it
will
grow
over
the
years
as
they
have
no
control
over
users
taking
down
their
photographs
on
flickr
over
time,
despite
their
creative
commons
licensing.
Furthermore,
the
photographs
are
not
inextricably
linked
to
their
content
and
context
metadata.
An
important
consideration
for
maintaining
long-term
accessibility
therefore
did
a
platformer
open
images
project
created
a
digital
preservation
pipeline
to
archive
the
open
images
data
set
using
industry,
best
practices
in
combination
with
file
code
storage.
J
We
verified
the
md5
checksum
of
every
image
downloaded
from
the
flickr
servers.
We
retrieve
the
png
overlay
files
for
segmentations
that
are
associated
with
many
of
the
images
we
combined
the
annotations
metadata
from
all
the
disparate
csv
sources
into
one
manifest
per
image.
We
then
extracted
x,
of
metadata
from
the
jpeg
files
and
captured
all
this
metadata
in
a
json
ld
schema
format
that
we
developed
from
the
schema.org
image
object.
Definition.
J
We
store
this
metadata
file
next
to
each
source
image
inside
a
library
of
congress,
standard
bagged
container
for
long-term
archival
storage.
These
containers
include
shaw,
512
checksums
for
all
files.
We
bundled
the
images
and
metadata
files
into
800
megabyte
tarp
packages
which
become
one
gigabyte
files
exactly
on
the
file
code
network
due
to
the
next
power
of
two
factor.
That
is
part
of
the
file
code,
hashing
and
chunking
algorithm.
J
We
found
that
3.5
of
the
open
images
contain
embedded
geolocation
data
in
the
current
version
of
the
application.
We
have
over
6000
maps
that
are
each
based
on
one
machine
learning
annotation
like
swimming
in
this
example
clicking
on
one
of
the
map.
Pins
shows
a
preview
of
the
photograph
as
well
as
the
city
and
country
where
it
was
taken.
J
J
If
you
click
the
view
metadata
button
or
the
pin
on
the
close-up
view,
you
will
see
the
image
metadata
below
the
map
besides
latitude
and
longitude.
This
contains
information
we
have
extracted
from
the
photograph's
embedded
metadata.
It
includes
creator,
name,
image
creation,
date,
width
and
height
file,
size
resolution,
along
with
any
title
or
description
information
that
the
original
creator
might
have
entered
into
the
embedded
image
metadata,
such
as
california
coast
and
kick
in
the
ball
for
this
photograph.
J
Each
of
the
tar
packages
that
we
have
uploaded
to
filecoin
can
be
downloaded
from
the
photographs
metadata
view
in
the
web
application.
You
also
have
direct
download
access
to
copies
of
the
open
image
master
file
and
the
json
ld
manifest
file
here.
The
filecoin
content
identifier
or
sid
is
provided
so
that
you
can
download
a
copy
from
the
filecoin
network
using
the
sid.
J
J
We
believe
that
the
metadata
we've
extracted
will
be
useful
for
comparing
and
training
new
machine
learning
models
on
this
data
set.
The
working
hypothesis
is
that
new
information
will
emerge
about
an
image
and
its
geolocation
when
we
combine
extracted
photograph
metadata
with
machine
learning
annotations.
J
So,
in
addition
to
us
using
the
using
the
backed
up
open
images,
data
set
for
our
own
aiai
training,
we
believe
our
approach
may
be
of
interest
to
machine
learning.
Researchers
who
are
looking
for
a
new
way
to
work
with
the
image
open
images
data
set.
We
will
explore
these
connections
in
more
depth
for
slingshot
2
version
of
the
application
for
now,
which
is
kind
of
fun
to
click
on
one
of
the
hyperlinked
annotation
labels.
J
J
Please
visit
the
slingshot
app
site
and
have
a
go
at
it
for
yourself.
If
you're
interested
in
learning
more
about
the
d
platform
project
as
a
whole
subscribe
to
our
substack
newsletter,
we
are
currently
working
on
a
roadmap
to
develop
the
push
of
d-platform
data
stored
on
filecoin
to
a
mobile
application
for
re-sharing
with
user
contacts.
Thank
you.
A
Awesome
peter
thanks
so
much
there's
a
lot
of
boxes
that
are
checked
there
with
with
the
mission
of
the
project
and
the
sweet,
ui
and
storing
data
onto
file
coin
so
find
peter
on
on
file
coin
slack.
If
you
want
to
check
out
the
ui,
it's
pledged
to
click
through
next
up,
we
have
create
a
file
cloud
they're,
a
personal
storage
app.
That's
currently
focusing
on
storing
public
unsplash
image
data
so
create
a
file
cloud.
E
E
E
Last
files
will
be
stored
on
filecoin
as
cold
storage
after
interacting
with
miners,
how
we
retrieve
files
for
hottest
files,
users
retrieve
the
file
from
our
cache
server
directly
for
hot
files.
Users
retrieve
from
ipfs
if
exists
for
cold
files
files
will
be
retrieved
from
filecoin
to
our
cache
server
when
needed.
E
E
E
E
E
E
A
Awesome,
thank
you
so
much.
It
was
a
it's
a
really
ambitious
project
and
a
lot
of
things
there
to
to
help
make
the
the
ecosystem
better
so
really
appreciate
the
presentation.
Next
up,
we
have
ipfs
ubun,
they
are
storing
public
vr
video
and
they
allow
you
to
play
it
in
browser
through
their
ui.
So
please
go
ahead.
Ipvs
new
bun.
L
Okay,
so
we
are
a
video
side
and
in
the
beginning
we
talking
to
build
a
application
which
users
can
upload
videos
but
literally
switch
to
a
curated,
the
dataset
application.
So
we
are,
we
were
so
for
the
stringshot
competition.
We
were
uploading,
the
self
film
and
the
materials
and
the
produced
vr
videos
to
the
function
net
network.
L
So
we
have
a
company
making
vr
videos
and-
and
we
own
more
than
800
terabytes
of
raw
materials
and
produced
videos.
L
L
So
so
so
now
we're
looking
forward
now
now
we're
looking
to
store
our
our
data
to
the
file
called
network
at
the
back
up,
so
how
we
transfer
the
data.
So
in
the
beginning
of
the
spring
cell
documentation
we
was
thinking
about
how
can
we
transfer
the
800
terabytes
of
data
asap?
L
So
so
we
have
deployed
the
two
servers,
one
in
running
the
idc
data
center
of
a
cooperated
miner.
L
In
a
nice
city
and
another
server
in
our
local
office,
and
we
so
we
we
calculated
if
we
use
the
one
gig
gigabit
network,
it
will
take
us
around
10
days,
so
we
upgrade
to
the
fiber
and
and
we
were
able
to
copy
all
those
data.
In
one
day
we
were
thinking
about
and
also
the
s
copy
becomes
a
bottleneck
to
us
and
we
investigate,
and
so
so
I
think
there
are
many
solutions.
L
L
Around
19
or
13
terabytes,
but
most
of
them
are
lost
with
that
corporate
miner.
So
after
that,
we
starting
to
send
all
the
data
to
the
network
to
the
remote
miners,
and
I
think
that
in
the
in
the
end
of
the
the
things
that
y,
we
have
stored
around
eight
terabytes
of
data.
L
So
about
how
we
store
the
data,
so
at
first
we
we
investigated
the
power
kit
and
it's
a
very
good
structure.
So,
however,
however,
we
we
find
it's
not
not
very
flexible
and
it's
not
stable
during
the
the
sr
sr1
sr2,
the
bluetooth
and
the
power
gate
is
always
upgrading,
so
we
developed,
so
it
developed
a
command
line
tool
named
machine
guns
to
interact
with
the
lutherans
directly.
L
We
were
able
to
initiate
luther's
commands
in
parallel
and
we
we
have
a
list
of
minors
and
we
and
the
the
the
tour
can
rotating
from
that
list
and
we
were
recording
all
the
metadata
to
to
the
influx
db
and
it
will
make
sure
we
will
store
no
more
than
10
copies
of
our
data
and
the
leap
is
very
impressive
because
we
can
always
consume
all
of
our
batteries.
L
L
So
for
the
long-term
plan,
so
we
we
we
target
to
build
a
radio
platform
where
user
can
publish
videos
and
and
in
later
stage
we
want
to
make
it
all
decentralized
on
the
blockchain
and
the
users
were
able
to
fold
their
video
to
a
mt
token
and
the
video
was
stored
in
the
file
coin
network.
L
L
Okay,
so
now
we
now
it's
tilt,
we
only
start
a
fragment
of
our
data,
so
so
we
so
we're
still
looking
forward
to
corporate
cooperating
with
other
miners
who
can
help
to
store
our
data.
We
can
send
a
hard
drive
to
to
you
and
in
the
screenshot
phase
you
can
share
the
income
of
our
project.
So,
if
you're
interested
you
can
contact
with
me.
A
Awesome
thanks
so
much
gavin.
This
was
another
project
who
I
had
the
pleasure
to
retrieve
data
from
the
file
coin
network
from
and
I
opened
the
file-
and
I
was
like
oh
what's
wrong
with
this
video,
but
this
is
vr
video,
it's
more
and
more
common
these
days.
So
it's
great
to
see
and
just
a
quick
reminder
to
everyone.
Please
refrain
from
from
advertising
paid
services
and
that
sort
of
thing
but
yeah
moving
on.
A
Let's,
let's
go
ahead
and
introduce
file,
drive
file
drive,
is
a
public
data
sharing
platform
and
have
a
pretty
snazzy
ui.
So
please
go.
D
D
Okay,
hello,
everyone.
This
is
laura
m
from
phil
job
team,
thanks
for
the
invitation
from
slingshot
team
and
it's
my
honor
to
have
this
opportunity
to
introduce
our
project.
Fellow
dr
and
in
my
presentation
I
will
show
you
as
technical
architecture
a
demo
of
it
full
function
and
our
plan
for
the
future.
D
In
fact,
the
idea
of
building
a
public
data
science
has
been
there
for
a
long
time.
The
slingshot
communication
provided
a
platform
to
make
it
real.
During
these
few
weeks,
we
stored
over
6
000
files
and
it
raised
more
than
23
terabyte
storage
space.
I
think
we
make
a
good
achievement
and
speaking
of
the
target
audience
since
the
core
idea
of
a
field
job
is
to
make
a
public
data
side
for
all
human
beings.
So
everyone
can
be
our
target
audience.
D
Some
of
the
data
we
started
on
failure
was
query.
The
data
side
for
sling
for
slingshot,
computation
and
others
were
provided
by
filicon
community
members.
We
are
so
glad
to
have
their
support.
Thank
you
guys
and
let's
move
to
the
technical
architecture
according
to
our
technical
design.
If
a
user
wants
to
storage
data
on
fail
drive,
the
only
thing
he
needs
to
do
is
to
choose
specific
files
and
upload
them.
D
I
D
So
when
you
enter
the
field
job.ao,
you
can
see
a
very
clear
home
page
with
all
function.
Keys
and,
like
I
said,
fill
drive,
is
based
on
ipfs
and
filcon,
supporting
retrieval
by
fill
name
and
content
hash.
D
D
D
These
are
all
the
functions
of
file
drive
it's
running
smoothly
and
user
friendly.
I'm
not
sure
if
you
guys
encountered
any
issues
when
using
file
drive,
especially
during
the
review
session,
as
meaning
due
to
the
network
configuration
issues
which
we
have
been
fully
soaked
now
and
what's
next
in
the
future,
we
want
to
optimize
the
method
of
retrieval
and
the
display
of
data
catalogs
in
order
to
make
things
more
easier
and
easier
for
users,
and
we
will
also
focus
on
solving
the
problem
of
data
science.
Expansion
find
more
and
more
available.
D
Public
data
and
guidance
is
a
universal
way.
We
also
will
also
want
to
look
for
data
providers
and
make
data
in
a
database
multiverse
and
last
we
will
keep
working
together
with
falcom
storage
miners.
We
provide
real
data
and,
at
the
same
time,
miners
provide
storage
service
and
make
a
win-win
cooperation.
D
A
Thanks
so
much
file
drive,
I
really
appreciate
the
presentation
there
so
moving
on
next
we
have
orion
deep
learning
cloud,
so
you
might
have
heard
from
charles
during
the
slingshot
until
we
had
halfway
through
the
competition
and
he's
excited
to
share
some
updates
with
his
project.
So,
charles,
please
take
it.
G
Hi
everyone-
and
this
is
charles
from
iron,
deep
learning,
cloud
platform
team
and
yes
and
yes,
we
are
nebula
ai
cloud
solution.
Provider
based
in
montreal
iron
cloud
is
a
development
platform
with
developed,
which
is
a
gpu
container
platform.
It
has
a
general
storage
service
service
under
the
support
different
logs
files.
Videos
backups
on
a
container-based
system
so
previously
were
integrated
with
aws
and
azure,
but
with
fragrance
we'd
like
to
get
much
more
economic
efficiency
code
storage,
as
well
as
the
performance
improvement.
G
So
this
is
the
high
level
of
a
platform
design.
So
we
have
a
infrastructure
place,
which
means
that
it
can
be
gpu
farm.
It
can
be
fire
coin,
it
can
be
cpu
farm
and
we
build
a
path
platform
here
to
allow
people
access
the
platform
to
build
applications
on
top
of
it,
and
we
want
people
also
be
able
to
using
developers
smart
contracts
using
distributed
different
ai
tasks
across
the
platform,
and
this
is
a
system
we
developed
before
about
three
years
ago.
G
So
we
already
have
a
smart
contract
that
can
distribute
the
different
ai
jobs
in
a
task
pool
and
from
the
task
4.
It
will
need
to
download
the
data
from
storage
previously
three
years
ago.
Well,
faircoin
is
not
fully
directly
yet
at
that
time
we
don't
have
a
choice,
so
we
use
aws
as
a
storage
solution.
So
we
don't.
G
We
embedded
the
instance
to
communicate
with
aws
and
get
data
from
there,
but
the
cost
is
expensive
and
it's
so
centralized,
which
means
that
when
people
start
using
it,
they
have
to
trust
us
with
the
data
and
we
have
to
upload
this
expensive
data,
put
it
on
aws
cloud
and
we
pay
the
vr
every
month.
Now
we
have
buy
coin,
which
is
especially
with
the
screenshots.
We
have
so
many
teams
upload
so
many
files,
especially
those
ei
files
on
the
system.
So
now
we
don't
need
to
really
mind
by
ourselves.
G
We
can
get
those
easy
to
use
resource
and
globally
yeah
after
we
get
those
ai
storage
data
set,
we
can
download
it
to
our
gpu
local
system
and
start
using
it
we'll
give
a
demo
later,
and
this
is
what
we
support
now,
so
we
integrated
notebook,
so
people
can
direct
using
pygate.
We
added
a
packet
as
dependency
in
the
system
and
we
enable
people
to
use
in
a
container
based
db,
so
you
can
use
database
inside
the
system.
G
We
support
the
gpu
acceleration,
which
is
very
important
for
deep
learning,
while
us
using
the
difficulty
either
go
to
some
aws
gpu
resources,
which
is
three
point
six
dollars
per
hour
or
we
can
using
the
gpu
acceleration.
We
provided
because
we
also
a
miner
of
5
coin
and
we
could
name
mbfs
team,
which
is
a
miner,
so
we
have
about
40
gpu
machines
in
canada,
well,
most
of
the
time
doing
the
selling
jobs
and
for
sending
those
data
uploaded
to
the
system.
G
But
while
we
are
not
so
busy,
we
can
release
some
gpu
for
ai,
deep
learning
as
well,
so
which
means
that
we
can
doing
the
switch
and
scheduling
between
different
jobs.
So
we
also
have
multi
data
center
support,
which
is
supported
by
canada
government.
So
we
have
three
data
centers
across
canada.
We
can
use
that
to
redirect
different
services
among
the
data
centers.
G
But
the
credibility
of
current
is
only
working
in
canada
and
also
capitals.
Support,
for
example,
ethereum.
We
plan
to
add
a
fraction
support
later,
but
not
recently,
because
we're
really
too
busy-
and
this
is
some
interface
we
already
have.
So
we
have
deposit
by
qr
code
per
payment.
You
can
choose
payment
here
using
different
credit
card
or
up
to
you
and
if
you,
as
a
team,
if
you
want
more
free
working
hours,
you
can
just
send
us
information.
G
We
can
add
as
a
free
hours
for
you,
and
here
we
have
the
frequent
data,
which
is
we
are
doing
in
the
screenshot.
We
have
a
cure,
create
data
set,
which
is
a
common
clue
from
the
open
data
set.
We
have
uploaded
to
help
to
let
people
upload
different
data
and
the
z.
Can
users
can
apply
their
own
data
copy
and
they
can
find
there's
a
lot
of
local
data,
which
means
that
it's
your
you,
the
data
you
uploaded
and
we
provided
aws
synchronization
with
some
file
from
local
storage
to
the
aws.
G
If
you
input
some
token
on
the
bucket,
that
means
synchronization
and
the
last
part
is
the
gpu
support
where
you're
doing
the
ai
training
and
you
can
directly
click
the
button
here
to
open
a
gpu,
currently
we're
using
2080
ti,
and
we
have
some
other
gpu
cards
on
the
road
yeah
here
is
the
video
just
can
quickly
go
through
about
how
people
upload
a
file.
You
can
go
to
the
that's
uploader
and
upload
the
files.
G
And
yeah,
and
for
some
pre-uploaded
created
data
set,
is
the
common
core.
We
want
to
support
more
data
set
in
the
future,
so,
but
now
I
prefer
I
was
thinking
about
upload
by
yourself,
but
now
I
see
lots
of
cool
projects
which
is
presented
before
me.
Then
I
think
that
maybe
I
don't
need
to
do
this
work.
We
can
just
integrate
their
data,
set
directly
put
it
on
our
task
list.
Then
we
can
provide
it
to
all
the
deep
learning
users
instead
of
uploading
by
myself.
G
Yeah.
So
often
you
can
either
read
a
different
data
set.
You
can
even
do
a
search
now.
This
is
the
deep
learning
part
you
can
see
that
the
gpu
is
already
turned
on
and
we
have
pi
gate
integrator
already.
When
you
download
the
data
from
notebook,
you
can
go
to
the
sync
folder.
The
phantom
folder
means
that
if
you
put
your
data
in
the
same
folder,
when
you
switch
log
into
next
time,
when
we
switch
a
different
machine
for
you,
the
sync
for
the
data
won't
last,
it
will
synchronize
with
different
machines.
G
So
next
time
we
open
it.
When
we
don't,
you
want
a
gpu
card,
we
don't
have
a
gpu
on
the
same
machine.
The
data
will
start
transfer
to
another
machine
automatically
and
you
get
a
new
machine
with
the
cpu
and
you
can
unzip
here
using
command.
This
is
a
monkey
data
which
is
uploaded
by
using
himself
yeah.
Then
you
can
start
the
training
here
and
you
can
see-
and
you
can
do
it
just
like
as
a
notebook
step
by
step.
G
Yeah,
this
is
a
whole
bunch
of
images,
have
kind
of
different
monkeys
and
it's
doing
the
image
required
organization
to
find
the
watches
space
of
the
space
of
the
monkeys.
You
can
see
those
different
cute,
monkeys
and
yeah
and
below
that
you
can
do
more
analyze
about
the
data
loss,
the
accuracy
of
the
different
data
set.
I
can
just
print
out
yeah,
so
this
is
a
one
worker
provided
by
our
ai
developers.
I
asked
them
recorded
last
friday
and
quite
interesting,
but
please
do
remember
to
turn
off
the
gpu.
F
G
Okay,
now,
let's
talk
about
the
development
process
we've
experienced
during
the
application.
We
have
the
power
gate
installed
on
our
server,
so
people
can
we
can
close
watching
the
log,
the
difference
on
the
background
and
when
people
upload
a
file.
Actually
it
goes
through
the
power
gate
and
the
ipfs
then
go
to
the
breakpoint,
and
we
have
a
trunk
file
process
running
at
the
back
end
to
doing
upload
you
to
be.
G
Our
trunk
is
also
different
data
set
and
ins
and
compare
the
md5
to
make
sure
the
data
is
not
broken
and
we
have
different
folders
to
save
locally
we're
doing
the
comparation
and
every
day
to
make
sure
when
some
data
was
not
retrievable
on
the
net,
where
we
submitted
data
to
the
miners
to
make
sure
the
data
is
still
valid
yeah,
and
this
is
about
our
future
plan.
I
would
like
to
add
the
propagate
fs
docker
images
support.
G
So
people
can
directly
using
document
image
of
the
power
gate,
so
they
can
upload
their
own
token.
Instead
of
using
the
platform
token,
we
would
like
to
support
the
offline
data
transfer.
We
will
get
requests
from
some
community
members
want
to
have
100
terabyte
data
directly
sent
to
the
data
center.
We
want
to
add
those
services
to
our
platform.
G
Data
encryption
is
another
issue.
We
want
to
integrate
with
a
key
management
system
to
make
sure
that
when
user
uploads
their
data,
it's
a
very
protected
instead
because
they
don't
want
to
open
to
the
world,
and
we
wanted
to
do
embedded
scheduling
services
between
the
er
and
the
ceiling
switch
so
currently
like.
We
wanted
to
get
the
weird
104
gpu
to
slicing
the
gpu
course
to
make
sure
you
still
have
the
capability
to
win
post
while
it's
doing
the
ai
training
and
we
want
to
do
the
dynamic
their
price
adjustment.
G
For
now
we
are
using
a
fixed
price
change
every
day,
but
we
want
to
make
it
more
dynamic.
According
to
the
price
change.
I
know
that
there's
an
api
currently
is
developed
by
the
fragrant
team,
so
we
would
like
to
catch
up
after
it
become
much
more
major
and
we
want
using
the
credit
card
and
the
crop
account
to
cryptocurrency
support
for
u.s
region,
not
just
canada,
yeah.
So
there's
lots
of
work
needed
to
be
done,
while
I
was
so
excited
to
be
part
of
the
viracon
ecosystem.
A
Awesome
thanks
so
much
charles,
you
know.
Decentralized
storage
protocols
and
ai
training
really
should
be
best
friends.
So
I
really
appreciate
the
work
that
you're
you're
doing
cool.
So
we
have
two
two
demos
left.
Next
we
have
geoguesser,
I'm
really
excited
for
this
one.
It
was
one
of
the
the
few
game
submission
projects.
So
it's
a
it's
a
game
that
uses
satellite
image
data
stored
in
filecoin
and
has
you
guess
where
that
image
is
from
so
please
geoguess
or
take
it.
M
Away,
hey
everyone
I'll
be
talking
about
geoguessr.
I
gotta
hear
where
we
store
atlanta
eight
on
finalcoin
is
my
screen.
M
Cool,
so
this
guy
on
the
left
is
jose.
My
colleague
I'm
the
guy
on
the
right
eli.
We
both
work
at
a
studio,
we're
a
development
data
science
and
design
studio
that
works
closely.
M
So
when
you
land
on
the
file,
when
you
land
on
the
home
page,
you
really
have
two
paths.
The
first
is
kind
of
boring.
Just
shows
you
a
list
of
files
that
we've
uploaded
some
metadata
and
you
get
a
download
link.
The
real
fun
in
the
game
is,
after
you
press
start.
M
Got
time
for
for
our
our
app,
so
I
guess
like
to
put
it
into
perspective
jose,
and
I
were
put
on
this
project
two
weeks
before
the
deadline,
and
we
really
had
zero
knowledge
about
file
point,
which
meant
we
had
ten
working
days
to
learn
enough
about
filecoin,
to
build
something
come
up
with
an
idea
and
also
to
build
it.
So
we
worked
quick
and
broke
things
really
fast
and
we
also
wouldn't
have
gotten
very
far
without
the
help
of
the
textile
team.
M
So
massive
shout
outs
to
aaron
and
ignacio
for
helping
us
out.
Our
first
iteration
was
really
simple.
We
uploaded
things
manually
onto
a
host
of
powergate
instance
that
the
textile
team
graciously
gave
to
us,
which
was
very
helpful,
because
when
I
say
we
had
zero
knowledge
like
we,
we
came
on
the
project
and
really
didn't
know
anything.
M
M
E
M
Furthermore,
we
had
problems
with
the
node
powergate
client.
It
was
probably
our
fault,
but
we
had
some
really
slow
downloads
and
we
didn't
really
have
time
to
investigate
further
due
to
coming
on
really
late
around
this
time,
peter
shared
in
the
smilecoin
slack
channel
his
pie,
gate
project,
which
was
incredible
timing.
Really
either
of
us
knew
go
and
given
the
two
weeks
we
had,
we
didn't
really
want
to
put
more
on
our
plates.
M
It
was
also
at
this
time
we
started
developing
our
upload
pipeline,
where
we
downloaded
a
directory
off
of
the
s3
buckets
of
landsat
8
turret
and
push
it
up,
and
we
also
saved
it
to
a
local
database,
which
we
was
actually
a
csv.
We
hacked
a
lot
of
things
together,
and
this
was
one
of
them
and
because
we
started
late,
we
were
and
because
we're
also
very
competitive.
We
fired
up
two
of
our
own
instances.
M
They
were
like
beefy
64,
core
300
gigabyte
memory
machines,
which
meant
we
have
triple
the
upload
power
that
we
initially
had
so
a
day
before
the
final
final
due
date,
we
finally
got
a
chance
to
fix
the
images
taking
far
too
long
to
load
and
decided
to
save
images
to
disk.
After
the
first
read.
M
M
We
also
had
to
constantly
monitor
our
lotus
nodes
because
they
were
going
out
of
memory
so
in
the
meantime,
we're
rebuilding
our
upload
pipeline
to
be
pole,
based
as
opposed
to
push
based
in
hopes
that
it'll
be
easier
for
lotus.
M
The
two
takeaways
we
got
from
doing
this
project
was
that
first,
the
file
coin
community
is
amazing.
The
slack
channel
is
very
lively
and
extremely
helpful,
and
the
community
is
very
welcoming
and
obviously
wants
to
be
involved.
M
M
But
after
two
weeks
we
were
so
impressed
with
the
ecosystem,
because
we
were
able
to
get
to
where
we
were
so
quickly,
so
we're
excited
to
keep
contributing
to
the
future
and
contributing
to
file
point
and
to
see
the
future
of
file.
Point
wow,
I
like
zoomed
through
that.
So
thank
you
for
listening
about
our
experiences
building
geoguesser
and
thank
you
to
the
slingshot
team
for
making
this
all
possible
we're
always
looking
for
new
clients
to
partner
up
with
and
build
cool
products
that
increase
human
agency.
M
A
A
Awesome
so
we
did
have
one
last
presentation
from
daisolo,
but
actually
we're
not
sure
if
they're
in
the
panel
right
now,
if
any
of
the
pal
panelists
are
die
solo.
Could
you
please
say.
A
Cool
all
right,
that's
all
right.
We've
had
a
ton
of
really
really
great
presentations
and
I
think
it'd
be
awesome
to
to
maybe
switch
over
then
to
to
the
awards
ceremony.
So
let
me
just
share
my
screen
again.
A
Great
so
we're
switching
over
to
the
award
ceremony,
and
this
is
what
many
of
you
have
been
waiting
for
and
I'm
sure,
as
you
see
this
slide
your
eyes
creep
over
to
the
bottom
right
and
see
the
59
000
plus
fill
and
rewards
box
there,
as
puja
mentioned,
but
before
getting
into
that.
Just
wanted
to
to
really
quickly
give
an
overview
of
the
structure
of
how
the
community
review
and
rewards
process
worked.
So
I'm
not
going
to
understate
it.
A
It
was
a
pretty
arduous
process
over
the
last
three
days
for
both
the
slingshot
team
and
and
a
huge
shout
out
to
the
community
review
team,
and
it
resulted
in
figuring
out
the
final
qualifying
storage
amounts
for
each
teams
and
final
reward
amounts.
If
you
take
into
account
the
other
reward
types
and
this
included
reviewing
the
uis
all
the
uis
retrieving
and
reviewing
data
stored
on
the
filecoin
network,
making
sure
it
was
retrievable
and
what
kind
of
data
we
were
getting
back
and
much
more.
A
A
So,
if
you
look
over
to
the
right
hand,
side,
there
is
calculation,
just
quick
sum
of
the
different
categories
of
reward
pools.
So
at
the
top
we
have
storage
rewards.
You
all
did
a
great
job
and
stored
over
500
kibby
bytes
of
oh
no
pebby
bytes.
A
Rather
I'm
sorry
one
magnitude
off
of
data,
unlocking
a
50,
000
storage
reward
pool,
and
this
is
split
across
all
teams
with
qualifying
storage,
there's
also
community
awards,
and
there
were
seven
categories
of
community
awards,
plus
community
call
out
awards
for
for
honorable
mention
teams
and
then
also
booster
awards,
and
these
booster
awards
were
50
fill
awards
for
every
team
who
stored
at
least
one
gibby
byte
of
data,
and
that
gets
us
to
our
almost
60
000
in
infill
rewards.
A
So
I'd
love
to
to
kick
off
presenting
some
of
the
awards
to
to
announcing
the
top
20
projects
for
for
the
storage
awards.
So
drum
roll
pre,
please.
A
So
here
we
have
the
top
20
projects
I'll
call
it
the
top
five
projects.
So
first
we
have
dragonfly.
So,
as
you
know,
they
they
created
kind
of
a
docker
hub
on
filecoin.
Next
create
a
file
cloud.
It's
a
personal
storage
app!
That's
currently
storing
unsplash
image
data,
as
you
saw,
and
then
the
next
three
are
are
all
file
drive
file
going
net
disk
and
mateen
cloud.
A
Pan
are
all
personal
storage
applications
and
then,
if
you
look
over
to
the
right
side,
there's
there's
the
following:
15
to
round
up
to
the
top
20..
So
congratulations
all
these
teams.
But
again,
even
if
you
don't
see
your
name
up
here,
if,
as
long
as
you
have
qualifying
data,
you
will
get
rewards
out
of
this
50k
fill
pool.
A
And
I
would
love
to
to
hand
it
off
and
introduce
greg
markow
from
chainsafe
to
present
the
community
awards.
He
was
one
of
our
community
panelists
and
a
huge
part
of
the
falcon
ecosystem
he's
the
co-founder
and
cto
of
chainsafe
and
chain
state
is
currently
building
forest,
which
is
the
rest
implementation
of
file
coin,
and
I
know
he's
really
excited
to
be
here
today,
so
I'm
going
to
hand
it
off
to
greg.
N
Hey
everybody
thanks
a
lot
and
great
job.
This
far
it
was
really
great
to
kind
of
review
everybody's
stuff,
so
for
the
community
awards,
category
we've
got
so.
There
are
seven
categories
within
the
community
choice
awards,
there's
the
personal,
the
user,
the
storage,
categor.
Sorry,
the
personal
user
storage
category
has
prizes
for
the
top
three
teams
and
there's
an
overall
prize
pool
of
three
thousand
fill.
So
it's
a
live
lot
there
and
it's
pretty
awesome.
N
The
other
six
categories
have
prize
pools
of
250
fill
each
and
for
one
project,
and
we
also
have
a
community
call
out
which
is
essentially,
you
know
like
a
like
special
honorable
mention
for
projects
that
we
thought
did
a
really
great
job
and
they
each
get
a
100
fill.
If
for
being
recognized,
you.
N
Great
projects
that
we
saw
and
one
of
the
conditions
just
everybody
knows-
is
these
projects
can't
win
multiple
community
awards?
So
if
you
got
one,
congratulations
and
you
know
there's
we
did
not
allocate
another
one
after
that.
So
first
up
we'll
present
the
personal
user
storage
category
award,
drumroll
awesome
first
place
award
to
phil
box
and
congratulations
you're,
taking
a
big
prize
of
1
500
phil
phil
box
had
a
super
clean
user
interface,
and
you
know
it
was
a
well
thought
out.
N
We
were
able
to
go
to
the
website,
create
an
account
upload
and
store
some
data
and
also
retrieve
that
it
was
great,
really
easy
to
use
and
super
excited
for
you
all,
and
one
of
the
great
parts
is
also
like
the
retrieval
side.
You
know,
as
I
said
before,
you
know,
retrieval
is
a
bit
of
a
challenging
process.
Sometimes
right
now
and
you
know
we
were
able
to
retrieve
it
and
see
the
data
really
well.
N
So
congratulations
to
phil
box
for
coming
first
place
in
the
personal
user
storage
category
second
and
third
place
are
going
to
pufferfish
cloud
storage,
drive
and
file
drive.
Congratulations!
Both
the
projects
very
similar,
elegant,
great
designs
and
you
know,
was
able
to
do
the
full
feature.
Suite
of
uploading
retrieving
viewing
all
your
stuff,
and
some
of
the
nice
parts
too,
is
the
ability
to
search
which
is
great.
N
So
next
up
we'll
present
the
video
and
music
categories,
and
these
are
each
for
250
fill
and
we've
got
two
winners
here,
so
I'm
super
excited
to
announce
that
movies
ax
and
we
review
both
are
our
winners
for
the
music,
video
and
music
category
respectively
movies,
ax,
the
judges
liked
that
you
know
we
enjoyed
the
fact
that
you
know
you
could
browse
and
search
through
a
bunch
of
different
movies
from
a
public
domain
and
view
them
in
your
browser.
N
You
know:
there's
a
lot
of
content
there
and
the
fact
that
it
was
like
seamless
to
actually
go
through
was
really
awesome.
Similarly,
with
we
review
with
the
fact
that
you
know
same
kind
of
user
experience,
we're
able
to
go
through
actually
view
all
the
music
and
select
what
we
want
here
and
you
know
what
we
want
down
and
whatnot
that's
great,
and
so
congratulations
once
again
to
movies
ax
and
we
rave
you
for
that
awesome,
250,
fill
prize
next
up
we're
doing
the
scientific
data
set
and
machine
learn.
N
Oh
yeah
scientific
data
set
machine
learning
category
and
congratulations
to
depliform
and
yolo.
Sorry,
if
I
misspelled
that
first
one
for
winning
your
respective
categories
again,
big
250,
full
prize
depp
left
form,
you
know
the
ui
was
excellent
and
scored
very
well
and
the
fact
that
the
image
search
was
clean
and
the
way
that
pinning
onto
the
actual
map,
so
basically
you're
able
to
pin
like
photos
and
stuff
to
the
map
was
was
super.
Super
elegant,
really
simple
and
fluid:
congratulations:
el
pamora
for
winning
that
and
yolo.
N
Similarly,
they
were
basically
a
project
that
allowed
you
to
view
like
a
gallery
of
google's
open
image
data
set,
and
it
was
a
nice
clean,
ui
and
basically
did
like
facial
recognition.
So
you
can
see
basically
all
the
different
stuff
that
was
going
on
with
that
data
set.
So
congratulations
to
you
two
for
winning
both
the
scientific
data
or
winning
scientific
data
and
machine
learning
prize
pool.
N
Next
up.
We
got
technical
complexity
and
outstanding
creativity
and
our
two
winners
here,
our
decentralized
docker
for
technical
complexity
and
ipfs,
new
bund
for
the
outstanding
creativity.
The
really
awesome
part
honestly
about
the
decentralized
docker
was
that
it
had
a
very
similar
experience.
What
we
were
used
to
very
easy
to
use
very
clean
and
simple,
just
pull
down
those
images
and
get
them
up
and
running
and
honestly
the
ui
was
great.
N
It
was
just
easy
to
use
and
that's
that's
that's
great,
because
you
know
what
the
challenge
actually
is
behind
the
behind
the
scenes.
Similarly,
with
ipf's
new
build,
they
basically
stored
vr
video
within
the
chain
and
allowed
you
to
go
and
play
these
videos
in
browser
which
is
really
cool.
Considering
you
know,
retrieval
constraints
and
whatnot,
so
really
awesome
to
see
that
and
congratulations
to
you
two
both
next
up.
We
have
the
most
dazzling
user
interface
and
congratulations
to
geo
guesser.
N
Really
cool
kind
of
you
know
funky
and
got
that
like
old
school
vibe
going
a
little
bit
there,
they
basically
created
like
a
game
that
would
allow
users
to
guess
like
what
country
a
random
image
is
from
that's
all,
based
on
satellite
images
and
whatnot.
That
was
like
uploaded
type,
uploaded
fat
coin
and
yeah
so
go
check
them
out.
They're,
live
and
really
awesome,
ui
and
very
fun
to
play
with,
and
congratulations
geogasser.
N
And
finally,
last
but
not
least,
we'll
present
the
community
college
awards
and
want
to
be
there's
a
couple
here.
So
just
bear
with
me
so
big,
congratulations
to
smart
city,
ips
ipfs
found
free
music
archive
browser
and
win
web3
content,
community
dice,
solo
chainsdb,
video
over
filecoin
and
cadbury
meat.
The
judges
all
really
liked
your
project.
N
As
I
said
before,
the
call
it
was
a
great
like
for
everybody
else,
and
we,
you
know,
saw
something
really
great
in
your
project
and
really
saw
what
you
guys
were
working
for
big
congratulations,
because
each
of
you
are
coming
home
with
100
fill
today
and
that's
all
for
the
community
awards.
Thanks
a
lot
for
everybody
that
participated,
it
was
really
great
to
actually
judge
a
bunch
of
them.
I
had
a
lot
of
fun.
I
know
that
my
colleagues
as
well
did
and
keep
building
and
make
sure
to.
A
Awesome-
and
I
will
hijack
the
the
microphone
here-
oh
my
goodness,
there's
some
loose
images
here.
I
was
having
disappearing
images
on
my
google
slides.
If
that's
happened
to
any
of
you
out
there,
but
yeah
huge
thanks.
More
importantly,
to
all
our
community
reviewers
alluded
to
this
earlier.
It
was
a
very
involved
process.
We
had
some
24-hour
rolling
shifts
going
on
across
time
zones
thanks
to
the
slingshot
team
members
who
moderated
the
community
review
sessions
as
well
and
and
for
the
quick
turnaround.
A
None
of
this
would
have
been
possible
without
all
of
you,
and
I
think
it
just
speaks
to
the
great
community
that
we
have
both
the
foundation
and
are
continuing
to
develop
so
snaps
and
collapse
to
everyone
involved.
There
I'll
hand
it
off
to
pooja
really
quickly.
Who
will
give
you
a
preview
of
phase
two?
I
know
you're
hungry
for
details.
B
Yes,
so
we're
still,
we
still
have
a
little
while
before
we
will
be
publishing
the
really
specific
rules
and
rewards
for
slingshot
phase.
Two
we're
taking
this
opportunity
to
you
know
amass
the
learnings
from
slingshot
phase
one
and
just
create
more
opportunities
with
this
next
phase
of
the
competition.
So
we'll
we'll
have
more
details
for
you
soon,
but
in
the
meantime
we
wanted
to
give
you
just
some
guidance
that
we
know
for
sure
is
not
going
to
going
to
change.
B
So
if
you
are
beginning
to
looking
to
participate
in
slingshot
phase
2,
you
can
still
get
started
during
this
this
period
over
the
next
few
weeks,
while
we're
finalizing
some
of
the
details
and
the
efforts.
If
you
follow
these
rules,
they
will
still
count
first
thing
shot
phase
two,
so
just
a
bit
of
information
on
some
of
the
relevant
dates
to
keep
in
mind.
Officially
slingshot
phase
two
started
last
week
on
wednesday
october
21st
2020
at
1800
utc.
So
as
soon
as
phase
one
ended,
we
started
rolling
straight
into
slingshot
phase.
B
Two
we'll
be
opening
official
registration
for
the
next
phase
of
the
competition
on
november
11th.
So
that's
in
almost
exactly
two
weeks
from
now
a
little
bit
less
than
two
weeks
from
now
and
we'll
be
publishing
the
final
rules
and
rewards
latest
by
november
9th.
So
a
couple
of
days
before
the
official
registration
opens
for
the
next
piece
of
this
competition.
B
B
The
first
was
that
the
file,
one
of
the
main
goals
of
the
file
queen
project
and
definitely
for
the
slingshot
competition,
is
to
preserve
humanity's
most
important
data
sets
on
the
filecoin
network,
and
so
as
a
result,
what
this
means
for
slingshot
phase
two
and
we've
already
communicated
this
in
some
of
the
slack
channels
and
so
on,
but
it
means
that
we
are
only
going
to
for
this
competition,
we're
only
going
to
reward
the
storage
of
curated
data
sets,
and
we
had
some
description
about
what
this
means
in
in
phase
one.
B
But
basically
you
know
we
have
a
list
on
the
slingshot
repo.
It
has
a
number
of
different
data
sets
that
are,
you
know,
scientific
data
sets
or
cultural
data
sets,
data
sets
that
are
like
in
the
public
interest
and
for
the
public
good.
Anyone
can
download
them
on
the
internet
freely
today
and
for
for
this
phase
of
the
competition
we'd
like
to
encourage
folks
to
find
those
either.
The
data
sets
that
we've
already
identified,
or
if
there's
a
data,
set
that
you're
really
excited
about.
That's
also
a
public
access.
B
You
know
open
licensed
data,
set
that
you
think
should
be
on
this
list.
You
should
feel
free
to
submit
a
pr
to
the
repo
as
well
to
get
it
added
so
that
you
can
store
it
for
the
purposes
of
the
phase
two
competition.
B
Another
goal
for
the
slingshot
competition
is
to
make
these
important
data
sets
accessible
and
explorable
by
anyone
in
the
world,
and
so
what
that?
What
this
is
going
to
mean
for
phase
two?
Is
that
we'll
we
will
be
requiring
that
all
folks
who
are
participating
in
the
competition
provide
public-facing
documentation
for
exactly
which
chunks
of
data
they
were
storing
on
the
filecoin
network,
how
those
chunks
have
been
structured?
So
that
includes
information
about
you
know
what
are
the
file
names
and
formats?
What
are
the
sizes
of
these
pieces?
B
What's
the
payload
cids
that
it
can
be
retrieved
back
from
the
network
easily
and
so
on,
so
that
we'll
have
more
specific
rules
on
this,
but
we
basically
want
this.
This
data
you're,
you
know
we're
all
doing
the
really
hard
work
of
storing
it
into
the
falcon
network
and
preparing
it
so
that
it
can
be
used
and
so
for
phase
two
we're
going
to
we're
going
to
enforce
that.
B
You
know
this
is
like
data
that
can
really
be
used
by
anyone
and
then
another
corollary
of
this
is
that
we'd
love
to
see
just
greater
investment
in
uis
for
for
this
phase
of
the
competition.
So
we're
working
out
the
details
here,
but
this
is
probably
gonna
result
in
some
sort
of
ui
or
application
focused
track,
which
is
separate
from
the
data
storage
track.
B
And
so,
if
you
really,
you
know,
are
an
application
developer
and
you're
really
excited
about
just
building
beautiful
user
interfaces
that
are
just
pleasurable
experiences
for
people
to
interact
with.
This
is
your
opportunity
to
to
shine-
and
you
know
just
really
build
some
interesting
projects
on
data
that's
being
stored
on
the
file
point
network.
B
The
third
major
goal
for
this
competition
is
to
strengthen
the
file
coin,
storage,
client
and
developer
experience.
So
translation
of
this
is,
we
have
been
so
appreciative
of
all
the
stress
testing
that
has
happened
on
the
network
and
the
client
and
developer
sides
over
the
last
few
weeks
and
we'd
love
to
continue
this
with
you.
So
definitely
don't
be
shy
on
all
of
the
various
you
know,
slack
channels
and
the
various
venues
where
we're
chatting
about
you
know
how
do
you
interact
with
filecoin?
B
What's
the
best
practice
for
how
to
do
x,
y
and
z?
Please
continue
to
do
that.
It's
extremely
valuable
and
and
will
continue
to
be
so
in
phase
two
and
the
fourth
goal
is
definitely
to
have
fun
and
to
build
our
awesome,
developer
and
client
communities,
even
even
more
so
we're
currently
in
the
process
of
planning
some
really
awesome.
B
Community
events
for
phase
two,
the
velocity
of
the
events,
will
be
a
little
bit
slower
than
than
phase
one
just
so
that
we
can
have,
like
you
know,
richer
bonding
experiences
with
each
other
and
just
really
get
to
know
each
other's
projects
and
ideas,
and
hopefully
new
partnerships
will
form
out
of
that
and
so
on.
But
we're
gonna
have
some
really
awesome.
B
Events
for
phase
two
so
definitely
keep
an
eye
out
for
that,
and
then
some
concrete
rules
and
and
again
there
will
be
more
more
specifics
coming
soon,
but
just
because
we
want
to
give
you
some
guidance
in
this.
This,
like
just
you
know
this
this
period,
where
we'll
we're
still
finalizing
some
of
the
structure,
a
few
rules
for
you
to
keep
in
mind.
So
one
rule
in
phase
one
of
the
competition.
We
said
that
participants
couldn't
store
data
with
just
one
minor.
B
They
had
to
be
storing
with
three
or
more
minors
so
for
phase
two
we're
actually
making
this
a
bit
of
a
stronger
requirement,
so
we're
participants
will
not
be
able
to
store
more
than
30
of
their
total
data
with
any
individual
miners,
so
just
making
sure
that
your
data
is
more
decentralized.
B
This
is
also
really
important,
for
just
you
know,
just
having
redundant
copies
on
the
network
that
are
spread
more
broadly
in
like
different
geographies
with
different
entities
is,
is
just
important
feature
of
any
storage
system,
and
this
is
like
a
way
of
to
be
able
to
kind
of
enforce
that
through
the
competition
just
to
make
sure
that
your
your
data
is,
is
like
copied
many
times
on
the
network
and
is
available
when
you
need
it.
B
Another
rule
is
we've
seen
a
lot
of
you
know
folks,
storing
data
with
their
own
miners
and
in
phase
two
of
the
competition
this
won't
be
eligible
for
rewards.
So
we're
really
looking
to
see
you
know:
storage,
clients
and
developers
who
are
storing
data
with
other
miners
on
on
the
network,
who
are
not
their
own
miner
and
then
the
third
rule
here
is.
B
We
asked
for
at
the
end
of
phase
one
just
a
list
of
all
of
the
deals
that
participants
had
made
throughout
the
course
of
the
competition,
and
you
know
some
folks
ran
into
a
little
bit
of
trouble
with
this.
At
the
end,
this
it
was
a
bit
of
a
you
know,
last
minute
requirement,
and
maybe
folks
hadn't
had
lost
this
information
at
some
point
in
the
competition
or
just
didn't
have
a
way
to
get
this
data
back
out.
B
So
we're
just
letting
you
know
up
front
that
for
the
entirety
of
phase
two.
This
is
also
just
necessary
for
you
to
be
able
to
get
data
back
out
of
the
network
too.
So
it's
it's
a
requirement
now
that
we'd,
like
all
projects,
to
start
just
storing,
just
creating
an
index
and
storing
information
about
the
data
that
you're
storing
on
the
network,
also
because
we
are
going
to
have
kind
of
like
a
data
focus
track
for
phase
two
and
a
ui
focus
track.
B
Having
these,
these
indexes
is
going
to
be
really
important
so
that
others,
in
in
the
competition
and
just
in
the
final
queen
ecosystem
in
general,
can
see
what
data
is
being
stored
on
the
network
and
then
be
able
to
build
their
own
applications
and
uis.
On
top
of
that.
So
there
are
some
specifics.
It's
the
same
set
of
requirements
as
for
phase
one
in
terms
of
what
specific
fields
we're
looking
for
for
folks
to
store.
B
Awesome-
and
I
just
wanted
to
touch
on
this
briefly-
this
doesn't
tell
you
anything
about
the
the
specifics
of
how
these
prizes
will
be
awarded,
but
we
will
have
for
phase
two
even
more
categories
of
awards
that
we
than
we
did
for
phase
one.
So
we're
still
gonna
have
the
data
awards
or
the
storage
awards.
We
will
still
have
community
choice
awards
and
we
will
still
have
the
booster
awards.
B
So
david
mentioned
this
on
one
of
the
slides,
but
we
are
awarding
every
every
team
that
basically
stored
more
than
one
gigabyte
of
data
on
this
competition
and
met
all
of
the
rules
for
phase
one
we'll
be
awarding
every
team
of
a
50
file
coin
boost
and
so
for
phase.
Two.
The
amount
of
the
boost
will
be
different,
but
you
know
we're
we're.
B
Definitely
providing
booster
rewards
as
well,
so
that
you
can
continue
to
store
data
throughout
the
the
course
of
this
phase,
two
competition
and
we're
also
gonna
start.
You
know
seeing
some
prizes
from
community
organizations
as
well.
So
I
think
phase
two
is
gonna,
be
really
really
exciting,
different
from
phase
one
incorporating
a
lot
of
our
learnings
and
but
you
know,
definitely
an
exciting
competition
and
there's
a
lot
ahead.
So
I'm
really
excited
that
you
are
with
us
on
this
journey
and
we'll
have
more
details
for
you
very
very
soon.
B
A
Awesome
yeah
and
in
case
you
were
scribbling
down
notes
as
quickly
as
you
could
to
get
these
details,
no
sweat.
We
will
post
this
in
the
slingshot
announcements
channel
following
the
presentation-
and
I
know
it's
late
for
some
of
you
early
for
some
of
you,
but
please
please
sit
tight.
We
have
one
more
special
speaker
here
tonight.
Juan
is
going
to
you
know
recap
a
little
bit
about
mainnet
launch,
but
also
look
forward
into
the
future
and
get
you
excited
for
phase
two
in
the
way.
Only
quan
can
so
please
swan.
H
Yeah
all
right
awesome.
Thank
you
very
much.
I
will
keep
you
very
short
because
we're
already
it's
been
a
really
amazing
ceremony
and
we
are
probably
all
dazed
with
all
of
the
awesome
things
that
have
been
going
on
so
I'll
I'll.
Keep
it
brief.
When
we
opened
the
slingshot
competition,
I
talked
a
bit
about
kind
of
like
some
suggestions
for
for
slingshot
folks
of
what
clients
could
do
and
miners
could
do
really
awesome
to
see
all
of
those
things
came
true.
H
I
gave
some
suggestions
for
data
and
a
ton
of
these
things
kind
of
got
into
into
these
these
applications.
So
it's
been
really
really
awesome
to
see
all
the
different
directions
that
people
took
things.
So
a
really
massive
congratulations
to
all
of
the
all
of
the
all
the
participants
and
and
all
of
the
winners,
all
the
applications
that
you
made
and
the
the
way
you
use.
This
is
really
really
great.
A
number
of
these
are
super
exciting.
H
To
me
personally,
I
think
things
like
being
able
to
see
the
video
distribution
stuff
already
working
like
movies.ax
is
really
cool
because
all
of
those
public
movies
and
being
able
to
load
them
really
quickly,
like
that's
fantastic,
and
it
just
shows
a
lot
of
promise
for
falcon
in
general,
really
excited
about
all
of
the
machine.
Learning
and
scientific
data
set
use
cases.
H
Those
are
also
really
interesting
and
really
promising,
as
well
as
the
the
developer,
tooling
stuff,
like
being
able
to
see
docker
and
and
kind
of
docker
containers
and
and
so
on,
and
have
that
entire
flow
working
is
super
promising.
So
I
think
it's
been
awesome
to
see
what
you've
done
with
with
falcon
in
a
super
short
time.
I
can't
wait
to
see
what
you
do
next.
H
You
know,
I
think,
like
all
of
these
applications,
I
just
you
know
grabbed
a
few
screenshots
as
we
went
through.
All
of
these
applications
are
super
cool,
and
I
can't
wait
to
see
what
what
you
build
for
for
phase
two.
So
maybe
you
know,
I
think,
during
the
competition
we
launched
the
network,
and
that
was
a
pretty
amazing
set
of
events.
You
know
itself
if
you
haven't
gotten
a
chance
to
see
all
of
the
conference
from
for
liftoff.
H
I
highly
recommend
checking
out
all
of
the
videos,
so
we've
got
a
ton
of
video
captured
from
all
of
the
different
events
last
week.
So
there's
a
there's,
a
large
youtube
page
with
with
all
of
those
talks.
So
definitely
a
lot
of
really
great
ideas
for
the
future,
really
awesome
stuff
happening
across
the
ecosystem
really
get
pumped,
because
I
think
the
next
few,
the
next
few
months,
are
going
to
be
pretty
amazing
for
for
the
whole
network.
H
So
with
that,
I
want
to
kind
of
give
a
a
kind
of
some
thoughts
around
and
kind
of
like
for
for
the
future.
So
as
people
participating
in
the
slingshot
competition,
this
is
kind
of
the
forefront
of
where
clients
and
position
clients
will
come
in
so
some
some
thoughts
there.
I
think
one
is
a
lot
of
you
are
already
doing
this,
but
you
know
think
about
ways
of
now
transferring
over
all
of
what
you've
learned
to
large-scale
applications
that
you
may
be
working
on.
H
So
maybe
you
worked
on
on
a
hack
or
or
something
kind
of
a
side
project
for
for
slingshot,
but
maybe
consider
all
of
this
as
a
as
a
good
trial
run
for
for
doing
it
for
for
reeling
in
a
larger
scale
thing,
and
also
you
know,
for
for
developers
a
ton
of
the
the
stack
that
you're
you're
playing
around
with
now
and
starting
to
use
in
your
in
your
applications.
H
You
know,
can
really
power
pretty
large,
fully
web3
native
native
applications
so
pretty
exciting
to
see
what
what
you
will
go
on
and
build.
I
think
right
now,
you're,
probably
ahead
of
the
curve
from
a
lot
of
people,
because
you
spent
the
last
few
weeks
learning
a
lot
of
this
new
stack
and
there's
not
that
many
people
in
the
world
that
know
it.
So
you
have
a
huge
advantage
and
an
edge
on
on
a
lot
of
other
other
folks.
H
So
can't
wait
to
see
what
what
you
end
up
making
yeah.
I
think
with
that.
I'll
repeat,
some
of
the
worst
of
advice
that
I
gave
at
liftoff
week,
which
is
hey
focusing
on
bringing
deeply
valuable
data
to
the
network,
is
the
next
phase
for
falcon.
So
we
have
an
enormous
amount
of
capacity,
680
petabytes,
which
is
truly
a
staggering
level
of
scale.
H
We
need
to
start
feeling
that
that
massive
amount
of
capacity
with
with
really
useful,
valuable,
valuable
data
so
can't
wait
to
see
what
what
happens
with
slingshot
phase
two,
but
you
know
even
beyond
the
competition,
just
just
think
about
the
kinds
of
things
that
you
think
should
be
stored
in
in
the
network.
It
should
be
stored
for
the
long
term.
We
think
that
there's
a
fantastic
opportunity
for
building,
probably
the
world's
the
world's
best,
like
repository
of
public
data.
H
So
that's
that's
a
a
pretty
amazing
thing
that
that
that
could
get
out
of
that.
We
could
get
out
of
building
fat
coin,
and
you
know
number
two
is
you
know
kind
of
think
of
new
new
kinds
of
applications
to
build?
A
lot
of
you
already
did
this
for
this
competition.
H
Now
you
know
consider
even
kind
of
beyond
beyond
slingshot
what
kinds
of
apps
you
might
want
to
want
to
make
and
stay
on
the
lookout
for
for
things
like
hackathons
and
and
so
on,
but
also,
if
you
end
up
making
something
in
your
free
time
or
you
start
building
a
new,
a
new
product
or
a
new
company,
or
something
like
that.
Definitely
talk
about
it.
H
On
slack,
I
think
a
lot
of
people
are
super
excited
to
to
try
out
new
things
to
to
help
out
to
build
new
things
and
really
lean
on
the
community,
with
with
anything
that
you
that
you
start
making
a
ton
of
amazing
things
are
built
just
by
by
people
playing
around
and
hacking
on
things.
In
fact,
ibfs
started
that
way
so,
and
the
last
bit
of
advice
is,
is
a
really
think
long
term
falcon
is
really
about
the
the
long-term
value
creation.
H
So
as
you
go
and
make
applications
think
of
think
of,
what's
now
possible
with
falcon,
and
how
can
you
make
something
that
that
lasts?
And
how
can
you
make
something
that
is
either
either
fundamentally
better
than
what's
out
there
or
or
similar,
but
dramatically,
cheaper
or
faster,
or
something
like
that
and
think
of
ways
of
providing
value
value?
In
that
way,
great,
I
think,
with
that
I
will
probably
pause
here.
H
Maybe
I'll
just
show
you
the
these
links
that
I've
done
for
those
of
you
that
started
the
event.
H
So
here's
the
the
site
for
the
whole
conference
a
lot
of
amazing
amazing
talks
and
you
can
find
all
of
them
here
so
really
really
awesome
amount
of
content,
and
you
know
just
to
be
recursive
for
a
moment
all
right.
I
A
Awesome
thanks
so
much
swan
yeah.
It's
super
exciting
to
know
that,
even
though
we're
here
on
the
backs
of
so
many
so
much
progress
by
humanity,
we've
only
scratched
the
surface
of
the
next
chapter,
so
I'm
really
excited
to
see.
A
What's
next,
I
mean
we
even
saw
just
in
in
the
the
demos
today,
three
out
of
the
ten
projects
in
a
sample
we're
using
kai
gates,
so
yeah
please
go
out
and
build,
and
with
that
appreciate
everyone
sticking
around
with
us
for
for
this
two
hours,
and
it
was
a
great
time
great
to
see
a
lot
of
friendly
faces
and
connect.
You
know
slack
names
to
presentations
and
things
like
that
and
really
stoked
for
what
phase
two
has
to
bring.