►
From YouTube: DataFund Vision
Description
In this presenation from Day 3 of the #SwarmOrangeSummit, Gregor Žavcer from Datafund presents their vision. In this video he explains the concept of data as labour; something that can bring revenue to others but is not their property. This concept is important because our digital and physical selves are starting to merge, so it matters who owns the digital side of us i.e. our data.
A
A
We
do
like
this
mental
digital
dualism,
but
in
essence,
we
can
see,
like
today's
actions
are
often
guided
from
the
virtual
to
the
to
the
physical,
simple
example
would
be
people
going
to
a
concert
to
make
a
selfie
and
collect
likes
and
not
listen
to
music
and,
at
the
same
time,
what
what
the
problem
is
is
that
if
we
accept
this
notion
how
these
two
worlds
have
merged,
then
I
guess
we
should
also
look
at
it
that
norms
of
behavior
and
values
should
also
be
synchronized
throughout
these
domains.
But
this
is
not
the
case.
A
So
when
we
speak
about
data,
when
we
speak
about
personal
data,
it's
quite
often
the
behavior
that
happens
in
the
virtual
world
would
be
totally
unacceptable
in
the
physical.
Let's
just
imagine
you
know
you're
on
a
date,
there
comes
like
this
guy
with
a
Facebook
t-shirt
and
say
sits
next
to
you
and
says
like
do
not
mind
me,
you
just
keep
on
talking.
I'll
make
some
notes
to
know
what
you
sell
later.
A
You
know,
probably
you
would
be
full
of
enthusiasm
and
we
could
go
on
and
on
with
this
kind
of
examples,
you
know
like
a
postman
coming
opening
your
post
and
saying
oh
I,
see
you're
planning
the
trip
to
Bahamas
here's
some
special
offer
for
shorts
and
kitesurfing
course
so,
and
why
is
this
important?
This
is
important
in
also
how
we
design
the
systems,
how
we
think
about
it
and
how
we
approach
these
topics
now
the
second
concept.
The
second
concept
is
that
your
personal
data
is
actually
your
digital
self.
A
So
this
this
can
be
quite
a
meta
concept.
In
essence,
in
essence,
I
would
here
even
continue
more
in
a
sense
if
we
accept
this
so
that
I
don't
go
into
depth
like
that,
we
discussed
like
the
notion
of
selves
and
stuff
like
that.
Now
imagine:
okay,
we
have
all
our
personal
data
collected
in
one
place.
What
happens
when
we
run
scripts
over
it,
and
it's
like
the
first
moment
when
we
give
our
digital
self
a
little
bit
of
autonomy,
and
then
we
run
some
more
scripts
on
it
and
how?
A
What
does
then
the
future
bring
when
here
when
we
can
have
like
fully
autonomous
digital
self?
What
is
our
relation
to
this
digital
self?
Are
we
growing
dependent?
Are
we
basically
relinquishing
our
free
will?
So
these
are
like
again
some
questions
just
to
ponder
about
the
next
one.
The
next
one,
which
is
lately
gaining
quite
a
lot
of
traction,
is
that
data
should
be
considered
as
labor.
Currently,
the
most
common
idea
is
that
the
data
is
capital.
A
The
more
data
I
have
the
more
capital
I
have,
and
these
results
in,
like
seeing
companies
holding
immense
amounts
of
data,
not
really
knowing
what
to
do,
but
it's
capital.
So
it's
just
good
to
have
it
stored
there
and
one
day
we
will
figure
out
how
to
use
it
and
again.
Why
is
this
important?
It's
important
also,
because
if
we
accept
this,
the
data
is
labour.
A
It
changes
a
lot
of
other
dynamics
so,
for
example,
the
question
of
who's,
the
owner
of
data,
and
this
can
go
back
to
slavery
law
in
times
of
slavery,
Lord
they
in
essence,
the
slavery
law
was
like.
Do
not
kill
your
slave
unless
it
really
makes
trouble
you
know
so,
and
then,
when
slavery
was
abolished,
social
law
came
into
place,
which
was
kind
of
basically
addressing
the
idea
of
employment
and
with
employment.
It
means
I
can
work
for
you,
but
you
don't
own
me.
A
So
if
we
accept
data
as
labor,
we
can
also
basically
deduct
this
notion.
So
you
can
use
my
personal
data,
but
you
don't
own
my
personal
data,
because
my
personal
data
is
actually
me,
therefore,
because
of
all
these,
and
there
are
also
other
reasons
we
data
fund.
We
deeply
believe
that
in
the
digital
age,
freedom
begins
with
truly
owning
your
data
and
it's
like
the
person
who
will
own
your
data
will
own
your
future
and
because
of
that
privacy
matters,
because
it
protects
your
digital
self,
among
other
things.
A
But
here
at
the
summit
we
speak
a
lot
about
technical
solutions
and,
yes,
there
are
technical
challenges,
but
at
the
same
time
we
need
to
be
aware
that
privacy
is
as
much
a
social
challenge
as
it's
a
technological
challenge.
It's
really
when
we
speak
with
people
when
we
work
people
and
data
found
this
developing
a
privacy
solution.
A
Things
are
changing,
especially
in
Europe,
especially
on
the
front
of
awareness.
We
will
see
how
much
on
the
front
of
legal
things
so
in
Europe,
there's
a
new
directive
coming
into
place
at
the
end
of
the
month
called
GDP
are
how
many
of
you
have
you
heard
about.
Gdp
are
cool,
so
I
can
then
just
skip
this.
What
is
important
for
us
is
data
fund,
and
this
is
why
we
focus
also
at
first
at
the
European
Union
is
but
basically
GDP.
Our
introduces
a
lot
higher
punishments,
and
this
for
us
changes
the
game
theoretic
setting.
A
So
like
some
assumption
and
how
we
look
at
the
protocol
are
a
lot
better
possible
in
Europe
than
in
rest
of
the
world.
Nonetheless,
we
feel
that
the
rest
of
the
world
will
follow.
Also,
at
this
point,
I
just
take
the
opportunity,
if
we're
very
open,
we
can
help.
Also
you,
if
you
feel
the
tune
touch
upon
GDP
are
or
are
not
sure
you
can
just
contact
us,
and
our
in-house
lawyer
can
give
you
first
a
run
down
how
compliant
you
are,
and
then
we
see
if
we
can
help.
A
So
all
these
all.
This
leads
to
another
thing
that
we
are
quite
missing
in
this
space
and
it's
the
talk
about
ethics.
Ethics
in
essence
could
be
considered
like
that,
it's
about
inventing
good
or
bad,
and
we
feel
like
with
the
systems
with
the
decentralized
systems
that
we
built
and
that
those
systems
that
touch
directly
upon
the
human.
So,
in
our
case,
it's
like,
if
somebody
decides
which
data
to
share
and
what
to
do
with
the
data
that
we
need
ethics,
because
ethics
is,
in
our
opinion,
it's
a
social
consensus.
A
So
it's
important
because
ethics,
we
addressed
how
incentives
drive
behavior.
This
is
this
is
like
one
of
the
biggest
biggest
issues
and
I
think
in
the
future.
We
will
even
speak
about
this
cause
currently
most
of
the
systems
that
we
have
in
place.
Actually
I,
don't
know
any
that
don't
just
address
so-called
extrinsic
motivation
and
the
the
the
main
value
of
the
network
is
actually
grid.
A
So
there
is
already
some
research
coming
out
how
money,
just
monetary
incentives
in
the
network,
how
they
can
basically
become
these
networks
through
time
become
toxic,
and
you
know
if
you
follow
a
little
bit
what's
going
on
in
the
Bitcoin
community.
Well,
moreover,
our
ethics
awareness
eliminates
ignorance
and
minimizes
unintended
effects,
side
effects.
So
it's
like
again,
we
are
supposed
to
build
unstoppable,
apps,
unstoppable
apps.
A
So
basically,
these
are
like
the
concepts
that
kind
of
guide
our
development
of
the
protocol
and
our
debt
and,
in
essence,
data
fund
is
a
protocol
that
guards
personal
data,
provides
safe
storage
and
enables
provable
fair
and
ethical
data
exchange.
At
least
we
hope
that's
going
to
happen.
One
of
the
guidelines
main
guidelines
that
we
have
in
the
protocol
design
is
our
so-called
min
max
sharing
principle,
which
essentially
says
like
minimize
sharing
of
your
personal
data.
A
Do
it
on
the
need-to-know
basis,
know
what
why
you
give
out
your
personal
data
while,
on
the
other
side,
maximize
sharing
of
anonymized
data,
anonymize
data?
Doesn't
expose
us,
but
it
can
benefit
to
big
datasets
in
all
kinds
of
areas
from
smart
cities,
research,
health,
so
forth.
Now.
The
protocol,
in
essence,
has
three
layers:
constant
management.
A
It
always
starts
with
consent,
so
our
permission,
mutual
agreement
so
to
say-
and
this
is
in
here-
also
hides
the
gdpr
part
we
are
not
specifically
talking
about
GDP-
are
on
the
level
of
protocol
design,
but
it
is
maybe
even
stricter.
Then
it's
followed
by
personal
data
exchange
and
anonymize
data
exchange.
A
I
will
now
just
touch
upon
the
anonymize
data
exchange
because
it's
kind
of,
like
the
fun
part,
to
present
how
how
we
kind
of
see
this
working
so
maybe
before
before
I
start
with
explaining
this
diagram
data
fund
is
envisioned
also
as
as
a
model
as
sort
of
like
a
concept
like
if
we
talk
about
curation
markets,
if
you
talk
about
prediction
markets
so
forth,
so
so
it's
data
fund
also
some
a
model.
How
things
should
be
done
in
terms
of
exchanging
data?
A
It's
drawing
inspiration
upon
financial
funds
and
the
fun
part,
and
if
you
buy
into
the
notion
that
personal
data
has
value,
then
we
could
also,
then
we
also
agree
that
different
categories
of
personal
data
have
value.
These
categories
can
be
considered
as
assets
in
your
personal
data
fund,
which
you
manage
so
basically
you're
the
one
who's
saying:
what
kind
of
data
are
you
going
to
collect?
A
What
kind
of
data
do
you
want
to
and
under
which
conditions-
and
this
also
at
the
end,
determines
your
monetization
options
if
you're
interested
in
monetizing
and
everybody
can
have
one
or
several
data
funds,
so
the
general
idea
is
like
in
a
financial
fund,
we
have
investors
so
called
LPS.
They
invest
the
money,
the
fund
manager,
then
this
decides
which
assets
to
hold
and
basically
manages
the
fund,
and
then
the
profits
are
being
split
after
fees
are
taken
from
the
data
hunt
manager,
fund
manager,
data
fund
is
doing
actually
something
quite
similar.
A
So
we
incentivize
companies
to
give
the
data
back.
I
mean
it's
a
little
bit
a
little.
It's
a
little
bit
upside
down
thinking.
So
why
do
we
need
to
incentivize
and
actually
buy
back
our
data
if
it's
ours,
but
let's
call
it
the
power
of
incentives
so
how
it
happens?
It's
like
data
file
itself
is
taking
a
little
bit
of
inspiration
from
curation
markets
using
a
bonding
curve
and
in
essence
it's
like
company,
a
comes
to
me
and
says:
like
Gregor,
we've
been
collecting
data
about
you
for
five
years.
It's
nicely
structured.
A
We
did
the
work,
you
know
and
we
feel
it
would
enrich
your
data
set
I,
look
at
it,
I
evaluate
it,
I
agree,
I,
accept
it
and
I
mean
back.
My
personal
tokens
from
the
data
fund
and
same
happens
with
Company
B.
Now,
if
they
are
supplying
the
same
data
kind
of
same
type
of
data,
the
value
goes
down.
It's
like,
if
basically
first-come
first-serve.
So
this
already
draw
is
the
incentive.
A
Then
me,
as
data
fund
manager
and
or
in
this
case,
Alice
I
decide
what
I
want
to
share
and
how
I
want
to
share
it.
We
with
our
token
mechanics
we
actually
incentivize
anonymize
data
sharing.
So
this
is,
this
is
I,
think
of
other
importance,
and
then
we
try
to
push
the
2's
many
data
marketplaces
as
possible,
so
anonymize
data
marketplaces
this.
A
This
monetization
creates
revenues,
revenues
go
back
into
the
data
fund
and
are
split
according
to
the
shares
of
the
tokens.
So
the
token
holders
who
got
issued
these
personal
tokens
are
entitled
to
share.
So
why
do
we
do
this?
Why
do
we
do
this?
It's
it's
actually
quite
this
was
this
was
kind
of
like
at
the
moment.
The
only
also
solution
that
we
see
how
some,
how
fair
and
ethical
data
exchange
could
be,
could
be
done,
and
the
thing
is
with
this
kind
of
setup,
companies
are
actually
incentivized
to
give
data
back.
A
In
essence,
data
that
wants
to
be
shared
should
be
shared
and
through
through
that
we
decrease
also
information
asymmetry,
so
I
haven't
actually
mentioned
swarm,
but
I
think
it's
kind
of
self-explanatory
that
this
vision
that
we
have
is
basically
impossible
to
do
without
swarm.
So
we
we
see
how
we
plan
to
use
the
swaps,
we're
in
Swindell
framework
and,
of
course,
the
data.
Our
data
should
be
stored
there,
where
we
define
how
how
we
access
it
into
who
we
share
and
with
that
I
would
thank
you,
and
if
there
are
any
questions,
feel
free.
B
A
So
this
is,
this
is
a
good
question
and
it's
actually
what
I'm
maybe
also
should
have
made
more
clear
throughout
the
presentation.
It's
about
the
process.
Personally,
I
don't
believe
there
can
be
today
developed
a
solution
that
will
just
magically
solve
everything.
So
part
of
this
process
is
in
Europe
at
least
it's.
It
started
with
gdpr,
where
we
are
now
getting
like
a
lot
of
emails.
Asking
companies
where
companies
ask
us
if
we
agree,
then
it's
like
this
kind
of
Black
Swan
events
like
we
see
Facebook
and
Cambridge
analytic
and
all
these
drives
awareness.
A
A
good
thing
in
GDP
are
is
that
there
is
also
one
notion
that
privacy
policies
should
be
made
in
a
readable
format:
yeah
yeah
yeah,
but
when
we,
when
we
speak
about
consent,
the
idea
is
that
now
shoot
like.
When
we
ask
for
consent,
it
should
be
presented
in
a
couple
of
sentences,
not
that
you
have
to
read
like
20
pages.
You
know
that
it
feels
like
every
privacy
policy
on
each
side
is
like
a
whitepaper
to
read
so
and
so
yeah.
It's
going.