►
Description
Berlin Ethereum Meetup - 2018.08.28
Slides: coming soon
A
This
is
all
not
that
far.
We
have
a
proof
of
concept,
that's
currently
running
on
ethereum,
but
we
aim
to
a
more
scalable
approach.
That's
coming
later
this
year
and
to
to
do
this
to
to
allow
video
platforms
to
do
their
daily
job.
We
are
creating
a
new
type
of
mining.
The
video
mining
that's
coming
to
cryptocurrency
miners,
who
use
GPUs
to
do
cryptocurrency
hashing,
and
we
aim
to
create
an
additional
income
for
them.
A
So
if
you
are
a
developer,
who
wants
to
add
video
to
your
application,
you
face
a
lot
of
pains
right,
so
it's
really
expensive
to
go
after
the
services
of
the
big
five
like
Amazon
or
Google,
maybe
not
to
start
with.
But
once
you
scale
your
video
infrastructure
up,
this
is
becoming
really
expensive
and
on
the
other
hand,
you
could
self
host
get
a
bunch
of
root
servers
and
and
manage
them
on
your
own.
A
But
this
is
hard
to
maintain,
requires
a
DevOps
layer
that
it's
most
likely
human
resource
intensive
yeah
to
keep
to
keep
a
system
running
stable
at
scale.
If
you
do
not
do
this,
basically,
if
you
go
to
the
cloud
providers,
you're
also
agreeing
on
renting
the
infrastructure
from
your
competitor
right,
if
you
want
to
create
the
next
version
of
YouTube,
it's
kind
of
weird
to
go
to
Google
and
rent
infrastructure
from
them,
and
there
is
like
no
alternatives
available.
So
yeah
you
couldn't
just
like
go
somewhere
else.
A
So
with
life,
people
are
trying
to
create
a
marketplace:
a
trustless
marketplace,
so
that
anybody
who
has
resources
that
he
could
contribute
to
a
video
network,
coochi
Elam.
We
are
building
orchestration
services
that
are
basically
allowing
you
to
share
idle
capacities
in
GPU
mining,
rigs
and
other
other
resources,
and
we
would
allow
you
to
to
share
them
or
sell
them
on
a
marketplace.
A
So
we
basically
give
in
Utah
they
have
previously
been
creating
these
mining
rigs.
I
guess
some
of
you
have
been
in
touch
already
containing
out
of
graphic
arts
and
dumped
down
motherboard
and
CPU
to
just
control
the
rehashing
on
these
cards.
But
what
most
people
do
not
know
is
that
in
these
cards
only
parts
of
the
chips
in
the
cards
are
basically
used
for
the
hashing.
Others
are
there
for
other
purposes.
A
For
example
in
Nvidia
cards,
the
N
vank
decoder
and
the
Nvidia
decoder
and
Nvidia
encoder
are
separate
asset
chips
that
are
apart
from
the
CUDA
cores
that
which
do
the
actual
encoding.
So
you
basically
have
in
the
lower
part
the
CUDA
cores
that
are
being
used
for
hashing.
These
reprogram
will
compute
units,
and
then
you
have
the
Nvidia
decoder
and
encoder
and
that's
a
success.
So
that's
circuits
that
can't
do
anything
else.
Then
video
decompression
compression.
A
So
you
know
there's
a
big
price
difference
if
you
compare
rendering
video
encoding
video
on
a
on
a
graphic
card
or
if
you
render
it
on
say
AWS
on
AWS
for
rendering
a
live
stream
as
we
stream
it
from
here
on,
live
pier,
he
would
basically
pay
like
$3.00
per
hour,
but
the
same
graphic
card
or
a
graphic.
A
similar
graphic
card
in
a
mining
rig
would
only
produce
an
income
of
$1
50
in
a
whole
day,
while
it
could
render
48
hours
of
video.
A
A
So
and
as
you
see,
the
the
difference
is
huge.
It's
like
a
hundred
times
bigger
for
the
same
service.
The
miners,
on
the
other
hand,
have
very
low
access
cost
to
get
into
that
game.
Once
our
software
is
completely
finished,
they
can
run
this
side
by
side
next
to
their
usual
hashing
within
their
cryptocurrency
mining
BOTS.
So
they
don't
have
to
stop
the
usual
operation.
They
can
use
the
idle
chips
next
to
it
and,
in
addition,
basically,
and
that's
giving
them
a
second
income
stream,
the
the
resources
that
are
available
are
massive.
A
So,
as
you
see
that
in
that
picture,
that's
just
one
of
the
mining
farms
that
Genesis
mining
has
like
a
big
caching
contract
provider,
there's
a
lot
of
them
worldwide,
but
this
is
not
only
applying
to
the
big
mining
farms,
of
course,
but
everybody
could
use
that.
In
fact,
the
big
mining
firms
will
have
less
of
an
opportunity
here,
as
they
would
have
to
spend
dedicated
resources,
the
form
of
bandwidth
to
do
that
specific
service,
while
a
home
miner
would
basically
be
able
to.
A
So,
of
course
like.
If
we
give
you
a
video
and
you
encode
it
from
for
us,
we
have
to
make
sure
it's
properly
done.
If
you
don't
do
it
properly,
we
couldn't
really
use
it.
So
we
had
to
design
a
passive
validation
mechanism
to
make
this
work
and
that's
basically
a
combination
of
like
what's
the
encoding
settings.
A
So
we
use
open
source
technology
from
the
ffmpeg
family
to
look
into
the
video
Elementary's
and
see
if
certain
aspects
of
the
video
job
has
been
properly
done,
like
it's,
the
codec
properly
the
bitrate
the
frame
size
stuff
like
this,
but
we
also
probe
the
video
quality.
So
that's
a
tool
from
that
has
been
released
from
Netflix,
for
example,
called
we
math
and
what
it
does
is
it
used
its
using
perceived
video
hashes
to
to
gain
a
metric
on
the
video
quality
or
basically
a
score
on
the
video
quality.
A
It
allows
us
to
see
how
well
an
encoder
performed,
so
how
good
the
quality
was
that
he
was
producing,
and
it
also
allows
us
then
to
see
if
somebody
cheated
and
tried
to
simplify
the
encoding
process,
and
we
have
to
see
if
the
content
was
tampered.
So
if
the
the
the
content
of
the
video
itself
was
changed
and
we
use
perceptual
video
hashing
for
that
as
well,
where
we
would
basically
have
an
Pearson,
R
algorithm.
That's
telling
us
if
parts
of
the
video
itself
or
of
a
single
pictures
in
the
video
have
been
changed.
A
That's
that
was
an
open
resource
resource
topic
for
the
last
couple
of
month,
but
we
were
luckily
getting
into
contact
with
researchers
and
developers
from
the
ffmpeg
community
who
volunteered
to
help
us
getting
the
spill.
So
this
is
basically
now
in
an
alpha
available
with
us
and
which
are
not
available
for
you,
but
we
are
testing
it
internally,
hopefully
soon
for
you.
A
So
these
methods
of
validation
basically
allow
us
now
to
distribute
video
encoding
jobs
among
any
type
of
software,
so
the
encoding
software
does
not
have
to
come
from
us
same
with
chips.
If
there's
somebody
out
there
who's
creating
a
new
video
encoding
chip,
which
is
like
way
more
efficient,
he
can
just
contribute
to
our
network
without
us
ever
touching
his
software.
It's
a
complete
passive
validation
before
we
were
using
Turbots
so
before
all
the
encoding
that
we've
done
have
been
CPU
based
and
people
are
using
true--but
along
with
ffmpeg
to
really
be
able
to
validate.
A
If
somebody
has
changed
the
encoding
software,
but
that's
that's
not
really
helping
with
driving
competition
up
and
that's
what
we
do.
We
try
to
create
a
scenario
where
transcoders
compete
against
each
other.
For
the
available
work
in
the
network,
we
want
to
make
them
compete
against
price
quality
and
performance
so
price.
A
We
are
creating
a
marketplace
where
it's
it's
not
written
in
stone,
we're
just
creating
it
where
there's
like
a
secret
auction
happening
where
you
ask
the
worker
bit
to
get
the
work
done
and
if
you
basically
broadcasters
send
a
good
price
that
the
that
the
other
party
is
willing
to
pay
for
your
chances.
Your
likelihood
of
you
getting
the
job
is
higher.
Now
we're
gonna
live
in
a
world
where
there's
so
much
more
worker
than
available
work
that
we
think
only
the
chance
on
you
know.
A
A
To
do
this
and
solve
this
a
couple
of
years
ago
to
a
US
company
and
I
know
that
there's
always
going
to
be
that
that
threshold,
none
of
the
companies
want
to
step
over
in
terms
of
price,
as
they
are
onboarding
costs
for
their
customers,
their
support
cost
they're
much
higher
than
the
actual
infrastructure
or
the
operational
cost.
So
you
know
they
will
never
allow
a
certain
try
thresholds
to
be
broken
and
there
has
to
be
like
an
automatic
software
doing
the
price
negotiation
so
that
there
is
not
so
much
room
for
price
cut
else.
A
Building
in
that
space
and
quality,
we
want
to
allow
that
transcoders
are
able
to
like
broadcast
a
higher
price
if
they
do
a
better
quality,
but
we
also
want
to
allow
like
older
equipment
to
be
able
to
do
the
video
processing
for
networks
that
are
not
able
to
sustain
a
higher
price.
Not
all
video
use
cases
are
the
same
right.
You
have
video
use
cases
where
there's
a
real
business
model
behind
it
and
though
the
develop
video
developer
who's
implementing
it
can
take
more
expensive,
but
maybe
better-looking
technology
or
more
performant
technology.
A
But
we
also
have
markets
where
there's
no
business
model
and
it's
subsidized
anyways,
so
the
more
money
they
save
along
the
way,
the
longer
they
can
run.
Having
now,
an
marketplace
that
allows
people
to
bid
on
both
price
and
quality
is
going
to
differ,
allow
the
different
markets
to
access
the
same
marketplace.
A
Amazon
sells
the
encoding
for
the
same
price
for
everybody
right,
but
there
is
businesses
out
there,
that's
let's
say
India
or
somewhere
else,
where
the
the
business
model
behind
these
video
use
cases
might
never
achieve
the
equilibrium
that
we
see
happening
here
in
the
Western
world,
so
in
performance
as
the
last.
So
we
create
race
conditions
among
trans
coders
to
make
sure
that
there's
at
least
one
really
good
performing
transcoders.
Who
can
then
win
the
job?
And
we
do
this
so
that
even
so,
it's
a
distributed
network.
A
We
give
you
some
confidence
that
we
will
always
have
have
your
encoding
or
your
live
stream
being
maintained
on
a
professional
level.
A
single
there's
like
this
in
the
end,
and
not
a
single
point
of
failure,
but
at
least
one
transcoder
still
carrying
in
your
stream.
The
result
would
be
that
there's
a
dramatic
cost
reduction.
We
we
say
like
we
can.
We
can
get
to
like
5%
of
what
the
market
price
is
today
like
from
a
$3
per
hour
price.
A
We
think
we
can
drop
to
like
6,
and
that's
that's
not
like
us,
our
our
cut.
Of
course.
That's
the
net
income
of
what
a
transcoder.
What
does
life
P
does
not
has
a
plan
to
create
a
free
market.
On
top
of
that,
so
it's
just
going
to
be
an
open
source
market
that
anybody
could
use
without
a
middleman.
That's
like
gaining
fees,
on
top
of
it
other
than
the
actual
transcoder
and
infrastructure
providers.
A
We
want
to
make
sure
that
the
overall
quality
is
getting
higher
and
higher
over
time,
so
the
competition
itself
should
also
incentivize
transcoders
to
broadcast
higher
quality,
encoding
jobs
for
a
smaller
price
and
also
higher
reliability.
Maybe
so
the
currently
we
say
reliability
is
going
to
happen
in
our
world
through
redundancies.
We're
gonna
have
multiple
production
lines
working
simultaneously.
A
Basically,
so
one
can
fall
off
without
harming
the
overall
result,
so
the
market
size
for
transcoding,
it's
something
that
we
can
easily
calculate
through
so
in
YouTube,
doesn't
have
like
public
numbers,
but
in
some
of
their
press
releases
you
can.
You
can
hear
them
saying
we
do
like
300
hours
of
video
every
minute,
and
that
sounds
up
to
like
8
18,000
concurrent
live
streams.
Now,
with
each
and
videography
cart
being
able
to
render
two
streams,
you
would
only
need
9000
of
these
cards
to
always
render
all
of
the
incoming
videos
of
YouTube
in
real
time.
A
So
it's
just
it's
not
really
saying
that
we
aim
to
do
this.
It's
just
to
give
you
a
perspective
on
how
much
resources
are
active,
actually
needed
to
do
the
job.
It's
not
that
much
with
twitch,
for
example,
you
see
they
peak
at
75,000,
concurrent
channels,
it's
not
even
that
all
the
twitch
channels
would
get
a
transcoder
only
their
partner
programs,
member
or
only
those
viewers
who
have
above
50
concurrent
viewers
would
actually
get
a
transcoder,
but
still
like
35,000
cards
would
be
enough
to
render
all
of
this
in
real
time.
A
So
now
take
the
ethereal
Network,
the
ethereum
network,
if
you,
if
you
take
the
hash
rate,
a
TAS
and
divided
on
how
many
graphic
cards
you
could
run
with
it,
you
would
end
up
with
about
6
million
graphic
cards,
Nvidia
graphic
arts
doing
that
job.
So,
along
with
the
ethereum
proof-of-work
hash
power
or
the
acid
idol
chips
that
are
next
to
these
graphic
cards,
you
could
basically
render
about
12
million
concurrent
live
streams
all
the
time.
That's
way
more,
as
the
worldwide
demand
is
so
that
sounds
like
a
business
threat
right,
but
it's
really
not!
A
So
that's
not
a
business
threat
for
the
enterprise
customers.
It's
a
very
over
soft
market
with
encoding
or
video
in
general.
They
have
specialized
on
niches
in
these
enterprises
to
sell
into
certain
verticals.
We
are
all
not
addressing
this
and
they're
also
from
a
cost
perspective,
not
at
that
point
where
they
desire
a
dramatic
cost
decrease.
A
So
that's
not
for
the
enterprise,
it's
also
not
for
premium
right,
Netflix
and
Coates.
Every
video
it
does
a
few
hundred
times.
It's
literally
brute,
forcing
the
encoding
results
to
get
the
best
possible
outcome,
so
also
premium
content
providers.
It's
not
going
to
be
interesting
for
them
and
even
for
the
traditional
broadcasters
of
the
German
television
they
you
know
they
won't
be
interested.
A
They
have
their
encoding
solutions,
but
what
it
really
does
it's
it's,
creating
a
new
market,
a
new
market
for
decentralized
applications
that
first
need
a
decentralized
infrastructure
to
not
lose
their
habits
right
and
also
to
have
a
dramatic
cost
decrease.
Here
we
have
all
heard
about
startups,
who
we're
lucky
and
got
some
venture
capital
or
some
funding,
and
then
they
started
to
create
a
video
product
and
just
drained
through
their
resources,
really
really
quickly.
A
I've
seen
that
happening
myself
a
couple
of
times
where
people
had
good
ideas
and
even
good
funding
and
they
started,
then
they
had
a
sudden
success.
A
lot
of
people
were
consuming
their
content
or
using
their
app,
and
they
were
not
able
to
build
up
a
business
model
before
they
ran
out
of
their
funds.
Now,
that's
a
huge
risk
and
lowering
the
cost
of
the
operation
just
means
giving
them
a
longer
runway.
A
If
we
are
at
five
percent
of
infrastructure
cost
for
running
a
video
platform,
it
also
means
for
somebody
who's
implementing
that
and
using
his
capital
against
that
service
to
be
able
to
rent
run
20
times
longer.
So
that
gives
him
a
lot
more
time
to
find
a
real
business
model
and
yeah,
then
hopefully
sustain
and
be
able
to
compete
against
those
who
have
the
infrastructure
already.
A
So
this
is
targeting
consumer
products
most
likely
consumer
products
that
are
not
yet
having
a
business
model
and
trying
to
maybe
over
time,
find
one
but
need
a
long
runway
to
find
it
it's
in
a
sense,
it's
a
sense
where
the
competitive
market
is
going
to
create
a
different
niche.
It's
it's
not
a
horizontal.
We
target
it's
really:
it's
not
really
a
vertical.
We
target
it's
really
horizontal
for
anybody
to
build
on.
So
maybe
over
time.
A
A
Yeah
correct
so
in
in
twitch,
if
you
are
a
nobody,
if
you're
not
yet
in
their
part
and
program,
and
you
have
less
viewers,
your
highest
quality
is
the
quality
of
broadcast.
You
only
sent
them
one
stream,
and
that
is
what
everybody
has
to
consume
before
transcoder
kicks
in.
So
anybody
who
is
in
the
situation
that
she
could
currently
not
sustain
the
bandwidth
from
your
source
stream
would
not
be
able
to
properly
watch
that
stream.
A
A
Bandwidth
is
a
limiting
factor
and
similar
to
how
it
worked
with
proof-of-work
mining,
where
you,
as
a
miner,
had
to
find
a
reasonable
resource
of
electricity.
Now
miners
have
to
find
affordable
bandwidth
as
well.
We
estimate
with
about
10
megabits,
up
and
download
capacity
per
encoding
job.
You
do
that's,
including
a
Headroom.
A
A
Yeah
the
question
was:
if
we
ask
for
fees,
but
now
we
will
not
ask
for
fees
the
aim
for
life
peers
really
to
built
an
open-source
community
around
this
that
can
sustain
and
we
have
created
a
token-
that's
being
used
as
a
utility
token
to
secure
our
network
from
civil
attacks.
But
it's
also
not
going
to
be
used
to
pay
for
the
service
internally,
it's
going
to
be
easily
that
you
use
to
pay
for
that
service
and
the
live
key
token
really
is
only
to
basically
secure
who
is
allowed
to
do
work
in
the
network.
A
So
so
the
life
peer
product
has
been
funded
with
a
private
token
sale.
That
will
give
us
enough
Headroom
it's
about
4
million
enough
Headroom,
with
our
small
team
to
build
that
product
to
get
it
out
the
door
and
to
get
it
running,
and
it
most
likely
is
also
enough
to
sustain
this
for
1
or
2
years
to
maintain
it
and
to
continue
going.
A
But
the
aim
of
it
is
to
become
a
more
decentralized
organization
in
the
end
that
is
not
depending
on
on
direct
finances,
but
would
basically
ask
those
who
use
that
network
to
contribute.
Also
to
it.
The
life
key
token
itself
definitely
will
have
some
value
in.
In
my
sense,
it
acts
also
as
a
type
of
software
license.
But
that's
not
really.
A
A
So
there
is
existing
projects
out
there
that
are
trying
to
to
get
like
a
proof
for
replication
done
and,
and
the
swarm
system,
for
example,
use
that
methodology
swaps,
where
in
Swindell
to
mitigate
the
risk
that
you
have
for
somebody
not
being
able
to
pay
you
for
the
traffic
that
you
served.
But
all
these
concepts
are
highly
experimental
and
none
of
them
are
yet
proven
to
be
working
or
to
be
easily
to
integrate
and
with
us
like
with
video
infrastructure,
it's
our
on
or
off
game.
Either
this
works,
or
it
doesn't
work
like
with
video.
A
You
have
very
little
room
for
users
having
a
bad
experience
and
still
accepting
the
service,
especially
if
you
look
at
YouTube
Facebook
Twitter,
also
Netflix
what
they
do
to
improve
the
viewer
experience
today.
Anybody
who
wants
to
compete
in
that
has
to
at
least
do
similar.
Well,
so
we're
basically
currently
exploring
ways
for
the
delivery
that
that
are
not
relying
on
a
financial
layer
but
I'm
more
a
free
service.
A
We
are
currently
experimenting
with
vApp
torrents
using
HTTP
traffic
next
to
it
to
sustain
a
live
use
case,
so
you
would
basically
have
a
CDN
on
one
side,
a
traditional
CDN
and
you
would
have
a
web
torrent
stream
on
the
other
side
and
clients
would
basically
be
allowed
to
get
partial
data
from
the
CDN
and
partial
data
from
other
users
that
are
currently
consuming
the
same
content.
That's
with
us!
It's
it's
very
good
situation.
Cuz
with
live
stream.
You
are
like
the
only
use
case.
We're
really
everybody
is
yet
the
same
content.
A
At
the
same
time,
though,
for
us,
it's
working,
it's
working
out
pretty
well,
it's
different
for
other
use
cases,
especially
longtail
use
cases,
can't
really
use
this,
but
yeah.
We
are
looking
into
how
to
make
this
work
as
well.
Our
goal
is
not
only
to
drive
down
the
pricing
with
transcoding,
but
our
goal
with
live
pier
is
really
to
drive
down
the
price
for
overall
video
infrastructures.
A
So
so,
currently,
currently
the
whole
job
negotiation
and
anything
else
works
on
chain
that
doesn't
give
us
a
lot
of
room
for
many
transcoders.
We
currently
only
have
15
transcoders
that
fight
for
the
jobs
in
the
network,
but
that
was
just
a
proof-of-concept
design
later
on.
We
think
about
having
a
limited
amount
of
orchestrators
or
validators
that
are
basically
sitting
in
between
the
transcoders
and
those
who
buy
the
transcoding
resources.
That
would
mean
that
the
validators
might
still
be
limited,
but
they
don't
do
the
hard
work.
A
They
only
do
partial
work,
the
validation
work
and
the
real
work
happens
with
trans
coders
and
that's
then,
all
off
chain
and
and
unlimited
with
the
amounts
where
it's
limited,
of
course,
with
the
physical
resources
behind
it.
But
it's
not
limited
as
it
doesn't
require
us
to
go
through
etherium
each
time
we
do.
One
of
these
jobs.
A
Yeah
you're
hitting
a
point
there.
So
yes
you're
right
to
make
video
in
just
a
success.
You
have
to
make
sure
that
you're
not
too
far
away
from
the
origin
and
one
of
the
one
of
the
new
new
network
layers
includes
a
region
so
that
you
would
basically
broadcast
the
availability
of
your
work
in
a
certain
region
and
would
not
then
get
a
job
from
Tokyo
or
from
China.
A
A
So
so
core,
currently
it's
with
with
H,
with
x264
and
and
the
software
encoding.
It's
really
only
CPUs
later,
with
the
GPU
based
very
verification.
We
going
to
do
AMD
and
Intel
that's
coming
to
VA,
API
and
NVIDIA,
and
that's
basically
come
that's
basically
covering
any
graphic
card
provider
out
there
and
even
the
graphic
chips
that
are
in
the
intro
CPUs
we're
discussing.
A
If
it's
also
needed
that
we
would
go
to
a
more
a
RM
kind
of
infrastructure
and
see
if
we
can
get
even
P,
I
boards
and
I
mean
there's
like
there's
so
few
resources
of
so
few
devices
out
there
that
do
not
have
a
video
encoding
chip.
In
the
meantime,
it's
crazy
most
likely
even
your
television
has
one.
Even
so.
It's
not
used.
A
A
How
do
I
put
this
without
right
and
stepping
on
somebody's
toes
right?
So
building
apps
on
ethereum
is
really
a
bad
idea
right
now
right
what
I
would
need
to
get
the
original
design
to
scale
without
us
doing
a
huge
protocol
redesign
and
without
us
doing
pushing
everything
off
chain
would
be
a
fixed
transaction
cost,
but
I
think
that's
never
going
to
happen
and
for
known
reason.
So
why
we
chose
ethereum
because
of
the
smart
patent
contract
platform.
That's
unmatched
still.
A
Is
it
the
best
place
for
us
to
be
so
for
me,
as
like
the
product
guy
I'm,
really
not
convinced
I,
see
so
many
reasons
why
we
should
not.
The
our
smart
contract
developers
are
very
confident
that
we
can
push
anything
that's
bloat
right
now
off
chain
and
make
sure
that
these
use
cases
then
only
needs
etherium
as
the
financial
settlement
layer
and
basically
to
do
the
proofs
and
challenges
that
would
reveal
a
cheater
in
our
system.