►
Description
The folks behind Koda, the world’s first decentralized dog, explain how IPFS + AI is a match made in heaven for robotics.
For more information on IPFS
- visit the project website: https://ipfs.io
- or follow IPFS on Twitter: https://twitter.com/IPFS
Sign up to get IPFS news, including releases, ecosystem updates, and community announcements in your inbox, each Tuesday: http://eepurl.com/gL2Pi5
A
Hi
guys
my
name's
john
sue,
I'm
the
cto
of
coda
usa
and
I'm
the
board
of
advisors.
Essentially
when
I
was
exposed
to
this
technology
called
coda.
My
my
gut
reaction
is:
this
is
really
a
super
cool,
decentralized
computing
platform
in
the
shape
of
a
dog
which
is
which
is
entirely
accurate.
A
My
background's
in
cyber
security
for
the
last
25
years
and
specifically
innovating
and
decentralized
computing
and
and
how
to
secure
decentralized
computing
from
you
know,
metadata
attack
vectors
and
allowing
to
get
the
maximum
of
efficiency
out
of
something
like
this,
while
preserving
anonymity
and
securing
the
environments.
A
So
I
wasn't
particularly
interested
in
coda.
Coda
utilizes
go
ipfs
with
the
hd,
you
know
at
the
core
and
is
a
a
pretty
amazing,
robotic
dog.
So
let's
talk
about
it
so
way
back
in
2018,
a
group
of
engineers
and
designers
at
coda
really
wanted
to
build
a
decentralized
computing,
ai
platform,
and
they
had
experience
with
with
different
form
factors
and
what
really
came
to
light
was
if
you
could
come
up
with
something
that
was
entirely
useful.
A
A
You
know
the
computing
processing
power
of
something
that
would
require
quite
a
bit
when
used
that'd,
be
pretty
that'd,
be
pretty
amazing,
and
this
is
where
coda
comes
from
coda
will
win.
This
is
to
be
released
guys.
This
is
an
a
a
very
late
stage.
A
Prototype
will
be
announced
soon
for
a
privatization,
but
essentially
what
it
will
allow
is
its
users
to
volunteer
the
computing
time
on
a
coda
which
has
a
you
know,
a
multitude
of
processors
obviously
runs
go
ipfs
with
the
ht
and
we'll
certainly
take
advantage
of.
A
You
know
things
that
are
coming
out
in
six
and
seven
joining
five
right
now,
but
it's
really
set
up
so
that
it
can
allow
a
user
to
select
what
projects,
for
example,
for
decentralized
computing,
but
also
even
in
the
default
state,
allow
it
to
be
a
node
and
that's
that's.
The
idea
to
you
know
proliferate
this
idea
of
decentralized
computing
without
the
hindrances
and
restraints
of
you
know,
using
you
know,
kind
of
the
traditional
traditional
vendors,
let's
say.
A
The
idea
here
is,
there
is
going
to
be
a
coda
store.
The
only
reason
apple's
on
here
is
just
to
give
you
know
an
analogy
of
what
we're
talking
about
that.
A
Not
only
will
there
be
decentralized
apps
that
can
be
run
on
your
coda
or
a
coda
that
you're
getting
volunteer
time
on
to
use,
but
it
will
also
allow
there's
a
full
sdk
and
api,
obviously
available
for
for
access
to
coda's,
coda's,
univision
systems,
audible
systems,
but,
most
importantly,
it's
array
of
computing
computing
systems.
A
A
little
bit
overview
and
again
this
is
really
just
meant
to
be
an
intro,
I'm
not
going
too
deep
into
this
I'll,
go
a
little
bit
into
the
architecture
for
those
that
are
interested,
but
essentially
it's
very
cute.
One
of
the
first
things
I
had
come
to
mind
is
I've.
Seen
just
like
I'm
sure
a
lot
of
you
on
youtube.
You
know
kind
of
what
robotic
dogs
can
look
like
and
they're,
certainly
very
capable
things
out
there.
This
one
has
a
head,
which
I
thought
was
a
great
start.
A
You
know
having
this
kind
of
decentralized
computing
platform
for
ai
on
a
dog
is
all
the
kind
of
dog
things
that
are
replicated
not
because
they're
contrived
by
you
know
the
creators
and
the
manufacturing
it's
because
kind
of
how
dogs
work
so,
for
example,
there's
there's
really
there's
14
motors
on
this
dog
and
there's
a
gambling
system
for
the
head
and
because
of
the
way
the
microphones
are
set
up
in
the
I
just
thought
this
was
particularly
interesting,
so
I'm
sharing
it
the
way
the
microphones
are
set
up
in
the
head.
A
You
know
like
any
real
dog,
you
know
a
dog
that
is
trying
to
identify
a
sound
source
and
and
receive
the
most.
You
know
sound
waves
from
whomever
or
whatever
is
making
that
sound.
A
It
will
arc
its
head,
so
kind
of
does
a
cant
you
know
in
order
to,
in
order
to
to
hear
you
know
better
and
that's
exactly
how
a
real
dog
works
and
because
of
the
placement
of
these
sensors
and
how
the
motor
you
know,
they've
gone
through
a
lot
of
effort
and
putting
the
motors
kind
of
how
a
real
dog
works
with
the
you
know,
obviously
the
gate,
but
also
how
it
utilizes
the
sensors.
It's
got
a
very
cool
effect
in
the
system.
A
For
my
part
of
this,
I'm
particularly
interested
in
obviously
doing
my
background
in
cyber
security.
In
the
you
know,
protection
of
not
just
private
data
and
personal
data,
but
also
the
data
for
when
coda's
time
is
being
volunteered
and
we're
particularly
sensitive
to
that.
Just
thought.
I'd
mention
that.
A
So
this
this
I'm
going
to
go
into
much
detail,
but
it
does
use
a
a
fusion
of
cpus.
There
are
several,
as
you
might
imagine
it
has
14
motors.
It
has
actually
five
3d
cameras
this.
This
is
a
little
dated,
sorry
guys.
This
uses
v
slam
some
of
your
newer.
You
know
rumba
vacuum
cleaners,
roomba
vacuum
cleaners
uses
v
slam
to
navigate
and
to
judge
distance
so
does
tesla.
I
believe
mercedes
does
as
well,
I'm
not
totally
sure
on
that
guy.
A
So
please
don't
quote
me,
but
it
is
amazing.
Technology.
A
So
he
uses
v-slam
and
obviously
needs
a
visual
recognition
and
tracking
system
and
intelligent
hearing
technology,
and
this
is
used
for
identification,
not
just
because
you
know
it
needs
to
be
able
to
receive
commands
or
instructions.
A
Audibly,
one
of
the
first
one
of
the
first
ideas
for
using
coders
in
general
are
for
visually
impaired
people,
and
so,
if
you
have
someone
who
would
benefit
from
a
you
know,
living
and
breathing
seeing
eye
dog
you
know,
coda
can
be
particularly
advantageous
as
a
supplement
and
and
sometimes
a
replacement
depending
on,
what's
needed.
But
it
needs
to
be
able
to
act
and
understand
its
user,
just
like
a
living
and
breathing
animal
would
and
to
that
end
it
needs
to
understand,
for
example,
if
it's
owners
under
duress.
A
So
you
know,
if
you
can
imagine
you're
at
a
traffic
intersection
and
the
light
turns
green,
but
the
owner,
you
know,
is
having
an
issue
and
doesn't
necessarily
want
to
cross
the
street
at
that
point.
Coda
can
understand
this
as
a
duress
event,
depending
on
the
tone
and
act
accordingly
and
provide
additional
information
and
perhaps
on
route
or
what
have
you
navigations
built
into
the
into
the
system.
But
the
idea
here
is
not
just
to
kind
of
use
the
traditional
sensors
in
the
traditional
ways.
A
So
this
is
just
a
little
quick
summary.
You
know
it's,
you
know
smarter,
safer,
faster.
There
have
been
other
things
out
on
the
market.
This
is
really
its
core
was
built
to
be
a
decisionalized
computing
platform.
You
know,
codas
are,
are
you
know,
destined
to
be
able
to
teach
each
other
with
their
owner's
permission?
What
they
learn?
You
know
if
one
code
learns
how
to
navigate
a
particularly
difficult
type
of
stairs,
it
can
share
that
information
with
other
codas,
so
other
coders
when
they
encounter
similar
geometry,
can
navigate
them.
A
You
know
obviously
much
better,
and
so
that's
just
obviously
a
very
small
example,
but
that
that
really
translates
across
what
these
devices
are
able
to
learn
and
share
with
each
other.
There
is
full
control
over
what
is
shared
and
received
in
terms
of
instruction,
but
the
idea
here
is
it's
going
to
not
just
be
a
platform
for
decentralized
computing
and,
and
you
and
ai,
it's
it's
going
to
use
it
itself
in
order
to
in
order
to
learn
and
learn
quickly,
a
little
bit
about
the
company.
A
It's
headquartered
in
mountain
view:
california,
it
is
a
robotics
innovation
company
and
the
whole
idea
is.
We
would
like
to
harness.
You
know:
decentralized
computing.
A
A
little
bit
on
timeline,
a
lot
of
people
ask
it.
Whenever
we
talk
about
this,
we
haven't
really
launched
yet
so
for
the
few
that
have
seen
what
you're,
seeing
today
a
little
bit
of
background
us
beginning
in
2018,
first
lab-based
prototype
was
in
january,
headquartered
in
silicon
valley
opened
up
in
february
here
in
silicon
valley.
A
I've
been
I'm
personally
in
maryland,
though
july
is
the
second
generation
of
the
prototype
pre-orders
start
september,
2020.,
there's
the
fourth
generation
prototype
will
be
in
october,
then
the
fifth
being
launched
at
and
hopefully
we're
all
gonna,
be
available
to
to
see
it
in
person
at
in
january,
at
computer
electronics
show
the
delivery
in
june.