►
From YouTube: Principles Seminar Session 7: Privacy
Description
In the seventh session of this 12 part series, join status' core contributors as we discuss and debate to which degree we uphold our principles, how we can improve our performance, and what we're adding to our Wall of Shame.
B
B
Additionally,
we
strive
to
provide
the
right
of
total
anonymity
so
before
going
further,
what
we
changed
things
up
a
bit
as
we
do
this
all
same
generation
collectively
and
right
now
live
so
essentially
I'm
gonna
ask
each
one
of
you
to
say
one
thing
and
then
we'll
add
it
to
the
wall,
a
wall
of
shame.
If
you
can
come
up
with
more
things,
that's
fine
and
if
they
do
get
set,
also,
okay,
and
if
you
really
can
come
up
with
something,
then
with
a
question
or
something.
B
C
Got
it
yeah
I,
don't
have
any
ideas
there.
I
would
say
that
in
especially
when
it
comes
to
studio
I,
don't
know
if
this
is
a
wall
of
shame
or
if
it's
something
to
aspire
to.
One
of
the
things
in
designing
studio
going
forward
that'll
be
important,
is
to
think
about,
for
example,
how
much
information
you
want
to
collect
from
the
users
and
making
sure
that
it's
minimal.
C
So,
for
example,
we've
been
talking
about
different
different
versions
of
studios,
so
a
basic
or
advanced
and
and
let's
say
user,
picks
their
preferences
and
says
you
know
I'm
a
basic
or
I'm
in
advanced.
How
do
you
sort
of
retain
those
settings
for
them
logging
in
multiple
times
down
the
line
without
necessarily
having
to
collector
of
their
email
address
or
something
or
their
identity?
C
B
B
D
D
You
know,
especially
if
we're
talking
about
you,
know
making
our
compensation
schemes
open
to
the
world
and
the
you
know
identities
of
the
individual
contributors
open
to
the
world,
their
legitimate
reasons
for
for
individuals
to
want
to
stay
private
within
that
context
and
I,
don't
necessarily
have
something
that
may
not
like
something
specific
but
I.
Think
it's
something.
B
B
B
B
F
Yeah
so
I
mean
in
terms
of
an
application
I
think
we
Dean
great
stuff
with
for
perfect
secrecy,
as
well
as
decoupling,
the
the
wallet
from
the
the
whisper
key,
which
is
like
a
great
stride
towards
making
the
application
a
lot
more
private
I
mean
honestly
I'll
save
money
bugbear
is
that
I
mean
Oscar,
for
example,
that
if
I'm
connected
with
you,
you
can
my
my
identity
inside
a
public
chat
where's,
everyone
else
ceases
being
anonymous
and
that's
a
privacy
concern
for
me,
because
it
means
that
I
can't
act
in
a
public
group
with
the
assumption
that
I'm
anonymous
I,
don't
know
if
that's
that's
more
or
not,
and
in
anonymous,
more
so
than
private.
F
B
I
So
for
me,
it's
like,
as
we
already
talked
today,
about
the
test,
ferry
and
actually
collecting
any
any
information.
Any
words
it's
oh
I
should
be
able
to
control
them
or
and
understand
the
consequences
like
what
actually
is
is
collected,
and
ideally
I'd
like
to
have
some
paranoid
mode
where,
when
I
can
switch
off
all
things
that
somehow
influence
my
privacy
so
that
collecting
data
and
right
now
we
don't
have
it
so
that's
ideally,
I
would
like
to
have
it
and
then
I
would
be
more.
B
J
H
We
me
and
Cory
was
talking
about
something
like
this
yesterday
about
the
privacy
of
the
intensity,
and
there
is
the
this
problem
that
is
not
mentioned
yet
is
that
in
future
we
were
going
to
have
a
solution
to
spam.
So
if
we
do
everything
that
is
so
privately
and
that's
you,
don't
you
cannot
filter
out
the
spam
or
we
don't
know
yet
how
to
do
it,
but
for
the
practice
itself
right
now
we
we
can
have
multiple
accounts,
so
we
that's
not
really
private.
So
this
is
not
anonymous.
H
B
B
K
L
B
G
Similar
what
Sergey
said
we
tie
everything
to
a
single
public
key.
So
if
you
do
anything
with
anybody,
you
they
understand,
understand
everything
associated
that
public
key,
whether
it
be
your
contact
code
or
all
the
contents
of
your
wallet,
etc.
B
A
B
B
M
Yeah
yeah
Oscar,
so
I
think
you
may
have
called
me
when
I
was
stepping
away
I,
don't
know
in
terms
of
before
it.
It
could
be
a
perspective.
What
we'd
like
to
do
is
make
it
we'd
like
to
open
up
the
entire
program
to
the
wider
community
as
much
as
possible,
but
we
have
to
strike
a
balance,
especially
around
the
application
process
is
where
we
want
to
get
the
wider
community
involved,
but
we
need
to
also
be
respectful.
M
A
I
wanted
to
add
actually
so
we
actually
have
pretty
serious
security
breach
that
came
up
when
it
is
that
you
published
all
our
signature
all
of
our
signatures
principals,
and
then
we
basically
will
not
wanted
to
publicize
that.
So
it's
similar
to
the
to
the
public
key
being
known
on
people
at
ball.
It's
that
we've
already
everybody
who
signed
the
principal's
is
pretty
much
exposed.
Now,
if
they're
continuing
to
use
those
those
keys.
G
N
Yeah
I
think
that's
a
really
important
point
because
that's
you
know
I
think.
Naturally,
especially
the
people
use
third-party
Depp's
users
are
going
to
interact
with
things
and
leaked
data
right
and
we
may
not
have
control
one
understand
control
over
that.
So
having
some
sort
of
mechanism
to
say,
fYI
you
just
on
this
this
and
this
you
know
you
may
or
may
not
have
known
that
I
think
is,
is
very
important.
B
H
That
what
about
prayer
privacy,
for
example,
myself
I,
know
the
consequences?
Sorry
not
a
consequence,
but
what
I'm
exposing
but
I,
don't
know
the
consequences
of
that
exposure.
So
I
know
what
what
I'm
making
public
I
know
that
my
children,
animals
is
associated
with
my
name
but
I.
Don't
know
what
are
the
consequences
of
this.
B
So
yeah
this
is
his
quote:
privacy's
dead
get
over
it,
which
is
something
that's
said
by
people,
dealings
of
humanizing,
a
song,
and
if
you
look
at
it
for
the
first
part
of
our
principles,
they
say
this
thing
about
privacy,
to
power
to
selectively
revealed
ourselves
to
work
and
just
going
back
to
it.
It
has
sort
of
that.
The
specific
phrasing
comes
from
the
cipher
maka
manifesto.
B
You
can
read
it
here
and
the
cyberpunk
a
manifest
cyberpunk
group
was
kind
of
this
small
little
main
list
back
in
the
90s,
which
maybe
I
don't
know,
not
that
many
hundred
people
who
participating
we
had
a
sort
of
disproportionate
effect
on
the
intimacy
as
we
see
today,
and
so
a
lot
of
core
ideas
and
people
and
movements
and
ways
of
thinking
about
privacy.
So
came
from
this
like
BitTorrent
and
WikiLeaks,
and
remailers
and
PDP
Bitcoin
Bitcoin
is
mentioned
twice:
zombies
and
EFS,
smart
content,
so
a
lot
of
them.
B
So
we
discussed
this
main
list
and
I.
Think
there's
something
something
interesting
here
in
that.
If
you
look
back
further
before
we
were
sort
of
connected
with
computers
and
everything,
if
you
just
think
about
the
offline
world,
we
have
sort
of
natural
human
intuition
for
what
privacy
means.
It's
you
stuff.
B
Where
kind
of
access
a
a
symmetrical
defense
in
that
it's
very
hard
to
sort
of
actually
crack
as
strong
encryption,
which
leads
to
the
ability
of
private
private
communication
antenna,
encryption
or
so
on,
but
it's
sort
of
going
against
the
grain
with
sort
of
facebooking
in
Google
and
Instagram
and
so
on.
So
it
sort
of
been
blunted.
This
is
a
natural
addition
that
humans
have
will
have
had
for,
like
millennia
and
so
on,
but
it's
sort
of
missing
you
sort
of
become
to
move
into
that
as
an
example.
B
Lily
I,
just
when
I
mentioned
with
metallic
just
in
terms
of
this
not
actually
being
obvious
at
all,
so
II
recently
changed
his
stance
to
be
more
pro
privacy,
sort
of
largely
due
to
thinking
about
like
what
it
means
to
not
have
privacy
from
me,
singling
point
of
view
and
all
sort
of
the
Sibylla
state
and
so
on.
So
this
is
sort
of
not
obvious
I,
think
why
it
simple
and
so
on
in
terms
of
specific
visa
guarantees.
I
think
we've
talked
about
it
a
bit,
but
there's
things
like
enter
and
shaft.
B
And
then
you
have
things
like
total
new
media,
which
is
sort
of
where
you
don't
have
a
specific
identity
and
every
message
sort
of
has
only
the
information.
That's
in
it,
which
is
sort
of
an
idea.
That's
not
necessarily
compatible
to
the
way
we're
structuring
up
right
now,
but
it's
something
that
that
we
can
provide
to
to
various
extents
of
potentially
just
in
terms
of
understanding
this
metadata
leakage
and
so
on.
I
just
want
to
show
you
this
thing.
B
If
you
haven't
seen
it
so
this
is
this
product
by
e
FF
and
I
used
you
block
origin.
All
these
things,
but
but
but
and
I
haven't
sort
of
tie,
I
still
use
JavaScript
and
stuff,
and
this
is
like
I
have
a
bunch
of
add-ons
like
meta
mask
and
so
on.
What
this
does
is
it
essentially
looks
at
fingerprints,
so
specific
things
like
what
language
you're
using
and
different
cookies
and
the
fonts
you
stole
this
one,
and
we
can
see
here
it
said
out
of
two
sort
of
two
million
something
browsers
have
tested
this.
B
This
is
a
fingerprint.
My
browser
is
unique,
and
that
means
that
you
can
sort
of
identify
my
browser
session
uniquely
to
me
potentially,
and
you
can
still
make
them
some
information
information,
a
theoretical
argument
here
about
how
many
bits
information
you're
leaking,
and
this
is
sort
of
why
this
method
is
like
it's,
it's
so
important,
because
it's
there's
so
much
information
we're
leaking,
and
we
really
don't
know
how
much
of
it
we
are
linking
it's
not
sort
of
answer.
This
is
a
sterile
thing
about
metadata.
Who
cares?
I
will
the
same
we
already
been
through.
N
A
The
slide
as
well,
because
that's
a
pretty
good
resource,
so
two
things
I'd
like
to
start
this
off
with
when
I
take
issue
with
privacy
being
something
that
we've
had
for
millennia.
Actually
I
think
that
it's
a
it's
a
very
modern
thing
that
we
have
just
from
the
last
century,
there's
still
a
lot
of
people
around
the
world
who
don't
live
with
with
any
sort
of
privacy
and
communal
groups.
A
And
if
you
look
at
architecture,
particularly
in
Europe,
you
know
the
rooms
have
all
of
these
doors
that
you
go
through
sequential
rooms
and
this
sort
of,
like
classic
sort
of
house
layout
that
we
have
in
America,
is
where
people
have
their
own
private
rooms
and
their
own
private
bathrooms
is
something.
That's
that's
really
quite
modern.
A
So
I'd
like
to
I'd
like
to
ask
in
the
round
actually
how
do
each
of
you?
What
how
do
each
of
you
see
privacy
being
defined
for
yourself?
What
does
it
mean
for
you
specifically
and
I'd
like
to
go
through
the
list
again
in
the
same
way
that
that
that
Oscar
did
and
I'll
start
with
the
Hoxie?
What
does
privacy
mean
for
you
personally?
A
A
I
D
E
F
Yeah
pretty
much
in
line
with
others.
Privacy
is
the
amount
of
control.
I
have
to
reveal
information.
A
A
J
O
L
A
P
P
A
So
are
we
literally
at
a
point
in
time
where,
where
privacy
is
dead,
digitally
and
and
how
can
we
actually
well
I,
don't
want
guarantee
that,
but
but
how
could
how
do?
How
do
you
guys
think
that
we
can
actually
make
that
happen
with
our
application.
D
Well,
I
think,
actually,
what's
going
on
right
now,
you
know
people
are
super
excited
for
the
past
few
years
about
blockchain,
but
really
I
think
what
is
exciting
about
this
is.
This
is
like
the
largest
public
deployment
of
cryptography
in
history,
and
so
for
the
first
time
the
masses
now
are
starting
to
understand
how
cryptography
works
and
how
we
can
leverage
it,
and
so
this
gives
us
the
opportunity
to
take
privacy
back.
G
That
it's
it's
I,
think
blockchain
was.
It
was
a
wonderful
application
and
exposure
of
what
cryptography
asymmetric
cryptography
can
do.
There
are
only
that
and
incentivize
a
lot
of
research
in
the
area,
so
we're
pushing
forward
the
boundaries
of
what,
if
we
know
applied
cryptography,
can
do
as
it
pertains
to
privacy
and
secrecy.
A
P
I
think
what
you're
referring
to
in,
in
which
it's
the
whole
issue
about
the
right
to
be
forgotten
which
not
in
the
urine
teehee.
Anyone
can
request
Google
to
delete
data
in
the
search
results
about
themselves,
and
that
is,
there's
been
a
bit
some
controversy
about
it,
because
some
even
politicians
are
being
using
this
to
to
sort
of
like
erase.
P
You
know
news
about
corruption
in
the
past
and
things
like
that,
although
it
doesn't
make
sense
for
some
people
that
maybe
did
some
something
interviewed,
that
they
repent
now
and
so
I
think
for
those
cases,
a
kind
of
make
sense
but
yeah
there
is
this
whole
issue
that
once
information
is
public
or
there's
a
trail,
so
to
speak
on
the
web.
It's
it
can
be
really
hard
to
to
take
it
back
and
there's
a
whole
question.
A
I
I
A
B
H
Encryption,
we
should
remember
that
the
algorithms
we
are
using
now
they
are
safe
just
now,
so
maybe
I
can
just
collect
all
the
messages
in
the
network
and
when,
when
these
algorithms
is
not
safe
anymore,
because
there
is
a
new
harder,
there
is
a
don't
know
what
what
happened.
But
then
I
can
start
creeping
decrypting
everyone's
message
that
that
would
be
something
to
think
about
us
as
well.
That
would
would
be
a
concern
regarding
no
one
can
hear
what
are
we
are
talking
because,
because
right
now
we
use
this.
H
H
Especially
if
we're
having
this
line
of
mail
servers,
so
they
could
potentially
become
a
mail
server
to
know.
Whoever
is
is
receiving
the
message,
not
whoever
who
is
sending,
because
it's
not
guaranteed
that
the
whoever
sent
them
the
messages
at
the
actual
sender
because
of
the
whisperer
protocol,
but
it
can
actually
detect
whoever
received
the
message
and
in
future,
maybe
decrypt
them.
H
But
when
I
say
the
future,
this
is
really
uncertain.
This
is
an
unknown
okay.
Maybe
they
would
never
break
it.
Maybe
decrypting
the
messages
would
be
possible,
but
would
to
take
a
lot
of
energy
to
do
that,
so
it
would
not
like
worth
the
cost
of
it.
Maybe
if
even
in
future
is
possible,
maybe
you
need
to
run
a
quantum
computer.
That
is
millions
of
dollars
per
hour
to
do
that.
So
it's
just
a
unknown
that
maybe
we
should
try
to
make
something
different
about
it.
D
Possibility,
maybe
some,
like
you
know,
some
acts
are
out
there
like
a
nation-state
or
has
already
technology
to
decrypt
some
some
of
this
encryption
pretty
easily.
What
makes
you
think
they
would
just
disclose
it
to
the
public
or
or
make
it
open
source
that
they
have
this.
It
would
make
sense
for
them
to
keep
it
private
and
can't
then
leverage
that
to
have
an
information
asymmetry
compared
with
other
nations
right
I.
B
Think
it's
important
to
I
mean
it
simple
to
be
skeptical,
so
it's
also
important
to
not
be
overly
paranoid
and
be
realistic
about
festivals.
What
I
mean
by
that
is
that
if
we
can
say
that,
given
what
we
currently
know
about
cutting-edge
research
for
for
cryptographic,
algorithms,
it
seems
like
this
implementation-
or
this
is
a
reference
right
now
to
our
best
of
our
knowledge,
is
safe
right
and
that's
as
far
as
you
can
go.
B
And,
of
course,
if
someone
wants
to
do
target
nation
state
wants
to
do
toriel
civilians
against
you,
you
have
a
problem
and
you're
not
going
to
probably
not
gonna,
stop
be
able
to
make
sure
against
that,
but
in
terms
of
dragnet
surveillance,
as
of
less
sophisticated
actors,
is
a
completely
different
picture
and
I
think
it's
important
to
be
able
to
separate
those
two
things
out
in
terms
of
guarding
against
reasonable
taxes
on
an
accord.
You
have
any
any
thoughts
on
that.
H
That's
really
important,
because
that
differentiates
the
like
the
criminals
that
are
trying
to
just
get
messages
to
scam
people
and
attack
them
in
student
money
and
from
security
programs
from
governments
that
have
a
lot
of
resources.
So
this
is
a
good
distinction.
I
agree
with
you.
It's
very
likely
that
small,
individual
asthma
hacker
would
be
able
to
just
with
their
computer.
Do
anything.
G
B
G
Mean
if
someone
wants
to
attack
you
in
today's
technology
landscape
they
can
it's
theirs.
It's
it's
more
about
motivation
and
resources.
If
someone
has
enough
of
the
both,
then
there's
nothing,
you
can
really
do,
and
so
a
lot
of
what
you're
trying
to
do
is
a
I
guess.
Trying
to
increase
your
privacy
and
security
is
just
to
try
and
make
it
as
difficult
as
possible
for
someone
to
do
something
so
they
give
up
before
you
want
to
ruin
their
motivation
and
resources.
So.
A
I'd
like
to
I
just
posted
a
link
into
the
chat
blind
in
the
panopticon
is
a
really
interesting
piece
that
Bennett
Gupta
wrote
for
the
cyber
Institute
in
in
England,
and
his
argument
is
that
there's
diminishing
returns
on
having
this
much
data
and
to
what
degree
the
state
can.
Actually,
you
know,
go
after
people
for
being
homosexual
or
being
Muslim
or
or
whatever.
It's
a
very,
very
interesting
paper.
That
I
think
that
that
a
lot
of
you
guys
would
be
interested
in
reading.
A
A
What
do
you
guys
think
that
we
can
actually
guarantee
or
actually
be
pretty
secure
about
with
our
application?
What
are
the
things
that
we
could
list
that
we're
very,
very
strongly
opinionated
about
as
being
a
type
of
a
guarantee
and
if
we
could
have
like
just
really
short
statements,
rather
than
their.
G
B
F
B
It's
a
good
question:
I
think
there's
literature
on
it,
but
I
don't
know
in
detail,
but
you
can
look
at
sort
of
how
many
information
bits
you're
leaking
and
anonymity
sets
and
so
on.
I
think
there's
been
a
bunch
of
work
in
tor
and
sort
of
mix
networks
and
so
on.
I,
don't
know
if
someone
else
has
more
knowledge
about
this,
but
I
believe
there
should
be
metrics
for
it.
I.
D
B
The
privacy
concerns
that
you're
directly
trusting
a
ducky,
a
direct
peer-to-peer
connection.
So
if
a
is
using,
if
a
is
using
mail
server
to
relied
messengers,
you
can
still
see
the
envelope
you're.
Getting
from
that.
Do
you
see
you
get
a
sent
straight
for
it
to
the
IP
as
opposed
to
to
the
general
rescue
network
Amazon's.
So
it's
a
trusted
form
of
relationship.
H
A
H
B
It
depends
on
where
the
data
lives,
because,
under
the
current
model,
you
have
a
date
that
is
specific,
a
mail
server,
node
and
right
now
all
servers
node.
We
host
have
all
the
data,
but
diffusor
of
have
a
specific
translation
shipped,
and
maybe
the
basic
picture
looks
different.
Who
we're
trying
to
work
towards
is
removing
the
high
availability
requirement,
which
would
mean
that
the
data
spread
out
for
swarms
having
at
this
and
then
it
wouldn't
matter
which
mails
from
you
connect
to.
So
it
depends
on
the
specifics
of
how
we
implement
it.
A
F
I
mean
I'm
just
thinking
of
top
my
head,
probably
the
clearest
one:
it's
gonna
be
the
amount
of
data.
That's
used
to
to
send
a
message,
so
my
understanding
is
that
the
dark
or
the
more
private
she
wants
a
message
you
gotta
send
it
through
bouncing
around
as
many
nodes
as
possible
to
the
feet.
Traffic
analysis,
the
more
of
that
is
done,
the
more
resources
it
takes.
So
the
more
resources
that's
been
used
to
send
a
message.
F
A
G
So
some
of
the
work
packages,
an
identity
or
currently
working
on
allowing
you
to
spin
up
basically
multiple
identities
within
the
status
apps.
You
have
a
public
public
identity,
which,
basically
is
your-
would
be
your
default,
that
you
can
choose
settings
on
what
you've
exposed
to
people.
And
then
you
can
also
spin
up
a
number
of
pseudonymous
identities
that
are
separate
from
that
completely
separate
from
that,
including
the
separate
whisper
key
and
so
on
and
so
forth.
G
And
whenever
you
enter
in
a
chat,
whether
it
be
one
one
group
chat
or
public,
you
join
as
a
you
join
as
one
of
those
identities,
and
then
you
interact
with
that
with
that
chat.
As
that
person,
which
should
allow
you
to
have
these
better
options
and
settings
and
better
knowledge
about
who
you?
Who
you
are
that
or
what
part
of
you
that
you're
exposing
to
the
chat
that
you're
you're
interacting
with.
A
G
G
H
A
Let
me
there's
a
there's:
a
topic.
That's
been
going
on
now
as
a
result
of
missing
messages
in
the
client
and
we're
looking
at
increasing
the
amount
of
history.
It
is
that
we
that
we
retrieve
can
we
talk
about
maybe
making
this
also
a
user
setting,
so
that
a
user
could
choose
himself
how
long
it
is
that
his
data
is
retrievable.
Is
that
doable?
So
if
I
say,
if
I
say
you
know,
okay,
these
messages
that
I've
sent
are
I'm
putting
a
limit
on
them.
They
are
you
know
in
the
like.
A
H
Yes,
we
could
change
the
name
server
ischemic
to
maybe
I
think
variety
big.
You
become
a
major
server
for
your
message.
For
example,
you
say
the
encrypted
message
to
public
chat
and
whoever
was
to
see
that
message
needs
to
request
you,
the
the
decryption
of
the
field,
so
the
to
I
think
it's.
It
would
be
a
way
to
do
it.
A
So
I'm
a
I'm,
a
large
fan
about
of
data
ephemerality
and
for
two
reasons,
one
for
the
security
for
the
security
and
privacy
concerns
and
secondly,
in
in
terms
of
perpetuating
data
on
distributed
decentralized
networks,
there's
nobody's
figured
out
a
cost
model.
Yet
we
have
some
models
out
there
they're
supposed
to
be
working.
A
Can
we
talk
about
maybe
making
this
a
feature
that
that
the
data
disappears
that
that's
a
default
setting
and
that
people,
then
we
can
look
towards
having
incentive
models
about
how
it
is
that
people
perpetuate
their
data,
because
I
think
that
that's
going
to
be
a
large
issue
as
we
move
forward
into
the
into
this
new
technology?
How
do
you
guys
feel
about
that.
B
So
signal
has
dismal
which,
which
I
like
just
checking
out
today
of
disappearing
messages
and
that
da
severe
no
one-to-one
chat.
The
group
chat
you
can
serve
negotiate
after
how
long
missus
self-destruct.
Unfortunately
it
doesn't
work
for
me
because
it
says
one
week,
but
I
can
see
all
the
messages.
But
but
idea
is
nice
at
this
sort
of
a
social
negotiation
and
then
you
both
honor
it
presumably
hopefully.
A
So,
for
me,
there's
two
issues
about
this
here:
there's
the
privacy
concerns
and
then
there's
actually
the
economic
concern.
So
how
much
of
this
data
do
we
need
to
be
perpetuating
in
in
this
sense,
yeah
and
and
there's
a
privacy
trade-off
there
about
how
long
it
is
that
that
it
is
that
you
perpetuate
data
as
well.
A
So
it's
it's
sort
of
like
paying
forward
or
or
not
paying
just
means
that
your
data
disappears
after
a
really
short
period
of
time,
there's
just
a
huge
amount
of
flotsam
and
jetsam
and
cat
pictures
and
and
individual
you
know,
personal
okay's
and
lols
and
whatever
that
don't
really
need
to
be
perpetuated
in
the
network.
So
maybe
we
could
even
have
a
setting
where
you
can
say
hey
this
specific
message:
I
want
to
perpetuate
in
the
future,
for
whatever
reason
is
you
choose
to
do
that,
whether
it's
a
privacy
concern
or
an
economic
one.
H
I
think
it's
great,
where
you
can
start
that,
just
as
a
tourist
model
and
later
Lee
code,
if
possible,
we
introduce
script
economics
into
it.
To
ensure
that
is
easy.
This
is,
will
happened
because,
right
now
you
can
say
I
want
this
message
just
five
minutes
and
once
once
you
have
a
forked
client
that
doesn't
respect
it.
It
would
be
not
anymore
as
intended,
but
I
think
basing
it
in
a
trust
model.
A
Cool
so
we're
here
it's
seven
minutes
before
the
top
of
the
hour,
I'd
like
to
go
back
up
to
the
top,
with
all
the
items
that
we
started
with
on
our
Wall
of
shame.
If
you
guys
want
to
take
a
review
with
that
and
and
mention
any
anything
else,
additional
comments
about
this
long
wall
of
shame,
which
you
guys
all
all
contributed
to,
and
then
we
can
close
up
this
session
and
come
back
for
the
next
one.
I.