►
From YouTube: IETF103-PEARG-20181107-1540
Description
PEARG meeting session at IETF103
2018/11/07 1540
https://datatracker.ietf.org/meeting/103/proceedings/
A
B
B
C
So
we
have
four
presentations
today
and
try
to
get
a
good
mix
of
open-source
academic
as
well
as
civil
society,
folks
who
work
on
privacy
and
the
idea
is
that
by
next
meeting
or
so,
we
would
also
have
work
items
to
be
looking
at.
This
is
note.
Well,
you've
probably
seen
this
about
a
dozen
times
by
now,
and
the
idea
should
take
Polly
take
a
look
at
it.
It
has
patent
information
as
well
as
code
of
conduct
and
a
bunch
of
really
useful
links.
C
Do
we
have
a
jabber
scribe?
Thank
you
and
Kareem
offered
to
take
notes.
Thank
You
Corinne.
It
was
easy,
so
we
had
our
site
meeting
in
Montreal.
We
heard
a
lot
of
opinions
and
thoughts
on
where
we
should
be
taking
the
research
group,
and
this
is
the
first
meeting
and
we've
already
had
a
lot
of
discussion
on
the
Charter
on
the
mailing
list.
C
B
C
B
C
E
Thank
you,
Delaney
Elkins,
I'm
confused
is
this.
Are
you
asking
for
comments
now
or
are
you
gonna
ask
for
comments
later
or
so
we.
E
Yeah
I'm
a
little
are
you
looking
for?
I
guess
is.
My
comment
is
a
more
I
guess
it's
generic
is
like.
Are
you?
How
are
you
defining
privacy?
Is
it?
Is
it
just
privacy
of
protocols
or
privacy
in
platforms,
or
you
know
in
a
more
general
sense
or
what
I
guess
I'm,
not
kind
of
cuz
there's
them.
There's
different
implications
depends
on
how
you
define
privacy
right
yeah,
so
I'm,
just
that's
a
question
for
you
guys
or
maybe
the
group
great.
C
C
C
There's
no
further
comments.
We
can
move
on
to
the
first
presentation.
Is
it
yeah?
There's
a
comment
from
Nik.
Sorry
comment
from
Nik
I
thought
it
was
notable
notable
that
privacy
beyond
just
confidentiality
was
highlighted,
but
no
particular
research
items
or
topic
seem
to
be
suggested.
Right
I
mean
the
first
meeting.
We
try
to
get
do
some
outreach
for
like
groups
that
we
thought
were
doing
interesting.
Research
on
privacy
and
work
around
privacy,
but
I
mean,
like
you
said
you
know,
that's
definitely
like
in
scope,
and
you
would
be
interested
in
looking
at
that.
F
Hi
everyone,
my
name,
is
Bennett
ciphers
and
I'm,
a
new
staff
technologist
at
the
Electronic
Frontier
Foundation,
so
I'll
start
off
with
quick,
you've,
never
heard
of
EF
F.
We
are
a
member
funded
nonprofit,
we
fight
for
privacy,
free
expression
and
innovation,
and
we
have
over
40,000
views
paying
members
worldwide
who
support
us.
F
So
I'm
gonna
start
with
this
presentation.
This
is
gonna,
be
a
presentation
about
privacy,
badger,
which
is
our
browser
extension
privacy,
protection
tool
and
I'm
gonna
give
a
little
context
first.
So
this
is
about
how
we
came
to
build
the
tool
that
we
built
and
it's
also
a
story
about
protocols
and
how
sometimes
they're
not
enough.
So
some
of
you
may
have
been
involved
with
this
fight
and
please
feel
free
to
kind
of
interrupt
with
questions
or
anything.
F
If
you
want
I
want
to
keep
this
relatively
informal,
because
it's
gonna
cover
a
lot
of
ground
and
I'd
like
to
see
what
kind
of
discussions
can
come
come
out
of
it.
So
anyway,
back
in
2008
and
2009,
and
some
people
came
up
with
the
idea.
This
is
the
corporate
surveillance
on
the
internet
was
kind
of
coming
more
into
the
the
popular
consciousness
and
people
were
getting
upset
about
it
for
the
first
time.
F
Well,
larger
group
of
people
was
getting
upset
about
it
for
the
first
time,
and
so
this
idea
that
maybe
there
could
be
a
way
for
consumers
to
actually
opt-out
of
tracking
the
internet
didn't
have
to
work
that
way,
not
everyone
had
to
be
tracked
all
the
time,
and
some
ad
companies
actually
thought.
This
might
be
a
good
idea
to
at
the
very
beginning
because
they
thought
they
might
be
able
to
head
off
some
of
the
consumer
backlash
against
them
by
agreeing
to
not
track
a
small
group
of
people
who
would
opt
into
this
do-not-track
idea.
F
So
in
2009,
a
group
of
researchers
proposed
the
first
version
of
the
dnt
signal.
This
is
a
header
that
gets
sent
with
every
every
request
for
a
resource
over
HTTP,
and
it
was
implemented
as
a
Firefox
extension
and
a
gained
momentum
really
quickly
and
2010.
The
FTC
made
a
call
for
a
universal
dnt
system
that
tech
companies
and
browsers
could
adopt,
and
it
got
adopted
very
quickly.
A
working
group
was
charted
at
the
w3c
by
the
end
of
2011.
Pretty
much.
F
Every
major
browser
had
support
for
this
opt-in,
D
and
T
signal
you
could
go
into
the
settings
say:
I
want
to
send
DMT
with
every
request
that
I
send
and
in
2012
the
digital
advertising
Alliance,
which
is
this
giant
industry
group
that
represents
many
of
the
biggest
digital
advertisers
said.
We
are
going
to
respect
the
DN
T
signal,
because
this
is
a
way
that
we
can
avoid
further
regulation
and
everything
looked
great
around
this
time
around
the
beginning
of
2012
and
then
some
some
things
change.
F
Some
people
get
a
little
zealous
Internet
Explorer
10
and
Windows
8
decided
to
launch
ie
10,
launched
with
DMT
turned
on
by
default.
That
was
the
Express
settings
and
ie
back
back
in
2012
was
still
I,
don't
know
if
there
is
still
the
biggest
browser,
but
they
were
so
massive
chrome
hadn't
come
to
dominate
the
entire
market
yet,
and
so
this
was.
This
was
huge.
All
of
a
sudden,
the
number
of
people
who
had
often
into
DMT
went
from
like
single
digits
to
low
double
digits
percentage.
F
To
like
a
very
significant
chunk
of
the
web,
many
people
who
never
would
have
thought
about
tracking
at
all
otherwise,
and
the
digital
advertising
Alliance
immediately
backed
out
Google
Yahoo,
a
bunch
of
other
companies
started
putting
in
their
policies.
We
don't
know
how
to
honor
dnt.
We
don't
know
what
it
means,
we're
not
gonna
respect
it
at
all,
and
that's
when
we
had
the
e
FF,
who
not
me
other
people
at
the
FF
who
had
been
working
on
this
we're
like
things
breaking
down.
This
is
not
going
anywhere.
F
We
need
another
way
to
try
and
allow
people
to
opt
out
of
tracking
and
that's
when
we
start
developing
the
privacy
Patrick
and
then
just
as
sort
of
an
epilogue
about
a
month
ago.
The
final
charter
for
the
w3c
dnt
working
group
expired,
and
there
was
absolutely
nothing
to
show
for
it,
so
privacy
badger
what
the
idea
of
it
was.
F
F
So
I
had
to
be
built
by
a
very
small
group
of
people.
It
also
had
to
be
easy
to
use
in
cross-platform.
We
really
wanted
this
to
be
a
high-impact
tool
and
the
final
constraint
was,
we
didn't
want
to
use
a
block
list.
We
didn't
want
to
do
the
same
thing
that
every
other
tracker
blocker
out
there
does,
which
is
just
have
a
curated
list
of
these,
are
all
the
things
that
we
know
do
tracking
and
we're
gonna
stop
them,
and
that's
for
a
couple
of
reasons.
F
Mainly
we
didn't
want
to
have
to
have
that
sort
of
editorial
responsibility,
but
also
we
were
thinking
like
in
the
wrong
hands.
A
block
list
is
a
censorship
tool.
If
anyone
were
to
ever
compromise
our
motivations
or
surreptitiously
force
us
to
do
something
that
would
could
effectively
be
used
to
block
big
parts
of
the
internet
for
all
of
the
people
who
trusted
us
and
use
privacy
badger.
F
So
what
did
privacy
badger
become
and
what
is
it
now?
Basic,
mechanics,
privacy
badger?
Is
it's
a
non-consensual
tracker
blocker?
So
the
first
thing
it
does.
It
turns
on
dnt
new
browser.
It
sends
a
DMT
signal
to
every
site
and
every
resource
that
you
request
and
it
looks
for
dnt
policy
posted
at
slash,
well-known,
slash,
dnt
policy
txt
on
every
domain
that
your
browser
visits
and
so
anytime,
you,
you
fetch
a
resource.
F
It'll
first
check
to
see
whether
the
domain
that
hosts
that
resource
has
a
DMT
policy
posted
and
if
it
does,
and
if
it's
one
of
the
policies
that
we
acknowledge
as
being
a
respectful
policy,
we
will
allow
resources
from
that
domain
to
do
whatever
they
want.
So
they
can
perform
actions
that
might
otherwise
look
like
they're
tracking
actions,
because
we
trust
that
this
legally
binding
policy
that
they
will
follow
this
legally
binding
policy
and
not
use
the
data
that
they
collect
to
track.
F
F
So
what
is
a
heuristic?
We
kind
of
went
for
the
low-hanging
fruit?
Luckily,
there
are
a
lot
of
very
low
hanging
fruit
in
the
the
tracking
world.
The
first
one
is
just
third
party
cookies.
Third
party
cookies
are
still
used
everywhere,
even
though
there
are
a
lot
of
tools
to
block
them
and
they're
relatively
easy
to
opt
out
of
so
we
consider
any
third
party
cookie
with
sufficient
entropy
to
be
a
tracking
cookie.
F
We
also
look
for
local
storage
reads
and
writes
with
sufficient
entropy.
Those
are
basically
the
same
cookies
and
canvas
fingerprinting.
We
have
a
couple
different
heuristics
to
identify
more
common
types
of
canvas
fingerprinting
and
we
detect
and
lock
that
as
well.
So
we
had
to
make
some
compromises
to
keep
it.
The
scope
relatively
contains
the
first
one.
Is
we
assume
that
each
domain,
each
top-level
domain,
plus
one
so
that's
like
google.com
and
not
like
mail.google.com
or
it's
like
any
anything
underneath
google.com-
is
either
a
tracker
or
not?
F
So
once
once
you
identify
like
sub
domain
that
sub
domain
TLD
as
tracking.
It
also
identifies
everything
under
the
TLD
plus
one
as
tracking
as
well,
unless
the
company
has
posted
a
do
not
track
policy
on
some
sub
sub
domain
in
particular,
and
we're
also
assuming
that
outright
blocking
resources
from
domains
that
perform
tracking,
usually
won't
break
first
party
sites
and
that's
just
a
usability
thing.
F
We
have
to
go
on
this
assumption,
otherwise
the
tool
is
kind
of
enviable
and
that
on
arrival,
and
so
for,
when
these
assumptions
break
down,
we
have
a
few
different
workarounds.
The
first
one
is
cookie
block
domains,
that's
sort
of
a
middling,
in-between
outright
block
and
resources
and
outright
allowing
resources.
F
The
second
is,
we
have
a
feature
called
widget
replacement,
which
I'll
talk
about
later
and
finally,
we
just
have
user
control.
So
every
every
action
that
privacy,
badger
learns
to
take
against
every
domain
can
be
configured
and
changed
by
the
user
if
they
want,
and
so
that
gives
them
the
power
to
the
bug
issues
on
sites
and
to
block
trackers
that
privacy,
badger
might
not
otherwise
catch.
F
First
I'll
talk
about
cookie
block
domains.
These
are
domains
that
privacy,
badger
has
decided
our
trackers,
but
we
have
also
determined
are
unfortunately
essential
for
the
operation
of
a
significant
number
of
sites
on
the
web.
So
these
sites
are
not
allowed
to
perform
local
storage
reads
or
writes,
they're
not
allowed
to
set
or
get
cookies.
F
So
someone
says:
oh
everyone,
who's
using
privacy,
badger
is
breaking
my
site,
and
so
then
we
can
build.
Allow
them
to
make
their
case
to
get
on
the
yellow
list,
and
this
rests
on
the
assumption
that
blocking
access
to
these
functions
almost
never
breaks
a
site
like
if
you,
if
you
have
something
that
relies
on
google.com,
if
you,
if
you
block
these
certain
functions
for
scripts,
that
come
from
google.com,
most
of
those
sites
should
still
work,
and
that
turns
out
to
be
pretty
reliable
in
practice.
F
Next
widget
replacement.
This
is
a
really
cool
feature
and
it
comes
from
some
research
that
happen
at
the
University
of
Washington
back
in
2011-2012.
So
this
is
just
we
have.
We
identify
certain
widgets
and
like
add-ins
from
third
parties
that
are
common
across
the
web.
That
also
happened
to
perform
tracking.
So
like
the
Facebook
like
button
is
the
most
common
one.
F
Every
time
you
see
a
facebook
like
button
that
has
sent
Facebook
your
login
cookies,
they
know
that
you
visited
that
site,
but
it
also
is
a
useful
feature
for
a
lot
of
a
lot
of
media
sites,
and
so
we
have
this
sort
of.
We
have
we
have
this
feature
where
we
identify
that
a
facebook
like
button
is
going
to
be
loaded
and
we
instead
insert
our
own
resource.
That
looks
like
the
Facebook
Likes
button,
but
we
don't
allow
the
facebook
like
button
to
actually
load.
F
F
Yeah
certain
sites,
also
certain
sites
and
eat
cookies,
like
so
putting
them
on
a
on
the
OLS,
isn't
enough
to
get
them
to
work.
Other
sites
need
refers.
That
kind
of
thing.
This
is
a
bit
labor-intensive.
We
have
to
do
a
good
deal
of
work
to
build
each
one
of
these
widget
replacements,
but
it
is
really
effective
and
it's
a
great
way
to
deal
with
things
like
like
buttons
and
embedded
media.
F
Finally,
the
ultimate
fallback
is
user
controls.
So
if
you,
if
you
open
up
privacy
badger
in
your
browser,
the
little
little
tooltip,
you
will
see
a
list
of
all
the
third-party
domains
that
have
been
loaded
on
the
site
that
you're
on
and
the
ones
that
privacy
batter
has
identified
as
tracking
the
actions
that
has
taken,
whether
it's
being
blocked,
outright
or
being
cookie
blocked,
as
well
as
all
the
domains
that
it
doesn't
think
are
tracking.
And
you
can
manually
set
what
actions
you
want
privacy
better
to
take
for
those
domains
in
the
future.
F
The
first
one-
and
this
is
this-
is
an
ongoing
effort.
It's
kind
of
been
around
from
the
start
is
limited
first
party
in
software
specific
features
so
like
the
big
one,
is
outgoing
link
tracking.
So
if
you,
if
you've
ever,
if
you've
ever
used
social
media,
Twitter
or
Facebook
like
on
Twitter
you'll
notice,
that
every
link
that
you
tweet
is
wrapped
in
a
Tico
link,
and
so
that
has
a
dual
function
of
shortening
the
URL
Twitter
says
it
adds
security
features
but,
more
importantly,
it
allows
Twitter
to
see
every
time
you
leave
their
site.
F
Where
are
you
going
Facebook?
We
will
do
the
same
thing.
There's
a
little
less
transparent.
We
have
some
pretty
cool
blog
posts
about
how
those
features
work
and
how
privacy
Badgers
counter
features
actually
work
to
stop
them,
and
this
is
a
thing
that
well,
it's
not
quite
third-party
tracking
in
the
original
sense
that
privacy
badger
was
designed
to
block
it
is
third-party
tracking
in
the
sense
that
users
understand
it.
F
It's
like
you're,
taking
an
action
to
leave
a
site
and
that
site
really
doesn't
have
any
shouldn't,
have
anything
to
do
with
where
you're
going
afterwards,
and
so
users
feel
that
this
is
something
that
privacy
better
should
block
another
one
is
we
are
trying
to
build
heuristics
to
identify
what
we
call
first
to
third-party
tracking.
These
are
trackers
like
Google
Analytics,
so
the
way
Google
Analytics
works.
They
don't
use
any
third-party
cookies.
F
Every
site
that
uses
Google
Analytics
sets
up
its
own
first
party,
Google,
Analytics
cookies
and
then
has
JavaScript,
which
makes
requests
to
Google
Analytics
with
the
values
for
those
first
party
cookies
in
the
URL
parameters.
So
it's
a
little
it's
not
quite
as
easy
for
Google
to
link
one
user
who
visits
two
different
sites
with
Google
Analytics
enabled
to
link
those
two
visits
together
because
they
don't
have
the
same
like
cookie
idea.
That's
persistent,
but
obviously
there
are
a
lot
of
other
ways
to
correlate
those
two
visits
for
google.
F
Once
once,
you've
made
two
requests
from
two
different
sites
like
they
few
drapy
doesn't
change,
which
it
usually
doesn't.
They
can
link
those
visits
or
like
they
can
use
TLS
session
resumption
things
or
just
TLS
sessions
in
general,
so
we're
we're
working
on
features
to
identify
and
block
those
and
web
beacon
requests
are
another
thing.
This
is
a
relatively
new
feature
in
HTML.
F
So
we
just
have
a
cloud
computer
instance
with
running
selenium
like
a
headless
browser,
and
we
have
privacy
battery
installed
on
it
and
we
browse
around
the
web
and
measure
all
the
trackers
privacy
badger
sees
and
identifies
on
its
trip
around
the
web,
and
this
is
well
visit
a
few
thousand
to
like
ten
thousand
sites,
and
the
other
thing
is
last
view-
is
ship
privacy
badger
with
not
a
curated
block
list,
but
a
pre
trained
list.
So
it's
a
list.
It
means
the
first
time
a
new
user
installs
privacy
badger.
F
Nowadays
they
will
block
most
common
trackers
because
it's
as
if
they've
been
browsing
through
all
these
common
sites-
and
it's
it's
all
heuristic
based,
but
it
does
ameliorate.
The
problem
we
used
to
have,
which
is
people,
would
install
privacy
badger,
go
to
a
web
page
and
be
like?
Oh,
it's
not
blocking
anything
like
it
doesn't
work
an
immediately
uninstall
it,
and
it
also
allows
us
to
test
out
new
heuristics
and
measure
their
impact
on
the
web.
F
So
this
is
just
the
result
of
a
recent
scan.
We
did
on
the
top
10,000
sites
on
the
majestic
million,
and
these
are
the
most
common
trackers
we
identified.
Facebook.Com/
was
by
far
the
most
common.
They
use
third-party
cookies
everywhere
they
have
the
Facebook
pixel
on,
as
you
can
see,
around
1/3
of
the
popular
web.
F
Google
was
the
second
most
common
twitter,
it's
all
of
the
social
media
companies
these
days,
but
it's
really
a
mostly
a
two
horse
race
sort
of
a
three
horse
race,
if
you
count
Twitter
double
click,
of
course,
is
also
owned
by
Google,
and
this
is
the
result
of
a
test
of
our
new
cookie
pixels.
Sorry
cookie,
sharing
a
pixel
heuristic,
which
is
designed
to
identify
that
first,
a
third-party
tracking
case
and
to
catch
specifically
Google
Analytics,
and
so,
as
you
can
see,
we
we
identified
70-something
instances
of
Google
Analytics
tracking
on
the
scan.
F
The
only
way
you
can
really
do
it
is
in
a
with
a
VPN
using
the
VPN
API.
You
can
have
like
a
VPN
that
just
sort
of
grabs
every
request
before
it
leaves
your
phone
and
you
can
filter
things
there,
but
thanks
to
the
rise
of
other
great
privacy
features
like
HTTP
and
in
the
future,
dns
over
yes
and
stuff,
like
that,
it's
going
to
be
really
difficult
to
filter
traffic,
even
in
a
VPN
environment,
and
so
obviously
we're
not
advocating
for
people
to
rollback
the
stuff.
F
The
progress
we've
made
with
HTTPS,
but
it
does
present
these
unique
challenges
for
like
it's
great,
that
the
NSA
can't
surveil
people.
But
it's
going
to
be
a
lot
harder
to
stop.
Corporations
from
surveilling
people
in
environments
like
this
anyway,
I
think
I
might
have
used
up
a
little
too
much
time.
That's
fine!
But
if
anyone
has
any
questions.
G
So
this
is
one
of
the
issues
with
HSTs,
for
example,
is
that
HSTs
creates
a
the
ability
to
set
like
a
meta
cookie
by
serving
someone,
a
bunch
of
domains,
giving
some
of
them
some
of
them
HSTs
and
some
of
them
not
and
then
observing
later,
whether
they
come
back,
HCBS
or
not
so
by
using
Badgers
set?
What
you're
actually
doing
is
you're
you're,
giving
people
a
baseline
that
protects
them
from
from
actually
having
that
same
pattern
used
against
them,
at
least
with
the
things
that
you've
pre
browsed.
G
F
H
F
F
F
So
we
have
to
request
that
the
most
permissive
ones,
but
yes,
so
the
only
we
we
instrument
some
some
of
the
things
like
local
storage
and
some
of
the
canvas
functions
in
order
to
detect
tracking
mechanisms.
But
important
thing
to
note
is
that
no
data
ever
leaves
your
browser
and
goes
cff
or
anywhere
else.
F
Unless
you
send
an
error
report,
the
only
time
your
browser
makes
your
browser
makes
two
kinds
of
requests:
uff
one
is
once
a
day
it
makes
it
get
requests
you
get
the
new
version
of
the
yellow
list
and
the
other
thing
is:
if
you
ever
decide
to
send
an
error
report,
it
will
include,
like
the
information
that
you
tell
us
an
error
board,
but
never
otherwise
does
any
information
lead.
Your
pastor.
Thank.
C
F
I
Malory
you
go
from
article
19.
You
may
not
know
this
because
you
run
around
then,
but
I
was
just
wondering
if
there
was
a
discussion
about
potentially
developing
privacy
badger
as
part
of
the
w3c
group.
That
was
on
dnt
and
if
you
did
it,
if
you
like
why
you
didn't
decide
to
do
that,
that's.
J
D
F
Powder,
yes,
so
there's
there's,
unfortunately,
not
a
lot
of
ways
within
the
web.
Extensions
API
that
we
can
either
detect
that
you're
using
other
extensions
or
interact
with
them
at
all.
So
there's
a
problem
where
sometimes
like,
if
you're
using
I
use,
I
personally
use
you
block
origin
I.
Think
it's
great
alongside
privacy,
better,
but
sometimes
like
especially
like
with
surrogate
scripts,
is
another
feature
we
have
which
I
didn't
mention
and
it
will
like.
F
F
K
Javonni
aside,
the
N
until
they
so
thanks
for
for
developing
this
before
a
fee
AFF
to
doing
that
ii.
Think
it's
great
to
win.
Yes,
thanks
for
that,
I'm
just
a
common
here
if
he
was
not
explicit
but
I
like
the
idea.
Caesar,
don't
have
any
kinda
me
treat
data
going
back
from
the
users
to
you
guys
there
that
project
I
did
for
the
Rena,
have
this
browser
somewhere
else
in
a
cloud
and
do
all
this
analysis,
all
those
measurements
collect
all
this
data.
K
L
F
That's
that's
a
good
question
so,
fortunately,
and
unfortunately,
privacy
badger
is
not
that
big.
In
the
scheme
of
things
we
have,
we
have
a
few
single-digit
million
users,
but
there
are
much
bigger
ad
blockers
and
the
tracking
company
is
most
of
them
operate
on
scales,
where
they
really
just
don't
care
about
the
revenue
they
lose
from
our
users.
So
we
like
it.
F
On
the
one
hand,
it
means
that
we
can't
apply
as
much
economic
pressure
as
we
would
like
against
these
companies,
but
on
the
other
hand,
it
means
that
it's
usually
too
much
effort
for
them.
It's
not
worth
it
for
them
to
go
out
of
their
way
to
avoid
privacy,
badger
and
furthermore,
it's
it's
pretty
easy
to
identify.
New
tracking
methods
in
the
wild,
like
once
like
fingerprinting,
is,
is
basically
only
used
to
circumvents
anti
tracker
like
when
people
decide
to
block
cookies.
F
Fingerprinting
is
kind
of
a
backup
that
companies
can
use,
but
it
took
a
long
time
to
develop
a
lot
of
these
fingerprinting
scripts
and
once
you
know
what
to
look
for,
it
is
extremely
easy
to
look
for
and
block
these
fingerprinting
strips,
so
I
think
I.
Think
in
the
end,
if
there
is
an
arms
race,
tracker,
blockers
and
users
have
the
advantage,
we
be
cutting
the
line.
Now.
C
F
M
F
Yeah,
so
we
have
dnt
policy
that
we
have
written.
Our
lawyers
have
written
that's
available
at
our
that
well-known
slash,
dnt
policy,
and
we
encourage
everyone
to
use
that
and
read
it
and
use
that
as
guidance
for
how
to
respect
users
data
and
then
beyond
that,
it's
just
generally
as
a
site
owner
the
more
you
can
limit
the
access
you
give
third
parties
to
users
on
your
site,
the
better,
but
no
we
don't
have.
We
don't
have
a
specific
guide.
I,
don't
think
that
might
be
useful
to
write
up,
though.
N
Market
office
as
ID
unless
I
love
the
tool.
Thank
you
been
using
it
for
years,
but
I
also
have
an
ad
blocker
on
the
same
browser,
and
sometimes
it's
a
little
bit
unclear
for
me.
Who
is
doing
what
you
suppose
I
would
remove
the
ad
blocker
would
I
get
ads.
Then
again,
even
if
I
use
your
privacy
badger,
oh
yeah,.
F
You'll
get
some
ads
so
most
ads
perform
tracking
as
well.
We
don't
block
ads.
Our
policy
is
not
to
block
ads,
but
most
ads
use
tracking,
and
so
we
ended
up
blocking
most
ads
just
kind
of
coincidentally.
But
if
you
did
remove
your
ad
blocker,
you
would
probably
start
to
see
some
ads
in
some
places
and
just
and.
F
F
And
some
more
content,
so
the
other
thing
is:
if
you,
if
you
open
up
the
privacy
badger
widget
at
the
top
right
of
your
browser,
you
can
see
all
the
domains
of
privacy,
badger
has
identified
and
decide
to
take
action
on,
and
so
like
it's
a
kind
of
an
easy
experiment.
It's
like,
if
you
want
to
disable
your
ad
blocker,
see
the
domains
privates
better,
seized
and
then
enable
it
again
you
can.
The
difference
will
be
what's
gotten
through
your
ad
blocker
and
to
privacy
battery
thanks.
P
Awesome
so
we
will
start
okay,
so
hi.
Our
name
is
Sofia
celli
and
you
Devon
wagon,
and
we
work
for
in
Sao,
Paulo
Brazil
for
our
centers,
the
schools
centrally
autonomy,
a
busy
town,
and
today
we
are
going
to
present
you
all
this
doctor
stock.
That
is
called
no
evidence
of
communication,
oh
dear
before,
which
is
basically
a
presentation
to
introduce
you
of
the
record
messaging
protocol
protocol
innovation
for
sound
excess
light.
Yes,.
D
All
right,
so
what
is
out
here?
Otr
stands
for
off
the
record
messaging.
It's
an
academic
paper
that
was
first
published
in
2004
by
Nikita
and
Boris
off
in
Goldberg
and
Eric
Brewer
version.
3
provides
encryption
with
forward
secrecy
of
all
your
messages
authentication,
so
you
can
be
rest
assured
you're
talking
to
the
right
person.
Most
importantly,
deniability
anyone
can
force
messages
after
our
conversation,
but
during
a
conversation,
it's
assured.
The
messages
that
are
seen
are
authenticated
and
unmodified.
D
Before
first
entry
we
had
version
2,
which
was
released
in
2005
and
solution.
3
was
published
in
2012,
more
implementations,
followed
in
different
languages,
has
seen
clients
most
popular
operating
systems.
A
specification
was
released
and
has
been
implemented
in
a
library
called
lip
OTR
and
clients
pitch
in
OTR
by
the
Oh
trt
signal
created
by
Moxie,
Marlinspike
and
Trevor
Perrin
others
drew
inspiration
from
the
OTR
protocol
for
their
signal
protocol.
That's
been
widely
used
by
lots
of
people
around
the
world.
Now
this
next
slide.
Okay,.
P
So
basically,
there
was
a
very
brief
introduction
to
what
OTA
is
and
the
consequence
question
that
we
have
of.
That
is
why
we
need
a
version
fold.
We
already
have
different
versions
of
OTA
and
to
answer
these
questions
we
mainly
will
have
to
look
at
the
main
purpose
of
or
Tia.
The
main
purpose
of
OTA
was
to
give
deniability.
P
So
you
know
Tia
version
four.
We
wanted
to
prove
this:
the
definitions
of
the
nobility
right
now,
the
current
academic
cryptographic
literature
has
already
established
good
definitions
and
types
of
what
deniability.
If
we
have
now
online
offline
participation
and
messaging
ability-
and
we
wanted
to
include
all
of
these
types
of
different
abilities
note
here
before
what
does
online
and
offline
deniability
means
deniability
can't
be
defined
in
the
terms
of
Oh
Tia,
not
as
denying
a
conversation,
but
rather
us,
but
rather
as
saying
that
anyone
could
have
participated
into
that
conversation.
P
So
online
basically
means
that
during
the
conversation,
no
one
connected
that
you
participated
all
specific
message
during
the
company
of
Linden
everything
after
the
fact
that
the
conversation
happened.
No
one
cannot
say
you
participated.
Also,
it
I'll
take
a
message
in
a
conversation.
I
participated
in
a
basically
means
that
you
can
deny
having
participated
in
a
conversation
or
having
set
and
a
specific
message
in
the
conversation.
D
All
right
so
the
Koreans
say
those
things:
Canadian
specification,
that's
on
github.com,
/ot,
fe4,
so
govt
for
at
least
a
film
specification
and
some
of
the
architectural
decisions
that
have
been
made
along
the
way.
We
also
have
an
implementation
called
Nepal
trng,
which
is
written
in
the
C
language.
At
this
point
in
time.
Next,
to
a
library
we
have
a
client
implementation,
that's
underway
called
pigeon,
OTR
and
G.
P
Okay,
so
I'm
sorry,
my
audio
stopped
for
some
reason,
so
yeah
as
I
was
cooking.
We
have
different
deniability
properties
in
note
here
before,
and
we
also
have
different
security
properties
that
we
wanted
to
also
to
have
an
OTA
before
this
kind
of
different
properties.
First,
most
that
we
wanted
to
raise
the
security
level
of
the
overall
protocol
to
two
to
four
by
using
elliptic
curve
cryptography.
We
also
wanted
to
clear
the
notion
so
forward
and
forward
secrecy.
P
Besides
from
the
message
that
it
was
able
to,
decrypt
also
wanted
to
increment
a
new
network
model
in
the
cells
that
OTA
v3
had
limitations
in
the
sense
that
it
was
not
possible
to
do
OTA
b3
in
Laughlin
messaging
or
an
out-of-order
network,
and
without
here
before
we
actually
have
offline
messages,
an
out-of-order
network.
We
also
have
wanted
to
improve
the
cryptography
to
improve
the
cryptographic
primitives.
P
Well,
because,
right
now,
the
current
academic
literature
has
a
lot
of
great
photography,
primitives
that
are
not
used
in
practice,
and
we
wanted
to
use
these
new
cryptography
primitives,
which
are
secure
enough
for
us
to
be
used
and
yeah.
So,
basically,
how
audio
before
looks
in
the
practice,
you
will
have
always
two
participants,
this
Alice
and
Bob,
and
each
one
of
them
we
hey
I,
want
you
I
mean,
is
very
support
out
here
before
then
we
go
ahead
and
this
other
protocol.
After
that,
you
work
to
actually
witches.
D
So
let's
talk
about
the
double
ratchet
so
due
to
the
the
way
the
online
community
way
of
communication
through
the
day,
it's
possible,
we
can
do
a.
We
can
achieve
asynchronous
messaging,
like
the
signal
protocol
through
the
double
ratchet.
So
even
if
the
other
participant
is
offline-
and
you
still
want
to
send
them
a
message
when
it
happens,
there's
pre
keys
at
the
pre
key
server,
which
allows
the
user
to
receive
messages
that
he
or
she
is
able
to
decrypt
when
even
when
you're
not
online,
and
when
you
come
aligned,
these
three
keys
will.
D
Decrypt
the
messages,
so
even
though
this
also
comes
back
to
the
deniability
part,
we
have
two
ways
of
doing
a
verification,
so
one
of
them
is
to
S&P,
which
is
socialism
in
a
protocol,
and
the
other
one
is
to
fingerprint
verification.
So
through
the
S&P,
you
can
actually
have
the
other
ID
a
question
to
a
shared
frequency.
You
have
discussed
before
you
are
able
to
actually
authenticate
that
person
in
question
or,
if
you
actually
have
already
have
Maddie
you're,
not
trying
going
to
change
fingerprint
anytime
soon,
you
can
actually
do
a
verification
through
fingerprints.
D
D
We
actually
have
an
implementation
currently
OTO
and
G,
which
is
written
in
the
C
language.
We
also
actually
have
a
plug-in
for
pidgin,
called
pigeon,
OTR
and
G
I
think
some
of
it
is
actually
already
packaged
for
the
arts
Linux
distribution.
Our
plan
is
to
also
packages
for
Debian
and
Ubuntu,
and
hopefully
we
can
find
some
people
to
packages
for
us
for
operating
systems
like
fedora
and
others.
There's
a
pre
key
server
implementation,
so
you
can
do
asynchronous
encrypted
communication,
which
is
the
double
ratchet
protocol
and
there's
two
more
implementations
underway.
D
At
the
moment,
one
in
Java
called
OTR
for
J
by
Danny
phone
Herman
and
one
in
golang,
biola
beanie
was
working
at
the
central
autonomia
digital.
Hopefully
we
will
see
more
implementations
in
the
future
that
span
into
the
mobile
ecosystem
as
well.
The
great
thing,
of
course,
is
is
that
a
geography
for
is
not
just
theory
or
academic
literature.
It's
quite
the
opposite
is
actually
an
applied
pricing
and
enhancing
technology
that
is
free
and
open
source
which
cuts
to
the
theory
and
practice
next
slide.
Please.
D
So
these
are
some
of
our
get
repositories
that
you
can
find
online,
which
actually
lists
so
the
first
one
that
slash
OTO
fee
for
slice
of
govt
four
is
the
fill
specification
that
can
be
checked
out.
Otr
fee
for
pre
key
server
is
our
pre
key
implementation.
That's
based
off
the
free
and
open
source,
prosody
XMPP
server,
the
library
can
be
found
and
as
well
as
the
the
plugin
for
the
pigeon
I
am
clients
and
next
slide.
Please.
D
So
he
will
see
the
OTR
ng
pre
key
server
and
the
other
components
part
of
their
pre
key
implementation,
as
well
as
the
toolkits
actually
and
the
toolkit
is
interesting
because
it
allows
you
to
actually
force
that
part
of
the
transcripts
after
after
you've
actually
ended
the
conversation
all
right
next
slide.
That's
our
talk
thanks.
I
hope
it
was
actually
follow
ball,
even
though
we
had
some
of
the
technical
issues.
D
O
I,
this
part
is
our
likely
first
answers.
Defense
question,
so
MLS
is
a
whole
messaging
protocol,
where
you
have
lots
of
entities
and
different
parties
in
the
system
that
you
all
have
to
set
up
and
run
your
learner's
protocol.
While
OTR
is
more
of
a
add-on.
So
you
encode
on
any
existing
message.
Communication
you
have
and
then
you
can
send
Santos
and
currents
on.
L
P
I
didn't
hear
the
answer
of
the
person
who
was
just
so
key,
but
MLS
from
what
I
have
actually
been
reading
does
not
provide
deniability
and
one
of
the
main
purposes.
The
OTR
is
actually
to
provide
the
inability
and
then
and
OTR
before,
is
actually
also
beef
protocol,
because
it
also
tries
to
achieve
security,
privacy
and
encryption
and
an
encryption
of
all
the
messages,
as
well
as
forward
backwards.
Post
compromised
security,
which
was
also
one
of
the
aims
of
MLS,
but
also
it
has
an
ability
in
all
of
its
different
types.
P
Q
I
Steven
fro
again
sorry
just
a
clarify.
My
question
I
understand
the
difference.
Do
you
know,
T
or
and
kind
of
group
nice
group
messaging,
like
in
large
groups
with
MLS?
What
I
don't
understand
is
whether
you
know
one
of
the
efforts
in
MLS
is
to
try
and
improve
the
cryptographic
primitives
and
get
them.
You
know
widely
kind
of
studied
by
the
community
by
kapag
refers.
Q
P
I
completely
agree
with
you.
One
of
the
aims
also
widowed
here
before,
was
basically
that
it
becomes
also
an
inspiration
protocol.
So
one
of
the
reasons
why
we
have
ot
is
that
it
was
an
inspiration
protocol
to
other
kinds
of
criticals,
like
signal
took
inspiration
from
me
and
so
other
protocol
so
tear
before
each
aim
is
also
to
be
an
inspiration
to
other
protocols.
So
if
ot
are
updates
the
cryptography
primitives,
all
the
protocols
would
update
as
well.
R
P
Sure
yeah
I'm,
sorry
I'm
breaking
but
yeah.
Okay.
Some
participation
then
abilities
that
you
don't
know.
One
on
a
conversation
can
actually
say
that
you
participated
into
that
conversation.
No
one
can
actually
prove
that
a
messes
per
message.
Deniability
means
that
no
one
can
prove
that
you
said
on
a
specific
message.
During
that
conversation.
S
Every
school
I
have
a
perhaps
different,
take
on
this,
so
I
think
to
Paul's
point:
I
mean
I,
think
so,
like
MLS
and
other
messaging
on
systems
and
OTR
obviously
have
a
lot
of
like
things
in
common
about.
Like
doing
you
know
that
the
very
security
properties
that
on
yours,
indicating
I
think
that
certainly
like,
if
people
from
OTR
wish
to
be
involved
in
MLS,
that
would
be
extremely
welcomed.
I
think
I
think
I
speak
for
everybody
involved
in
that
one.
On
the.
S
Internal
surprise
that,
but
the
sort
of
alternate
heavyweight
on
I
mean
obviously
a
group
messaging
protocol
when
it's
the
stanchion
resting
settings
before
heavyweight,
but
the
actual
on
when
you
skele
permit,
is
down
to
one.
The
one
less
energy
I'll
show
that
to
some
more
disappoint,
deniability
is
kind
of
interesting.
As
you
say
there,
two
kinds
of
deniability
mls
you
like
don't
make
no
attempt
to
have
the
liability
for
these
communication
at
all
I'm,
not
sure.
Actually,
that's
the
impossible,
a
multi-party
setting
on
how
to
think
about
that.
S
One
on
on
LS
is
just
agnostic
on
the
question.
If
the
my
ability
for
messages
that
the
protocols
aren't
willing
to
flushed
out
to
specify
what
is
it's
not
gonna,
be
the
case,
there's
really
been
discussions
about
making
that
work.
So
again,
you
know
we
certainly
love
to
have
your
participation
if
your
existing
working
MLS
as
well
yeah.
P
That
is
awesome
I.
We
have
been
looking
at
really
at
some
of
the
specification
of
MLS
so
yeah
it
will
be
nice
to
participate
and,
yes,
of
course,
great
look.
Charging
abilities
is
still
an
unsolvable
problem
right
now,
there's
some
cryptographers
that
have
been
actually
looking
into
this
to
generate
some
papers
around
it,
and
this
is
something
respect
to
have
on
previous
versions
of
no
previous
one
consequent
question
versions
of
or
Tia.
A
Very
good:
well,
thanks
for
the
invitation.
Here's
your
on
so
I
decided
that
maybe
the
best
way
is
to
do
a
summary
of
everything
here
at
the
start,
because
not
sure
how
much
time
we'll
have
to
get
into
some
of
them.
Some
of
the
detailed
things
so
tries
to
go
from
a
high
level
to
a
little
bit
lower
level
here.
A
The
idea
of
pluggable
is
that
it's
written
to
a
specification,
that's
being
worked
out
actively
by
a
small
group
of
people
interested
in
this
area,
and
we
need
to
be
pluggable,
because
the
half-life
of
an
obfuscation
technology
is
pretty
short
and
the
developers
of
those
technologies
and
the
developers
of
software
that
use
them
need
a
way
to
get
those
technologies
into
the
field.
I'm
going
to
mention
here
a
couple
things
that
first
of
all
obfuscation,
requires
a
partnership
between
a
client
and
a
server
or
software
on
the
device
and
software
in
the
network.
A
Just
a
quick
point
about.
Why
should
we
care
about
surveillance?
You
know
over
time
we've
started
to
learn
that
surveillance
runs
sort
of
at
cross-purposes
to
individual
privacy.
Privacy
is
among
the
core.
Concepts
were
interested
in
in
human
rights,
and
I
just
wanted
to
note
here
that
Guardian
project
for
with
whom
I'm
involved
work,
mostly
in
mobile
and
mostly
on
mobile
networks.
So,
although
there's
a
wider
space
out
there,
our
expertise
area
tends
to
be
in
the
in
mobile
and
the
opportunities
and
problems
of
the
mobile
operating
systems
and
mobile
networks.
A
D
A
This
kind
of
data
that's
produced
by
routers,
started
to
get
used
for
purposes
other
than
routing
as
the
technology,
the
hardware
technology
and
software
technology
in
router
boxes
got
more
and
more
and
more
and
more
powerful
became
possible
to
look
at
every
aspect
of
every
packet,
not
just
the
IP
or
headers.
So
this
was.
A
This
has
been
rolled
out
over
some
relatively
small
number
of
years
in
a
relatively
small
number
of
places,
but
it's
a
growing
threat
because
of
course
you
can
start
looking
at
every
every
contents
of
every
packet
in
your
network
and
even
though
you
we
have
an
encryption,
there
are
still
ways
that
there's
still
recognizable
patterns
in
the
packets
produced
by
that
software
that
allows
you
to
know
or
suspect
the
packets
are
of
a
certain
type
and
then
start
taking
action
on
that.
So
you
know,
we've
heard
the
the
analogy
of
the
postman.
A
Reading
your
mail
and
even
even
regular
citizens
get
worried
about
wow
what
happened
if
the
post-office
read
all
of
my
mail
so
now
we're
kind
of
in
that
space
in
the
on
the
Internet
and
some
people
would
like
to
have
more
privacy
than
that.
So
we
we
are
working
on
this
area.
So
can
you
defend
yes?
Throughout
few
stations,
there
are
two
ways
basically
to
obfuscate.
One
is
to
I'll
use,
look
at
number
B
here
first,
which
is
that
you
obfuscate
the
intended
a
direct
service
address.
The
other
is
to
obfuscate.
What's
actually
in.
A
So,
but
this
this
idea
of
obfuscating
first
of
all,
requires
that
it
gets
unappreciated
before
it
reaches
the
actual
service
definition
that
doesn't
make
sense
to
deploy
obfuscation
technologies
that
every
website
or
every
application
service
has
to
understand.
So
there's
a
transforming
state
and
a
nun
transforming
state,
but
it's
also
important
to
recognize
that
the
kind
of
threat,
the
kind
of
techniques
we
use
tend
to
be
unsuspicious
at
one
point
and
then
suspicious
at
another
point,
and
so
we
meet
pews
occasion.
A
A
These
technologies
act
at
the
network,
transport
layer,
so
applications
don't
see
them
having
their
effect.
Applications
still
just
continue
to
write
and
read
bytes
from
what
they
think
is
their
communication
endpoint
and
transforms
happen
independently
of
that.
There
are
three
techniques
the
first
year
if
fronting,
is
to
obscure
the
address
of
service
and
the
other
is
scrambling
and
shape-shifting
actually
obfuscate.
The
data-
that's
in
your
in
your
communication,
the
to
the
latter
two
are
sort
of
demarcated
when
one
scrambling
has
just
to
change
the
stream
of
bytes.
A
Somehow
there
a
variety
of
ways
that
have
been
looked
at
do
this,
but
shape-shifting
is
sort
of
a
special
case
of
that
where
you
are
making
your
byte
stream
look
like
somebody
else's
byte
stream,
and
in
that
case
maybe
music
streaming
is
deemed
to
be
benign
in
today's
network
landscape,
and
so
people
don't
surveil
music
streaming,
whereas
they
might
surveil
something
else.
So
shape-shifting
is
a
subset
of
scrambling,
but
it's
severe
for
a
very
specific
purpose.
A
So,
just
to
be
clear,
we
we
do
have
a
two-party
system
here,
a
client
or
something
software,
that's
built
on
the
device,
and
then
software
that
exists
in
the
network
to
do
the
obfuscation,
the
transform,
untransformed
and
that's
important,
because
we,
the
application,
are
the
the
pluggable
transport
has
to
know
where
it's
a
partner.
Endpoints
are
we'll
discuss
them.
We'll
discuss
this
a
little
bit
further.
A
This
idea
of
a
bridge
is
common
to
what
goes
on
with
tor
and
what
goes
on
with
VPNs,
and
it
should
be
noticed
that
these
endpoints
these
service
endpoints
in
the
network,
are
these
service
intermediaries
in
the
network.
Pardon
me,
you
know,
are
often
the
targets
for
sensors
and
surveillance
engines
and
therefore
their
addresses
tend
to
be
mercurial
is
a
nice
way
to
put
it
managing
the
configuration
of
these
servicing
intermediaries.
A
These
bridges
is
a
big
problem
for
VPNs
tour
and
that
increases
with
pluggable
transports,
since
each
kind
of
transport
has
to
have
its
special
kind
of
bridge.
So
there
are
there's
also
work
underway
to
look
at
crowdsourcing
bridges
that
is
making
the
number
of
bridges
available
impossible
to
black.
But
of
course
that
means
impossible
to
find
as
well
since
we're
client-side
people
and
specifically
mobile
I,
can
really
talk
mostly
about
this,
and
there
are
two
ways
to
implement
this
technology
on
the
device.
A
One
is
as
a
VPN
or
a
virtual
private
network,
and
that
means
that
we
can
use
an
operating
system
interface
to
to
intercept
the
traffic
from
every
application
and
and
work
our
magic
on
it.
This
is
a
purpose
of
a
very
narrowly
defined
interface
in
both
iOS
and
Android,
for
very
specific
purpose,
so
and
specifically,
the
the
VPN
style
purpose
that
allows
companies
to
have
their
tunnel.
Their
private
traffic
through
the
internet.
A
So
this
is
just
the
kind
of
very
rough
diagram,
a
very
simple
diagram,
the
black
line
inside
the
blue
square.
There
means
that
there's
a
nice
standoff
between
an
application
and
how
it's
written
and
how
the
how
the
traffic
is
intercepted
by
by
the
transports
and
you'll
notice
that
each
transport
has
a
bridge
or
multiple
bridges,
and
then
the
application
services
are
are
beyond
essentially
beyond
the
bridges
or
outside
the
domain
of
the
monitoring
entities
so
we're.
Currently,
we
have
several
types
of
obfuscation
relatively
easily
available.
A
Here
we
have
scramble
suit
and
AB
few
skating
and
proxy,
for
those
are
the
first
two
and
there
those
are
scrambling
traffic
meek
is,
is
a
transport
that
hides
the
address
to
which
you're
going
and
format
transforming
encryption
is
the
start
of
something
to
actually
shape-shift
traffic.
These
are.
These
are
a
little
bit
farther
out
we're
working
on
the
latter,
one
here
and
snowflake,
which
you
know
is.
D
A
Systematizing
or
make
making
the
technology
we
have
fit
better
into
the
operating
system
so
that
it's
available
for
applications
in
a
cleaner
way,
as
well
as
then,
hopefully
promoting
a
more
standardized
ability
for
for
the
software
to
get
out
into
the
field.
We're
also
going
to
look
at
this
bridge.
Crowdsourcing
idea
as
well.
As
you
know,
improving
bridge
configuration.
A
This
work
has
kind
of
evolved
from
being
a
tour,
only
thing
where
plug
ability
was
not
part
of
what
they
were
trying
to
do
for
anybody
besides
themselves
to
something
where
plug
ability
has
kind
of
you
know
started
to
happen,
especially
once
there
were
providers
other
than
tor
interested
in
this
kind
of
idea.
There
is
a
2.0
specification
for
this,
but
we
have
to
be
aware
that
there
are
some
pretty
heavy
incompatibilities
within
the
modal
mobile
programming
models
that
make
this
entire
thing
a
challenge.
So
the
opportunities
for
Standardization
are
to
look
at.
A
You
know:
mechanisms
for
delivering
new
obfuscation
technologies
at
a
lower
level
of
the
stack,
there's,
also
a
possibility
to
look
at
ways
that
you
could
negotiate
an
obfuscation
technology.
This
is
again
trying
to
reduce
the
problem
of
many
bridges
of
many
types
and
then
the
latter
is
simply
to
inform
allied
standards.
People
working
in
this
area
about
the
work
that
we're
doing.
A
Let
this
slide
up
just
to
advise
that
it's
the
intra
news
organization
out
of
Washington
DC.
That's
been
funding
a
lot
of
this
work.
The
folks
here
listed
here
are
all
interactive
on
either
promulgating
or
or
doing
initial
early
development
work
in
this
area,
pretty
simple
to
go
to
plug
pluggable
transports,
dot
info
to
get
virtually
everything
being
done
currently
in
the
in
this
community,
so
I
think
that's
it.
You've.
Q
A
Q
T
Connect
two
points:
one
and
I
don't
know
how
relevant
this
is,
but
there's
actually
an
ITF
working
group
meeting
in
the
same
session
right
now.
The
transport
services
working
group
that
you
know
it
deals
with
Buddle
api
is
for
transport.
I
actually
have
no
idea
how
relevant
it
is
to
what
you're
doing,
because
I'm,
not
an
expert
did
either.
M
T
You
know
it's
sort
of
interesting
with
deep
packet
inspection
and
sometimes
that's
just
realizing
the
internet
security
model
that
we've
been
trying
to
use
for
a
long
time.
Assuming
that
you
know
the
network
is
entrusted
and
so
I
think
there's
some
extent
intention
to
do.
You
know
Stephen
was
proposing
of
trying
to
make
the
legitimate
traffic
look.
You
know
entirely
uniform.
Certainly
in
the
quick
working
group,
that's
been
coming
up
a
lot
about.
You
know:
how
can
we
make
you?
The
bulk
data
transport
look
basically
random
yeah
Thank.
A
V
Kathleen
Moriarty,
so
this
sounds
interesting,
but
have
you
thought
of
the
case
where
this
is
being
used
and
both
endpoints
now
become
compromised
right?
How
do
you
detect
this?
Because
what
you're
left
with
is
your
endpoint?
That's
been
compromised
to
reveal
data,
so
if
you
bring
it
back
to
your
sam'l
analogy,
how
do
you
detect
anthrax?
How
do
you
detect
bombs,
yeah.
V
All
of
your,
so
you
know,
I
get
the
concern
of
protecting
data
and
there's
lots
of
ways
to
do
that.
You
know
on
the
wire
I
just
want
to
make
sure
we're
also
considering
security
concerns
and
not
just
privacy
right.
So
how
does
that
fit
in
how?
How
can
that
factor
in
so
I'm,
not
trying
to
say
don't
do
any
of
this
I'm
just
saying
think
about
security
too.
A
So
I
would
have
a
long
answer
in
one
way
about
that.
So
we'd
better
talk
about
that
later,
but
primarily
with
the
earlier
versions
of
ABS
obfuscating
proxy,
for
example,
the
net
effect
was
we
figured
it
out,
we're
blocking
you
anyway,
so
you
know
it's
it.
It
stops
working
as
opposed
to
uncovers
something.
Do
you
see
what
I
mean
the
difference?
So
we're
not
we're
not
exposing
anything
that
wasn't
previously
exposed,
it's
just
that
the
secret
got
learned
and,
and
it
stopped
working.
V
E
Nalini
elegance
I
just
wanted
to
kind
of
wonder
about
another
tension.
Possibly
to
is
that
sometimes
one
wishes
to
inspect
traffic
for
potentially
benign
ends,
like
maybe
wanting
to
detect
child
pornography.
Something
like
that
I.
Just
wonder
if
that's
something
that
you
guys
are
taking
into
consideration.
A
Yeah
I,
don't
I
can't
speak
for
the
actual
implementers
of
this,
but
but
yeah
as
I
mentioned
in
one
of
the
early
slides
there
you
know.
There's
there
is
a
tension
and
a
problem
between
mom.
You
know
benign
monitoring,
hardcore
surveillance,
actual
censorship
and
and
trying
to
develop
a
technique
that
is
sensitive
to
the
ones
that
we
want
to
block.
D
A
C
W
For
so
this
the
paper
were
Lucas,
Stephen
and
Arvind,
and
it's
a
paper
about
a
certain
web
standards
called
better
status,
API
and
assessing
the
privacy
of
web
standards
and
standards
making
process
using
this
API
as
a
as
a
use
case.
So
one
kind
of
correction,
so
Stephen
is
now
at
Mozilla.
Although
he
wrote
this
paper,
it
is
Princeton
University
capacity,
let's
say
and-
and
the
next
slide
is.
W
W
Okay,
so
this
may
be,
our
API
allows
your
YouTube
browse
the
web,
using
your
VR
headset
and
one
of
the
things
you
need
to
do
to
optimize.
The
experiences
next
slide
to
adjust
your
entire
popular
distance.
This
is
the
distance
between
the
your
eye,
purples
next
yeah,
so
the
this
API
you
like
exposes
your
the
distance
between
your
eyes
to
the
website
and
I.
Think
like
you
can
see,
is
like
a
previous
issue.
W
So
w3c
has
some
like
each
issue
is
in
place
to
curb
this
or
addresses
privacy,
epilation
privacy
issues
with
the
web
Sanders
and
one
is
a
self
review.
Questioner
next
slide,
so
this
includes
question
that
standard
editors
needs
to
address
when,
during
the
initialization
phase,
for
instance,
if
whether
these
standard
new
API
that's
to
be
introduced
involves
personal,
identifying
information
or
high-value
data,
and
interestingly,
one
of
the
mitigation
strategies
outlined
in
this
questionnaire
is,
if
you
look
at
4.3,
is
drop
the
feature.
That
is
the
remove
in
the
feature
so
next.
W
Interest
group
called
ping
privacy
interest
group
which,
among
others,
provide
guidance
and
advice
to
the
during
the
standards,
development
and
next.
So
the
let
me
like
a
briefly
talk
about
is
better
status.
Api
JavaScript
Sanders
is
a
recommendation
by
w3c
and
it's
just
a
way
to
expose
the
battery
information
to
the
website
where
JavaScript.
So
it's
a
very
simple
API.
W
The
surface
is
you:
can
the
website
can
read
the
the
charge
level
and
whether
your
battery
is
charging
or
not
and
time
the
estimate
of
time
to
charge
or
discharge
the
battery
in
seconds
and
next
so
be
CPI
first
introduced
in
2011
and
came
a
long
way
and
after
some
research
and
research
up
the
research?
Basically
it
is.
The
API
is
removed,
basically
unshipped
from
browsers,
including
Firefox
and
and
Safari.
W
W
W
15
I
researchers,
including
myself
and
Luke
rush,
wrote
a
paper
basically
published
a
paper
showing
that
well
actually
a
website.
The
most
website
can
do
a
lot
using
this
API.
So
one
is
that
you
can
just
simply
use
the
battery
level
the
charge
level
to
distinguish,
for
example,
similar
computers
behind
the
ads,
for
example,
in
a
like
a
say,
like
a
enterprise
setting
where
you
have
like
a
very
similar
computers
with
waves
from
the
fingerprints
and
take
a
same
API,
you
can
distinguish
them,
for
example,
using
their
battery
level.
W
But,
more
importantly,
we
found
that
actually
firefox
on
linux
is
exposing
the
battery
level
with
very
high
precision,
it's
a
double
floating-point
number
and
then
using
that
and
somewhat
floating-point
rates.
We
could
detect
the
capacity
of
the
battery,
and
we
probably
the
proposed
that
this
campus
is
a
long
term
identifier.
W
Next
place,
publish
that
privacy
measurement
measurement
of
1
million
websites
and
in
their
measurement
next
they
found
that
actually
fingerprints
is
using
this
API
to
collect
battery
charge
level
as
part
of
the
fingerprint,
so
this
API
was
found
to
be
used
in
the
world
by
fingerprinting
spirits
to
collect
a
fingerprint.
So
this
back
the
the
the
recommendation,
TT
recommendation
was
updated
to
address
this.
These
concerns.
W
W
Late
2016,
which
is
somehow
surprisingly
actually
Mozilla
start
to
discuss,
are
removing
this
API
and
next
place.
So
the
question
they
ask.
Basically,
these
are
like
a
real
website
using
using
better
API
for
the
collision,
its
purpose
and
labor
income
is
that
they're?
Actually,
although
you
can
do
good
very
actually,
no,
they
don't
see
like
a
lot
of
out
of
users
that
the
original
sander
that
envisioned
as
the
use-cases
next
and
early
2007,
actually
several
vendors
removed
or
restrictive.
This
support,
certainly
privacy
concerns
and
papers
and
the
lack
of
use
next
and
the
Selby.
W
So
what
happened
is
so
this
API
came
out
and
then
several
research
showed
there's.
There
are
some
privacy
problems
or
these
problems
are
using
the
abuse
in
the
world
and
then
some
vendors,
basically
on
ship,
the
API.
So
what
we
learned
from
this
all
these,
and
so
this
paper
has
some
recommendations
which
I'll
be
going
through
in
the
next
next
four
Slice.
Next,
please.
W
First,
the
specification
process
should
include
also
review
the
implementations,
so
a
lot
of
things
may
go
wrong
in
the
during
implementation
and
look
as
one
of
the
authors
of
this
paper
actually
moved
in
the
price
river
of
the
ambient
light
sensors
API,
and
they
found
like
similar
problems
with
implementations,
where
the
the
browser's
expose
very
high
precision
readings
on
missus
early
next,
and
so
implementation
is
one
thing,
but
then
is
another
problem.
Another
question
is
like
how
the
API
is
used.
W
For
instance,
audio
context,
API
is
like
a
used
can
be
used
by
the
browser
to
like
synthesized
audio
and
us,
so
this
is
found
to
be
revealing
the
users
operating
system
and
the
browser
and
in
the
normal
standard
browser
case.
This
is
something
you
don't
care
about
this,
because
it's
loading
included
in
the
user
agent
header,
however,
like
tor
browser,
tries
to
hide
the
operating
system
of
the
of
the
user,
and
this
is
a
problem
for
them.
W
So
yeah,
basically
the
people
says
the
things
like:
try
to
incentivize
atoms
in
santosky
resources
to
look
into
this
Sanders
and
break
into
the
privacy
assumptions
and
maybe
like
organize
workshops
or
forums,
inviting
researchers
and
academics
to
look
into
that,
and-
and
so
these
are
the
four
recommendations.
The
paper
has
more
other
recommendations
as
well.
So
next
slide,
please
just
to
find
a
slide
so
yeah
I.
Thank
you.
So
you
can
check
the
paper
for
the
other
combinations
as
well
and
yeah.
Thanks
for
the
invitation
so
happy
to
take
questions.
X
That's
a
good
burger
try
to
get
it
was
15
seconds.
I
guess!
Thank
you
for,
for
this
talk,
I
think
the
recommendations
about
measurement
are
pretty
interesting.
I
participated
in
this
work
when
it
was
very
first
starting
like
eight
years
ago
and
was
a
person
who
went
there
and
said
do
not
standardize
this
because
of
these
problems.
There
were
several
of
us.
X
I
know
that
nick
is
on
Nick,
probably
remember
system,
maybe
a
little
bit
better
than
I
do,
but
that
was
like
the
best
recommendation
that
we
could
give,
because
there
wasn't
really
an
obvious
way
to
represent
this
information
in
the
API
without
having
these
privacy
leakage,
problems
and,
and
essentially
like
the
drive
to
do
all
of
these
device,
a
P
is,
and
the
w3c
was
so
strong
that
those
those
concerns
were
were
not
really
listened
to
and
then
I
stopped
participating.
That
of
III
see
mostly
for
other
reasons.
So
anyway.
X
I,
just
came
up
here
to
say
that
to
give
that
little
bit
of
history,
because
you
know
a
lot
of
the
things
that
you're
proposing
were
have
been
part
of
the
conversation
since
the
since
this
API
was
like
very
first
proposed
and
many
of
the
other
device
api's
as
well
and
I
think
we
have
a
better
understanding
of
a
lot
of
their
threat
models
now
and
there's
a
greater
appreciation
for
these
kinds
of
threats
than
there
was
eight
years
ago.
But
it's
still
you
know
it's.
It's
a
back
and
forth.
O
X
Y
Christine
run
a
NGO,
w3c
privacy
interest
group
co-chair.
Thank
you
very
much
to
the
Princeton
team.
The
ping
group
is
very
grateful
for
that
research,
which
gave
us
the
opportunity
to
provide
the
feedback
to
the
working
group.
One
thing
to
mention
to
everyone
is
the
Security
and
Privacy
questionnaire
is
currently
being
updated.
Y
Y
S
S
So
it's
part
of
why
we
hired
him,
but
there's
any
implicit
assumption
in
this
work
that
API
is
which
enable
this
kind
of
fingerprinting
are
bad
and
that
isn't
really
reflect
I.
Think
the
consensus
of
the
browser
vendors
who
basically
have
given
up
on
for
any
have
to
fingerprint,
largely
and
I
think
this
is
something
that
it's.
S
W
Yeah
and
second
response:
are
you
definitely
agree?
So
it's
actually,
if
it
comes
across
like
this
I
think
I
should
apologize.
So
it's
not
really
that
these
apos
are
bad,
but
for
instance,
in
this
case
it's
like
I
think
really
Tom
Hill
pedal,
because
the
exposure
of
this
like
high
precision
level
was
totally
unnecessary,
whereas
in
other
cases
you
have
like
a
legit
users
of
this
say
fingerprint
of
a
surface
sure.
S
I
think
I
think
that
perhaps
the
Lawson
I
would
take
home
meniscus
back
to
what
I
was
saying
earlier
is
for
a
while.
There
was
like
really
a
rush
to
put
like
every
possible
API
surface
that,
like
any
like
any
phone,
a
phone
programming
environment
had
and
put
them
the
web
somewhere,
and
that
was
going
with
that
really
good
cost-benefit
analysis.
Whether
these
things
were,
you
know,
were
useful.
I
mean,
like
you
notice,
that
when
we
took
notice
that
when
we
decided
that
this
was
bad,
we
didn't
like
reduce.
S
You
know
the
precision
to
five
levels.
We
just
remove
this
year
that
this
API
entirely
verdict
on
battery
status.
Api
is
this
useless
and
this
any
written
any
daily?
Could
all
it
was
a
bad
idea
home.
So
I
think
you
know
I
think
that
perhaps
you
know
that
I
guess
I've
erosion,
your
thoughts
on.
If
any
way
we
could
have
made
that
cost
benefit
analysis
better
earlier
and
realized
that,
like
there's
no
port,
we
should
do
that.