►
From YouTube: Data Privacy Virtual Forum - January 27, 2022
Description
Data Privacy Virtual Forum hosted by the Information Technology Advisory Commission and Emergency Preparedness Advisory Commission.
A
Welcome
everyone
welcome
everyone.
I
am
mary
cornell
chair
of
the
tech
commission,
and
I
want
to
welcome
you
to
the
data
privacy
forum,
sponsored
by
the
information
technology,
advisory
commission
and
the
emergency
preparedness
advisory
commission
chaired
by
sharon.
Valencia
and
sharon.
Sharon
is
with
us
as
well.
So
we're
delighted
to
have
this
distinguished
panel
joining
us
this
evening
to
discuss
data
privacy
and
we
we
welcome
all
of
you,
I'm
so
glad
that
arlingtonians
have
have
have
joined
us
to
discuss
data
privacy,
a
few
logistics
before
we
meet
the
the
panelists.
A
If
you
have
a
question
for
one
of
the
panelists,
please
write
your
question
in
the
chat
box
and
dr
michael
cornfield
who's.
One
of
our
tech
commission
members
will
take
note
of
your
question
and
relay
it
to
the
presenter,
so
make
sure
and
put
your
questions
in
the
chat
box,
and
you
can
do
that
as
we
as
we
hear
the
presenters.
So
we
encourage
you
to
ask
questions
throughout
the
program.
Well,
that
didn't
take
long.
B
A
All
right
so
the
panelists
we
have
jack
belcher
who's,
the
cio
of
of
arlington
county,
the
chief
information
officer,
dr
costas
torrega,
director
of
cyber
security
and
privacy,
research
institute
from
george
washington
university,
dr
dan
kiefer,
computer
science,
professor
from
penn
state
university
and
technologist
for
the
public
realm
mayor's
office
of
new
urban
mechanics
and
city
of
boston,
massachusetts
and
we're
delighted
to
have
all
of
you
here.
This
is
this
is
such
a
wonderful
panel
and
have
so
many
different
perspectives.
A
A
C
All
right,
let
me
get
into
powerpoint
mode
here
and
get
going.
Thank
you
so
much.
It's
an
honor
to
be
here
today
and
have
this
discussion.
It's
an
important
discussion
to
have
about
privacy.
I
have
to
start
right
off
the
bat
by
saying
I'm
the
technologist.
C
This
is
a
bigger
issue
than
just
technology.
Technology
plays
a
critical
role
in
defining
privacy
issues,
and
but
it's
not
all
there
is,
and
it
requires
really
a
comprehensive
effort
to
look
at
it
from
all
sectors.
C
So
saying
that
I'm
going
to
focus,
maybe
he's
asked
me
to
just
give
you
a
brief
understanding
of
what
we've
been
doing
in
the
county.
What's
driving
us,
and
hopefully
I
can
keep
it
within
10
minutes
and
we
can
move
on
to
other
the
other
presenters
and
get
some
questions
from
you.
So
with
that,
let
me
start
off.
One
of
the
first
things
I
think
we
have
here
is
that
this
is
a.
This
is
not
new.
C
This
is
that
persistent
discussion
about
individual
rights
versus
the
good
of
the
community-
and
you
know
I
was
trained
as
a
historian
at
my
alma
mater
georgetown
and
got
into
technology
and
very
familiar
with
all
the
works
have
been
written
about
this.
But
this
started
way
before
today
and-
and
I
listed
here-
john
joshua,
so
john
locke
with
the
whole
idea
of
the
social
contract
states
rights
versus
government
mandates-
we
see
it
today.
C
We
saw
it
last
sunday
we're
a
huge
rally
taking
place
on
the
mall
and
people
saying
my
individual
rights
are
being
being
compromised.
And
what
are
you
gonna
do
about
that,
and
so
we're
very
concerned
about
that
very,
very
interested
in
seeing
what
that
means-
and
we
see
a
dichotomy
here-
there's
a
right
to
individual
rights
in
your
privacy,
but
government
needs
data
and
information
to
be
able
to
service
constituents
in
a
good
faith
medicine.
C
So
the
question
was:
what
have
we
done
here
in
the
county
since,
since
the
covert
and
frankly,
that's
where
my
tension
has
been
for
the
last
22
months
is-
is
to
make
sure
we
have
virtual
access
to
not
only
education
but
court
rooms
being
able
to
do
contact
tracing.
Did
I
get
a
workforce,
that's
mobile
and
being
able
to
continue
to
provide
services?
Well,
we've
done
a
lot
on
summer
2020.
We
we
put
up
a
new
open
data
portal,
made
it
simpler
and
easier
to
get
information
that
we
have
in
the
county.
C
Follow
21.
We
put
a
brand
new
website.
The
idea
behind
it
is
to
refresh
it
make
it
easy
to
use,
make
it
easier
to
get
information
follow
21.
We
also
published
privacy
principles
and
laid
those
out,
and
just
this
last
week
we
issued
an
inventory
and
mapping
of
our
county
privacy
policies
and
we
have
a
number
of
them
and
if
you
go
to
the
arlington
county
website,
ireland
va.us
and
put
in
the
word
privacy
it'll
give
you
a
listing
by
department
of
what
we've
done
in
terms
of
privacy
policies.
C
C
It's
all
publicly
available
data,
as
I
said
before,
we've
come
up
with
privacy
principles
and
you
can
see
them
listed
here
and
they're
very
simple,
minimal,
unnecessary
data,
we're
collecting
we're
ensuring
quality
transparency,
it's
very
important
that
we're
transparent
about
what
we
have
and
what
we're
offering
and
the
whole
question
of
stewardship,
making
sure
that
what
we
put
up
in
the
secure
in
a
manner
that
can
be
protected
here
again,
is
the
the
map
of
our
current
privacies
and
you
see
they're
listed
by
county
policies,
department
policies,
and
if
you
want
to
scroll
down
on
the
site,
you
see
state
and
federal
policy
as
well
listed
okay.
C
So
what
are
we
doing?
How
are
we
moving
forward?
Well,
we're
enamored
by
the
approach
that
the
national
science
institute
of
science
and
technology
has
and
they
call
it
the
nist
privacy
framework.
Now
that
big
word,
but
what
it
is,
is
really
a
tool.
It's
a
voluntary
tool,
it's
not
a
compliant
tool,
you'd
have
to
you
have
to
do
it
and
it
doesn't
suggest
how
to
do
it.
It
just
says
these
are
issues
you've
got
to
discuss.
C
You've
got
to
address
whether
it
be
identifying
what
that
information
is
recovering
it,
responding
to
any
threats
that
may
take
to
it
be
a
assigned
to
it.
So
we
we
have
a
doc
we've
adopted
that
the
privacy
framework
is
rather
new.
I
think
it's
as
I
think
this
january
it
became
two
years
old.
That's
that's
how
long
it
is
so
it's
not
it's,
not
something
that's
been
around
for
a
while,
but
it's
a
good
tool
and
we're
using
it
and
we'll
talk
a
little
bit
about
it.
C
Now
we
as
a
county,
you
know
we're
faced
with
the
challenge
between
cyber
security,
securing
our
assets,
making
sure
access
to
our
data
is
protected
and
privacy,
and
I
have
to
be
honest
with
you
when
asked
where
I
was
going
to
make
the
investment
the
manager
and
myself
decided
to
make
it
into
security,
and
we've
done
a
great
job
in
terms
of
securing
our
enterprise.
It
used
to
be
that
we'd
sit,
we'd,
have
staff
sitting
in
front
of
screens
looking
at
attacks.
C
We've
now
got
partnerships
that
are
looking
worldwide
in
terms
of
any
incursion
that
takes
place
here
and
they
alert
us
that
this
is
happening,
and
they
also
give
us
guidance
on
what
to
do
to
prevent
data
center
and
our
data
being
compromised,
so
think
of
it,
as
instead
of
thinking
in
human
terms,
speed.
C
The
human
speed,
we're
thinking
an
internet
speed
and
it's
been
a
great
resource
for
us,
but
I
think
now
it's
time
for
us
to
take
a
look
at
what
the
privacy,
the
privacy
side
and
we're
gearing
up
and
we're
doing
it
as
fast
as
we
can,
and
here
I
got
an
example
that
came
from
the
again
the
in
this
then
this
model
and
you
can
see
the
privacy
and
cyber
security
overlap
and
those
overlaps
are
where
we've
we've
been
focusing
on.
C
What
keeps
me
up
at
night
is
getting
a
call
saying
that
we've
just
been
compromised
and
the
county's
data
is
now
on
the
dark
web
and
if
I
could
make
a
payment
of
so
much
money
and
get
it
back
that
could
happen
and
now
we're
finding
out
that
it's
not
just
us
and
others
that
we're
a
threat.
It's
our
citizens,
our
residents,
who
use
our
services,
and
we
want
to
make
sure
that
our
employees
are
well
aware
of
the
threats
we
face.
C
And
how
can
we
address
it
so
talking
about
the
cyber
security
in
this
privacy?
It
has
three
components:
major
components,
one
is
the
core.
In
other
words,
you
identify
your
objectives.
What
are
you
trying
to
achieve?
What
are
the
outcomes
you
hope
will
make
it
make
it
work
for
you
developing
profiles?
C
That's
how
active
do
we
want
to
be
in
terms
of
how
we
collect
that
data
and
use
that
data
that's
being
put
into
these
being
covered
by
the
privacy
issues,
privacy
plans
we
have,
and
then
implementation
tiers
that
and
one
of
the
most
important
things
that
I
have.
The
components
of
that,
of
course,
is
having
a
privacy
aware
workforce
that
they
know
that
these
these
are
the
privacy
policies
we
have,
and
this
is
how
they're
working
and
here's
what
we
need
to
do
about
it,
going
forward
areas
of
development
and
alignment
and
collaboration.
C
C
C
Our
threats,
here's
where
technology
really
plays
a
role
emerging
technology
is
hitting
us
every
day
and
it's
growing,
but
what
we
thought
we
had
done
well
could
change
tomorrow.
So
we
talked
about
de-identification
opportunities
may
exist,
and
so
we
make
them
and
we
make
sure
that
information
is
not
being
compromised.
It
could
be
something
like
a
camera
video
camera.
The
same.
C
And
lastly,
again,
as
I
said
before,
we
want
a
privacy
enabled
workforce,
a
workforce
that
understands
what
the
privacy
concerns
we
have
of
the
community.
How
do
we
get
there?
Well,
it's
a
journey
and
there's
no
easy
way
to
it.
I'd
like
to
say
that,
at
the
end
of
this
meeting
we
will
announce
a
policy,
it
won't
happen
and
it
can't
happen
just
from
a
technology
perspective.
This
has
got
to
be
a
community
engagement
to
say
what
what
what
do
we
want,
and
you
might
say-
well,
everybody's
done
it
and
boston's
done
it.
C
Seattle's
done
it.
Why
don't
we
just
take
it?
Do
a
wood
search
and
just
replace
bars
in
seattle
with
arlington,
because
arlington's
different,
you
have
different
values
that
we
is
we're
trying
to
live
by,
and
so
we
have
to
find
out
what
that
is,
and
it's
got
to
be
approved
not
only
by
us
the
people
on
this
club,
but
our
elected
officials
and
the
community
in
general.
C
What
motivates
us,
I
think,
is
this
last
is
on
this
last
slide.
That
no
discussion
is
only
31
of
americans
are
comfortable,
that
their
personal
information
is
used
in
ways
they're
comfortable
with,
if
you
think,
you're
not
being
monitored,
go
to
a
website
where
you
don't
have
to
pay
for
access,
it's
not
secured.
You
can
guarantee
your
data
is
being
taken
and
use
your
behaviors,
whether
you're
searching
for
this
or
that
that
is
happening,
and
so
what
we're
trying
to
achieve
is
a
holistic
approach
and
that's
some
compliance,
the
strategy.
C
What
are
the
ethics
we
we're
applying
to
this
data
governance
and
then
making
sure
that
we're
doing
everything
we
can
to
create
this
robust
protected
environment
here.
So
with
that
mary?
I
hope
I've
kept
it
within
10
minutes
and
thank
you
perfect.
A
Perfect,
thank
you
so
much
jack
and
I
I'm
sure
there'll
be
many
many
questions
and,
and
that
was
a
great
frame.
So
thank
you
so
much
if
you
have
questions,
please
put
them
in
the
the
chat
box
and
we
we
will
get
to
them.
We
we'd
love
to
hear
our
next
presenter,
dr
costas
torregas,
and
thank
you
so
much
costas.
We,
we
look
forward
to
hearing
hearing
your
perspective
and
your
experiences.
E
E
E
I'm
sure
that
you're
aware
that
this
week
is
data
privacy
week
and
what
better
way
to
celebrate
data
privacy
week
than
by
engaging
citizens
with
some
presentations
and
hopefully
a
lot
of
questions
and
answers.
Also
checking
did
you
did
my
slide
change.
A
E
Well,
I'm
sure
you
planted
years
in
advance,
so
I
wanted
to
pile
on
on
what
my
friend
jack
belcher,
that
I
haven't
worked
with
for
oh
more
than
15
years
now,
but
we
had
some
some
interesting
times
together.
Pioneering
some
ideas
for
local
governments
and
technology
back
in
the
90s
and
2000s
privacy
is
not
a
technology
issue,
the
technology,
the
cio.
The
technician
can
only
and
I
put
in
quotes
because
it's
a
hard
job
help
implement
something
else.
E
I
would
be
aghast
if
you
listen
to
jack.
Tell
you
what
the
privacy
should
be.
Privacy
is
not
something
for
the
cio,
so
privacy
is
something
for
new
groups
of
people
to
engage,
so
I'm
going
to
try
to
help
you,
at
least
with
my
perspective
of
how
do
you
enter,
and
how
do
you
create
a
framework
for
privacy?
E
I
have
three
words,
so
I
put
them
in
in
a
natural
triangle.
Here
one
is
privacy,
the
other
is
security
and
jaco
has
already
shown
you
how
it's
a
it's:
a
balance
between
privacy
and
security,
but
I'm
going
to
put
the
third
one
up
there,
which
is
called
empowerment
over
and
above
privacy
and
security.
E
There
comes
this
idea
that
data
and
information
and
the
applications
that
process
the
information
allow
people
to
do
things
and
by
the
way,
what
I
feel
about
what
data
and
information
and
applications
allow
me
to
do
is
very
different
from
my
35
year
old
son.
He
feels
much
more
different
if
I,
if
I
asked
him,
put
put
a
pin
next
to
the
the
part
of
the
triangle
that
you
feel
most
comfortable
with,
he
would
go
to
empowerment,
I'd
probably
go
somewhere
closer
to
security
and
maybe
with
some
empowerment
and
maybe
with
some
privacy.
E
But
my
point
is
that
space
is
different
from
every
single
one
of
us,
so
that
makes
it
very
different
to
ask
for
what
shall
we
have
as
a
policy
I'm
going
to
use
a
popular
example
from
time
to
time
we
watch
midsummer
murders,
we
watch
vera.
All
these
are
for
those
that
are
not
aficionados
of
murder
mysteries
on
tv
there
are,
there
are
police,
detective
stories
on
tv
and
the
the
most
vital
word
we
hear
is:
is
there
cctv?
E
Did
we
get
the
cctv
and
all
of
a
sudden,
we
were
at
the
edge
of
our
seats,
hoping
that,
whether
it's
it's
the
inspector
from
it's
a
murders
or
the
vera
from
from
from
her
show
that
they
found
cctv.
That
would
pinpoint
the
murder
that
would
show
that
someone
indeed
left
at
2
30
in
the
morning
from
their
house
boy.
Oh
boy,
were
we
happy
that
there
was
cctv
footage?
E
Well,
the
reality
is
that
we
have
to
these
things,
not
just
on
tv,
but
every
day
we
have
ankle
bracelets,
we
have
drones,
we
have
ai
based
applications,
we
have
cctv
with
close
circuit
tv
and
the
question
is
not:
do
we
have
it
or
not?
It's
all
over
the
technology
exists.
The
question
is:
what
is
the
framework
that
makes
us
more
comfortable?
E
So
here
I
come
and
perhaps
when
your
elected
officials
hear
this
video
they
might
wince.
I
think
the
elected
officials
have
a
huge,
huge
role
to
play
in
this
little
diagram.
You
see
thousands
of
people
and
they're
all
queuing
up
through
a
single
portal
and
that
portal
we
call
elected
officials
in
arlington
county.
I
don't
know
how
many
tens
of
thousands,
if
not
hundreds
of
thousands
of
people,
are
represented
by
single
board.
E
Member
that
awesome
responsibility
to
empower
the
voice
of
so
many
with
different
opinions
and
different
perspectives
to
come
through
to
a
reasoned
discussion
of
a
board
is
very,
very
important
for
the
privacy
discussion
you
have
in
my
mind
the
most
important
thing
to
start
with
is
civics
civics
courses
and
k-12
education.
Why?
E
Because
we
need
to
make
sure
that
the
new
arlingtonians
are,
in
fact,
men
and
women
who
understand
privacy,
understand
security,
understand,
empowerment,
and
I
don't
like
to
have
lots
of
words
in
a
slide,
but
I
thought
I'd
copy
one
part
of
this
privacy
and
security
curriculum
for
k-12
schools.
It
was
done
by
a
professor
at
george
washington
university,
professor
sullivan,
the
law
school
with
in
coordination
with
other
institutions,
iks
and
others,
but
listen
to
it.
Privacy
issues
often
don't
have
clear
right
and
wrong
answers.
E
It
is
best
to
provide
students
with
age-appropriate
information
and
then
allow
them
to
think
for
themselves
and
reach
their
own
conclusions
about
what
best
suits
their
values
and
comfort
levels.
Wow
privacy
issues
don't
have
clear
right
and
wrong
answers.
Okay,
if
that's
true
for
students,
it's
true
for
you
and
me
as
well.
There
is
no
magic
answer.
We
have
to
kind
of
slug
through
and
pay,
and
it
may
be
painful
this
discussion,
so
we
have
legislation
throughout.
E
I
pulled
out
gdpr
general
data
protection
regulations
in
europe,
the
ccpa
in
california,
the
cdpa
in
virginia
my
own
academy,
I'm
a
fellow
of
the
national
academy
of
public
administration
and
chaired
a
committee
that
looked
at
data
privacy
and
security,
trying
to
develop
an
agenda
for
2021
the
itif
information
technology.
Innovation
foundation
just
issued
a
big
report
urging
the
federal
government
to
take
on
the
issue
of
privacy,
because
they're
afraid
that
too
many
individual
state
privacy
laws
or
county
privacy
laws
may
in
fact
slow
down
innovation
and
increase
cost
of
innovation
to
industry.
E
So
what
what?
What
am
I
suggesting?
And
I'm
coming
to
the
end
of
my
presentation:
it's
a
structured
dialogue.
We
need
to
start
discussing
things
with
the
community
at
large,
just
as
you're
doing
today.
We
need
to
discuss
things
with
elected
officials
and
all
the
key
stakeholders
that
we
have,
because
in
a
sense
we
have
a
very,
very
unusual
journey
and,
as
my
last
process
slide,
I
chose
to
show
you
a
picture
of
the
mars
rover
photograph.
E
Why?
Because,
when
we
landed
that
rover
in
mars,
we
didn't
know
what
we're
going
to
find
the
rover
had
to
have
some
intelligence
of
its
own
to
figure
things
out,
as
it
took
the
first
few
steps,
or
is
it
rolled
the
first
few
feet
on
that
uncommon
ground
in
mars?
It's
the
process,
therefore,
not
the
answer.
We
should
look
for
and,
as
a
consequence,
I
think
you've
started.
You've
been
very
brave
in
starting
this
process,
and
I
applaud
you
and
I
look
forward
to
the
discussion
in
our
debate.
Thanks
very
much.
A
Thank
you
so
much
because
that
was
that
was
wonderful.
Thank
you,
dan,
dr
kiefer.
Would
you
would
you
like
to
give
us
perspective
about
not
only
now,
but
some
things
you
see
in
the
future
as
well?
Okay,.
F
Yeah,
so
we
can
see
your
slides
okay
great.
So
thank
you
for
inviting
me
and
letting
me
share
some
thoughts,
so
I
don't
control
the
slides.
Angela
has
graciously
helped
me
with
my
technology
issues.
F
I
said:
let's
please
go
to
the
next
slide,
so
quick
intro,
I'm
a
professor
of
computer
science
at
penn
state
university.
I
work
on
formal
privacy,
which
is
a
field
that
studies
unexpected
ways
that
data
can
make
private
information
about
you.
I
have
ongoing
collaborations
with
the
census,
bureau,
facebook
and
bls,
and
this
means
I
always
have
to
add
these
kinds
of
disclaimers.
In
my
talks
that
these
opinions
are
my
own,
I
don't
represent
any
agency
or
organization
or
company.
F
Okay,
please
next
slide.
So
let's
start
with
the
question
of
why
people
treat
digital
privacy
differently
from
physical
privacy.
So
suppose
this
guy
was
following
around
taking
pictures,
recording
everything
you're
doing
peering
at
your
phone
right,
so
you
might
have
a
pretty
unhappy
reaction
to
that
to
say
the
least
so
next
slide,
but
this
is
basically
what
all
these
online
companies
are
doing.
So
the
only
difference
really
is
that
they're
really
good
at
hiding?
F
What
they're
doing
so,
you
might
not
be
aware
of
when
they're
collecting
your
information
or
what
they're
doing
what
they're
doing
with
it,
and
what
exactly
they
have
next
slide
please,
and
so,
because
they're
good
at
keeping
this
hidden
a
lot
of
people
don't
appreciate
privacy
as
much
like
if
they
actually
knew
or
were
aware
of
when
they're
on
a
web
page
hey.
This
is
what
is
being
collected
about
me.
This
is
what
it's
being
added
to.
F
This
is
what
you
can
do
with
all
of
that
information
right,
then
you
know
people
will
get
more
concerned,
so
the
the
key
is
knowing
what
is
happening
with
your
data
rather
than
the
current
state
of
affairs.
So
next.
F
So
the
main
thing
to
do-
and
I
think
jack
had
a
good
start
in
describing
this-
is
you
need
to
treat
data
like
an
asset?
You
know
like
money
right,
and
so
you
have
to
have
an
accounting
system
to
keep
track
of
it.
So
if
I
could
have
a
wish
list
for
arlington
county,
one
thing
is
you'd
want
them
to
be
very
good
at
keeping
track
of
what
is
collected
across
departments.
F
So,
instead
of
having
one
department,
you
know
each
department
making
its
own
decisions
having
some
central
centralized
place
where
you
know
what
is
being
collected,
what
services
is
it
being
used
for
and
how
long
it
is
being
retained?
So
you
don't
need
to
keep
all
the
data
forever.
This
is
good
for
cyber
security
as
well
as
for
privacy.
F
An
important
question
is:
when
does
the?
When
does
your
information
leave
government
hands,
so
this
could
be
when
you're
paying
for
a
service?
Of
course,
your
credit
card
is
going
to
a
vendor
to
to
process
the
payment,
but
there
are
also
other
cases
where
data
leaves
government
hands,
and
then
you
need
to
know
why
this
is
happening.
F
So
it's
not
enough
to
just
have
this
policy,
but
this
should
be
in
one
place
very
public,
easily
accessible
place
where
you
can
keep
track
of
exactly
what
is
you
know
what
data
is
being
collected
and
this?
So
this
shouldn't,
be.
You
know
legalese
this
should
be
plain,
understandable,
english,
okay,
next
slide,
please
yeah.
So
there
are
many
tools
that
are
available.
So
jack
did
a
very
good
job
of
describing
the
missed
privacy
frameworks.
F
F
Bunch
of
states,
so
this
was
a
excerpt
from
an
article
from
vice,
and
so
you
can
see
that
selling
personal
data
is
very
lucrative,
so,
for
example,
florida
made
77
million
dollars
just
by
selling
data
from
the
dmv,
and
this
is
just
in
one
year
and
this
data
is
not
very
well
regulated,
so,
for
example,
in
texas,
pretty
much
anyone
could
get
access
to
it.
So,
for
example,
given
their
policy
of
selling
it
just
let's
say
to
private
investigators,
you
can
have
felons
accessing
your
data
because
they
don't
have
a
lot
of
well.
F
They
don't
have
a
lot
of
regulations
there
for
this
kind
of
thing,
so
the
other
problem
is
once
even
if
there's
a
legitimate
use
like
you,
can
sell
data
to,
let's
say
car
manufacturers,
so
they
can
process
recalls.
But
then
what
happens
when
the
data
is
out
of
their
hands
right?
What's
going
to
stop
them
from
reselling
the
data?
And
once
it's
out
there
right,
you
have
no
more
control
about
it.
F
So
the
long
story
short
you
know
it's
not
right
for,
in
my
opinion,
for
an
agency
to
just
sell
the
data
or
to
monetize
the
people
right.
The
purpose
of
an
agency
is
to
provide
services
to
the
people.
F
You
know
collect
revenue
to
provide
those
services,
but
not
to
think
of
them
as
walking
dollar
signs.
So
once
data
leaves
an
agency,
you
have
no
control
over
it,
so
they
have
to
be
very,
very
careful
with
the
data
what
they
do
with
it
and
who
they
give
it
to
you.
F
So
there
is
a
technical
branch
which
is
not
very
well
known,
but
this
is
formal
privacy
and
the
idea
is
sometimes
you
do
want
a
government
agency
to
release
data
right.
This
is
for
statistical
purposes.
If
you
want
to
do
planning,
you
want
to
figure
out
okay,
should
I
build
a
headquarters
here,
or
should
I
build
the
manufacturing
plans
so
and
so
this
kind
of
data
you
want
to
make
sure
it's
only
useful
for
statistical
purposes.
F
A
Great,
thank
you.
Thank
you.
So
much.
I
think
people
are
pausing
by
some
of
those
numbers
that
you
shared
dan.
So
I
I
I
imagine,
there'll
be
more
more
questions
when,
when
we're,
when
we
arrive
at
that
part
of
the
program
and
joe
has
slides,
I
believe
sharon
sent
them
to
you
angela.
So
if
you
could
check,
then
then
we'll
have
his
slides
as
well
and
we'll
be
able
to
to
go
forward
so
yo.
A
Do
you
want
to
introduce
yourself
and
then
angelo
will
bring
up,
bring
up
your
slides.
G
Perfect,
thank
you.
Hey
everyone,
I'm
just
still
taking
in
bits
from
everything
that
jack
said
and
costa
said
and
dan
said.
It
feels
like
a
lot
of
great
connections
to
what
we've
been
exploring
in
boston.
My
name
is
yodesh
bande,
I'm
the
technologist
for
the
public
realm
in
the
mayor's
office
of
new
urban
mechanics
in
the
city
of
boston.
G
So
our
team
works
a
little
bit
differently
than
the
department
of
innovation
and
technology
which
oversees
a
lot
of
the
data
privacy
and
cyber
security
efforts
going
on
in
boston.
Our
team
is
more
of
a
civic
r
d
lab,
and
so
I
want
to
talk
a
bit
about
what
we've
been
we've
been
exploring
in
those
efforts.
G
Perfect,
thank
you
so
to
get
us
started,
I
feel,
like
you
know,
we're
all
thinking
about
how
crucially
important
data
privacy
is,
and
we
know
that
it's
important
from
our
own
lives-
and
we
know
that
it's
maybe
personal
for
us
and
what
I
wanted
to
talk
about,
that
we've
been
doing
in
boston,
is
exploring
how
because
data
privacy
is
that
personal
for
everyone.
G
It's
crucial
to
bring
everyone
in
to
the
process,
so
there's
kind
of
two
cases
I
want
to
make
tonight
one
that
developing
a
data
privacy
policy
should
be
an
open
and
democratic
process
and,
second,
that
any
conversation
about
data
privacy
is
also
a
conversation
about
digital
trust.
So
over
the
next
few
minutes
I'll
talk
about
what
my
office
has
been
up
to
and
try
to
make
those
cases
go
on
next
slide.
G
So
my
team
is
the
mayor's
office
of
new
urban
mechanics.
We're
a
civic
r
d
team
talked
a
bit
about
this.
We
can
go
next
slide.
The
kind
of
projects
that
we
work
on
have
a
wide
range.
We've
worked
on
311
the
infrastructure
issue,
reporting
app,
that
boston
has
project
oscar,
which
is
a
community
composting
program
and
creating
free
wi-fi
zones
outside
public
libraries
and
testing
autonomous
vehicle
operations
in
the
city.
So
a
wide
range
of
projects,
not
necessarily
ones
that
scream
data
privacy.
G
My
role
is
specifically
technology
in
the
public
realm,
and
what
we
found
out
is
that
the
department
of
innovation
and
technology
they're
handling
digital
and
technological
infrastructure
issues,
doing
a
thousand
things
every
day
and
doing
them
excellently,
and
it's
helpful
to
have
some
people
who
can
step
back
and
see
the
wide
angle
or
pop
up
to
30
000
feet
and
look
at
what
the
city
is
doing
overall.
So
that's
where
we
try
to
live
in
terms
of
technology,
and
I
want
to
shout
out
my
teammates
naily
rodriguez
and
chris
carter
at
the
new
urban
mechanics.
G
A
lot
of
the
thought,
discovery
and
the
practical
projects
I'll
be
talking
about,
have
happened
under
their
tenure,
so
it
might
start
sounding
like
some
obscure
gospel
or
something
which,
how
much
I
say,
naily
once
said,
as
chris
proclaimed
that
sort
of
thing
we
can
go
on
to
the
next
slide.
G
One
thing
we've
been
seeing
is
that,
from
this
outside
vantage
point,
the
data
privacy
conversation
is
becoming
more
and
more
widespread,
more
urgent,
but
because
we're
a
city
and
we're
an
institution
providing
public
services.
We're
also
seeing
that
it's
hard
to
extricate
the
conversation
about
digital
trust,
there's
overlap
and
they're
different
processes.
G
So
the
data
that
we
produce,
just
as
citizens,
not
by
engaging
with
a
clear
technological
service,
is
being
collected
and
used,
and
it's
something
that
is
beginning
to
exist
in
people's
understandings.
But
it's
an
outside
of
the
mental
model
of
data
privacy
for
a
lot
of
people.
So
you
see
how
it
starts.
G
G
One
central
methodology
of
our
team
is
to
partner
with
universities
to
explore
things
that
we're
both
interested
in
and
to
encourage
more
civic
engagement.
So
the
public
data
principles
that
we
came
up
with
was
a
project
like
this.
It
was
in
collaboration
with
harvard
law
school
cyber
law
clinic.
We
wanted
to
state
where
boston's
current
thinking
is
represent
the
actual
conditions,
but
also
say
this
is
what
we
believe
about
data.
G
G
G
Amsterdam
was
the
first
to
do
this
mapping
their
data
collection
devices
in
their
downtown,
and
this
is
a
proof
of
concept,
because
we
don't
yet
have
an
existing
inventory
of
all
sensors
in
the
public
realm,
but
it's
an
arrow
for
us
of
where
we
want
to
move
toward.
If
we
can
establish
that
there's
data
privacy
based
on
the
principles
we're
just
talking
about
how
can
we
move
to
create
greater
trust
through
transparency?
G
One
of
the
exciting
ongoing
projects
that
we've
been
working
on
is
the
partnership
with
the
organization
helpful
places
which
has
created
dtpr
and
dtpr
is
an
open
source
communication
standard
for
all
kinds
of
technology
in
the
public
realm,
and
people
can
interact
with
it
to
learn
how
data
is
being
collected
by
technology
around
them.
So
we've
piloted
this
dtpr
signage
that
you
see
on
these
stickers
light
poles
on
a
few
different
kinds
of
street
infrastructure
already
and
what
it
does
is
make
it
possible
for
people
to
see
the
taxonomy
know.
G
What
kind
of
data
broadly
is
being
collected
then
scan
a
qr
code
on
their
phone
and
learn
all
about
how
it's
being
used?
What
for
how
it's
protected
and
we're
excited
about
expanding
this
out,
because
it
feels
to
us
like
opening
up
the
conversation
of,
what's
actually
being
what's
actually
happening
with
data
collection.
G
Next
slide
we're
also
moving
more
and
more
towards
community
co-creation.
Basically,
how
can
we
get
these
conversations
about
data,
privacy
and
digital
trust,
out
of
a
single
lone
governmental
office
of
technology
experts
to
be
happening
around
classroom
desks
and
around
kitchen
tables?
And
we,
I
think
the
the
baseline
feels
like
ultimately
we're.
You
know
public
servants
and
as
much
as
we
think
that
we're
experts
on
the
issues,
if
we're
making
it
so
that
people
feel
uninvolved
and
that
there's
a
lack
of
information
we're
failing
to
serve.
G
So,
to
summarize,
some
of
the
lessons
that
we've
been
learning
data
privacy
can
be
separated
from
digital
trust
because,
as
soon
as
we
move
into
the
public
realm,
it's
more
becomes
more
and
more
a
question
of
trust
where
there
isn't
where
there
isn't
already
knowledge
and
recognition
of
how
data
is
collected
and
trust
is
a
process.
It's
a
relationship,
it's
reliability
and
transparency
and
goodwill
over
time.
G
So,
even
when
you
come
with
something
good
people,
naturally,
you
know
wouldn't
believe
that
it's
good
so
we're
trying
to
be
as
transparent
and
open
as
possible.
We're
trying
to
find
good
partners,
meaning
external
partners,
people
who
care
about
these
issues
and
other
department.
Excuse
me,
internal
partners,
people
who
care
about
the
issues
in
in
city
departments,
thought
partners
who
are
exploring
theory
around
them
and
community
partners
who
can
lead
in
determining
what
decisions
are
actually
going
to
best
serve
people
and
the
last
part
is
redefining
public
safety.
G
So
when
we
talk
about
surveillance
technology
oftentimes,
the
argument
for
a
new
technology
is
public
safety,
and
what
I
really
want
to
emphasize
is:
if
we
only
think
about
public
safety
through
the
lens
of
lives,
saved
lives,
lost
and
property
maintained
or
damaged.
It's
missing
a
huge
part
of
it,
the
part
of
it
that
data
privacy
really
gets
at,
which
is
the
mental
and
communal
health
parts
of
public
safety,
how
people
trust
their
government
and
trust
each
other?
G
How
safe
people
feel
going
around
the
city,
exploring
a
new
part
of
the
city,
engaging
with
the
public
service
and
that's
ultimately,
what
makes
people
happy
what
makes
a
city
good
to
live
in
and
what
makes
the
government
work.
So
that's
what
we've
been
up
to
I'll
stop
there.
Thank
you
all
for
this
time.
A
A
I
encourage
I
know.
Some
of
you
have
already
done
that.
I
encourage
you
to
put
your
questions
in
in
the
chat
box
and
michael,
I
will
michael
cornfield
the
tech
commission
member.
I
will
turn
it
over
to
you
to
start
asking.
I
want
I'd
like
to
go
to
the
chat
box
first
and
then
we
have
some
other
questions
as
well,
so
go
ahead.
B
I
apologize,
I
think,
there's
great
interest
so
far
in
the
questions.
I've
read
in
just
what
arlington
county
has
what
it
shares
with
third
parties,
whether
it's
free
or
or
sold,
and
in
particular
among
those
third
parties.
B
What
has
it
shared
with
the
department
of
defense,
the
dea
and
other
federal
agencies
located
in
arlington
that
have
as
their
special
interest,
surveillance
and
intelligence
so
jack?
You
spoke
of
a
map
and
an
inventory,
and
I
don't
know
exactly
what
that
was
referring
to.
B
But
how
far
away
are
we
as
a
county
from
having
a
place
where
we
could
go,
and
I
I'd
love
to
have
qr
codes
on
kiosks
like
they
do
in
boston
but
anyway,
if
we
just
had
to
go
to
the
website
to
get
answers
to
those
basic
particular
questions?
What
what
who
knows
what
about
us?
What
is
the
current
state
of
affairs.
C
We
don't
know
to
be
honest
with
you,
I
mean
that's
a
good
question,
an
excellent
question.
We're
learning
as
the
work
boston
is
doing
is
fantastic
and
I
just
see
the
the
processes
they're
going
through.
All
of
which
sounds
great,
I
mean
I
would
love
to
be
able
to
say
that
we
know
where
all
the
information
is
we
have
and
how
it's
being
disseminated.
C
As
I
was
talking
to
yo's
colleague
in
boston,
the
cio
there
I
said
everything
it
says
to
me:
you
you
know
exactly
what
information
is
being
shared
and
no
one's,
doing
anything
that
you
don't
know
about
and
when
she
stopped
laughing.
H
C
Growing-
and
I
think
that's
where
we
are
right
now,
questions
you
ask
depends
on
the
you
ask
what
the
mapping
was
all
about,
and
the
identity
and
mapping
identifying
and
mapping
that
I
talked
about.
It
lists
those
departments
that
have
privacy
policies
built
there
available
which
talk
about
what
they're
doing
with
the
data
they
have
so
and
some
of
the
things
you
talked
about
involve
public
safety.
C
Again,
some
things
are,
they
have
the
right
to
share
through
the
fact
of
public
safety,
and
we
do
the
best
that
we
can
to
identify
what
they
are
sharing.
But
I
I
couldn't
give
you
an
answer
today,
right
now,
if
we
without
one
repository,
is
but
we're
beginning
the
journey,
and
I
think
that's
the
key
thing,
it's
trying
to
find
out
exactly
what
we
have
and
how
that
data
is
being
used.
B
B
Anyone
can
take
that
you
know
and
if
any
other
cities
have
better
ideas
of
what
is
currently
held
in
their
jurisdictions.
That's
great
to
hear
too
on
jack.
C
Yeah,
well
let
me
finish
that
off,
because
I
think
you
know
it's
you
had
that
follow-up
question.
You
know
we're
one
of
the
largest
deployments
of
5g
technology
in
the
united
states
and
it's
going
to
grow
more
and
more
and
the
process
whereby
somebody
wants
to
put
a
a
5g
poll
up,
collect
data.
C
They
go
through
a
process
and
they
put
that
up.
We
don't
have
a
process
to
say
what
are
they
doing
with
that
data
they're
collecting
and
if
you
have
a
cell
phone
and
you're
near
one
of
those
say
an
att
or
verizon
tower
t-mobile
they're,
most
likely
collecting
the
data
on
where
you're
going.
What
you're
doing
so
I'd
like
to
say
that
we've
got
a
hold
of
that,
but
we
don't
and
we're
limited
in
what
we
can
do.
C
In
fact,
the
fcc's
made
it
quite
clear
to
us
that
the
only
thing
a
local
jurisdiction
can
do
is
to
provide
expedite
the
process,
we'll
buy
a
poke
and
put
up
be
put
up.
We
do
not
have
the
authority
to
say
what
are
you
doing
inside
that
poll?
What
type
of
technology
you're
putting
inside
it.
B
C
The
government
itself
the
county
government
itself,
it
does
not
have
that
authority.
It's
really
a
it's
part
of
in
the
case
of
the
5g
spectrum,
what
they
put
up,
it's
it's
really
between
the
fcc
and
the
the
parties
doing
it
and
putting
the
poll
up.
B
There's
a
question
for
for
yo
to
expand
on
your
concept
of
public
safety
that
it's
not
just
about
lives
and
injuries.
G
Thank
you
yeah.
I
think
when
we
think
about
what
public
safety
could
be,
I
think
there's
a
nice
way
of
imagining
a
really
like
the
ideal
of
of
you
know
what
could
be
the
best.
G
The
best
experience
for
a
citizen
of
arlington,
the
best
experience
for
a
citizen
of
boston
is
a
sense
where
there's
you
know
for
harmony
and
peace,
and
this
all
sounds
very,
you
know
flowery,
but
I
think
that
the
way
it
can
actually
play
out
is
in
expanding
a
definition
of
public
safety
to
include
things
that
maybe
traditionally
haven't
been
the
the
area
that
I
think
is
is
really
interesting.
G
Here
is
mental
health
and
the
mental
health
experience
of
people
moving
through
a
city
because
for
particularly
like
people
from
groups
who
are
historically
over
surveilled
and
who
you
know
suffer
more
than
other
groups
from
like
algorithmic
bias
and
stuff
like
that
in
surveillance
technology,
I
think,
there's
a
felt,
there's
a
felt
decline
in
public
safety
when
just
being
out
out
in
public.
G
I've
heard
it
in
conversations
with
people,
and
you
know
that
moving
through
a
city
and
knowing
that
there's
some
sort
of
surveillance
going
on,
you
feel
unsafe,
just
as
a
citizen,
even
if
the
idea
of
there
being
that
surveillance
technology
there
or
other
monitoring
or
other
data
collection,
is
to
increase
safety.
So
I
just
think
that
expanding
the
definition
is
useful,
because
then
we
can
pit
different
things
together.
How
do
we
want
to
set
up
bike
counters
in
order
to
reduce
fatalities?
A
A
What
do
you
think
is
posing
the
biggest
challenge,
then
for
policy
makers,
because
you've
already
talked
about
how
some
government
entities
are
selling
data?
You
mentioned
the
77
million
per
year
from
florida
as
you
look
ahead
five
ten
years,
and
you
really
have
a
sense
about
where
the
technology
is
moving.
A
F
I
think
it's
keeping
up
with
the
technology
so,
for
example,
a
lot
of
regulations
are
based
on
personally
identifiable
information
or
pii
and
that's
really
an
outdated
concept.
So
if
you
look
at
where
technology
is
going
right,
so
we
have
ai,
you
know
digital
assistants
and
possibly
robots,
robotic
drivers.
F
So
how
do
they
train
them?
They
collect
massive
amounts
of
data,
so
emails.
You
know
what
you
type
on
your
phone
videos
and
they
train
a
big,
deep
learning
model
on
it
and
after
you
train
this
model.
Well,
what
does
pii
mean
so
model
really
is
just
a
bunch
of
numbers.
None
of
those
numbers
are
your
phone
number
social
security
number
credit
card
number.
None
of
them
are
pictures
of
you,
but
you
can
take
this
model
and
you
can
trick
it
into
spitting
out
the
data
it
was
trained
on.
F
So
if
there
were
credit
card
numbers
phone
numbers
api
and
there
you
can
get
it
it
somehow
memorized
it.
But
no
one
knows
how-
and
you
know
explaining
this
to
regulators.
Policy
makers
is
not
always
an
easy
task,
or
even
you
know,
how
do
you
write
laws
explaining
that
right?
So
pii
was
a
nice
concept.
You
know
what
is
a
pii
or
what
is
not,
but
when
it's
just
a
bunch
of
numbers
that
somehow
could,
if
you
do
things
in
the
right
way,
reveal
secret
information.
F
How
would
you
regulate
that,
and
so
one
of
the
key
concepts
that
has
grown
from
the
technical
community?
Is
it's
not
the
thing
you
share?
It's
how
you
produce
it.
So
it's
how
you
train
the
model
or
how
you
aggregate
statistics.
So
it's
the
the
code
that
you
run
it
through
in
order
to
produce
the
output.
That's
the
important
thing
that
needs
to
be
regulated
and
to
understand
that
you
know
the
our
policymakers
need
to
be
trained
in
much
more
than
just
laws,
different
kinds
of
technologies
as
well.
A
F
So
it
depends
on
who
they
is,
but
some
companies
like
apple
and
google
and
facebook
in
certain
cases
have
started
it.
So,
for
example,
you
may
have
noticed
that
when
you
type
on
your
iphone
you're
getting
some
pretty
good
suggestions
for
what
to
type
next
or
some
emoji
recommendations
and
the
way
they
do
that
is
they
collect
what
you
type
now.
That's
really
creepy
right.
So,
instead
of
collecting
literally
what
you
type,
they
add
noise
to
it.
F
So
when
the
data
hits
their
servers,
they
have
no
idea
what
you
typed,
but
when
you
collect
this
over
many
many
many
people,
the
noise
sort
of
cancels
out-
and
you
can
start
to
say,
okay,
this
is
what
people
are
searching
for
when
people
type
cat.
This
is
the
kind
of
emoji
that
they're
interested
in.
F
So
yes,
a
lot
of
companies,
especially
the
big
ones,
are
doing
it
and
they
have
active
research
teams
that
are
improving
the
technology
so,
but
this
is
something
that
the
big
eyes
are
doing
and
for
small
startups
small
companies
where
they
don't
have
the
resources
to
invest
in
that.
That's
where
the
problem
could
lie.
B
All
right,
thanks
mike,
we
have
in
arlington
county
two
proposals
that
impinge
on
privacy
that
have
already
been
put
into
at
least
partial
effect.
One
is
deploying
sensors
in
the
clarendon
public
space
with
jack
mentioned,
and
another
is
leasing,
county,
dark,
fiber
to
our
our
largest
commercial
landlord.
B
What
sort
of
risk
analysis
cost
benefit
analysis
was
done
and
if
none
was
done,
what
could
be
reasonably
expected
and
and
more
than
that
and
and
more
generally,
if,
if
acostus
and
some
of
the
others
on
the
panel
could
speak
to
the
concept
of
consent,
what
rights
do
people
have
to
control
their
own
private
data
and
do
they
have
the
right
to
opt
out?
Do
they
have
the
right
to
see
data
that
is
personally
identifiable,
even
though
that's
an
anachronistic
concept?
I've
just
learned
what
about
so.
B
The
first
question
is
the
process
of
giving
away
or
or
selling
information,
and
the
second
is
the
principle
of
of
consent.
E
Well,
first
of
all,
this
has
been
a
great
discussion
and
yo
and
dan
thanks
for
adding
to
the
mix
that
the
reality
of
jack
has
has
given
us.
You
mentioned
dark
fiber,
you
know
if
you
offer
dark
fiber
to
someone
you're,
not
offering
data
you're
offering
dark
fiber
and
the
other
person
runs
data
through
it.
So
we
have
to
be
slow
and
careful
of
what
is
it
that
we're
concerned
with?
E
If
you
told
me
that
the
county
was
sharing
data
streams
with
someone,
it's
very
different,
from
leasing,
dark
fiber
now
the
thing
that
happens
with
leasing,
dark
fiber
is
someone's
getting,
perhaps
a
good
deal
because
they
may
not
pay
as
much
as
they
would
have
to
pay
to
dig
their
own
fiber
ducts.
But
if
they're
just
leasing
fiber
it's
a
straight-up
economic
discussion,
it
doesn't
matter,
we
don't
know.
What's
going
through
it,
they're
they're
going
to
decide
the
lee
si
is
going
to
decide
what
to
push
through.
E
So
unless
I've
misunderstood
the
question,
I
I
think
that's
kind
of
like
a
non-question
but
more
important
than
that
is.
I
think
what
joe
was
trying
to
say,
which
is
that
it's
time
now
to
redefine
things,
redefine
public
safety,
redefine
trust
and
not
look
at
any
one
thing,
and
I've
been
looking
at
the
chat
and
there
seems
to
be
really
passion
around
a
small
number
of
issues
that
perhaps
jack
you
can.
E
E
But
the
most
important
thing
that
that
I'd
like
to
push
us
on
is
elected
officials
and
how
they're
going
to
have
to
learn
much
more
than
they
ever
have
been
asked
to
it's,
not
that
we
expect
our
elected
officials,
be
they
board
members
or
be
the
congress,
people
or
senators.
We
don't
expect
to
become
experts
in
technology,
but
we
do
expect
them
to
understand
the
impact
of
their
choice,
the
impact
of
their
choice.
So
when
that
comes
up,
I
vote
for
some.
E
Some
focused
and
targeted
discussions
with
elected
officials
before
they're
asked
to
vote
on
legislation
because
legislation
is
the
law.
If
I
don't
do
it,
you
take
me
to
prison.
That's
an
awesome
power
that
we
give
our
government.
So
before
the
law
is
passed,
we
have
to
make
sure
that
the
the
right
conditions
and
the
right
knowledge
was
shared
and
one
slight
example.
E
I
wonder
if
people
know
the
settings
on
their
own
iphones
or
different
kinds
of
vendors,
whether
they
know
how
to
set
their
various
privacy
and
security
principles
inside
their
phone,
so
that
when
you're
walking
by
an
antenna,
the
antenna
doesn't
suck
up
everything
because
you've
said
don't
allow
yourself
to
be
sucked
up.
E
B
Well,
what
about
the
the
principle
of
consent?
How
do
how
do
we
redefine
these
issues
and
and
and
get
elected
officials
to
understand
the
principle
of
consent.
B
B
Going
to
explain
to
me
not
really
my
question,
I'm
I'm
I'm
I'm
trying
to
distill
from
from
the
chat,
and
I
think
consent
means
the
capacity
to
opt
out
of
the
collection
and
sale
of
personally
identifiable
information.
I
This
is,
this
is
joshua
fair.
I
don't
know
if
I'm
unmuted,
but
that
was
my
question
and
I
kind
of
want
to
clarify
what
I
mean
is
you
know
I
work
in
the
federal
space
and
I
live
here
in
arlington,
and
I
recognize
a
number
of
uses
for
our
data
that
maybe
I
would
actually
allow
if
I
had
the
ability
to
provide
any
type
of
consent
over
it.
I
I
I
recognize
there's
kind
of
a
difference
here
on
law
enforcement
use
cases
where
people
would
want
to
opt
out,
but
there
are
legitimate
use
cases
that
I
would
actually
really
like
to
opt
into,
and
so
I
guess,
I'm
mostly
curious
about
your
con.
Your
thoughts
on
sort
of
consent,
management
being
the
consent
of
the
use
of
data
for
things
that
maybe
I
actually
really
want.
G
I
could
say
one
bit
on
this
yeah.
I
think
it's
such
a
great
kind
of
open
field
to
play
in
and
part
of,
what's
really
feels
exciting
and
the
work
that,
like
I'm
thinking
of
like
the
cassette
tech
project
and
all
the
organizations
like
that
that
are
exploring
this
more.
It
feels
like
there's
so
much
positive
opportunity
as
as
opposed
to
negative
opportunity.
G
I
mean
that
in
the
sense
of
of
space
that
there's
a
lot
of
opportunity
for
like
increased
sense
of
civic
engagement
and
increased
sense
of
satisfaction
for
people,
an
increased
sense
of
like
positive
relationship
between
an
individual
and
the
you
know
the
governmental
authority
that
they're
sort
of
that's
serving
them
through
working
with
consent.
G
More
one
thing
we
had
played
with
once
this
going
back
to
like
a
piece
of
a
project
a
little
while
ago
was
the
idea
of
having
like
data
donors,
which
is
basically
an
idea
where
people
would
be
able
to
select
some
of
the
ways
that
they
wanted.
The
data
that
they're
producing
to
be
used-
and
so
I
feel
like
joshua,
is
just
getting
a
bit
at
your
question.
F
Yeah
I'd
like
to
add
one
one
other
point:
I
don't
have
a
solution
to
it,
but
but
one
issue
with
consent
is
that
you
get
very
biased
data.
Let's
suppose
you
want
to
consent,
because
you
want
to
be
part
of
a
study
that
treats
a
disease
that
you're
very
interested
in,
so
the
people
who
would
volunteer
in
that
study
are
very
biased
sample.
F
They
might
not
accurately
reflect
the
kind
of
people
that
well
with
the
kind
of
data
that
would
be
needed
for
statistically
meaningful
analysis,
so
so
consent
is
good,
but
if
you
consent
based
on
the
contents
of
your
data,
it
creates
a
kind
of
data
skew
that
could
actually
cause
the
data
to
be
worse.
I
don't
have
a
solution
to
it,
but
that's
just
pointing
out
that
it's
a
complex
issue.
C
C
If
my
radical
medical
record,
someone
came
to
me
and
said,
hey
jack,
we
have
this
program
where
we're
going
to
work
with
the
fire
department,
emergency
rescue
and
what
we're
going
to
do.
If
you
sign
up
for
this,
we'll
get
to
your
health
record.
So
when
we
show
up
because
you
call
9-1-1
we'll
know
exactly
what
your
conditions
are
and
we
can
serve
you
better
to
me,
I
personally
would
say
hey:
where
do
I
sign
up
I'll?
C
C
C
B
Joshua,
thank
you
for
speaking
up.
Do
you
and
elizabeth
you're
next
joshua?
Do
you
have
any
any
follow-ups
you'd
like
to
comment
on
or
or
ask
panel.
D
Michael
I'm
sorry,
I
just
want
to
point
out
the
fact
that
we
do
have
a
moderated
discussion
tonight
and
we've
asked
people
to
put
their
questions
in
the
chat
rather
than
hey.
B
I'm
doing
my
best
to
to
synthesize
and
I'm
getting
a
lot
of
pushback
from
some
people
that
I'm
not
doing
an
adequate
job,
so
I
thought
I
would
allow
that,
but
we
can
we
can
move
on
then.
Thank
you
sure.
B
I
apologize
I'm
going
to
ask
a
question
and
it's
about
the
idea
of
having
a
chief
privacy
officer
as
an
elected
official
and
that
person
would
have
and
I'm
getting
knows,
but
that's
why
I'm
asking
that
person
would
have
under
under
his
or
her
purview,
training,
handling,
complaints,
issuing
guidances
on
new
tech,
and
the
reason
I
said
elected
is
just
because
that
assures
some
independent
accountability.
Apart
apart
from
the
board
dan,
you
shook
your
head
now.
F
H
B
Okay,
that
that's
a
fair
comment.
What
if
it
was
just
a
civil
servant,
is,
is
there
value
in
having
a
single
person
with
that
that
kind
of
dedicated
responsibility.
E
Authority,
yes,
michael
yes,
for
me,
I
agree
with
the
proposal
if
it's
and
I
agree
totally
with
daniel's
rejoinder
so
having
you
know,
there
are
lots
of
chief
somethings
in
government
these
days,
so
we
have
a
chief
information
officer,
chief
information
security
officer,
the
chief
data
officer
chief
privacy
officer,
at
chief
empowerment
officer,
a
chief
innovation
officer.
We've
got
lots
of
chiefs.
E
E
So,
if
you,
if
you
venture
into
the
direction
of
privacy
and
engagement,
it's
not
a
bureaucratic
job,
I
think,
as
you
would
would
agree
immediately.
It's
engaging
the
community
finding
new
ways
to
discuss
with
the
community.
What,
where
are
our
our
metrics
for
privacy?
Where
are
we
comfortable
with?
B
Okay,
I'm
getting
nods
from
from
the
other
two,
so
I
I
I
can
move
on
go
ahead.
Somebody
was
about
to
speak
no.
B
Mary,
do
you
have.
A
A
Was
just
looking
at
some
of
the
questions
in
the
chat
about
third-party
data
yeah.
A
C
C
We
specify
what
what
type
of
data
it
is
we're
sharing
and
quite
honestly,
it's
not
anything
that
before
we
review
the
data
and
we
make
sure
that
there
are
protections
and
embraced
to
that
we've
most
recently,
the
board
approved
a
template,
a
memorable
understanding
where
we
could
share
data
with
certain
universities
that
want
to
participate
in
this,
and
they
signed
an
agreement
which
our
attorneys
looked
at,
to
make
sure
that
any
data
that
we
give
them
is
not
sensitive
and
that
that
data
is
protected
and
returned
in
the
format
that
the
counties
asked
for
right
now.
C
We've
got
three
universities
university
of
virginia
george,
mason
university
and
american
university,
who
are
of
participating
in
that
memorandum
of
understanding
and
the
way
it
works
is
they'll,
come
forth
with
us
and
say:
look
we
got
this
idea,
you
you
tell
me
what
it
might
be.
C
It
might
be
a
better
way
of
doing
a
fire
management
or
whatever,
and
they
come
up
with
the
statement
of
work
that
they
want
to
do,
and
we
take
that
to
the
department,
that's
going
to
be
sharing
their
data
with
them,
and
would
they
evaluate
it
again.
Looking
at
us
from
the
perspective
of
this
data,
we
should
be
sharing
with
them.
C
C
It
was
a
memorandum
of
understanding
with
those
universities
and
then
again
the
thing
is
in
the
statement
of
work
now,
each
of
these
statement
of
work
that
and
we've
had
very
few
of
them
to
start
off
with
here,
because
we
just
passed
this,
but
we
do
privacy
risk
assessment
on
it,
my
statement
to
say:
okay,
what
what
data
is
being
being
developed
and
is?
Are
we
comfortable
with
that
and
again
that's
done
by
the
data
privacy,
not
data
privacy?
C
Obviously,
a
data
off
chief
data
officer
and
the
parties
attached
to
this
in
the
department
that's
going
to
be
sharing
their
data
with
them.
Hope
that
helps.
A
Yeah
and
is
there,
is
there
a
framework
now
for
us
to
decide
kind
of
what
the
pieces
of
data
are
in
terms
of
the
different
pieces
of
data
and
how
we
treat
them
is
there?
Is
there
a
framework
where
we
look
at
that
in
terms
of
is
one
more
sensitive
than
another?
I.
C
Well,
they
said
earlier
on.
I
think
what
we've
done
is
we've
identified
and
mapped
where
the
data
is.
The
next
step
is
to
do
that.
Taxonomy
that
says,
let's
dive
down
and
find
out
what
that
data
is
actually
david,
but
danny
brought
up
the
whole
question
of
the
personally
identifiable
information.
That's
a
concept
that
was
developed
years
ago
and
doesn't
need
to
be
refreshed
and
looked
at.
Probably
so,
but
those
are
the
types
of
things
we'll
be
looking
at
going
forward.
F
Jack,
can
I
ask
you
a
question,
so
you
said
you
review
the
data
and
you
send
it
to
universities
after
making
sure
there's
no
sensitive
information
in
there.
So
what
is
to
stop
you
from
just
making
that
publix,
instead
of
only
a
certain
group
of
researchers
having
access
to
it,
make
it
widely
available?
So
you
don't
have
to
be,
let's
say,
affiliated
with
the
university
or
with
the
team
of
lawyers
too.
So.
C
We
do
as
well,
and
then
we
put
it
into
this
data
portal
we
have
which,
which
we
just
refreshed,
which
allows
people
to
go
in
and
say.
If
there's
not
sensitive
and
it's
publicly
available,
then
they
can
go
and
look
at
it
and
do
an
analysis
on
it.
Sometimes,
universities
would
like
bulk
for
data
because
they're
doing
a
research
capstone
project
in
the
university,
a
research
project
and
they
relate
it
to
the
department
that
they're
working
with
in
the
county.
C
They
may
have
an
interest
in
seeing
further
thought
on
that,
and
so
it
involves
not
only
data,
but
it
involves
participation
of
the
department
of
subject
matter.
Experts
in
the
department
I
work
with,
but
so
most
certainly,
if
we're
giving
it
to
university
we're
making
it
available
through
our
data
portal
as
well.
B
I
I
see
that
takis
has
joined
us.
Thank
you.
I.
I
can't
summarize
everything
that
we've
talked
about
in
in
the
last
70
minutes,
but
there
is
a
common
theme
that
I
I
would
like
to
ask
you
to
address,
which
is
there
is
a
hunger,
at
least
among
those,
the
40
of
us
who
are
in
attendance
for
more
knowledge
from
the
county,
on
what
it's
doing
on
on
the
contract.
B
Success
by
which
it
signed
contracts
and
will
sign
contracts
on
on
consent,
on
opting
out
on
national
security
provisions
what's
in
place
to
make
make
things
more
transparent.
H
And
I-
and
I
guess
this
is
this-
is
me
who
is
asked
correct
so
sorry
for
for
joining
for
joining
late.
There
was
a
conflict
with
your
independent
auditors
committee,
so
which,
among
other
things,
is,
is
also
looking
into.
You
know
how
we
contract
things
etc.
So
I
I
think
that
we
are
starting
a
very
long
conversation
here
on
on
on.
H
You
know
what
exactly
the
the
what
transparency
will
be
up
about,
how
how
best
we
can
assure
citizens
that
that,
first
of
all
that
we
we
collect
responsibly
the
data
we
handle
responsibly
their
data,
we
make
sure
that
there
are
no
breaches
that
make
sure
that
we
are
not
vulnerable
as
much
as
possible
to
cyber
attacks
etc.
H
There
are
many
layers
to
this,
and
I
think
that
the
that
the
commission
has
been
trying
to
think
through
this,
and
you
will
all
understand
that,
one,
that
there
are
parts
that
definitely
our
our
staff
is
responsible
for
the
board
is
looking
forward
to
you
know
to
adapt
policies
to
sufficiently
serve
the
idea
of
of
of
transparency
without,
though
making
it
impossible
for
stuff
to
work
without
making
it
impossible
for
jack.
Belcher's
department
to
to
do
their
work
efficiently
and
correctly
now.
H
I
definitely
subscribe
the
idea
that
we
need
to
elevate
our
our
account,
our
digital
accountability,
and
that
applies
really
to
me
and
to
my
colleagues-
and
I
think
this
is
a
great
discussion
to
have
on
how
we
deal
with
digital
assets
as
part
of
our
governance.
As
we,
you
know,
we
changed
a
lot
of
things
in
the
in
the
in
a
few
years.
We
literally
now
use
cloud
services
all
over
the
place
for
for
us
for
you
for
everybody,
we
are
now
telecommuting,
etc.
H
We
have
new
devices,
we
I
think
I
I
I
stopped
in,
or
I
I
I
I
came
in
right
when
you
were
talking
about
you,
know
innovative
contracts
and
smart
city
applications,
etc.
H
So
this
is
the
start
of
a
conversation
of
how
best
to
you
know,
adapt
our
policies
so
that
the
residents
of
arlington
are
satisfied
that
they
have
enough
transparency,
that
they
have
that
they
can
give
useful
and
appropriate
input
to
that
and
that
your
elected
officials
can
respond
and-
and
you
know
be
at
the
level
of
accountability
that
you
want
now.
This
is
a
very
political
answer.
H
I
know
that
the
the
way
to
that
is
not
trivial
is
is
not
easy,
but
we
are
committed
to
to
improve
as
we
go,
but
one
thing
I
want
to
say
is
and
to
to
be
very
clear
about
that.
H
As
somebody
you
know,
for
myself,
I've
been
critical
for
some
of
you
know
the
for
some
on
some
initiatives,
including
the
clarendon
initiative.
One
thing
I'm
really
assured.
H
I
know
that
our
other
staff,
the
county
managers
office,
understands
the
challenge,
and
I
know
that
they
are
really
responding
to
the
idea
that
privacy
matters
that
we
have
to
improve,
that
we
have
to
think
ahead,
etc.
We
they
understand
that
transparency
matters
as
well,
so
I
am
literally
you
know,
I'm
I'm
committed
to
to
make
progress
on
that.
I
I
grew
up
in
mostly
before
I
came
to
the
united
states,
the
society
that
has
a
very
long.
H
I
was
in
germany
right,
so
it
has
a
very
long
tradition
of
how
this
right
to
informational
self-determination,
which,
by
the
way
it's
a
it's
a
is-
is
a
legal
asset
defined
by
their
supreme
court.
So
how
this
was
implemented,
it
took
literally
decades
to
get
there,
but
they
got
so.
I
think
that
we
are
in
the
same
same
boat
right
now,
and
I
think
we
will
see
good
results
out
of
that.
B
A
Right-
and
there
are
a
couple
I
we
did
have
a
couple
of
questions-
one
yo:
what
for
us
moving
in
the
future
and
obviously
it
is
a
journey
and
we've
got
a
lot
of
steps
to
take.
What
surprised
you
most
as
you
are
engaging
in
building
that
digital
trust
that
you've
talked
about.
What
what
advice
do
you
have
for
us,
because,
obviously
we
are
behind
you
in
that
journey,
so
we
we'd
love
to
hear
what
what
advice
you
have.
G
Oh,
you
know
I
was
thinking
as
jack
was
saying
too.
I
mean
I
feel
so
much
that
we're
austin
is
constantly
learning
and
and
growing
in
the
space
too,
and
I
I
think,
we're
we're,
probably
in
a
lot
of
similar
places
on
this.
I
think
that
daki's
just
mentioned
this
too.
Moving
towards
transparency
for
transparency's
own
sake
is
a
is
like
a,
I
think,
a
huge.
G
Can
I
say
it's
a
it's
a
huge
charge
of
energy
because,
having
like
it,
can
create
more
and
more
community
support
for
initiatives.
I
think
I'm
thinking
as
I'll
talk
about
this
about
all
the
places
in
boston
in
which
it
feels
like
we're,
we're
sometimes
growing
less
transparent,
but
it
feels
like
that,
the
more
that
we
partner
with
different
community
organizations,
the
more
work
we've
done
with
the
massachusetts
aclu,
which
has
been
a
real
leader
in
the
space,
the
more
that
feels
like
there's
new
pathways
to
transparency.
G
I
think
that
yeah
then
we're
trying
to
do
that
in
the
in
the
surveillance,
ordinance
and
other
places,
and
we're
trying
to
the
next
step
is
to
explore
consent
more,
but
I
think
it's
always
reevaluating.
As
dan
said
earlier,
the
the
technology
is
changing.
We're
trying
to
stay
on
top
of
it
and
figure
it
out.
So
just
opening
up
the
conversation
is
one
of
the
biggest
things
best
part
I
would
just
say
going
back
to
what
jack
kentucky
said.
G
Is
that
we're
always
trying
to
figure
out
how
we
can
be
more
thoughtful
and
ethical
in
our
processes
without
gumming
up
the
work
so
much
that
government
can't
function
and
that's
a
really
important
part
too
right,
like
public
services,
need
to
happen,
otherwise
people
are
really
going
to
suffer,
and
so
it's
conversation
we've
had
with
other
cities
too,
and
yeah
be
good
to
be
continually
in
conversation
on
this,
like
what
are
the
sacrifices
that
we're
willing
to
make
in,
or
you
know
of
slowing
things
down
in
order
to
make
sure
that
we're
getting
it
right
and
how
can
we
open
up
the
conversation
and
community?
A
Thanks
joe
thanks
and
dan,
I
have
a
final
question
for
you.
A
It
looks
like
in
this
journey
it's
really
important
for
individuals
to
really
look
for
reliable
information
right
and
remain
informed,
particularly
since
technology
is
moving
so
fast,
and
you
mentioned
one
of
the
one
of
the
tools
and
I'd
love
for
you
to
to
say
that
again,
but
what's
your
best
advice
about
how
people
stay
on
top
of
what's
impacting
their
daily
lives,
because
clearly
in
and
as
I've
looked
at
the
chat,
you
know
the
the
pace
at
which
municipalities
may
go
will
be
slower
than
than
the
expectations
of
the
residents.
F
You
won't
like
my
answer.
It's
really.
I
don't
know
I
mean
if
you
come
up
with
the
answer
to
that.
I
I'd
love
to
hear
it.
Everything
is
changing
so
fast,
there's.
So
many
more
organizations
right.
You
have
your
cable
provider.
You
have
your
city,
your
county,
everything
it's
so
hard
to
keep
track
of
everything
right.
So
all
these
different
businesses
that
you're
involved
with
so
I
I
don't
know
it's
really
a
matter
of
education
and
yeah.
F
It's
really
a
community
effort
to
educate,
you
know
civics
and
I
guess
maybe
regulations
so
that
let's
say
privacy
policies
are
actually
easy
to
read
and
they're,
not
don't
take
up
pages
right
who
actually
reads
them
yeah.
So
it's
a
mess.
I
don't
think
there's
any
easy
solution
to
this.
F
Yeah,
well
I
mean
one
thing:
is
you
can
follow
some
of
the
major
companies
that
actually
do
have
privacy
research
teams
like
what?
What
is
google
doing?
What
is
apple,
doing
apple,
put
out
a
press
release
saying
that
they've
adopted
a
while
ago,
differential
privacy
and
here's
what
it
allows
them
to
collect
and
here's
how
it
impacts
you.
F
A
All
right:
well,
we
certainly
we
are
almost
at
the
at
the
end
and
I'll
I'll
turn
it
over
to
the
vice
chair,
frank,
yazzo,
too.
Jerry
can
I
make
a
couple
of
quick
questions.
Go.
E
Ahead,
thank
you
just
two
two
comments,
especially
because
takis
has
joined
us.
I
I
I
wanted
to
enrich
a
little
bit
the
description
that
we
heard
about
what
the
the
night
was.
I'm
a
little
bit
more
of
an
optimist.
I
I
thought
there
was
a
great
debate
and
I
thought
I
heard
arlingtonian
speak
about
an
open,
open
way
to
both
explore
and
for
sure
process.
E
What's
going
on,
there
are
some
very
strong
voices
in
the
group
that
I
heard
that
have
some
very
particular
concerns,
and
I
would
make
sure
that
I
address
those
concerns
head
on,
but
I
think
in
general,
at
least
from
the
expert
panel
you
gathered,
the
notion
is
innovation
is
good.
Trying
something
new
is
good.
You
have
to
have
some
strict
framework
around
it.
E
The
second
thing
that
I
wanted
to
say
is
that,
if
you
think
government
has
problems
in
terms
of
what
to
do
with
the
data
look
at
industry
that
you
have
10
to
50
times
the
amount
of
processing
of
data,
so
us
in
government
we're
very
small
we're
a
small
fry,
and
this
is
why
I
think
daniel's
suggestion
of
looking
at
the
big
boys
might
be
a
useful
one.
I
don't
want
to
forget
that
schools
is
where
it
starts.
E
You
have
to
look
at
k-12
curricula
that
take
on
privacy
and
security
for
the
kids,
because
you
know
I
know
it's
not
for
tomorrow,
but
unless
we
start
now,
it'll
never
happen
and
then
finally,
I
I
just
wanted
to
re-emphasize
the
word.
Not
you
should
not
be
looking
for
an
answer.
You
should
be
setting
down
a
process.
If
this
happens.
What
should
we
do,
since
these
voices
continue
to
speak?
How
do
we
deal
with
them?
You
need
processes
more
than
you
need
strategies
or
written
laws,
and
I
think
for
tiki's
sake.
E
I
I
absolutely
endorse
the
idea
that
elected
officials
need
to
first
of
all
understand
more
deeply
the
impact
of
laws
that
are
being
discussed,
and
I
wouldn't
rush
into
a
privacy
law
right
now,
because
there's
so
many
things
that
are
changing
just
try
to
find
procedures
that
will
help
you
along
the
way.
Thank
you
very
much.
I
didn't
mean
to
interrupt.
A
A
All
right,
frank,
well,
wrap,
wrap
us
up,
and
we
really
do
appreciate
everyone
coming.
J
Very
good
discussion
tonight
and
want
to
thank
tacos
carantonas
for
joining
us
tonight,
our
board
liaison
to
the
information
technology
advisory
commission.
I
also
want
to
thank
angela
easterwood
from
the
county
government
she's,
the
wizard
behind
the
screens,
keeping
this
whole
whole
operation
going.
So
we
want
to
thank
angela
again.
I
also
want
to
thank
all
of
our
speakers
tonight,
jack
belcher,
who,
with
the
county
dr
costas
torregas,
dr
dan
keifer
and
yo
deshpande,
for
helping
educate
us
all
on
this
very
important
issue.
Tonight.
J
It
was
a
joint
effort
of
both
epac
and
the
tech
commission
and
finally,
we
want
to
thank
the
members
of
the
of
both
e-pac
and
itaf
or
tech
commission
who
joined
in
in
this
effort.
And
finally,
I
want
to
thank
the
members
of
the
public
who
joined
us
tonight.
This
is
the
beginning
of
an
important
conversation
and
we
hope
to
organize
future
programs
on
this
topic,
as
we
continue
the
journey
into
arlington's
digital
future.
So
thanks
again,
stay
stay
tuned.
J
A
C
H
Last
question
before
I
before
I
leave
I'm
now
going
to
look
at
the
recorded
part
of
all
this,
so
where
do
I
find
it.