►
Description
Facial recognition and other biometric technologies are rapidly proliferating in consumer and government applications, presenting new and challenging privacy issues. State legislatures are responding with calls for regulation, including moratoria on the use of facial recognition. This session will discuss the many ways biometrics are being deployed and proposals for regulating them.
A
A
Okay,
let's
go
ahead
and
get
started,
welcome
everybody,
hello
and
thank
you
for
joining
us
for
today's
webinar
on
facial
recognition,
location
data
and
biometrics.
My
name
is
abby
gruwell.
I
work
in
ncsl's
dc
office
on
federal
privacy
and
technology
issues.
This
session
is
part
of
privacy
week,
the
first
event
of
the
new
ncsl
privacy
working
group,
which
is
a
subcommittee
of
ncsl's
cyber
security
task
force.
A
The
work
group
was
recently
formed
to
examine
policy
issues
related
to
consumer
data,
privacy,
algorithms
and
artificial
intelligence,
government
data,
usage,
transparency
and
other
issues,
as
well
as
to
consider
the
intersection
between
data
privacy
and
cyber
security.
We
are
grateful
to
our
private
privacy
working
group
sponsors
for
their
support
of
these
activities
over
the
next
60
minutes.
We
encourage
your
participation,
so
please
feel
free
to
type
your
questions
into
the
chat
box.
A
A
This
session
will
discuss
the
many
ways
biometrics
are
being
deployed
and
proposals
for
regulating
them.
We
are
pleased
to
have
with
us
today
two
distinguished
speakers,
claire
garvey,
from
the
center
on
privacy
and
technology
at
georgetown,
university
law
and
chris
coopman
from
the
center
for
growth
and
opportunity
at
utah
state
university.
A
We
will
hear
first
from
claire
garvey
claire
is
a
senior
associate
with
the
center
on
privacy
and
technology
at
georgetown
law.
She
was
the
lead
author
on
the
perpetual,
lineup,
unregulated
police
space
recognition
in
america
in
2016
and
two
follow-up
reports
in
2019.
She
testified
before
the
house
oversight
committee
about
police
use
of
facial
recognition.
A
Her
commentary
has
appeared
in
the
new
york
times
the
washington
post
to
wall
street
journal
and
she
serves
as
an
expert
resource
to
both
democrats
and
republicans
in
congress
and
state
legislatures.
She
received
her
jd
from
georgetown
lawn
or
ba
from
barnard
college,
claire
I'll
turn
it
over
to
you.
B
Great
thanks
so
much
I'm
very
much
looking
forward
to
the
questions,
so
I'm
going
to
keep
this
pretty
short.
Basically,
I
want
to
give
a
very
brief
overview
of
the
state
of
play
of
face
recognition,
use
in
law
enforcement,
which
I
think
it's
the
area
that
I
focus
on
in
my
research
and,
I
think,
is
probably
the
area
that's
most
focused
on
in
state
legislators
today
or
state
legislatures,
face
recognition
used
by
law,
enforcement's
incredibly
prevalent
back
in
2016.
B
B
In
addition,
thanks
to
the
ever-increasing
access
to
driver's
license
databases
over
half
of
all
american
adults
are
in
a
phase
recognition
database,
that's
used
for
criminal
investigations
and
the
recent
efforts
over
the
last
year
and
a
half
or
so
by
a
number
of
state
and
local
jurisdictions
not
withstanding.
It
is
still
a
largely
unregulated
field.
B
This
field
is
governed
by
policy.
Sometimes
other
agencies,
don't
even
have
policies
governing
how
and
when
they
can
use
this
technology
and
transparency
still
is
a
huge
problem
when
it
comes
to
understanding
the
scope
and
the
types
of
uses
that
we
see
this
technology
in
three
things
that
I
want
to
just
quickly
focus
on.
In
terms
of
the
concerns
that
phase
recognition
raises.
B
We
have
not
seen
real
use
cases
in
the
united
states,
the
use
of
base
recognition
to
track
individuals,
but
we
have
seen
face
recognition
used
on
public
protest
and
a
growing
concern
that
that
is
in
fact
chilling.
People's
willingness
and
ability
to
participate
in
particularly
contentious
or
law
enforcement
focused
protest.
B
Photo
databases
essentially
can
be
converted
to
biometric
templates
very
quickly
and
most,
if
not
all,
of
us
have
biometric
or
face
photographs
on
file
with
one
or
more
government.
Agencies
think
passports,
visas,
driver's
licenses,
mug
shots
etc,
and
the
other
is
is
legacy
camera
system.
Cctv
systems
can
do
greater
lesser
degree,
be
mobilized
to
facilitate
biometric
identification
by
connecting
face
recognition
directly
to
live
video
streams
or
using
static.
One-Off
images
from
cctv
camera
feeds
to
conduct
that
identification
as
well.
B
B
It's
uses
a
forensic
identification
technique
somewhat
akin
to
latent
fingerprint
identification
where
a
latent
face
print,
if
you
will
is
left
at
the
scene
or
on
a
camera
and
can
be
run
against
the
face
database
on
file
to
generate
a
possible
match.
This
raises
some
questions
about
misidentification.
B
There
are
a
lot
of
variabilities
in
how
these
systems
are
run
and
a
lot
of
this
variability
leads
to
a
risk
of
misidentification
as
it
stands
now.
B
In
terms
of
legislative
approaches,
over
the
last
two
years,
we've
seen
a
lot
of
efforts
to
at
the
state
and
local
level,
predominantly
but
also
a
few
proposals
at
the
federal
level
to
rein
in
the
use
of
the
technology,
either
through
bans,
moratoria
or
regulations,
and
one
of
the
the
resources
you
should
have
available
to.
You
is
a
taxonomy
of
legislative
approaches
that
we've
seen
in
the
u.s
to
date,
written
by
me
and
a
colleague
of
mine,
jamison's
feedback.
We
identified
four
different
approaches
so
far.
B
All-Out
bands
basically
saying
base
recognition
cannot
be
used
either
at
all
in
a
given
city
or
by
law
enforcement
or
by
city
officials.
That
type
of
thing,
time-bound,
moratoria,
basically
saying
we're
going
to
press
pause
on
the
use
of
this
technology
for
a
certain
period
of
time,
after
which
it
will
either
continue
as
it
was
beforehand
or
during
that
interim
period,
we
will
have
set
rules
in
place
to
constrain
or
regulate
the
technology.
B
Moratoria
which
basically
says
face
recognition
cannot
be
used
unless
certain
conditions
are
put
in
place,
such
as
regulation
or
audits.
That
type
of
thing
this
sounds
a
lot
like
bands
because,
let's
be
honest,
a
ban
is
a
time
bound
or
a
directive
moratorium
that
can
be
repealed
at
any
point
in
time.
B
It
would
take
affirmative
action
on
the
part
of
the
legislature
and
then
finally,
the
most
common
approach
is
regulation
at
any
sort
of
area
of
the
spectrum,
from
sort
of
very
light
touch
regulation,
maybe
just
requiring
transparency
and
reports
to
congress
or
more
heavy-handed
regulation
requiring,
let's
say,
court
order
prior
to
the
use
of
the
technology.
This
is
a
very,
very
broad
strokes.
As
the
the
outlining
the
available
approaches
within
the
regulation.
B
B
B
The
reason
for
this
is
that
face
recognition
is
is
very
in
vogue
right
now.
Gate
recognition
might
be
a
thing
that
we
have
in
five
years
or
face
recognition,
may
morph
into
with
really
high
quality
cameras,
iris
detection,
so
the
more
narrower
the
more
focused
we
are
on
the
particular
technology,
the
less
future
proof
it
is,
and
I
think,
the
less
it
actually
gets
at
the
real
concerns
which
are
the
impacts
on
rights
and
liberties.
A
Thank
you
so
much
claire
up.
We
have
chris
kookman
chris
is
the
executive
director
at
the
center
for
growth
and
opportunity
at
utah
state
university.
He
specializes
in
regulation,
competition
and
innovation.
His
research
and
commentaries
appeared
in
the
wall
street
journal
new
york
times,
washington,
post
usa
today
bloomberg
and
npr.
A
He
is
also
a
contributor
at
the
hill
and
was
named
to
forbes
30
under
30
2016
for
law
and
policy
prior
to
joining
the
center
for
growth
and
opportunity.
He
was
a
senior
research,
fellow
at
the
director
and
director
of
the
technology
policy
program
at
the
mercatus
center
at
george
mason
university.
He
is
currently
a
senior
affiliated
scholar
with
mercatus
center
and
a
member
of
the
it
and
emerging
technology
working
group
at
the
federalist
society's
regulatory
transparency
project.
I'll
turn
it
over
to
you.
Chris.
C
Awesome,
thank
you
so
much
abby
and
thank
you
claire
for
for
outlining
a
lot
of
the
the
concerns
that
I
think
many
of
us
have
around
the
law
enforcement
use
of
biometric
data
and
facial
recognition
technology.
One
of
the
things
I
I
want
to
highlight
now
I'll
go
through
three
parts
here
is
sort
of
following
up
on
some
of
the
law
enforcement
concerns
that
that
claire
raised
dive
in
a
little
bit
in
the
current
state
of
the
rate.
C
What
do
the
laws
in
states
that
have
passed
them
currently
there's
only
about
a
half
dozen
laws
in
it,
passed
in
the
united
states
that
that
regulate
biometric
information
and
and
then
jump
in
and
talk
a
little
bit
about
the
impact
that
these
laws
are
having.
C
But
I'm
also
going
to
bring
up
some
commercial
applications
as
well,
and
I
I
want
to
really
bring
to
life
something
claire
had
mentioned,
and
that
is
the
sort
of
the
the
the
rights
violated
like
rights,
violating
abuses
and
risks
that
can
come
along
with
the
the
misuse
of
of
biometric
data,
and
I
want
to
talk
specifically
about
a
young
man
named
michael
ustory.
So
so
michael
was
a
filmmaker,
is
a
filmmaker
in
louisiana
and
somehow
got
himself
tied
up
into
a
nearly
25
year
old
cold
case.
C
Murder
in
idaho
falls
idaho
because
his
father
had
uploaded
his
biometric
data
in
the
form
of
his
dna
to
ancestry.com,
not
technically
ancestry.com,
to
a
another
project
that
was
purchased
later
by
ancestry.com,
but
law
enforcement
when
they
realized
that
they
had
wrongly
convicted
the
murderer.
You
know
20
years
ago
and
reopened
the
case,
and
there
was
some
increased
attention
on
these
familial
searches
using
dna
to
find
the
the
killer
or
the
murderer
or
the
rapist
or
the
criminal
based
on
familial
dna.
C
Family
until
we
find
we
find
the
suspect-
and
that's
exactly
what
happened
in
this
case
and
and
michael
us,
three
junior
was
identified
both
because
he
he
his
father
uploaded
the
data.
They
mapped.
Five
generations
of
the
ashrae
family
and
then
found
michael
who
michael
ussrey
jr,
who
had
actually
been
in
idaho,
falls
at
some
point
in
his
tribal
in
his
life,
had
lined
up
to
to
a
degree
with
the
events
that
had
transpired
20
some
odd
years
earlier
long
story
short,
it
turns
out.
C
They
ended
up
finding
the
correct
suspect
later,
but
michael
esther
had
to
spend
months
of
his
life
defending
himself
and
protecting
his
name
and
his
freedom
because
of
sloppy
police
work
on
the
part
of
you
know,
idaho
investigators,
as
they
were,
trying
to
tie
up
a
cold
case
that
that
is,
that
is
a
tragedy
using
using
this
type
of
data
in
that
way,
is
is
a
tragedy,
and
we
should
limit
this,
but
we
also
for
for
all
of
us
on
the
call.
C
We
also
need
to
differentiate
the
law
enforcement
misuse
of
this
technology
from
the
promising
commercial
applications
as
well
and-
and
I
think
that
many
of
the
laws
in
the
states
as
written
focus
almost
exclusively
on
the
commercial
applications
and
do
not
really
limit
the
law
enforcement
applications
in
many
degrees,
and
I
think
the
the
biometric
information
privacy
act
in
illinois,
which
was
the
first
one
passed
and
has
been
a
model
for
for
the
half
dozen
states
that
have
since
passed
these.
These
laws
does
exactly
that.
It
it
limits.
C
The
commercial
application,
does
nothing
for
for
the
types
of
rights
violating
uses
that
that
claire
has
mentioned,
and
I
think
an
interesting
thing
just
to
dive
into
to
illinois
as
we'll
just
talk
about
illinois,
texas,
washington,
new
york,
arkansas
and
california,
through
ccpa
all
have
biometric
laws
on
the
books
now,
but
illinois
has
been
the
model
for
the
most
part.
It
was
passed
in
2008
and
there's
something
interesting
that
claire
mentioned,
and
it's
worth
saying
about
the
the
over.
C
What's
the
best
way
to
put
this
the
the
broad
scope
that
these
laws
take,
that
they're
regulating
technology
that
hasn't
come
into
existence
yet
in
existence?
C
Yet
claire
mentioned
that
maybe
in
five
years,
we'll
have
gate
recognition,
technology
and
like
broad
use,
illinois
included
that
in
its
biometric
privacy
law
passed
12
years
ago,
so
they
were
regulating
technology
seven,
almost
two
decades
before
that
technology
is
even
being
used
to
give
you
context
for
illinois's
time
on
this
illinois
passed
the
biometric
privacy
act
in
2008
about
eight
months
after
the
first
iphone
came
out
so
think
of
how
far
technology
has
come.
C
A
C
The
the
the
fines,
the
fees,
the
requirements,
the
requiring
clear
and
written
policies
and
advanced
disclosure
and
written
releases-
all
of
this
has
remained
the
same,
but
our
technology
has
changed
and
that's
a
really
important
point
for
for
people
on
the
call
to
take
into
account.
So
why
why
why?
Why
should
the
law
you
know
take
into
account
changes
in
technology
and
and
ensure
that
we
aren't
sort
of
stifling
these
these
extremely
promising
technologies?
From
coming
to
fruition?
I
I
think,
let's
dive
into
some
of
the
technology
here.
C
How
is
biometric
and
facial
recognition
being
used
commercially?
You
think
microsoft,
seeing
ai
is
an
example.
Microsoft
is
literally
giving
sight
to
the
blind
through
facial
recognition
technology
there.
There
are
auto
manufacturers
who
are
generating
biometric
sensors
to
identify
drowsy
and
distracted
drivers,
and
these
can
also
detect
mood
right.
They
can
detect
scales
on
your
face.
C
They
can
see
if,
if
you're
getting
increasingly
unhappy
sitting
in
traffic-
and
they
can
do
things
like
change
the
music
adjust,
the
lighting
avoid
annoying
bells
to
maybe
not
send
someone
in
to
a
road
road
rage
instance,
and
then
finally,
I
ibo
from
from
sony,
which
is
a
robot
pet,
that
can
provide
companionship
without
care,
and
why
is
this
important?
There
are
actually
studies
that
show
that
this
is.
C
All
three
of
these
technologies
cannot
be
deployed
in
illinois
and
are
not
being
deployed
in
illinois.
You
think
actually,
ibo
sony
does
not
sell
their
their
robot
dog
in
illinois,
because
the
law
requires
that
advanced
disclosure
and
written
releases
be
collected
from
every
single
person
in
which
their
biometric
data
is
collected
and
processed.
C
So
let's
say
you
have
a
person
who
is
in
dementia
care
and
has
a
robot
dog
you'd
have
to
have
an
advanced
disclosure
or
written
release
from
every
person
in
which
that
dog
interacts.
So
every
person
that
walks
into
that
walks
into
that
person's
home,
their
apartment,
whatever
would
need
to
have
a
written,
a
written
release.
That
says
I
allow
myself
to
be
processed
by
sony's
technology
or
it's
a
violation,
and
this
isn't
just
hypothetical.
This
is
really
what's
happening.
You
think,
earlier
this
year
facebook
settled
a
650
million
dollar
lawsuit.
C
It
was
actually
in
the
billions
they
settled
for
650
million
dollars
based
on
the
facial
recognition
technology
within
their
app
right.
Basically,
a
class-action
lawsuit
transpired
in
which
users
said
that
they
didn't
actively
consent
and
were
not
were
not
given
advanced
disclosure
and
did
not
submit
written
releases
for
facebook
to
process
their
their
biometric
data,
and
this
is
just
the
facial
recognition
technology
in
the
app
that
says,
like
hey
here's,
this
picture
that
you
just
uploaded.
C
C
An
opt-out
for
everyone
in
illinois
now
and
actually
the
the
company
is,
is
working
on
geofencing
the
state
in
the
same
way
that
many
other
apps
have
been
geofenced,
where
they're
illegal,
so
if
you're
driving
through
illinois
they're
actually
turning
features
off
when
you
enter
the
state.
This
is
the
same
thing
going
on
with
apple
and
its
photos
app,
and
even
this
has
gone
to
so
far
as
time
clock
vendors
in
illinois
and
in
texas
have
been
sued,
based
on
the
thumbprint
time,
clocks
that
they've
sold
to
employers.
C
So
the
classic
punch
clock
is
a
card
in
into
the
clock,
and
it
times
you
in
and
out
the
technology
exists
now
to
use
your
thumbprint,
so
you
can
sign
in
and
out,
and
your
employer
can
ensure
that
no
one
is
punching
you
in
and
out
that
it
is
in
fact
you
punching
punching
in
and
out.
C
These
companies
are
now
being
sued
right
because
they
provided
the
technology
and
the
vendor,
not
the
person
deploying
the
technology
but
the
vendor.
These
are
broad
sweeping
regulations
and
limitations
that
are
sort
of
prohibiting
the
advancement
of
this
technology
and
the
second
part-
and
I
think
this
plays
off
of
what
what
claire
was
talking
about
is
in
many
states.
There
is
not
clear
guidance,
regulation
and
restrictions
across
the
country
for
the
law
enforcement
use
of
this
technology.
C
So
in
instances
where
you've
banned
the
commercial
use
and
you've
you've
sort
of
let
the
law
enforcement
use
go
unfettered
you've
created
what
economists
would
call
a
monopsony
right.
We
all
hear
these
days
in
technology
about
monopolies,
but
on
the
other
side,
there's
something
called
a
monopsony
which
is
a
single.
It's
a
market
with
a
single
buyer
who
has
a
power
to
shape
the
market
based
on
what
they're
purchasing
and
in
some
instances
this
is
exactly
what
these
laws
are
are
doing
unintentionally,
because
they're
creating
a
monopoly
where
they're
they're.
C
The
legislation
was
passed
for
fear
of
the
the
abuse,
but
they've
created
a
situation
in
which
the
only
the
only
institutions
that
can
use
it
are
the
law
enforcement
institutions.
So
you
will
only
see
law
enforcement
applications
of
this
technology
and
that's
not
to
say
also
that
there
aren't
rights
enhancing
uses
of
this
technology
as
well.
I
mean
you
think.
C
The
innocence
project
for
a
very
long
time
has
been
using
ju,
almost
exclusively
biometric
data
in
the
form
of
dna
evidence
to
clear
the
wrongly
convicted,
and
I
would
also
say
that
even
on
the
border,
we
we've
seen
instances
in
dhs
earlier
this
month,
just
announced
regulations
to
update
its
own
biometric
regulations.
C
That
would
limit
the
need
for
family
separation.
I
think
we
we
we've
heard
a
lot
about
family
separation
at
the
border
because
they
cannot
identify
if
an
adult
bringing
a
child
over
a
board
or
over
the
southern
border
is
in
fact,
the
parent
of
that
child
or
or
a
family.
C
Member
of
that
child
in
dhs
right
now,
under
its
regulations,
are
limited
in
the
way
that
it
can
collect
biometric
data
from
minors
at
the
border,
and
so
it
almost
creates
this
instance
in
which
the
only
option
dhs
seems
to
think
it
has
is,
is
family
separation,
and
yet
that's
just
one
instance
of
collecting
biometrics.
You
could
quickly
identify
parents
and
children,
or
at
least
familial
connections
that
would
allow
you
to
protect
the
rights
of
those
parents,
keep
nuclear
families
together
and
really
enhance
the
rights
of
people
at
our
southern
border.
C
I
I
would
say
for
for
those
of
you
in
the
room,
just
three
things
to
consider
when,
when
contemplating
these
policies
is
identifying
the
nature
of
the
problem
very,
very
clearly
identifying
the
nature
of
the
problem,
narrowly
tailoring
your
regulations
or
your
legislation
to
fit
that
problem
and
then
be
willing
to
admit
when,
when
the
law
doesn't
work
right,
when
the
world
changes,
the
technology
changes
or
there
are
unintended
consequences,
be
willing
to
very
quickly
and
humbly
change.
Those
things
to
to
continue
to
to
balance
the
the
really
promising
technologies
that
we
see.
C
With
with
the
fears
that
we
do
have
and
legitimate
fears
about
the
rights
violating
uses
that
that
they
may
have
for
for
individuals
and.
A
Communities
right,
thank
you
very
much
chris.
We
appreciate
your
presentation
and
we
might
need
to
come
back
to
robot
pets,
because
that
sounds
adorable,
but
we're
gonna
move
into
the
question
answer
part
of
our
program.
Today
we
have
some
questions
prepared,
but
please
enter
your
questions
into
the
chat
box.
A
B
I'm
happy
to
start
on
that.
I
mean
we're
seeing
facebook
again
I
mean
I
highlighted
the
idea
that
this
is
a.
This
is
the
first
sort
of
instance,
of
biometric
surveillance.
That
raises
a
lot
of
opportunity
and
risks
on
the
part
of
various
users,
both
commercial
and
and
law
enforcement,
and
I
might
chris-
and
I
might
have
a
little
bit
of
disagreement
on
the
on
or
maybe
like
a
difference
in
optimism
on
the
commercial
applications
of
face
recognition.
B
I
focus
a
lot
on
its
use
as
a
to
detect
known
or
suspected
shoplifters,
for
example,
where
there's
actually
no
sort
of
process
by
which
you
can
actually
challenge
being
kicked
out
of
a
store
on
the
basis
of
being
enrolled
or
misidentified
as
being
enrolled
in
on
a
pace,
recognition
list
or
other
applications
of
the
technology
like
they
use
in
clearview,
ai
of
scraping
the
internet
essentially
and
enrolling
three
billion
images
into
a
face
recognition
system
that
was
initially
available
to
places
like
macy's
and
jp
morgan
and
are
now
only
available
to
law
enforcement
to
chris's
point
where
we're
narrowing
this.
B
The
field
at
based
on
this
risk,
or
this
this
perceived
risk
on
the
commercial
side
as
opposed
to
the
law
enforcement
side.
B
When
I
see
face
recognition,
the
few
sort
of
areas
for
growth
are
basically
a
scale
concept.
However
many
cameras
there
are,
we
could
just
see
an
increase
in
in
application
there.
I
think
we're
going
to
see
new
modalities
of
of
biometrics
come
online
voice.
Recognition
is
probably
the
next
one,
based
on
the
prevalence
of
legacy.
Databases
from
the
department
of
corrections
in
various
places
that
record
and
tag
phone
calls
that
inmates
receive
based
on
the
identity
of
the
caller
and
then
the
inmate
with
gate
recognition.
B
There
aren't
the
legacy
databases,
so
there
are
a
few
more
hurdles
or
widespread
iris
gate
or,
let's
say,
ear
pattern
analysis
to
come
on
deck.
Another
field
I
think
related
and
something
that
chris
touched
on.
Is
this
idea
of
sentiment
analysis
or
face
phase
analysis
trying
to
determine
somebody's
emotional
state
based
on
their
facial
expression?
B
There
are
a
number
of
challenges
with
this,
and
the
science
is
pretty
shaky
at
this
point
on
whether
or
not
this
is
something
that
that
faces
our
whether
our
faces
accurately
portray
our
emotional
state,
such
as
anger,
surprise,
happiness
intent
to
deceive.
B
There
is
a
company
called
faceception
out
there,
which
purports
to
be
able
to
identify
the
propensity
of
somebody
to
commit
a
certain
type
of
crime
based
on
the
structure
of
their
face.
The
ceo
has
acknowledged
that
there's
no
science
behind
that
particular
application
that
has
not
stopped
the
technology
from
being
sold
and
deployed
in
the
in
at
least
two
countries
in
the
world.
B
So
I
do
think
emotion,
detection
and
sentiment.
Analysis
is
incredibly
appealing
and
we're
going
to
see
a
push
towards
that.
I
think
the
idea
of
like
remote
lie.
Detecting
tsa
is
interested
in
frictionless
frictionless
security.
They
would
have
to
build
in
some
sort
of
lie.
Detection
sentiment,
analysis,
emotion,
analysis
into
that.
They've
talked
at
length
about
that
and
then
yeah.
I
think
I'll
leave
it
there.
B
B
It's
often
the
deployment
of
the
technology
that
happens
before
we've
actually
thought
through
the
science
like,
for
example,
like
the
forensic
science
behind
the
use
of
face
recognition
in
criminal
investigations,
the
science
behind
being
able
to
detect
someone's
propensity
to
be
a
pedophile
or
a
terrorist
based
on
the
structure
of
their
face.
The
science
isn't
there
yet,
but
the
technology
is
so.
We
do
see
that
very
strange,
pretty.
Concerning
concept
happen
in
the
biometrics
field
and
I
think
we'll
continue
to
see
that.
C
C
Conversation
to
focus
on
that
and-
and
I
do
think
I'll
work
backwards.
I
I
think
what
you're
talking
about
in
terms
of
you
know
the
science
following
deployment
right
like
the
deploy
and
then
learn
and
then
iterate,
that's
the
that's
the
process
of
innovation,
and
I
think
it
it
it's
one
thing.
C
If
my
car
thinks
that
I'm
frustrated
and
changes
mood
lighting
and
plays
kenny
g-
and
I
don't
like
it
and
then
I
take
it
to
the
dealer
and
say
turn
it
off
versus
using
that
same
technology
to
try
to
identify
someone
who
may
or
may
not
be
lying
when
they
might
just
have
a
a
you
know.
C
If
you
know
it
would
be
like
trying
to
determine
if
I'm
unsure
or
not,
while
I'm
talking
and
then
say
well
he's
lying
because
he
looks
unsure
right
or
he
quick
twitch
in
his
left
eye
means
he
lied
right
and
the
computer
says
this.
It's
one
thing
when
it's
used
in
a
commercial
aspect
and
and
I'm
able
to
have
a
relationship
with
that
company
or
choose
not
to
use
that
company.
C
It's
another
thing
when
it's
used
in
a
law
enforcement
aspect
and
I
have
no
choice
of
provider
of
law
enforcement,
and
I
do
think
that
is
a
really
really
important
differentiation
to
make
in
this
space.
In
that,
like
you
know,
for
example,
my
google
photos
is
convinced
that
my
six
month
old
and
I
are
the
same
person
when
I'm
when
I'm
cleanly
shaven
and
it's
it's
not
wrong.
Right,
like
we
look
somewhat
alike
and
it's
not
it's
it's
it's
funny
and
you
just
correct
it
right.
C
You
can
tell
it
and
it
does
something
different.
That
is
a
I
to
me.
That
is.
That
is
how
innovation
works.
That's
how
this
process
should
work,
that's
how
we
improve
our
our
technology,
and
it's
also
why
government
should
be
a
first
mover
or
a
last
mover,
not
a
first
mover
in
this
space.
You
know
you
can
have
like
bad
facial
recognition
that
doesn't
hurt
anyone
and
in
fact
I
would
even
say
the
shoplifting.
C
The
shoplifting
example
that
that
you
give
claire
is
a
clear
one
in
which,
if
there
are
particular
people
or
or
even
let's
say
communities
that
are
disproportionately
affected
by
the
technology
that
says
they
are
a
shoplifter
or
is
it
confusing
them
with
one
another?
It's
very
easy
for
those
people
to
exercise
voice
and
to
tell
the
the
the
grocery
store
the
the
gas
station
any
of
these
other.
We
are
very
unhappy
with
the
way
that
you
are
deploying
this
technology
in
our
community,
and
they
do
have
some
say
in
this.
C
You
see
commercial
boycotts
work
all
the
time
in
how
technology
is
is
used
by
by
by
corporations,
and
I
would
say
that
that
that
isn't
really
a
rights
violating
use
of
that
technology.
It's
it's
bad
business,
but
it's
not
putting
someone
in
in
jail
or
or
threatening
their
their
their
life
and
their
liberty,
and
I
would
say,
keeping
those
things
very
separate
in
terms
of
like
commercial
and
government
use
and,
let's
focus
on
those
things,
build
the
protections
and
the
barriers
around
the
the
law,
enforcement
and
government
government
use
of
this
technology.
C
I
mean
illinois
focuses
solely
on
commercial
use.
You
have
states
like
maryland
over
the
past.
Several
years
have
proposed
bills
that
would
focus
exclusively
on
the
law
enforcement
use.
I
would
encourage
you
to
look
at
that
proposals
in
states
like
maryland,
where
you've
had
bills
that
have
died
in
session,
that
that
that
do
provide
those
rights
violating
protections
right
in
instances
where
this
technology
may
in
fact
put
someone's
life
or
liberty
in
jeopardy.
There
are
clear
uses
time
limitations.
You
cannot
just
like
scan
everyone
all
the
time.
C
Those
are
things
that
that
should
be
looked
at
in
terms
of
law
enforcement
uses,
but
I
would
say,
conflating
commercial
and
government
uses
of
this
technology
sort
of
does
what
I
said
before
limits
the
the
commercial
use
and
limits.
As
you
say,
claire,
you
need
to
use
the
technology
to
make
it
better.
C
Limiting
the
use
of
the
technology
puts
us
in
a
stagnation
where
it
won't
get
better
and
exactly
what
we
have
is
what
we
will
have
forever
and
we
can't
use
the
same
technology
in
the
way
that
dna
has
been
used
to
to
actually
get
people
their
liberty,
when
they've
been
wrongly,
can
convicted
or
accused.
C
B
I'm
just
a
quick
response
and
it
ties
in
the
comment
from
joseph
jerome
about
how
would
commercial
boycotts
work
in
light
of
how
little
transparency
there's
around
face
recognition?
I
think
that
is
one
of
the
problems.
The
idea
that
there
is
agency
is
really
only
to
the
extent
that
individuals
know
that
they're
being
kicked
out
of
the
store
because
of
face
recognition,
misidentification
and
that's
not
necessarily
the
case.
There
aren't
requirements
of
transparency,
it
maybe
says
like
smile
you're
on
camera,
but
you
might
not
know
why.
B
There's
like
a
security
guard
shadowing
you
around,
you
may
just
assume
that's
because
you're
a
young
black
person
in
this
country
and
not
because
you've
been
misidentified
as
a
potential
shoplifter.
The
other
point
I
would
say
I
wholeheartedly
agree
that
law,
enforcement
and
commercial
applications
are
very
distinct
and
there
is
every
reason
to
treat
them
differently.
I
do
think
it's
not
always
the
case
that
a
commercial
use
of
the
technology
is
consumer-facing.
B
So
one
of
the
areas
where
we
see
sentiment
analysis
come
into
play.
Right
now
is
with
hiring
applications,
and
maybe
I
go
through
a
first
round
of
screening
for
hiring
and
they
the
algorithm
identifies
that
I
raised
my
eyebrows
a
lot,
and
so
it
thinks
that
I
am
not
confident
or
I
am
like
a
psychopath.
Who
knows,
I
don't
know
what
raising
your
eyebrows
means,
but
it
flags
me
is
that
I
may
again
not
know
that
that
was
used,
but
also
not
have
any
recourse
to
say.
B
So
I
think
it's
it's
not
it's
not
quite
black
and
white
on
that,
but
but
fair
point
that
there
are.
It
is
interesting
that
we've
seen
this
focus
on
commercial
and
less
on
law
enforcement,
even
though
the
law
enforcement
applications
are
incredibly
prevalent
and
directly
speak
to
our
constitutional
rights.
C
And
I
I
would
just
add
so
to
to
the
instance
that
you,
you
know
you
could
you
could
come
up
with
two
hypotheticals
in
a
in
a
in
a
7-eleven.
Let's
just
use
a
you
know,
a
a
convenience
store,
someone
walks
in
the
mode
of
identification
right
shouldn't
it
like
doesn't
necessarily
matter
if
the
person
walks
in
the
camera
says.
This
looks
like
someone
that
has
shoplifted
in
here
before
and
tells
the
clerk
and
the
clerk
is
like.
C
Oh
my
gosh,
I
recognize
him
get
him
out
versus
the
clerk
saying
you
look
like
someone
that
walked
in
here
before
I'm
kicking
you
out
right.
In
both
instances,
the
mode
of
identification
is
essentially
the
same.
Like
I
looked
at
your
face,
the
other
one,
I
would
say
it
you
know,
has
the
tendency
to
probably
be
more
accurate
right.
You
and
I
both
know
right.
Most
people
on
the
call
would
probably
know
human
memory
is
like
really
really
bad
right
like
identifying
faces
of
strangers
is
bad
right.
C
We
we
there's
tons
of
evidence
on
this
in
terms
of
not
just
constructed
memories,
but
like
just
completely
bad
memories.
In
some
ways
this
could
limit
those
instances
that
are
already
happening
where
individuals
are
kicking
people
out
of
their
stories.
They
said,
don't
come
in
here.
I
know
you.
I
know
your
friends
get
out
of
here
versus
you
know
an
instance
of
a
person
stopping
and
thinking
before
they
make
the
decision
because
they
didn't
get
flagged
by
the
the
facial
recognition
technology.
C
So
again,
I
I
think
there
there
are
things
to
balance
here
and
again,
if
somebody's,
if
a
store
clerk
is
kicking
members
of
the
local
community
out
of
the
neighborhood,
whether
he
used
a
computer,
his
own
eyeballs,
there's
going
to
be
a
very,
I
think,
clear
backlash
in
the
community
with
regard
to
the
decision
being
made
by
that
that
clerk,
be
it
the
owner,
the
user,
whatever
right
like,
I
think
most
people
in
the
community
would
be
would
be
frustrated,
and
that's
what
I
mean
when
I
say
like
commercial
boycotts.
C
Even
if
there's
no
transparency
as
the
decision
everybody
would
know,
the
decision
was
made.
Like
you
know,
jimmy
was
kicked
out
of
the
the
neighborhood
store
and
now
we're
all
mad
about
it.
Whether
or
not
it
was
technology-based
or
human-based,
it
probably
shouldn't
matter
right.
A
All
right:
well,
we
have
a
question
from
the
audience:
do
you
have
thoughts
on
technology
known
as
deep,
fake,
giving
abilities
to
modify
video
or
faces
to
put
words
in
someone's
mouth?
How
should
legislators
possibly
think
about
limiting
its
misuse,
as
it's
currently
used,
often
for
fun,.
C
Hey,
I
can't
I
can
do
a
bit.
I
I
I
don't.
I
do
not
know
that
there
is
a
clear
reason
why
something
like
a
a
deep,
fake
neet
needs
to
be
legislated.
I
think
that
there
are
plenty
of
torts
that
would
cover
how
one
person
may
use
a
deep
fake
to
affect
the
personal
or
commercial
standing
of
another
person
and
that
could
be
remedied
very
easily
in
courts.
C
I
think
this
is
a
perfect
example
of
of
how
legislators
should
sort
of
take
a
stop
and
stop
and
wait
and
and
evaluate
the
situation
before
jumping
in
because
you
you,
you
may
end
up
regulating
something
away
that
really
didn't
necessarily
need
to,
but
open
open
the
doors
for
for
something
else.
I
think
a
perfect
example.
This
is
bipa
the
biometric
information
privacy
act
in
illinois
over
over
a
24-month
span.
You
had
hundreds
right
from
2018
to
2019.
C
It
was
something
like
200
cases
filed
in
illinois
just
on
the
the
bipa
act
and
and
claimed
violations
of
the
bipa
act,
because
it
it
was
written
ten
years
prior.
The
technology
has
dramatically
changed,
and
now
there
are
all
these
very
clear
applications
of
the
law
to
uses
of
technology
that
were
never
intended
or
thought
through.
That
have
created
all
kinds
of
of
of
troubles
in
the
courts
for
this,
so
that
that
would
be
my
take
on
on
deep
face.
B
I
think
this
is
honestly
not
my
area
of
of
focus,
so
I
don't
want
to
speak
too
much
to
it,
but
I
do
know
from
my
work
with
with
defense
attorneys.
There
is
that
one
of
the
concerns
that
the
idea
of
deep
fakes
raises
is
in
the
court
system.
B
Video
based
evidence
is
considered
sort
of
the
gold
standard
evidence
that
it's
pretty
much
unimpeachable,
deep
fakes
raises
a
question
about
whether
we
can
continue
to
assume
that
or
whether
video
evidence
as
introduced
in
court
in
civil
and
criminal
cases
may
start
being
subject
to
a
question
on
the
basis
of
deep
fakes.
I'm
not
saying
it
may
not
need
a
legislative
solution.
It's
just
going
to
need
some
sort
of
very
careful,
very
intentional.
A
So
one
of
the
interesting
things
about
biometric
technology
is
that
it
can
be
used
in
a
lot
of
different
ways.
Can
you
talk
about
how
legislators
should
think
differently
between
aggregate
or
population
level,
data
collection
versus
targeted
collection?
That
just
goes
after
one
specific
person?
Are
there
differences
in
how
they
should
consider
policy
making.
B
I'll
take
a
stab
at
this
from
a
law
enforcement
perspective.
I
would
make
a
distinction
between
and
I
think
we're
talking
about,
the
collection
of
what
would
be
called
probe
photos
or
probe
video.
The
video
of
the
unknown
individual
that
law
enforcement's
seeking
to
identify
the
existence
of
the
database
is
already
is
already
established.
B
So
the
distinction
I
would
make
is
that
with
individual
collection,
when
you're
looking
at
a
specific
individual,
you
can
establish
a
reasonable
suspicion
or
the
nexus
of
that
individual
to
the
crime
that
you're
investigating
right
now
very
few,
but
some
jurisdictions
do
have
restrictions
around
who
can
be
searched.
A
law
enforcement
official
cannot
just
search
his
ex-girlfriend's
new
boyfriend,
but
in
fact
needs
a
criminal,
hopefully
needs
a
criminal
nexus
and
he
might
actually
be
limited
to
searching
for
the
perpetrator
of
a
crime,
maybe
the
the
victim
and
possibly
a
witness
as
well.
B
Although
some
jurisdictions
have
limited
the
witness
component
that
individuality
and
nexus
to
the
crime
can
be
established
with
sort
of
mass
collection,
one
would
think
with,
let's
say
real
time:
face:
surveillance
off
of
video
feeds.
You
don't
have
that
same
well,
you
can
sort
of
narrow
down
who
you're
searching
for
and
narrow
down
the
repercussions
and
misidentifications
by
specifying
who's
on
the
watch
list
who
is
able
to
be
identified,
but
the
risk
of
misidentification
and
then
sort
of
adverse
action
taken
against
individuals
on
the
basis
of
msid
becomes
pretty.
B
There's
a
high
risk
here,
because
sort
of
mass
surveillance,
biometric
surveillance
is
going
to
be
the
least
accurate
form
of
face
recognition.
The
cameras
are
not
well
placed
they're,
usually
above
our
heads,
we're
not
facing
the
cameras,
we're
not
doing
sort
of
intentional
collection.
There
are
a
lot
there's
low
quality
image
inputs,
etc.
That's
all
going
to
degrade
the
accuracy
of
these
systems
and
the
consequences
of
a
sort
of
a
very
high
risk
watch
list.
Let's
say:
you're
looking
for
people
without
standing
arrest
warrants
for
violent
felonies.
B
The
risk
here
is
a
miss
id
which
is
increasingly
likely
because
of
that
variability
and
that
inaccuracy
the
consequence
of
somebody
being
maybe
forcibly
arrested
or
or
being
held
at
gunpoint
on
the
basis
of
that
miss
id.
B
So
it's
just
and
then
mass
collection
just
quite
simply
enables
or
will
lead
to
chilling
effects
that
individual
collection
may
not.
So
I
think
when
considering
which
controls
to
put
in
place,
I
would
be
very,
very
skeptical
from
an
accuracy
and
a
rights
intrusion
perspective
on
mass
collection,
more
so
than
individual.
C
Yeah,
I
I
would
add
a
couple
things
to
this.
I
I
would
say
that
again,
some
of
this
has
to
do
with
like
thinking
through
how
warrants
are
just
issued
in
in
the
united
states.
I
mean
right
now:
law
enforcement
could
get
a
warrant
that
says,
give
us
to
in,
and
ask
google
give
us
all
of
the
people
that
you
have
google
map
information
on
that
were
in
this
particular
area
at
this
particular
time.
Right
again,
I
would
say
that's
like
that
is
a
strange
mass
collection
of
information.
I
would.
C
C
Trolling
photo
databases,
and
one
of
the
states
that
were
pulled
in
this
was
my
home
state
utah
in
the
way
that
they
were
allowing
out
of
state
and
even
federal
agencies
such
as
ice
to
just
use
the
driver's
license
database
to
find
they
have
a
picture
of
a
person
they're
looking
for,
and
they
were
essentially
just
letting
law
enforcement
agencies
out
of
state
or
federally
just
sort
of
have
have
complete
access
to
the
database
and
run
the
the
facial
recognition.
Algorithms.
C
I
also
think
that
there's
an
important
piece
here
that
we
need
to
talk
about,
especially
in
in
this
instance,
is
you.
The
the
response
is
not
like
a
this
is
a
perfect
match
right.
Very
rarely.
Are
you
getting
like
a
hundred
percent
perfect
match
you're,
getting
like
here's
the
standard
like
here's?
Basically,
we
know
within
a
certain
probability
that
this
may
or
may
not
be
a
match,
and
what
you
have
found
in
law
enforcement
instances
is
that
they're
actually,
like
you
know,
going
with
very
low
probability
matches.
So
you
get
like
several.
C
C
I
do
think
from
a
from
a
from
a
regulatory
point
of
view
in
terms
of
this,
if
you
want
to
call
it
mass
versus
individualized
surveillance,
it
very
any
anything
that
that
that
could,
you
know
threaten
someone's
life
or
liberty
needs
to
be
taken
very
very,
very
seriously,
and
I
think
that
oftentimes
law
enforcement
is
playing
a
little
fast
and
loose
with
these
technologies,
and
it
is
having
sort
of
a
general
negative
effect
not
just
on
the
individual
people,
but
on
our
ability
to
effectively
use
these
technologies
in
the
future
right.
A
A
She
says
all
I
hear
from
industry
is
that
we
should
do
nothing
to
address
this
and
let
them
continue
to
use
it
and
at
the
same
time,
some
privacy
advocates
want
a
complete
ban.
Is
there
any
organization
working
on
a
middle
ground
and
are
there
any
advocacy
groups
willing
to
have
this
conversation
or
are
we
simply
in
a
place
where
we
can't
find
any
common
ground,
not
an
easy
question.
C
I
think
claire
and
I
have
common
ground
I
mean
like.
I,
don't
think
we're
really
disagreeing
that
much.
I
like
not
to
not
to
paraphrase
her
point,
because
she
could
she
could
speak
for
her
herself,
but
I
do
think
it
seems
to
me,
at
least
in
this
conversation
that
that
we
agree
that
there
should
be
strong
protections
when
it
comes
to
law
enforcement
and
rights
violating
uses
of
this
technology.
C
I
like
to
me
that
seems
to
be
a
very
common
ground.
I
don't
think
there's
anyone
out
there.
That
would
say
we
need
to
just
allow
like
nascent
unproven
technology
to
be
used
in
a
law
enforcement
and
criminal
capacity,
even
in
the
courtroom.
I
think
that
that
is
very
that
that
seems
to
be
everyone.
I
talk
to
seems
to
be
in
agreement
with
this.
B
I
would
agree-
and
I
think
I
I
think
the
question
is
probably
looking
more
commercial
applications,
but
I
will
say:
yeah,
there's
a
lot
of
agreement
on
the
law
enforcement
side
of
things,
which
is
interesting
to
me.
It's
also
a
very
bipartisan
issue.
At
the
federal
level,
our
largest
allies
were
the
late
representative,
cummings
and
representative
jim
jordan,
which
who
couldn't
those
these
two
individuals
could
not
be
further
from
each
other
on
a
lot
of
issues,
but
they
were
very
much
in
agreement
on
law
enforcement
use
of
face
recognition
on
the
commercial
side.
B
This
is
again
less
sort
of
my
area
of
expertise,
but
I
think
I
think
the
challenge
is
that
yes,
people's
talking
points
are
further.
Apart
than
and
industry
groups
versus
advocates
talking
points
might
sound
like
they're,
not
in
alignment,
whereas
there
probably
is
common
ground.
B
If
we
look
less
at
the
technology
itself
and
more
at
the
maybe
the
industry
in
which
it's
being
deployed
the
risk
of
bias,
I
think
there's
general
agreement
even
in
the
commercial
space,
and
we
have
to
be
pretty
careful
around
deploying
technology
that
just
quite
simply
performs
differently,
depending
on
your
on
who
you
are
on
your
race,
sex
and
age
and
and
the
impacts
instead
of
the
technology
itself,
to
look
at
what
are
the
impacts?
I'm
going
to
go
back
to
the
hiring
example.
Just
for
a
second.
B
If
we
have
a.
I
think
this
is
one
of
the
the
places
where
there
is
a
lot
of
concern.
It
are
the
hiring
algorithms
basing
decisions
on
are
the
inputs
that
they're
they're
analyzing
and
the
outputs
do
they
have
a
racial
difference
in
in
what
they're
analyzing
and,
I
think,
there's
a
lot
of
agreement,
and
it's
it's
a
little
hard
to
argue
that
we
should
allow
algorithms
to
even
though
humans
are
are
flawed.
B
Flawed
algorithms
should
replace
them.
So
but
yeah
I
mean
it's
a
tough
question
and
I
think
it's
really
hard
to
cut
through
the
the
sort
of
public,
positioning
and
posturing
of
a
lot
of
advocacy
groups
and
industry
groups
to
get
at
common
ground.
C
I
I
wouldn't
have
to
let
the
declares
example
if
I'm
an
employer-
and
I
use
a
technology
that
may
have
a
racial
outcome,
and
I
use
that
technology
to
make
a
hiring
decision,
and
so
I
I
make
a
higher
or
not
higher
decision
based
on
their
race,
either
intentionally
or
by
default,
allowing
the
algorithm
to
do
that.
That's
already
illegal
right,
that's
all
like
that.
That
is
already
illegal,
like
you
cannot
make
an
employment
decision
based
on
based
on
race,
sex
country
of
origin.
Like
those
you
cannot
you
you
can't.
You
can't.
C
It's
already
illegal,
so
I
think
if,
if
someone's
using
an
algorithm
or
a
facial
recognition
technology
that
has
identifiable
and
provable
racial
disparities,
they're,
they
are
they're
violating
the
law.
I
mean
we
already
have
laws
in
place,
and
I
I
think
that
that's
like
to
to
me.
That's
an
important
part
in
this.
In
the
debates
around
these
technologies
is
oftentimes.
We
want
to
regulate
the
technology
specifically
for
an
instance
in
which
there's
a
generalized
law
that
already
pops
these
senses.
C
I'm
not
saying
I
I
don't
disagree
with
claire,
like
no
one
should
make
those
decisions
and
no
one
should
be
using
technology
that
like,
if
look,
I'm
the
executive
director
of
a
nonprofit.
If
someone
brought
me
technology
and
said
it
may
or
may
not
do
this,
I
would
throw
them
out
of
my
office
right
like
as
as
an
employer.
A
All
right:
well,
we
have
about
six
minutes
left
so
I'll.
Just
give
both
of
you
a
few
minutes
to
talk
about,
especially
how
legislators
can
deal
with
a
field
that
is
evolving
so
quickly.
I
know
claire
mentioned
technology
initial
approaches
earlier,
but
what
do
both
of
you
think
about
trying
to
regulate
something
that
is
just
constantly
evolving.
B
Yeah,
so,
just
very
briefly
on
this,
I
have
mentioned
looking
focusing
more
on
on
the
rights
that
we're
seeking
to
protect,
as
opposed
to
the
technology
itself.
I
think
that's
a
really
important
angle
to
again
to
future
proof
these
this
legislation,
particularly
in
light
of
how
rapidly
developing
the
field
is.
I
also
think
this
is
sort
of
a
unique
benefit
of
of
a
state
and
local
approach.
B
State
and
locals
can
just
act
a
lot
more
rapidly
than
and
effectively
than
the
federal
government
and
have
a
more
tailored
approach
to
the
needs
and
interests
of
your
constituents
and
their
realities
on
the
ground.
So
I
think
that
is
a
benefit
of
focusing
on
state
and
local
action,
and
then
the
last
thing
I
will
say
is
I
work
for
an
organization
that
does
advocate
for
a
moratorium
on
the
use
of
face
recognition
in
a
law
enforcement
context.
This
is
particularly
in
light
of
two
things.
B
One
is
this:
this
disproportionate
impact
and
the
racial
bias
that
still
happens
in
a
lot
of
these
systems,
and
we
don't
fully
understand-
and
the
other
reason
is,
I
think,
the
the
cases
in
which
this
technology
is
used.
There
may
be
a
question
where
this
technology
has
violated
the
due
process,
rights
of
those
individuals,
and
we
have
no
idea
how
many
wrongful
convictions
or
wrongful
plea
bargains
and
whatnot
have
occurred
over
the
last
20
years
because
of
this
technology.
B
So,
for
all
those
reasons,
taking
a
step
back
and
saying,
we
need
some
time
to
figure
out
what
the
rules
of
the
road
are.
We
need
to
take
some
time
and
understand
that
this
is
a
forensic
science
and
therefore
we
need
to
treat
it
as
a
science,
not
as
as
as
magic
as
a
you
know,
a
technology
that
any
officer
can
use.
We
should
maybe
have
trained
individuals
using
it.
I
think
there
is
a
benefit
to
taking
that
pause
and
putting
specific
sort
of
next
steps
in
place,
whereby
we
continue
to
investigate
misidentification
rates.
B
We
conduct
audits,
we
increase
a
transparency
around
the
use
of
the
technology
so
that
we
just
have
a
more
comprehensive
understanding
of
how
it's
used
and
how
it
should
be
used.
C
Yeah
I
I
would
say
that
I've,
I've
already
said
this
probably
30
times
I'll,
say
31..
I
think
for
for
legislators,
focusing
on
the
the
the
true
nature
of
the
problems
here,
and
I
think
claire
hits
the
nail
right
on
the
head.
It's
in
these
law
enforcement
and
criminal
instances,
in
which
most,
if
not
all
of
us
are
are,
are
very
acutely
concerned
about,
and
I
would
say,
beginning
there
is
a
great
place
for
for
legislators
to
start
from
from
there.
C
I
would
say
you
know
again,
if
you're
going
to
make
a
move
in
any
sort
of
emerging
technology
space
through
regulation,
you,
you
need
to
have
a
very
clear
statement
of
what
the
problem
is
in
a
very
narrowly
tailored
response
to
that
particular
problem.
I
would
say
something
like
again
look
to
an
instance
like
bipa,
in
which
it's
very,
very
well
intentioned
and
is
it
is
imposed
in
the
name
of
privacy,
but
there
are
clear
trade-offs
to
this
right
nest
in
nest.
C
The
smart
camera
systems
that
many
of
us
may
have
in
our
homes
or
on
our
doorbells
are
turned
off
in
in
illinois
because
because
of
bipa
right,
the
the
sony
eyeball
right,
the
robot
dogs,
which
could
which
could
help
millions
of
people
who
are
suffering
from
dementia
as
a
compliment
to
their
current
care,
that's
unavailable
to
people
in
illinois
right
and
in
the
other
states
that
have
followed
this
mode.
So
I
would
say
there
are
clear
trade-offs
in
this
space.
Neither
of
those
technologies
existed
in
2008
when
these
laws
were
passed.
C
So
in
some
ways
it
was
there
was
no
cost,
because
there
was
nothing
lost
right.
We're
passing
laws
on
technology
that
were
decades
away,
as
claire
mentioned,
gay
technology
isn't
for
another
five
years,
and
yet
we
regulated
it
20
years
ago
in
illinois,
and
I
would
say,
walking
walking
backward
from
there
and
saying
again
as
the
as
the
nature
of
the
market
and
the
technology
changes
being
willing
as
legislators
to
constantly
and
continually
renew
these
things.
C
So
I
would
say
any
technology
specific
policy
that
is
passed
in
a
state
should
sunset
right
to
set
it
on
a
24-month
shot
clock
and
it
expires
in
24
months
and
forces
the
legislature
to
reconsider
the
policies
that
they've
instituted.
I
mean
that's
just
a
really
simple
way
to
ensure
that
the
policies
that
you're
passing
can
be
refreshed
and
renewed
and
focused
on
the
real
problems
in
the
evolution
in
the
technology
in
the
markets,
as
opposed
to
taking
a
snapshot
in
time
and
forcing
the
future
to
to
be
dictated
by
decisions.
We
made
decades.
A
Well
ago,
you
both
so
much.
I
really
appreciate
your
expertise,
and
this
has
been
a
really
interesting
discussion.
Luke
at
the
beginning,
put
a
link
to
our
privacy
week
agenda,
so
you
can
go
ahead
and
look
at
what
else.
We've
had
this
week
and
watch
our
recording
of
our
consumer
data
privacy
webinar
from
earlier
this
week
and
again
we
really
appreciate
both
of
your
participation
and
all
the
questions
from
the
audience.
A
We
will
have
this
recorded
on
our
website
within
the
week,
so
we'll
send
that
out
to
you
so
again,
thank
you
so
much
for
joining
us
and
if
you
have
any
questions,
just
reach
out
to
any
of
the
ncsl
staff
and
thank
you
for
joining.