►
From YouTube: Committee on Government Operations on June 9, 2020
Description
Docket #0683 - Ordinance banning facial recognition technology in Boston
A
Council
TV,
it
will
also
be
rebroadcast
it
at
a
later
date
on
Comcast,
8r
c
n8,
verizon
1
964
for
public
testimony.
Written
comments
may
be
sent
to
the
committee
email
at
CCC
G
at
Boston
gov,
and
it
will
be
made
part
of
the
official
record.
This
is
an
ordinance
that
would
ban
the
use
of
facial
surveillance
by
the
city
of
Boston
or
any
official
in
the
city
of
Boston.
A
The
proposal
would
also
prohibit
entering
into
agreements
contracts
to
obtain
face
surveillance
with
third
parties
I'm
going
to
turn
it
now
over
to
the
lead,
sponsors,
councillor,
Wu
and
councillor
Arroyo,
and
then
I
will
turn
it
to
my
colleagues.
In
order
of
arrival
for
opening
remarks,
just
acknowledge
my
colleagues
and
I
believe
the
order
that
I
have
so
far
councillor
again
it'll
be
councillor
Arroyo,
councillor,
whoo,
councillor,
Braden,
councillor,
bra
block
box,
Cuse
me,
council,
Makia,
counselor,
Campbell
and
then
I
had
counselor
O'malley,
councillor,
asabi,
George
and
then
I
am
sorry.
B
A
D
That
start
with
me
sure
thank
you,
madam
chair
I
think
we
have
excellent
panelists
to
kind
of
go
into
the
intricacies
of
the
facial
recognition
ban,
but
to
just
kind
of
summarize
it
this
base
recognition
ban,
creates
a
community
process
that
makes
our
systems
more
inclusive
community
consents,
while
adding
democratic
oversight
where
there
currently
is,
none
will
be
joining
our
neighbors
and
cambridge
and
somerville,
and
specifically,
when
it
comes
to
facial
recognition,
tech,
it
doesn't
work.
It's
not
good.
D
It's
been
proven
through
the
data
to
be
less
accurate
for
people
with
darker
skin
a
study
by
one
of
our
panelists
joy.
Bellini
research
at
MIT
found
that
black
women
were
35%
more
light
white
men
to
be
misclassified
by
face
surveillance
technology.
The
reality
is
that
face
surveillance
isn't
very
effective
in
its
current
form.
D
According
to
records
obtained
by
the
ACLU
one
manufacturer.
Promoting
this
technology
in
Massachusetts
admitted
that
it
might
only
work
about
30%
of
the
time.
Currently,
my
understanding
is
that
the
Boston
Police
Department
does
not
use
this.
This
isn't
a
ban
on
surveillance.
This
is
simply
a
ban
on
facial
recognition,
surveillance,
which
I
think
the
data
has
shown
doesn't
work.
D
Bpd
doesn't
use
it
because
I
now
allow
commissioner
Ross
to
speak
on
it,
but
my
understanding
from
from
what
I've
heard
and
past
comments
from
Commissioner
grasses
that
they
don't
want
to
use
technology
that
doesn't
work
well,
and
so
this
ban
isn't
necessarily
even
a
permanent
ban.
It
just
creates
a
process
where
we
are
banning
something
that
doesn't
have
to
be
process
that
doesn't
have
any
consent
and
which
we
require
some
democratic
oversight
if
they
ever
do
want
to
implement
it.
D
E
F
F
We
are
thankful
that
Boston
police
agree
with
that
assessment
and
do
not
use
facial
recognition,
facial
surveillance
today
in
Boston,
and
are
looking
to
codify
that
understanding,
so
that
with
the
various
mechanisms
for
technology
and
upgrades-
and
you
know
changing
circumstances
of
different
administrations
and
personalities
that
we
just
have
it
on
the
books.
That
protections
come
first,
because
we
truly
believe
that
public
health
and
public
safety
should
be
grounded
in
trust.
So
you
know
the
the
chair
gave
a
wonderful
description
of
the
this
ordinance.
F
I
am
looking
forward
to
the
discussion,
but
most
of
all
I'm
grateful
to
colleagues-
and
everyone
has
contributed
to
this
conversation,
as
well
as
the
larger
conversation
about
community
center
oversight
of
surveillance,
which
will
be
in
a
separate
but
related
docket.
That
will
be
moving
forward
on
the
council.
Thank
you
very
much.
C
G
Thank
you.
Thank
you,
madam
chair.
Thank
you
to
the
makers
of
this
ordinance.
I,
really
look
forward
to
the
conversation
this
afternoon
and
very
proud
to
participate
in
this
discussion.
I
think
it's
it's
the
proper
and
right
thing
to
do
to
really
consider
deeply
all
the
implications
of
new
technology
that
hasn't
been
proven
to
be
effective
before
introducing
it
in
any
city
in
Massachusetts,
so
I
really
look
forward
to
learning
more
and
hearing
from
the
panelists
this
afternoon.
Thank
you.
H
Yes,
thank
you,
madam
chair.
Everyone
for
being
outside
for
a
minute.
I
I
want
to
just
say,
I'm
excited
about
this
conversation
today,
I
think
it's
a
piece
of
action.
We
should
absolutely
take
I
just
want
to
say
that
I,
you
know,
I
think
it's
really
important
that
this
technology
has
major
shortcomings
and
that
also
it
has
you
know
a
racial
bias
built
into
it.
H
Those
are
really
good
reasons
not
to
use
it,
but
honestly,
even
if
those
did
not
hold
I,
think
that
in
this
country
we
run
the
risk,
sometimes
of
just
starting
to
do
things
that
tech
new
technology
can
do
without
asking
ourselves
like
as
a
democratic
populace
as
a
community.
Is
this
something
that
we
should
do
and
I
think
in
the
process?
H
We
as
Americans,
have
given
up
a
lot
of
our
rights
of
privacy,
frankly
often
as
as
often
as
not
to
private
corporations
as
a
government,
but
I
think
that
it's
really
important
to
just
remember
that.
There's
a
there's,
a
true
public
interest
in
a
society
that
is
built
more
on
trust
than
on
surveillance
and
so
I
think.
It's
really
important
for
us
to
draw
this
line
in
the
sand,
so
that
you
know
if
the
technology
improves
and
some
of
those
arguments
that
are
based
on
kind
of
its
functionality
start
to
slip
away.
H
That
we're
still
gonna
have
a
real
conversation
together
about
about
how
you
know
how
we
want
to
live
and
what
kind
of
a
society
we
want
to
be.
So
to
me,
putting
a
moratorium
banned
like
this
into
place,
makes
a
ton
of
sense
and
I'm
excited
for
the
conversation
and
grateful
to
all
the
advocates
that
you
so
much.
Madam
chair,
Thank,.
I
Yes,
thank
you
sorry
about
my
audio
Thank
You,
chair
woman
I'll.
Do
it
and
thank
you
to
the
makers,
council,
rule
and
I
throw
you
for
your
leadership
on
this
issue.
For
me,
this
issue
is
personal
and
professional.
As
a
Dominican
woman
who
claims
a
black
boots,
facial
recognition
technology
miss
identifies
people
like
me
by
35%
we're
in
a
time
where
our
technology
is
outpacing
our
morals.
I
We've
got
a
2020
technology
with
a
16
20
state
of
mind,
set
that's
thinking
as
a
city,
we
need
to
practice
extreme
caution
of
a
facial
recognition
technology
and
we
should
be
concerned
not
only
with
this.
Technology
goes
wrong,
but
also
when
it
goes
right.
We
have
a
lot
of
work
to
do
when
it
comes
to
building
trust
in
our
government.
Working
against
this
facial
recognition.
I
Technology
is
a
good
first
step,
but
I'm
happy
to
hear
there
is
pushback
against
this
kind
of
surveillance
on
all
sides
of
the
issue,
like
the
makers
of
the
ordinance
mentioned
now
right
now,
there
is
not
a
desire
to
put
facial
recognition
technology
in
place,
and
this
ordinance
is
an
opportunity
to
put
that
into
law.
I
look
forward
to
the
discussion
and
I
hope
to
continue
to
push
on
this
issue
alongside
council
and
naturally,
oh
thank
you
so
much.
B
You,
madam
chair
for
in
the
sponsors
for
this
hearing,
as
I've
stated
in
the
past,
the
technology
updates
and
advances
have
provided
us
with
many
advantages,
but
at
the
same
time,
they've
also
proven
to
be
deeply
flawed,
unreliable
and
disproportionately
impacting
communities
of
color
this
technology,
just
for
everyone's
full
disclosure
listening.
This
technology
is
currently
not
being
used
by
the
BPD.
B
I
want
to
make
sure
that
that
you
know
that
I'm
gonna
continue
to
remain
supportive
of
setting
transparent
limits
in
creating
a
public
process
by
which
we
assess
the
use
of
these
technologies.
I
know
that
we've
been
able
to
solve
a
lot
of
crimes
through
video
surveillance
and
also
through
video,
quite
frankly,
and
also
they've,
led
to
to
justice
for
for
families
and
someone
that
has
lost
a
cousin
due
to
murder.
But
for
you
know
there
was
surveillance
and
it
was
DNA
that
led
to
my
family,
getting
justice.
B
B
That
make
the
most
sense
for
Boston
not
like
I'll,
get
real
parochial
here,
I'm,
not
quite
frankly,
I'm
not
really
concerned
about
Cambridge,
and
some
of
them
I
want
to
make
sure
that
whatever
we're
doing
in
Boston
is
impacting.
You
know
all
of
our
neighborhoods,
the
neighborhoods
that
I
represent
as
a
citywide
city
council
or
so,
and
making
sure
that
we
have
community
input.
B
Most
importantly,
we
work
for
the
people
of
Boston
as
councillors,
obviously,
as
the
Commissioner
that
has
police
offices,
we
work
for
that
for
that
for
the
residents
and
the
taxpayers
of
esterday,
so
they
need
a
seat
at
the
table
and
want
to
hear
from
them
if
this
technology
enhances-
and
it
helped
us
as
Public
Safety-
to
look
great
but
at
the
time
stance-
it's
it's
it's
still
unreliable
in
there
are
flaws
to
it,
so
as
a
result
of
which
that's
where
I'm
going
to
be
on
this.
Thank
you,
madam
chair,
thank.
C
Thank
you,
madam
chair,
and
thank
you
I
know.
You're
gonna
run
a
tight
ship
given,
what's
all
that's
happening
today,
including
the
funeral
of
course
for
for
George
Floyd.
So
thank
you
very
much.
Thank
you
also,
of
course,
to
the
commissioner
for
being
here.
Given
all
that's
going
on,
I
know:
you've
been
on
the
record
in
the
past
with
respect
to
this
issue,
and
that
not
only
does
it
apart,
not
have
it,
but
you
understand
the
flaws
of
it.
So
thank
you
for
for
being
here.
C
K
You,
madam
chair
I,
just
very
briefly
want
to
thank
the
makers,
counselors
whoo
and
royal
for
their
leadership
here.
Obviously,
I
support
the
facial-recognition
ban
also
grateful
wanna
echo,
my
thanks
to
the
Commissioner
for
his
his
comments
and
his
opposition.
The
fact
that
this
isn't
used
so
it's
important
that
we
codify
that
in
such
a
way,
which
again
is
the
purpose
of
this
hearing
in
this
ordinance
and
stand
ready,
willing
and
able
to
get
to
work.
L
L
I
think
that
our
colleagues
have
brought
to
the
forefront
over
the
last
few
years,
as
have
the
advocates
around
the
concerns
regarding
this
kind
of
technology,
so
that
I,
you
know
appreciate
the
the
real
concerns
around
accuracy
and
biases
against
people
of
color
women,
children,
the
elderly
and
you
know,
certainly
reflective
of
our
city,
our
city's
residents.
So
we
need
to
make
sure
that
we're
proactive
and
protecting
our
residents
and
having
a
deeper
understanding
and
appreciation
of
this
technology
and
understanding
what
this
ordinance
would
mean
and
around
that
ban
and
following
through
with
it.
A
Technology
in
Boston
I
will
continue
to
work
on
the
technical
challenges
on
my
end
and
will
join
the
hearing
at
a
later
time
if
I
am
able,
if
I'm
unable
to
join
our
view,
the
recording
at
a
later
time,
I
want
to
thank
the
makers,
counselors
whoo
in
row,
yo
for
their
advocacy
and
continued
leadership
on
this
critical
issue.
I
especially
want
to
thank
all
the
advocates
who
are
participating
for
their
tireless
advocacy
on
this
issue.
I
look
forward
to
working
with
advocates
to
incorporate
their
ideas
and
how
we
can
push
this
issue
moving
forward.
A
It
is
clear
we
need
policies
in
place
to
protect
communities
that
are
vulnerable
to
technologies
that
are
biased
against
them.
I
look
forward
to
working
with
my
colleagues
on
this
critical
issue
and
will
continue
to
fight
for
an
agenda
that
promotes
and
protects
black
lives.
Please
read
this
letter
into
the
record.
Thank
you
sincerely.
Kim
Janie
I'm,
not
sure.
If
we've
been
joined
by
councillor,
Flynn
I
do
know
he
was
trying
to
get
in
in
the
meantime.
I
did
want
to
just
do
some
quick
housekeeping
as
part
of
my
introductory
remarks.
A
Just
to
let
people
know
again.
Today's
hearing
is
specifically
about
this
proposed
ordinance
that
would
ban
facial
recognition
technology.
We
had
a
robust
conversation
about
defunding
the
police,
discussing
all
aspects
of
the
budget.
All
of
those
different
things
just
happened,
and
it
was
shared
by
councillor
Bach
and
I
want
to
make
sure
conversations
are
related
that
we
do
not
take
this
precious
moment
on
this
legislation
and
have
it
covered
up
by
the
other
conversation.
So
I
just
want
to
I.
A
Just
want
you
to
know
that
and
I
again
I
will
move
the
conversation
towards
this
ordinance.
There
should
be
a
copy
of
it
there.
People
should
have
a
the
ability
to
discuss
about
this
language
in
this
ordinance.
Questions.
I
have
I
do
want
to
talk
about
the
whether
time
is
of
the
essence.
There
seem
to
be
some
issue
and
concern
about
whether
we
were
entering
a
contract
that
allowed
for
this.
A
So
when
the
Commissioner
speaks
I'd
like
for
him
to
speak
on
that
I'm
not
sure
if
there's
a
contract
pending
what's
going
on,
I
also
want
to
make
sure
that
folks
understand
be
due
to
the
fact
that
I
think
we
have
possibly
a
hundred
people
signed
up
to
speak.
We
have
a
lot
of
folks
who
are
going
to
be
speaking
on
the
panel
I'm,
going
to
limit
our
my
colleagues
to
three
minutes
in
their
first
round
and
I'm
going
to
get
the
public
to
no
more
than
two
and
I'm
gonna.
A
M
Thank
you
thank
you
consulate
Edwards,
and
into
the
sponsors,
and
looking
forward
to
the
to
this
hearing,
learning
learning
more
about
the
the
proposal
in
also
interested
in
hearing
from
the
Commissioner
of
Police
Commissioner
gross
and
with
the
public
as
well
again
this
this
isn't
it.
This
is
an
issue,
that's
that
we
all
can
learn
from.
We
can
listen
to
each
other.
M
We
can
be
respectful
and
and
hear
each
other's
point
of
view
and
that's
what
this
city
is
all
about
is
coming
together
and
treating
each
other
feel
you
you
know
with
respect,
especially
during
difficult
days
and
ian's
on
difficult
issues.
I
also
know
that
our
Police
Commissioner
is
someone
that
also
listens
to
people
and
he's
in
the
neighborhoods,
and
he
he
talks
and
listens
to
a
lot
of
the
concerns
of
the
residents.
M
A
A
A
N
N
You
have
it
on
record
right
here
and
now
I'm
willing
to
work
with
you
counselors
the
mayor
in
our
great
city,
so
thank
you
so
I
will
start
and
thank
you,
the
chair
and
the
committee
members
and
the
makers,
council
and
Arroyo
I
really
thank
you
for
inviting
me
to
participate
in
today's
hearing
regarding
the
ordinance
banning
facial
recognition
technology
in
Boston.
As
you
can
imagine,
the
Boston
Police
Department
has
been
extremely
busy
meeting
the
current
demands
of
Public
Safety
from
the
kovat
19
pandemic
and
the
public
rallies.
N
However,
the
Department
and
I
understand
the
importance
of
the
given
topic.
I
would
like
to
request
that
this
hearing
remain
on
the
given
topic
and
that
we
work
together
to
find
other
times
to
discuss
the
important
issues
regarding
police
relations,
national
issues
on
police
reform
and
the
ways
that
we
can
improve
our
relationships
here
in
Boston.
N
As
has
been
practiced
in
the
past,
we
welcome
a
future
opportunity
to
discuss
the
ordinance
language
and
city
council
concerns
in
a
working
session
regarding
technology
policy
and
potential
privates
privacy
concerns.
My
testimony
today
is
meant
to
serve
as
a
background
to
BPD's
current
practices
and
identify
potential
technological
needs.
N
N
Video
has
proven
to
be
one
of
the
most
effective
tools
for
collecting
evidence
of
criminal
offenses
for
solving
crimes
and
locating
Missing
and
Exploited
individuals.
Any
prohibitions
on
these
investigative
tools
without
a
full
understanding
of
potential
uses
under
strict
protocols,
could
be
harmful
and
impede
our
ability
to
protect
the
public.
N
The
proposed
ordinance
defines
facial
surveillance
to
mean
an
automatic,
automated
or
semi-automated
process
that
assists
in
identifying
or
verifying
an
individual
or
encountering
information
about
an
individual
based
on
the
physical
characteristics
of
an
individual's
face
and
to
be
clear,
the
department
has
no
desire
to
employ
a
facial
surveillance
system
to
generally
surveil
the
citizens
of
Boston.
The
department,
however,
notes
a
distinction
between
facial
surveillance
systems
and
facial
recognition
technology.
N
The
department
believes
that
the
term
facial
recognition
technology
barest
better
describes
the
investigative
nature
of
the
technology.
We
believe
would
be
useful
to
the
city
with
the
right
safeguards
and
community
input.
The
department
would
as
technology
advances
and
becomes
more
reliable,
like
the
opportunity
to
discuss
the
utilization
of
facial
recognition
technology
to
restart,
to
respond
to
specific
crimes
and
emergency
situations,
and
I
would
like
to
revisit
everyone's
thoughts
and
remembrances
of
the
Boston
Marathon
and
the
most
recent
kidnappings.
N
Facial
recognition
technology
to
the
best
of
our
knowledge,
were
greatly
reduced
to
hours
necessary
to
review
video
evidence.
This
would
also
allow
investigators
to
move
more
quickly,
identifying
victims,
missing
persons
or
suspects
of
crime
through
citywide
camera
recordings.
This
technology,
if
acquired,
would
also
allow
for
the
compilation
and
condensing
a
video
be
done
more
efficiently.
N
Facial
recognition,
technology
used
with
well-established
guidelines
in
under
strict
review,
can
also
provide
a
safer
environment
for
all
in
the
City
of
Boston.
Creating
a
system
in
which
evidence
is
reviewed
in
a
timely
manner
and
efficiently
allows
for
police
to
intervene
and
reduce
levels
of
crime
would
be
would
be
beneficial.
N
It
is
important
to
note
the
department
shares
the
concerns
of
our
community
and
our
community
concerns
around
privacy
and
intrusive
surveillance,
as
it
is
the
current
practice
with
existing
technology.
Our
intentions
are
to
implement
any
potential
future
advances
in
facial
recognition
technology
through
well-defined
protocols
and
procedures.
Some
areas
we
consider
excuse
me
some
areas
we
consider
this
technology
may
be
beneficial,
are
in
identifying
the
route
and
locations
of
missing
persons,
including
the
suffering
from
Alzheimer's
and
dementia,
those
suffering
from
Alzheimer's
and
dementia
as
well
missing.
N
N
Important
to
note,
we
would
like
to
work
with
the
council
and
our
partners
to
clearly
articulate
the
language
that
could
best
describe
these
uses
in
Section.
B
2
of
the
proposed
ordinance
indicates
there
some
intention
to
account
for
the
use
of
this
technology
to
advance
investigations,
but
the
department
believes
some
additional
language
referring
referencing
is
permitted
uses
is
necessary.
N
N
That's
my
official
statement
that
I
read
in
and
I'm
telling
you
right
now
as
an
african-american
male
I'm.
The
technology
that
is
in
place
today
does
not
meet
the
standards
of
the
Boston
Police
Department.
Nor
does
it
meet
my
standards
and
until
technology
advances
to
a
point
where
it
is
more
reliable
again,
we
will
need
your
input.
Your
guidance
will
work
together
to
pick
that
technology,
which
is
more
conducive
to
our
privacy
and
our
rights
as
citizens
of
Boston.
Thank
you.
A
Thank
you
very
much,
commissioner
grass.
Do
you
just
before
I
go
on
we're
gonna,
go
now
to
joy,
boo,
weeny
and
I'm?
So
sorry
for
that
joy,
I,
apologize
profusely
for
the
butchering
of
your
last
name:
McHale
McKellen
at
Karina,
Homme,
Katie,
Crawford,
Kate,
Erik,
Berg
and
Joshua
Baracus,
but
I
wanted
to
ask
the
Commissioner
just
very
quickly.
In
the
you
said
you
have
suggested
language
or
that
you
would
want
to
work
on
suggested
language
for
B,
no
I.
N
A
You
so
now,
I'm
gonna
turn
it
over
to
the
some
of
the
folks
in
the
who
have
been
called
forward
to
speak
I
hope
we're
gonna,
try
and
get
through
as
many
of
them
as
possible.
Commissioner.
These
are
the
folks
who
helped
to
push
and
move
this
ordinance.
We
really
hope
you
can
stay
as
long
as
you
can
to
hear
them.
O
More
sounds
good,
so
thank
you
so
much
madam
chair
Edwards,
members
of
the
Committee
on
government
operations
and
members
of
the
Boston
City
Council
for
the
opportunity
to
testify
today.
I
am
the
founder
of
the
algorithmic
Justice
League
and
an
algorithmic
bias
researcher
I've
conducted
MIT
study
showing
some
of
the
largest
recorded
gender
and
racial
biases
in
AI
systems
sold
by
companies,
including
IBM,
Microsoft
and
Amazon.
As
you've
heard,
the
deployment
of
facial
recognition
and
related
technologies
has
major
civil
rights
implications.
O
These
tools
also
have
technical
limitations
that
further
amplify
harms
for
black
people,
indigenous
people,
other
communities
of
color
women,
the
elderly,
those
with
dementia
and
Parkinson's
Youth
trans
and
gender
non-conforming
individuals
in
one
test,
Iran
Amazon's
AI
even
failed
on
the
face
of
Oprah
Winfrey
labeling.
Her
mail
personally
I've
had
to
resort
to
wearing
a
white
mask
to
have
my
dark
skin
detected
by
some
of
this
technology.
But
given
mass
surveillance
applications
not
having
my
face
detected
can
be
a
benefit.
O
We
do
not
need
to
look
to
China
to
see
this
technology
being
used
for
surveillance
of
protesters,
with
little
to
no
accountability
and
too
often
in
violation
of
our
civil
and
human
rights,
including
First
Amendment
rights
of
freedom
of
expression,
Association
and
Assembly.
When
the
tech
works,
we
can't
forget
about
the
cost
of
surveillance,
but
in
other
contexts
it
can
fail
and
the
failures
can
be
harmful.
Misidentifications
can
lead
to
false
arrest
and
accusations
in
April,
2009,
Tina,
Brown,
University
senior
was
miss
identified
as
a
terrorist
suspect
in
the
Sri
Lanka
Easter
bombings.
O
The
police
eventually
corrected
the
mistake,
but
she
still
received
death
threats.
Mistaken
identity
is
more
than
an
inconvenience
and
she's
not
alone.
In
the
UK,
the
faces
of
over
2,400
innocent
people
were
stored
by
the
police
department.
Without
their
consent,
the
department
reported
a
false
positive
identification
rate
of
over
90
percent
in
the
u.s..
There
are
no
reporting
requirements
generally,
generally
speaking,
so
we're
operating
in
the
dark.
Further.
These
tools
do
not
have
to
identify
unique
faces
to
be
harmful.
O
An
investigation
reported
that
IBM
equipped
the
NYPD
with
tools
to
search
for
people
in
video
by
facial,
hair
and
skin
tone.
In
short,
these
tools
can
be
used
to
automate
racial
profiling.
The
company
recently
came
out
to
denounce
the
use
of
these
tools
for
mass
surveillance
and
profiling.
Ibm's
moved
to
stop
selling
facial
recognition.
Technology
underscores
its
dangers
due
to
the
consequences
of
failure.
O
I
focused
my
MIT
research
on
the
performance
of
facial
analysis
systems,
I
found
that
for
the
task
of
gender
classification,
IBM
Microsoft
and
Amazon
had
error
rates
of
no
more
than
1%
for
lighter
skinned
men.
In
the
worst
case,
these
rates
were
to
over
30
percent
for
darker
skinned
women.
Subsequent
government
studies
complement
this
work.
They
show
that
continued
error
disparities
in
facial
analysis
and
other
tasks,
including
face
recognition.
O
The
latest
government
study
of
189
algorithms
revealed
consequential
racial,
gender
and
age
bias
and
many
of
the
algorithms
tested
still
even
if
error
rates
improved
the
capacity
for
abuse,
lack
of
oversight
and
deployment
limitations
post
too
great
a
risk
given
known
harms
the
city
of
Boston
should
ban
government
use
of
face
surveillance.
I
look
forward
to
answering
your
questions.
Thank
you
for
the
opportunity
to
testify.
You.
A
P
P
Do
I
yeah
do
I
really
do
I
agree.
Police
have
has
yes,
they're
no
longer
catch
this
place,
but
catch
him
for
black
and
brown
folks,
On
January
I
was
talking
about
three
years
ago,
roughly
January
around
2017,
they
were
residents
in
my
neighborhood.
I
came
up
to
me
that
they
asking
a
drone
flying
in
the
area
and
they
asked
him
in
numerous
times
and
nobody
could
figure
out
who
was
playing.
P
That's
going
who's
controlling
it
and
so
upon
receiving
this
information,
I
live
across
the
street
from
a
soda
company
and
later
on,
then
they
asked
an
office
was
playing
with
a
drone
and
I
ended
up
taking
the
photos
of
the
drone
and
it
kind
of
jogs.
My
memory
back
that
this
was
this
was
the
drone
that
one
of
the
residents
in
my
neighborhood
has.
He
not
talked
about
and.
P
Was
flying
the
drone?
It
comes
to
my
knowledge
of
the
police,
so
we're
done
doing
some
of
research
are
reaching
out
to
friend,
arrays
hear
you
and
we
try
to
file
a
public
workers
request
which
we
were
denied
our
first
time
and
eventually
got
it
to
work.
Boston
Police
has
spent
17500
solutions
and
nobody,
nobody
from
city
council
to
the
public
that
was
kind
of
less
than
kept
secret,
and
there
was
no
community
oversight.
A
few
jobs
were
flying
illegally
in
the
side
of
a
neighborhood,
just
the
breach
of
privacy.
P
The
onion
underneath
there
not
knowing
what
information
was
being
collected.
What
was
what
was
being
stored?
I
was
being
stored.
That
really
scared
a
lot
of
hosts
in
my
neighborhood
and
to
the
point
that
a
lot
of
them
didn't
want
to
speak
up
because
of
class
experience
of
local
beast
and
being
harassed.
Nobody
wanted
that
backlash.
At
the
same
time,
we
have
to
be
as
residents
I
have
to
empower
one
another.
P
It
has
to
be.
There
has
to
be
somebody,
regardless
that
if
it's
myself
or
another
individual,
you
have
to
hold
people
accountable
and
something
that
Monaco
can
isn't.
A
tactic.
Kind
of
billy
has
no
colors
to
another,
because
if
you're
white
black
in
a
blue
uniform,
you
still
have
to
be
held
accountable
and
then
just
to
see
that
no
one
was
held
accountable
for
this
very
thing,
because
it
kind
of
just
forces
their
control.
A
You
see
much
perfect
testimony
and
I
really
appreciate
you
speaking
from
the
heart
from
your
own
experience
and
also
demonstrating
how
you've
actually
conversation
for
accountability
through
your
leadership.
So
I
want
to
thank
up.
Next.
We
have
Karina
from
the
student
movement
three
minutes,
Karina
hi.
R
Hi
everyone.
Thank
you,
everyone
for
joining
us
today.
At
this
hearing
my
name
is
Karina
Hamm
and
I'm.
The
field
organizer
for
the
student
immigrant
movement
I
would
like
to
also
thank
our
partners.
Who've
been
working
on
the
facial
recognition
band
city,
councilors,
Michelle,
vu,
Ricardo
Arroyo
under
a
cabal
kim
jamie
anneka,
sabe
George,
along
with
ACLU
of
Massachusetts
Muslim,
Justice,
League
and
unafraid
educators.
So
sim
is
a
statewide
grassroots
organization.
By
and
for
the
undocumented
youth,
we
work
closely
with
our
members
or
our
young
people
to
create
trusting
and
empowering
relationships
at
sim.
R
We
fight
for
the
permanent
protection
of
our
undocumented
youth
and
their
families
through
collective
action,
and
one
of
the
major
role
of
the
work
in
the
organizing
that
we
do
is
to
challenge
these
systems
that
continue
to
exploit
and
disenfranchise
the
undocumented
in
immigrant
communities.
So
I
am
here
on
behalf
of
them
to
support
the
ban
on
facial
recognition
because
it's
another
system,
it's
another
tool
that
will
intentionally
be
used
against
marginalized
community,
so
the
undocumented
immigrant
community
is
not
mutually
exclusive
to
one
particular
group.
R
Their
face
surveillance
system
has
been
adopted
in
their
school
district
and
they've
already
spent
over
a
million
dollars
in
the
system,
and
so
you
ask
yourself
well
what
is
exactly
the
intent
to
have
these
surveillance
that
you
still
have
the
right
to
their
privacy
and
have
the
right
to
live
without
constant
infringement
and
interference
of
law
enforcement?
If
another
city
allows
facial
surveillance
and
this
in
their
schools
to
happen,
the
chance
of
that
occurring
here
is
high
as
well.
R
We
know
that
black
and
brown
you
to
experience
this
oppression
every
day
and
they
already
are
deemed
suspicious
by
law
enforcement
based
on
their
skin
color
and
the
clothes
they
wear,
the
people
they
interact
with
and
more
and
so
adding
facial
recognition.
That's
not
even
accurate
to
surveil
our
youth
and
families
will
just
be
used
to
justify
their
arrests
or
NT
or
deportation,
and
through
the
policing
system,
we're
already
asking
for
police
officers
to
engage
with
the
youth
in
ways
that
are
just
harmful
to
the
youth
development.
R
The
City
of
Boston
needs
to
stop
criminalizing
criminalizing
young
folks,
whether
they
engage
in
criminal
activity
or
not
arresting
them
will
not
get
to
the
root
of
the
problem.
There
are
so
many
options
that
are
much
safer
and
more
secure
for
our
youth,
so
families
for
the
families
and
their
futures
banning
facial
recognition
would
mean
law
enforcement
has
one
less
tool
to
criminalize
our
youth
thanks
so
much
everyone.
A
S
N
A
That
there
are
some
counselors
that
have
some
clarifying
questions
and
we
have
three
more
people
to
speak,
and
so
I
am
begging.
You
please,
if
you
can
at
least
get
to
the
lead
sponsors,
councillor,
Wu
and
councillor
Arroyo
that'll,
be
three
six
I'll
be
probably
20
more
minutes
a
little
after
4:00,
okay,.
N
A
T
T
As
our
health
care
providers
lack
the
PPE,
they
need
to
protect
themselves
and
us,
but
our
government
cannot
continue
to
act
as
if
it
is
at
war,
with
its
residents
waging
counterinsurgency
surveillance
and
control
operations,
particularly
in
black
and
brown
neighborhoods,
and
against
black
and
brown
people.
We
must
act.
We
must
not
merely
speak
to
change
our
laws
to
ensure
we
are
building
a
free,
just
an
equitable
future
for
all
people
in
Boston.
So
that
said,
banning
face
surveillance
in
Boston
is
a
no-brainer.
T
It
is
one
small
but
vital
piece
of
this
larger,
necessary
transformation
face
surveillance
is
dangerous
when
it
works
and
when
it
doesn't
banning
this
technology
in
Boston.
Now
is
not
an
academic
issue.
Indeed,
the
city
already
uses
technology
that,
with
one
mere
software
upgrade,
could
blanket
Boston
in
the
kind
of
dystopian
surveillance
currently
practiced
by
authoritarian
regimes
in
China
and
Russia
and,
as
Joe
said,
even
some
cities
right
here
in
the
United
States
Boston
has
to
strike
a
different
path
by
protecting
privacy,
racial
justice
and
First
Amendment
rights.
T
We
must
ban
this
technology
now
before
it
creeps
into
our
government,
in
the
shadows,
with
no
democratic
control
or
oversight,
as
it
has
in
countless
other
cities
across
the
country
and
across
the
world.
So
today
you
already
heard
from
pretty
much
the
world
renowned
expert
on
racial
and
gender
bias
in
facial
recognition.
Thank
you
so
much
joy
for
being
with
us
today.
Your
work
blazed
a
trail
and
showed
the
world
that
yes,
algorithms,
can
in
fact
be
racist.
T
You
also
heard
from
Michael
Macmillan
a
young
Boston
resident,
who
has
experienced
firsthand
what
true,
Kony
and
surveillance
with
no
accountability
looks
like
kareena
him
from
the
student.
Immigrant
movement
spoke
to
the
need
to
keep
face
surveillance
out
of
our
public
schools,
where
students
deserve
to
be
able
to
learn
without
fear.
T
You're
gonna
hear
a
similar
set
of
sentiments
from
Erik
Berg
of
the
Boston
Teachers
Union
and
shortly
medical
doctor
and
infectious
disease
expert
Joshua
Baracus
will
testify
to
how
this
technology
in
Boston
would
inevitably
undermine
trust
among
the
most
vulnerable
people
seeking
medical
care
and
help.
Obviously,
that's
the
last
thing
that
we
need
to
do,
especially
during
a
pandemic
for
too
long
in
Boston
city
agencies
have
acquired
and
deployed
invasive
surveillance
technologies,
with
no
public
debate,
transparency,
accountability
or
oversight.
T
In
most
cases,
these
technologies,
which,
as
Michael
said,
include
drones
but
also
license
plate
readers,
social
media
surveillance,
video
analytics
software
and
military-style
cell
phone
spying
devices
among
many
others.
These
were
purchased
and
deployed
without
the
City
Council's
knowledge,
let
alone
approval.
We
have
seen
this
routine
play
out
over
and
over
and
over
for
years.
This
is
a
problem
with
all
surveillance
technologies,
but
it
is
especially
unacceptable
where
the
technology
is
dangerous
and
dystopian
as
face
surveillance.
T
It
is
not
an
exaggeration
to
say
that
face
surveillance
would
if
we
allow
it
destroy
privacy
and
anonymity
in
public
space.
Face
surveillance
technology
paired
with
a
thousands
of
networked
surveillance
cameras
already
installed
throughout
the
city,
would
enable
any
official
with
access
to
the
system
to
automatically
catalog
the
movements,
habits
and
associations
of
all
people
at
all
times,
merely
with
the
push
of
a
button.
T
Demonstrations
run
those
images
through
a
computer
program
and
automatically
populate
a
list
of
each
person
who
attended
to
express
their
First
Amendment
rights
to
demand
racial
justice.
The
police
could
also
ask
a
computer
program
to
automatically
populate
a
list
of
every
person
who
walked
down
a
specific
street
on
any
given
morning
and
share
that
information
with
ice.
It
could
also
be
used
to
automatically
alert
law
enforcement
whenever
a
specific
person
passes
by
a
specific
surveillance
camera
anywhere
in
the
city
and
again.
A
T
All
right,
I'm
almost
done.
Thank
you
sorry,
chair
so
I'll
just
skip
to
the
end
and
say
we
are
at
a
fork
in
the
road
right
now.
As
Joy
said,
you
know.
Major
corporations,
including
IBM
and
Google,
are
now
declining
to
sell
this
technology
and
that's
for
good
reason
because
of
the
real
potential
for
grave
human
and
civil
rights
abuses.
T
V
Thank
you
and
I'll
get
right
to
the
point
in
the
interest
of
time.
Thank
you
to
the
council
for
taking
this
up
and
for
allowing
this
time
today
and
I'm
here
to
speak
today
on
behalf
of
the
Boston
Teachers
Union
and
our
over
10,000
members
in
support
of
the
ordinance
banning
facial
recognition
technology
in
Boston
that
was
presented
by
councillors,
Wuan
Arroyo.
We
strongly
oppose
the
use
of
this
technology
in
our
public
schools.
V
Boston's
public
schools
should
be
safe
environments
for
students
to
learn,
explore
their
identities
and
intellects
and
play
faced
surveillance
technology
threatens
that
environment.
The
technology
also
threatens
the
rights
of
our
BTU
members,
who
must
be
able
to
go
to
work
without
fearing
that
their
every
movement,
habit
and
association
will
be
tracked
and
catalogued
to
our
knowledge.
This
technology
is
not
in
use
in
our
schools,
but
we've
already
witnessed
some
experimenting
with
it.
V
Face
surveillance
in
schools
transforms
all
students
and
their
family
members,
as
well
as
employees
into
perpetual
suspects
where
each
and
every
one
of
their
movements
can
be
automatically
monitored.
The
use
of
this
technology
in
public
schools
will
negatively
impact
student's
ability
to
explore
new
ideas,
express
their
creativity
and
engage
in
student
dissent,
an
especially
disturbing
prospect,
given
the
current
youth
led
protests
against
police
violence.
Even
worse,
as
we've
heard,
the
technology
is
frequently
biased
and
inaccurate,
which
raises
concerns
about
its
use
to
police
students
of
color
academic.
Peer-Reviewed
studies
show
face
surveillance.
V
Algorithms
are
too
often
racially
based,
particularly
against
black
women,
with
inaccuracy
rates
up
to
35%.
For
that
graphic
we
know
black
and
brown.
Students
are
more
likely
to
be
punished
for
perceived
misbehavior
face
surveillance
will
only
perpetuate
and
reproduce
this
situation
when
used
to
monitor
children.
This
technology
fails
in
an
essential
sense
because
it
has
difficulty
accurately
identifying
young
people
as
their
faces
change.
V
Research
that
tested
five
top
performing
commercial
off-the-shelf
face
recognition
systems
shows
there's
a
negative
bias
when
they're
used
on
children,
they
perform
poorer
on
children
than
on
adults.
That's
because
these
systems
are
modeled
through
the
use
of
adult
faces
and
children
look
different
from
adults
in
such
a
way.
They
cannot
be
considered,
they're,
simply
scaled-down
versions.
V
On
top
of
this
face
surveillance
technology
regularly
miss
genders
transgender
people
and
will
have
a
harmful
impact
on
transgender
young
people
in
our
schools.
Research
shows
that
automatic
gender
recognition
consistently
views
gender
in
a
trans
exclusive
way
and
consequently
carries
disproportionate
risk
for
trans
people
subject
to
it.
At
a
time
when
transgender
children
are
being
stripped
of
their
rights
at
a
national
level,
Boston
must
protect
transgender
kids
in
our
schools.
Moreover,
face
surveillance
in
schools
will
contribute
to
the
school
to
Prison
Pipeline,
threatening
children's
welfare,
educational
opportunities
and
life
trajectories.
V
V
W
V
Get
right
to
the
end
finally
face
surveillance
technology
will
harm
immigrant
families
in
this
political
climate.
Immigrants
are
already
fearful
of
engagement
with
public
institutions
and
face
surveillance
systems
would
further
chill
student
and
parent
participation
in
immigrant
communities
in
our
schools.
Boston
schools
must
be
welcoming
and
safe
spaces
for
all
families.
The
City
of
Boston
must
take
action
now
to
ensure
children
and
Beetee
workers
are
not
subject
to
this
unfair,
biased
and
chilling
scrutiny.
In
order
to
protect
young
people
in
our
educational
community,
we
must
stop
face
surveillance
in
schools
before
it
begins.
A
X
I'll
be
I,
will
I
will
read
quickly.
First
of
all,
thanks
for
allowing
me
to
testify
today
regarding
this
important
issue.
I
should
say
that
the
views
that
I'm
expressing
today
are
my
own
and
don't
necessarily
represent
those
of
my
employer,
but
that
said,
I'm
an
infectious
disease
physician
and
an
addictions
researcher
here
in
Boston
as
such
I
spend
a
large
majority
of
my
time
working
on
solutions
to
improve
the
health
and
well-being
of
vulnerable
populations,
including
people
who
are
experiencing
homelessness
people
with
substance
use
disorders.
X
One
thing
that
we
know
is
that
stigma
and
bias
are
pervasive
throughout
our
community
and
lead
to
disparities
in
care
for
these
populations.
These
disparities
were
laid
bare
and
exacerbated
by
the
ongoing
koban
19
pandemic.
Well,
we're
all
searching
for
answers
on
how
to
best
protect
ourselves
from
this
virus.
I,
truly
fear
that
the
use
of
facial
recognition
technology
will
only
serve
to
exacerbate
existing
disparities,
including
those
of
racial,
gender
and
socioeconomic.
X
Additionally,
Boston
provides
expansive
services
for
substance
use
disorders,
including
methadone
treatment,
and
it's
imperative
that
we
ban
facial
recognition
software
to
ensure
the
people
seeking
substance,
use,
disorder,
treatment
and
mental
health
care
among
other
stigmatized
health
services
may
do
so
privately
and
without
fear.
If
we
allow
this
on
many
of
our
public
cameras,
it
would
enable
the
government
to
automatically
compile
lists
of
people
seeking
treatment
for
substance,
use
mental
health
and,
as
I
said,
other
stigma,
stigmatized
health
Editions.
This
would
have
a
chilling
effect
and
make
most
and
and
disproportionately
affect
our
most
vulnerable
populations.
A
D
F
Thank
you,
madam
chair,
and
thank
you
so
much
Commissioner
for
making
the
time
to
be
here.
We
all
know
how
many
demands
are
on
your
time
and
and
now
we're
asking
for
even
more
than
you
had
a
lot.
It's
so
very
very
grateful
that
it
means
a
lot
that
you
took
the
time
and
are
the
one
at
this
hearing.
I
also
want
to
note
that
you
were
the
one
at
the
hearing
and
June
2018
as
well,
so
we
know
that
you've
been
part
of
this
conversation
and
supportive
of
this.
N
F
I'm
getting
so
much
I
get
you
out
here
as
quick
as
possible,
so,
secondly,
so
you
drew
a
distinction
between
face
surveillance
systems
and
facial
recognition
technology,
so
just
to
clarify
does
BPD.
You
know
we
don't
use
that.
You
know
right
now,
because
you
haven't
done
that
upgrade,
but
does
BPD
use
work
with
other
state
or
federal
entities
that
have
based
surveillance
or
facial
recognition
technology?
Do
you
ever
use
the
results
of
that
technology
on
any
surgeons?
Answer.
N
F
N
If
you're
a
victim
of
a
crime-
and
we
don't
have
that
person
under
arrest,
what
do
we
do?
Have
a
potential
suspect
you
have
to
pick?
You
are
allowed
the
opportunity
to
try
to
identify
the
suspect
of
the
crime
and
so
to
be
fair.
It's
to
ensure
that
no
innocent
person
is
is
selected.
You
put
the
suspects
picture
along
with
several
other
suspects
pictures
up
to
seven
or
more,
and
then
you
see
if
the
victim
of
the
crime
can
pick
the
suspect
out
of
that
photo
array.
N
F
N
F
N
F
And
so,
given
the
absence
of
guidelines
right
now
at
the
federal
state
or
city
level
and
the
you
know,
the
need
to
upgrade
and
the
use
by
State
Police,
for
example,
are
you
supportive?
You
know
we're
hearing
you
loud
and
clear
that
you'd
like
to
have
from
their
conversation,
or
you
just
want
to
get
your
on
your
opinion.
Now.
Are
you
supportive
of
a
city
level
ban
until
there
are
policies
in
place?
Potentially
you
know
other
levels
or
specifically
at
the
city
level,
to
rein
and
surveillance.
Overall,
yes,.
N
I
am
I've
been
clear
for
four
years.
We
need
your
input,
your
guidance
and
I.
Thank
you
to
everyone
to
testify.
Today.
I
have
a
team
taking
notes,
because
I
didn't
forget
that
I'm
african-american
and
that
could
be
misidentified
as
well
and
I
believe
that
we
have
one
of
the
most
diversified
populations
in
the
history
of
our
city,
and
we
do
have
to
be
fair
to
everyone.
That
is
a
citizen
of
Boston.
We
don't
want
anyone,
miss
identify
and
again
we
will
work
with
everyone
here,
because
we'll
be
educated.
Yes,.
D
All
right,
so
one
I
just
want
to
thank
you,
Commissioner
for
supporting
the
facial
recognition
and
surveillance
van
and
for
taking
the
time
to
be
here
for
the
questions
council,
who
asked
many
of
them.
I
have
one
specific
one
which
is:
has
the
BPD
used
facial
recognition
in
the
past
and
any
capacity?
No
not.
N
D
N
A
N
N
A
Before
you
go
I'm
gonna,
do
this
one
call
out
to
the
panelists
that
just
spoke
or
to
any
city
councillors?
One
question
that
you
may
have
for
the
Commission
that
you
need
to
ask:
do
any
of
you
have
that
to
either
joy,
Mikkel
Carina,
Cade,
Erik
Joshua
or
to
the
other
city
councillors
I'm
trying
to
be
mindful
of
his
time,
but
I
I
also
understand
you.
Guys
also
gave
time
to
be
here
today
as
well,
and
if
you
wanted
to
ask
him
representing
BPD
a
question
I
seek
Julian.
A
Y
I
I
just
wanted
to
quickly
thank
Commissioner
grass
for
your
time
and
just
your
steadfast
leadership,
but
I
do
have
just
a
quick
question
before
you
go
and
just
I
just
need.
Some
clarity
is
that
you
mentioned
that
you
hope
to
keep
this
hearing
on
the
topic
of
the
ordinance
and
find
other
ways
to
adjust
issues,
police
relations.
So
a
lot
of
people
draw
the
direct
link
between
facial
recognition
and
relationships
with
the
community.
Do
you
see
that
link,
and
can
you
talk
about
how
the
police
department
sees
that
link
I'm
just
really
curious
about?
N
N
You
have
to
listen
to
the
people
that
you
serve
because
you
know
everyone
throws
around
the
moniker
from
unique
leasing,
I
believe
that's
become
jaded
and
so
I've
created
a
Bureau
of
community
engagement
so
that
we
can
have
these
discussions
and
again
I.
Don't
forget
my
history:
I
haven't
forgotten.
My
history,
I
came
on
in
1983
I've
gone
through
a
lot
of
racism.
I
was
in
a
tough
neighborhood
growing
up
and
I.
Didn't
forgive
any
of
that
and,
as
you
alluded
to
earlier,
some
of
the
city,
councilors
I'm,
always
in
the
street.
N
Talking
to
people
I
had
a
lot
of
criticism
of
law
enforcement
in
general,
but
I
believe
in.
If
you
want
change,
be
the
change,
you
can
only
change
with
the
people.
So
I
am
looking
forward
to
further
conversation
and
to
listening
to
testimony
on
how
we
can
improve
our
our
community
relations,
especially
not
only
thrown
at
I'm
a
civil
unrest,
but
we're
in
the
middle
of
a
pandemic
as
well,
and
so
police
departments
should
not
be
separated
from
the
community.
N
You
have
to
have
empathy,
sympathy,
care,
respect,
a
lot
of
people
have
lost
jobs
and
we
must
keep
in
mind
about
socio-economic
economics,
fairness
for
employment,
a
lot
of
that
thing
too.
One
of
those
things
directly
affect
the
neighborhoods
of
color
and
I'm
glad
that
we
are
increasing
our
diversity
in
the
communities
and
we
currently
currently
have
organizations
in
Boston
that
directly
speaks
to
the
people
with
the
lat
Latino
law
enforcement
organization,
the
Cabo
Verde
Police
Association,
the
benevolent
Asian
Jade
Society.
N
Our
department
is
increasing
in
diversity
and
the
benefit
of
that
the
benefit
of
having
representation
from
every
neighborhood
we
serve
is
that
it
will
improve
relationships.
So
again,
that's
why
I'm
looking
forward
to
a
working
session
and
my
team
is
taking
notes,
because
everybody's
everybody's
testimony
is
important
or
trust.
Me.
People
like
you
that
worked
hard
for
everyone's
right,
I,
wouldn't
be
here
as
the
first
african-american
police.
Commissioner.
If
we
didn't
have
folks
that
exercise
their
First
Amendment
rights.
A
Commissioner,
grass
I
have
a
clarifying
question
and
then
I
just
wanted
just
to
note
this.
This
impacts
the
staying
in
so
for
folks
who
commissioner
cross
is
alluded
to
a
working
session,
just
to
give
some
clarification
for
those
watching.
That
is
when
we
actually
get
down
to
the
language
of
the
ordinance
and
work
down
to
the
Commons
is
what
I
say
and
how
to
do
that,
and
so
what
commissioner
cross
is
is
asking
that
we
have
that
conversation
within
the
next
60
days,
which
I'm
fine
with
doing
that.
A
I'll
just
check
with
the
lead
sponsors
I
did
want
to
make
sure
I
understood
what,
in
terms
of
timing
is
going
on
with
the
contract
is
BPD
entering
into
a
contract
that
has
this
technology.
There
was
a
timing
issue
that
I
thought
was
was
warranting
this
this
hearing
happening
very
fast.
Maybe
the
lead
sponsors
could
also
answer
this
question.
I
was
confused
as
someone
could
explain
what
what
was
that
there's
a
sense
of
urgency
that
I
felt
that
I
was
meeting
and
I'm
happy
to
meet
it.
T
Kade
yeah
counselor
I'd
be
happy
to
address
that
we
obtained
public
records
from
the
city
a
while
back
showing
that
the
Boston
Police
Department
has
a
contract
with
a
company
called
brief
cam
that
expired
on
May
yeah
that
expired
on
May
14th
and
the
contract
was
for
version
4.3
of
brief
cam
software,
which
did
not
include
facial
surveillance,
algorithms.
The
current
version
of
brief
cams
technology
does
include
facial
surveillance,
algorithms,
and
so
one
of
the
reasons
that
the
councillors
and
the
advocacy
groups
behind
this
measure
were
urging
you
chair
to
schedule.
T
N
And
thank
you.
That's
exactly
what
I
was
going
to
comment.
The
older
version
of
brief
cam
did
not
have
facial
recognition.
I
believe
that
the
newer
version
does
Kim
works
on
the
old
version
on
condensing
objects
cards,
but
I
would
definitely
check
on
that
contract,
which
I
am
NOT
I,
don't
want
any
technology
that
has
facial
recognition
and
think.
G
Z
N
AA
N
Of
that
that
allows
for
fake
facial
recognition,
no,
and
if
someone
did,
if
we
did
go
into
a
contract,
would
I
have
the
ability
to
show
you
that
that
portion
that
does
have
facial
recognition,
that
that
can
be
censored,
that
that
is
not
excuse
me,
poor
choice
of
words
that
that
can
be
excluded,
and
so
I'm
not
comfortable
with
that
contract
until
I
know
more
about
it.
I
don't
want
any
part
of
facial
recognition.
Thank.
Y
N
I
rent
a
testimony
right,
the
technology
that's
going
forward
now
in
many
fields,
is
like
hey.
We
have
facial
recognition
and
I'm
like
not
comfortable
with
that
and
I'm.
So
brief
cam
yeah
the
old
version,
we
hardly
ever
used
it
and
there
is
discussion
on
the
new
version
and
I'm
not
comfortable
with
the
facial
recognition
component
of
that.
A
Thank
you
very
much
and
thank
you
very
much
Kate
for
the
background
and
understanding
both
of
you
and
understanding
I
felt
a
sense
to
move
fast
and
I'm
moving
fast,
but
I
was
it.
I
was
trying
to
make
sure
I
understood
what,
for.
A
A
N
You
very
much
thank
you
for
your
testimony.
Everyone
for
advocates
such
as
everyone
on
this
this,
this
zoom
call
I,
wouldn't
be
here
in
this
capacity.
So
thank
you.
Thank.
N
A
Thank
you
very
much,
and
so
we
have
a
still
robust
conversation
to
have
amongst
ourselves
about
the
language
about
the
goals,
whether
it
goes
far
enough.
If
anyone
opposes
this
ban
and
I
I
do
want
to
under
I
want
people
to
understand.
If
there's
someone
who
disagrees
with
a
lot
of
us,
this
will
be
a
respectful
conversation
and
that
person
will
be
welcome
to
have
their
opinion
and
express
it
as
every
single
one
of
us
has
been
able
to
do
so.
A
I'm
going
to
continue
now,
there's
a
list
of
folks
who
have
signed
up
to
speak
and
I'm,
going
to
continue
down
that
list
and
keep
them
to
the
to
no
more
than
three
minutes
and
then
we're
gonna
open
up.
Oh
I'm,
so
sorry,
I
apologize
before
I
go
down
a
list.
I
know
my
counselor.
My
colleagues
also
may
have
questions
or
concerns
or
may
want
to
voice
certain
things
about
the
legislation
and
I
apologize,
so
I'm,
just
so
amped
to
get
to
the
public
testimony.
So
I'm
gonna
go
ahead
and
go
an
order
of
arrival.
A
D
Only
question
I
would
have,
which
is
more
I.
Think
Kade
from
the
ACLU
is
whether
or
not
she
has
any
idea
where
we
are
on
that
contract.
From
from
what
I
heard
from
Commissioner
brass,
it
sounded
like
he
couldn't
confirm
the
timeline
for
that
contract,
whether
or
not
it's
been
signed
or
not
signed.
What's
going
on
with
that
contract,
so
does
any
panelists
here
have
any
information
on
whether
or
not
that
contract
is
with
the
staff?
So
that
is
thanks.
T
For
the
question
counselor
Arroyo,
regrettably,
no,
we
filed
the
public
records
requests
with
the
Boston
Police
Department
on
May
14th,
so
that
was
about
a
month
ago
asking
just
for
one
simple
document
for
the
existing
contract
that
the
Boston
Police
Department
has
with
Bri
Kim
and
they
haven't
sent
it
to
us.
Yet
so
and
now
you
know,
we've
prodded
them
multiple
times
about
it,
including
on
Friday,
saying
you
know
it
would
be
a
real
shame
if
we
had
to
go
to
this
hearing
on
Tuesday
without
this
information-
and
we
still
haven't
received
it.
G
T
I
can
speak
to
that
joy
bone
winning
if
you're
still
on.
You
may
want
to
speak
to
some
of
this
too.
So
the
ACLU
we
nationwide
have
been
filing
FOIA
requests.
Freedom
of
Information
Act
requests
with
various
federal
agencies
to
learn
about
how
the
federal
government
is
using
this
technology
and
how
they're
sharing
information
derived
from
the
technology
with
state
and
local
law
enforcement.
T
So
one
concern
that
we
have
is
that,
even
if
the
city
of
Boston
band
face
surveillance
technology
for
city
employees
certainly
would
be
the
case
that
the
FBI
and
other
federal
agencies
would
be
able
to
continue
to
use
this
technology
and
likely
to
share
information
that
comes
from
it
with
the
Boston,
Police,
Department
and
other
city
agencies.
Unfortunately,
there's
nothing.
We
can
do
about
that
at
the
city
level,
which
is
why
the
ACLU
is
also
working
with
partners
in
Congress
to
try
to
address
this
level
at
the
federal
level
as
well.
G
H
I
apologize.
Thank
you
thanks,
counselor
Edwards,
my
question
was
actually
just
it's
for
the
panelists.
Whoever
wants
to
jump
in
just,
but
you
know,
the
police,
commissioner,
was
making
a
distinction
between
facial
surveillance
and
facial
recognition
systems
and
what
we
might
be
banning
or
not
and
I'm.
Just
wondering
I,
don't
I,
don't
know
the
literature
in
this
world
well
enough
to
know
if
that's
sort
of
a
strong
existing
distinction,
whether
you
think
this
piece
of
legislation
in
front
of
us
bans
one
or
the
other.
What
do
you
think
we
should
be
making
that
distinction?
O
So
when
we
hear
the
term
facial
recognition
often
times
what
it
means,
it's
not
so
clear,
and
so
with
the
way
the
current
bill
is
written,
it
actually
covers
a
wide
range
of
different
kinds
of
facial,
recognition,
technologies
and
I,
say
technologies,
plural,
to
emphasize
we're
talking
about
different
things.
So,
for
example,
you
have
face
recognition,
which
is
about
identifying
a
unique
individual.
So
when
we're
talking
about
base
surveillance,
that
is
the
issue,
but
you
can
also
have
a
facial
analysis.
O
That's
guessing
somebody's
gender
or
somebody's
age
or
other
systems
that
might
try
to
infer
your
sexuality
or
your
religion,
so
you
can
still
discriminate,
even
if
it's
not
technically
face
recognition
in
the
technical
sense.
So
it's
important
to
have
a
really
broad
definition.
You
also
have
face
detection,
so
if
you
think
about
weapon
systems
where
you're
just
detecting
the
presence
of
a
face
without
saying
a
specific
individual
that
can
still
be
problematic,
you
also
have
companies
like
face
up
chin.
O
That
say,
we
can
infer
a
criminality
just
by
looking
at
your
face,
your
potential
to
be
a
pedophile,
a
murderer,
a
terrorist,
and
these
are
the
kinds
of
systems
that
companies
are
ten
to
sell
law
enforcement.
So
it's
crucial
that
any
legislation
that
is
being
written
has
a
sufficiently
broad
definition
of
a
wide
range
of
facial
recognition
technologies
right
with
the
plural,
so
that
you
don't
get
a
loophole
which
says:
oh
we're
doing,
face
identification
or
face
verification
which
falls
under
recognition.
So
everything
we're
doing
over
here
doesn't
matter,
but
it
does
so.
H
I
I
just
think
we
get
to
the
panelists
and
for
educating
us
beforehand
and
always
sharing
all
this
amazing
information
in
Vienna
and
data
and
research
that
informs
by
thinking
everyday
I
am
just
curious
and
at
the
same
time,
a
little
bit
worried
about
the
just
kind
of
like
how
we
can
prevent
this
from
ever
passing
ever
because
it
seems
to
me
that
the
way
it's
been
position
is
that
it's
the
right
now,
because
it's
not
you
know
accurate,
but
I'm,
just
curious
what
happened
if
you've
seen
other
cities
or
in
you
know
just
even
around
the
world
there.
T
Get
what
I'm
trying
to
say
I
do
counsel
me
here
and
thank
you
for
that.
I
can
address
that
quickly.
The
council
can't
bind
future
City
Council's
right,
so
I
agree
with
you.
I
think
that
this
technology
is
dangerous,
whether
it
works
or
it
doesn't
and
I.
Think
that's
why
the
city
of
Boston
ought
to
take
the
step
to
ban
its
use
in
government.
I
can
promise
you
that,
as
advocates,
we
will
show
up
to
ensure
that
these
per
sections
persist
in
the
city
of
Boston
as
long
as
I'm,
alive
and
I.
T
Think
that's
true
of
many
of
my
friends
and
colleagues
in
the
advocacy
space.
But
you
know
some
cities.
Some
states
have
considered
approaches
of,
for
example,
passing
a
moratorium
that
expires
after
a
few
years.
We
did
not
choose
that
route
here,
in
fact,
because
of
our
concerns
that
are
I,
think
identical
to
yours
that
this
you
know,
it's
our
view
that
we
should
never
want
to
live
in
a
society
where
the
government
can
track
us
through
these
surveillance
cameras
by
our
face
wherever
we
go
or
automatically
get
alerts.
T
I
You
for
that,
and
the
other
question
that
I
have
is
I,
know
that,
where
this,
where
the
City
Council
and
all
of
our
it's
the
jurisdiction
is
all
things
I
deal
with
the
city
but
I'm
just
wondering
what,
if
any
examples
have
you
heard
of
other,
like
maybe
youth
in
the
private
sector
that
are
utilizing
or
coming
to
utilize
this
as
a
forum?
Is
there
anything
any
traction
or
anything
that
we
need
to
be
mindful
of
happening
outside.
AB
O
Sure
so,
I'm
so
glad
you're
asking
this
question
because
facial
recognition,
technologies,
broadly
speaking,
are
not
just
in
the
government
realm
so
right
now,
especially
with
the
kovat
pandemic
you're,
starting
to
see
more
people
make
proposals
of
using
facial
recognition
technologies
in
different
ways,
some
that
I've
seen
start
to
surface
or
can
we
use
facial
recognition
for
contactless
payments
right,
so
your
face
becomes
what
you
pay
with.
You
also
have
a
the
case
of
using
facial
analysis
in
employment.
O
So
there's
a
company
hire
view
that
says
we
will
analyze
your
facial
movements
and
use
that
to
inform
hiring
decisions
and
guess
what
we
trained
on
the
current
top
performers.
So
the
biases
that
are
already
there
can
be
propagated
and
you
actually
might
purchase
that
technology.
Thinking,
you're
trying
to
remove
bias,
and
so
there
are
many
ways
in
which
these
technologies
might
be
presented
with
good
intent.
But
when
you
look
at
the
ways
in
which
it
actually
manifests,
there
are
harms
and
oftentimes
harms
that
disproportionately
fall
on
a
marginalized
group.
Even
in
the
healthcare
system.
O
I
was
reading
research
earlier.
That
was
talking
about
ableism
when
it
comes
to
using
some
kinds
of
facial
analysis
systems
within
the
healthcare
context,
where
it's
not
working
as
well,
for
older
adults
with
dementia.
So
the
promises
of
these
technologies
really
have
to
be
backed
up
with
the
claims
and
I
think
the
council
can
actually
go
further
by
saying
we're
not
only
just
talking
about
government
use
of
facial
recognition
technologies,
but
these
technologies
impact
someone's
life
in
a
material
way
right.
O
A
G
K
It
sounds
to
me
that
obviously
there's
a
lot
of
support
across
the
board
and
it
looks
as
though
the
commissioners
asking
for
one
more
working
session,
which
seems
to
make
sense,
and
his
comments
were
very
positive,
so
I
think
this
is
a
good
thing.
So
thank
you
to
the
advocates.
Thank
you
to
my
colleagues,
particularly
the
lead
sponsors
and
the
Commissioner
for
his
participation
this
afternoon,
and
thank
you
for
all
the
folks
whom
we
will
hear
from
shortly.
That's
all
I've
got
Thank.
L
Thank
you
to
the
advocates
and
to
the
Commissioner
for
being
here
today
and
thank
you
to
the
advocates
who
have
met
with
me
over
the
last
a
few
months
ago,
over
a
period
of
a
few
months,
just
really
learned
a
lot
through
our
times
together,
especially
from
the
younger
folks.
I
am
interested
in
some.
If
there
is
any
information,
the
Commissioner
is
no
longer
with
us.
So
perhaps
I'll
just
forward
this
question
to
him
myself.
L
But
my
question
is
around
the
the
FBI's
evidence
standards
around
facial
recognition,
technology,
I,
wonder
if
they've
released
that
and
then
through
through
the
chair
to
the
makers,
we
had
some
questions
around
Section,
B
and-
and
perhaps
we
can
discuss
this
in
the
working
session.
All
or
I'll
send
this
to
ahead
of
time,
but
it
does
talk
about
Section,
B
Part,
two
reads
quote
nothing
in
B
or
one
shall
prohibit
Boston
or
any
Boston
official
from
using
evidence.
L
Related
to
the
investigation
of
a
specific
crime
that
may
have
been
generated
from
from
surveillance
systems,
we're
just
wondering
what
types
of
evidence
are
we
allowing
from
here
and
what
situations
are
we
trying
to
a
common
accommodates
just
the
specific
question
that
we
came
up
with
as
an
office
looking
through
that
so
I'll
share
that
with
the
makers
of
this
ordinance
to
understand
that
a
little
bit
better,
but
if
any
of
the
panelists
have
information
on
the
FBI
standards
regarding,
if
they've
shared
their
standards,
then
that
that
is
my
question
for
this.
For
this
time.
T
Can
answer
that
quickly
and
then
joy
may
also
have
an
answer.
But
one
of
the
issues
here
is
that
police
departments
across
the
country
have
been
using
facial
recognition
to
identify
people
in
criminal
investigations
for
decades
now
actually
and
then
not
disclosing
that
information
to
criminal
defendants,
which
is
a
due
process
violation
and
it's
a
serious,
serious
problem
in
terms
of
the
integrity
of
our
criminal
justice
process
and
the
courts
and
defendants
rights
to
a
fair
trial.
T
So
the
FBI
standards
that
you're
referencing,
one
of
the
reasons,
at
least
as
far
as
we
understand
it
and
you
know,
I,
don't
work
for
the
police
department,
but
one
of
the
reasons.
As
far
as
we
understand
it,
that
police
departments
have
not
been
disclosing
information
about
these
searches
to
criminal
defendants
is
that
there
is
no
nationally
agreed-upon
forensic
standard
for
the
use
of
facial
recognition,
technology
in
criminal
investigations
and
so
I.
My
understanding
is
police
departments
and
prosecutors.
T
Fear
that
if
they
were
to
disclose
to
courts
and
criminal
defendants,
that
this
technology
was
used
to
identify
people
on
specific
investigations
that
those
cases
would
basically
get
thrown
out
of
court
because
they,
you
know
the
the
people
who
performed
the
analysis,
would
not
really
be
able
to
testify
to
their
training
under
any
agreed-upon
national
standard
for
evaluating
facial
recognition,
results
or
or
using
the
technology.
More
generally,
I
can't
answer
your
other
question
about
that
exemption.
T
Obviously,
the
BPD
is
not
going
to
be
in
a
position
to
ask
every
single
Police
Department
in
the
country.
Where
did
you
get
the
name
attached
to
this
picture?
And
so
that's
that's
what
that
exemption
is
meant
to
address.
There
have
been
some
concerns
from
some
of
our
allied
organizations
that
we
need
to
tighten
that
up
with
a
little
more
extra
language
to
make
clear
that
that
does
not
allow
the
BPD
to
use
the
RMV
system
for
facial
recognition.
And
we
will
be
suggesting
some
language
to
the
committee
to
that
effect.
T
L
M
M
T
Well,
my
understanding
is
government
agencies,
including
the
Boston
Transportation
Department
use
surveillance
cameras
for
things
like
traffic
analysis,
accident
reconstruction,
so
there
are
non
criminal
uses
for
surveillance
cameras,
I'm,
not
aware
of
any
non
crime.
All
uses
in
government,
at
least
for
face.
Facial
surveillance,
except
in
the
I,
would
say
so-called
intelligence
realm
right,
so
that
would
mean
not
a
criminal
investigation
but
rather
an
intelligence
investigation,
for
example,
of
an
activist,
or
you
know
an
activist
group
or
a
protest,
or
something
like
that.
Okay,.
M
O
So
speaking
to
private
sector
uses
of
facial
recognition,
oftentimes,
you
see
it
being
used
for
security
or
accessing
securing
access
to
a
particular
place.
So
only
particular
employees
can
come
in
something
that's
more
alarming
that
we're
seeing
with
commercial
use
is
the
use
in
housing,
and
so
we
have
a
case
even
in
Brooklyn,
where
you
had
tenants
saying
to
the
landlord.
We
don't
want
to
enter
our
homes
with
our
face.
O
Don't
install
facial
recognition
technology,
they
actually
won
that
case,
but
not
every
group
is
going
to
be
so
successful,
so
you
can
be
in
a
situation
where
your
face
becomes
the
key.
But
when
your
face
is
the
key
and
it
gets
stolen
or
a
hat,
you
can't
just
replace
it
so
easily.
You
might
need
some
plastic
surgery,
so
we
see
facial
recognition,
technologies
being
used
for
access
in
certain
ways
and
again
the
dangerous
use
of
trying
to
predict
something
about.
Somebody.
Are
you
going
to
be
a
good
employee?
O
You
have
companies
like
Amazon,
saying
we
can
detect
fear
from
your
face
and
so
thinking
about
how
that
might
be
used
as
well.
The
other
thing
we
have
to
consider
when
we're
talking
about
commercial
uses
of
facial
recognition
technologies
is
oftentimes.
You'll
have
companies
have
general-purpose
facial
recognition,
so
this
then
means
other
people
can
buy
it
and
use
it
in
all
kinds
of
ways,
so
from
your
snapchat
filter
to
putting
it
lethal
autonomous
weapons.
You
have
a
major
range
with
what
can
happen
with
these
sorts
of
technologies.
T
O
T
Counselor
I
would
just
jump
in
to
also
reiterate
that
this
ordinance
only
applies
to
government
conduct.
It
would
not
restrict
in
any
way
any
entity
in
the
private
sector
from
using
this
technology,
but
that
said,
you
know
some
other
uses.
I
don't
know
if
you've
ever
seen
the
film
Minority
Report,
but
in
that
film
Tom
Cruise
enters
a
mall,
and
you
know
some
technology
says
to
him,
hello,
sir.
T
How
did
you
like
the
size,
small
underwear
you
bought
last
time,
so
that's
actually
now
becoming
a
reality
as
well
that
in
the
commercial
space
at
stores
for
marketing
purposes,
we
are
likely
going
to
see
if
laws
do
not
stop
this
from
happening.
The
application
of
facial
surveillance
in
a
marketing
and
commercial
context
to
basically
try
to
get
us
to
buy
more
stuff
and
to.
O
Underscore
that
point
you
have
a
patent
from
Facebook,
which
basically
Facebook
has
so
many
of
our
face.
Prints
we've
been
training
their
systems
for
years
right.
The
patent
says,
given
that
we
have
this
information,
we
can
provide
you
background
details
about
somebody
entering
your
scores,
your
store
and
even
give
them
a
trustworthiness
score
to
restrict
access
to
certain
products.
This
is
a
patent
that
has
been
filed
by
Facebook,
so
it's
certainly
within
the
realm
of
what
companies
are
exploring
to
do,
and
you
already
have
companies
that
use
facial
recognition
technologies
to
assess
demographics.
O
You
have
you
have
cameras
that
can
be
put
into
shelves.
You
have
cameras
that
can
be
put
into
mannequins,
so
you
already
have
this
going
on
in
Tacoma
in
Washington,
they
even
implemented
a
system
where
you
had
to
be
face
checked
before
you
could
walk
into
a
convenience
store.
So
this
technology
is
already
out
there
well.
M
A
You
so
I
it's
between
me
and
public
testimony,
so
I'm
going
to
just
put
my
three
questions
out
there
and
then
we're
gonna
go
right
down
the
list
of
folks
who
have
signed
up
in
RSVP'd
again
and
to
those
folks
were
going
to
come
after
you
testified
again.
We
were
maxed
out
and
our
limit
of
people
who
could
participate
so
if
you're
willing
to
testify
and
then
also
sign
out,
watch
through
YouTube
or
washer
and
other
mechanisms
to
allow
somebody
else
to
testify.
That
would
be
very
helpful
to
this
process.
I'm.
A
Looking
specifically
at
the
issue
of
enforcement
in
this
statute,
or
in
this
ordinance-
and
it
says
no
evidence
derived
therefrom-
may
be
received
in
evidence
in
proceeding
or
before
any
department
or
officer
agency
regulatory
body.
I
want
to
be
clear
if
the
police
received
this
evidence,
could
a
prosecutor
use
it
in
court
so
that
permission?
Okay,.
E
A
So
that's
that's
my
issue.
There
seems
to
be
no
actual
evidence.
Prohibition
for
use
in
court
against
somebody,
the
other
issue
or
concern
I-
have
is
provision.
Violations
of
this
ordinance
by
a
city.
Employee
shall
result
in
consequences
that
may
include
retraining,
suspension
termination
but
they're
all
subject
to
provisions
of
collective
bargaining
agreements,
so
all
of
that
could
be
gone
away
if
their
union
hasn't
agreed
to
that
being
part
of
the
disciplinary
steps
for
that
particular
employee.
A
So,
to
me
that
signals
the
patrolmen
and
other
unions
of
folks
who
are
part
of
investigating
need
to
need
to
have
this
as
part
of
their
violation
of
contract
or
violation
of
standard
I.
Think
maybe
it's
wrong
and
then
finally
I,
you
know
joy.
Your
testimony
about
the
private
sector
has
really
hit
me.
Thank
you.
So
much
I
mean
I'm
particularly
concerned
about
this
I'm,
a
twin
me
and
Facebook
tags.
My
sister
and
all
of
my
pictures
automatically
right
so
I
mean
I'm.
A
twin
I
mean
councillor.
A
Sami
George
has
trip
you
know
like
so
for
those
we're
called
multiples,
you
are
all
Singleton's.
You
were
born
by
yourself.
We
share
birth.
What
multiples
have
you
know,
and
so
naturally
I'm
concerned
about
the
second
I've,
got
my
sisters
inclined
to
do
any
criminal
activity,
but
she
may
go
to
a
protest
or
to
I,
don't
know,
but
either
way.
A
The
point
is
I'm
concerned
about
the
private
actor
and
how
they
interact
and
I'm
particularly
concerned
as
how
far
this
will
go
down
the
line
the
chain
supply
chain
right,
because
we
have
600
million
dollars
in
contracts
for
all
sorts
of
things
for
the
city
of
Boston.
You
know
if
my
contract,
we
have
a
contract
with
a
cleaning
company
that
requires
the
workers
to
sign
in
with
facial
recognition
right.
So
how
far
down
the
line
can
we
go
with
our
money
now?
I
understand.
A
T
T
You
councillor
so
just
very
quickly,
we
are
unclear
on
what
the
City
Council's
power
is
to
control
the
prosecutor's
office
right,
so
I
think
we'll
look
more
into
that.
We'll
look
closely
at
that.
On
the
second
question.
I
agree:
we
need
to
look
at
the
BPA
agreement,
which
I
understand
expires
this
summer,
be
pen.
AA
T
AA
Q
T
Then,
finally,
on
the
private
sector
concerns
I
share
them
again.
You
know
we.
We
want
to
get
this
government
ban
passed
as
quickly
as
possible.
The
ACLU
supports
approaches
like
that
that
the
state
of
Illinois
has
taken.
They
passed
the
nation's
strongest
consumer
facing
biometrics,
privacy
law.
It's
called
the
biometric
information
privacy
act.
Bitte
bitte
essentially
prevents
private
companies
any
private
companies
from
collecting
your
biometric
data.
Without
your
opt-in
consent
and
that's
not
I
clicked
a
button.
T
A
You
so
at
this
point,
I'm
going
to
turn
it
over
to
public
testimony.
I
have
some
folks
who
are
ready,
committed
and
asked
to
speak
today.
I'm
gonna
go
through
them
I'm,
going
to
it's
already
quarter
to
five
I'm,
going
to
try
and
end
this
hearing
no
later
than
six.
So
that's
an
hour
and
15
minutes
to
move
through
public
testimony
and
I'm
gonna.
Try.
I'm
gonna
have
to
keep
folks
to
two
minutes,
as
I
said
before,
because
there
are
a
lot
of
people
signed
up.
A
So
all
the
folks
who've
signed
up
I
will
go
through
that
list.
Then
I'm
going
to
invite
people
to
raise
their
hands,
who
may
not
have
signed
up
directly.
Who
would
also
like
to
testify
all
right
so
I
have
on
the
list
a
Bonnie
tinner
radio
from
the
National
Lawyers
Guild
after
her
I
have
Kailyn
big
nolle
from
the
library,
Freedom
Project
and
Maddy
Kroc
Lee.
Those
are
the
three
folks
lined
up
see
Kailyn.
Is
there
Bonnie.
A
J
Sure
hi
I
am
Kalyn,
vignola
I
am
a
resident
of
West
Roxbury
and
a
librarian
in
the
Boston
area
and
I
am
speaking
on
behalf
of
the
library
freedom
project
today.
So
thank
you
to
the
chair
and
thank
you
to
all
the
City
Council
for
the
chance
to
testify
today
and
it's
important
piece
of
legislation.
So
the
library
freedom
project
is
a
library
advocacy
group
that
trains
library,
workers
to
advocate
and
educate
their
library
community
about
privacy
and
surveillance.
We
include
dozens
of
library,
workers
from
the
US
Canada
and
Mexico,
including
eight
librarians
from
Massachusetts.
J
These
library
workers
educate
their
colleagues
and
library,
users
about
surveillance
by
creating
library,
programs,
union
initiatives,
workshops
and
other
resources
and
form
and
agitate
for
change
and
facial
recognition.
Technology
represents
a
class
of
technology
that
is
antithetical
to
the
values
of
privacy
and
confidentiality
of
library
workers
as
library
staff.
We
understand
that
part
of
our
work
is
to
represent
these
values
and
the
services
and
resources
we
prove
to
our
library
community
in
order
to
reduce
the
harm
of
state
and
corporate
scrutiny.
J
As
we
understand
this,
we
can
see
the
social
and
economic
imperatives
that
privilege
some
and
marginalize
others
that
are
encoded
into
many
computer
systems
and
applications.
This
is
a
result
of
the
structure
of
our
technology
industry,
which
prioritizes
the
interests
of
management,
venture
capitalists
and
stockholders,
who
are
mostly
white
men.
J
J
A
Q
Thank
you
very
much
Thank
You
councillor
Edwards,
and
thank
you
to
the
City
Council
and
all
the
panelists
for
taking
up
this
important
issue.
My
name
is
Maddy
Crawley
I
used
they/them
pronouns
I'm,
a
teen
librarian
and
a
bargaining
unit.
Member
of
the
Boston
Public
Library
professional
staff,
Association
msl,
a
local
4928
aft,
our
union
of
library
workers
supports
a
ban
on
facial
recognition
in
Boston.
Q
In
voting
to
do
so,
our
union
recognizes
that
facial
recognition
and
other
forms
of
biometric
surveillance
are
a
threat
to
the
civil
liberties
and
safety
of
our
colleagues
and
of
library
users.
Our
library,
ethics
of
privacy
and
intellectual
freedom
are
incompatible
with
this
invasive
technology.
We
recognize
that
facial
recognition
in
other
biometric
surveillance
technologies
are
proven
to
be
riddled
with
racist,
ablest
and
gendered
algorithmic
biases.
These
systems
routinely
misidentify
people
of
color,
which
can
result
in
needless
contact
with
law
enforcement
in
other
scrutiny,
essentially
automating
racial
profiling.
Q
We
recognize
the
harm
of
surveillance
for
youth
in
our
city,
especially
a
black
brown
and
immigrant
youth.
Unregulated
scrutiny
by
authorities
leads
to
early
contact
with
law
enforcement,
resulting
in
disenfranchisement,
marginalized
futures
and
potential
death
by
state
violence.
We
recognize
that
public
areas
such
as
libraries
parks
and
sidewalks
exist
as
spaces
in
which
people
should
be
free
to
move,
speak.
Think
inquire,
perform
protest
and
assemble
freely
without
the
intense
scrutiny
of
unregulated
surveillance
by
law
enforcement.
Public
spaces
exist
to
extend
our
rights
and
provide
space
for
the
performance
of
our
civil
liberties,
not
policing.
Q
A
ban
on
fees.
Surveillance
technology
is
critically
important
for
the
residents
for
residents
and
visitors
to
the
City
of
Boston.
Our
union
encourages
you
to
ban
the
use
of
Fay's
surveillance
in
the
city
of
Boston
by
supporting
and
passing
this
very
crucial
ordinance
I'd
like
to
thank
you
for
your
time
and
we
are
the
Boston
Public
Library
professional
staff
Association.
Thank
you
very
much.
A
E
E
I'm
speaking
on
behalf
of
the
National
Lawyers
Guild
Massachusetts
chapter,
which
for
over,
which
has
over
80
years,
has
fought
for
human
rights
and
we're
deeply
concerned
over
the
dangers
of
facial
surveillance.
We
strongly
support
this
up
this.
This
ordinance,
facial
recognition
deployed
through
cameras
throughout
a
city,
can
be
used
to
track
the
movements
of
dissidents
and
anyone
attending
a
protest,
as
others
have
observed.
Our
history
and
the
National
Lawyers
Guild
shows
that
this
danger
is
real.
E
We
have
defended
political
dissidents
targeted
by
law
enforcement
such
as
members
of
the
Black
Panther
Party,
American,
Indian,
Movement,
Porter,
Rican
independence
movement
and
were
recently
here
in
Boston
occupy
Boston
climate
change
activists,
those
opposing
a
street
crime
event
and
black
lives
matter.
We
we
know
that
these,
that
the
dangers
of
political
surveillance
and
political
use
of
this
technology
are
real
are
the
National
Lawyers
Guild
also
helped
expose
illegal
surveillance
in
the
Church
Commission
1975-76
quintal,
Pro
hearings.
We
know
from
this
experience
that
this
can
be
used
to
disrupt
and
prosecute
political
protest.
E
AD
Problem
you
got
that
right,
so
thank
you
chair
and
to
the
amazing
speakers
before
me.
My
name
is
Nikhil
I
work
on
a
Google
brain
in
Cambridge,
so
I
specifically
work
in
the
area
of
machine
learning,
fairness
and
interpretability.
So
this
is
sort
of
the
stuff
that
we
do
some
here,
as
a
citizen
and
I
am
very
much
in
favor
of
the
ordinance
banning
facial
recognition
by
public
officials
here
in
Boston,
so
modern
AI
algorithms
are
statistical
machines
right
they
base
predictions
off
patterns
and
data
sets
that
they're
trained
on
there's,
there's
no
magic
here.
AD
So
any
bias
that
stems
from
how
a
data
set
is
collected
like
non
representative
collection
of
data
for
different
races
because
of
historical
context
gets
automatically
reflected
in
the
AI
models
predictions.
So
what
this
means
is
that
it
manifests
as
unequal
errors
between
subgroups
like
race
or
gender.
Now
this
is
not
theoretical.
You
know
it's
been
studied
very
deeply
by
many
AI
researchers
in
this
field
who
have
been
on
this
call.
AD
You
know:
there's
a
federal
NIS,
T
and
National
Institute
of
Standards
and
Technology
study
that
looks
at
almost
facial
recognition,
algorithms
and
found
significant
demographic
disparities
in
most
of
them
and
is
another
paper
by
joy
here.
You
know
gender
shades
that
looked
at
three
commercial
classification
systems
and
found
that
there
are
thirty
five
percent
of
the
time
incorrect
results
for
dark-skinned
females
versus
only
0.8%
incorrect
for
light-skinned
males.
This
is
this
is
horrible
right.
So
these
these
mistakes,
you
know,
can't
also
be
viewed
in
isolation.
AD
They
will
perpetuate
bias
issues
by
informing
the
collection
of
the
next
generation
of
datasets,
which
are
then
and
again
used
to
train
the
next
generation
of
AI
models.
Moreover,
when
mistakes
are
made
either
by
a
machine
or
by
a
human
being,
we
need
to
be
able
to
ask
the
question:
why
was
that
mistake
made?
And
we
simply
haven't
developed,
robust
tools
to
be
able
to
hold
these
algorithms
to
an
acceptable
level
of
transparency?
AD
To
answer
any
of
these
critical
questions,
deployment
of
these
systems
will
exacerbate
the
issues
of
racial
bias
and
policing
and
accelerate
the
various
social
inequalities
that
were
protesting
to
it.
Innocent
people
will
be
targeted.
You
know
the
threshold
for
deploying
these
systems
to
be
extremely
high.
One,
that's
arguably
not
reachable
beyond
accuracy,
Young's,
wondrous
and
accurate
accuracy
idea.
Data
would
have
to
be
public
and
there
would
have
to
be
actionable
algorithmic
oversight.
This
is
an
incredibly
high
bar
and
we
are
nowhere
near
that
both
algorithmically
and
society.
AD
AE
Thank
You
councilmembers
and
authors
of
the
ordinance
for
your
service
to
the
city.
My
name
is
Norris
Lima
and
Langguth
Arthur
and
I
use
she/her,
pronouns,
I'm
residents
of
Jamaica,
Plain
and
I'm
here
in
my
capacity
as
a
private
citizen,
I
won't
repeat
what
everyone
else
said,
but
I
will
say
that
the
use
of
facial
recognition
by
law-enforcement
today
is
especially
disturbing
due
to
the
police,
violence
in
this
country
and
the
dysfunctional
relationship
it
has
with
black
and
brown
people
in
the
city
and
with
all
due
respect.
AE
AE
The
ordinance
establish
a
total
ban
on
the
use
of
facial
recognition,
software
in
all
public
spaces,
including
by
law
enforcement
malls
and
shopping,
centers,
retail
spaces,
restaurants
open
to
the
public
state
government,
buildings,
courts,
public
schools,
schools
funded
by
the
state
on
public
streets
and
in
places
of
employment
as
a
condition
of
employment
and
as
a
condition
of
landlords,
leasing.
Rental
properties,
the
ban
should
also
apply
to
the
use
and
purchase
of
facial
recognition.
AE
A
AF
Me
thank
you
so
much
dear
counselor.
Thank
you
for
allowing
me
the
opportunity
to
speak
in
support
of
the
ordinance
banning
face
surveillance
in
Boston.
I
am
a
professor
of
law
in
computer
science
of
Northeastern
University,
who
has
been
researching
and
writing
about
the
risks
of
facial
recognition
technologies
for
over
seven
years.
I
make
these
comments
in
my
personal
academic
capacity
I'm
not
serving
as
an
advocate
for
any
particular
work.
AF
Please
allow
me
to
be
direct
facial
recognition.
Technology
is
the
most
dangerous
surveillance
tool
ever
invented.
It
poses
substantial
threats
to
civil
liberties,
privacy
and
democratic
accountability.
Quite
simply,
the
world
has
never
seen
anything
like
it.
Traditional
legal
rules,
such
as
requiring
legal
process
or
people's
consent
before
surveillance
can
be
conducted,
will
only
entrench
these
systems
lead
to
a
more
watched
society
where
we
are
all
treated
as
suspects
all
the
time.
Anything
less
than
a
band
will
inevitably
lead
to
unacceptable
abuse
of
the
massive
power
bestowed
by
these
systems.
AF
That
is
why
I
believe
that
this
ordinance
is
justified
and
necessary.
There
are
many
ways
that
law
enforcement
use
of
facial
recognition
technology
can
harm
people.
Government
surveillance
has
disproportionately
targeted
marginalized
communities,
specifically
people
of
color.
Facial
recognition
will
only
make
this
worse
because
it
is
inaccurate
and
biased
based
along
racial
and
gender
lines,
and
while
facial
recognition
is
unacceptable
in
its
biased
and
error-prone,
the
more
accurate
it
becomes,
the
more
oppressive
it
will
get.
AF
Accurate
systems
will
be
more
heavily
used
and
invested
in
further
endangering
the
people
of
Boston
use
of
facial
recognition
seems
likely
to
create
a
pervasive
atmosphere
of
chill
these
tools
make
it
easier
to
engage
in
surveillance,
which
means
that
more
surveillance
can
and
will
occur.
The
mere
prospect
of
a
hyper
surveil
society
could
routinely
prevent
citizens
from
engaging
in
First
Amendment
protected
activities
such
as
protesting
and
worshipping
for
fear
of
ending
up
on
government
watch
lists.
AF
Facial
recognition
also
poses
a
threat
to
our
ideals
of
due
process,
because
it
makes
it
all
so
easy
for
governments
to
excessively
enforce
minor
infractions
as
pretext
for
secretly
monitoring
and
retaliating
against
our
citizens,
who
are
often
targeted
for
speaking
up
like
journalists,
whistleblowers
and
activists.
The
net
result
would
be
anxious
and
oppressed
citizens
who
are
denied
fundamental
opportunities
and
rights
for
the
reasons
that
I
line
above
I
strongly
support.
The
ordinance
banning
facial
records
are
very
simple.
AF
A
AG
I
I
don't
wish
to
echo
what
other
excellent
advocates
have
been
saying,
but
I'm
here
representing
digital
fourth
restored.
Fourth
Boston,
which
is
a
volunteer
based
advocacy
group
on
surveillance
and
policing
issues
in
the
Boston
area
and
I
mean
what
highlight
some
comments
of
Commissioner
grass.
In
addition.
Well,
invest.
Q
AG
There
have
been
a
number
of
studies
now
that
show
that
current
facial
recognition
technologies
are
perfectly
capable
of
mists
identifying
legislators
as
criminals
and
these
systems
being
100%
accurate
will
pose
even
more
worrying
risks
about
that
as
city
councilors.
Where
are
you
going
when
you're
going
in
public?
Are
you
meeting
with
social
justice
groups?
Are
you
meeting
with
immigrant
advocates?
Are
you
meeting
with
God
forbid
police
reform
and
downtick
accountability
groups?
Are
you
doing
something
embarrassing?
AG
Perhaps
that
could
be
used
to
influence
your
votes
on
matters
of
public
interest
or
matters
relating
to
the
police
budget?
The
risks
of
these
things
happening
are
very
real
if
it
is
to
be
implemented
in
the
future,
and
the
only
appropriate
response
for
is
for
today's
City
Council
to
ban
this
technology.
AH
What
are
we
talking
about?
We
are
talking
about
a
force
whose
actions
and
racial
profiling
techniques
have
been
documented,
tried
in
court
and
a
tourist
Lee
known
for
targeting
and
terrorizing.
You
know
on
those
from
disinvested
communities
right.
You
know
we're
talking
about
militarization.
You
know
to
intimidate
force
their
will
on
black
and
brown
citizens
at
a
Commonwealth,
a
force
by
their
own
field,
interrogations
and
observation
records
show
almost
seventy
percent
of
them
were
black
residents
in
the
city.
AH
AH
You
know
people
of
color
and
children
and
elderly
and
women,
so
you
know
I
see
that
it
needs
to
be
banned
because
we
can't
continue
condoning
and
reassuring
and
fueling.
You
know
other
generations
of
black
and
brown
men
and
women
from
Dorchester
Roxbury
Mattapan.
You
know
it's
a
more
trauma
right
and
more
incarceration.
AH
A
Very
much
I
really
do
appreciate
it.
I
think.
It's
again,
we
could
speak
from
the
heart
about
their
own
experience
and
specifically
talk
about
how
it
would
impact
them.
I
do
appreciate
that
not
that
I
appreciate
the
experts,
but
I'm
telling
you
he's
nothing
better
than
hearing
about
how
residents
are
directly
impacted.
Thank
you.
I
have
also
on
the
list.
Cynthia
pages
I,
don't
have
a
name
but
I
have
an
organization
called
for
the
people.
Boston
I,
don't
think
we
have
either
one.
AI
Hey
everyone
all
right.
Thank
you.
Thank
you
to
the
council
and
the
panelists
for
allowing
me
to
speak
so
I'm,
a
machine
learning
engineer
working
in
an
adjacent
field
and
a
resident
of
Jamaica
Plain,
so
and
I'm
here
today,
as
a
private
citizen.
So
this
is
my
first
time
speaking
at
a
public
hearing,
so
cut
you
some
slack.
So
although
I'm
not
an
expert
and
serve
on
the
surveillance
aspect
and
applications,
I
am
familiar
with
the
technology
aspect
of
this
of
this
stuff.
AI
So
I
felt
the
need
to
testify
today
because
of
the
gravity
of
this
issue.
Although
the
technology
may
seem
benign
upon
first
glance,
this
is
an
extremely
powerful
and
dangerous
tool.
So
to
me
this
amounts
to
basically
a
form
of
digital,
stop
and
frisk,
and
it's
even
more
dangerous,
since
it
can
be
kind
of
constantly
running
constantly
working
anywhere
at
all
times
kind
of
tracking
folks.
This
seems
like
a
clear
violation
of
the
Fourth
Amendment
to
me,
as
many
commenters
and
panelists
today
have
noted.
AI
In
addition,
as
many
folks
have
noted
today,
the
technology
is
certainly
not
perfect
and
it's
riddled
with
biases,
based
on
the
data
that
it
was
trained
on
and
how
it
was
trained
and
by
the
folks
that
trained
it.
And
although
this
is
an
important
thing
to
note,
this
would
really
be
a
moot
point
when
evaluating
these
technologies
for
deployment
right,
because,
while
a
poorly
functioning
system
is
bad,
an
interview
introduces
issues.
AI
Perfectly
functioning
system
would
be
much
much
worse
and
would
basically
signal
world
where
we
could
be
tracked
out
of
hand
at
any
at
any
time
the
police
our
government
wanted.
So
that's
really
what
I
wanted
to
say
and
I
wanted
to
kind
of
state
my
support
for
a
ban
on
any
type
of
a
to
any
type
of
facial
recognition,
technology
for
use
by
the
police
or
the
government,
and
that
we
should
disincentivize.
AI
A
Perfect
on
time,
for
the
first
time
participating,
we
really
appreciate
it.
I
hope
that
this
is
not
your
last
time
we
are
dealing
with
a
lot
of
different
issues.
Your
voice
is
definitely
welcome.
Thank
you.
So
much
up
next
sorry
up
next
I
have
I'm
gonna
name,
will
Luckman
followed
by
Lisette
Medina
and
then
also
a
Nathan
shared
assured
I
apologize
again
if
I
miss
your
name
feel
free
to
just
state
your
name
again
for
folks
who
have
already
testified
against
Ince
we're
over
capacity.
A
If
you
sign
out
and
then
watch
some
YouTube
that
allows
for
someone
else
to
get
in
line
to
also
speak
so
just
letting
you
know
that
that's
what
we're
trying
to
do
so
I
will
now
turn
it
over
to
the
league
will
is
ready
to
go,
and
you
have
two
minutes.
Thank.
AJ
You
good
afternoon,
my
name
is
Wil
Luckman
and
I
serve
as
an
organizer
with
the
surveillance
technology
oversight,
project
or
stop,
and
stop
advocates
and
litigates
to
fight
discriminatory
surveillance,
Thank
You
counselors
for
holding
this
hearing
today
and
for
proposing
this
crucial
reform
today.
My
oral
remarks
are
an
excerpt,
the
written
testimony
being
entered
into
the
record.
Facial
recognition
is
biased,
broken
and
when
it
works
antithetical
to
a
Democratic,
Society
and
crucially,
without
this
ban,
more
people
of
color
will
be
wrongly
stopped
by
the
police.
AJ
At
a
moment
when
the
dangers
of
police
encounters
have
never
been
clearer.
The
technology
that
drives
facial
recognition
is
far
more
subject
than
many
realize.
Artificial
intelligence
is
aggregation
of
countless
human
decisions
codified
into
algorithms
and,
as
a
result,
human
bias
can
infect
AI
systems,
including
those
that
supposedly
recognize
faces
in
countless
ways.
For
example,
if
a
security
cancer
camera
learns,
who
is
suspicious-looking
using
pictures
of
inmates,
the
photos
will
teach
the
AI
to
replicate
the
mass
incarceration
of
african-american
met.
AJ
In
this
way,
AI
can
learn
to
be
just
like
us,
exacerbating
structural
discrimination
against
marginalized
communities,
as
we've
heard,
even
if
facial
recognition
worked
without
errors,
even
if
it
had
no
bias,
the
technology
would
still
remain
antithetical
to
everything.
The
city
of
Boston
believes
facial
recognition.
Manufacturers
are
trying
to
create
a
system
that
allows
everyone
to
be
tracked
at
every
moment
in
perpetuity
go
to
a
protest.
The
system
knows
go
to
a
health
facility.
It
keeps
a
record
suddenly
Bostonians
lose
the
freedom
of
movement
that
is
essential
to
an
open
society.
AJ
If
the
city
fails
to
act
soon,
it
will
only
become
harder
to
enact
reforms.
Companies
are
pressuring
local
state
and
federal
agencies
to
adopt
facial
recognition.
Signals
brief.
Cam
the
software
power
in
Boston's
surveillance
camera
network
has
released
a
new
version
of
their
software
that
would
easily
integrate
invasive
facial
recognition
tools.
AJ
Although
the
commissioner
says
he
will
reevaluate
the
new
version
on
offer,
he
also
expressed
interest
in
waiting
until
the
accuracy
of
the
technology
improves
or
making
exceptions
for
facial
recognition,
use
right
now,
the
definitively
and
permanently
avoid
the
issues
with
facial
recognition
that
I've
raised
here.
We
need
comprehensive
ban
now
and
I'll
conclude
on
a
personal
note.
I
live
and
work
in
New,
York
City,
but
I
was
born
in
Boston
and
raised
in
Brookline.
AJ
It
pains
me
to
see
the
current
wave
of
protests
roiling
the
area
because
it
demonstrates
the
biased
and
unequal
law
enforcement
practices.
I
remember
from
my
youth
I
have
yet
to
be
addressed.
I
know
that
the
people
of
the
Commonwealth
want
to
see
a
change
and
I
believe
that
the
council
is
on
their
side
in
practice.
Inaccuracies
aside,
facial
recognition
systems
lead
to
increased
stops
for
people
of
color.
Increased
stops
mean
an
increase
in
opportunity
for
police
and
abuse.
AJ
We
must
recognize
that
black
lives
matter,
and
to
do
so,
we
must
realize
that
technology
doesn't
operate
in
a
neutral
vacuum.
Instead,
it
takes
on
the
character
of
those
building
and
deploying
it
I
encourage
the
council
to
respond
to
their
constituents.
Demands
for
police
reform
by
immediately
banning
the
use
of
this
harmful
technology
in
Boston,
Thank.
AK
Hi
good
afternoon,
my
name
is
Lisette
Medina
I'm
here
as
a
private
citizen.
I
actually
am
pretty
new
to
all
of
this
I
this
week
have
found
out
from
the
student
immigration
movement
kind
of
what's
going
on
and
I
just
wanted
to
voice.
My
support
for
this
ban,
I
think
personally,
as
an
immigrant
and
speaking
toward
like
the
presence
of
facial
recognition
technology
in
the
school
system,
I
think,
especially
where
we
have
so
many
conversations
with
like
the
school
to
Prison
Pipeline.
AK
This
tool
could
be
used
to
facilitate
that
kind
of
work
in
racially
profiling
and
for
our
students
and
I.
Think
now
more
than
ever.
We
have
this
in
this
point
in
time
and
decision
to
make
and
protecting
our
residents
of
color
of
black
and
brown
residents
and
students
as
they
live,
and
try
to
navigate
the
various
systems
that
are
already
playing
pressure
on
them
and
so
I
think,
especially
in
light
of
recent
events.
AK
This
is
really
a
stand
to
support
residents
of
color
in
Boston
over
possible
pros
that
could
be
looked
at
in
this,
and
I
also
think
it's
troubling
that
the
Commissioner
doesn't
seem
to
be
clear
about
what's
kind
of
going
on
with
this.
In
addition
that
you
know
so,
it
isn't
clear
that
there
is
even
a
surveillance
of
whether
or
not
this
could
be
used
already.
AK
A
AL
You
said
it
perfectly,
thank
you,
so
my
name
is
Nathan
shared
and
thank
you
for
allowing
me
to
speak
on
behalf
of
the
Electronic
Frontier
Foundation
and
our
30,000
members.
The
Electronic
Frontier
Foundation
strongly
supports
legislation
that
bans
government
agencies
and
employees
from
using
face
surveillance
technology.
We
thank
the
sponsors
of
this
ordinance
for
their
attention
to
this
critical
issue.
Based
surveillance
is
profoundly
dangerous
for
many
reasons.
First,
in
tracking
our
faces
a
unique
marker
that
we
cannot
change,
it
invades
our
privacy
second
government.
AL
You
said
the
technology
in
public
places
will
chill
people
from
engaging
for
some
other
protected
activities.
Research
shows
and
courts
have
long
recognized
that
government
surveillance
of
First
Amendment
activity
has
a
deterrent
effect.
Third
surveillance
technologies
have
an
unfair,
disparate
impact
against
people
of
color
immigrants
and
other
vulnerable
populations.
Watch
lists
are
frequently
over-inclusive:
error,
riddled
and
used
in
conjunction
with
powerful
mathematical
algorithms,
which
often
amplify
bias.
The
Space
Surveillance
is
so
dangerous
that
governments,
governments
must
not
use
it
at
all.
AL
We
support
the
aims
of
this
ordinance
and
respectfully
seek
three
amendments
to
assure
that
the
bill
will
appropriately
protect
the
rights
of
Boston
residents.
First,
the
bill
has
an
exemption
for
evidence
generated
by
face
ability
that
relates
to
investigation
of
crime.
If
that
respectfully
asked
that
that
it
be
amended
to
prohibit
the
use
of
face
around
enabled
evidence
generated
by
or
at
the
request
of
the
Boston,
Police
Department
or
any
Boston
official.
Second,
the
private
right
of
action
does
not
provide
free
shipping
for
a
prevailing
plaintiff
without
such
fee.
AL
Shifting
the
only
private
enforcers
will
be
advocacy
organizations
and
wealthy
individuals.
Yes,
PFF
respectfully
asked
that
the
language
that
language
be
included
toward
reasonable
attorney
fees
to
a
prevailing
plaintiff.
Third,
the
ban
extends
to
private
sector
used
to
face
surveillance
conducted
with
the
government.
Permit.
The
better
approach
asked
as
cages
or
has
offered
earlier
is
to
is
to
use
face
to
make
sure
that
face
surveillance
is,
is
used
only
with
informed
opt-in
consent.
AL
The
cff
respectfully
request
that
the
language
be
limited
to
prohibiting
private
sector
permitting
for
the
use
of
face
surveillance
recognition
at
the
government's
request
in
closing
iff.
Once
again,
thanks
to
sponsors
looks
forward
to
seeing
an
ordinance
passed
that
bans
government
used
to
face,
or
else
in
Boston,
and
will
enthusiastically
support
the
ordinance
banning
face
recognition
technology
in
Boston
if
it
is
amended
in
these
ways.
Thank
you.
A
And
I
want
to
make
sure
you
had
three,
which
is
to
ban
the
use
by
the
BPD.
A
BPD.
Excuse
me,
no
exceptions
period.
Doesn't
that
currently,
because
it
doesn't
allow
for
fee
shifting
or
attorney
fees.
It
really
does
put
it
on
either
individuals
to
take
it
on
their
own
without
having
any
kind
of
I
guess
protector
or
support
that
helps
to
be
funded
and
then
three
to
extend
to
private
actors.
It
down
the
supply
chain,
where
the
city
of
Boston
can
tell
private
actors
that
that
contract
with
us,
because
I'd.
N
A
AL
Right
now
it
prohibits
the
the
offering
of
a
permit
to
a
private
to
a
private
to
a
private
company
agency
person
what-have-you,
whereas
we
would
tighten
it
up
to
say
that
that
a
permit
could
be
provided
as
long
as
the
it
wasn't
being
provided
for.
Face
surveillance
to
be
collected
at
the
behest
are
at
the
request
of
the
Boston
Police
Department.
AL
So
really
making
sure
that
you
know
we're
speaking
back
to
the
fact
that,
like
private
use
would
be
better,
could
be
better
protected
by
something
similar
to
the
the
biometric
information
privacy
app
in
Illinois
that
are
in
quiet.
That
requires
informed
opt-in,
consent
from
the
individual,
rather
than
simply
saying
that
no
private
actor
can
do
space
surveillance.
A
Thank
you.
That
was
a
great
clarification.
I'm
gonna
go
now
to
next
speakers.
Thank
you
very
much.
Sabrina
burrows
is
Madea
Christina
Vasquez.
AM
Thank
you
so
much
just
to
let
folks
know
Christina
is
not
with
us.
Some
folks
had
to
leave
because
they
have
to
work,
but
so
hi
everyone,
hi
city
councilors.
My
name
is
Sabrina
Barroso
I
am
the
lead
organizer
for
the
student
immigrant
movement,
the
first
undocumented
immigrant
youth,
led
organization
in
the
state
of
Massachusetts.
AM
We
believe
that
all
young
people
are
powerful
and
that
when
youth
are
supported
and
have
space
and
investment
to
grow,
they
flourish
into
the
leaders
of
today
and
tomorrow
at
sym
we
are
invested
in
protecting
and
working
on
behalf
of
the
undocumented
youth
and
families
of
Massachusetts.
That's
why
I'm
here
with
you
all
today
and
I,
would
like
to
make
clear
that
we
must
prohibit
the
use
of
facial
recognition,
technology
and
software
in
the
city
of
Boston.
AM
For
far
too
long,
the
Boston
Police
Department
has
been
allowed
to
full,
have
full
flexibility
and
free
will
over
how
the
police
and
surveil
our
communities
and
the
consequences
are
devastating.
Our
people
are
being
criminalized
and
funneled
into
prisons,
detention
and
deportation.
We
need
full
community
control
over
the
police,
and
this
is
a
key
move
to
bringing
power
to
our
people
right
now.
There
are
no
laws
that
control
what
the
BPD
can
purchase
for
surveillance
or
how
they
use
it.
AM
When
I
learned,
this
I
was
scared,
because
I
thought
about
the
history
between
the
BPD
and
families
like
mine
and
I
thought
about
the
people
who
are
constant
about
the
people
that
I
love
and
that
the
people
I
care
about
and
I
thought
about,
the
people
who
are
constantly
harassed
and
targeted
by
law
enforcement
simply
for
existing.
So,
if
you
ask
me,
facial
recognition
is
not
to
be
trusted
in
the
hands
of
law
enforcement,
that
tracks
monitors,
hyper
surveillance,
black-brown
communities
and
that
puts
immigrants
and
activists
youth
under
constant
scrutiny
and
criminal
criminalization
counselors.
AM
Would
you
really
trust
a
police
department
that
shares
information
to
ice
that
shares
student
information
to
ice
as
well?
That
has
led
to
the
deportation
of
students
that
exchange
emails,
saying
happy
hunting
with
ice?
This
Enda
has
historically
targeted
black
and
brown
youth
to
invade
their
personal
space,
their
bodies
and
their
privacy
with
stop
and
frisk.
There
is
not
a
shame
to
separate
youth
from
their
and
their
families
from
their
communities
and
then
end
up
pinning
in
justices
and
crimes
on
us.
AM
The
BPD
says
that
they
don't
use
spatial
surveillance
now,
but
they
have
access
to
it
and
they
have
a
four
hundred
fourteen
million
dollar
budget
to
buy
it
right
now.
Our
communities
need
access,
the
funding
for
health
care,
housing,
and
so
many
other
things,
and
the
list
goes
on.
For
years,
residents
of
Boston
have
been
demanding
to
fund
things
that
truly
matter
to
them,
and
we
have
to
invest
in
things
that
are
not
face,
surveillance
that
doesn't
even
work
and
is
irrelevant
to
keeping
our
community
safe.
AM
We
must
invest
an
intentional
restorative
justice
practices,
health
care,
mental
health
and
resources
to
prevent
violence
and
distress.
Facial
surveillance
would
end
privacy
as
we
know
it,
and
completely
throw
off
the
balance
in
power
between
people
and
their
government,
listen
to
youth
and
listen
to
our
community,
and
let's
do
and
let's
follow
the
lead
of
young
people
who
are
addressing
the
deeper
issue
of
what
is
to
criminalize
our
people
and
putting
the
power
in
the
hands
of
the
people,
especially
when
it
comes
to
governing
surveillance.
Thank
you
all
so
much.
Thank
you.
AM
She's
not
available,
okay
is
his.
AN
Yes,
okay,
okay
hi,
my
name
is
Melda
and
I
was
I.
Am
member
of
the
sitting
media
movement
today,
I'm
here
to
testify
in
support
of
the
bending
special
rotation
technology
in
buses.
This
will
establish
a
ban
of
government
use
of
pastry
buildings
in
the
city
of
Boston.
Boston
must
pass
this
ordinance
to
join
Springfield,
Sabadell,
Cambridge,
Brookline
and
you're
handsome
and
protecting
racial
justice
T
in
freedom
of
speech.
A
federal
Gartland
study
published
in
December
2019,
found
that
face
recognition.
AN
Algorithms
were
much
more
likely
to
fail
when
attempted
to
identify
the
faces
of
people
of
color
children
at
the
elder
tree
and
woman.
That
means
that
technical,
only
reliable
words
technology
only
reliably
word
for
middle-aged
white
men,
a
very
small
fraction
of
bustah
students
president
as
a
John
person
in
the
city
of
Boston.
It
is
clear
to
me
that
this
type
of
technology
is
not
okay
to
use
in
our
communities.
AN
The
fact
that
this
technology
is
used
to
control
typically
in
communities
color
makes
my
family
and
me
feel
unsafe,
as
well
as
the
other
families
that
we
put
in
this
communities.
Black
and
brown
kids
are
a
germinates,
are
putting
databases
that
are
used
against
them
with
our
renters
or
reason.
This
interrupts
our
education
and
limits
our
future
kids
sharing
becoming
us
or
feeling
safe.
Black
and
brown
kids
should
be
able
to
work
on
the
stream
without
the
fear
of
beer
of
knowing
what's
going
to
happen.
AN
This
shows
how
this
surveillance
just
used
to
target
black
and
red
communities
to
racial
profiling
into
numerology
is
one
of
the
many
foreign
systemic
racism
immigrant
utility
to
do
not
trust
police,
because
we
know
that
they
are
very
close
with
us.
We
know
that
they
sent
a
student
information
to
eyes,
and
this
has
led
to
the
deportation
of
John
people
and
pain
in
our
communities,
black
and
brown
cousins.
These
are
constantly.
AN
Brings
repeat
the
honors
with
especially
their
strange
surveillance
and
communications
of
us.
It's
not
right.
Everyone
on
the
council
at
some
point,
has
talked
about
separating
young
people
and
young
people
in
our
future
leaders.
We
are
leaders
right
now.
We
are
telling
you
to
do
what
everyone
knows
is
right
event:
they
lose
the
facial
routine
surveillance.
We
can
do
more
to
protect
immigrant
families
in
Boston,
stop
selling
youth
and
families
out.
AN
AO
A
A
AP
Two
minutes:
oh
thank
you
so
much
speaking.
AP
This
is
Eli
uncle.
Thank
you
so
much.
My
name
is
Eli
Harmon
I
live
in
Mission
Hill
I've
been
doing
work
with
the
student
immigrant
movement
on
testifying
on
behalf
of
them
today
in
support
of
the
ordinance.
The
issue
of
police
surveillance
is
enormous,
ly
important
to
people
all
across
the
city
of
Boston,
though
it's
of
particular
concern
to
communities
of
color,
which
it
overwhelmingly
targets.
A
study
done
by
MIT
research
showed
this
type
of
surveillance.
AP
Technology
would
be
far
less
accurate
for
people
of
color
and
in
fact
it
was
inaccurate
for
thirty
five
percent
of
black
women
and
was
most
accurate
for
white
adult
men.
Law
enforcement
today
already,
of
course,
overwhelmingly
targets
bragg
black
and
brown
communities,
and
this
type
of
technology
is
likely
to
only
make
this
more
so
the
case.
Additionally,
this
type
of
technology
is
a
complete
violation
of
people's
privacy.
AP
AP
Many
cities
that
neighbor
Boston,
including
Somerville,
Brookline,
Cambridge
and
Springfield,
have
chosen
to
band
facial
surveillance
because
they
understand
that
it
is
dangerous
and
harmful
to
communities
of
color.
It
is
a
violation
of
people's
privacy
and
that
money
currently
being
used
to
fund
that
technology
would
be
better
spent
in
places
that
actually
improve
lives
in
Boston.
For
these
reasons,
I
asked
the
council
to
pass
a
prohibition
of
the
use
of
facial
recognition
surveillance
and
give
the
community
control
over
surveillance
technology
and
usage.
Thank
you
so
much.
Thank.
A
You
very
much
I
I,
do
see
sure
Lonnie.
Y
AD
AC
A
ban
on
facebook
surveillance
technology
is
important
in
the
city
of
Boston,
because
this
technology
is
dangerous
and
it
leads
to
serious,
miss
pross
and
community,
and
the
community
face
surveillance
is
prevent
to
be
less
accurate
to
people
for
people
of
darker
skin
black
people
already
face
extremely
extreme
levels
of
police
violence
and
harassment.
The
racial
bias
and
face
several
surveillance
will
put
lives
in
danger,
but
even
if
accurate,
we
should
still
be
doing
all
that
we
can
to
avoid
a
future
where
every
time
we
go
to
the
doctor,
a
place
of
worship,
a
protest.
AC
The
government
is
scanning
our
face
and
Rec
recording
or
movements.
We
already
know
that
law
enforcement
is
constantly
sharing
and
misusing
information.
What
would
happen
to
our
people
when
we
are
already
under
constant
surveillance
and
criminalization
using
face
surveillance?
The
government
could
begin
to
identify
and
keep
a
list
of
every
person
at
a
protest
people
who
are
driving
or
who
are
walking
the
street
every
single
day
across
the
city
on
an
ongoing
atom
attic
basis.
It
would
keep
the
same
harmful
surveillance
going
like
what
happened
with
the
use
of
gang
daily
lives.
AC
It
would
criminalize
anyone
move
moving.
Our
people
mean
anyone
to
make
who
make
a
move
in
our
city.
If
we
want
this
city
to
be
safe,
we
need
to
put
our
money
where
our
mouth
is
and
invest
and
practice,
such
as
restorative
justice,
to
help
restore
our
community.
Facial
surveillance
is
not
the
answer.
If
anything,
it
is
distracting
or
addressing
or
real
concern
about
safety
in
our
communities.
AC
A
You
very
much
I
want
to
say
thank
you
to
all
of
the
youth
who
are
going,
who
have
testified
and
will
testify
at
sym.
All
those
folks
showing
up
you
are
I
just
want
to
say.
Thank
you
so
much
on
behalf
the
City
Council,
your
future
and
you're
doing
an
amazing
job.
I
have
enough
lead
the
Diaz
and
then
Jose
class
I,
see
that
Emily
reef
who
had
signed
up
before
might
also
be
available.
So
those
next
three
names
I
think
Nathalie
may
no
longer
be
with
us.
AQ
A
AQ
Everyone
I'm
here
on
behalf
of
sim
to
testify
in
favor
of
the
ban
of
facial
recognition.
Software
in
Boston
I
am
an
MIT
alum
working
as
a
software
engineer
in
Boston
and
I
am
talking
recipient
as
well,
and
I
want
to
tell
you
firsthand
that,
as
a
software
engineer,
working
in
AI
and
machine
learning
enabled
robots
that
the
software
used
for
facial
recognition
is
far
from
perfect.
AQ
In
fact,
as
Joe
found
and
others
have
noted,
one
in
three
black
women
is
likely
to
be
misclassified
by
technology
and
our
own
federal
government
has
found
that
the
face
recognition,
algorithms
were
more
likely
to
fail
when
it
attempted
to
identify
faces
of
people
of
color
children,
elderly
and
women.
The
software
after
all,
is
written
by
people,
and
it
is
inherently
filled
with
mistakes
and
bias.
AQ
Despite
the
dita,
the
detail,
all
of
the
assurance
processes,
all
production
level
software
undergoes
this
is
an
accepted
fact
in
software
industry,
no
matter
how
good
your
software
is
or
the
testing
process
is
bugs
and
the
software
will
always
exist.
Consequently,
the
use
of
facial
recognition
technology
by
government
entities,
especially
the
police
departments,
must
be
bad
as
a
consequence
of
errors
in
this
technology
or
of
life
and
death.
A
false
identification
by
a
facial
recognition.
AQ
Software
could
lead
to
an
unwarranted
or
confrontation
with
a
police
department,
the
seemingly
small
area
in
facial
recognition
software
and
may
not
mean
much
to
you,
but
there's
an
in
document
on
documented
immigrant
and
the
person
of
color.
Any
confrontation
with
the
police
could
mean
deportation
or
even
my
life.
AQ
The
police
have
a
record
of
systemic
racial
bias,
leading
to
the
deaths
of
thousands
of
people
government
entities.
Special
police
forces
do
not
need
any
more
tools
to
further
their
racial
bias
and
consequently,
there's
a
stomach
murdering
of
very
people.
There
amounted
in
fact,
I
encourage
you
to
press
pause
on
these
facer
violence
by
government
entities
in
city
of
Boston.
By
supporting
this
crucial
ban,
thank.
A
AB
All
right,
yeah,
sorry
about
that
I
am
my
most
havoc
internet
he's
yeah,
so
my
name
is
Emily
right
and
I
work
on
a
machine
learning
research
group
Mexico,
although
of
course
my
views
are
my
own
and
I'm
here
as
a
citizen,
yeah
I
strongly
support
the
ban
on
facial
recognition,
technology
and
many
people
have
said
similar
things
to
what
I'm
about
to
say
and
and
much
better
words
I
just
wanted
to
yeah
I,
just
reiterate
on
even
when
working
correctly
there
are
so
many
major
privacy
concerns
with
with
these
technologies
and
by
definition,
they're
designed
to
track
us
wherever
and
whatever
we're
in
public
spaces
and
the
to
aggregate.
AB
It's
a
so
well
documented
phenomenon,
not
a
good
one,
and
so
that's
that's
all
about
facial
recognition
technology
when
it's
working
perfectly
but
again,
as
as
people
have
noted
that
that's
so
far
from
the
truth,
yeah
there's
a
ton
of
research
on
this
by
the
ACLU
Tim
Nate,
Kapoor
and
joy,
who
spoke
earlier
and
I've
seen
this
in
my
own
research,
our
team's
researches
related
to
these
area,
the
this
area,
that
these
models
are
just
totally
not
reliable
for
data
that
is
different
than
what
they're
crammed
on
and
and
that
these
inaccuracies
don't
make
it
any
less
date.
AB
I
mean
one
might
say
like.
Oh,
it
makes
it.
You
know
potentially
less
dangerous,
but
it
makes
it
much
much
more
dangerous
when
the
inaccuracies
does
reinforce
the
bias
use
that
disparities
that
have
always
been
there
and
the
way
that
different
groups
are
surveyed
and
policed
and
incarcerated,
and-
and
we
saw
with
with
the
ona
Pilar's
death
that
you
know
false
having
thinking
that
you're
attacking
or
that
you're
entering
the
right
house
and-
and
if
you
are
wrong
about
that,
I
mean
there's
just
horrible
unspeakable
consequences
and
yeah.
That's
just.
AB
A
It's
perfectly
on
time:
I'm
just
gonna
go
down
the
list
and
also
call
on
christian
deleon,
clara
ruiz,
yeah,
Serrano
and
Oliver
DeLeon.
AR
You
mean
fine
now,
thank
you.
Oh
my
name
is
christian
de
Lyonne,
and
I'm
here
today
as
a
member
of
the
student
immigrant
movement
as
a
point
of
banning
facial
recognition
in
boston,
a
surveillance
is
now
a
huge
invasion
of
privacy,
but
in
the
wrong
hands
can
be
used
to
even
further
oppress
minorities.
It
can
be
used
to
track
immigrants,
protesters
and
inaccurate
when
tracking
people
of
darker
complexion.
AR
As
we
know,
this
can
be
very
problematic
since
these
tools
target
these
communities
through
racial
profiling
already
having
surveillance
tools
like
facial
recognition
will
contribute
to
an
even
greater
systematic
oppression.
This
also
has
no
business
being
in
our
schools.
To
my
knowledge,
some
schools
are
already
investing
in
face
recognition
software
and
are
investing
millions
in
this
new
technology
and
for
what
exactly
to
track
their
students.
AR
Every
move,
I
believe
schools
should
spend
that
time
and
money
on
bettering
their
school
and
education
systems,
rather
than
spending
it
on
software
and
tools
that
will
in
no
way
improve
students
ability
to
be
educated.
We
are
already
living
in
a
time
where
technology
can
do
scary
things,
and
that-
and
the
last
thing
we
need
are
people
watching
our
every
move
on
a
day
to
day
basis.
During
this
pandemic,
we
have
grown
accustomed
to
wearing
masks
and
covering
our
faces
in
public.
AR
Well,
if
we
had
well,
if
we
now
have
facial
recognition
on
every
street
corner,
then
then
this
temporary
thing
the
world
has
adapted
to
may
just
become
permanent,
not
only
to
protect
our
own
privacy,
but
to
not
be
falsely
accused
of
crimes.
Due
to
falsely
recognition.
Software
20/20
has
had
a
great
display
of
racism
and
brutality
against
people
of
color,
particularly
the
black
community
and
I
fear
that
systems
such
as
facial
recognition,
software
in
the
hands
of
law
enforcement
can
just
be
another
tool
they
use
against
us.
Thank
you
for
listening
to
my
concerns.
Thank.
AS
AS
Thank
you
so
much.
My
name
is
Clara
I'm,
isolator
and
I'm
riding
in
support
of
the
ordinance
banning
facial
recognition
technology
in
Boston,
presented
by
constantly
booth
in
at
Rio.
This
ordinance
tablished
a
ban
on
government
use
of
face
of
aliens
in
the
city
of
Boston,
but
the
most
passes
ordinance
to
join
Springfield,
Cambridge
Brooklyn
in
nor
Hanson
in
protecting
racial
justice,
privacy
in
prison
or
fish.
AS
AS
Free
speech
and
Association,
equal
protection
and
government
accountability
in
transparency
and
the
government
must
be
aware
of
the
technical
limits
of
even
the
most
promising
new
technologies
in
the
race
of
relying
on
computer
system
that
can
be
prone
to
error.
Facial
recognition
is
creeping
into
more
and
more
law
enforcement
agencies
with
little
notice
or
oversight,
and
there
are
practically
no
laws
regulating
this
invasive
surveillance
technology.
AS
Meanwhile,
ongoing
technical
limitations
to
the
accuracy
of
facial
recognition
create
serious
problems
like
misidentification
in
public
safety
risk
I
refuse
to
sit
back
in
Minooka,
my
community,
my
wisdom
is
already
then
it
already
is.
The
surveillance
technologies
are
intentionally
set
up
to
the
disadvantage
of
brown
in
color
people
communities.
It
has
a
better
success
rate
when
it
comes
to
white
males,
meaning
this
technology
is
a
reliable
and
it
would
do
more
harm
if
we
don't
burn
the
surveillance
tool.
AS
This
technology
we
are
loved,
will
allow
for
misidentification
to
purposely
happen
against
black
and
brown
people
event
on
face
of
aliens
technology
is
critically
important
in
this
area.
Boston,
because
we
need
to
keep
the
community
safe
because
all
life
matters,
a
federal
government
government
study
published
in
December
2019,
found
that
face
recognition
algorithm
most
likely
to
fail
when
attention
to
identify
the
faces
of
people
of
color,
children,
elderly
and
woman.
That
means
the
T
only
reliable
works
for
middle
enjoyment.
AS
A
very
small
fraction
of
Boston
residents
I
encourage
you
to
pass
to
press
pause
on
the
use
of
pay
surveillance
by
government
entities
in
the
city
of
Boston.
By
supporting
in
passing
this
crucial
burn,
we
cannot
allow
bottom
to
add
up
authoritarian,
unregulated
bias
on
surveillance
technology.
Thank
you
for
the
opportunity
to
testify
him
by
the
way
the
anniversary
I
know
it's
also
gonna
be
testifying
through
this.
AO
Okay,
hello,
my
name
is
John
Yvette
Serrano
good
evening.
I'm.
Writing
on
behalf
of
sim
in
support
of
the
ordinance
banning
facial
recognition.
Technology
in
Boston
I
urge
my
city
of
Boston
to
pass
this
ordinance
to
join
Springfield,
Somerville,
Cambridge,
Brookline
and
Northampton
in
protecting
racial
justice,
privacy
and
freedom
of
speech.
It
is
our
duty
as
a
free
and
just
City
Saban's
technology
like
facial
recognition
as
a
victim
of
social
media.
AO
Hacking
I
have
seen
how
technology
could
be
used,
as
it's
all
do
to
defame
and
criminalize
with
pictures
that
feel
like
proof
to
the
viewers,
as
the
teacher
on
in
civilian
I
refuse
to
allow
those
in
my
community
to
be
exposed
to
a
highly
problematic
and
technology
like
facial
recognition.
Not
only
is
privacy
important
and
should
be
valued
by
our
law
enforcement
phase.
Surveillance
is
proven
to
be
less
accurate
for
people
with
darker
skin,
with
the
high
levels
of
racial
discrimination
force
on
our
black
and
brown
communities
by
law
enforcement.
AO
Their
racial
bias
in
face
surveillance
will
put
lives
in
danger
even
further
than
they
already
are.
A
ban
on
face
surveillance
technology
is
critically
important
in
the
city
of
Boston,
because
the
technology
really
is
only
reliable
for
a
very
small
fraction
of
the
population.
It's
highly
important
to
me
to
live
in
a
city
where
it
is
critical
for
leaders
to
keep
our
city
and
its
people
safe,
although
it
may
seem
as
though
facial
recognition
could
maybe
keep
us
safer,
it
has
proven
to
do
the
opposite.
AO
I
urge
you
to
press
pause
on
the
use
of
facial
for
surveillance
by
government
entities
in
the
city
of
Boston
by
supporting
and
passing
the
Cuccia
ban.
We
cannot
allow
Boston,
siddhappa
thoroughly
and
unregulated
by
US
surveillance
technology.
Thank
you
for
your
attention.
Danny
we're
gonna!
Thank
you
very.
AT
AT
Just
you
know,
facial
recognition
and
I
can
sincerely
tell
you
that
it
terrified
me
the
thought
of
having
somebody
access
this
technology
without
my
consent,
acknowledged
I,
don't
know
if
this
would
have
allowed
me
to
finish
school
at
that
time,
and
it
would
have
been
hard
to
finish
possibly
higher
education
without
that
education
I
would
not
be
able
to
secure
job
opportunities
like
I
have
now
with
a
paying
job
that
provide
the
healthy
provide
for
my
family.
So
we
start
thinking
about
that.
AT
He
really
start
seeing
how
this
facial
recognition
can
actually
have
a
negative
impact
in
the
education
of
our
youth
in
our
future.
People
already
talk
how
or
the
u.s.
is
re
riddled
with
racial
bias,
and
this
software
is
just
giving
them
giving
people
already
empower
the
tutu's
to
me
even
more.
For
example,
ice
has
already
showed
up
in
the
protests
and
a
teen
a
few
people.
AT
Now
we
have
to
ask
ourselves
how
the
heck
do
they
pick
out
these
feet
and
the
thousands
of
sea
of
protesters
I
mean
one
can
only
speculate,
but
one
of
the
few
answers
can
be
facial
recognition
of
software.
Do
we
not
have
it
do
they
have
it?
How
can
we
show
that
they're
not
using
it?
Well,
it's
pretty
hard
to
pick
out
two
or
three
individuals
are
about
thousands
and
thousands
and
thousands
of
processors.
So
it's
up
to
us
to
kind
of
speculate.
What
that
is,
the
corona
virus
has
caused
major
epidemic.
AT
The
police
brutality
has
caused
major
epidemic
I.
Guess
we
have
to
ask
ourselves?
Are
you
allowing
the
next
epidemic
to
happen?
Maybe
a
facial
recognition
in
the
city
of
Boston
will
bring
us
to
in
the
middle
of
the
next
battle
and
which
why
do
which
side
do
we
want
to
be
on?
Let's
not
approve
this
software
I
read
YouTube
and
facial
recognition
technology,
as
it
also
affects
our
user.
AT
A
Thank
you
very
much.
It's
10:00
to
6:00
I'm
going
to
have
to
stop
sharing
at
6
o'clock.
The
that
only
means
that
I
will
be
turning
over
the
gavel
to
the
one
of
the
lead
sponsors
Michelle
will
at
6
o'clock,
but
I'm
going
to
call
the
next
names
of
individuals
who
are
in
line
I,
don't
have
signed
up
to
speak
before
I
go
and
I
will
continue.
Chair
I
have
an
angel,
Michelle,
rajma,
Leon,
Smith,
Amy
whew.
AU
AU
AU
It
can
get
the
wrong
the
wrong
person.
I
am
young
as
I
get
older.
I
will
not
look.
The
same.
I
am
seven
now
and
when
I
turned
13,
I
will
look
different
when
I
turn,
18
I
will
look
different
I
get
older
and
my
face
changes.
If
these
tools
are
used
in
school,
I
will
not
feel
safe.
We
are
just
children,
our
teachers
are
already
taking
care
and
watching
us.
We
do
not
need
police
watching
us,
Thank
You,
sim
for
your
love
and
support.
A
Well,
I
have
to
say
I,
think
you're,
the
youngest
person
that's
testified
ever
and
in
anything
I've
ever
shared
or
had
that
participation,
I
I,
know
we're
all
unzoom,
but
I
will
give
you
a
round
of
applause.
Thank
you
so
much
that
was
absolutely
beautiful,
very
powerful
testimony
and
we're
so
proud
of
you.
Your
City,
Council
and
City
are
so
proud
of
you.
Thank
you.
So
much
for
your
testimony.
Angel
I'm
going
to
now
call
on
I,
don't
think
I
see
Michelle
or
Leon
I,
don't
hear
them
either.
A
Y
A
AV
Okay
and
I
also
I'm,
not
following
angel,
is
a
bit
much
so
hello.
My
name
is
Michelle
Barrios
and
I
am
a
social
studies
teacher
in
the
Boston
area,
and
today,
I
am
here
speaking
on
behalf
of
the
student
immigrant
movement
myself.
As
an
educator
and
on
behalf
of
my
diverse
student
population,
I
am
speaking
in
support
of
the
ordinance
ban
and
banning
facial
recognition
technology
in
Boston,
presented
by
councillors,
whoo
and
Arroyo.
This
ordinance
established
a
ban
on
government
use
of
facial
surveillance
in
the
city
of
Boston.
AV
Boston
must
pass
this
ordinance
to
join
Springfield,
Somerville,
Springfield,
Somerville,
Cambridge,
Brookline
and
Northampton
in
protecting
racial
justice,
privacy
and
freedom
of
speech.
In
my
training,
as
a
social
studies,
teacher
I
became
increasingly
aware
of
the
trauma
that
students
of
color
develop
as
they
grow
through
as
they
grow
in
a
society
that
seeks
to
police
them
on
the
basis
of
their
ethnicity,
economic
status
and
place
of
residence.
AV
This
trauma
is
exacerbated
when
young,
when
a
young
person
comes
from
an
immigrant
family
whose
status
in
this
country
places
them
at
risk
of
family
separation,
a
loss
of
education
and
a
stripping
of
basic
human
rights.
We
know
that
students
struggle
to
succeed
academically
when
they
face
these
daily
existential
traumas,
which,
in
essence,
is
a
violation
of
their
rights
in
the
u.s.
to
pursue
survival,
let
alone
social
mobility
and
a
future
of
stability.
AV
In
addition
to
my
training
and
work
as
an
educator,
I
am
the
wife
of
a
let
let
you
know,
immigrants
and
to
massif
and
the
mother
of
two
messy,
so
children,
while
I
am
a
white
latina
and
I
can
live
under
the
presumption
of
innocence.
Due
to
my
white
privilege,
I
witnessed
firsthand
what
it
means
to
live
in
a
society
that
has
not
yet
made
good
on
its
promise
of
equality,
liberty
and
justice
for
all.
My
husband
is
followed
in
stores.
AV
Teachers
hesitate
to
release
our
children
to
him
when
they
fur
to
meet
him,
and
he
has
been
asked
for
his
immigration
status
by
customers
in
his
place
of
work.
We
are
in
the
fortunate
position
to
know
that
his
life
and
our
family
can
count
on
his
permanent
residency
to
keep
us
together.
For
now.
However,
I
cannot
say
the
same
for
many
Latin
ex
families
in
the
Greater
Boston
area.
AV
A
ban
on
safe
surveillance
technology
is
critically
important
in
the
city
of
Boston,
because
the
surveillance
harms
our
privacy
and
our
freedom
of
speech,
a
fundamental
right,
a
fundamental
right
and
our
Republic's
Constitution.
This
type
of
surveillance
threatens
to
create
a
world
where
people
are
watched
and
identified
as
they
exercise
their
right
to
protest,
congregate
at
places
of
worship
and
visit
medical
providers
as
they
go
about
their
daily
lives.
For
my
students
of
color,
specifically,
the
young
men
growing
up
in
a
world
that
still
not
fully
accept
the
threat,
accept
the
threat,
their
lives
matter.
AV
I'm,
sorry,
except
that
their
lives
matter
based
surveillance
technology
is
not
only
another
threat
to
their
lives,
but
a
response
saying
to
these
young
people.
You
still
don't
matter,
I
teach
world
history
as
well
as
comparative
government
and
politics,
and
when
I
teach
about
authoritarianism,
I
teach
about
it.
AV
The
impact
on
marginalized
groups
and
countries
that
are
still
considered
developing
I
teach
amongst
I
teach
that
in
these
free
countries,
or
in
these
recently
free
countries,
surveillance
is
one
of
the
most
common
tools
to
manage
dissent,
and
that
is
often
followed
by
state
sanctioned
violence
as
means
to
silence.
Any
dissidents
I
urge
each
of
you
to
consider
the
statement
that
you
make
to
the
public
by
not
banning
the
use
of
facial,
Servite
I'm.
AV
Sorry,
the
statement
that
you
would
make
by
not
banning
the
use
of
facial
surveillance
technology
in
Boston
I
urge
each
of
you
to
imagine
a
fourteen-year-old
student
of
color
or
let
you
know
essential
worker
in
front
of
you
asking
if
their
lives
matter.
What
is
your
response?
I
encourage
you
to
consider
that
moment
and
to
press
pause
on
the
use
of
face
surveillance
by
government
entities
in
the
city
of
Boston.
By
supporting
and
passing
this
crucial
ban,
we
cannot
allow
Boston
to
adopt
authoritarian,
unregulated
by
surveillance
technology.
A
You
very
much
at
this
point
I'm
going
to
turn
over
the
microphone
or
the
gavel.
If
you
will
to
my
colleague,
Michelle
woo
I'll
just
say
up
next,
two
speakers
should
be
getting
ready
as
Julie
McNulty,
Zachary
loan,
Cristina,
Rivera
and
Joseph
Friedman
I
want
to
say
thank
you
so
much
to
the
lead.
Sponsors
of
this
ordinance
I
will
be
planning
to
have
a
working
session
as
soon
as
possible.
A
I
do
understand
the
urgency
and
I
do
believe
that
there
is
there's
a
great
opportunity
before
us
to
lead
in
a
way
that
we
that
that
not
only
values
the
lives
of
our
black
and
brown,
but
in
general,
our
civil
liberties
and
so
I
want
to
thank
again
the
lead
sponsors
for
their
leadership.
I,
look
forward
to
getting
this
done
and
with
that
I
am
going
to
have
to
sign
off
and
I
turn
over
to
Michelle
woo.
Thank.
F
Just
a
note:
could
council
central
staff
give
me
administrative
rights,
so
I
can
help
search
the
weather
folks
are
stuck
in
the
waiting
room
and
transit
people
over
into
the
panelist
section.
Thank
you
so
again
that
it
was
Julie.
Zachary,
Cristina,
Rivera,
Joseph,
Friedman,
Danielle
Sam
turn
for
the
next
few.
U
Yes,
thank
you
for
having
me,
as
stated,
my
name
is
christina
rivera
and
I
am
an
east
boston
resident.
I'm
also
the
founder
of
the
Latin
X
Coalition
for
black
lives
matter.
It
is
a
support
organization
to
help
end
racism
within
our
community,
but
also
help
advance
the
work
of
the
black
life.
The
lack
lives
matter
movement
I
want
to
say
that
we
as
an
organization,
we
fully
support
the
ordinance
of
a
ban
on
facial
recognition
technology,
as
stated
all
throughout
this
meeting
earlier.
U
We
know
that
the
automation
bias
exists
and
that
facial
recognition
technology
clearly
shows
bias
towards
already
highly
targeted
and
hyper
policed
communities.
Specifically
those
of
color
and
I
just
want
to
pose
the
question.
How
can
we
begin
to
trust
even
with
the
development
of
algorithm
accuracy
that
those
who
will
use
it
in
the
future
will
not
abuse
its
capabilities?
U
We
already
have
a
police
force
that
has
shown
time
and
time
again
that
some
of
their
own
procedures
are
still
abuse
to
this
date
and
used
by
bad
actors,
so
adding
a
tool
that
can
be
used
to
further
affirm
racial
bias
only
provides
another
route
to
target.
Our
communities
using
this
technology
also
enforces
a
policing
system
that
is
based
on
control,
and
that
is
a
direct
symptom
of
police
mistrust
that
already
exists
of
their
own
citizens
and
the
ones
that
they
serve,
and
so,
as
some
people
earlier
stated,
specifically
mr.
U
de
Leon's
point
about
how
what
government
agencies
are
allowed
to
already
use
this
versus
people
who
have
in
enacted
bands
what
is
also
going
to
be
the
legislative
power
that
any
federal
agency
will
be
able
to
use
it
in
individual
states,
specifically
when
it
comes
to
ice
and
immigration.
So
this
is
something
that
we
are
looking
to
make
sure
that
it
not
only
is
banned
now,
but
also
cannot
be
allowed
in
the
future.
With
that
said,
thank
you
so
much
to
all
the
panelists
who've
shared
all
this
great
information.
U
AW
Yes,
thank
you
for
your
time,
I'd
like
to
tell
you
a
little
bit
about
the
science
between
emotion
detection
technologies.
So
these
are
technologies
that
use
information
captured
from
face
surveillance,
supposedly
recognize
emotions.
AW
So
I'll
be
speaking
here
as
a
private
citizen,
but
I'll
tell
you
that
for
the
past
three
years,
I've
been
working
at
a
lab
at
Northeastern
University,
which
studies,
emotions
and
I've
been
a
research
assistant
at
the
Harvard
Kennedy,
school's
Belfer
Center
for
Science
and
International
Relations
thinking
about
the
same
type
of
technology
at
the
interdisciplinary,
affective
science
lab
at
Northeastern,
and
one
of
my
bosses
is
dr..
Lisa
Feldman
Barrett,
dr.
AW
Barrett
has
research
appointments
at
MGH
in
Harvard,
Medical,
School
she's,
the
chief
scientific
officer
at
MGH,
a
center
for
law,
brain
and
behavior
she's,
the
author
on
over
230
peer-reviewed
scientific
papers
and
just
finished
a
term
as
president
of
the
Association
for
psychological
science,
where
she
represented
tens
of
thousands
of
scientists
around
the
world
three
years
ago.
This
organization
commissioned
her
and
four
other
experts,
psychologists,
computer
science
and
so
on.
AW
To
do
a
major
study
on
whether
people
across
the
world
express
emotions,
like
anger,
sadness,
fear,
disgust,
surprise
and
happiness
with
distinctive
movements
of
the
face.
So
all
of
these
scientists
started
with
different
perspectives,
but
they
reviewed
a
thousand
research
articles
and
they
came
to
a
very
simple
and
very
important
consensus.
So
the
question
is:
can
you
read
emotions
in
a
reliably
in
the
human
face,
and
the
answer
is
now
here's
an
example
from
dr.
Barrett
people
living
in
Western
cultures
scowl
when
they're
angry
about
30%
of
the
time.
AW
This
means,
then
they
make
other
facial
expressions,
frowns
wide
eyed,
gasps
smiles
about
seventy
percent
of
the
time.
This
is
called
low,
reliability,
people
scowl
and
angry
more
than
chance,
but
they
also
scowl
when
they're
not
angry,
if
they're
concentrating
or
if
they
have
something
like
gasps.
This
is
called
low
specificity.
So
a
scowl
is
not
the
expression
of
anger,
but
it's
one
of
many
expressions
of
anger,
and
sometimes
it
doesn't
express
anger
at
all.
In
this
paper.
Dr.
Barrett
and
her
colleagues
showed
that
scowling
and
anger
smiling
and
Happiness
frowning
in
sadness.
AW
These
are
all
Western
stereotypes
of
emotional
expressions.
They
reflect
some
common
beliefs
about
emotional
expressions,
but
these
beliefs
don't
correspond
to
how
people
actually
move
their
faces
in
real
life
and
they
don't
generalize
to
cultures
that
are
different
from
ours.
So
it's
not
possible
for
anyone
or
anything
to
read
an
emotion
on
our
face
and
we
shouldn't
count.
We
infer
happiness
from
a
smile,
anger
from
a
scowl
or
sadness
from
a
frown.
There
are
technologies
that
claim
to
do
this
and
they're
misrepresenting
what
they
can
do.
AW
According
to
the
best
available
evidence,
people
aren't
moving
their
faces
randomly,
but
there's
a
lot
of
variation
and
the
meaning
of
a
facial
movement
depends
on
a
person
on
the
context
and
on
culture.
So
let
me
give
you
one
example
of
how
making
this
assumption
can
be
really
harmful.
There's
a
scientist
dr.
Lauren
Ruth
when
she
was
at
Wake,
Forest
University,
published
work,
reviewing
two
popular
and
motion
detection
algorithms.
She
ran
these
algorithms,
algorithms
on
portraits
of
white
and
black
NBA
players
and
both
of
them
consistently
interpreted.
AW
The
black
players
is
having
more
negative
emotions
than
the
white
players.
Imagine
that
there's
a
security
guard
that
gets
information
from
a
surveillance
camera
that
there's
possible
nuisance
in
the
lobby
of
their
building
because
of
the
facial
movements
that
somebody's
making
imagine
a
defendant
in
the
courtroom
that
scowling
as
they
concentrate
on
legal
proceedings,
but
in
a
motion.
Ai
based
on
facial
surveillance,
tells
the
jury
that
the
defendant
is
angry.
Imagine
that
the
defendant
is
black
and
the
AI
says
they're
contemptuous.
AW
Imagine
at
worst
a
police
officer
receiving
an
alert
remote,
a
body
camera
based
on
this
technology.
That
says
that
someone
in
their
line
of
sight,
is
a
threat
based
on
a
so-called
reading
of
their
face.
Counselors
I
think
that
you
know
the
question
that
we
should
be
thinking
about
is
whether
we
want
someone
to
use
this
technology
on
us
with
the
chance
that
it
would
misinterpret
our
face
with
a
you
know,
given
the
major
chance
that
it
has
of
being
wrong.
I
think
the
clear
answer
to
that
is:
no.
AW
AA
Hi
good
afternoon
everybody,
my
name
is
Danielle
I'm,
a
Boston
resident
I
live
in
North
vet
I
just
wanted
to
fully
support
and
put
my
my
voice
behind
the
band
for
facial
recognition.
Recognition
technology
I
feel
as
though
the
expert
witnesses
and
activists
that
have
gone
before
me
have
clearly
expressed
and
eloquently
expressed
the
views
that
I
also
share.
I
find
this
technology
to
be
incredibly
dangerous
as
a
cyber
security.
Professional
I
have
a
lot
of
questions
about
how
you
know
what
it
means
to
be
collecting
data
on
people
and
storing
it.
AA
I
certainly
think
that
this
is
a
measure
that
should
be
taken
to
be
a
precedent
for
all
future
developments
of
the
technology.
As
we
just
stated,
there
are
upcoming
and
new
ways
to
express
they
shall
recognition
through
emotions
or
otherwise
that
we
can't
even
comprehend
yet
because
it
hasn't
started
so
I
just
wanted
to
sport.
It
and
thank
all
the
other
people
for
doing
their
voice
behind
it
and
hope
to
see
it
take
place.
Thank
you.
AX
Myself,
hi
I
will
start
the
video,
so
I'm
DJ
Hadfield
I
am
a
professor
of
anthropology
and
history
at
the
Berklee
College
of
Music
in
Boston.
However,
what
I
have
to
say
is,
as
a
private
citizen
and
a
resident
of
East
Boston
as
a
shout
out
to
Lydia
he's
Boston
provide
yes,
as
as
an
anthropologist,
I'm
aware
that
all
technologies,
including
digital
technologies,
reflect
an
augment
the
social
structures
and
systems
of
value
of
particular
societies.
They
are
not
objective
but
reflect
the
biases
of
those
societies
as
they
develop
through
history.
AX
Therefore,
in
the
city
of
Boston,
where
we
take
pride
in
our
civic
freedoms,
we
should
be
wary
of
technologies
that
would
make
our
Fourth
Amendment
meaningless
and
would
erode
the
First
Amendment.
Therefore,
I
would
urge
the
City
Council
and
the
city
of
Boston
to
pass
a
complete
ban
on
facial
recognition
systems
and
technology.
AX
F
You
we're
excited
to
have
you
at
our
hearing
and
hope:
you'll
come
back
Denise
next
and
then
after
Denise
there
are
a
few
names
signed
up
that
I
didn't
see
in
the
room,
so
I'm
gonna
call
out
the
folks
that
I
was
able
to
cross-reference
between
the
waiting
room
and
the
list,
which
was
Charles
Griswold
and
Nora
Paul
Schultz
and
then
after
that,
I'm
gonna,
try
to
admit
everyone
else,
who's
in
the
room
and
and
go
in
order.
Even
if
you
weren't
on
the
original
list.
So
Denise,
please.
S
S
I.
Do
not
think
that
we
should
underestimate
how
this
type
of
technology
technology
could
be
used
to
further
racist
and
authoritarian
agendas.
Here,
the
use
of
facial
surveillance
is
used
to
oppress
and
control
citizens
under
the
guise
of
Public
Safety.
This
is
a
slippery
slope.
Government
surveillance
is
used
to
silence,
to
suppress
speech
and
to
violate
human
rights
as
an
educator.
I
worry
about
this
type.
S
AY
Thank
you.
My
name
is
Charles
squizzle
I'm,
a
resident
of
Brighton
and
I'm,
a
philosophy
professor
at
Boston,
University,
Hasen,
dad
and
I'm,
offering
my
thoughts
here
and
I
am
in
offering
my
thoughts.
I
am
NOT
speaking
for
my
university
and
not
acting
on
his
behalf,
so
I'm
speaking
on
my
own
behalf,
only
I'm
very
grateful
to
all
of
you
in
the
council
for
taking
action
on
this
matter
and
to
the
police
commissioner,
for
his
impressive
and
in
some
ways
reassuring
testimony.
It's
imperative.
AY
I
think
that
you
approve
this
ban
on
facial
recognition,
technology
or
rather
technologies,
as
we've
learned
today
by
government
before
the
relevant
software
is
upgraded.
So
far
as
I
can
tell
three
basic
arguments
for
the
ban.
I've
been
discussed
today,
the
first
to
extensively
in
the
third
of
it
less
so
and
taken
together.
They
seem
to
me
to
be
decisive
in
making
the
case
for
the
ban.
AY
In
some,
the
technology
puts
tremendous
additional
power
in
the
hands
of
government
and
its
agents,
all
the
more
so
when
it
actually
does
work
and
our
privacy,
among
other
things,
is
certainly
threatened
by
that.
Where
could
that
lead?
Look
no
further,
as
people
have
suggested
to
the
nightmare
that
these
surveillance
apparatus
in
China
today.
The
third
point
is,
has
been
mentioned,
but
was
less
emphasized
and
I.
AY
Think
it's
just
worth
underlining,
which
is
that
the
awareness
of
being
personally
tracked
outside
of
one's
home
by
this
technology
can
and
I
think
will
have
a
chilling
effect
on
our
exercise
of
our
civil
rights,
including
our
rights
of
free
speech,
freedom
of
association
and
up
religion.
So
I
strongly
urge
the
council
to
pass
this
ban,
but
I
also
asked
that
you
not
stop.
AY
There
expand
the
man
even
further
going
forward
to
include
other
kinds
of
remote
surveillance
systems
such
that
our
biometric,
for
example,
gait
and
voice
recognition
systems
and
also
to
the
private
sector
has
been
has
been
suggested.
So
I
hope
that
you
will
consider
expeditiously
passing
this,
but
also
going
further.
Thank
you
so
much
for
listening.
Thank.
AZ
Good
evening
my
name
is
Nora
Paul,
Schultz
and
I
am
a
physics
teacher
in
Boston,
Public,
Schools
and
I've
had
the
pleasure
of
teaching
Eli
and
his
Melda,
who
spoke
earlier
in
hopefully,
one
day
angel
and
I'm,
a
resident
of
Jamaica
Plain.
As
someone
who
studied
engineering
when
I
was
in
college,
has
taught
engineering.
Him
is
one
of
the
robotics
coaches
at
the
O'bryant
I
understand
the
impulse
to
feel
like
technology
can
solve
all
of
our
problems.
AZ
I
know
that
many
for
many
there
is
a
desire
that
if
we
have
better
smarter
and
more
technology
than
our
communities
we'll
be
safer,
but
the
reality
is
that
technology
is
not
perfect,
I'm
perfect,
a
biased
tool
as
much
as
we
wish
it
would
be.
Our
technologies
reflect
the
inherent
biases
of
our
makers,
so
technology
isn't
going
to
save
us
if
it
does
not
take
much
to
see
that
racism
is
infused
in
all
parts
of
our
country,
and
that
is
true
about
the
facial
surveillance
technologies.
AZ
Do
my
work
with
unafraid
educators,
the
Boston
teacher
unions,
immigrant
rights,
organizing
committee,
the
information
sharing
between
Boston
school
police
and
the
Boston
Police
Department
I
know
how
detrimental
observations
and
surveillance
can
be.
Practices
of
surveillance,
like
camera,
recording
in
the
hallway
or
officers
watching
young
people
are
not
passive.
Surveillance
is
active.
AZ
It
leads
to
the
creation
of
law-enforcement
profiles
about
young
people,
and
that
can
be
you
know
that
can
impact
their
lives
in
material
ways
for
years
to
come,
and
what
are
those
impacts
being
in
the
system
can
make
it
harder
for
young
people
to
get
a
job
or
get
housing
it
can
lead
to
incarceration.
It
can
lead
to
deportation.
Our
city
government
does
not
need
more
tools
to
profile
the
people
of
our
city
and
especially,
does
doesn't
need
tools
that
are
inherently
racist.
This
is
why
the
ban
is
so
important.
AZ
AZ
AZ
This
face
surveillance
technology
harms
our
community.
The
fact
that
the
face
band
technology
only
identifies
black
with
women
correctly.
One
third
of
the
time
lays
bare
both
the
ineffectiveness
and
the
implicit
bias
built
into
the
system.
As
a
teacher
I
know
that
that
is
not
a
passing
grade,
we
need
money
to
go
to
services
that
will
protect
our
community
and
we
need
to
stop
and
prevent
the
use
of
ineffective,
racists
and
money
make
money
wasting
technologies.
Thank
you.
F
Thank
You
Nora
and
we
appreciate
all
of
your
work.
I'm
gonna
quickly
read
the
list
of
names
that
were
signed
up
to
testify,
but
I
didn't
see
matches
of
the
names
of
folks
left
in
the
waiting
room,
so
just
in
case
you're
assigned
in
on
a
different
name,
but
I'm
expecting
that
these
folks
probably
aren't
here
anymore
Julie,
McNulty,
Zachary,
lone
McKenna,
Kavanagh,
Christine,
Doherty,
Sarah
Nelson
and
her
Theriot
Amanda
Mia
Chris,
Peron,
Bridget,
Shepard,
Lena,
Papaji,
honest
anyone
here
and
signed
on
under
a
different
name,
nope.
F
BA
Yes,
this
is
Marc
Gurevich,
presenting
testimony
on
behalf
of
Jewish
Voice
for
Peace,
a
local
and
national
organization.
That's
guided
by
Jewish
values,
fight
racism,
sexism,
Islamophobia
LGBTQ,
oppression
and
anti-immigrant
policies
here
in
challenging
racism,
colonialism
and
oppression
everywhere,
with
a
focus
on
the
injustice
is
suffered
by
Palestinians
under
Israeli
occupation
and
control.
In
recent
days
in
the
u.s.
BA
we
have
again
witnessed
the
impact
of
militarized
policing
on
communities
of
color
councilor
Wu
has
pointed
out
in
a
recent
publication
about
this
militarization,
and
much
of
this
militarization
has
been
through
the
transfer
of
billions
of
dollars
worth
of
equipment
from
the
military
to
civilian
police
forces
and
the
training
of
u.s.
police
in
the
use
of
military
tactics
in
controlling
the
communities,
most
heavily
impacted
by
racism,
economic
depletion
and
health
crisis.
Our
view
is
that
the
use
of
facial
recognition
technology
is
a
key
component
of
the
militarization
of
policing.
BA
We
know
this
because
of
our
knowledge
and
experience
of
the
Israeli
occupation
of
Palestinian
people
and
land.
Facial
recognition
is
a
core
feature
of
that
occupation.
Much
of
the
technology
being
marketed
to
the
US
law
enforcement
has
been
field-tested
on
Palestinians.
Under
military
occupation,
at
this
historic
and
critical
time,
when
the
effects
of
militarized
policing
in
the
US
has
resulted
in
nationwide
and
worldwide
condemnation,
the
very
last
thing
we
need
is
another
tool
adopted
from
the
toolbox
of
military
occupation.
Jewish
Voice
for
Peace
supports
the
face
surveillance
man.
Thank
you.
Z
F
Z
No
problem,
and
so
I
just
wanted
to
thank
everybody
and
I
want
to
say
that
I
am
very
much
I'm
an
associate
professor
at
UMass
Boston
and
both
as
an
educator
as
and
as
a
researcher
in
issues
of
surveillance.
I
am
very
much
in
favor
of
the
ordinance
and
I
would
like
to
say
that
I
would
also
like
the
council
to
consider
wider
regulations.
So,
as
many
have
said
before,
there
is
a
lot
of
issues
also
in
the
park
in
the
private
sphere.
Z
So
this
is
a
very
important
first
step,
but
a
lot
of
the
the
use
of
facial
recognition
in
law
enforcement
really
interacts
with
the
use
of
surveillance
technologies
in
the
private
space.
So,
for
example,
when
the
Brookline
Police
Department
were
trying
to
argue
against
the
Brooklyn
ban,
Sergei
hello,
they
used
an
example
of
police
using
snapchat.
So
a
lot
of
people
before
a
lot
of
speakers
have
before
have
talked
about
the
importance
of
anonymity
in
public
space
and
I
am
fully
in
support
of
that.
Z
But
this
is
not
only
about
public
space,
so
the
use
of
facial
recognition.
Software
is,
of
course,
invading
all
our
private
spaces.
As
well
so
I
would
like
people
to
be
aware
of
that
and
as
a
teacher
right
now
with
the
epidemic
teaching
on
soon,
you
can
only
imagine
the
deployment
of
voice
and
face
recognition
in
all
our
spaces,
and
so
I
think
this
is
very
important
and
another
thing
I'm.
My
background
is
also
in
philosophy
of
neuroscience
and
in
in
the.
Z
Of
autonomous
action
and
our
autonomous
action
depends
on
being
in
a
space
that
we
can
understand,
and
one
of
the
reasons
why
facial
recognition
is
one
of
the
most
dangerous
inventions
is
that
it
precisely
ruins
the
integrity
of
any
space
way,
because
we
don't
know
who's
there
and
it
can
do
so
retro
actively
so
I.
Thank
you
very
much
for
all
your
work
on
this
and
I'm
fully
in
support
of
the
audience.
Thank
you.
W
Don't
have
a
lot
to
add.
I
just
want
to
support
the
work
they're
doing
and
their
student
advocacy
I
know
they
all
spoke
for
themselves
and
their
allies
did
as
well
and
just
want
to
let
you
know
that
Boston
teachers
also
support
this
ban.
So
thank
you
for
listening
to
all
of
their
testimony
and
know
that
the
teachers
are
on
their
side
too,
and
that
we
support
this
ordinance.
Thank
you.
Thank.
F
L
Thank
you,
madam
chair,
at
this
point,
and
thank
you
to
you
and
councilor
Rio
for
bringing
this
before
the
council.
I've
really
appreciated
everyone's
testimony
today,
in
both
hearings
and
sort
of
the
certainly
a
very
direct
correlation
between
some
of
the
testimony
today.
But
a
particular
interest
of
mine
was
the
testimony
of
the
young
people
and
the
testimony
of
the
teachers
in
particular.
So
just
think.
Thank
you.
Both
Thank
You
counsel,
roofer
staying
after
the
chair,
had
to
leave
to
extend
an
opportunity
for
this
continued
testimony.
L
D
Just
a
sincere
thank
you
to
our
panelists
I
think
many
of
them
are
off
now,
but
thank
you
so
much
to
them
and
everybody
will
give
testimony.
Much
of
that
was
deeply
moving.
Certainly
angel
was
a
highlight,
so
thank
you
angel
and
anybody
who
can
get
that
message
to
angel
I'm,
not
sure
if
it's
bedtime,
but
thank
you
so
much
for
raising
your
voices
and
really
being
focused
at
a
real
municipal
level.
This
is
what
we've
always
asked
for.
This
is
what
you
know.
D
I
grew
up
about
angels
age,
watching
these
meetings
and
having
folks
basically
beg
for
this
kind
of
interaction
from
our
communities
and
so
I'm
just
deeply
grateful
that
whatever
brought
you
here
specifically
I
hope
you
stay,
engaged,
I,
hope
you,
you
bring
this
energy
to
other
issues
that
really
affect
our
communities
on
a
daily
basis,
because
it
really
makes
change.
It
really
changes
things,
and
so
thank
you
so
much
to
everybody
who
took
part
in
this.
D
Thank
you
to
all
the
advocates
who
took
part
in
this
and
Thank
You
Council
for
making
sure
that
there
was
a
chair
here
and
to
councillor
Edwards
first
running
it.
So
well
earlier,
but
thank
you
to
you
for
ensuring
that
we
can
continue
to
hear
public
testimony
because
it
really
is
valuable
so
with
that
I
thought.
That
was
all
very
valuable
and
also
thank
you
to
the
Commissioner
for
taking
the
time
to
be
here
and
in
to
really
endorse
this
idea.
I
thought
that
was
fantastic.
So
thank
you.
D
F
You
this
is
a
great
great
hearing.
I
think
we
learned
we
all
learned
so
much
and
I
want
to
echo
the
thanks
to
the
coalition,
all
the
community
members
who
gave
feedback
along
the
way,
and
over
this
evening,
my
colleagues,
the
Commissioner
and,
of
course,
central
staff
or
for
making
sure
all
that
went
off
very,
very
smoothly,
so
we'll
circle
back
up
with
the
committee
chair
and
figure
out
next
steps,
but
hoping
for
swift
passage
and
very
grateful
again
that
we
have
an
opportunity
in
Boston
to
take
action.