►
From YouTube: Commitee on Government Operations on June 9, 2020
Description
Docket #0683 - Ordinance banning facial recognition technology in Boston
A
A
Coming
in
okay,
yeah,
all
right,
then
so
I'm
gonna
go
ahead
and
start
everyone.
It's
Committee
on
government
operations
today
is
Tuesday
June
9th
2020.
This
is
a
hearing
on
docket
0,
6,
8,
3,
ordinance,
banning
facial
recognition,
technology
in
Boston
I
mean
media,
Edwards
and
I'm.
The
chair
of
the
Committee
on
government
ops,
and
this
matter
was
sponsored
by
councilors
whoo
and
Arroyo
councilor
rodeo
on
May
6
and
accordance
with
the
maecenas
councilors
of
George
awesome.
In
accordance
with
governor
Baker's
executive
order
modifying
certain
requirements
of
the
Open
Meeting
Law.
A
We
are
able
to
have
his
hearing
and
conducting
it
be
assumed.
This
enables
the
City
Council
to
carry
out
its
responsibilities
while
adhering
to
public
health
accommodations.
The
public
may
watch
this
meeting
via
livestream
at
WWF
Austin
gov,
slash
City,
Council
TV.
It
will
also
be
rebroadcast
at
a
later
date
on
Comcast
8r
c
n8,
sash
verizon
1
960
for
a
public
testimony.
Written
comments
may
be
sent
to
the
committee
email
at
CCC
G
at
Boston
gov,
and
it
will
be
made
part
of
the
official
record.
A
This
is
an
ordinance
that
would
ban
the
use
of
facial
surveillance
by
the
city
of
Boston
or
any
official
in
the
city
of
Boston.
The
proposal
would
also
prohibit
entering
into
agreements
contracts
to
obtain
face
surveillance
with
third
parties
I'm
going
to
turn
it
now
over
to
the
lead,
sponsors,
councillor,
Wu
and
councillor
Arroyo,
and
then
I
will
turn
it
to
my
colleagues.
A
In
order
of
arrival
for
opening
remarks
and
just
taking
motion,
like
my
colleagues
and
I
believe
the
order
that
I
have
so
far
councillor
again,
it
will
be
councillor
Arroyo,
councillor,
whew,
councillor,
Braden,
councillor,
Brock,
Bock,
Bock,
excuse
me,
counselor
Mejia
counts
are
Campbell
and
then
I
had
councillor
O'malley
councillor,
asabi,
George
and
then
I
I'm.
Sorry,
they
started
talking
so
other
counselors
have
joined
us.
B
A
E
That
start
with
me
sure
thank
you,
madam
chair
I
think
we
have
excellent
panelists
to
kind
of
go
into
the
intricacies
of
the
facial
recognition
ban,
but
to
just
kind
of
summarize
it.
This
facial
recognition
ban,
creates
a
community
process
that
makes
our
systems
more
inclusive
community
consents,
while
adding
Democratic
oversight.
Where
there
currently
is
none,
we
will
be
joining
our
neighbors
and
cambridge
and
somerville,
and
specifically,
when
it
comes
to
facial
recognition,
tech,
it
doesn't
work.
It's
not
good.
E
It's
been
proven
through
the
data
to
be
less
accurate
for
people
with
darker
skin
a
study
by
one
of
our
panelists
joy.
Halloweeny
research
at
MIT
found
that
black
women
were
35
percent,
more
likely
than
white
men
to
be
misclassified.
My
face
surveillance
technology.
The
reality
is
that
face
surveillance
isn't
very
effective
in
its
current
form.
E
According
to
records
obtained
by
the
ACLU
one
manufacturer.
Promoting
this
technology
in
Massachusetts
admitted
that
it
might
only
work
about
30%
of
the
time.
Currently,
my
understanding
is
that
the
Boston
Police
Department
does
not
use
this.
This
isn't
a
ban
on
surveillance.
This
is
simply
a
ban
on
facial
recognition,
surveillance,
which
I
think
the
data
has
shown
doesn't
work.
E
Bpd
doesn't
use
it
because
I
now
allow
Commissioner
prostitue
speak
on
it,
but
my
understanding
from
from
what
I've
heard
and
past
college
from
Commissioner
Goss
is
that
they
don't
want
to
use
technology
that
doesn't
work
well,
and
so
this
ban
isn't
necessarily
even
a
permanent
ban.
It
just
creates
a
process
where
we
are
banning
something
that
doesn't
have
to
process
that
doesn't
have
any
consent
and
which
we
require
some
democratic
over
if
they
ever
do
want
to
implement
it.
E
G
G
So
I
am
proud
that
Boston
has
the
potential
to
join
our
sister
cities
across
Massachusetts,
in
taking
this
immediate
step
to
ban
a
technology
that
has
been
proven
to
be
racially
discriminatory
and
that
threatens
basic
rights.
We
are
thankful
that
Boston
police
agree
with
that
assessment
and
do
not
use
facial
recognition
on
facial
surveillance
today
in
Boston
and
are
looking
to
codify
that
understanding,
so
that
with
the
various
mechanisms
for
technology
and
upgrades-
and
you
know
changing
circumstances
of
different
administrations
and
personalities
that
we
just
have
it
on
the
books.
G
That
protections
come
first,
because
we
truly
believe
that
public
health
and
public
safety
should
be
grounded
in
trust.
So
you
know
the
the
charity
of
a
wonderful
description
of
the
this
ordinance.
I
am
looking
forward
to
the
discussion,
but
most
of
all
I'm
grateful
to
colleagues
and
everyone
who
has
contributed
to
this
conversation,
as
well
as
the
larger
conversation
about
community
Center,
sir
oversight
of
surveillance,
which
will
be
in
a
separate
but
related
docket.
That
will
be
moving
forward
on
the
council.
Thank
you
very
much.
H
Oops.
Thank
you.
Thank
you,
madam
chair.
Thank
you
to
the
makers
of
this
ordinance.
I,
really
look
forward
to
the
conversation
this
afternoon,
I'm
very
proud
to
participate
in
this
discussion.
I
think
it's
it's
the
proper
and
right
thing
to
do
to
really
consider
deeply
all
the
implications
of
new
technology
that
hasn't
been
proven
to
be
effective
before
introducing
it
in
any
civilian
in
Massachusetts,
so
I
really
look
forward
to
learning
more
and
hearing
from
the
panelists
this
afternoon.
Thank
you.
I
Yes,
thank
you,
madam
chair
I.
Hope
everyone
outside
for
a
minute.
I
I,
want
to
just
say,
I'm
excited
about
the
conversation
today.
I
think
it's
a
cause
of
action
which
should
absolutely
take
I
just
want
to
say
that
I,
you
know,
I
think
it's
really
important
that
this
technology
has
major
shortcomings
and
that
also
it
has
you
know
a
racial
bias
built
into
it.
I
Those
are
really
good
reasons
not
to
use
it,
but
honestly,
even
if
those
did
not
hold
I,
think
that
in
this
country
we
run
the
risk,
sometimes
of
just
starting
to
do
things
that
tech
new
technology
can
do
without
asking
ourselves
like
as
a
democratic
populace
as
a
community.
Is
this
something
that
we
should
do
and
I
think
in
the
process?
I
We
as
Americans,
have
given
up
a
lot
of
our
rights
and
privacy,
frankly
often
as
as
often
as
not
to
private
corporations
as
a
government,
but
I
think
that
it's
really
important
to
just
remember
that.
There's
a
there's,
a
true
public
interest
in
a
society
that
is
built
more
on
trust,
that
on
surveillance
and
so
I
think
it's
really
important
for
us
to
draw
this
line
in
the
sand,
so
that
you
know
if
the
technology
improves
and
some
of
those
arguments
that
are
based
on
a
kind
of
its
functionality
start
to
slip
away.
I
K
Yes,
thank
you
sorry
about
my
audio
Thank
You
chair
women.
Add
to
it
and
thank
you
to
the
makers,
council,
rule
and
I
throw
you
for
your
leadership
on
this
issue.
For
me,
this
issue
is
personal
and
professional.
As
it's
American
women
claims
of
black
beans,
facial
recognition
technology,
miss
identifies
people
like
me
by
35%
we're
in
a
time
where
our
technology
is
outpacing.
Our
morals
we've
got
2020
technology
with
a
16,
20
state
of
mind,
said
that's
thinking
as
a
city.
K
K
Technology
is
a
good
first
step,
but
I'm
happy
to
hear
there's
a
pushback
against
this
kind
of
surveillance
on
all
sides
of
the
issue,
like
the
makers
of
the
ordinance
mentioned
now
right
now,
there
is
not
a
desire
to
put
facial
recognition
technology
in
place,
and
this
ordinance
is
an
opportunity
to
put
that
into
long.
I
look
forward
to
the
discussion
and
I
hope
to
continue
to
push
on
this
issue
alongside
council
rule
and
naturally
thank
you
so
much.
B
You,
madam
chair
for
in
the
sponsors
for
the
steering
as
I've
stated,
ambassador
technology
updates
and
advances
have
provided
us
with
many
advantages,
but
at
the
same
time
they
were
also
proven
to
be
deeply
flawed,
unreliable
and
disproportionately
impacting
communities
of
color
this
technology,
just
for
everyone,
full
disclosure
listening.
This
technology
is
currently
not
being
used
by
the
BPD.
B
I
want
to
make
sure
that
that
you
know
that
I'm
gonna
continue
to
remain
supportive
of
studying,
transparent
limits
in
creating
a
public
process
by
which
we
assess
the
use
of
these
technologies.
I
know
that
we've
been
able
to
solve
a
lot
of
crimes
through
video
surveillance
and
also
through
video,
quite
frankly,
and
also
they've,
led
to
to
justice
for
for
families
and
someone
that
has
lost
a
cousin
due
to
murder.
B
But
for
you
know,
surveillance,
and
it
was
DNA
that
led
to
my
family,
getting
justice
so
looking
forward
to
to
learning
more
about
this
in
the
commissioners
perspective
on
it,
but
also
recognizing
that
we
collectively
need
to
make
sure
that
we're
putting
forth
decisions
that
make
the
most
sense
for
Boston,
like
I'll,
get
real
pro
bill
here.
I'm,
not
quite
frankly,
I'm
not
really
concerned
about
Cambridge,
and
some
of
it
I
want
to
make
sure
that
whatever
we're
doing
in
Boston
is
impacting,
you
know
all
of
our
neighborhoods,
the
neighborhoods
and
I
represent.
B
This
is
citywide
City,
Council,
also
and
making
sure
that
we
have
community
input.
Most
importantly,
we
work
for
the
people
of
Boston.
This
council
is
obviously
as
the
Commissioner
that
has
police
offices.
We
work
for
the
trivet
for
the
residents
and
the
taxpayers
of
the
city,
so
they
need
to
see
the
key
table.
B
C
Thank
you,
madam
chair,
and
thank
you
I
know.
You're
gonna
run
a
tight
ship,
given
it's
all
that's
happening
today,
including
the
Fuhrer,
of
course
for
for
George
Floyd.
So
thank
you
very
much.
Thank
you.
Also,
of
course,
the
Commissioner
for
being
here,
given
all
that's
going
on.
I
know,
you've
been
on
the
record
in
the
past
with
respect
to
this
issue,
and
that
not
only
does
it
apart,
not
have
it,
but
you
understand
the
flaws
of
it.
So
thank
you
for
for
being
here.
C
Thank
you
to
my
council
colleagues,
but
also
importantly,
most
importantly,
the
makers
for
putting
this
forward
I
have
been
supportive
of
this
since
the
very
beginning,
when
I
hosted
along
with
counsel
and
counsel
McCarthy,
our
hearing
on
this
very
issue,
along
with
some
other
surveillance
technologies
and
things
that
were
possibly
coming
down
the
pike.
So
this
was
an
opportunity
for
us
to
have
this
conversation
on
what
we
could
do
proactively
to
make
sure
that
surveillance
tools
were
not
put
into
place
without
a
robust
community
process
and,
of
course,
council
input.
C
L
You,
madam
chair,
just
very
briefly
I,
want
to
thank
the
makers,
counselors
whoo
and
royal
for
their
leadership
here.
Obviously,
I
support
the
facial-recognition
ban
also
grateful
wanna
echo
I
thanks
to
the
Commissioner
for
his
his
comments
and
his
opposition
affected.
This
isn't
used,
so
it's
important
that
we
codify
that
in
such
a
way,
which
again
is
the
purpose
of
this
airing
in
this
ordinance
and
stand
ready,
willing
and
able
to
get
to
work.
Thank
you
and
look
forward
to
hearing
from
a
number
of
residents,
our
panelists,
as
well
as
public
testimony
Abby
thanks
very.
M
M
I
think
that
our
colleagues
have
brought
to
the
forefront
over
the
last
few
years,
as
have
the
advocates
around
the
concerns
regarding
this
kind
of
technology,
so
that
a
you
know
appreciate
the
the
real
concerns
around
accuracy
and
biases
against
people
of
color
women,
children,
the
elderly
and
you
know,
certainly
reflective
of
our
city,
our
city's
residents.
So
we
need
to
make
sure
that
we're
proactive
and
protecting
our
residents
and
having
a
deeper
understanding
and
appreciation
of
this
technology
and
understanding
what
this
ordinance
would
mean
around
that
ban
and
following
through
with
it.
A
Technology
in
Boston
I
will
continue
to
work
on
the
technical
challenges
on
my
end
and
will
join
the
hearing
at
a
later
time
if
I
am
able,
if
I'm
unable
to
join
our
view,
the
recording
at
a
later
time,
I
want
to
thank
the
makers,
counselors
whoo
in
a
row
yo
for
their
advocacy
and
continued
leadership
on
this
critical
issue.
I
especially
want
to
thank
all
the
advocates
are
participating
for
their
tireless
advocacy
on
this
issue.
I
look
forward
to
working
with
advocates
to
incorporate
their
ideas
and
how
we
can
push
this
issue
moving
forward.
A
It
is
clear
we
need
policies
in
place
to
protect
communities
that
are
vulnerable
to
technologies
that
are
biased
against
them.
I
look
forward
to
working
with
my
colleagues
on
this
critical
issue
and
we'll
continue
to
fight
for
an
agenda
that
promotes
and
protects
black
lives.
Please
read
this
letter
into
the
record.
Thank
you
sincerely.
Kim
Janie
I'm,
not
sure.
If
we've
been
joined
by
councillor,
Flynn
I
do
know
he
was
trying
to
get
in
in
the
meantime.
I
did
want
to
just
do
some
quick
housekeeping
as
part
of
my
introductory
remarks.
A
Just
to
let
people
know
again.
Today's
hearing
is
specifically
about
this
proposed
ordinance
that
would
ban
facial
recognition
technology.
We
had
a
robust
conversation
about
defunding
police
discussing
all
aspects
of
the
but
budget.
All
of
those
different
things
just
happened,
and
it
was
sure
councillor,
Bach
and
I
want
to
make
sure
conversations
aren't
related
that
we
do
not
take
this
precious
moment
on
this
legislation
and
have
it
covered
by
the
other
conversation.
So
I
just
want
you
to
I.
Just
want
you
to
know
that
and
I
again
I
will
move
the
conversation
towards
this
ordinance.
A
There
should
be
a
copy
of
it.
There's
people
should
have
the
ability
to
discuss
about
this
language,
and
this
ordinance
questions
I
have
I,
do
want
to
talk
about
that.
Whether
time
is
of
the
essence,
there
seemed
to
been
some
issue
and
concern
about
whether
we
were
entering
a
contract
that
allowed
for
this.
A
So
when
the
Commissioner
speaks
I'd
like
for
him
to
speak
on
that,
I'm
not
sure
if
there's
a
contract
pending
what's
going
on,
I
also
want
to
make
sure
that
folks
understand
due
to
the
fact
that
I
think
we
have
possibly
a
hundred
people
signed
up
to
speak.
We
have
a
lot
of
folks
who
are
going
to
be
speaking
on
the
panel
I'm,
going
to
limit
our
my
colleagues
to
three
minutes
in
their
first
round
and
I'm
gonna
get
the
public
to
no
more
than
two
and
I'm
gonna.
A
N
Thank
you
thank
you,
councillor,
Edwards,
and
into
the
sponsors
and
looking
forward
to
the
to
this
hearing,
learning
learning
more
about
the
V
proposal
in
also
interested
in
hearing
from
the
Commissioner
of
Police
Commissioner
gross
and
with
the
public
as
well
again.
This
is
an
this
is
an
issue.
That's
that
we
all
can
learn
from.
We
can
listen
to
each
other.
We
can
be
respectful
and
and
hear
each
other's
point
of
view
and
that's
what
this
city
is
all
about
is
coming
together
and
treating
each
other
feeling.
N
You
know
with
respect,
especially
during
difficult
days
and
being
on
difficult
issues.
I
also
know
that
our
Police
Commissioner
is
someone
that
also
listens
to
people
and
he's
in
the
neighborhood
Cindy.
He
talks
and
listens
to
a
lot
of
the
concerns
of
the
residents.
That's
what
I
like
most
about
the
police,
commissioner,
is
his
ability
to
listen
and
treat
people
feeling
again
I'm
here
to
listen
and
learn
about
the
proposal.
Thank
You,
counsel,
Edwards.
We
got
strong
leadership.
A
Under
person
limit
on
the
Xoom,
we
have
118
people
trying
to
get
in
so
and
once
you've
testified.
Could
you
then
you
sign
out
that
allows
for
other
people
to
get
in
Commissioner
poss
you've
been
waiting
so
I'm
going
to
turn
the
microphone
over
to
you.
I'm
gonna,
ask
you,
you
know,
get
right
to
the
ordinance
and
get
to
the
friends
and
thoughts
on
that
discuss
any
procedure.
Then
we're
gonna
try
and
get
to
the
panelists
and
move
back
to
the
City
Council,
while
you're
still
here.
Okay,
yes,.
A
O
You
know
good
afternoon,
everyone,
you
know,
god
bless
mr.
George
Floyd.
May
he
rest
in
peace
and
for
the
record
I'm
proud
of
Boston,
as
they
paid
great
homage
and
peaceful
protest
and
I'm
anything.
We
can
do
to
move
our
department
forward
working
with
our
community.
You
have
it
on
record
right
here
and
now
I'm
willing
to
work
with
you
counselors
the
mayor
in
our
great
city,
so
thank
you
so
I
will
start
and
Thank
You
chair
and
the
committee
members
and
the
makers
council
whooping
Oh
Roy.
O
Oh
I,
really
thank
you
for
inviting
me
to
participate
in
today's
hearing
regarding
the
ordinance
banning
facial
recognition
technology
in
Boston.
As
you
can
imagine,
the
Boston
Police
Department
has
been
extremely
busy
meeting
the
current
demands
of
Public
Safety
from
the
coab
at
nineteen
pandemic
and
the
public
rallies.
However,
the
department
and
I
understand
the
importance
of
the
given
topic.
O
I
would
like
to
request
that
this
hearing
you
mean
on
the
given
topic
that
we
work
together
to
find
other
times
to
discuss
the
important
regarding
police
relations,
national
issues
on
police
reform
and
the
ways
that
we
can
improve
our
relationships
here.
In
Boston,
as
has
been
practiced
in
the
past,
we
welcome
a
future
opportunity
to
discuss
the
ordinance
language
and
city
council
concerns
in
a
working
session
regarding
technology
policy
and
potential
privates
privacy
concerns.
My
testimony
today
is
meant
to
serve
as
a
background
to
BPD's
current
practices
and
identify
potential
technological
needs.
O
The
department
where
the
record
does
not
currently
have
the
technology
for
facial
recognition
as
technology
advances.
However,
many
vendors
have
and
will
continue
to
incorporate
automatic.
Automated
excuse
me
recognition
abilities.
We
have
prohibited
these
features
as
we
have
moved
along
in
our
advancements
with
the
intention
to
have
rich
dialogue
with
the
community
prior
to
any
acquisition
of
second,
such
technology.
O
Video
has
proven
to
be
one
of
the
most
effective
tools
for
collecting
evidence
of
criminal
offenses
for
solving
crimes
and
locating
Missing
and
Exploited
individuals.
Any
prohibitions
on
these
investigative
tools,
without
a
full
understanding
of
potential
uses
under
strict
protocols,
could
be
harmful
and
impede
our
ability
to
protect
the
public.
O
The
proposed
ordinance
defines
facial
surveillance
to
mean
an
automatic
I
would
have
made
it
or
semi
automated
process
that
assists
in
identifying
or
verifying
an
individual
Orang
capturing
information
about
an
individual
based
on
the
physical
characteristics
of
an
individual's
face,
and
to
be
clear,
the
department
has
no
desire
to
employ
a
facial
surveillance
system
to
generally
surveil
the
citizens
of
Boston.
The
department,
however,
notes
a
distinction
between
facial
surveillance
systems
and
facial
recognition
in
technology.
O
The
department
would
as
technology
advances
and
becomes
more
reliable,
like
the
opportunity
to
discuss
the
utilization
of
facial
record
recognition,
technology
to
restart,
to
respond
to
specific
crimes
and
emergency
situations,
and
I
would
like
to
revisit
everyone's
thoughts
and
remembrances
of
the
Boston
Marathon
and
the
most
recent
kidnappings
facial
recognition
technology
to
the
best
of
our
now
knowledge
were
greatly
reduced
to
hours
necessary
to
review
video
evidence.
This
would
also
allow
investigators
to
move
more
quickly,
identifying
victims,
missing
persons
or
suspects
of
crime
through
citywide
camera
recordings.
O
Facial
recognition,
technology
used
with
well-established
guidelines
in
under
strict
review,
can
also
provide
a
safer
environment
for
all
in
the
city
of
Boston.
Creating
a
system
in
which
evidence
is
reviewed
in
a
timely
manner
and
efficiently
allows
for
police
int
to
intervene
and
reduce
levels
of
crime
would
be
would
be
beneficial.
O
It
is
important
to
note
the
department
shares
the
concerns
of
our
community
and
our
community
concerns
around
privacy
and
intrusive
surveillance,
as
it
is
the
current
practice
with
existing
technology.
Our
intentions
are
to
implement
any
potential
future
advances
in
facial
recognition
technology
through
well-defined
protocols
and
procedures.
Some
areas
we
consider
excuse
me
some
areas
we
consider
this
technology
may
be
beneficial,
are
in
identifying
the
route
and
locations
of
missing
persons,
including
the
suffering
from
Alzheimer's
and
dementia,
those
suffering
from
Alzheimer's
and
dementia
as
well
missing.
O
O
Important
to
note,
we
would
like
to
work
with
the
council
and
our
partners
to
clearly
articulate
the
language
that
could
best
describe
these
uses
in
Section
B
2
of
the
proposed
ordinance.
The
indicates
there
is
some
intention
to
account
for
the
use
of
this
technology
to
advance
investigations,
but
the
department
believes
some
additional
language
referring
referencing
is
permitted
uses
is
necessary.
O
That's
my
official
statement
that
I
read
in
and
I'm
telling
you
right
now
as
an
african-american
male
I'm.
The
technology
that
is
in
place
today
does
not
meet
the
standards
of
the
Boston
Police
Department.
Nor
does
it
meet
my
standards
and
until
technology
advances
to
a
point
where
it
is
more
reliable
again,
we
will
need
your
input.
Your
guidance
will
work
together
to
pick
that
technology,
which
is
more
conducive
to
our
privacy
and
our
rights
as
citizens
of
Boston.
Thank
you.
A
Thank
you
very
much,
commissioner
grass.
Do
you
just
before
I
go
on
we're
gonna
go
now
to
joy.
A
Weeny
and
I'm
so
sorry
for
that
joy,
I,
apologize
profusely
for
the
paternity
of
your
last
name:
McHale
Macallan
at
Karina
ham,
Katie,
Crawford,
Kate,
Erik,
Berg
and
Joshua
orocos,
but
I
wanted
to
ask
the
Commissioner
just
very
quickly.
In
the
you
said
you
have
suggested
language
or
that
you
would
want
to
work
on
suggestive
language
for
be
to
know.
O
I
would
want
to
work
on
it,
so
keep
in
mind
anything
we
do
going
forward
as
I
promised
for
the
last
four
years.
We
want
your
input
and
your
guidance
counselor
Mejia.
You
hit
it
on
point.
This
technology
is
not
fair
to
everyone,
especially
african-americans
Latinos.
It's
not
there
yet
so
moving
forward.
We
have
to
make
sure
everybody
is
comfortable
with
this
type
of
technology.
O
A
You
so
now,
I'm
gonna
turn
it
over
to
the
some
of
the
folks
in
the
lab
and
call
forward
to
speak
I
hope
we're
gonna,
try
and
get
through
as
many
of
them
as
possible.
Commissioner.
These
are
the
folks
who
helped
to
push
and
move
this
ordinance.
We
really
hope
you
can
stay
as
long
as
you
can
to
hear
that.
A
P
Good,
so
thank
you
so
much
madam
chair
Edwards,
members
of
the
Committee
on
government
operations
and
members
of
the
Boston
City
Council
for
the
opportunity
to
testify
today.
I
am
the
founder
of
the
algorithmic
Justice
League
and
an
algorithmic
bias
researcher
I've
conducted
MIT
studies
showing
some
of
the
largest
recorded
gender
and
racial
biases
in
AI
system
sold
by
companies,
including
IBM,
Microsoft
and
Amazon.
As
you've
heard,
the
deployment
of
facial
recognition
and
related
technologies
has
major
civil
rights
implications.
P
These
tools
also
have
technical
limitations
that
further
amplify
harms
for
black
people,
indigenous
people,
other
communities
of
color
women,
the
elderly,
those
with
dementia
and
Parkinson's
Youth
trans
and
gender
non-conforming
individuals
in
one
test,
I
ran
Amazon's
AI,
even
failed
on
the
face
of
Oprah
Winfrey
labeling.
Her
mail
personally
I've
had
to
resort
to
wearing
a
white
mask
to
have
my
dark
skin
detected
by
some
of
this
technology.
But
given
mass
surveillance
applications
not
having
my
face
detected
can
be
a
benefit.
P
When
the
tech
works,
we
can't
forget
about
the
cost
of
surveillance,
but
in
other
contexts
it
can
fail
and
the
failures
can
be
harmful.
Misidentifications
can
lead
to
false
arrests
and
accusations
in
April,
2009,
Tina,
Brown,
University
senior
was
miss
identified
as
a
terrorist
suspect
in
the
Sri
Lanka
Easter
bombings.
The
police
eventually
corrected
the
mistake,
but
she
still
received
death
threats.
Mistaken
identity
is
more
than
an
inconvenience
and
she's
not
alone.
In
the
UK,
the
faces
of
over
2,400
innocent
people
were
stored
by
the
police
department.
P
Without
their
consent,
the
department
reported
a
false
positive
identification
rate
of
over
90
percent
in
the
US.
There
are
no
reporting
requirement,
generally,
generally
speaking,
so
we're
operating
in
the
dark.
Further.
These
tools
do
not
have
to
identify
unique
bases
to
be
harmful.
An
investigation
reported
that
IBM
equipped
the
NYPD
with
tools
to
search
for
people
in
video
by
facial,
hair
and
skin
tone.
In
short,
these
tools
can
be
used
to
automate
racial
profiling.
The
company
recently
came
out
to
denounce
the
use
of
these
tools
for
mass
surveillance
and
profiling.
P
Ibm's
moved
to
stop
selling
facial
recognition.
Technology
underscores
its
dangers
due
to
the
consequences
of
failure.
I
focused
my
MIT
research
on
the
performance
of
facial
analysis
systems,
I
found
that
for
the
task
of
gender
classification,
IBM
Microsoft
and
Amazon
had
air
rates
of
no
more
than
1%
for
lighter
skinned
men.
In
the
worst
case,
these
rates
were
to
over
30%
for
darker
skinned
women.
Subsequent
government
studies
complement
this
work.
They
show
that
continued
air
disparities
in
facial
analysis
and
other
tasks,
including
face
recognition.
P
The
latest
government
study
of
189
algorithms
revealed
consequential
racial,
gender
and
age
bias
in
many
of
the
algorithms
tested
still
even
if
air
rates
improved
the
capacity
for
abuse,
lack
of
oversight
and
deployment
limitations
posed
too
great
a
risk
given
known
harms
the
city
of
Boston
should
ban
government
use
of
face
surveillance.
I
look
forward
to
answering
your
questions.
Thank
you
for
the
opportunity
to
testify
you
so.
A
Much
and
to
correct
my
horrible
mispronouncing
mispronunciation
of
your
name,
would
you
mind
saying
your
full
name
for
the
record
again
sure
my
name
is
joy,
bloom
Winnie!
Thank
you
very
much.
Joy,
I'm
gonna
now
turn
over
to
Mikkel
McKellen,
a
youth
advocate
I'm.
Also
gonna
set
a
clock.
Three
minutes.
Q
My
kill
McMillan
I'm,
the
kid.
Q
Q
Do
I
yeah
so
I'm
very,
very
pleased
to
have
as
called
yes
there's
no
longer
catch.
This
place
will
catch
him
for
life
and
Ralph
goes
on.
January
I
want
to
talk
about
three
years
ago,
roughly
January
around
2017
there
were
residents
in
my
neighborhood.
I
came
up
to
me
that
they
asking
a
drone
flying
in
the
area
and
they
asked
him
in
numerous
three
times
and
nobody
could
figure
out
who
was
planted.
Q
Upon
receiving
this
information,
I
live
across
the
street
from
a
solar
company
and
they
don't
and
they
asked
an
office,
was
playing
with
a
drone
and
I
end
up
taking
the
photos
of
their
Joe
and
and
it
kind
of
jogs.
My
memory
back
that
this
was
the
drone
that
was
the
residence
in
my
neighborhood.
Has
he
not
talked
about,
and
no
one
knew
who
was
find
it
so,
with
doing
so,
much
research
are
reaching
out
to
a
friend
of
rays.
Q
Do
you
and
we
try
to
file
a
public
records
request
which
we
were
denied
our
first
time
and
admits?
We
got
it
to
work.
Boston
Police
has
I
spent
the
amount
of
seventeen
thousand
five
collisions
and
nobody,
nobody
from
City
Council
that
the
public
kind
of
missed
and
kept
secret,
and
there
was
no
committee
oversight.
A
few
drones
were
flying
illegally
inside
of
a
neighborhood,
just
the
creature
of
privacy,
the
underneath
there
not
knowing
what
information
was
being
politically.
What
was
what
was
being
stored?
I
was
being
stored.
Q
Q
It
has
to
be.
There
has
to
be
somebody,
regardless
that,
if
it's
myself
or
under
the
individual,
we
have
to
hope
people
accountable
and
something
like
monocle,
canister
sort
of
activist
kind
of
really
has
no
color
stood
out
there.
We
gotta
check
your
white
flag
in
a
blue
uniform
used
to
have
to
be
held
accountable,
just
to
see
that
no
one
was
held
accountable
for
this
very
thing.
Of
course,
they're
controlling.
A
You
see
much
perfect
testimony
and
I
really
appreciate
you
speaking
from
the
heart
from
your
own
experience
and
also
demonstrating
how
you've
actually
a
conversation
for
a
child
bill
through
your
leadership.
So
I
want
to
thank
up
next.
We
have
Karina
him
from
the
student
immigrant
movement,
three
minutes
Karina.
R
I
hi
everyone.
Thank
you.
Everyone
for
joining
us
today.
At
this
hearing
my
name
is
Karina
Hamm
and
I'm.
The
field
organizer
for
the
student
immigrant
movement
I
would
like
to
also
thank
our
partners.
Who've
been
working
on
the
facial
recognition
band
city,
counselors
Michelle,
who
Ricardo
our
Doyle
Andre
Kim,
Bell,
Kim,
Jamie,
a
nice
s,
I'm
George,
along
with
ACLU
of
Massachusetts
Muslim,
Justice,
League
and
unafraid
educators.
So
sim
is
a
statewide
grassroots
organization.
R
By
and
for
the
undocumented
youth
we
work
closely
with
our
members
or
our
young
people
to
create
trusting
and
empowering
relationships
at
them.
We
fight
for
the
permanent
protection
of
our
undocumented
youth
and
their
families
through
collective
action
and
one
of
the
major
role
of
the
work
and
the
organizing
that
we
do
is
to
challenge
these
systems
that
continue
to
exploit
and
disenfranchise
the
undocumented
in
immigrant
communities.
R
Their
face
surveillance
system
has
been
adopted
in
their
school
district
and
they've
already
spent
over
a
million
dollars
in
the
system,
and
so
you
ask
yourself,
but
what
is
exactly
the
intent
to
have
these
surveillance
that
you
still
have
the
right
to
their
privacy
and
have
the
right
to
live
without
a
constant
infringement
and
interference
of
law
enforcement?
If
another
city
allows
facial
surveillance
and
this
in
their
schools
to
happen?
R
We
know
that
black
and
brown
youth
to
experience
depression
everyday
and
they
already
are
deemed
suspicious
by
law
enforcement
based
on
their
skin
color
and
the
clothes
they
wear.
The
people
they
interact
with
and
more
and
so
adding
facial
recognition,
that's
not
even
accurate
to
surveil
our
youth
and
families
will
just
be
used
to
justify
their
arrests
or
NT
or
deportation
and
through
the
policing
system,
we
are
already
asking
for
police
officers
to
engage
with
the
youth
in
ways
that
are
just
harmful
to
the
youth
development.
R
The
City
of
Boston
needs
to
stop
criminalizing
criminalizing
young
folks,
whether
they
engage
in
criminal
activity
or
not
arresting
them
will
not
to
the
root
of
the
problem.
There
are
so
many
options
that
are
much
safer
and
more
secure
for
our
youth,
so
families
for
the
families
and
their
futures
banning
facial
recognition
would
mean
law
enforcement
has
one
less
tool
to
criminalize
our
use.
Thank
you
so
much.
Everyone
yeah.
A
O
A
O
A
T
T
As
our
health
care
providers
lack
the
PPE,
they
need
to
protect
themselves
and
us,
but
our
government
cannot
continue
to
act
as
if
it
is
at
war,
with
its
residents
waging
counterinsurgency
surveillance
and
control
operations,
particularly
in
black
and
brown
neighborhoods,
and
against
black
and
brown
people.
We
must
act.
We
must
not
merely
speak
to
change
our
laws
to
ensure
we
are
building
a
free,
just
and
equitable
future
for
all
people
in
Boston.
So
that
said,
banning
fake
surveillance
in
Boston
is
a
no
brainer.
T
It
is
one
small
but
vital
piece
of
this
larger,
necessary
transformation
based
surveillance
is
dangerous
when
it
works
and
when
it
doesn't
banning
this
technology
in
Boston.
Now
is
not
an
academic
issue.
Indeed,
the
city
already
uses
technology
that,
with
one
mere
software
upgrade,
could
blanket
Boston
in
the
kind
of
dystopian
surveillance
currently
practiced
by
authoritarian
regimes
in
China
and
Russia
and,
as
Joyce
said,
even
some
cities
right
here
in
the
United
States
Boston
has
to
strike
a
different
path
by
protecting
privacy,
racial
justice
and
First
Amendment
rights.
T
We
must
ban
this
technology
now
before
it
creeps
into
our
government,
in
the
shadows,
with
no
democratic
control
or
oversight,
as
it
has
in
countless
other
cities
across
the
country
and
across
the
world.
So
today
you
already
heard
from
pretty
much
the
world
renowned
expert
on
racial
and
gender
bias
and
facial
recognition.
Thank
you
so
much
joy
for
being
with
us
today.
Your
work
blazed
the
trail
and
showed
the
world
that
yes,
algorithms,
can
in
fact
be
racist.
T
You're
gonna
hear
a
similar
set
of
sentiments
from
Erik
Berg
of
the
Boston
Teachers
Union
and
shortly
medical
doctor
and
infectious
disease
expert
Joshua
Baracus,
testify
to
how
this
technology
in
Boston
would
inevitably
undermine
trust
among
the
most
vulnerable
people
seeking
medical
care
and
help.
Obviously,
that's
the
last
thing
that
we
need
to
do,
especially
during
a
pandemic
for
too
long
in
Boston
city
agencies
have
acquired
and
deployed
invasive
surveillance
technologies,
with
no
public
debate,
transparency,
accountability
or
oversight.
T
In
most
cases,
these
technologies,
which,
as
Michael
said,
include
drones
but
also
license
plate
readers,
social
media
surveillance,
video
analytics
software
and
military-style
cell
phone
spying
devices
among
many
others.
These
were
purchased
and
deployed
without
the
City
Council's
knowledge,
let
alone
approval.
We
have
seen
this
routine
play
out
over
and
over
and
over
for
years.
This
is
a
problem
with
all
surveillance
technologies,
but
it
is
especially
unacceptable
where
the
technology
is
dangerous
and
dystopian
as
face
surveillance.
T
It
is
not
an
exaggeration
to
say
that
face
surveillance
would
if
we
allow
it
destroy
privacy
and
anonymity
in
public
space.
Face
surveillance
technology
paired
with
the
thousands
of
networked
surveillance
cameras
already
installed
throughout
the
city,
would
enable
any
official
with
access
to
the
system
to
automatically
catalog
the
movements,
habits
and
associations
of
all
people
at
all
times,
merely
with
the
push
of
a
button.
T
Demonstrations
run
those
images
through
a
computer
program
and
automatically
populate
a
list
of
each
person
who
attended
to
express
their
First
Amendment
rights
to
demand
racial
justice.
The
police
could
also
ask
a
computer
program
to
automatically
populate
a
list
of
every
person
who
walked
down
a
specific
street
any
given
morning
and
share
that
information
with
ice.
It
could
also
be
used
to
automatically
alert
law
enforcement
whenever
a
specific
person
passes
by
a
specific
surveillance
camera
anywhere
in
the
city
and
again
there
are
thousands
of
these
cameras
with
more
going
up
each
week.
Yes,
so.
A
T
U
Thank
you
and
I'll
get
right
to
the
point
in
the
interest
of
time.
Thank
you
to
the
council
for
taking
this
up
and
for
allowing
this
time
today
and
I'm
here
to
speak
today
on
behalf
the
Boston
Teachers
Union
and
our
over
10,000
members
in
support
of
the
ordinance
banning
facial
recognition.
Technology
in
Boston
that
was
presented
by
councillors
will
in
a
Royal.
We
strongly
oppose
the
use
of
this
technology
in
our
public
schools.
Boston's
public
schools
should
be
safe
environments
for
students
to
learn,
explore
their
identities
and
intellects
and
play
faith.
U
Surveillance
technology
threatens
that
environment.
The
technology
also
threatens
the
rights
of
our
BTU
members,
who
must
be
able
to
go
to
work
without
fearing
that
their
every
movement,
habit
and
association
will
be
tracked
and
catalogued
to
our
knowledge.
This
technology
is
not
in
use
in
our
schools,
but
we've
already
witnessed
some
experimenting
with
it.
U
Face
surveillance
in
schools
transforms
all
students
and
their
family
members,
as
well
as
employees
into
perpetual
suspects
where
each
and
every
one
of
their
movements
can
be
automatically
monitored.
The
use
of
this
technology
in
public
schools
will
negatively
impact
student's
ability
to
explore
new
ideas,
express
their
creativity
and
engage
in
student
dissent,
an
especially
disturbing
prospect,
given
the
current
youth
led
protests
against
police
violence.
Even
worse,
as
we've
heard,
the
technology
is
frequently
biased
and
inaccurate,
which
raises
concerns
about
its
use
to
police
students
of
color
academic.
Peer-Reviewed
studies
show
face
surveillance.
U
Algorithms
are
too
often
racially
based,
particularly
against
black
women,
with
inaccuracy
rates
up
to
35%.
For
that
demographic
we
know
black
and
brown.
Students
are
more
likely
to
be
punished
for
perceived
misbehavior
face
surveillance
will
only
perpetuate
and
reproduce
this
situation
when
used
to
monitor
children.
This
technology
fails
in
an
essential
sense
because
it
has
difficulty
accurately
identifying
young
people
as
their
faces
change.
Research
that
tested
five
top
performing
commercial-off-the-shelf
face
recognition
systems
shows
there's
a
negative
bias
when
they're
used
on
children,
they
perform
core
on
children
than
on
adults.
U
U
On
top
of
this
face
surveillance
technology
regularly
miss
genders
transgender
people
and
will
have
a
harmful
impact
on
transgender
young
people
in
our
schools.
Research
shows
that
automatic
gender
recognition
consistently
views
gender
in
a
trans
exclusive
way
and
consequently
carries
disproportionate
risk
for
trans
people
subject
to
it.
At
a
time
when
transgender
children
are
being
stripped
of
their
rights
at
a
national
level,
Boston
must
protect
transgender
kids
in
our
schools.
Moreover,
face
surveillance
in
schools
will
contribute
to
the
school
to
Prison
Pipeline,
threatening
children's
welfare,
educational
opportunities
and
life
trajectories.
J
U
Get
right
to
the
end
finally
face
surveillance
technology
will
harm
immigrant
families
in
this
political
climate.
Immigrants
are
already
fearful
of
engagement
with
public
institutions
and
face
surveillance
systems
would
further
chill
student
and
parent
participation
in
immigrant
communities
in
our
schools.
Boston
schools
must
be
welcoming
and
safe
spaces
for
all
families.
The
City
of
Boston
must
take
action
now
to
ensure
children
and
Beetee
workers
are
not
subject
to
this
unfair,
biased
and
chilling
scrutiny.
In
order
to
protect
young
people
in
our
educational
community,
we
must
stop
face
surveillance
in
schools
before
it
begins.
A
V
I'll
be
I,
will
I
will
read
quickly.
First
of
all,
thanks
for
allowing
me
to
testify
today
regarding
this
important
issue.
I
should
say
that
the
views
that
I'm
expressing
today
are
my
own
and
don't
necessarily
represent
those
of
my
employer,
but
that
said,
I'm
an
infectious
disease
physician
and
an
addictions
researcher
here
in
Boston
as
such
I
spend
a
large
majority
of
my
time
working
on
solutions
to
improve
the
health
and
well-being
of
vulnerable
populations,
including
people
who
are
experiencing
homelessness
people
with
substance
use
disorders.
V
One
thing
that
we
know
is
that
stigma
and
bias
are
pervasive
throughout
our
community
and
lead
to
disparities
and
care
for
these
populations.
These
disparities
were
laid
bare
and
exacerbated
by
the
ongoing
koban
19
pandemic.
Well,
we're
all
searching
for
answers
on
how
to
best
protect
ourselves
from
this
virus.
I,
truly
fear
that
the
use
of
facial
recognition
technology
will
only
serve
to
exacerbate
existing
disparities,
including
those
of
racial,
gender
and
socioeconomic.
V
Additionally,
Boston
provides
expansive
services
for
substance
use
disorders,
including
methadone
treatment,
and
it's
imperative
that
we
ban
facial
recognition
software
to
ensure
the
people
seeking
substance,
use,
disorder,
treatment
and
mental
health
care
among
other
stigmatized
health
services
may
do
so
privately
and
without
fear.
If
we
allow
this
on
many
of
our
public
cameras,
it
would
enable
the
government
to
automatically
compile
lists
of
people
seeking
treatment
for
substance,
use
mental
health
and,
as
I
said,
other
stigmas,
stigmatize
health
conditions.
This
would
have
a
chilling
effect
and
make
most
and
and
disproportionately
affect
our
most
vulnerable
populations.
A
E
G
Wu,
thank
you,
madam
chair,
and
thank
you
so
much,
commissioner,
for
making
the
time
to
be
here.
We
all
know
how
many
demands
are
on
your
time,
and
now
we're
asking
for
even
more
than
you
had
allotted
so
very
very
grateful
that
it
means
a
lot
that
you
took
the
time
and
are
the
one
at
this
hearing.
I
also
want
to
note
that
you
were
the
one
at
the
hearing
and
June
2018
as
well,
so
we
know
that
you've
been
part
of
this
conversation
and
supportive
of
this.
O
O
G
I'm
gonna
try
to
get
you
out
of
here
as
quick
as
possible,
so
secondly,
so
you
drew
a
distinction
between
base
surveillance
systems
and
facial
recognition
technology,
so
just
to
clarify
does
BPD.
You
know
we
don't
use
that.
You
know
own
it
right
now
because
you
haven't
done
that
upgrade,
but
does
BPD
use
work
with
other
state
or
federal
entities
that
have
based
surveillance
or
facial
recognition
technology?
Do
you
ever
use
the
results
of
that
technology
on
any
service
agent
answer.
O
G
So
when
State
Police
or
you
know,
will
find
out
more
potentially
BPD
works
with
the
RMB
to
run
matches
of
photos
against
their
database.
Do
you
need
any
sort
of
war
ends,
rather
approval
to
do
that
before?
Let's
say
they
say.
M
O
O
If
you
are
victim
of
a
crime-
and
we
don't
have
that
person
under
arrest,
we
do
we
do
have
a
potential
suspect
you
have
to
pick.
You
are
allowed
the
opportunity
to
try
to
identify
the
suspect
of
the
crime
and
so
to
be
fair
to
ensure
that
no
innocent
person
is
is
selected.
You
put
the
suspects
picture
along
with
several
other
suspects
pictures
up
to
seven
or
more,
and
then
you
see
if
the
victim
of
the
crime
can
pick
the
suspect
out
of
that
photo
array.
O
G
G
O
G
O
G
And
so,
given
the
absence
of
guidelines
right
now
at
the
federal
state
or
city
level-
and
you
know
the
need
to
upgrade
and
the
use
by
State
Police,
for
example,
are
you
supportive?
You
know
we're
hearing
you
loud
and
clear
that
you'd
like
to
have
from
their
conversation?
Are
you
just
wanting
it
on
your
opinion?
Now?
Are
you
supportive
of
a
city
level
ban
until
there
are
policies
in
place?
Potentially
you
know
other
levels
or
specifically
at
the
city
level,
to
reign
and
surveillance.
Overall,
yes,.
O
I
am
I've
been
clear
for
four
years.
We
need
your
input,
your
guidance
and
I.
Thank
you
to
everyone
to
testify
today,
a
team
taking
notes,
because
I
didn't
forget
that
I'm,
african-american
and
I
could
be
misidentified
as
well
and
I
believe
that
we
have
one
of
the
most
diversified
populations
in
the
history
of
our
city,
and
we
do
have
to
be
fair
to
everyone.
That
is
a
citizen
of
Boston.
We
don't
want
anyone
miss
identified
and
and
again
we
will
work
with
everyone
here,
because
we'll
be
educated.
Yes,.
E
So,
actually,
is
that
right
all
right,
so
one
I
just
want
to
thank
you,
Commissioner
for
supporting
the
facial
recognition,
surveillance
van
for
taking
the
time
to
be
here
for
the
questions.
Counsel,
who
asked
many
of
them.
I
have
one
specific
one
which
is:
has
the
BPD
used
facial
recognition
in
the
past
in
any
capacity?
No.
O
E
O
A
O
A
Before
you
go
I'm
gonna,
do
this
one
call
out
to
the
panelists
that
just
spoke
or
to
any
City
counselors
one
question
that
you
may
have
for
the
Commission
that
you
need
to
ask:
do
any
of
you
have
that
to
either
joy
Mykel
cover
Karina,
Cade,
Eric
Joshua,
or
to
the
other
city
councillors
I'm
trying
to
be
mindful
of
his
time,
but
I
I
also
understand.
You.
Guys
also
gave
time
to
be
here
today
as
well,
and
if
you
wanted
to
ask
him
representing
PPD
a
question
I
seek
Julian.
A
K
Yes,
I
just
wanted
to
quickly
thank
Commissioner
brass
for
your
time
and
just
your
steadfast
leadership,
but
I
do
have
just
a
quick
question
before
you
go
and
just
I
just
need.
Some
clarity
is
that
you
mentioned
that
you
hope
to
keep
this
hearing
on
the
topic
of
the
ordinance
to
find
other
ways
to
adjust
issues,
police
relations.
So
a
lot
of
people
draw
the
direct
link
between
facial
recognition
time
relationships
with
the
community.
Do
you
see
that
link,
and
can
you
talk
about
how
the
police
department
sees
that
link
I'm
just
really
curious
about?
O
O
You
have
to
listen
to
the
people
that
you
serve
because
you
know
everyone
throws
around
the
moniker
from
unique
policing,
I
believe
that's
become
jaded
and
so
I've
created
a
bureau
of
community
engagement
so
that
we
can
have
these
discussions
and
again
I.
Don't
forget
my
history:
I
haven't
forgotten.
My
history,
I
came
out
in
1983
I've
gone
through
a
lot
of
racism.
O
I
was
in
a
tough
neighborhood
growing
up
and
I.
Didn't
forgive
any
of
that
and,
as
you
alluded
to
earlier,
some
of
the
city
councilors
I'm,
always
in
the
street
talking
to
people,
they
had
a
lot
of
Chris
criticism
of
law
enforcement
in
general,
but
I
believe
in.
If
you
want
change,
be
the
change,
you
can
only
change
with
the
people.
O
So
I
am
looking
forward
to
further
conversation
and
to
listening
to
testimony
on
how
we
can
improve
our
our
community
relations,
especially
not
only
during
the
time
a
civil
unrest,
but
we're
in
the
middle
of
a
pandemic
as
well,
and
so
police
departments
should
not
be
separated
from
the
community.
You
have
to
have
empathy,
sympathy,
care,
respect,
a
lot
of
people
have
lost
jobs
and
we
must
keep
in
mind
about
socio-economic
economics,
fairness
for
employment,
a
lot
of
that
thing
too.
O
One
of
those
things
directly
affect
the
neighborhoods
of
color
and
I'm
glad
that
we
are
increasing
our
diversity
in
the
communities
and
we
currently
currently
have
organizations
in
Boston
that
directly
speaks
to
the
people,
the
lab
latino
law
enforcement
organization,
the
cabo
verde
police
association,
the
benevolent
bees
and
Jade
Society.
Our
department
is
increasing
in
diversity
and
the
benefit
of
that
the
benefit
of
having
representation
from
every
neighborhood
we
serve
is
that
it
will
improve
relationships.
O
So
again,
that's
why
I'm
looking
forward
to
a
working
session
and
my
team
is
taking
notes,
because
everybody's
everybody's
testimony
is
important
or
trust.
Me.
People
like
you
that
worked
hard
for
everyone's
right,
I,
wouldn't
be
here
as
the
first
african-american
police.
Commissioner.
If
we
didn't
have
folks
that
exercise
their
First
Amendment
rights.
A
Commissioner
grass
I
have
a
clarifying
question
and
then
I
just
wanted
just
to
note
this.
This
impacts,
the
staying
in
so
for
folks
to
Commissioner
cross,
is
alluded
to
a
working
session.
To
give
some
clarification
for
those
watching.
That
is
when
we
actually
get
down
to
the
language
of
the
ordinance
and
work
down
to
the
commas
is
what
I
say
and
how
to
do
that,
and
so
what
commissioner
process
is
asking
that
we
have
that
conversation
within
the
next
60
days,
which
I'm
fine
with
doing
that?
A
I'll
just
check
with
the
lead
sponsors
I
did
want
to
make
sure
I
understood
what,
in
terms
of
timing
is
going
on
with
the
contract
is
BPD
entering
into
a
contract
that
has
this
technology.
There
is
a
timing
issue
that
I
thought
was
was
warranting
this
this
hearing
happening
very
fast.
Maybe
the
lead
sponsors
could
also
answer
this
question.
I
was
confused
as
someone
could
explain
what
what
was
that
there's
a
sense
of
urgency
that
I
felt
that
I
was
meeting
and
I'm
happy
to
meet
it.
T
Counselor
I'd
be
happy
to
address
that
we
obtained
public
records
from
the
city
a
while
back
showing
that
the
Boston
Police
Department
has
a
contract
with
a
company
called
brief
cam
that
expired
on
made
yeah
that
expired
on
May
14th
and
the
contract
was
for
version
4.3
of
brief
cam
software,
which
did
not
include
facial
surveillance,
algorithms.
The
current
version
of
brief
cams
technology
does
include
facial
surveillance,
algorithms,
and
so
one
of
the
reasons
that
the
counselors
and
the
advocacy
groups
behind
this
measure
were
urging
you
chair
to
schedule.
T
O
And
thank
you.
That's
exactly
what
I
was
going
to
comment.
The
older
version
of
brief
cam
did
not
have
facial
recognition.
I
believe
that
the
newer
version
does
Kim
works
on
the
old
version
on
condensing
objects.
Cars,
but
I
would
definitely
check
on
that
contract,
which
I
am
NOT
I,
don't
want
any
technology
that
has
facial
recognition
exactly
what
we
were
gonna
check
on
my
cancer
right
now
is
know
at
this
point.
J
O
O
No,
and
if
someone
did,
if
we
did
go
into
a
contract,
would
I
have
the
ability
to
show
you
that
that
portion
that
does
have
facial
recognition,
that
that
can
be
censored,
that
that
is
not
excuse
me
fortress
awards
that
that
can
be
excluded
and
so
I'm
not
comfortable
with
that
contract
until
I
know
more
about
it.
I
don't
want
any
part
of
facial
recognition.
Thank.
A
O
I
read
into
testimony
right
all
the
technology.
That's
going
forward
now
in
many
fields,
it's
like
hey.
We
have
facial
recognition
and
I'm
like
not
comfortable
with
that
and
I'm.
So
brief
cam
yeah
the
old
version.
We
hardly
ever
used
it
and
there
is
discussion
on
the
new
version
and
I'm
not
comfortable
with
the
facial
recognition
component.
That.
A
A
O
You
very
much
thank
you
for
your
testimony.
Everyone
for
advocates
such
as
everyone
on
this
this,
this
zoom
call
I,
wouldn't
be
here
in
this
capacity.
So
thank
you.
Thank.
A
Thank
you
very
much,
and
so
we
have
a
still
robust
conversation
to
have
amongst
ourselves
about
the
language
about
the
goal.
Whether
it
goes
far
enough.
If
anyone
opposes
this
ban
and
I
I
do
want
to
under
I
want
people
understand.
If
there's
someone
who
disagrees
with
a
lot
of
us,
this
will
be
a
respectful
conversation
and
that
person
will
be
welcome
to
have
their
opinion
and
express
it
as
every
single
one
of
us
has
been
able
to
do
so.
A
I'm
gonna
continue
now
there's
a
list
of
folks
who
have
signed
up
to
speak
and
I'm,
going
to
continue
down
that
list
and
keep
them
to
the
to
no
more
than
three
minutes
and
then
we're
gonna
open
up.
Oh
I'm,
so
sorry,
I
apologize
before
I
go
down
the
list.
I
know
my
counselor.
My
colleagues
also
may
have
questions
or
concerns
or
may
want
to
voice
certain
things
about
the
legislation.
I
apologize,
so
I'm,
just
so
amped
to
get
to
the
public
testimony.
So
I'm
gonna
go
ahead
and
go
in
order
of
arrival.
E
Only
question
I
would
have,
which
is
more
I.
Think
okay
from
the
ACLU
is
whether
or
not
she
has
any
idea
where
we
are
on
that
contract.
From
from
what
I
heard
from
Commissioner
brass,
it
sounded
like
he
couldn't
confirm
the
timeline
for
that
contract,
whether
or
not
it's
been
signed
and
that
sign
let's
go
out
with
that
contract,
so
do
does
any
panelists
here
have
any
information
on
whether
or
not
that
contract
is
what
this
absolute
is.
Thanks.
T
For
the
question
counselor
royal,
regrettably,
no,
we
filed
the
public
records
request
with
the
Boston
Police
Department
on
May
14,
so
that
was
about
a
month
ago
asking
just
for
one
simple
document
for
the
existing
contract
that
the
Boston
Police
Department
has
with
Bri
Kim
and
they
haven't
sent
it
to
us.
Yet
so
and
now
you
know,
we've
prodded
them
multiple
times
about
it,
including
on
Friday,
saying
you
know
it
would
be
a
real
shame
if
we
had
to
go
to
this
hearing
on
Tuesday
without
this
information-
and
we
still
haven't
received
it.
H
T
I
can
speak
to
that
joy,
ball
and
wean
me
if
you're
still
on.
You
may
want
to
speak
to
some
of
this
too.
So
the
ACLU
we
nationwide
have
been
filing
FOIA
requests.
Freedom
of
Information
Act
requests
with
various
federal
agencies
to
learn
about
how
the
federal
government
is
using
this
technology
and
how
they're
sharing
information
derived
from
the
technology
with
state
and
local
law
enforcement.
T
So
one
concern
that
we
have
is
that
even
if
the
city
of
Boston
band
face
surveillance
technology
for
city
employees,
certainly
it
would
be
the
case
that
the
FBI
and
other
federal
agencies
would
be
able
to
continue
to
use
this
technology
and
likely
to
share
information
that
comes
from
it
with
the
Boston,
Police,
Department
and
other
city
agencies.
Unfortunately,
there's
nothing
we
can
do
about
that
at
the
city
level,
which
is
why
the
ACLU
is
also
working
with
partners
in
Congress
to
try
to
address
this
level
at
the
federal
level
as
well.
H
I
I
apologized.
Thank
you
thanks
a
councilor,
my
question
was
actually
just
it's
for
the
panelists.
Whoever
wants
to
jump
in
just,
but
you
know,
the
police,
commissioner,
was
making
a
distinction
between
spatial
surveillance
and
facial
recognition
systems
and
what
we
might
be
banning
or
not
and
I'm.
Just
wondering
I,
don't
I,
don't
know
the
literature
in
this
world
well
enough
to
know
if
that's
sort
of
a
strong
existing
distinction,
whether
you
think
this
piece
of
legislation
in
front
of
us
stands
one
or
the
other.
What
do
you
think
we
should
be
making
that
distinction?
P
So
when
we
hear
the
term
facial
recognition
oftentimes
what
it
means-
it's
not
so
clear,
and
so
with
the
way
the
current
bill
is
written,
it
actually
covers
a
wide
range
of
different
kinds
of
facial,
recognition,
technologies
and
I,
say
technologies,
plural,
to
emphasize
we're
talking
about
different
things.
So,
for
example,
you
have
face
recognition,
which
is
about
identifying
a
unique
individual.
So
when
we're
talking
about
face
surveillance,
that
is
the
issue,
but
you
can
also
have
a
facial
analysis.
P
That's
guessing
somebody's
gender
or
somebody's
age
or
other
systems
that
might
try
to
infer
your
sexuality
or
your
religion,
so
you
can
still
discriminate,
even
if
it's
not
technically
face
recognition
in
the
technical
sense.
So
it's
important
to
have
a
really
broad
definition.
You
also
have
face
detection,
so
if
you
think
about
weapon
systems
where
you're
just
detecting
the
presence
of
a
face
without
saying
a
specific
individual
that
can
still
be
problematic,
you
also
have
companies
like
face
up
chin.
P
That
say,
we
can
infer
a
criminality
just
by
looking
at
your
face,
your
potential
to
be
a
pedophile,
a
murderer,
a
terrorist,
and
these
are
the
kinds
of
systems
that
companies
are
attempting
to
sell
law
enforcement.
So
it's
crucial
that
any
legislation
that
is
being
written
has
a
sufficiently
broad
definition
of
a
wide
range
of
facial
recognition
technologies
right
with
the
plural,
so
that
you
don't
get
a
loophole
which
says:
oh
we're
doing,
face
identification
or
face
verification
which
falls
under
recognition.
So
everything
we're
doing
over
here
doesn't
matter,
but
it
does
so.
I
K
I
just
think
we
gave
to
the
panelists
and
for
educating
us
beforehand
and
always
sharing
all
this
amazing
information
in
data
and
data
and
research
telephones,
I
am
thinking
everyday
I
am
just
curious
and
at
the
same
time,
a
little
bit
worried
about
yeah
just
kind
of
like
how
we
can
prevent
this
from
ever
passing
ever
because
it
seems
to
me
that
the
way
it's
been
position
is
that
it's
the
right
now,
because
it's
not
you
know
accurate,
but
I'm,
just
curious
what
happened
if
you've
seen
other
cities
or
you
know
the
jobs
even
around
the
world
there.
K
Any
other
incidences
where
this
has
kind
of
slipped
under
the
radar
and
has
passed
I,
don't
know
if
I'm
making
any
sense
here,
but
basically
the
bottom
line
is
I'm.
Trying
to
understand
is
that
here
we
are
standing,
firm
and
embarrassing
facial
recognition
and
from
what
I
understand
at
this
point,
our
commissioners
in
agreement,
because
of
issue
of
accuracy
right.
How
can
we
get
ahead
of
this
situation
on
the
way
back?
T
What
I'm,
trying
to
say,
I,
do,
counsel
me
here
and
thank
you
for
that.
I
can
address
that
quickly.
The
council
can't
bind
future
city
councils
right,
so
I
agree
with
you
I
think
that
this
technology
is
dangerous,
whether
it
works
or
it
doesn't
and
I.
Think
that's
why
the
city
of
Boston
ought
to
take
the
step
to
ban
its
use
in
government.
I
can
promise
you
that,
as
advocates,
we
will
show
up
to
ensure
that
these
protections
persist
in
the
city
of
Boston
as
long
as
I'm,
alive
and
I.
T
T
We
did
not
choose
that
route
here,
in
fact,
because
of
our
concerns
that
are
I,
think
identical
to
yours
that
this
you
know,
it's
our
view
that
we
should
never
want
to
live
in
a
society
where
the
government
can
track
us
through
these
surveillance
cameras
by
our
face
wherever
we
go
or
automatically
get
alerts
just
because
I
happen
to
walk
past
a
camera
in
a
certain
neighborhood,
that's
a
kind
of
surveillance
that
should
never
exists
in
a
free
society.
So
we
agree
that
it
should
be
permanently
banned
and
thank.
K
You
for
that,
and
then
the
other
question
that
I
have
is
I,
know
that,
where
the
where
the
City
Council
and
all
of
our
it's
the
jurisdiction
as
all
things
I
deal
with
the
city
but
I'm,
just
wondering
what,
if
any
examples
have
you
heard
of
other,
like
maybe
youth
in
the
private
sector
that
are
utilizing
Orkin
screen
to
utilize
this
as
a
forum?
Is
there
anything
any
traction
or
anything
that
we
need
to
be
mindful
of
happening
outside
of
city
government?
If
you
will.
P
So
I'm
so
glad
you're
asking
this
question
because
facial
recognition,
technologies,
broadly
speaking,
are
not
just
in
the
government
realm
so
right
now,
especially
with
the
kovin
pandemic
you're.
Starting
to
see
more
people
make
proposals
of
using
facial
recognition
technologies
in
different
ways,
some
that
I've
seen
start
to
surface
or
can
we
use
facial
recognition
for
contactless
payments
right,
so
your
face
becomes
what
you
pay
with.
You
also
have
a
the
case
of
using
facial
analysis
in
employment.
P
So
there's
a
company
hire
view
that
says
we
will
analyze
your
facial
movement
and
use
that
to
inform
hiring
decisions
and
guess
what
we
train
on
the
current
top
performers,
so
the
biases
that
are
already
there,
it
can
be
propagated
and
you
actually
might
purchase
that
technology.
Thinking,
you're
trying
to
remove
bias,
and
so
there
are
many
ways
in
which
these
technologies
might
be
presented
with
good
intent.
But
when
you
look
at
the
ways
in
which
it
actually
manifests,
there
are
harms
and
oftentimes
harms
that
disproportionately
fall
on
a
marginalized
group.
P
Even
in
the
healthcare
system.
I
was
reading
research
earlier.
That
was
talking
about
ableism
when
it
comes
to
using
some
kinds
of
facial
analysis
systems
within
the
healthcare
context,
where
it's
not
working
as
well,
for
older
adults
with
dementia.
So
the
promises
of
these
technologies
really
have
to
be
backed
up
with
the
claims
and
I
think
the
council
can
actually
go
further
by
saying
we're
not
only
just
talking
about
government
use
of
facial
recognition
technologies,
but
these
technologies
impact
someone's
life
in
a
material
way
right.
P
K
A
L
Thank
you,
oh
yeah.
I
will
be
brief,
because
I
think
it's
important
to
go
to
the
public
testimony,
but
it
seems
to
me
that
this
sounds
like
a
very
productive
and
operations
hearing.
It
sounds
to
me
that
obviously
there's
a
lot
of
support
across
the
board
and
it
looks
as
though
the
commissioners
asking
for
one
more
working
session,
which
seems
to
make
sense,
and
his
comments
were
very
positive,
so
I
think
this
is
the
thing.
So
thank
you
to
the
advocates.
L
M
M
Thank
you
to
the
advocates
and
to
the
Commissioner
for
being
here
today
and
thank
you
to
the
advocates
who
have
met
with
me
over
the
last
a
few
months
ago.
Over
a
period
of
a
few
months,
just
really
learned
a
lot.
There
are
times
together,
especially
from
the
younger
folks.
I
am
interested
in
some
if
there
is
any
information
that
the
Commissioner
is
no
longer
with
us,
so
perhaps
I'll
just
forward
this
question
to
him
myself.
M
But
my
question
is
around
the
the
FBI's
evidence
standards
around
facial
recognition,
technology,
I,
wonder
if
they've
released
that
and
then
through
through
the
chair
to
the
makers,
we
had
some
questions
around
Section
B
and
to-
and
perhaps
we
can
discuss
this
in
the
working
session.
All
or
I'll
send
this
to
ahead
of
time.
T
Answer
that
quickly
and
then
joy
may
also
have
an
answer.
But
one
of
the
issues
here
is
that
police
departments
across
the
country
have
been
using
facial
recognition
to
identify
people
in
criminal
investigations
for
decades
now
actually
and
then
not
disclosing
that
information
to
criminal
defendants,
which
is
a
due
process
violation
and
it's
a
serious,
serious
problem
in
terms
of
the
integrity
of
our
criminal
justice
process
in
the
courts
and
defendants
rights
to
a
fair
trial.
T
So
the
FBI
standards
that
you're
referencing,
one
of
the
reasons,
at
least
as
far
as
we
understand
it
and
you
know,
I,
don't
work
for
the
police
department,
but
one
of
the
reasons.
As
far
as
we
understand
it,
that
police
departments
have
not
been
disclosing
information
about
these
searches
to
criminal
defendants
is
that
there
is
no
nationally
agreed-upon
forensic
standard
for
the
use
of
facial
recognition,
technology
in
criminal
investigations
and
so
I.
My
understanding
is
police
departments
and
prosecutors.
T
Fear
that
if
they
were
to
disclose
to
courts
and
criminal
defendants,
that
this
technology
was
used
to
identify
people
on
specific
investigations
that
those
cases
would
basically
get
thrown
out
of
court
because
they,
you
know
the
the
people
who
performed
the
analysis,
would
not
really
be
able
to
testify
to
their
training
under
any
agreed-upon
national
standard
for
evaluating
facial
recognition
results
or
using
the
technology.
More
generally,
I
can't
answer
your
other
question
about
that
exemption.
T
Obviously,
the
BPD
is
not
going
to
be
in
a
position
to
ask
every
single
Police
Department
in
the
country.
Where
did
you
get
the
name
attached
to
this
picture?
And
so
that's
that's
what
that
exemption
is
meant
to
address.
There
have
been
some
concerns
from
some
of
our
allied
organizations
that
we
need
to
tighten
that
up
with
a
little
more
extra
language
to
make
clear
that
that
does
not
allow
the
BPD
to
use
the
Army's
system
for
facial
recognition,
and
we
will
be
suggesting
some
language
to
the
committee
to
that
effect.
P
And
just
echo
a
little
bit
of
what
Kate
is
sharing
here.
I
do
want
to
point
out
that
in
April
2000,
nineteen
with
the
example
of
the
student
being
misidentified
as
a
terrorist
suspect,
you
had
that
photo
shared,
and
even
though
it
was
incorrect,
you
have
the
ramifications
of
what
happens
because
of
the
presumption
of
guilt,
and
so
this
is
why
I
especially
believe
we
should
also
be
talking
about
systems
that
try
to
infer
criminality
using
AI
systems
in
any
kind
of
way.
M
N
N
T
I'm,
not
aware
of
any
non
criminal
uses
in
government,
at
least
for
face.
Facial
surveillance,
except
in
the
I,
would
say
so-called
intelligence
realm
right,
so
that
would
mean
not
a
criminal
investigation,
but
rather
an
intelligence
investigation,
for
example,
of
an
activist,
or
you
know
an
activist
group
or
a
protest,
or
something
like
that.
Okay,.
N
P
So
speaking
to
private
sector
uses
of
facial
recognition,
oftentimes,
you
see
it
being
used
for
security
or
accessing
securing
access
to
a
particular
place.
So
only
particular
employees
can
come
in
something
that's
more
alarming
that
we're
seeing
with
commercial
use
is
the
use
in
housing,
and
so
we
have
a
case
even
in
Brooklyn,
where
you
had
tenants
saying
to
the
landlord.
We
don't
want
to
enter
our
homes
with
our
face.
P
Don't
install
facial
recognition
technology,
they
actually
won
that
case,
but
not
every
group
is
going
to
be
so
successful,
so
you
can
be
in
a
situation
where
your
face
becomes
the
key.
But
when
your
face
is
the
key
and
it
gets
stolen
or
a
hat,
you
can't
just
replace
it
so
easily.
You
might
need
some
plastic
surgery,
so
we
see
facial
recognition,
technologies
being
used
for
access
in
certain
ways
and
again
the
dangerous
use
of
trying
to
predict
something
about.
Somebody.
Are
you
going
to
be
a
good
employee?
P
You
have
companies
like
Amazon,
saying
we
can
detect
fear
from
your
face
and
so
thinking
about
how
that
might
be
used
as
well.
The
other
thing
we
have
to
consider
when
we're
talking
about
commercial
uses
of
facial
recognition
technologies
is
often
times
you'll.
Have
companies
have
general-purpose
facial
recognition,
so
this
then
other
people
can
buy
it
and
use
it
in
all
kinds
of
way
so
from
your
snapchat
filter,
so
putting
it
for
lethal
autonomous
weapons.
You
have
a
major
range
with
what
can
happen
with
these
sorts
of
technologies,
easy.
T
Party
city,
councilor,
I,
would
just
jump
in
to
also
reiterate
that
this
ordinance
only
applies
to
government
conduct.
It
would
not
restrict
in
any
way
any
entity
in
the
private
sector
from
using
this
technology,
but
that
said,
you
know
some
other
uses.
I
don't
know
if
you've
ever
seen
the
film
Minority
Report,
but
in
that
film
Tom
Cruise
enters
a
mall,
and
you
know
some
technology
says
to
him,
hello,
sir.
T
How
did
you
like
the
size,
small
underwear
you
bought
last
time,
so
that's
actually
now
becoming
a
reality
as
well
that
in
the
commercial
space
at
stores
for
marketing
purposes,
we
are
likely
going
to
see
if
laws
do
not
stop
this
from
happening.
The
application
of
facial
surveillance
in
a
marketing
and
commercial
context
to
basically
try
to
get
us
to
buy
more
stuff
and
to.
P
Underscore
that
point
you
have
a
patent
from
Facebook,
which
basically
Facebook
has
so
many
of
our
face.
Prints
we've
been
training
their
systems
for
years
right.
The
patent
says,
given
that
we
have
this
information,
we
can
provide
you
background
details
about
somebody
entering
your
course
your
store
and
even
give
them
a
trustworthiness
score
to
restrict
access
to
certain
products.
This
is
a
patent
that
has
been
filed
by
Facebook,
so
it's
certainly
within
the
realm
of
what
companies
are
exploring
to
do,
and
you
already
have
companies
that
use
facial
recognition
technologies
to
assess
demographics.
P
You
have
you
have
cameras
that
can
be
put
into
shelves.
You
have
cameras
that
can
be
put
into
mannequins,
so
you
already
have
this
going
on
in
Tacoma
in
Washington,
they
even
implemented
a
system
where
you
had
to
be
face
checked
before
you
could
walk
into
a
convenience
store.
So
this
technology
is
already
out
there
well.
N
A
You
so
I
it's
between
me
and
public
testimony,
so
I'm
gonna
just
put
my
three
questions
out
there
and
then
we're
gonna
go
right
down
the
list
of
folks
who
have
signed
up
in
ours
Rapide
again
and
to
those
folks
were
gonna
come
after
you
testified
again.
We
were
maxed
out
in
our
limit
of
people
who
could
participate
so
if
you're
willing
to
testify
and
then
also
sign
out
and
watch
through,
YouTube
or
wash
through
and
other
mechanisms
to
allow
somebody
else
to
testify.
That
would
be
very
helpful
to
this
process.
A
I'm.
Looking
specifically
at
the
issue
of
enforcement
in
this
statute,
earned
this
ordinance
and
it
says
no
evidence
derived
therefrom-
may
be
received
in
evidence
in
proceeding
or
before
any
department
or
officer
agency
regulatory
body.
I
want
to
be
clear
if
the
police
received
this
evidence,
could
a
prosecutor
use
it
in
court?
A
So
that's
our
question.
Okay.
Sorry,
so
that
that's
my
issue,
there
seems
to
be
no
actual
evidence
for
prohibition
for
use
in
court
against
somebody.
The
other
issue
or
concern
I
have
is
provisions.
Violations
of
this
ordinance
by
a
city.
Employee
shall
result
in
consequences
that
may
include
retraining,
suspension
termination
but
they're
all
subject
to
provisions
of
collective
bargaining
agreements,
so
all
of
that
could
be
gone
away.
Their
union
hasn't
agreed
to
that
being
part
of
the
disciplinary
steps
for
that
particular
employee.
A
So,
to
me
that
signals
the
patrolmen
and
other
unions
of
folks
who
are
part
of
investigating
need
to
need
to
have
this
as
part
of
their
a
violation
of
contract
or
violation
of
standard
I.
Think
maybe
it's
wrong
and
then
finally
I,
you
know
joy.
Your
testimony
about
the
private
sector
has
really
hit
me.
Thank
you.
So
much
I
mean
I'm
particularly
concerned
about
this
I'm,
a
twin
me
and
Facebook
tags
sister
in
all
of
my
pictures.
A
Automatically
right
so
I
mean
I'm,
a
twin
I
mean
council,
Asami
George
has
triplets,
you
know
like
so
for
those
we're
called
multiples,
you
are
all
Singleton's.
You
were
born
by
yourself.
We
share
birth.
What
multiples
have
you
know,
and
so
naturally
I'm
concerned
about
the
second
op,
that
my
sister
is
inclined
to
do
any
criminal
activity,
but
she
may
go
to
a
protest
or
to
I,
don't
know,
but
either
way.
A
The
point
is
I'm
concerned
about
the
private
actor
and
how
they
interact
and
I'm
particularly
concerned
as
how
far
this
will
go
down
the
line
the
chain
supply
chain
right,
because
we
have
six
hundred
million
dollars
in
contracts
for
all
sorts
of
things
for
the
city
of
Boston.
You
know
if
my
contract,
we
have
a
contract
with
a
cleaning
company
that
requires
the
workers
to
sign
in
with
facial
recognition
right.
So
how
far
down
the
line
can
we
go
with
our
money
now?
I
understand.
A
W
T
We
are
unclear
on
what
the
City
Council's
power
is
to
control
the
prosecutor's
office
right.
So
I
think
we'll
look
more
into
that.
We'll
look
closely
at
that.
On
the
second
question.
I
agree:
we
need
to
look
at
the
BPA
agreement,
which
I
understand
expires
this
summer
depending
and
then
finally,
on
the
private
sector
concerns
I
share
them
again.
You
know
we.
We
want
to
get
this
government
ban
passed
as
quickly
as
possible.
The
ACLU
supports
approaches
like
that
that
the
state
of
Illinois
has
taken.
They
passed,
the
nation's
strongest
consumer
facing
biometrics,
privacy
law.
T
It's
called
the
biometric
information.
Privacy
Act
PIPA
PIPA
essentially
prevents
private
companies
any
private
companies
from
collecting
your
biometric
data.
Without
your
opt-in
consent
and
that's
not
I
clicked
a
button.
When
I
was
scrolling
through
a
Terms
of
Service,
it's
you
actually
have
to
sign
a
piece
of
paper
and
give
it
to
them.
So
we
need
a
law
like
that
right
here
in
Massachusetts.
T
A
You
so
at
this
point,
I'm
going
to
turn
it
over
to
public
testimony.
I
have
some
folks
who
already
committed
and
asked
if
you
speak
today,
I'm
gonna
go
through
them
I'm,
going
to
it's
already
quarter
to
5:00
I'm,
going
to
try
and
end
this
hearing
no
later
than
6:00.
So
that's
an
hour
and
15
minutes
to
move
through
public
testimony
and
I'm
gonna.
Try.
I'm
gonna
have
to
keep
folks
to
two
minutes,
as
I
said
before,
because
there
are
a
lot
of
people
signed
up.
A
So
all
the
folks
who've
signed
up
I
will
go
through
that
list.
Then
I'm
gonna
invite
people
to
raise
their
hands,
who
may
not
have
signed
up
directly.
Who
would
also
like
to
testify
all
right
so
I
have
on
the
list
a
Bonnie
tin
Aurelio
from
the
National
Lawyers
Guild
after
her
I
have
Kalin
big
nolle
from
the
library,
freedom
project
and
Maddie
crackling.
Those
are
the
three
folks
lined
up
see.
A
X
Sure
hi
I
am
Callan,
viñoly
I
am
a
resident
of
West
Roxbury
and
a
librarian
in
the
Boston
area
and
I
am
speaking
on
behalf
of
the
library
freedom
project
today.
So
thank
you
to
the
chair
and
thank
you
to
all
of
City
Council
for
the
chance
to
testify
today
and
some
print
piece
of
legislation.
So
the
library
freedom
project
is
a
library
advocacy
group
that
trains
library,
workers
to
advocate
and
educate
their
library
community
about
privacy
and
surveillance.
We
include
dozens
of
library
workers
from
the
US
Canada
Mexico,
including
eight
librarians,
from
Massachusetts.
X
These
library
workers
educate
their
colleagues
and
library,
users
about
surveillance
by
creating
library,
programs,
union
initiatives,
workshops
and
other
resources,
inform
and
agitate
for
change
and
facial
recognition.
Technology
represents
a
class
of
technology
that
is
antithetical
to
the
values
of
privacy
and
confidentiality
of
library,
workers,
library,
staff.
We
understand
that
part
of
our
work
is
to
represent
these
values
and
the
services
and
resources
we
provide
to
our
library
community
in
order
to
reduce
the
harm
of
state
and
corporate
scrutiny.
X
As
we
understand
this,
we
can
see
the
social
and
economic
imperatives
that
privilege
some
and
marginalize
others
that
are
encoded
into
many
computer
systems
and
applications.
This
is
a
result
of
the
structure
of
our
technology
industry,
which
prioritizes
the
interests
of
management,
venture
capitalists
and
stockholders,
who
are
mostly
white
men.
X
X
A
Y
Thank
you
very
much
Thank
You
councillor
Edwards,
and
thank
you
to
the
City
Council
and
all
the
panelists
for
taking
up
this
important
issue.
My
name
is
Maddy
Crawley
I
used
a
them
pronouns
I'm,
a
teen
librarian
and
a
bargaining
unit.
Member
of
the
Boston
Public
Library
professional
staff
association,
msl,
a
local
49:28
aft.
Our
union
of
library
workers
supports
a
ban
on
facial
recognition
in
Boston.
Y
In
voting
to
do
so,
our
union
recognizes
that
facial
recognition
and
other
forms
of
biometric
surveillance
are
a
threat
to
the
civil
liberties
and
safety
of
our
colleagues
and
of
library
users.
Our
library,
ethics,
privacy
and
intellectual
freedom
are
incompatible
with
this
invasive
technology.
We
recognize
that
facial
recognition
in
other
biometric
surveillance
technologies
are
proven
to
be
riddled
with
racist,
ablest
and
gendered
algorithmic
biases.
These
systems
routinely
misidentified
people
of
color,
which
can
result
in
needless
contact
with
law
enforcement
and
other
scrutiny,
essentially
automating
racial
profiling.
Y
We
recognize
the
harm
of
surveillance
for
youth
in
our
city,
especially
a
black
brown
and
immigrant
youth.
Unregulated
scrutiny
by
authorities
leads
to
early
contact
with
law
enforcement
resulting
in
disenfranchisement,
marginalized
futures
and
potential
death
by
state
violence.
We
recognise
that
public
areas
such
as
libraries
parks
and
sidewalks
exist
as
spaces
in
which
people
should
be
free
to
move,
speak,
think
inquire,
perform
protest
and
assemble
freely
without
the
intense
scrutiny
of
unregulated
surveillance
by
law
enforcement.
Public
spaces
exist
to
extend
our
rights
and
provide
space
for
the
performance
of
our
civil
liberties,
not
policing.
Y
A
ban
on
fees.
Surveillance
technology
is
critically
important
for
the
residents
for
residents
and
visitors
to
the
City
of
Boston.
Our
union
encourages
you
to
ban
the
use
of
face
surveillance
in
the
city
of
Boston
by
supporting
and
passing
this
very
crucial
ordinance
I'd
like
to
thank
you
for
your
time
and
we
are
the
Boston
Public
Library
professional
staff
Association.
Thank
you
very
much.
A
Z
Z
I'm
speaking
on
behalf
of
the
National
Lawyers
Guild
Massachusetts
chapter,
which
for
over,
which
has
over
80
years
as
altering
human
rights
and
we're
deeply
concerned
over
the
dangers
of
patient
surveillance,
we
strongly
support
this
up
this.
This
ordinance,
facial-recognition
deployed
through
cameras
throughout
a
city,
can
be
used
to
track
the
movements
of
dissidents
and
anyone
attending
a
protest,
as
others
have
observed.
Our
history
and
the
National
Lawyers
Guild
shows
that
this
danger
is
real.
Z
We
have
defended
political
dissidents
targeted
by
law
enforcement
such
as
members
of
the
Black
Panther
Party
American,
Indian
Movement
for
two
Rican
independence
movement
and
were
recently
here
in
Boston
occupy
Boston
climate
change
activists,
those
opposing
a
straight
prime
event
and
black
lives
matter.
We
we
know
that
these,
that
the
dangers
of
political
surveillance
and
political
use
of
this
technology
are
real.
Our
the
National
Lawyers
Guild
also
helped
expose
illegal
surveillance
in
the
church,
commissioned
1975-76,
COINTELPRO
hearings.
We
know
from
this
experience
that
this
can
be
used
to
disrupt
and
prosecute
political
protest.
Z
People
must
be
able
to
demonstrate
without
fear
being
identified
and
tracked
afterwards.
I
will
not
repeat
when
others
have
said
very
articulately
about
just
the
basic
privacy
concerns
of
being
able
to
walk
through
a
city
without
having
cameras
tracked
your
identity
and
movements
throughout
the
city.
AA
Problem
you
got
that
right,
so
Thank
You
chair
and
to
the
amazing
speakers
before
me.
My
name
is
Nikhil
I
work
on
a
Google
brain
in
Cambridge,
so
I
specifically
work
in
the
area
of
machine
learning,
fairness
and
interpretability.
So
this
is
sort
of
the
stuff
that
we
do
so
I'm
here
as
a
citizen
and
I
am
very
much
in
favor
of
the
ordinance
banning
facial
recognition
by
public
officials
here
in
Boston,
so
modern
AI
algorithms
are
statistical
machines
right
they
base
predictions
off
patterns
and
data
sets
of
their
trained
doctors.
There's
no
magic
here.
AA
So
any
bias
that
stems
from
how
a
data
set
is
collected
like
non
representative
collection
of
data
for
different
races
because
of
historical
context
gets
automatically
reflected
in
the
AI
models
predictions.
So
what
this
means
is
that
it
manifests
as
unequal
errors
between
subgroups
like
race
or
gender.
Now
this
is
not
theoretical.
You
know
it's
studied
very
deeply
by
many
AI
researchers
in
this
field
who
have
been
on
this
call.
AA
You
know:
there's
a
federal
n
ist,
a
National
Institute
of
Standards
and
Technology
study
that
looks
at
almost
200
facial
recognition,
algorithms
and
found
significant
demographic
disparities
and
most
of
them,
and
is
another
paper
by
joy
here.
You
know
gender
shades
that
looked
at
three
commercial
classification
systems
invalid.
There
are
35
percent
of
the
time
incorrect
results
for
dark-skinned
females
versus
only
0.8%
incorrect
for
light
signals.
This
is
this
is
horrible
right,
so
these
these
mistakes,
you
know,
can't
also
be
viewed
in
isolation.
AA
They
will
perpetuate
bias
issues
by
informing
the
collection
of
the
next
generation
of
datasets,
which
are
then
and
again
used
to
train
the
next
generation
of
AI
models.
Moreover,
when
mistakes
are
made
either
by
a
machine
or
by
a
human
being,
we
need
to
be
able
to
ask
the
question:
why
was
that
mistake
made?
And
we
simply
haven't
developed,
robust
tools
to
be
able
to
hold
these
algorithms
to
an
acceptable
level
of
transparency?
AA
To
answer
any
of
these
critical
questions,
deployment
of
these
systems
will
exacerbate
the
issues
of
racial
bias
in
policing
and
accelerate
the
various
social
inequalities
that
were
protesting
to
innocent
people
will
be
targeted.
You
know
the
threshold
for
deploying
these
systems
to
be
extremely
high.
One,
that's
arguably
not
reachable
beyond
accuracy,
tons,
hundreds
inaccurate
accuracy
idea
data
would
have
to
be
public
and
there
would
have
to
be
actionable
algorithmic
oversight.
This
is
an
incredibly
high
bar
and
we
are
nowhere
near
that
both
algorithmically
and
societal.
AA
AB
All
good
Thank,
You,
councilmembers
and
authors
of
the
ordinance
for
your
service
to
the
city.
My
name
is
Norris
Lima
and
Lang
adores
her
and
I
use,
she/her,
pronouns,
I'm
resident
of
Jamaica,
Plain
and
I'm
here
in
my
today
adds
a
private
citizen.
I
won't
repeat
what
everyone
else
said,
but
I
will
say
that
the
use
of
facial
recognition
by
law
enforcement
today
is
especially
disturbing
due
to
the
epidemic
of
police
violence
in
this
country
and
the
dysfunctional
relationship
it
has
with
black
and
brown
people
in
the
city
and
with
all
due
respect.
AB
AB
Software
I
asked
the
council
to
consider
supporting,
but
the
ordinance
establish
a
total
ban
on
the
use
of
facial
recognition,
software
in
all
public
spaces,
including
by
law
enforcement
malls
and
shopping,
centers,
retail
spaces,
restaurants
open
to
the
public
state
government,
buildings,
courts,
public
schools,
schools
funded
by
the
state
on
public
streets
and
in
places
of
employment
as
a
condition
of
employment,
and
adds
a
condition
of
landlords,
leasing,
rental
properties.
The
ban
should
also
apply
to
be
used
and
purchase.
AB
AC
Thank
you
so
much
dear
councilor,
sir,
thank
you
for
allowing
me
the
opportunity
to
speak
in
support
of
the
ordinance
banning
face
surveillance
in
Boston.
I
am
a
professor
of
law
and
computer
science
at
North.
You
University,
who's,
been
researching
and
writing
about
the
risks
of
facial
recognition
technologies
for
over
seven
years.
I
make
these
comments
in
my
personal
academic
capacity
I'm
not
serving
as
an
advocate
for
any
particular
organization.
Please
allow
me
to
be
direct
facial
recognition.
Technology
is
the
most
dangerous
surveillance
tool
ever
invented.
AC
It
poses
substantial
threats
to
civil
liberties,
privacy
and
democratic
accountability.
Quite
simply,
the
world
has
never
seen
anything
like
it.
Traditional
legal
rules,
such
as
requiring
legal
process
or
people's
consent
before
surveillance
to
be
conducted
will
only
entrench
these
systems
lead
to
a
more
watched
society
where
we
are
all
treated
as
suspects
all
the
time.
Anything
less
than
a
ban
will
inevitably
lead
to
unacceptable
abuse
of
the
massive
power
bestowed
by
the
decisions.
That's
why
I
believe
that
this
ordinance
is
justified
and
necessary.
AC
There
are
many
ways
that
law
enforcement
use
of
facial
recognition
technology
can
harm
people.
Government
surveillance
has
disproportionately
targeted
marginalized
communities,
specifically
people
of
color.
Facial
recognition
will
only
make
this
worse
because
it
is
inaccurate
and
bias
based
along
racial
and
gender
lines,
and
while
facial
recognition
is
unacceptable
in
its
biased
and
error-prone,
the
more
accurate
it
becomes,
the
more
oppressive
it
will
get.
AC
Accurate
citizens
will
be
more
heavily
used
and
invested
in
further
endangering
the
people
of
Boston
use
of
facial
recognition
seems
likely
to
create
a
pervasive
atmosphere
of
chill
these
tools
make
it
easier
to
engage
in
surveillance,
which
means
that
more
surveillance
can
and
will
occur.
The
mere
prospect
of
a
hyper
surveilled
society
could
routinely
prevent
citizens
from
engaging
in
First
Amendment
protected
activities
such
as
protesting
and
worshipping
for
fear
of
ending
up
on
government
watch
lists.
AC
Facial
recognition
also
poses
a
threat
to
our
ideals
of
due
process,
because
it
makes
it
all
so
easy
for
governments
to
excessively
enforce
minor
infractions
as
pretext
for
secretly
monitoring
and
retaliating
against
our
citizens,
who
are
often
targeted
for
speaking
up
like
journalists,
whistleblowers
and
the
net
result
would
be
anxious
and
oppressed.
Citizens
who
are
denied
fundamental
opportunities
and
rights
for
the
reasons
outlined
above
I
strongly
support
the
ordinance
banning
facial-recognition
Boston.
AC
A
You
so
much
professor.
We
have
coming
up
Alex,
Marquess,
Emily,
reef,
jebel
row.
Now
those
three
folks
I,
don't
know
if
they're
ready
to
go
but
we'll
start.
My
future
well
I
see
Alex,
go
ahead.
Alex,
Matthews,
hi.
AD
AD
AD
There
have
been
a
number
of
studies
now
that
show
that
current
facial
recognition
technologies
are
perfectly
capable
of
misidentifying
legislators
as
criminals
and
these
systems
being
100%
accurate
will
pose
even
more
worrying
risks
about
that
as
city
councilors.
Where
are
you
going
when
you're
going
in
public?
Are
you
meeting
with
social
justice
groups?
Are
you
meeting
with
immigrant
advocates?
Are
you
meeting
with
god
forbid
police
reform
and
doubt
accountability
groups?
Are
you
doing
something
embarrassing?
Perhaps
that
could
be
used
to
influence
your
votes
on
matters
of
public
interest
or
matters
relating
to
the
police
budget?
J
AE
You
thank
you
Boston
City
Council,
for
having
this
platform.
Thank
you.
Thank
you
and
shout
out
to
the
expert
advocates
on
the
subject
for
shedding
light
on
the
subject
that
you
know
me,
and
many
of
us
are
not
you
know
fully
versed
in
when
I
am
well
versed.
In,
though,
is
the
reality
surrounding
the
Boston
Police
Department's
history
and
the
negative
impact
that
they
have
had
on
the
black
of
brown
Bostonians.
AE
For
decades
my
name
is
Darrell
R&L
I'm,
a
formerly
incarcerated
black
man
from
a
heavily
policed
comic
way:
neighborhood
of
Dorchester
I'm,
a
community
organizer
for
families
for
justices,
Halon
and
I'm
here
to
support
and
the
ordinance
banning
facial
recognition,
technology
in
Boston,
presented
by
a
councilors
rule
and
a
royal
first.
What
are
we
talking
about?
We
are
talking
about
a
force
whose
actions
and
racial
profiling
techniques
have
been
documented,
tried
in
court
and
the
Taurus
Li
known
for
targeting
and
terrorizing.
You
know
on
those
from
disinvested
communities
right.
You
know
we're
talking
about
militarization.
AE
You
know
to
intimidate
force
their
will
on
black
and
brown
citizens
on
a
Commonwealth,
a
force
by
their
own
field,
interrogations
and
observation
records
show
almost
70%
of
them
were
black
residents.
In
the
say,
that's
only
around
twenty
five
point:
two
percent
black,
so
we're
talking
about
taking
the
same
force
and
further
funding
them
and
equipping
them
with
another
weapon.
That
would
definitely
be
used
to
further
oppress
target
and
incarcerate.
You
know
our
sons
and
fathers
and
daughters
and
mothers.
AE
You
know
people
of
color
and
children
and
elderly
and
women,
so
you
know
I
see
that
it
needs
to
be
banned
because
we
can't
continue
condoning
and
reassuring
and
fueling.
You
know
other
generations
of
black
and
brown
men
and
women
from
Dorchester
Roxbury
Mattapan.
You
know
it's
a
more
trauma
right
and
more
incarceration.
AE
A
Very
much
I
really
do
appreciate
you
I
think.
It's
again.
We
could
speak
from
the
heart
about
their
own
experience
and
specifically
talk
about
how
it
would
impact
them.
I
do
appreciate
that
not
that
I
don't
appreciate
the
experts,
but
I'm
telling
you
he's
nothing
better
than
hearing
about
how
residents
are
directly
impacted.
Thank
you.
I
have
also
on
the
list.
Cynthia
pages
I,
don't
have
a
name
but
I
have
an
organization
called
for
the
people.
Boston
I,
don't
think
we
have
either
one
there.
AF
All
right,
thank
you.
Thank
you,
the
council
and
the
panelists
for
allowing
me
to
speak.
So
I
am
a
machine
learning
engineer
working
in
an
adjacent
field
and
a
resident
of
Jamaica
Plain,
so
and
I'm
here
today,
as
a
private
citizen.
So
this
is
my
first
time
speaking
at
a
public
hearing,
so
cut
me
some
slack.
So,
although
I'm
not
an
expert
and
serve
on
the
surveillance
aspect
and
the
applications,
I
am
familiar
with
the
technology
aspect
of
this
of
this
stuff.
AF
So
I
felt
the
need
to
testify
today
because
of
the
gravity
of
this
issue.
Although
the
technology
may
seem
benign
upon
first
glance,
this
is
an
extremely
powerful
and
dangerous
tool.
So
to
me
this
amounts
to
basically
a
form
of
digital
stop-and-frisk
and
it's
even
more
dangerous,
since
it
can
be
kind
of
constantly
running
constantly
working
anywhere
at
all
times
kind
of
tracking
folks.
This
seems
like
a
clear
violation
of
the
Fourth
Amendment
to
me,
as
many
commenters
and
panelists
today
have
noted.
AF
So
that's
really
what
I
wanted
to
say
and
I
wanted
to
kind
of
state
my
support
for
a
ban
on
any
type
of
a
to
any
type
of
facial
recognition,
technology
for
use
by
the
police
or
the
government,
and
that
we
should
disincentivize.
The
research
and
development
of
you
know,
send
clear
signal
to
distant
disincentivize,
the
research
and
development
of
these
types
of
systems
in
the
future.
Thank
you.
A
Perfect
time
for
the
first
time
participating,
we
really
appreciate
it.
I
hope
that
this
is
not
your
last
time
we
are
dealing
with
a
lot
of
different
issues.
Your
voice
is
definitely
welcome.
Thank
you.
So
much
up
next
sorry
up
next
I
have
I'm
gonna
name,
will
Luckman
followed
by
Lisette
Medina
and
then
also
Nathan
shear
assured
I
apologize
again
if
I
miss
your
name
feel
free
to
just
state
your
name
again
for
folks
who
have
already
testified
against
Ince
we're
over
capacity.
A
AG
You
good
afternoon,
my
name
is
Wil
Luckman
and
I
serve
as
an
organizer
with
the
surveillance
technology
oversight,
project
or
stop,
and
stop
advocates
and
litigates
to
fight
discriminatory
surveillance,
Thank
You
counselors
for
holding
this
hearing
today
and
for
proposing
this
crucial
reform
today.
My
oral
remarks
are
an
excerpt
they're,
written
Simoni
being
entered
into
the
record.
AG
Facial-Recognition
is
biased,
broken
and
when
it
works
antithetical
to
a
democratic
society
and
crucially,
without
this
ban,
more
people
of
color
will
be
wrongly
stopped
by
the
police.
At
a
moment
when
the
dangers
of
police
encounters
have
never
been
clearer.
The
technology
that
drives
facial
recognition
is
far
more
subjective
than
many
realize.
Artificial
intelligence
is
an
aggregation
of
countless
human
decisions
codified
into
algorithms
and,
as
a
result,
human
bias
can
infect
AI
systems,
including
those
that
supposedly
recognize
faces
in
countless
ways.
AG
For
example,
if
a
security
cancer
camera
learns,
who
is
suspicious-looking
using
pictures
of
inmates,
the
photos
will
teach
the
AI
that
replicate
the
mass
incarceration
of
african-american
men.
In
this
way,
AI
can
learn
to
be
just
like
us,
exacerbating
structural
discrimination
against
March
highs
communities,
as
we've
heard.
Even
if
facial
recognition
worked
without
errors,
even
if
it
had
no
bias,
the
technology
would
still
remain
antithetical
to
everything.
The
city
of
boston
believes
facial
recognition.
Manufacturers
are
trying
to
create
a
system
that
allows
everyone
to
be
tracked
at
every
moment
in
perpetuity
go
to
a
protest.
AG
The
system
knows
go
to
a
health
facility.
It
keeps
a
record
suddenly
Bostonians
lose
the
freedom
of
movement
that
is
essential
to
an
open
society.
If
the
city
fails
to
act
soon,
it
will
only
become
harder
to
enact
reforms.
Companies
are
pressuring
local
state
and
federal
agencies
to
adopt
facial
recognition.
Signals
brief,
cam
the
software
powering
Boston
surveillance.
Camera
network
has
released
a
new
version
of
their
software
that
would
easily
integrate
invasive
facial
recognition
tools.
AG
Although
the
Commissioner
says
he
will
re-evaluate
the
new
version
on
offer,
he
also
expressed
interest
in
waiting
until
the
accuracy
of
the
technology
improves
or
making
exceptions
for
facial
recognition,
use
right
now,
the
definitively
and
permanently
avoid
the
issues
with
facial
recognition
that
I've
raised
here.
We
need
comprehensive
ban
now
and
I'll
conclude
on
a
personal
note.
I
live
and
work
in
New,
York
City,
but
I
was
born
in
Boston
and
raised
here
acquired.
AG
It
pains
me
to
see
the
current
wave
of
protests
roiling
the
area
because
it
it's
the
biased
and
unequal
law
enforcement
practices.
I
remember
from
my
youth
have
yet
to
be
addressed.
I
know
that
the
people
of
the
Commonwealth
want
to
see
a
change
and
I
believe
that
the
council
is
on
their
side
in
practice.
Inaccuracies
aside,
facial
recognition
systems
lead
to
increased
stops
for
people
of
color.
Increased
stops
mean
an
increase
in
opportunity
for
police
violence
and
abuse.
AG
We
must
recognize
that
black
lives
matter,
and
to
do
so,
we
must
realize
that
technology
doesn't
operate
in
a
neutral
vacuum.
Instead,
it
takes
on
the
character
of
those
building
and
deploying
it
I
encouraged
council
to
respond
to
their
constituents.
Demands
for
police
reform
by
immediately
banning
the
use
of
the
sample
technology.
Boston
Thank.
A
K
AH
Hi
good
afternoon,
my
name
is
ville
de
medina
I'm
here
as
a
private
citizen
I,
actually
pretty
new
to
all
of
this
I
this
week
have
found
out
from
the
student
immigration
movement
kind
of
what's
going
on
and
I
just
wanted
to
voice.
My
support
for
this
ban
I
think
personally,
as
an
immigrant,
it's
speaking
toward
like
the
presence
of
facial
recognition
technology
in
the
school
system,
I
think,
especially
where
we
have
so
many
conversations
with
like
the
school
to
Prison
Pipeline.
AH
This
tool
could
be
used
to
facilitate
that
kind
of
to
work
in
racially
profiling
and
our
students
and
I
think
now
more
than
ever.
We
have
this
in
this
point
in
time
and
decision
to
make
and
protecting
our
residents
of
color
of
black
and
brown
residents
and
students
and
as
they
live
and
and
try
to
navigate
the
various
systems
that
are
already
putting
pressure
on
them
and
so
I
think,
especially
in
light
of
recent
events.
AH
This
is
really
a
stand
to
support
residents
of
color
in
Boston
over
possible
pros
that
could
be
looked
at
in
this,
and
I
also
think
it's
troubling
that
the
Commissioner
doesn't
seem
to
be
clear
about
what's
kind
of
going
on
with
this.
In
addition
that
you
know
so,
it
isn't
clear
that
there
is
even
a
surveillance
of
whether
or
not
this
could
be
used
already.
AH
A
AI
You
said
it
perfectly,
thank
you,
so
my
name
is
Nathan
shared
and
thank
you
for
allowing
me
to
speak
on
behalf
of
the
Electronic
Frontier
Foundation
and
our
30,000
members.
The
Electronic
Frontier
Foundation
strongly
supports
legislation
that
bans
government
agencies
and
employees
from
using
Space
Surveillance
technology.
We
thank
the
sponsors
of
this
ordinance
for
their
attention
to
this
critical
issue.
A
surveillance
is
profoundly
dangerous
for
many
reasons.
First,
in
tracking
our
faces
a
unique
marker
that
we
cannot
change,
it
invades
our
privacy.
AI
Second
government
use
of
the
technology
and
public
places
will
chill
people
from
engaging
for
some
other
protective
activities.
Research
shows
and
courts
have
long
recognized
that
government
surveillance
of
person,
women
activity,
has
a
deterrent
effect.
Third
surveillance
technologies
have
an
unfair,
disparate
impact
against
people
of
color
immigrants
and
other
vulnerable
populations.
Watch
lists
are
frequently
over-inclusive:
error,
riddled
and
used
in
conjunction
with
powerful
mathematical
algorithms,
which
often
amplify
bias.
The
Space
Surveillance
is
so
dangerous
that
governments,
governments
must
not
use
it
at
all.
AI
We
support
the
aims
of
this
ordinance
and
respectfully
seek
three
amendments
to
assure
that
the
bill
will
appropriately
protect
the
rights
of
Boston
residents.
First,
the
bill
has
an
exemption
for
evidence
generated
by
face
event
that
relates
to
investigation
of
crime.
If
that
respectfully
ask
that
it
be
amended
to
prohibit
the
use
of
face
around
enabled
evidence
generated
by
or
at
the
request
of
the
Boston,
Police
Department
or
any
Boston
official.
Second,
the
private
right
of
action
does
not
provide
free
shipping
for
a
prevailing
plaintiff
without
such
fee.
AI
Shifting
the
only
private
enforcers
will
be
at
see,
organizations
and
wealthy
individuals.
Yes,
BFF
BFF
respectfully
asked
that
the
language
that
language
be
included
toward
reasonable
attorney
fees
to
a
prevailing
plaintiff.
Third,
the
ban
extends
to
private
sector
used
to
face
of
violence
conducted
with
the
government.
Permit.
The
better
approach
asked
as
cages
or
as
offered
earlier
is
to
is
to
use
space
surveillance
to
make
sure
that
face
surveillance
is,
is
used
only
with
informed
opt-in
consent.
AI
The
cff
respectfully
request
that
the
language
be
limited
to
prohibiting
private
sector
permitting
for
the
use
of
face
surveillance
recognition
at
the
government's
request.
In
closing
iff,
once
again,
thanks
for
sponsors
looks
forward
to
seeing
an
ordinance
passed
that
bans
government
needs
to
face
around
in
Boston
and
will
enthusiastically
support
the
ordinance
banning
face
recognition
technology
in
Boston
if
it
is
amended
in
these
ways.
Thank
you.
A
A
Guess
your
support
that
helps
to
be
funded
and
then
three
to
extended
private
actors
in
the
supply
chain,
where
the
city
of
Boston
can
tell
private
actors
that
that
contract
with
us,
because
I,
don't
one
of
the
issues
is
how
far
we
can
go.
I
mean
we
can't
regulate
facebook,
but
we
can
regulate
so
I'm,
assuming
it's
fun.
A
AI
So
to
reiterate
so
the
first,
the
first
amendment
that
we
would
suggest
is
that
in
right
now,
there's
there's
an
exclusion
for
forever
for
evidence
and
encaged
spoke
earlier
to
the
fact
that
that's
in
reference
to
you
know
if
they
received
something
from
the
federal
government
or
want
to
post
or
what-have-you,
but
to
make
sure
that
they're,
it's
amended
to
tighten
that
up
a
bit
and
say
that
it's
they
as
long
as
that
information
isn't
requested
or
created
at
the
request
of
the
Boston
Police
forms.
That's
the
first
one.
AI
So
really
making
sure
that
you
know
we're
speaking
back
to
the
fact
that,
like
private
use
would
be
better,
could
be
better
protected
by
something
similar
to
the
the
by
much
of
the
information
Privacy
Act
in
Illinois
that
are
in
quiet.
That
requires
informed
opt-in,
consent
from
the
individual,
rather
than
simply
saying
that
no
private
doctor
can
do
space
ourselves.
A
AJ
Thank
you
so
much
just
to
let
folks
know
christina
is
not
with
us.
Some
folks
had
to
leave
because
they
have
to
work,
but
so
hi
everyone,
hi
City
counselors.
My
name
is
Sabrina
Barroso
I
am
the
lead
organizer
for
the
student
immigrant
movement,
the
first
undocumented
immigrant
youth,
led
organization
in
the
state
of
Massachusetts.
AJ
We
believe
that
all
young
people
are
powerful
and
that
when
youth
are
supported
and
have
space
and
investment
to
grow,
they
flourish
into
the
leaders
of
today
and
tomorrow
at
sym
we
are
invested
in
protecting
and
working
on
behalf
of
the
undocumented
youth
and
families
of
Massachusetts.
That's
why
I'm
here
with
you
all
today
and
I,
would
like
to
make
clear
that
we
must
prohibit
the
use
of
facial
recognition,
technology
and
software
in
the
city
of
Boston.
AJ
For
far
too
long,
the
Boston
Police
Department
has
been
allowed
to
full,
have
have
full
flexibility
and
free
will
over
how
the
police
and
surveil
our
communities
and
the
consequences
are
devastating.
Our
people
are
being
criminalized
and
into
prisons,
detention
and
deportation.
We
need
full
community
control
over
the
police,
and
this
is
a
key
move
to
bringing
power
to
our
people
right
now.
There
are
no
laws
that
control
what
the
BPD
can
purchase
for
surveillance
or
how
they
use
it.
AJ
When
I
learned,
this
I
was
scared,
because
I
thought
about
the
history
between
the
BPD
and
families
like
mine
and
I
thought
about
the
people
who
are
constantly
about
the
people
that
I
love
and
that
the
people
I
care
about
and
I
thought
about,
the
people
who
are
constantly
harassed
and
targeted
by
law
enforcement
simply
for
existing.
So,
if
you
ask
me,
facial
recognition
is
not
to
be
trusted
in
the
hands
of
law
enforcement,
that
tracks
monitors,
hyper
surveillance,
black-brown
communities
and
that
puts
immigrants
and
activists
youth
under
constant
scrutiny
and
criminal
criminalization
counselors.
AJ
Would
you
really
trust
a
police
department
that
shares
information
to
ice
that
shares
student
information
to
ice
as
well?
That
has
led
to
the
deportation
of
students
that
exchange
emails,
saying
happy
hunting
with
ice?
This
Enda
has
historically
targeted
black
and
brown
youth
to
invade
their
personal
space,
their
bodies
and
their
privacy
with
stop
and
frisk.
There
is
not
ashamed
to
separate
youth
from
their
and
their
families
from
their
communities
and
then
end
up
pinning
in
justices
and
crimes
on
us.
AJ
The
BPD
says
that
they
don't
use
spatial
surveillance
now,
but
they
have
access
to
it
and
they
have
a
four
hundred
fourteen
million
dollar
budget
to
buy
it
right
now.
Our
communities
need
access
to
funding
for
health
care,
housing,
and
so
many
other
things
and
the
list
goes
on.
For
years,
residents
of
Boston
have
been
demanding
to
fund
things
that
truly
mattered
them,
and
we
have
to
invest
in
things
that
are
not
based
surveillance
that
doesn't
even
work
and
is
irrelevant
to
keeping
our
community
safe.
AJ
We
must
invest
an
intentional
restorative
justice
practices,
health
care,
mental
health
and
resources
to
prevent
violence
and
distress.
Facial
surveillance
would
end
privacy
as
we
know
it,
and
completely
throw
off
the
balance
in
power
between
people
and
their
government,
listen
to
youth
and
listen
to
our
community,
and
let's
do
and
let's
follow
the
lead
of
young
people
who
are
addressing
the
deeper
issue
of
what
is
to
criminalize
our
people
and
putting
the
power
in
the
hands
of
the
people,
especially
when
it
comes
to
governing
surveillance.
Thank
you
all
so
much
thank.
AJ
AK
Melda
yeah,
okay,
okay
hi,
my
name
is
Mel
de
and
I
was
I,
am
member
of
the
sitting
Mia
movement
today,
I'm
here
to
testify
in
support
of
the
bending
special
occasion,
technology
ambassadors.
This
will
establish
been
a
government
use
of
face
for
villains
in
the
city
of
Oscar.
Austin
must
pass
this
ordinance
to
join
Springfield,
Sabadell,
Cambridge,
Brookline
and
you're
handsome
and
protecting
racial
justice.
Seeing
freedom
of
speech,
the
federal
government
started,
publishing
December
2019
found
a
face.
Recognition.
AK
Algorithms
were
much
more
likely
to
fail
when
attempted
to
identify
the
faces
of
people
of
color
children
at
the
atrophy
and
woman.
That
means
that
technical,
only
reliable
words
technology
of
only
reliably
word
for
middle-aged
white
men,
a
very
small
fraction
of
Boston
students,
president
Adly
john
person
in
the
city
of
Boston.
It
is
clear
to
me
that
this
type
of
technology
is
not
okay
to
use
in
our
communities.
AK
The
fact
that
this
technology
is
used
to
consult
discipline,
communion
scholar
makes
my
family
and
me
feel
safe,
as
well
as
the
other
families
that
we
put
into
this
communities.
Black
and
brown.
Kids
are
a
germinates
are
pretty
databases
that
are
used
against
them
with
our
venice,
a
reason
this
interrupts,
our
education
limits
are
featured
kid,
chilling,
be
criminalized
or
feeling
safe.
Black
and
brown
kids
should
be
able
to
walk
on
the
street
without
a
beard,
a
beard
of
knowing
what's
going
to
happen.
AK
This
shows
how
this
surveillance
is
used
to
target
black
and
our
communities
to
racial
profiling
in
technology.
It's
one
of
the
many
porns
systemic
racism,
immigrant
youth
early
city,
to
do
not
trust
the
police,
because
we
know
that
they
are
very
close
with
eyes.
We
know
that
they
sent
a
student
information
to
eyes,
and
this
has
led
to
the
deportation
of
young
people
in
and
pain
in
our
communities,
black
and
brown
consuming.
These
are
council
ye.
AK
Felons
we
paid
the
cameras
with
with
specially
there's
strange
surveillance
and
communal
ization
of
us.
It's
not
right.
Everyone
on
the
council
at
some
point,
has
talked
about
separating
young
people
and
young
people
being
our
future
leaders.
We
are
leaders
right
now.
We
are
telling
you
to
do
what
everyone
knows.
It's
right.
We
haven't,
they
lose
the
facial
recognition
surveillance.
We
can
do
more
to
protect
immigrant
families
in
Boston,
stops
on
youth
and
families.
AK
A
A
A
AL
This
is
ui.
Oh
thank
you
so
much.
My
name
is
Eli
Harmon
I
live
in
Mission
Hill
I've
been
doing
work
with
the
student
immigrant
movement
I'm,
testifying
on
behalf
of
them
today
in
support
of
the
ordinance.
The
issue
of
police
surveillance
is
enormous,
ly
important
to
people
all
across
the
city
of
Boston,
though
it's
of
particular
concern
to
communities
of
color,
which
it
overwhelmingly
targets.
A
study
done
by
MIT
research
showed
this
type
of
surveillance.
AL
Technology
would
be
far
less
accurate
for
people
of
color
and
in
fact
it
was
inaccurate
for
35%
of
black
women
and
was
most
accurate
or
white.
Adult
men
law
enforcement
today
already,
of
course,
overwhelmingly
targets
Frank
black
and
brown
communities,
and
this
type
of
technology
is
likely
to
only
make
this
more
so
the
case.
Additionally,
this
type
of
technology
is
a
complete
violation
of
people's
privacy.
AL
Many
cities
that
neighbor
Boston,
including
Somerville
Brookline
Cambridge
and
the
springfield,
have
chosen
to
band
facial
surveillance
because
they
understand
that
it
is
too
dangerous
and
harmful
to
communities
of
color.
It
is
a
violation
of
people's
privacy
and
that
money
currently
being
used
to
fund
that
technology
would
be
better
spent
in
places
that
actually
improved
lives
in
Boston.
For
these
reasons,
I
ask
the
council
to
pass
a
prohibition
of
the
use
of
facial
recognition
surveillance
and
give
the
community
control
over
sir
technology
and
usage.
Thank
you
so
much.
Thank.
AM
J
AM
Am
writing
on
behalf
of
San,
in
support
of
the
ordinance
banning
facial
recognition
technology
in
Boston
I
urge
my
city
of
Boston's,
who
passed
this
ordinance
to
join
other
cities
in
Massachusetts
like
Springfield,
Somerville,
Cambridge
for
client
and
northern
clan,
in
putting
the
community
power
before
the
police,
the
city
of
Boston,
the
residents,
families
and
youth
of
Boston
deserve
transparency,
respect
and
control.
A
ban
on
Facebook
surveillance
technology
is
important
in
the
city
of
Boston,
because
this
technology
is
dangerous
and
it
leads
to
serious
Mistral's
and
communities,
and
the
community
face.
AM
Surveillance
is
proven
to
be
less
accurate
to
people
for
people
of
darker
skin
black
people
already
face
extremely
extreme
levels
of
police
violence
and
harassment.
The
racial
bias
and
face
several
surveillance
will
put
lives
in
danger,
but
even
if
accurate,
we
should
still
be
doing
all
that
we
can
to
avoid
a
future
where
every
time
we
go
to
the
doctor,
a
place
of
worship,
a
protest,
the
government
is
any
our
face
and
right
recording
or
movement.
We
already
know
that
law
enforcement
is
constantly
sharing
and
misusing
information.
AM
What
would
happen
to
our
temple
when
we
are
already
under
constant
surveillance
and
criminalization
using
face
surveillance?
The
government
could
begin
to
identify
and
keep
a
list
of
every
person
added
protests,
people
who
are
driving
or
who
are
walking
the
street
every
single
day
across
the
city
on
bone
automatic
basis.
It
would
keep
the
same
harmful
surveillance
going
like
what
happened
with
the
use
of
gang
database.
It
would
criminalize
anyone
move
moving
or
people
mean
anyone
to
make
who
make
a
move
in
our
city.
AM
If
we
want
this
city
to
be
safe,
we
need
to
put
our
money
where
and
invest
and
practice
such
as
restorative
justice,
to
help
restore
our
community.
Facial
surveillance
is
not
the
answer.
If
anything,
it
is
distracting
or
addressing
or
real
concern
about
safety
in
our
communities.
I
urge
you
to
prohibit
the
use
of
facial
surveillance
in
the
City
of
Boston,
because
we
cannot
allow
law
enforcement
to
target
black
and
brown
immigrants,
activists
and
community
members.
Thank
you.
So
much
for
your
time.
Thank.
A
You
very
much
I
want
to
say
thank
you
to
all
of
the
youth
who
are
going,
who
have
testified
and
will
testify
at
sym.
All
those
folks
showing
up
you
are
I
just
want
to
say.
Thank
you
so
much
on
behalf
the
City
Council
you're,
our
future
and
you're
doing
an
amazing
job.
I
have
enough
League,
Diaz
and
then
Jose
glance.
I
see
that
Emily
reef
who
had
signed
up
before
might
also
be
available.
So
those
next
three
names
I
think
Nathalie.
We
no
longer
be
with
us.
A
D
Everyone
I'm
here
on
behalf
of
him
to
testify
in
favor
of
the
ban
of
facial
recognition.
Software
in
Boston
I
am
an
MIT
alum
working
as
a
software
engineer
in
Boston
and
I'm
talking
recipient
as
well,
and
I
wanted
to
tell
you
firsthand
that,
as
a
software
engineer,
working
in
AI
and
machine
learning
enabled
robots
that
the
software
used
for
facial
recognition
is
far
from
perfect.
D
In
fact,
as
Joe
found
in
others
have
noted,
one
in
three
black
women
is
likely
to
be
misclassified
by
technology,
and
our
own
federal
government
has
found
that
the
face
recognition,
algorithms
were
more
likely
to
fail
when
it
attempted
to
identify
faces
of
people
of
color
children,
elderly
and
women.
The
software
after
all,
is
written
by
people,
and
this
is
inherently
filled
with
mistakes
and
bias.
D
Despite
the
detail,
the
detail,
all
of
the
assurance
processes
of
production
level
software
undergoes
this
is
an
accepted
fact
in
software
industry,
no
matter
how
good
your
software
is
or
the
testing
process
is
bugs
and
software
will
always
exist.
Consequently,
the
use
of
facial
recognition
technology
by
government
entities,
especially
the
police
departments,
must
be
bad
as
the
consequence
of
errors
in
this
technology
or
of
life
and
death.
A
false
identification
by
the
facial
recognition.
Software
could
lead
to
an
unwarranted
confrontation
with
the
police
department.
This
seemingly
small
area
facial
recognition.
D
Software
may
not
mean
much
to
you,
but
as
a
document
undocumented
immigrant
in
the
personal
color
ethnic
confrontation
with
the
police
could
mean
deportation
or
if,
in
my
life,
the
police
have
a
record
of
systemic
racial
bias
leading
to
the
deaths
of
thousands
of
people
government
entities,
special
police
forces
do
not
need
any
more
tools
to
further
their
racial
bias
and
consequently,
there's
a
stomach
merdeen
of
very
people.
They're
amended
fact:
I
encourage
you
to
press
pause
on
these
facial
violence
by
government
entities
in
city
of
Boston.
By
supporting
this
crucial
ban.
Thank.
A
AN
Hello,
sorry,
can
you
yeah
yeah,
two
minutes
all
right,
yeah,
sorry
about
that
I.
My
was
having
internet
issues
yeah,
so
my
name
is
Emily
right
and
I
work
on
a
machine
learning
research
group
x-bow.
Although
question
of
use
of
my
own
and
I'm
here
as
a
citizen,
yeah
I
strongly
support
the
ban
on
facial
recognition.
Technology
and
many
people
have
had
similar
things
to
what
I'm
about
to
say
and
in
much
better
words.
I
just
wanted
to
yeah
I,
just
reiterate
on
a
you've
been
working
correctly.
AN
AN
You
know
potentially
less
dangerous,
but
it
makes
it
much
much
more
dangerous
when
the
inaccuracies
does
reinforce
the
biases,
that
the
disparities
that
have
always
been
there
and
the
way
the
different
groups
are
surveyed
and
policed
and
incarcerated,
and-
and
we
saw
with
with
the
owner
killers
death
that
you
know
pulse
having
thinking
that
you're
attacking
or
that
you're
entering
the
right
house
and-
and
if
you
are
wrong
about
that,
I
mean
there's
just
horrible
unspeakable
consequences
and
yeah.
That's
just.
AN
A
It's
perfectly
on
time:
I'm
just
gonna
go
down
the
list
and
also
call
on
Christian
DeLeon
Clara
Ria's,
Dean,
Serrano
and
Oliver
DeLeon.
AO
My
name
is
christian
de
lyon,
and
I'm
here
today,
as
a
member
of
the
as
a
point
of
banning
facial
recognition
in
Boston,
a
surveillance
is
not
only
a
huge
invasion
of
privacy,
but
in
the
wrong
hands
can
be
used
even
further
oppress
minorities.
It
can
be
used
to
track
immigrants,
protesters
and
inaccurate
when
tracking
people
of
darker
complexion.
As
we
know,
this
can
be
very
problematic
since
these
tools
target
these
communities
to
racial
profiling.
Already
having
surveillance
tools
like
facial
recognition
will
contribute
to
an
even
greater
systemic
oppression.
AO
This
also
has
no
business
being
in
our
schools.
To
my
knowledge,
some
schools
are
already
investing
in
face
recognition
software
and
are
investing
millions
in
this
new
technology
and
for
what
exactly
try,
ocular
students
have
removed.
I
believe
school
should
spend
that
time
and
money
on
bettering
their
school
and
education
systems,
rather
than
spending
it
on
software
and
tools
that
will
in
no
way
improve
students
ability
to
be
educated.
AO
We
are
already
living
in
a
time
where
technology
can
do
scary
things,
and
that-
and
the
last
thing
we
need
are
people
watching
our
every
move
on
a
day
to
day
basis.
During
this
pandemic,
we
have
grown
accustomed
to
wearing
masks
and
covering
our
faces
in
public.
Well,
if
we
had
well,
we
now
have
facial
recognition
on
every
street
corner.
Then
then,
this
temporary
thing
the
world
has
adapted
to
may
just
become
permanent,
not
only
to
protect
our
own
privacy,
but
to
not
be
falsely
accused
of
crimes.
Due
to
falsely
recognition.
AO
Software
20/20
has
had
a
great
display
of
racism
and
brutality
against
people
of
color,
particularly
the
black
community
and
I
fear
that
systems
such
as
facial
recognition,
software
in
the
hands
of
law
enforcement
can
just
be
another
tool
they
use
against
us.
Thank
you
for
listening
to
my
concerns.
Thank.
AP
Thank
you
so
much.
My
name
is
Clara
oscillator
and
I'm
riding
in
support
of
the
ordinance
banning
facial
recognition
technology
in
Boston,
presented
by
constantly
good
and
after
all,
you
did
ordinance
average
a
ban
on
cover
of
pay
civilians
in
the
city
of
Boston,
but
the
most
facets
ordinance
to
join
sprinkle,
some
liberal
Cambridge
Brooklyn
in
nor
Hanson,
in
protecting
racial
justice,
privacy
and
freedom
of
speech.
AP
AP
Free
speech
and
Association,
equal
protection
and
government
accountability
in
transparency
and
the
government
was
pretty
aware
of
the
technical
limits
of
even
the
most
promising
new
technologies
in
the
race
of
relying
on
computer
system
that
can
be
prone
to
error.
Facial
recognition
is
creeping
into
more
and
more
law
enforcement
agencies,
but
later
notice,
our
oversight
in
there
are
practically
no
laws
regulating
this
invasive
surveillance
technology.
Memoir,
ongoing
technical
limitations
to
the
accuracy
of
facial
recognition
create
serious
problems
like
misidentification
in
public
safety
risk
I
refuse
to
sit
back
in
medical.
AP
My
community
on
Morrison
is
already
then
able
it
already
is.
The
surveillance
technologies
are
intentionally
set
up
to
the
disadvantage
of
brown
in
color
people
communities.
It
has
a
better
success
rate
when
it
comes
to
white
males,
meaning
this
technology
is
unreliable
and
it
would
do
more
harm
if
we
don't
bandy
surveillance
to
this
technology,
we
allow
will
allow
for
misidentification
to
purposely
happen
against
black
and
brown
people
event
on
face
of
aliens
technology.
AP
It's
critically
important
in
this
area
Boston,
because
we
need
to
keep
the
community
safe
because
all
life
matters,
a
federal
government
government
study
published
in
December
2019,
found
the
face
recognition
algorithm
most
likely
to
fail
when
attention
to
identify
the
faces
of
people
of
color,
children,
elderly
and
woman.
That
means
the
tea
they
attach
only
reliable
works
on
middle
employment.
A
very
small
fraction
of
Boston
residents
I
encourage
you
to
pass
to
press
pause
on
the
you,
a
surveillance
by
government
entities
in
the
city
of
Boston.
AP
AQ
Okay,
hello,
my
name
is
Johnny
Beth
Serrano
good
evening.
I'm.
Writing
on
behalf
of
sim
in
support
of
the
ordinance
banning
facial
recognition.
Technology
in
Boston
I
urge
my
city
of
Boston
to
pass
this
ordinance
to
join
Springfield,
Somerville,
Cambridge,
Brookline
and
Northampton,
and
protecting
racial
justice,
privacy
and
freedom
of
speech.
It
is
our
duty
as
a
free
and
just
City.
Seban
technology
like
facial
recognition
as
a
victim
of
social
media.
AQ
Hacking
I've
seen
how
technology
could
be
used,
as
it's
all
do
to
defame
and
criminalize
with
pictures
that
feel
like
proof
to
the
viewers,
as
the
teacher
on
in
civilian
I
refuse
to
allow
those
in
my
community
to
be
exposed
to
a
highly
problematic
and
technology
like
facial
recognition.
Not
only
is
privacy
important
and
should
be
valued
by
our
law
enforcement.
Face
surveillance
is
proven
to
be
less
accurate
for
people
with
darker
skin,
with
the
high
levels
of
racial
discrimination
forced
on
our
black
and
brown
communities
by
law
enforcement.
AQ
The
racial
bias
and
face
surveillance
will
put
lives
in
danger
even
further
than
they
already
are.
A
ban
on
face
surveillance
technology
is
critically
important
in
the
city
of
Boston,
because
the
technology
really
is
only
reliable
for
a
very
small
fraction
of
the
population.
It's
highly
important
to
me
to
live
in
a
city
where
it
is
critical
for
leaders
to
keep
our
city
and
its
people
safe,
although
it
may
seem
as
though
facial
recognition
could
maybe
keep
us
safer,
it
has
proven
to
do
the
opposite.
AQ
S
Miss
Oliver
and
I
live
in
Jamaica,
Plain
and
I
was
some
document
and
when
I
attended
the
Boston
Public
Schools
in
the
early
90s.
Yes,
it's
a
while
back.
I
became
a
citizen
in
2016
I,
look
back
on
my
days
in
high
school,
while
contemplating
this
new
technology.
Does
you
know,
facial
recognition
and
I
can
sincerely
tell
you
that
it
terrified
me
the
thought
of
having
somebody
access
this
technology.
S
Well,
my
concern
acknowledged
I,
don't
know
if
this
would
have
allowed
me
to
finish
school
at
that
time
and
he
would
have
written
hard
to
finish
possibly
higher
education
without
that
education
I
would
not
be
able
to
secure
job
opportunities
like
I
have
now
with
a
paying
job
that
provides
the
healthy
provide
for
my
family.
So
we
start
thinking
about
that.
You
just
are
seeing
how
this
facial
recognition
can
actually
have
a
negative
impact
in
the
education
of
our
youth
in
our
future.
S
People
already
talk
how
or
the
us,
oh
sorry,
middle
with
racial
bias,
and
this
software
is
just
giving
them
giving
people
already
in
power.
The
tool
to
skew
me
even
more.
For
example,
ice
has
already
shown
up
on
the
protests
and
between
a
few
people.
Now
we
have
to
ask
ourselves
how
the
heck
do
they
pick
out
these
people
and
their
thousands
of
sea
of
protesters
I
mean
one
Quran
is
speculate,
but
one
of
the
few
answers
can
be
facial
recognition
software.
Do
we
not
have
it
do
they
have
it?
S
How
can
we
sure
that
they're
not
using
it?
Well,
it's
pretty
hard
to
pick
out
two
or
three
individuals
are
about
thousands
and
thousands
and
thousands
of
processors.
So
it's
up
to
us
to
kind
of
speculate.
What
that
is,
the
corona
virus
has
caused
major
epidemic.
The
police
brutality
has
caused
major
epidemic
I.
Guess
we
have
to
ask
ourselves?
Are
you
allowing
the
to
happen?
S
Maybe
a
patient
recognition
in
the
City
of
Boston
will
bring
us
to
be
in
the
middle
of
the
next
battle
and
which
I
do
which
side
do
we
want
to
be
on?
Let's
not
approve
this
software
I
read
YouTube
and
facial
recognition.
Technology
I
said
also
effects
are
you
we
do
not
need
a
system
or
software
that
can
pick
up
individuals
and
potentially
erroneously.
S
A
Thank
you
very
much.
It's
10:00
to
6:00
I'm
going
to
have
to
stop
sharing
at
6
o'clock.
The
that
only
means
that
I
will
be
turning
over
the
gavel
to
the
one
of
the
lead
sponsors
Michelle
will
at
6
o'clock,
but
I'm
going
to
call
the
next
names
of
individuals
who
are
in
line.
I'm
gonna
have
signed
up
to
speak
before
I
go
and
I
will
continue.
Chair
I've
got
an
angel,
Michelle,
rajma,
Leon,
Smith,
Amy,
Anderson
he'll.
Excuse
me
Julie
McNulty
and
Zachary
loan.
AR
A
AR
AR
AR
It
can
get
the
wrong
the
wrong
person.
I
am
young
as
I
get
older.
I
will
not
look.
The
same.
I
am
seven
now
and
when
I
turned
13,
I
will
look
different
when
I
turn,
18
I
will
look
different
I
get
older
and
my
face
changes.
If
these
tools
are
used
in
school,
I
will
not
feel
safe.
We
are
just
children,
our
teachers
are
already
taking
care
and
watching
us.
We
do
not
need
police
watching
us,
Thank
You,
sim
for
your
love
and
support.
A
Well,
I
have
to
say,
I
think
you
are
the
youngest
person,
that's
testified
ever
and
anything
I've
ever
shared
or
had
that
participation,
I
I,
know
we're
all
in
soon,
but
I
will
give
you
a
around
four
five.
Thank
you
so
much
that
was
absolutely
beautiful,
very
powerful
testimony
and
we're
so
proud
of
you.
Your
City,
Council
and
City
are
so
proud
of
you.
Thank
you.
So
much
for
your
testimony.
Angel
I'm
going
to
now
call
on
I,
don't
think
I
see
Michelle
or
Leon
I,
don't
hear
them
either.
A
AS
A
AT
And
also
I'm,
not
I'm,
following
angel
look
a
bit
much
so
hello,
my
name
is
Michelle
Barrios
and
I
am
a
social
studies
teacher
in
the
Boston
area,
and
today,
I
am
here
speaking
on
behalf
of
the
student
immigrant
movement
myself.
As
an
educator
and
on
behalf
of
my
diverse
student
population,
I
am
speaking
in
support
of
the
ordinance
banning
banning
facial
recognition,
technology
in
Boston,
presented
by
councilors,
Wu
and
Arroyo.
This
ordinance
established
a
ban
on
government
use
of
facial
surveillance
in
the
city
of
Boston.
Boston
must
pass
this
ordinance
to
join
Springfield
Somerville.
AT
Excuse
me:
Springfield
Somerville,
Cambridge,
Brookline
and
Northampton
in
protecting
racial
justice,
privacy
and
freedom
of
speech.
In
my
training,
as
a
social
studies,
teacher
I
became
increasingly
aware
of
the
trauma
that
students
of
color
develop
as
they
grow
through
as
they
grow
in
a
society
that
seeks
to
please
them
on
the
basis
of
their
ethnicity,
economic
status
and
place
of
residence.
AT
This
trauma
is
exacerbated
when
young,
when
a
young
person
comes
from
an
immigrant
family
whose
status
in
this
country
places
them
at
risk
of
family
separation,
a
loss
of
education
and
a
stripping
of
basic
human
right.
We
know
that
students
struggle
to
succeed
academically
when
they
face
these
daily
existential
traumas,
which,
in
essence,
is
a
violation
of
their
rights
in
the
u.s.
to
pursue
survival,
let
alone
social
mobility
and
a
future
of
stability.
AT
In
addition
to
my
training
and
work
as
an
educator,
I
am
the
wife
of
a
let
let
you
know
immigrant
and
to
misty
and
the
mother
of
two
messy,
so
children,
while
I
am
a
white
latina
and
I
can
live
under
the
presumption
of
innocence.
Due
to
my
white
privilege,
I
have
witnessed
firsthand
what
it
means
to
live
in
a
society
that
has
not
yet
good
on
its
promise
of
equality,
liberty
and
justice
for
all.
My
husband
is
followed
in
stores.
AT
Teachers
hesitates
to
release
our
children
to
him
when
they
first
meet
him,
and
he
has
been
asked
for
his
immigration
status
by
customers
in
his
work.
We
are
in
the
fortunate
position
to
know
that
his
life
and
our
family
can
count
on
whose
permanent
residency
to
keep
us
together.
For
now,
however,
I
cannot
say
the
same
for
many
Latin
ex
families
in
the
Greater
Boston
area.
AT
A
ban
on
safe
surveillance
technology
is
critically
important
in
the
city
of
Boston,
because
its
surveillance
harms
our
privacy
and
our
freedom
of
speech,
a
fundamental
right,
a
fundamental
right
and
our
Republic's
Constitution.
This
type
of
surveillance
threatens
to
create
a
world
where
people
are
watched
and
identified
as
they
exercise
their
right
to
protest,
congregate
at
places
of
worship
and
visit
medical
providers
as
they
go
about
their
daily
lives.
For
my
students
of
color,
specifically,
the
young
men
growing
up
in
a
world
that
still
not
fully
accept
the
threat,
accept
the
threat,
their
lives
matter.
AT
I'm,
sorry,
except
that
their
lives
matter
based
surveillance
technology
is
not
only
another
threat
to
their
lives,
but
a
response
saying
to
these
young
people.
You
still
don't
matter,
I
teach
world
history
as
well
as
comparative
government
and
politics,
and
when
I
teach
about
authoritarianism,
I
teach
about
it.
AT
The
impact
on
marginalized
groups
and
countries
that
are
still
considered
developing
I
teach
a
month,
I
teach
that
in
these
three
countries,
or
in
these
recently
three
countries
surveillance
is
one
of
the
most
common
tools
to
manage
dissent,
and
that
is
often
followed
by
state
sanctioned
violence
as
a
means
to
silence.
Any
dissident
I
urge
each
of
you
to
consider
the
statement
that
you
make
to
the
public
by
not
banning
the
use
of
facial
sort
of
bite.
I'm.
AT
Sorry,
the
statement
that
you
would
make
by
not
banning
the
use
of
facial
surveillance
technology
in
Boston
I
urge
each
of
you
to
imagine
a
fourteen-year-old
student
of
color
or
let
you
know
essential
worker
in
front
of
you
asking
if
their
lives
matter.
What
is
your
response?
I
encourage
you
to
consider
that
moment
and
to
press
pause
on
the
use
of
face
surveillance
by
cover
entities
in
the
city
of
Boston
by
supporting
and
packing
this
crucial
band,
we
cannot
allow
Boston
to
adopt
authoritarian,
unregulated
bias,
surveillance
technology.
Thank
you
for
your
attention
and
your
consideration.
AT
A
You
very
much
at
this
point
I'm
going
to
turn
over
the
microphone
or
the
gavel
as
well
to
my
colleague,
Michelle
woo
I'll
just
say
up
next,
two
speakers
should
be
getting
ready,
is
Julie
McNulty,
Zachary
long,
Christina,
Rivera
and
Joseph
Friedman
I
want
to
say
thank
you
so
much
to
the
lead.
Sponsors
of
this
ordinance
I
will
be
planning
to
have
a
working
session
as
soon
as
possible.
A
I
do
understand
the
urgency
and
I
do
believe
that
there
is
there's
a
great
opportunity
before
us
to
lead
in
a
way
that
we
that
that
not
only
values
the
lives
of
our
black
and
brown,
but
in
general,
our
civil
liberties
and
so
I
want
to
thank
again
the
lead
sponsors
for
their
leadership.
I,
look
forward
to
getting
this
done
and
with
that
I'm
going
to
have
to
sign
off
and
I
turn
over
to
Michelle
whoo.
Thank.
G
And
just
a
note
could
counsel
central
staff
give
me
administrative
rights,
so
I
can
help
search
the
weather
folks
are
stuck
in
the
waiting
room
and
interesting
people
over
into
the
panelists
section.
Thank
you
so
again
that
it
was
Julie
Zachary,
Cristina,
Rivera,
Joseph,
Friedman,
Danielle
santur
for
the
next
few.
AU
Yes,
thank
you
for
having
me,
as
stated,
my
name
is
Christina
Rivera
and
I
am
Austin
resident
I'm.
Also
the
founder
of
the
Latin
X
Coalition
for
black
lives
matter.
It
is
a
support
organization
to
help
end
racism
within
our
community,
but
also
help
advance
the
work
of
the
black
life
black
lives
matter.
Movement
I
want
to
say
that
we
as
an
organization,
we
fully
support
the
ordinance
of
a
ban
on
facial
recognition
technology,
as
stated
all
throughout
this
meeting
earlier.
AU
We
know
that
the
automation
bias
exists
and
that
facial
recognition
technology
clearly
shows
biased
towards
already
highly
targeted
and
hyper
police
communities.
Specifically
those
of
color
and
I
just
want
to
pose
the
question.
How
can
we
begin
to
trust
even
with
the
development
of
algorithm
accuracy
that
those
who
will
use
it
in
the
future
will
not
abuse
its
capabilities?
AU
Using
this
technology
also
enforces
a
policing
system
that
is
based
on
control,
and
that
is
a
direct
symptom
of
police
mistrust
that
already
exists
of
their
own
citizens
and
the
ones
that
they
serve,
and
so,
as
some
people
earlier
stated,
specifically
mr.
de
Leon's
point
about
how
what
government
agencies
are
allowed
to
already
use
this
versus
people
who
have
in
enacted
bands
what
is
also
going
to
be
the
legislative
power
that
any
federal
agency
will
be
able
to
use
it
in
individual
states,
specifically
when
it
comes
to
ice
and
immigration.
AU
So
this
is
something
that
we
are
looking
to
make
sure
that
it
not
only
is
banned
now,
but
also
cannot
be
allowed
in
the
future.
With
that
said,
thank
you
so
much
to
all
the
panelists
who've
shared
all
this
great
information.
Thank
you
so
much
to
councilor,
Wu
and
so
royal
for
doing
this
work
and
I
concede
my
time.
Thank
you.
AU
AV
AV
To
suppose
we
recognize
emotions,
so
I'll
be
speaking
here
as
a
private
citizen,
but
I'll
tell
you
that
for
the
past
three
years,
I've
been
working
at
a
lab
at
Northeastern
University,
which
studies,
emotions
and
I've
been
a
research
assistant
at
the
Harvard
Kennedy
School
Isabelle
for
Center
for
Science
and
International
Relations
thinking
about
the
same
type
of
technology
at
the
interdisciplinary,
affective
science
lab
at
Northeastern.
One
of
my
bosses
is
dr.:
Lisa,
Felton
Barrett,
dr.
AV
Barrett
has
research
appointments
at
MGH
and
Harvard
Medical
School
she's,
the
chief
scientific
officer
at
MGH,
a
center
for
law,
brain
and
behavior
she's,
the
author
on
over
230
peer-reviewed
scientific
papers
and
just
finished
a
term
as
president
of
the
Association
for
the
psychological
science,
where
she
represented
tens
of
thousands
of
scientists
around
the
world
three
years
ago.
This
organization
commissioned
her
and
four
other
experts,
psychologists,
computer
science
and
so
on.
AV
To
do
a
major
study
on
whether
people
across
the
world
express
emotions,
like
anger,
sadness,
fear,
disgust,
surprise
and
happiness
with
distinctive
movements
of
the
face.
So
all
of
these
scientists
started
with
different
perspectives,
but
they
reviewed
a
thousand
research
articles
and
they
came
to
a
very
simple
and
very
important
consensus.
So
the
question
is:
can
you
read
emotions
in
a
reliably
in
the
human
face,
and
the
answer
is
now
here's
an
example
from
dr.
Barrett
people
living
in
Western
cultures
scowl
when
they're
angry
about
30%
of
the
time.
AV
This
means,
then
they
make
other
facial
expressions,
frowns
wide
eyed,
gasps
smiles
about
70%
of
the
time.
This
is
called
low,
reliability,
people
scowl
and
angry
more
than
chance,
but
they
also
scowl
when
they're
not
angry,
if
they're
concentrating
or
if
they
have
something
like
gasps.
This
is
called
low
specificity.
So
a
scowl
is
not
the
expression
of
anger,
but
it's
one
of
many
expressions
of
anger
and
sometimes
it
doesn't
express
anger
at
all.
In
this
paper.
Dr.
Barrett
and
her
colleagues
showed
that
scowling
and
anger
smiling
and
Happiness
frowning
in
sadness.
AV
These
are
all
Western
stereotypes
of
emotional
expressions.
They
reflect
some
common
beliefs
about
emotional
expressions,
but
these
beliefs
don't
correspond
to
how
people
actually
move
their
faces
in
real
life
and
they
don't
generalize
to
cultures
that
are
different
from
ours,
so
it's
possible
for
anyone
or
anything
to
read
an
emotion
on
our
face
and
we
shouldn't
confidently
infer
happiness
from
a
smile,
anger
from
a
scowl
or
sadness
from
a
frown.
There
are
technologies
that
claim
to
do
this
and
they're
misrepresenting
what
they
can
do.
AV
According
to
the
best
available
evidence,
people
aren't
moving
their
faces
randomly,
but
there's
a
lot
of
variation
and
the
meaning
of
the
facial
movement
depends
on
a
person
on
the
context
and
on
culture.
So
let
me
give
you
one
example
of
how
making
this
assumption
can
be
really
harmful.
There's
a
scientist
doctor
born
bruise
when
she
was
at
Wake,
Forest
University,
published
work
reviewing
too
popular
in
motion
detection
algorithms.
She
ran
these
algorithms,
algorithms
on
portraits
of
white
and
black
NBA
players
and
both
of
them
consistently
interpreted.
AV
The
black
players
is
having
more
negative
emotions
than
the
white
players.
Imagine
that
there's
a
security
guard
that
gets
information
from
a
surveillance
camera
that
but
there's
possible
nuisance
in
the
lobby
of
their
building
because
of
the
facial
movements
that
somebody's
making
imagine
a
defendant
in
the
courtroom
that
scowling
as
they
concentrate
on
legal
proceedings,
but
an
emotion,
AI
based
on
facial
surveillance
tells
the
jury
that
the
defendant
is
angry.
AV
Imagine
that
the
defendant
is
black
and
the
AI
says
they're
contemptuous
imagine
at
worst
a
police
officer
receiving
an
alert
from
a
body
camera
based
on
this
technology.
That
says
that
someone
in
their
line
of
sight,
is
a
thread
based
on
a
so-called
reading
of
their
face.
Counselors
I
think
that
you
know
the
question
that
we
should
be
thinking
about
is
whether
we
want
someone
to
use
this
technology
on
us
with
the
chance
that
would
misinterpret
our
face.
You
know,
given
the
major
chance
that
it
has
it
being
wrong.
AV
AW
Hi
good
afternoon
everybody,
my
name
is
Danielle
I'm,
a
Boston
resident
I
live
in
North
Bend
I
just
wanted
to
fully
support
and
put
my
voice
behind
the
band
for
facial
recognition.
Recognition
technology
I
feel
as
though
the
expert
what
instance
and
activists
that
have
gone
before
me
have
clearly
expressed
and
eloquently
expressed
the
views
that
I
also
I
find
this
technology
to
be
incredibly
dangerous.
As
a
cyber
security,
professional
I
have
a
lot
of
questions
about.
AW
You
know
what
it
means
to
be
collecting
data
on
people
and
storing
it.
I
certainly
think
that
this
is
a
measure
that
should
be
taken
to
be
a
precedent
for
all
future
developments
of
the
technology.
As
we
just
stated,
there
are
upcoming
and
new
ways
to
express
special
recognition
that,
through
emotions
or
otherwise,
that
we
can't
even
comprehend
yet
because
it
hasn't
started
so
I
just
wanted
to
support
it
and
thank
all
the
other
people
for
bring
their
voice
behind
it
and
hope
to
see
it
take
place.
Thank
you.
AX
AX
AX
G
You
we
are
excited
to
have
you
come
back
Denise
next
and
then
after
Denise
there
are
a
few
names
signed
up,
that
I
didn't
see
in
the
room,
so
I'm
gonna
call
out
the
folks
that
I
was
able
to
cross-reference
between
the
waiting
room
and
the
list,
which
was
Charles
for
his
Wald
and
Nora
Paul
Schultz,
and
then
after
that
and
when
I
try
to
admit
everyone
else
was
in
the
room
and
go
in
order.
Even
if
you
weren't
on
the
original
list
so
Denise,
please.
AY
AY
I.
Do
not
think
that
we
should
underestimate
how
this
type
of
technology
technology
could
be
used
to
further
racist
and
authoritarian
agendas.
Here,
the
use
of
facial
surveillance
is
used
to
oppress
and
control
citizens
under
the
guise
of
Public
Safety.
This
is
a
slippery
slope.
Government
surveillance
is
used
to
silence,
to
suppress
speech
and
to
violate
human
rights
as
an
educator.
I
worry
about
this
type.
AY
AZ
Thank
you.
My
name
is
Charles
Quizlet
I'm,
a
resident
of
Brighton
and
I'm,
a
philosophy
professor
at
Boston,
University
I,
hasten
to
add
that
I'm,
offering
my
thoughts
here
and
I
am
in
offering
my
thoughts.
I
am
NOT
speaking
for
my
University
and
not
acting
on
his
behalf.
So
I'm
speaking
on
my
own
behalf,
only
I'm
very
grateful
to
all
of
you
in
the
council
for
taking
action
on
this
matter
and
to
the
police
commissioner,
for
his
impressive
and
in
some
ways
reassuring
testimony.
It's
imperative.
AZ
I
think
that
you
approved
this
ban
on
facial
recognition,
technology
or
rather
technologies,
as
we've
learned
today
by
the
government
before
the
relevant
software
is
upgraded.
So
far
as
I
can
tell
three
basic
arguments
for
the
band
I've
been
discussed
the
first
to
extensively
in
the
third
a
bit
less
so
and
taken
together.
They
seem
to
me
to
be
decisive
in
making
the
case
for
the
ban.
AZ
In
some,
the
technology
puts
tremendous
additional
power
in
the
hands
of
government
and
its
agents,
all
the
more
so
when
it
actually
does
work
and
our
privacy
and
one
other
things
is
certainly
threatened
by
that.
Where
could
that
lead?
Look
no
further,
as
people
have
suggested
to
the
nightmare,
that
is
the
surveillance
apparatus
in
China
today.
A
third
point
is
has
been
mentioned,
but
was
less
emphasized
and
I?
AZ
So
I
strongly
urged
the
council
to
pass
this
ban,
but
I
also
asked
that
you
not
stop
there
expanding
the
ban
even
further
I'm
going
forward
to
include
other
kinds
of
remote
surveillance
systems
such
that
are
biometrics,
for
example,
gated,
voice
recognition
systems
and
also
to
the
private
sectors
been
has
been
suggested.
So
I
hope
that
you
will
consider
expeditiously
passiveness,
but
also
going
further.
Thank
you
so
much
for
listening.
Thank.
BA
Good
evening
my
name
is
Nora
Paul,
Schultz
and
I
am
a
physics
teacher
in
Boston,
Public,
Schools
and
I've.
Had
the
pleasure
of
teaching
Eli
and
his
Melda,
who
spoke
earlier
in
pokey
one
day,
angel
and
I'm,
a
resident
of
Jamaica
Plain,
as
someone
who
studied
engineering
when
I
was
in,
has
taught
engineering
in
is
one
of
the
robotics
coaches
at
the
O'bryant
I
understand
the
impulse
to
feel
like
technology
can
solve
all
of
our
problems.
BA
I
know
that
many
for
many
there
is
a
desire
that,
if
Nia
have
better
smarter
and
more
technology
than
our
communities
we'll
be
safer,
but
the
reality
is
that
technology
is
not
perfect,
I'm
perfect,
a
biased
tool
as
much
as
we
wish
it
would
be.
Our
technologies
reflect
the
inherent
biases
of
our
makers,
so
technology
isn't
going
to
save
us
if
it
does
not
take
much
to
see
that
racism
is
infused
in
all
parts
of
our
country,
and
that
is
true
about
the
facial
surveillance
technologies.
BA
Do
my
work
with
unafraid
educators,
the
Boston
teacher
unions,
immigrant
rights,
organizing
committee,
the
information
sharing
between
Boston
School
Police
and
the
Boston
Police
Department
I
know
how
detrimental
observations
and
surveillance
can
be.
Practices
of
surveillance
like
camera,
recording
in
the
hallway
or
officers
watching
young
people
are
not
passive.
BA
Surveillance
is
actives.
It
leads
to
the
creation
of
law
enforcement
profile.
I
was
about
young
people,
and
that
can
be.
You
know
that
can
impact
their
lives
in
material
ways
for
years
to
come,
and
what
are
those
impacts
being
in
the
system
can
make
it
harder
for
young
people
to
get
a
job
or
get
housing
it
can
lead
to
incarceration.
It
can
lead
to
deportation.
Our
city
government
does
not
need
more
tools
to
profile
the
people
of
our
city
and
especially,
does
doesn't
need
tools
that
are
inherently
racist.
BA
BA
BA
This
face
surveillance
technology
harms
our
community.
The
fact
that
the
face
band
technology
only
identifies
black
with
women
correctly.
One
third
of
the
time
lays
bare
both
the
ineffectiveness,
an
implicit
bias
built
into
the
system.
As
a
teacher
I
know
that
that
is
not
a
passing
grade,
we
need
money
to
go
to
services
that
will
protect
our
community
and
we
need
to
stop
and
prevent
the
use
of
ineffective,
racists
and
money
make
money
wasting
technologies.
Thank
you.
G
Thank
You
Nora
and
we
appreciate
all
of
your
work.
I'm
gonna
quickly
read
the
list
of
names
that
were
signed
up
to
testify,
but
I
didn't
see
matches
of
the
names
of
folks
left
on
their
waiting
room,
so
it
just
in
case
you're
signed
in
a
different
name,
but
I'm
expecting
that
these
folks
probably
aren't
here
anymore
Julie,
McNulty,
Zachary,
loud
Makenna,
cabin
Christine,
Doherty,
Sarah
Nelson
and
her
Theriot
Amanda
Mia
CRISPR
Rome,
Bridget,
Sheppard,
Lena,
Papa
giannis.
Anyone
here
and
signed
on
under
a
different
name.
G
BB
Yes,
this
is
mark
Gurwitch,
presenting
testimony
on
behalf
of
Jewish
Voice
for
Peace,
a
local
and
national
organization,
that's
guided
by
Jewish
values,
fighting
racism,
sexism,
Islamophobia
LGBTQ
oppression
and
anti
humor
immigrant
policies
here
in
challenging
racism,
colonialism
and
oppression
everywhere.
Well,
the
focus
on
the
injustice
is
suffered
by
Palestinians
under
Israeli
occupation
and
control.
In
recent
days,
in
the
u.s.
BB
we
have
again
witnessed
the
impact
of
militarized
policing
by
communities
of
color
councilor
Wu
has
pointed
out
in
a
recent
publication
about
this
militarization,
and
much
of
this
militarization
has
been
through
the
transfer
of
billions
of
dollars
worth
of
equipment
from
the
military
to
civilian
police
forces
and
the
training
of
u.s.
police
in
the
use
of
military
tactics
in
controlling
the
communities,
most
heavily
impacted
by
racism,
economic
depletion
and
health.
Crisis's,
our
view
is
that
the
use
of
facial
recognition
technology
is
a
key
component
of
the
militarization
of
policing.
BB
We
know
this
because
of
our
knowledge
and
experience
of
the
Israeli
occupation
of
Palestinian
people
and
land.
Faithful
recognition
is
a
core
feature
of
that
occupation.
Much
of
the
technology
being
marketed
to
the
US
law
enforcement
has
been
field-tested
on
Palestinians,
under
military
occupation
at
this
historic
and
critical
time,
when
the
effects
of
militarized
policing
in
the
u.s.
resulted
in
nationwide
and
worldwide
condemnation.
The
very
last
thing
we
need
is
another
tool
adopted
from
the
toolbox
of
military
occupation.
Jewish
Voice
for
Peace
supports
the
face
of
valence
band.
Thank
you.
Thank.
G
W
G
W
No
problem,
and
so
I
just
wanted
to
thank
everybody
and
I
want
to
say
that
I
am
very
much
I'm
an
associate
professor
at
UMass
Boston
and
both
as
an
educator
as
and
as
a
researcher
in
issues
of
surveillance.
I
am
very
much
in
favor
of
the
ordinance
and
I
would
like
to
say
that
I
would
also
like
the
council
to
consider
wider
regulations.
So,
as
many
have
said
before,
there
is
a
lot
of
issues
also
in
the
park
in
the
private
sphere.
W
So
this
is
a
very
important
first
step,
but
a
lot
of
the
use
of
facial
recognition
in
law
enforcement
really
interacts
with
the
use
of
surveillance
technologies
in
the
private
space.
So,
for
example,
when
the
Brookline
Police
Department
were
trying
to
argue
against
the
Brooklyn
ban,
Sarah
Kay
hello,
they
used
an
example
of
police
using
snapchat.
So
a
lot
of
people
before
a
lot
of
speakers
have
before
have
talked
about
the
importance
of
anonymity
in
public
space
and
I'm
fully
in
support
of
that.
W
W
BC
Don't
have
a
lot
to
add.
I
just
want
to
support
the
work
they're
doing
in
their
student
advocacy.
I
know
they
all
spoke
for
themselves
and
their
allies
did
as
well
and
just
wanted
to.
Let
you
know
that
Austin
teachers
also
support
this
ban.
So
thank
you
for
listening
to
all
their
testimony
and
know
that
the
teachers
are
on
their
side
too,
and
that
we
support
this
ordinance.
Thank
you.
Thank.
G
G
M
You,
madam
chair,
at
this
point,
and
thank
you
to
you
and
council
Royale
for
bringing
this
before
the
council.
I've
really
appreciated
everyone's
testimony
today
in
both
hearings
and
sort
of
the
certainly
a
very
direct
correlation
between
some
of
the
testimony
today.
But
a
particular
interest
of
mine
was
the
testimony
of
the
young
people
and
the
testimony
of
the
teachers
and
in
particular,
so
just
think.
Thank
you.
Both
Thank
You
council
room,
two
for
staying
after
the
chair,
had
to
leave
to
extend
an
opportunity
for
this
continued
testimony.
M
E
Just
a
sincere
thank
you
to
our
panelists
I
think
many
of
them
on
now,
but
thank
you
so
much
to
them
and
everybody
would
give
testimony.
Much
of
that
was
deeply
moving.
Certainly
angel
was
a
highlight,
so
thank
you
angel
and
anybody
who
can
get
that
message
to
angel
I'm,
not
sure
if
it's
bedtime,
but
thank
you
so
much
for
raising
your
voices
and
really
being
focused
at
a
real
municipal
level.
E
This
is
what
we've
always
asked
for
this,
but
you
know:
I
grew
up
about
angels
age,
watching
these
meetings
and
having
folks
basically
beg
for
this
kind
of
interaction
from
our
communities,
and
so
that
was
deeply
grateful
that
whatever
brought
you
here
specifically
I,
hope
you
stay,
engaged
I,
hope
you
bring
this
energy
to
other
issues
that
really
affect
our
communities
on
a
daily
basis,
because
it
really
makes
change.
It
really
changes
things,
and
so
thank
you
so
much
to
everybody
who
took
part
in
this.
E
Thank
you
to
all
the
advocates
who
took
part
in
this
and
Thank
You
Council
for
making
sure
that
there
was
a
chair
here
and
to
councillor
Edwards
first
running
so
well
earlier.
But
thank
you
to
you
for
ensuring
that
we
can
continue
to
hear
public
testimony
because
it
really
is
valuable.
So
with
that
I
thought
that
was
all
very
valuable.
So
thank
you
to
the
Commissioner
for
taking
the
time
to
be
here
and
to
really
endorse
this
idea.
I
thought
that
was
fantastic.
So
thank
you.
Thank.
G
You
this
is
a
great
great
hearing.
I
think
we
learned
we
all
learned
so
much
and
I
want
to
echo
thanks
to
the
coalition,
all
the
community
members
who
gave
feedback
along
the
way,
and
over
this
evening,
my
colleagues,
the
Commissioner
and,
of
course,
central
staff
or
for
making
sure
all
of
it
went
off
very,
very
smoothly.
So
we'll
circle
back
up
with
the
committee
chair
and
figure
out
next
steps,
but
hoping
for
swift
passage
and
very
grateful
again
and
that
we
have
an
opportunity
in
Boston
to
take
action.