►
From YouTube: SES Meeting: Records and Tuples with Jordan Harband
Description
In this conversation, we explore a holistic approach to rooting out how bad each of the options for Records and Tuples are for breaking existing code.
A
All
right
welcome
to
this
special
session
of
sets
where
we've
got
all
of
the
interested
parties
in
records
and
tuples.
In
one
place,
we
left
off
last
week,
having
made
some
progress
on
at
least
understanding
the
scope
of
the
problem
and
the
natures
of
the
trade-offs
and
who
would
like
to
take
it
away
from
here.
B
Okay,
so
last
week
we
ended
up
agreeing
that,
with
the
current,
like
membrane
implementations,
it's
not
possible
to
have
type
of
anything
different
from
object.
B
Did
this
did
your
notification?
I
should
try
to
select
them.
B
However,
having
type
of
objects
brings
up
other
problems
which
are
how
to
handle
them
in
week
max
and
in
proxies.
D
So
we
continue
to
hold
the
current
case
where
on
authority
can
only
be
nested
within
something
that
has
type
of
object,
and
the
third
option
is
willingly
breaking
that
and
potentially
breaking
code.
That
depends
on
this
behavior
that
there
is
a
authority
that
is
nested
within
something
that
is
not
type
of
object.
E
So
you're
talking
about
like
three
specific
options:
can
I
step
back
a
bit
and
sort
of
come
at
it
from
a
more
high
level,
like
first
principles
thing,
so
the
language
has
a
number
of
axioms
that
apply
that
potentially
apply
to
every
use
case
right.
There
are,
maybe
some
people
who
don't
care
about
them,
but,
like
any
part
of
javascript,
is
eligible
to
to
rely
on
these
axioms,
for
example,
object.
Parens
x,
triple
equals
x,
is
only
true
for
objects
when
x
is
an
object.
It
is
false.
E
When
x
is
a
primitive,
that's
an
axiom.
We
cannot
ever
break
that
for
any
reason,
it
will
make
the
language
more
confusing.
So
there's
a
lot
of
other
things
that
I
would
put
in
that
list
of
axioms.
One
is
that
if
two
things,
if
x
and
y
are
objects
and
they
are
triple
equals,
they
must
not
be
distinguishable.
F
E
Yes,
I
mean
invariant
whether
explicitly
documented
and
agreed
to
or
not
yeah.
E
Then
I'll
switch
from
axiom
to
invariant,
because
that
that
is
indeed
what
I
mean.
So
there
was
the
one
about
object
x,
triple
equals
x.
Tells
you
if
it's
a
primitive
or
an
object.
That's
an
invariant.
If
x,
triple
equals
y
and
x
and
y
are
both
objects.
E
E
E
So
you
could
argue
that's
an
invariant,
but
that
I
think
it
would
be
a
difficult
argument
that
that
code,
authored
by
people
who
aren't
sharing
this
call's
concerns
actually
ever
has
that
mental
model,
and
so
let.
F
Me
get.
Let
me
take
a
counter
example
that
you're
aware
of
that.
Please
yeah.
The
counter
example
was
found
in
in
response
to
your
challenge.
Yeah
is
I
took
a
look
at
our
own
coded
agaric
for
places
where
we
were
counting
on
this
invariant,
and
I
found
one
that
has
nothing
to
do
with
authority
per
se.
F
It
was
just
a
cycle
check
is
I
wanted
to
know
if
the
reachability
graph
rooted
in
a
particular
object
had
cycles
in
it
and
when
I
encountered
a
primitive,
I
just
skipped
further
investigation,
because
I
know
I
know
that
a
primitive
can't
lead
to
a
pointer
back
into
the
graph
records
and
tuples
by
themselves.
F
E
So,
thank
you
for
reminding
me
of
that.
That
is
the
cycle
checks
and
particularly
in
json.stringify,
is
like
the
built-in
version
of
that
that
that
is
a
use
case.
Now
I
I'm.
E
I
would
still
be
interested
in
further
investigation
there
to
see
if
somebody
like,
I
think
that
everyone
on
this
call
and
in
associated
code
basis,
has
the
cursive
knowledge
about
lots
of
things,
and
so
I
would
be
very
com
persuaded
by
examples
of
code
with
that
care
that,
where
the
intention
of
the
code
is
to
think
about
carrying
mutable
state
in
such
a
way
that
records
and
tuples
with
boxes
would
break
it.
E
If
those
examples
exist
in
code
authored
by
someone
who
doesn't
have
the
curse
of
knowledge,
that
all
of
us
us
in
this
call
share
now
that
doesn't
erase
the
use
case
or
the
breakage
it
just
I'm
just
like
it's
just
discussing
about
the
widespreadness
of
the
damage,
and
I
think
that
in
the
list
of
invariants
that
one
will
probably
have
the
least
widespread
damage
out
of
all
of
them.
If
we
pick
one
to
break
and
if
we
break
none
of
them,
I
I
like
we.
F
E
G
H
I
just
want
to
ask
if
we
had,
I
think
we
all
have
a
lot
of
shared
agreement,
but
where
it
might
get
more
gray
is
when
something
stops
being
an
invariant
and
there's
just
like
a
consequence,
as
in
I
agree
that
there
are
invariants
in
the
language,
but
then
there
might
be
other
things
that
are
true
and
break,
but
aren't
necessarily
an
invariant,
as
in
you
could
say
that
someone
had
code
where
they
assumed
the
set
of
strings
returned
from
type
of
was
fixed
in
stone
and
would
never
update
and
then
obviously
again
comes
along.
H
So
it's
like
was
the
set
of
things
returned
from
type
of
an
invariant,
or
was
it
just
a
consequence
of
the
language
and
does
that
depend
on
the
yeah,
like,
I
think,
is
what
mark
was
saying,
the
things
which
are
crucially
important
just
because
of
how
wide
spread
they
are
versus
things
that
someone
you
know
someone
somewhere
could
have
a
type
of
where
they
the
default
cases
to
throw,
and
they
you
know,
was
that
an
invariant
just
because
someone
did
it
out
there
and
we
can
find
a
website
that
does
it
so
yeah.
H
So
do
we
have
a
clear
understanding
of
when
an
invariant?
How
can
we,
just
you
know,
separate
things
in
the
language
into
the
things
which
are
invariants
and
the
things
which
may
be
true
today?
That
might
not
be
true
in
the
future,
and
maybe
that
is
what
mark
was
saying
about
the
consequence
of
breaking
it.
A
I
Yeah,
I
guess
I'm
trying
to
relate
this
back
to
jordan's
point
overall,
because
we're
right
that,
like
I
mean
well,
the
real
answer
is
that
anyone
can
claim
an
invariant
about
anything
and
the
real
answer
is.
We
have
to
come
to
a
consensus
on
the
fact
that
an
invariant
is
truly
invariant,
given
that
the
alternative
is
some
sort
of
disastrous
outcome
that
the
community
doesn't
want
to
have
right.
F
I
It
is
not
too
dangerous
and
jordan's
original
argument
being
that
the
like
that
the
because
the
amount
of
lines
of
code
is
like
in
use,
cases
is
relatively
small
that
it's
lower
risk,
which
is
true
in
a
sense
and
then,
but
jordan
makes
a
really
interesting
sub
point
that
the
entire
subset
of
people
who
might
care
about
this
is
in
this
call.
I
There
is
certainly
a
way
to
solve
this
problem
such
that
you
are
aware
of
the
known
set
of
like
possible
type
of
values,
so
it
seems
feasible
that,
in
like
cases
like
this,
we
could
simply
update
these
instances
to
fix
this
problem
ahead
of
time.
Right.
F
So
so,
first
of
all,
I
find
it
implausible
that
everybody
who
would
be
broken
by
this
is
on
this
call
or
is
thinking
about
security.
I
think
the
the
cycle
check
is
is
a
great
example
of
something
that
I
wasn't
even
thinking
of.
Until
I
went
to
look
for
it
and
the
number
of
ways
one
can
one
can
count
on
something
being
primitive,
implying
some
things
and
then
have
that
break.
I
think,
is
I.
I
Yeah,
I
guess
I
guess
saying
the
words
heart
entire
set
is
is
maybe
a
bit
of
a
strong
assertion.
I
guess,
if
I'm
to
make
the
same
argument,
it's
that,
like
the
majority,
is
the
case,
but
I
mean
we.
I
guess
that
requires
like
a
counter
example
where
somebody
like
where
this
exists
as
prevalent
in
the
wild
and
then
it
comes
down
to
like
do.
We
think
this
invariant
is
truly
something
people
consider
an
invariant.
F
Yeah,
I
think
I
think
in
general,
with
the
invariants,
the
the
the
frequent
case
that
we
need
to
be
thinking
about
as
language
designers
is,
where
somebody's
counting
on
the
invariant,
without
the
invariant
being
articulate
in
their
head
as
oh.
This
is
an
invariant
of
language,
but
rather
it's
just
a
regularity
that
that
they
happen
to
be
counting
on,
and
even
if
you
ask
them
like,
I
was
pr.
I
think
I
was
the
one
who
authored
the
cycle
checking
code.
If
you
ask
them,
are
you
counting
on
it?
I
J
Yeah,
I
think
the
object
crawling
is
an
interesting
thing,
I'm
more
worried
about
serialization
and
what
people
are
gonna
do
with
that.
J
If
no
matter
what
we
do
with
these,
I
think
serialization
is
gonna
break
in
some
way.
Naive.
Serializers
are
gonna,
use,
type
of
and
determine
how
they're
gonna
serialize
things.
So
I'm
not
sure
we
we
should
just
say
there's
only
two
use
cases
that
we
know
of.
J
I
could
probably
go
and
think
of
arbitrarily
many
number
of
use
cases
where
people
are
using
type
buff.
They
use
it
for
more
than
just
you
know,
cycle
checks,
and
things
like
that.
So
just
be
aware
that
while
we're
discussing
don't
don't
ever
go
down,
the
route
of
we've
encountered
all
the
use
cases
yeah
wait.
G
Yeah,
so
if
I
may,
I
I
I
just
wanted
to
try
to
reason
on
this
problem
space
from
more
on
who's,
going
to
be
impacted
by
what
we
decide
to
break
here.
And
if
we
decide
to
break
the
type
of
invariant
that
this
group
is
concerned
about
effectively.
I
think
they're,
I'm
not
going
to
pretend
anything
about
the
set
of
people
that
are
going
to
be
impacted.
G
G
So
I
don't
know
if
there
is
a
way
from
either
tc39
or
even
implementers
implementing
the
spec
to
signal
to
those
infrastructure.
G
That
being
said,
I
again,
I
don't
know
what
javascript
code
is
there
in
the
world.
Maybe
there
is
some
more
application
facing
code,
that's
probably
going
to
break,
but
I
have
more
the
intuition
that
if
we
make
records
being
a
type
of
object,
for
example,
that
we
might
end
up
hurting
application
codes
or
at
least
people
writing
application
codes,
comprehension
more
than
the
alternative,
so.
D
Okay,
quick
reply:
this
is
not
only
platform
code
executing
other
code;
it
is
also
libraries
used
by
a
code.
Okay,
that
may
not
be
where
you
cannot
push
an
update
to
unless,
of
course,
directly
updates.
G
And
and
to
and
to
bradley's
point
this
might
also
break
in
the
other
situation
as
well.
You
could
break
library
code
if
you
make
record
written
type
of
object
right
and
so
yeah.
It's
just
trying
to
state
that
yeah
exactly
what
jordan
is
saying
in
the
chat,
I
guess
is,
is
exactly
the
same
kind
of
breakage.
You
could
expect
the
the
thing
is
yeah.
G
Just
I'm
finishing
my
point
fast,
but
the
thing
is:
if
you
have
old
library
code
running
in
an
old
application
that
all
the
application
is
also
unlikely
to
introduce
records
into
polls
right
and
yeah,
that's
that's!
It.
C
D
F
It's
both,
you
know
there
there's,
you
know
lots
of
libraries
that
are
widely
used,
they're,
no
longer
actively
maintained
and
they're
considered
to
be
robust
and
correct
and
breaking
those
in
the
is
is
something
that
we
need
to
take
seriously.
G
Okay,
I
I'll
let
nikola
maybe
either
answer
to
that
or
add
to
this.
B
Yes,
thanks
so
yeah.
I
think
we
should
be
really
careful
to
differentiate
between
things
that
will
be
broken,
regardless
of
the
cut-off
and
things
that
will
be
broken
with
a
specific
type
of
because,
for
example,
I
believe
both
the
cycle,
detector
and
the
serialization
use
case
would
be
broken
regardless
of
detect
off.
B
And
the
reason
is
that,
when
you
like,
when
you
check
for
cycles,
you
currently
never
call
functions.
So
even
if
boxes
were
objects,
you
would
not
call
the
unbox
function,
and
so
you
would
not
detect
the
cycle
and
firstly
for
the
serialization
one.
Probably
you
would
be
able
to
serialize
it
like
as
if
it
was
an
object,
but
then
you
wouldn't
be
able
to
deserialize
it
back
anyway,
because
you
wouldn't
know
how
to
build
back
a
record
if
it's
serialized
as
an
object.
I
I
think
generalizing
that
into
an
invariant
that
maybe
doesn't
exist.
Is
that
or
maybe
a
claim
that
you
can
make
that
this
invariant
doesn't
exist,
that
the
set
of
primitives
that
the
language
has
stays
the
same
right
like
for
the
serializer
case,
your
serializer
is
broken.
If
big
gent
is
added
to
the
language,
also
right,
because
you
can't
see
it
you
have
to.
You
can't
know
that
how
to
serialize
a
big
end,
if
you
only
have
know
how
to
serialize
numbers.
E
I
F
So
so
so
the
the
issue
of
reasoning
about
invariance
is
somewhat
sensitive
to
the
issue
of
what
precedence
programmers
are
aware
of
that
helps
that
helps
shape
what
they
consider
to
be
in
variance
and
speaking
as
a
user
of
the
language.
F
The
I'm
aware
that
the
type
of
of
primitives
has
expanded
over
time,
but
the
type
of
of
objects
has
not.
I'm
also
aware
that
there's
the
type
of
objects
is
a
short
list
to
where
I
can
sort
of
notationally
afford
to
write
type
of
x
is
not
equal
to
object
and
type
of
x
is
not
equal
to
function,
but
I'm
not
going
to
similarly
list
all
of
the
types
of
primitives.
If
I
want
to
check
for
primitive,
so
that's
that's.
Why
there's
widespread
code
that
checks
against
object
or
function
to
make
the
distinction?
F
Is
it
an
object
versus?
Is
it
a
primitive
primitives
themselves
have
a
sufficiently
uniform
behavior
right
now
that
for
much
code,
not
for
all
code,
but
for
much
code,
adding
new,
primitive
types
with
new
typeofs
has
been
something:
we've
been
able
to
survive
and
each
one
has
been.
You
know
a
gamble,
but
of
all
of
the
record
and
tuple
proposals.
One
that
I
have
the
most
confidence
would
not
break
things.
F
Is
that
additional
is
that
records
and
tuples
become
primitives
that
can
only
contain
primitives
and
that
they
have
new
type
of
record
and
tuple
respectively?
That
will
break
some
things,
but
I
think
it'll
it'll
do
the
least
damage
to
existing
code.
I
K
D
I
think,
from
from
reading
some
of
the
use
cases
v,
dumb
and
so
on,
it
feels
like
the
overarching
use
of
boxes
is
really
to
be
able
to
clearly
identify
that
there's
a
place
in
the
record
or
tuple
structure.
Where
there
is
data
associated
which
you
cannot,
you
would
have
to
do
awkwardly
with
a
side
table
side
table.
D
If
you
have
a
symbol,
symbol
can
be
used
for
something
else,
so
you
also
have
to
include
some
other
kind
of
data
indicating
at
which
path
that
symbol
is
there
or
you
have
to
like.
D
Do
something
awkward
like
finding
all
symbols
in
your
record,
tuple
and
and
and
compare
it
to
an
entry
in
your
in
your
symbol,
sets
it's
it's
basically
awkward.
I
F
F
So
my
comment
was
not
advocacy.
It
was
an
observation.
F
The
we're
discussing,
invariants
and
and
breakage
and
and
and
hazards
on
existing
code,
and
the
observation
I
was
making
is
that
records
and
tuples
as
primitives
with
new
type
of
types
is
the
one
that
has
the
least
hazard
now
the
direction
that
we've
pursued.
As
we've
discussed
this,
I
think
the
records
and
tuples
as
bach
as
as
objects.
F
So
that
they
can
contain
boxes
as
objects,
I
think
is
still
something
that
has
attractions,
but
I
take
jordan's
jordan's
observation
about
other
breakage
very
seriously.
So
so
I'm
I'm
stating
the
observation
that
the
position
that
has
the
least
breakage
hazard
is
primitives
with
no
boxes
and
then
for
the
other
ones.
D
I
would
actually
say
that,
like
the
example
of
serializers
is,
is
one
where
primitives
would
break
a
little
bit
more
than
than
type
of
objects,
since
a
serializer
with
type
of
object
would
just
end
up
parsing
it
as
if
it
was
a
regular
object
and
you
just
wouldn't
be
able
to
serialize
it
to
a
record.
But
you
wouldn't
serialize
it
to
a
regular
object.
J
J
J
So
I'd
agree
with
matthew
that
type
of
object
is
probably
less
likely
to
break
stuff
because
you'll
actually
send
the
data
across
the
wire,
at
least
in
that
case,
with
primitives
it'll,
probably
be
dropped
or
throw
in
the
case
of
json
parse,
it's
probably
going
to
end
up
or
stringify.
It's
probably
going
to
end
up
with
a
drop
for
most
things.
The
language
itself
might
support
it,
but
that's
unlikely.
I
feel
like
we'd,
never
ever
really
update,
json.parse
or
stringify.
K
So
I'm
wondering
for
people
who
support
box
and
type
of
object,
which
I
guess
includes
matu
and
and
bradley.
What
do
you
think
about
the
invariants
that
jordan
is
positing
like?
How
should
we
be
making
this
trade-off.
D
My
opinion
is
what
jordan
stated
in
in
chat
they're
you
either
canada
catalyze
can
and
it
has
a
minus
zero
to
plus
zero
or
you
make
them
differentiated
to
keep
it
equal.
K
J
K
So
jordan
jordan's
also
encouraged
us
to
allow
everything
to
be
weak
map
keys,
which
bradley
is
encouraged,
that
we
don't
do
because
it
it
could
lead
to
memory
leaks.
So
I
I
mean
I
wonder
if
we
could.
You
know,
since
there
are
many
different
proposals
floating
around
about
how
to
change
semantics,
if
maybe
like
you
could
you
could
work
out
some
kind
of
compromise
position,
because
I'm
personally,
I'm
personally
okay
with
many
different
points
on
this
kind
of
spectrum.
J
D
Bradley
you
have
more
experience
in
the
realm
of
tooling
for
that,
how
feasible
and
how
much
of
a
mitigation
do
you
think
would
be
for
developer
tooling,
to
warn
the
developer
when
they've
added
a
an
object
that
doesn't
bear
identity
to
a
weak
map.
D
Then
so
a
record
or
tuple
that
doesn't
have
that
doesn't
contain
a
box
or
that
if
we
decide
box
can
contain
non
identity
objects.
So
if
you
put
a
record
in
tuple
in
it,
but
that
doesn't
carry
within
it
an
object
with
identity
so
that
the
record
tuple
doesn't
have
an
identity
by
itself.
If
you
add
that
to
a
weak
map,
basically
worn
in
developer
tooling,.
E
For
my
use
cases
every
time,
I'm
doing
it
as
I
have
to
write
special
case
code,
the
my
my
preference
is,
I
just
want
an
optimistically
weak
map.
I
want
to
be
weak
when
possible
and
I
don't
care
if
it's
strong.
When
not.
I
want
something
I
can
stick
everything
in
and
we'll
hold
on
to
the
least
memory
possible,
and
so,
if.
D
I
get
you'll
get
spammed
in
your
developer
tooling,
but
if
you
want
to
update
your
code
to
be
more
responsible,
sorry,
I
don't
have
a
better
word.
You
can
update
it
because
you,
the
language,
also
provide
the
tool
to
make
for
you
to
make
that
check
and
avoid
putting
it
in
there.
E
J
There's
some
confusion
here:
there's
also
a
hand
up.
H
F
Yeah,
this
is
actually
an
interesting
data
point
which
is
for
for
essentially
this
reason.
I
actually
have
written
a
map-
abstraction
that
does
that,
but
rather
than
try
catch
what
it
does
is
it
says
if
it's
an
object,
use
it
use
it
use
it
as
a
key
in
a
weak
map,
otherwise
use
it
as
a
key
in
a
map
so
yeah.
So
this
case
itself
becomes
another
example.
We
do
something
like
that
in
in.
J
I
think
we've
kind
of
jumped
we've
written
it
twice.
I
think
we're
trying
to
jump
too
hard
on
solving
something-
that's
potentially
better
solved,
not
at
runtime,
so
we
have
various
kinds
of
type
checkers,
linters
and
stuff
like
that,
and
you
can
just
lint
and
see
if
they're
trying
to
put
something
into
a
weak
map.
Usually
these
days
worn
there
rather
than
writing
a
bunch
of
code,
because
people
can
do
it
intentionally,
there
are
use
cases
for
it.
D
Actually
that
actually
does
raise
a
good
question,
would
most
tooling
be
updatable
to
deal
with
record
and
tuples
at
as
objects.
D
J
Narrowing
with
things
like
typescript
or
flow
is
not,
you
would
suddenly
have
everything
that
is
guaranteed
to
be
a
type
that
is
object
no
longer
have
that
guarantee
this.
If
type
of
is
object,
you're,
you
are
changing
their
types.
D
E
I
I
think
narrowing
is
like
in
a
type
system.
I
think
it
would
work
the
same.
It
would
probably
my
guess
is.
It
would
be
more
complex
if
it's
type
of
object,
but
I'd
want
someone
on
the
ts
team
to
confirm
that
either
way
but
like
it
feels
like
in
either
case
a
type
system
could
distinguish
and
without
a
type
system
you
cannot
distinguish.
E
E
I
E
I
I
C
D
Would
be
good
to
ask
the
question
to
a
typescript
implementer?
I
don't
think
anyone
is
on
the
call
or
has
been
involved
in
these
discussions.
H
H
Right
so
in
terms
of
the
the
typescript
tracking
it
I
I
think
it
it
could
track
certain
things,
but
of
all
things
with
typescript.
It's
always
a
trade-off
of
when
they
could
be
more
accurate
about
something.
But
the
problem
with
accuracy
comes
from.
You
then
start
to
have
false
positives
and
they
could
start
complaining
about
something
incorrectly
like
you
know,
they
could
go.
H
It's
not
possible
to
tell
if
this
is
a
record
with
a
box
or
just
a
record
of
anything
in
it,
so
it
could
assume
it
could
have
a
box
and
file
for
those
cases,
because
it
thinks
that's
the
safer
thing
to
do,
or
it
could
think
that's
going
to
cause
too
many
people
an
error.
It's
actually
going
to
like
they
don't
want
touchscript.
Doesn't
you
know,
have
any
guarantees
about
being
sound?
You
know.
H
I
I
I
think
that
at
like,
as
recognizable
hits
stage,
three
and
touchscript
starts
implementing
this
stuff.
You
know
we'll
have
to
involve
the
texture
team,
a
lot
to
like
figure
out
how
this
will
mesh
together,
but
I
don't
think
we
need
to
like
a
guarantee
that
this
is
going
to
mesh
well,
because
the
typescript
type
system
being
unsound
means
that
there's
just
so
many
opportunities
for
different
ways
to
hack
around
this.
That
I'm
not
it's
not
as
big
of
a
concern
to
me.
D
And
I
think
thinking
about
this
anyway,
it
would
be
a
problem,
regardless
of
the
type
of
issue.
I
Do
we
want
to
loop
it
back
to
the
other,
blocking
invariant
for
objects
still
being
the
fact
that
triple
equal
does
not
stay
the
same
from
as
object
is
under
the
current
equality,
semantics.
A
So
I
want
to
ask
a
procedural
question,
since
it's
my
job
to
facilitate,
it
seems
like
dude.
Does
everyone
here
agree
that
it
would
be
a
useful
way
to
conduct
the
conversation
about
records
and
tuples
to
in
terms
of
what
breaks
for
each
of
the
options
so
that
we
can
then,
after
we've
collected
a
collection
of
of
what
breaks
for
each
option,
weigh
that
against
our
values?
Is
that?
A
A
Is
a
reaction
to
our
consensus
from
the
last
meeting
that
that
one
of
the
options
has
sufficiently
condemning
properties
because
it
breaks
things
that
this
group
of
people
cares
about
deeply
an
invariant.
A
This
group
cares
about
deeply
that
it
is
not
an
option
on
the
table
and
that
we
can
make
progress
sort
of
dynamically
off
of
that,
but
that
we,
that
I
think
jordan's
call
is
for
us
to
more
holistically,
evaluate
how
bad
each
all
three
of
the
options
that
we're
investigating
are,
as
mark
said,
the
least
bad
is
obvious,
but
also
the
least
palatable,
because
it
gives
us
the
least
expressivity
and
et
cetera,
et
cetera.
So
if
we
could
use
the
remaining
time,
does
everybody
agree?
A
We
could
best
use
the
rest
of
this.
The
time
in
this
meeting
review
just
coming
up
with
what
is
the
worst
thing
that
happens.
What
is
the
worst
consequence
of
each
of
these
options
and
then
start
actually
digging
into
the
ones
that
we've
discussed,
the
least,
which
are
things
like
the
object
in
it:
object,
equality
and
variance,
etc.
F
I
would
go
I
I
like
that.
I
would
put
a
fourth
option
on
the
table,
not
because
I'm
in
favor
of
it,
but
because
it
is
one
of
the
things
that
might
that
we
might
end
up
doing,
which
is
just
not
doing
records
and
tuples
at
all.
That
obviously
does
the
least
damage,
and,
and
so
the
pros
and
cons
of
that
should
be
weighed
against
the
other
three.
A
Yeah
yeah,
I
think
that
that
actually
goes
without
saying
that
sure
we
clearly
do
the
least
damage
by
doing
nothing
that
you
cannot
break
the
web
by
doing
nothing.
B
Do
we
like
in
the
state
of
dc39
history,
has
there
haven't
been
like
a
way
to
evaluate
how
bad
some
breakage
is
because,
like,
for
example,
mark
mentioned
that
it's
not
enough
to
just
see
how
much
code
is
broken,
because
some
breakages
might
be
worse
than
others?
So
did
we
ever
have
like
some
way
to
evaluate
this?
Even.
C
A
Yeah
as
as
jordan,
and
I
were
going
back
earlier
on
the
call
about
what
what
what
kinds
of
cases
may
be
consequential
and
one
of
the
things
that
I'd
observe
about
that
is
that
there
are
pieces
of
code
that
that
folks
on
this
call
are
intimately
familiar
with
that.
A
No
one
else
that
very
few
people
off
of
this
call
are
intimately
familiar
with,
but
are
no
less
consequential
because
they
are
baked
into
the
lowest
layers
of
the
most
primitive
libraries
that
absolutely
everything
uses
and
and-
and
that
certainly
is
like,
if
there's
one
library
that
everyone
uses.
I
think
that
that
is
of
more
consequence,
and
we
just
need
to
have
like
some
concrete
evidence.
I
In
terms
of
discovering
those,
actually
I
don't
know
if
you,
the
laboratory
thing
you
shared,
doesn't
just
give
us
exactly
those
answers
like
enumerating
through
those
things.
I
don't
know
if
y'all
have
played
with
it,
but
like
going
there
and
switching
the
various
options
will
give
us
the
enumeration
of
all
possible
breakages
right.
It
will
tell
you,
I
think,.
D
This
tool
is
great,
however,
having
come
to
the
discussion
on
records
and
temples
a
little
late.
I
am
missing
a
lot
of
the
background
on
the
equality
discussion
and
that
threat
is
very,
very
long
and
I'm
declaring
bankruptcy
on
it.
I,
if
same
value,
zero,
is
not
possible.
I
I
believe
same
value.
Zero
is
not
possible
for
type
of
objects.
D
K
So
for
the
for
the
history,
the
same
values,
your
equality,
semantics.
You
know
that
was
it.
That
was
quite
a
lengthy
discussion
and
there
were
the
proposal
initially
started
with
just
same
value:
semantics.
The
triple
equal
would
be
the
same
as
object.
That
is
that
there
wouldn't
be
canonicalization.
K
We
heard
a
lot
of
complaints
from
from
web
developers.
I
mean
I
guess
especially
from
from
gus
kaplan,
and
maybe
people
who
who
he
invited
to
the
these
forums
and
people
were
saying,
but
I
think
kevin
gibbons
was
also
sympathetic
and
other
people
and
people
were
saying
that
this
would
create
extra
developer,
friction
for
developers
to
have
to
start
caring
about
the
difference
between
zero
and
negative
zero
through
triple
equals
through
through
records
and
tuples.
K
I
I
would
be
okay
personally
with
going
back
to
the
original
equality
semantics,
but
I
think
if
we
want
to
make
that
change,
we
would
want
to
bring
in
the
the
stakeholders
who
had
that
other
point
of
view.
We
have
very
extensive
github
threads
and
you
can
figure
out
who
to
get
in
touch
with
through
that.
D
Could
someone
if
they
remember
summarize
what
the
argument
against
canalization
was.
K
The
argument
against
canonicalization
was
that
it's
it's
I
mean
I,
I
believe
this
argument.
Actually
you
it's
weird
if
we
have
a
container
data
structure
and
you
put
something
in
the
data
structure
and
take
it
out
and
it's
different,
I
just
don't
expect
you
know
the
operation
of
making
a
tuple
of
something
and
then
taking
it
out
to
change
what
the
value
is.
H
E
H
H
H
They
are
kind
of
collection
classes
that
give
you
a
certain
functionality.
So
I'm
like
I'm,
whereas
a
a
record
or
a
tuple,
you
could
say
there's
an
actual
value
that
I'm
creating
as
it's
hard,
it's
very
subjective.
In
my
gut,
it's
like,
if
I
put
something
in
an
array
and
putting
something
in
an
array,
changes
it.
It
feels
very
different
to
if
I
try
and
store
something
in
a
map
and
the
way
that
map
is
hashing.
It
means
that
it
kind
of
actually
doesn't
store
that
exact
value.
H
A
Is
there
a
useful
insight
to
be
had
that
perhaps
a
tuple
could
contain
the
original
value,
but
but
canonicalize
it
for
the
purposes
of
its
own
identity?
That's
what
breaks.
D
That
means
you
can
create
two
objects
that
are
actually
different,
that
have
the
same
looking:
identity.
A
D
D
That's
triple
equal,
but
you
get
it
out
and
really
isn't
the
same
thing.
Okay,.
H
H
If
you're
doing
physics,
you
know
and
you're
working
out
which,
what's
the
current
vector
of
kind
of
this
player-
and
you
know,
if
suddenly
you've
swapped,
the
sign
you
know,
you're
suddenly,
gonna
have
something
shooting
off
in
the
other
direction
was
the
example
given
in
that
long
thread.
I
think.
D
So
the
the
issue
there
would
be
some
code
switching
from
a
regular
array
to
a
tuple
and
not
realizing,
there's
a
semantic
difference
for
zero
and
minus
zero
between
the
two.
H
Yeah,
I
think
that's
what
I
was
trying
to
get
earlier
with.
I
think
the
way
this
proposal,
my
guess,
on
how
people
approach
this
proposal
is
that
tuples
and
records
map
to
objects
and
arrays
at
like
a
very
kind
of
high
level
in
terms
of
how
you
would
use
them
and
obviously
visually.
They
have
similar
literals,
as
opposed
to
thinking
that
records
and
tuples
map
to
map
like
capital
m
map
and
a
non-existent
capital
list
thing
in
the
library,
which
is
why
you
know
arrays,
definitely
don't
change
values
when
they're
put
in
them.
A
F
The
reason
kahan
introduced
negative
zero
into
floating
point
was
because
of
numerical
algorithms
that
he
had
in
mind
that
that
would
error
in
one
direction
or
the
other
accumulate
error
in
one
direction
or
the
other
or
something
I'm
not
a
numerical
analyst.
I
don't
really
understand
it.
I
sure
wish
he
hadn't
done
it,
but
he
did
do
it
for
the
purpose
of
algorithms
that
he
then
published
and
got
in
widespread
use.
J
F
I
B
Oh,
I
didn't
realize
my
head
was
up,
but
anyway
I
just
want
to
say
that
maybe
those
algorithms
can
be
actually
implemented
in
javascript.
So
I
mean
there
are
different
libraries
doing
numerical
stuff
in
javascript,
so
I
wouldn't
surprise
if
some
of
them
rely
on
zeros
like
there
are
different
javascript
libraries
for
maps.
H
I
The
thing
I
wanted
to
mention
like
we're
probably
out
of
time,
but
coming
all
the
way
back
to
the
opposite
thing.
I
guess
maybe
we
should
bring
this
offline
jordan,
but
something
that
occurs
to
me
that
maybe
that
the
the
the
invariant
breakage
around
objects
with
different,
like
you,
know,
a
negative
zero
in
a
place
that
the
positive
zeros
on
the
other
side,
like
two
records
that
are
type
of
object,
that
are
objects
quote
unquote,
that
triple
equal
to
the
same.
I
But
they're
not
object
is
same
ignoring
the
fact
that
triple
equal
and
objectives
are
different.
The
the
real
gross
part
there
is
that
you
triple
equal
those
two
objects
and
if
you
access
those
two
properties
like
they
are
observably
different,
the
interesting
or
like
that,
I
guess
a
way
to
soften
that
invariant.
E
I
mean
that's
fair
and
especially
with
proxies
and
stuff
that
can
be
different.
I
would
say
that,
like
you
could,
if
I
wanted
to
come
up
with
something
generic
and
not
do
the
rational
thing
and
avoid
those
edge
cases,
I
would
probably
say
something
about
like
object:
dot
get
on
property,
descriptors
of
it
being.
F
I
I
Well,
take
if
you
have
the
example
where
you
have
two
objects:
two
standard
objects
that
are
triple
equal
to
each
other
them
being
the
exact
same
object.
Then
you
could
make
the
case
that
I
am
tempting
to
disprove
that
all
of
the
properties
of
said
objects
when
you
access
them
will
be
the
same
as
well
like
via
the
same
operator.
But
that's
not
strictly
the
case,
because
an
object
could
have
a
getter
property
that
will
return
any
value
on
average.
I
F
Right
the
the
idea
that
the
object
that
an
unfrozen
object's
properties
are
stable
was
never
an
invariant.
F
And
that's
and
and
and
that's
that's,
the
invariants
for
frozen
objects
are
very
strong.
The.
E
Right
but
but
given
that
they're
two,
the
same
objects,
if
I
do
x
triple
equals
y
in
the
very
next
line
like
modulo
getters,
that
can
do
random
things,
because
I
could
choose
to
reflect
and
not
invoke
the
getter
they'll.
Those
two
objects
are
indistinguishable,
because
they're
the
same
object
currently
whether
they're
frozen
or
not,
that
has
no
bearing
on
it.
E
Right
yeah,
the
the
invariant
that
I
am
concerned
with,
is
that
immediately
before
anything
else
has
had
any
chance
to
mutate
any
aspect
of
that
object
and
modulo
getters
like
they're.
That
means
they're
the
same
object
and
they're
interchangeable
and
if
there's
ed
exists
any
case
where
they're
not,
then
that
is.
I
Right,
I'm
just
I,
I
guess
I'm
my
point
is
to
soften
that
that,
like
yes,
they
are
still
they're.
Clearly,
obviously
observable
they
do
contain
different
values,
but
that.
F
The
the
interchangeable
is
the
the
really
important
point
about
object
is,
which
is
if
x,
object
is
y.
That
means
that
if
you
take
all
occurrences
of
of
of
the
value
of
x
and
substitute
y
or
vice
versa,
you
can't
cause
any
observable
change
in
computation
and
the
one
we
do
have.
One
exception
to
that
which
helps
clarify
how
strong
the
invariant
is.
The
one
exception
is
the
bits
that
a
nan
syria
wants
us
to
right.