►
From YouTube: Polonius WG: Initialization and move tracking
Description
Discussed how the current borrow checker tracks moves and initialization and how to move that logic to polonius.
Dropbox paper link:
https://paper.dropbox.com/doc/Polonius-and-initialization--AeKw2KzJcfEftnZDZ5C9cV8iAg-mNvR4jqITCdsJDUMEhFbv
B
B
A
B
A
So,
let's
see
so
I
wanted
I
figured
I'd
start
by
you
know
you
just
sort
of
explain
a
little
bit
about
what
the
rustic
rules
are
in
st.
Peter.
You
may
not
have
mastered
them
too
much.
What
we're
trying
to
models
when
the
user
point
of
view
you
get
our
terminology
consistent
and
so
on,
we're
basically
we're
trying
to
check.
This
is
like
quite
a
bit
different
in
some
sense
and
the
stuff
about
Polonius
we've
been
doing.
A
It
has
nothing
to
do
with
regions
or
lifetimes,
for
example,
we're
just
we're
just
trying
to
check
that
all
the
data
that
you
use
is
actually
initialized
at
least
that's
the
way,
I
think
about
it.
There
are
different
terminologies
you
can
use,
and
this
sort
of
it's
in
a
way
it's
a
very
similar
to
check
to,
like
you
know,
do
you
ever
wrote
Java
code
there's
some.
B
A
B
A
A
A
But
if
we
try
to
access
2.1,
that's
okay,
but
we
don't
track
things
at
infinite
granularity,
because
various
reasons
there
are
some
cases
where
we,
where
we
done
so.
Basically,
there
are
various
cases
where
it's
not
actually
legal
to
D
initialize
memory
in
the
first
place
right.
So
one
of
them
is
you
can't
move
out
from
borrowed
data
and
move
the
referent
of
a
reference.
A
You
can't
move
fields
of
that
structure.
Ordinarily,
you
can
move
fields
of
a
struct
just
like
with
the
tuple.
However,
if
they
implement
drop,
you
can't
and
the
reason
is
now
it's
sort
of
confusing
and
a
little
unclear
like
do,
we
run
drop.
We
can't
run
a
drop
now
because
all
the
fields
aren't
there,
so
we
can
poke
the
destructor,
but
then
what
happened?
The
destructor
just
never
runs
I,
don't
know
it's
just
weird,
so
we
don't
like
it
to
do
it
and
I
think
there's
a
third
case.
I,
don't
know
where
this
o
slices.
A
Similarly,
even
if
you
have
an
array
or
a
slice
or
something
like
this,
we
don't
let
you
drop
out
of
it,
because
that
would
require
us
to
sort
of
know.
Well,
let's
come
to
the
next
point.
We
don't
allow
this
and
why
don't
we
allow
it?
Well,
so
there's
a
thing:
that's
only
tangentially
related,
which
is
like
dynamic
drop.
So
basically,
here
we
say
some.
A
A
So
if
it's
true,
then
the
vector
is
going
to
be
dropped
here
and
obviously
that
means
you
know
it
would.
It
would
be
an
error
to
access
the
from
the
code
here
because
it
may
or
may
not
be
initialized,
but
the
question
is
so:
the
memory
gets
freed
here,
so
everything's
kind
of
fine.
In
that
case,
the
question
is:
if
it
doesn't
go
down
that
path,
if
it
goes,
if
it
goes
down
this
path,
when
do
we
free
the
memory
for
the
vector?
A
Do
we
free
it
sort
of
somewhere
in
this
else
branch,
or
do
we
free
it
as
we
exit
this
block,
and
the
answer
in
rust
is
that
V
gets
freed
here
for
better
or
worse,
that's
the
semantics
we
chose
and
a
side
effect
of
that
is
that
actually
V
only
maybe
gets
freed
here
right
because
it
depends
on
which
path
we
took
through
the
program.
So
we
call
that
dynamic
drop,
because
it's
dynamically
decided
whether
we
will
drop
the
vector
V
as
we
exit
this
function.
A
A
B
A
B
A
A
A
A
Right,
so
what
you
see
is
that
actually
in
the
mirror,
as
we
see
it,
there
are
two
drops
for
V
and
it's
certainly
possible
for
one
to
reach
the
other
and
because
you
can
go
from
zero
to
one
two.
Three
and
now
you've
got
two
drops
so
mirrors
kind
of
semantics
is
that
you
can
drop
the
same
thing
multiple
times
and
that's
that's,
okay.
A
We,
you
can
sort
of
imagine
that
the
runtime
semantics
includes
a
flag
for
whether
it's
initialized
or
not
and
drop
has
the
semantics
of
checking
this
flag
and
then
dropping
only
if
it
is,
and
then
we
later
sort
of
lower
that
by
making
the
flag
explicit,
where
it's
necessary
and
otherwise
not
right.
So
after
that
big
tangent,
what
does
that
have
to
do
with
this?
A
Well,
that
relates
to
these
arrays,
because
if
we
allow
you
to
drop
from
an
array,
we
would
have
to
remember
which
index
you
dropped
and
which
indexes
you
didn't,
and
that
means
we
can't
just
use
boolean
Flags.
They
have
to
be
something
more
complicated,
and
so
we
didn't
want
to
support
that.
So
we
just
don't
there's
a
couple.
Other
situations
I
think
the
other
one
is
you
can't
drop?
Oh.
A
Well,
no,
you
can
never
mind
you
that
there's
something
about
enums
and
matching
which
doesn't
really
matter
here.
You
can't
do
you
can't
sort
of
drop
some
parts
of
an
enum
and
not
others,
but
or
we
will
drop
all
the
rest
for
you,
it's
kind
of
what
happens
in
any
case.
All
right
so
come
back
to
this,
so
this
is
what
we
wanted.
This
is
the
analysis
we
want
to
implement.
A
Basically,
we
want
to
give
these
errors
when
you
try
to
move
or
what
bar
check
currently
does
it
will
give
you
errors
when
you
try
to
move
from
things
you're
not
supposed
to
move
from,
but,
more
importantly,
it'll,
give
you
errors
when
you
access
things
that
may
not
be
initialized
and
actually
these
errors,
we
probably
won't
do
in
polonius
at
least
not
to
start.
Maybe
never,
because
they're
not
really
I
mean
what
I
mean
is
the
error
is
about
moving
from
a
path
you're
not
supposed
to
move
from.
Those
are
like
not
very
interesting.
A
So,
oh,
why
is
my
screen
sharing
polished?
Was
my
screen
share
been
paused
this
whole
time.
B
B
A
A
A
A
And
so
what
these
bits
are
is
they're,
basically
tracking
what
sets
of
paths
have
been,
or
maybe,
as
it
says,
initialize
that
a
given
point
and
which
ones
may
not
be,
and
so
the
way
we
do
this.
So
if
you
recall,
I
said
that
we
we
allow
you
to
we
have
to.
We
have
to
track
this,
not
just
at
the
resolution
of.
A
A
A
A
It
turns
out
they
would
have
to
terminate
in
an
option
actually,
but
nonetheless
there
could
be
a
left
s
like
you
could
imagine
that
if
you
just
started
like
all
the
pads
that
user
might
name,
there
could
be
quite
a
lot.
So
what
we
actually
do
is
we
construct
the
pads
that
we
actually
see,
and
only
those
in
within
the
source
function
we're
analyzing
right.
So
for
this
one
we
might,
we
might
well,
first
of
all,
we'll
always
make
well
we
have
this.
A
We
have
this
set
of
interesting
move
pads
and
we're
gonna
give
each
one
an
index,
and
we
first
thing
we
do.
Is
we
make
one
for
every
local
variable?
Cuz?
That's
just
makes
our
lives
easier,
so
there
would
be
a
path
for
tuple,
but
then
we're
gonna
analyze
the
program
and
we're
gonna
see.
Oh
the
user,
actually
accesses
tuple
0,
so
I'll
make
a
path
for
tuple,
dot,
0
and
then
later
we'll
see
that
they
access
to
pull
dot
1
so
we'll
make
a
path
for
tuple,
not
1.
B
B
B
A
A
We
only
construct
embassies
for
paths
that
you
actually
could
drop
so
pads
like
this
star
from
through
a
reference
or
a
field
of
a
struck
that
implements
drop,
we're
gonna,
see
later,
we
will
never
have
what's
called
a
move
path
for
that,
because
we
don't
really
need
to
track
the
initialization
of
that
because
it
can't
be
it
can't
have
been.
It
can't
be
D
initialized.
So
it's
basically
initialized
if
the
parent
path
is
initialized
and
otherwise
not
right.
A
Right,
so
that's
what
a
move
path
is.
Essentially
it's
just
it's
just
in
an
index
the
maps
to
a
path
that
we,
you
might
have
moved
from,
that
you've
used
and
we
have
these.
These
are
arranged
in
in
a
sort
of
couple
different
ways,
but
one
of
them
is
a
tree
so
that
you
can
go
from
the
parent
to
each
of
the
children
and
vice
versa.
So
let
me
show
you
that,
or
let
me
not
quite
sure
that
the
reason
I'm
talking
about
move
paths
is
once
we
have
indices
for
all
these.
A
Now,
when
we
do
these
analyses
like
is,
it
may
be
initialized,
these
are
basically
computing.
These
are
data
flow
analysis.
Their
computing
bid
sets,
so
we
now
have
bits
to
express
them
in
terms
of
right.
There
are
sets
of
these
with
these
indices
so
may
be
initialized
you
might
have.
If
we
look
at
this
function,
you
might
say
okay
at
this
point.
A
There's
nothing
initialized
at
this
point:
we've
initialized
zero
one
and
two
because
by
assigning
to
tuple,
we
also
assigned
to
all
the
sub
pads
and
tuples
sort
of
implicitly.
If
that
makes
sense,
and
then
at
this
point
we
moved
out
of
of
one.
So
we
only
have
zero
and
two
or
maybe
initialized
right
make
sense.
Okay.
A
So,
let's,
let's
look
at
this
move
path,
real
fast,
just
to
be
clear
or
my
my
planned
result
or
walkthrough
what
the
code
does
today
and
then
I
will
sort
of
pivot
to
what
part
of
this
Polonius
will
do,
which
shouldn't
take
that
long,
I
hope
so
we'll
see
how
far
we
get
right.
So
this
gather
moves.
What
this
is
actually
doing
is
creating
those
move
pads
and
also
just
indexing.
It's
kind
of
computing
in
the
Polonius
sense.
A
The
base
input
facts
from
the
program
actually
and
if
you
see
if
I
can
find
where
that
code
actually
lives,
I
think
this
code
lives
in
dataflow
yeah.
So
this
code
is
not
part
of
Baro
check.
This
move
analysis
code
and
that's
because
it
was
initially
written
before
we
had
the
mihrab
ro
check.
We
still
had
to
do
this
dynamic
drop
analysis
and
it
used
a
lot
of
this
code.
It
was
like
the
first
consumer
because
it
also
had
to
do
similar
answer
similar
questions
anyway.
So
the
move
paths.
A
B
B
A
You
see
that
it
contains
a
place.
This
is
a
mere
place
or
what
I've
been
calling
path
mere
we
called
it
place.
Something
else,
I
might
change
something
or
I
would
possibly
change,
but
anyway,
so
this
is
like
where's.
This
is
the
the
mirror
version
of
you
know:
tuple,
dot,
zero
or
whatever,
so
we're
kind
of
in
turning
them
into
these
move
pets,
and
then
these
these
three
fields,
these
encode
the
tree,
not
sure,
if
you've
seen
this
encoding
of
a
tree
before.
But
it's
super
useful.
A
So
this
is
a
link
to
the
parent,
the
index
of
our
parent.
This
is
a
link
to,
or
rather
this
one
is
a
link
to
our
first
child
and
then
there's
a
link
list.
That's
what
the
next
sibling
field
is
is
our
child.
Well,
you
can
sort
of
go
down
the
list,
the
siblings
of
the
child
of
the
child.
So
this
way
you
store
the
nice
thing
about
this
encoding
of
the
tree.
A
Is
you
can
store
two
fields
per
if
you
ignore
the
parent
from
two
fields
per
node,
but
you
can
sort
of
build
an
arbitrary
size
tree.
You
don't
need
to
have
like
a
vector
for.
A
Vector
of
children,
so
it's
more
condensed
well
depends
on
your
pointing
you,
but
in
any
case
that's
how
it's
encoded.
It's
a
linked
list
threaded
through
this
next
sibling,
and
you
can
kind
of
see
that
like
oh
well,
this
is
a
this
is
some
code
that
walks
up
the
parents
of
a
path.
That's
very
easy!
A
You
just
get
the
parent
and
iterate
loading
the
next
parent
each
time,
but
there's
some
other
code
like
where
is
it
I,
don't
know
somewhere
or
other
that
I
don't
see
right
now,
probably
in
another
module,
there'll
be
some
code
that,
like
walks,
all
the
children,
for
example,
of
a
given
index
and
it'll,
do
that
by
first
load,
the
first
child
and
then
from
there
load
the
next
sibling.
Okay.
So
that's
what
these
move
paths
are:
that's
alright
and
where
they're
built,
that's
what
I
wanted
to
show
you.
A
A
And
what
this
is
going
to
do
is
walk
over
all
the
statements
and
the
terminators
in
the
mirror
and
gather
information
about
them
by
calling
you
know,
gather
statement
which
invokes
some
visitor,
which
is
the
gatherer,
and
the
gather
statement
will
look
and
it
kind
of
truck
it's
basically
looking
for
what
are
the
effects
of
each
kind
of
mere
statement.
So
this
is
what
I
meant
by
the
this
is
basically
building
up
your
your
polonius
facts
or
what
Polonius
would
call
the
input
facts.
A
So,
for
example,
here
it
says:
okay,
we
have
an
assignment
into
place
from
graph
I'll
first
create
move
path.
What
does
that
do
that?
That
will
take
this
this
mere
place
and
create
a
move
path
corresponding
to
it
and
we're
gonna
look
at
this
in
a
second.
But
if
you
remember,
I
told
you
that
not
all
move
paths,
not
all
places
have
move
pads
because
they
can't
all
be
moved
out
of.
So
this
function
doesn't
always
succeed,
but
here
we
don't
care
it.
A
A
B
A
Move
data
that
we're
constructing
I
should
have
talked
about.
This
actually
has
it.
This
is
the
thing
called
the
move.
Data
and
I
spent
a
lot
of
time
talking
about
the
move
pads.
So
one
of
the
big
things
that
move
data
has
is
an
indexing
for
each
move
path
index.
Here's
the
actual
move
path
data,
but
it
also
has
all
this
other
stuff
right
and
this
reverse
lookup.
That's
just
a
map.
A
I
think
that
helps
you
quickly,
map
from
place
to
move
paths
sort
of
rather
than
searching
linearly
through
this
array,
but
this
stuff,
like
moves
and
location
map
and
stuff
like
that.
This
is
detailing
the
effects.
So
the
location
map
says
at
a
given
location
here
are
the
paths
that
get
moved
out
of
from
that
location
and
the
way
we
do
that
is
we
give
each
each
time
that
there
there's
a
move
we
create.
We
have
we
create
an
entry
in
this
vector
which
we
call
a
move
out
and
it's
stores
some
information.
A
A
It
also
has
like
a
list
of
moves
or
move
outs,
which
I
think
you
would
kind
of
call
like
mm
I,
don't
know
what
I
forget
what
there
are
letters
we
used
for
these
things,
I'll
call
em,
P
and
P
so
sort
of
saying
there
was
a
move
out
from
this
move
path
to
this
point
right
and
then
we
have.
We
have
that
data
indexed
by
where
the
move
occurs,
so
that
you
sometimes
want
to
access
it
that
way
so,
just
like
in
Polonius,
you
end
up
with
two
versions
of
the
same
thing.
A
You
have
the
same
here
and
then.
Similarly,
we
have
mapped
this
way.
If
you
want
to
know
every
place
that
a
given
path
is
moved,
there's
a
there's,
a
mapping;
okay,
they
both
just
index
into
this
vector
which
has
the
data,
and
similarly,
we
have
a
list
of
in
it's.
These
are
all
the
places
that
a
path
is
assigned
to.
B
A
A
Yeah,
so
this
is
the
function
that
makes
a
new
move
path,
I
think
and
note
that
we
only
ever
create
move
times
in
this
first
pass.
There's
other
functions
later
that
do
the
lookup,
but
we
assume
that
we've
sort
of
made
all
the
move,
ties
that
are
ever
interesting
in
this
first
pass
and
after
that,
it's
just
just
looking
up,
and
so
basically
what
it
does
is
it.
A
What
does
it
do?
This
is?
This
code
has
changed
a
little
bit.
This
is
probably
thanks
to
Santiago
who's
been
refactoring
these
things.
So
what
this
a
place
in
mirror
has
a
base
like
a
local
variable,
usually
and
then
a
set
of
projections
right,
so
this
place
dot,
iterate
I
think
if
I
the
code
I
have
never
seen
before,
but
this
is
what
I
think
it
does
walks
over
invokes
this
function
for
each
projection.
B
A
A
feel
like
if
you
have
a
dot,
B
dot
C,
you
can
see
our
projections,
they
call
it,
they
call
it
your
like
projecting
out
from
the
base
path,
that's
what
it's
and
so
what
does
it
do
so
or
no?
It
looks
like
this
is
a
list?
Okay,
so
it's
iterable
anyway.
The
point
is
we
can
sort
of
build
this
starting
from
one
side
and
moving
forward
right,
and
so
we
we
start
with
the
base.
I
said
that
we
always
have
for
every
local
variable.
We
have
a
path.
We
just
make
one.
A
That's
the
easiest
thing
to
do.
You
can
always
move
out
of
a
local
variable.
So
that's
fine!
So
we
do
that.
Then
we
walk
down
each
projection.
So
if
you
started
out
with
a
then
we'll
now
looking
at
the
B,
if
you
have
a
dot,
B
dot
C
and
we
look
at
what
was
the
type
on
not
of
the
what
is
the
type
of
the
thing
we're
projecting
from.
A
So
if
we're
looking
at
a
dot
B
we're
looking
at
the
type
of
a
here
and
we'll
give
some
errors
in
some
cases
right,
so
you
can't
move.
For
example,
out
of
a
borrowed
pointer,
so
if
we
see
that
you're
trying
to
build
a
move
path,
that
goes
through
a
borrowed
pointer,
we'll
just
stop
and
return
an
error
at
that
point,
so
we
will.
A
We
will,
along
the
way,
I
think
have
built
up
paths
for
all
the
leading
up
things
that
led
up
to
that
point,
but
we
don't
actually
build
one
for
them:
the
referenda
tough.
Similarly,
if
ADT
is
a
struct,
if
this
is
a
struct
that
has
a
destructor
will
stop
and
so
on,
all
right,
but
otherwise
we
succeed
so
we're
going
to
look
first
check
if
it
already
is
in
there
or
not.
A
This
is
basically
just
a
look
it
up
in
the
hashmap
if
it's
not
in
the
hashmap,
yet
will
construct
a
move
path
and
that's
it
right.
So
that's
how
we
do
it
so
then.
The
main
point
is
that
this
function
basically
builds
all
the
move,
paths
that
it
can
and
then
either
gives
you
back
an
okay
result
if
it
got
all
the
way
through
or
gives
you
an
error
of
somewhere
along
the
line
it
couldn't,
and
then
that
error.
That's
that's
why
you'll
notice
the
remember
that
code.
A
That
said,
let
underscore
equals
so
sometimes
that
error
doesn't
mean
anything
like
if
you're,
just
if
this
is
code
is
not
actually
moving
out
from
that
path.
It's
just
like
initializing
that
path.
That's
fine!
We
don't
need
to
have
constructed
all
the
pets,
but
if
it
is
an
actual
move,
we
would
report
that
error
to
the
user
and
then
I
think
we
just
I,
don't
know
what
we
do
in
terms
of
well.
We
could
look
whether
we
actually
generate
a
move
out
data
structure
or
not.
A
A
So,
okay,
so
the
point
is
this:
is
basically
we
make
one
of
these
for
every
move.
That
is
a
legal
path.
Okay,
all
right
so
far,
so
good!
Then
what
do
we
do?
I
think
enough
about
move
building.
Let's
talk
about
how
we
actually
use
this
stuff
I'm
assuming
you'll.
Stop
me.
If
anything
is
confusion,
so.
A
A
That's
but
that's
doing
right
now
is
it's
using
this
dataflow
framework,
which
is
kind
of
a
generic
framework
II
thing,
and
the
idea
of
this
framework
is
that
you
define
your
own
place.
You
define
analyses
like
see
if
I
can
find
one
well,
the
framework
is
going
to
be
parametrized
by
with
by
a
BD
or
a
bit
denotation
just
basically,
what
are
we
computing
like
the
mate
there's,
a
struct
that
corresponds
to
computing,
maybe
initialized
and
a
struct
for
maybe
uninitialized
and
that's
what
this
BD
is
and
that's
you
see
we're
building
them
here.
B
A
Right,
it's
essentially,
so
all
of
these
are
dataflow
analysis
right.
They
all
work
the
same
way.
They
have
these
kill
in
gen
sets
and
you
propagate
it
around.
It's
just
a
matter
of
what
do
you
consider
a
kill,
and
what
do
you
consider
a
gen
and
that's
exactly
what
this
denotation
thing
is
all
about.
A
So,
let's
see
if
I
can
find
where
that
is
it
so
for
each
of
these
they're
going
to
define
this
bit
D
notation,
which
has
what
is
the
index
type,
so
this
is
like
and
then
a
bunch
of
random
stuff.
That's
not
that
important,
and
then
this
stuff,
which
says
basically
you're
we're
gonna,
call
these
these
callbacks
as
we
walk
the
control
flow
graph
and
they're
supposed
to
add
and
remove
things
from
the
sets
of
what's
present
or
not
present,
right,
okay
and
so
there's
one
it's
like
when
you
start
the
block
before
the
statement.
A
B
B
A
Be
honest,
I
have
never
understood
why
this
trait
has
that
name,
laughs
ties
Felix
but
usually
denote
a
denotation
is
like
something
you
denote
something
when
you
you
know
so,
but
I
guess
it
denotes.
The
bits
I
don't
know
maybe
gives
meaning
to
this.
Probably
something
I
would
have
called
it
something
else,
but
I'm
not
sure
my
name
had
been
better
just
different.
B
A
Dataflow
operator,
anyway,
whatever
is
you
get
the
idea?
It's
a
very
generic
thing.
It's
a
it's!
A
it's
a
dataflow
factory,
builder
constructor,
so
I'm
trying
to
find
the
interest
in
this.
This
is
all
like
super
generic
code,
there's
some
where
I
found
earlier
I
found
we're
sort
of
bottoms
out.
But
where
is
it?
A
Oh
here
we
go
update,
bits,
I,
think
that's
it
there's
some
helper
functions,
but
in
the
end
of
the
day,
they
kind
of
boil
down
to
killing
and
Jenning,
and
you
can
sort
of
see
that
this
code
references
that
dynamic
drop
thing.
These
helper
functions
basically
say
there
was
some
stuff,
the
statement
or
terminator
or
whatever
else
that
caused
us
to
change
the
drop
flag
state
for
this
path
to
either
absent
or
present.
A
B
A
Maybe
uninitialized
is
exactly
reversed,
but
okay,
that's
how
this
code
works.
What
we're?
What
we'll
do
in
I
think
what
we'll
do
is
you
know
in
Polonius
we'll
have
like
a
move
out
and
an
init
and
then
the
actual
analysis
is
basically
exactly
the
same
as
liveness
just
copy
and
paste
it
and
put
the
right
things
in
the
right
place.
Right
like.
B
B
A
B
A
You
know-
and
it's
also
may
be
initialized
if
there's
a
predecessor
or
whatever,
and
it
hasn't
been
moved.
Something
like
that.
I
guess
one
difference
would
be
from
liveness
as
these
are
forward
analyses
not
backwards.
So
you
know
you
could
so
there'll
be
some
CFG
PQ
oops
dammit
how'd.
This
become
a
link.
B
B
B
A
Me
think
what
am
I
not
telling
you
so
right?
Okay,
one
last
thing
did
so
but
I'm
gonna
assume
that
you've
figured
out
the
we're.
Gonna
define
some
sort
of
maybe
uninit
some
sort
of
MPP
thing
here.
Then
the
question
is:
what
is
an
error
like
we
can
compute
these
in
polonius
readily
enough?
But
now,
where
do
we
start
to
get
errors?
A
So
when
you,
when
you
see
an
access,
this
basically
comes
down
to
you
see
an
access
to
the
path,
it
must
not
be,
may
be
uninit
right,
yeah
and
oh,
let
me
put
that
out
so
you'll
notice
that
maybe
in
it
is
not
actually
needed
for
your
errors.
The
only
thing
we
use
may
be
a
nip
for
you've
already
come
across,
which
is
the
drop
dead.
B
B
B
A
A
We
can
come
back
to
let's-
let's-
let's
not
worry
about
this
just
now,
but
what
computer
for
everything,
but
we
might
be
able
to
optimize
by
doing
less
or
maybe
we
can
get
smarter.
The
point
being
you
know
if
you've
moved
from
one
half
of
the
tuple,
but
not
the
other
I.
Don't
think
we're
smart
enough
to
understand
that
only
that
half,
like
only
the
half
that
you
haven't
moved
from
this
drop
life
and
the
other
half
is
not,
but
we
do
do
some
sort
of
smart
things
like
that.
A
A
Forget
which
Phi
this
like
everything
is
in
this
file,
probably
there's
a
lot
of
code
in
this
file.
Let's
take
a
look,
maybe
under
heads
yeah,
so
we
there's
some
code
in
between
the
stuff.
I
was
showing
you
and
this
I'm
looking
at
right.
Now
it
kind
of
walks
over
the
mirror
again
and
along
the
way
we
have
this
notion
of
flow
state.
Let
me
show
you
that
flows
in
you,
so
this
is
the
the
top
level
code.
A
That's
like
the
entry
point
to
scroll
the
entry
point
to
borrow
check
that
we've
been
looking
at
where
it
computes
the
move
data
it
computes
the
maybe
in
its-
and
you
see
that
it
does
these
kind
of
early.
That's
because
if
they're
needed
for
liveness
then
later
and
then
they
just
get
dropped
after
that,
then
later
it
computes,
then
maybe
onion
it's
and
a
bunch
of
other
random
stuff,
and
it
puts
all
of
those
things
in
this
flows.
A
B
A
A
And
part
of
the
reason
we
do
it
this
way
in
this
code
is
that
we
don't
store
all
the
values
for
all
the
points.
We
only
store
the
values
on
entry
to
each
basic
block,
and
then
we
sort
of
recompute
the
intermediate
states
as
we
go,
which
is
something
that
don't
do
in
data
frog,
and
it
would
be
nice
if
we
can
just
never
do
that,
because
it's
really
complicated,
but
maybe
we'll
have
to
reproduce
it
someday.
I,
don't
know
anyway,.
A
So
right,
so
then,
as
we
walk,
the
data
flow,
we're
gonna
call
this
check
it.
Full
path
is
moved
if
we
see
a
move
or
no.
If
we
see
an
access
to
a
given
place
at
a
given
location,
these
are
loops
together,
just
first
just
for
clarity.
I
think
say
that
that's
the
span
of
this
place,
so
what
that
does
is
it
calls
this
helper
newly
to
link
here.
A
It
calls
this
helper
move
path
closest
to
and
what
that
basically
does.
It
says:
well,
maybe
a
dot
B
dot.
C
is
not
something
we
build
a
move
path
for,
because
C
is
a
field
you
couldn't
move
from.
Then
we
want
the
the
longest
path
we
can
get.
So
if
a
dot
B
dot
C
doesn't
exist,
then
a
dot
B.
Maybe
that
exists
it
in
the
limit.
A
should
exist,
and
this
usually
succeeds.
You
can
see
it
can
actually
error,
because
it's
not
always
the
paths
don't
always
begin
with
the
local
variable.
A
A
B
A
B
A
And
yeah
the
only
special
thing,
the
only
special,
mere
local
variable,
that's
the
other
wall,
the
other
mere
local
variable.
That's
a
bit
special
is
there's
a
return
slot
and
that
wouldn't
would
be
never
initialized
or
it
certainly
not
uninitialized,
certainly
not
initialized
on
entry.
So
yes,
access
to
a
path
must
not
be
maybe
internet
right
and
note.
Actually
it
might
be
that
well.
A
No
I
I
was
going
to
say
it
might
be
that
if
you
access
a
dot,
B
dot
C,
it
might
not
exist.
For
many
reasons
like
it
might
be
that
you
can't
move
from
that
path,
but
I
think
in
principle
we
could
choose
not
to
generate
move
paths
for
things
that
are
never
moved
from,
but
are
only
accessed
in
other
ways,
because
then
there
they
are
initialized.
If
the
base
structure
is
initialized
and
otherwise
not.
However,
I
think
at
moment,
we
are
simpler
than
that
and
we
just
generate
move
ties
for
everything.
B
A
A
A
B
A
A
Depending
right,
so
there
might
be
a
DUP
like
if
there's
a
if
there
are
sub
pads,
they're,
also
being
initialized
and
there's
some
code
in
the
I.
Think
this
logic
is
probably
not
I
have
to
double
check,
but
I
think
the
gather
moves.
Logic
is
kind
of
already
has
this
covered
the
existing
moves?
In
other
words,
there
already
be
entries
in
the
array
for
all
the
sub
paths,
but
I'm
not
sure.
If
that's
true,
because.
A
Actually,
that's
probably
not
true
that
probably
can't
be
true,
I
realize
now,
because
you
need
to
do
two
passes.
You
need
to
first
generate
all
the
move
paths
and
then
come
back
and
figure
out
what
all
the
sub
children
are.
You
might
not
know
them
all
at
the
time
when
you're.
If
you
see
what
I
mean
like
when
you
see
a
equals
foo,
you
may
not
yet
have
seen
it
up
using
so
I
think
that
expansion
probably
happens
in
the
bit
denotation
info
today.
B
A
B
A
So
I
was
looking
over
today.
I
was
like,
and
this
data
flog
stuff,
it's
like
so
nice,
but
it's
kind
of
right
only.
You
have
to
be
very
careful
documenting
so
anyway,
these
are
probably
your
base
facts
and
then
we
can
compute
from
these.
Let's
just
start
with,
maybe
on
an
it
I
guess.
A
B
B
B
B
A
Yeah
this,
this
stuff
is
fairly
straightforward,
I
think
I'm.
The
one
saving
grace
here
is
that
we're
actually
kind
of
already
generating
facts
like
that
move
data,
data
structures,
kind
of
already
got
everything
you
need,
so
you
just
have
to
iterate
over
it
and
spit
it
out.
The
I
was
just
gonna,
say
the
Lark
type,
just
for
reference,
I,
probably
already
linked
to
it,
but
just
so
I
can
my
own
edification.
A
Yeah,
these
are
doing
similar
things
tracking
them.
It
business
tracking,
the
uninitialized
right,
we'll,
probably
wind
up
I,
don't
think
you
should
read
this,
but
I'll
probably
go
reread
it
and
try
to
remember,
because
I
suspect
we're
gonna
wind
up
computing,
all
the
same
things,
but
one
of
the
things
that
came
up
that
I
do
remember
is
there's
a
lot
of
like
phasing
here,
because
the
data
frog
stuff
likes
to
distinguish
closed
relations
from.
A
You
can
see
that
I
sort
of
you
know
like
there's
a
first
first
path
here,
that
computes
I
think
this
is
saying
over
written
is
my
aversion
to
initialized.
But
if
you
write
to
a
given
path
at
a
given
spot,
then
you
also
write
to
all
the
children
of
that
path.
Right
and
I
compute,
that
sort
of
as
one
thing
and
get
a
relation
out
of
it,
so
that
I
can
then
later
do
leap
joins
and
stuff
like
that
with
it
and.
A
Similarly,
if
you
access
all
the
different
ways
to
access
the
given
path,
this
may
not
be
relevant
but
I'm
sort
of
combining,
like
oh
it's,
a
direct
access
or
yeah,
or
it's
a
transitive
access.
So
I
computed
the
transitive,
the
transitive
relation
and
then
I
can
just
kind
of
join
them
together
and
so
forth.
So
we
may
wind
up
trying
to
do
stuff
like
that.
B
B
A
Think
that's
kind
of
all
I
got
in
terms
of
giving
the
high-level
picture
on
I.
Imagine
that
you
sorta
have
a
feel
for
the
drill
by
now.
Probably
I
would
probably
we
would
approach
it
in
the
same
in
this
order,
more
or
less
start
by
adding
the
atom
and
generating
the
base
facts
and
then
write
the
polonius
code,
speaking
of
which
I
don't
know,
I
haven't
looked
today,
but
I
guess
I
need
to
start
merging
things.
Is
this
read
this
is
not
ready,
or
is
it
ready
for
me
to
merge
them
not
yet.
B
B
A
B
B
B
A
B
B
A
B
B
A
Okay,
I,
don't
like
that
terminology
anymore
and
I've
been
trying
to
move
away
from
it,
but
what
it
refers
to
is
the
named
lifetimes
on
the
function,
perimeter
and
the
reason
I
called
them.
The
universal
is
that
they're
universally
quantified,
meaning
that
you
have.
A
You
sort
of
you
don't
know
what
today
is,
but
it's
some
set
of
things.
You
know
I
think
for
the
purposes
of
polonius.
They
don't
they're,
not
very
different
from
any
other
region,
except
that
they're
sort
of
life,
all
the
time
yeah
and
they're
a
little
special
and
some
yeah.