►
From YouTube: wg-incr-comp mtg #13 (2021-03-09)
Description
The thirteenth meeting of the incremental compilation working group. The T-compiler sprint was last week and we're all in recovery mode. :)
A
Okay,
all
right:
this
is
the
13th
meeting
of
the
incremental
violation
working
group
we
have
wesley
here
and
oliver.
I
don't
from
your
silence,
I'm
trying
to
figure
out
whether
you're
going
to
participate
or
not
totally
silent.
Oh,
oh!
No!
Microphone!
Okay!
That's
that's
the
issue
all
right!
Well
then,
I
they're
very
unlucky
to
see
a
participation
list.
That's
gonna
happen
in
the
chat
all
right,
so
it's
just
wesley
and
felix
today
so
yeah,
I
I
don't
have
any
items
at
the
agenda.
So
this
agenda
is
what
it
is.
A
Oh,
that's
not
true.
Actually
I
was
gonna
mention
the
other
agenda
that
I
was
making
in
parallel
with
wesley
I'll.
Add
it
on
I'll.
Add
my
item
on
there,
which
is
mainly
a
sort
of
side
note
I'll
put
at
the
bottom.
A
Okay,
all
right.
Let's,
let's
get
started
then
so
we've
got
this.
I
see
a
new
issue
which
is
filed
very
recently.
Eight
two,
nine
two
zero
only
p
high.
Is
it
not
p
critical
just
because
we
don't
figure,
people
aren't
to
run
into
it
like
it's,
it's
relatively
hard
to
hit
because
you
have
to
have
a
certain.
You
know.
I
think
it's
a
standard.
A
A
Yeah,
okay,
that
definitely
is
bad.
Okay,
yeah,
it's
tricky.
I
I
agree
that
usually
stable
to
stable
things
are
not
treated
as
critical,
because
we've
seen
people
go
around
to
them,
but
this
is
the
kind
of
thing
where
up
david
is
waiting
in
the
lobby.
Oh
man,
I
am
just
out
to
lunch
about
who's
waiting,
all
right,
so
I'm
letting
in
santiago
and
david
now
all
right.
So
you've
all
didn't
miss
very
much
other
than
me
hitting
the
record
button.
So
wesley
has
an
agenda
that
he
put
together
and
we're.
A
Looking
at
the
first
item.
The
agenda,
you
too
can
add
stuff
to
the
agenda
if
you
want
to
in
parallel,
but
we're
talking
about
eight
two,
nine
two
zero.
This
this
relatively
newly
filed,
incompetent
compilation,
bug
that
is
stable
to
stable
regression
was.
Is
it
actually
been
identified?
Yeah?
I
guess
they
did
identify
the
actual?
No
wait
it
just
it's
not
been
bisected,
so
we
don't
know
okay,
so
it
needs
bisection
as
well
as
it
needs
mz
mcbe.
A
Could
it
seem?
If
it
truly
is
reproducible,
then
we
should
be
able
to
bisect
it.
Oh,
I
can't
tag
things
I'm
in
my
work
machine
right
now
and
I
don't
have
myself
logged
into
github
for
some
reason.
I
guess
I
could
what.
A
I
want
to
just
the
needs,
the
needs
by
section
or
whatever
in
terms
of
the
priority
here
yeah.
So
what
I
was
just
musing
about
was
whether
what
priority
to
use
it's
tagged
as
I
prioritize,
meaning
that
you
know
they're
asking
for
help
prioritizing
it
and
yeah
historically
wesley
pointed
out
that
stable
to
stable
things
don't
get
as
high
priority,
because
they
we
figure
okay,
it's
in
it's
out
in
production
anyway,
so
you
know
we
shouldn't
be
scrambling
to
fix
it.
I
am
curious.
A
A
So
the
other
one
interesting
part
is:
is
this
detail
about
the
verifying
the
ich,
which
I
assume
is
incremental
compilation
hash?
Is
that
right,
yeah?
So
yes,
so
that's
a
sign
that,
like
our
hash,
is
actually
unstable
right
is
that
is
that
the
correct
way
to
interpret
that
flag?
Yes,
unstable
fingerprints?
A
So
that's
that's
good!
That
might
be
a
you
know,
a
clue
as
to
what
it'd
be
there
to
fix
it.
Let's
call
it
be
critical
and
move
along
and
then
and
then
should
someone
be
assigned
to
it,
I'm
happy
to
take
charge
of
it
for
the
short
term.
A
All
right.
Do
you
want
me
to
find
you?
Yes,
please
that's
the
one
I'm
saying
this
while
I
completely
I'm
literally
not
logged
into
github,
because
I'm
on
this
machine
that
I
don't
use,
I
try
to
avoid
doing
github
stuff
on
my
work
machine,
so
yeah,
if
you
could
assign
me
that'd,
be
good
all
right
and
let's,
let's
move
along
then,
if
it's,
if
I'm
assigned
and
that's
that's
taken
care
of,
then
we
can
move
on
to.
B
B
Neither
cj
gellar
or
I
have
a
have
any
good
ideas
for
making
it
faster
or
less
red
or
whatever.
So
the
current
state
is
basically
that
there's
some
regressions
up
to
2.
I
think.
A
Well,
that's
better
than
four
percent
there's
some
large.
A
B
Actual
rates
in
incremental
scenarios
and
then,
as
they
noted
in
the
pr,
the
vr,
also
has
the
effect
of
breaking
the
heart
dependency
currently
between
coached
units.
B
Happened
if
the
hi,
the
hir
owner
and
the
hir
owner
nodes
queries
right
now
change,
then
we
just
recompile
code,
even
if
we
don't
need
to
in
cgues
because
of
the
way
the
query
system
works.
So
that's
my
understanding
so
anyway,
by
breaking
the
dependency
there's
more
opportunity
to
reuse
code
units,
which
is,
I
think,
probably
what's
happening
in
the
tokyo
web-
push
simple
debug.
A
B
And
I
think
architecturally
this
is.
This
is
what
we
want.
So,
if
there's
other,
if
there
are
regressions
we
should
I
mean
we
should
try
to
fix
those
eventually
at
some
point
or
find
some
other
improvement.
That
would
kind
of
negate
the
regressions.
But
I
think
this
is
architecturally
what
we
want
so.
A
Okay,
I
will
yeah,
I
think,
that's
the
way
to
go,
approve
them
great,
great,
okay,
even
wins
on
max
rs.
Oh,
I
guess
yeah.
If
you
don't
recompile
stuff,
you're
not
gonna,
have
as
high
of
max
rss,
necessarily
because
you
don't
waste
that
effort.
Sorry,
I
was
just
looking
at
other
metrics
and
thinking
about
them.
Okay,
great,
let's
merge
it
so
the
next.
B
A
Just
want
to
point
out
that
the
sprint
is
over
and
I'm
hoping
to
have
a
retrospective
about
it
for
the
whole
team,
but
one
of
the
big
things
that
I
feel
like
is
that
this
was
no
surprise.
We
didn't
have
as
much
as
enough
preparation
for
the
sprint,
and
so
we
spent
the
sprint
itself
doing
work
that
arguably,
should
have
done
before
the
sprint,
and
it's
fine
that
that
was
what
happened
for
the
first
sprint.
A
I
just
want
to
learn
from
it
from
any
upcoming
sprints
and
so
we'll
have
a
planning
meeting
at
some
point
and
decide
what
the
next
print
will
be,
but
the
off-chance
it's
about
incremental
violation.
This
is
you
know
the
case
where
be
good
for
our
team
to
sort
of
take
the
take
the
lead
there
in
terms
of
any
prep
work.
That
might
need
to
be
done.
That's
all
santiago!
A
C
So
as
it
says,
there
are
like
I'm
kind
of
working
on
a
metadata
damper,
I'm
doing
other
stuff
related
to
trades
and
other
things
also
in
the
compiler.
So
it's
not
like
the
first
thing,
I'm
tackling,
but
as
a
side
thing,
I'm
I'm
I'm
doing
that
and
at
some
point
it
will
be
ready.
A
C
Yeah
like
I,
I
don't
need
to
decide
exactly
what
like
or
even
ask
what
what
would
we
like
to
to
down
from
from
the
from
the
our
meta
files?
The
idea
is
to
read
this
kind
of
files
and
and
dump
whatever
whatever
is
in
there,
but
I
don't
know
if
it
like
dumping.
Everything
would
be
like
massive
stuff
or
maybe
maybe
we
would
like
to
have
a
way
of
filtering
that,
like
with
an
external
tool.
Also,
like
I
don't
know,
do
you
want
to
say
I
don't
know,
but
there's.
A
Enough
about
the
metadata
format
to
speak
as
to
what
makes
sense
there.
I
can
guess
that,
like.
A
On
how
it's
structured,
I
could
imagine
you
know
going
pretty
doing
interesting
stuff
with
filtering
either
either
as
a
post-processing
pass
or
perhaps
better
still
as
a
sort
of
like
online
filter.
Like
you
know,
feeding
through
but
yeah,
there's
lots
of
question
marks
there,
but
yeah.
I
think
just
getting
something
getting
the
simplest
thing
going
is
the
way
to
go.
There
don't
try
to
like
iterate
too
much
on
like
figure
out
the
thing
make
the
thing
that
dumps
and
find
out.
C
A
A
A
A
C
C
A
tool
that
that
exists
made
by
bijon
if
I'm
pronouncing.
C
C
A
I
don't
know
the
answer
there
either.
I
it's
interesting
because
the
thing
I
was
going
to
say
is
that,
on
the
one
hand,
I
often
opt
to
adopt
to
dump
human
readable
stuff,
but
I
found
there's
a
there's:
some
cultural
stuff
there,
where
there's
a
new
trade
there's
a
crate,
called
tracing
that
tokyo
has-
and
one
thing
I
noticed
in
watching
a
talk
about
it.
Is
the
author
of
it
made
a
point
of
saying,
like
we're,
not
dumping
like
human,
readable,
we're
dumping,
json
or
some
other
format.
A
That's
meant
to
be
parsed
by
their
tools
and
they
like
that
was
a
pro
from
their
perspective
because
it's
like
they,
if
you
dump
human,
readable
stuff,
then
anytime,
you
want
to
work
with
it.
You
have
to
make
ad
hoc
like
greps
or
other
tools,
because
as
soon
as
the
data
becomes
large,
then
it's
not
human
readable,
even
no
matter
how
you
dump
it.
It's
not
going
to
be
human
readable
right.
That's
that's!
A
The
basic
issue
with
you
know
a
large
data
set
is
that
you
end
up
having
to
use
other
tools
anyway
to
make
sense
of
it
and
once
you're
at
that
point,
then
you
have
a
question
of
okay.
Wait.
Is
this
actually
buying
us
anything
to
have
it
be
human
readable?
So
that's
the
only
thing
I'm
sort
of
mentioning
offhand
best
of
all
might
be
to
have
it
be
simple
to
toggle
between
two
options:
right,
something
where
the
output
stream.
A
However,
you
regen
in
the
output
try
to
if
you
can
make
it
something
that
can
easily
produce
json
or
human
readable
without
too
much
trouble,
and
I
don't
know
how
hard
that
is
to
do.
I,
I
honestly
don't
know-
maybe
it's
if
you
right
now.
The
compiler,
for
example,
doesn't
have
a
dependency
on
survey,
so
you
can't
just
like
try
to
use
their
systems
because
we
don't
want
to
add
that
dependency
as
of
now,
but
if
there's
any
but
there's
other
things
that
do
jump,
dump
json.
A
I
guess
it's
what
I'm
trying
to
say,
I
believe
and
there,
and
so
you
might
be
able
to
leverage
whatever
they're
doing
to
somehow
make
it
easy
to
produce
one
or
the
other,
and
that
way
at
least
you
know
you
can
in
the
terminal,
just
use
human
readable
when
you're
doing
little
tests.
But
then,
when
you
want
to
actually
produce
if
you're
working
with
a
large
crate,
that
json
might
be
more
useful.
That's
all!
I'm!
I'm
just
trying
to
point
out
a
potential.
A
Don't
worry
about
santiago,
I
wasn't
saying
anything
that
important.
Okay,
all
right,
let's
move
on
to
wesley
you've
got
a
patch
from
you
from
the
sprint
last
week,
right.
B
B
So
so
I
wrote
a
patch
that
basically
just
looks
for
the
kind
of
outlier
largest
cgus
in
a
crate
and
then
splits
them
in
half
roughly,
and
then
we
lean
on
the
current
merging
process
to
actually
give
us
like
reasonable
sized
results,
and
the
data
I
was
seeing
for
rusty
mirror
is
that
that
seemed
to
work
great
in
that
the
resulting
cgus
we
got
were
all
maybe
within,
like
two
or
three
x,
the
same
size.
The
same
estimates.
A
D
A
A
A
B
Yeah,
I
can
look
at
that
because
I
did
have
the
timer.
I
didn't.
I
didn't
look
at
that
currently
because
I
don't
think
the
past
is
taking
the.
B
A
B
I
think
it's
just
the
effect
that
it's
having
on
llvm,
what's
interesting
is
that
it
looks
like
there
is
a
reduction
in
peak
memory
usage,
but
I'm
I'm
unconvinced
that
that
has
anything
to
do
with
my
patch
other
than
it
made
crates.
Take
long
enough
to
compile
that
it
changed,
which
compartment
which
crates
were
compiling
at
the
same
time,
and
that
has
the
effect
of
reducing
peak
memory
usage.
B
A
Well,
so,
as
part
of
this,
then,
have
you
looked
at
it
without
compiling
the
crates
in
parallel
like
just
to
like,
you
know,
evaluate
the
peak
memories
or
I
guess
you
don't
need
to
do
it
that
way
I
was.
I
was
amusing
myself
you're
talking
about.
Okay,
like
you
saw
these
changes
in
thick
memory
decision,
you're
hypothesizing
about
it
being
about
the
concurrent
runs,
but
can
you
eliminate
that
from
the
announcement.
A
Yeah,
I
will
take
time.
Let
you
just
set
that
run
up
yeah.
Okay,
that's
cool,
though
all
right,
yeah,
it's
it's!
It's
yeah
it'd
be
good
to
see
what
comes
of
this
I
mean.
Obviously,
the
other
thing
I
would
think
is
once
again
this
comes
back
to
our
merging
algorithm
being
so
simple-minded
and
not
taking
anything
else
into
account
and,
like
once
you're
doing
the
splitting
you
may
need
to
revisit.
A
I
don't
know
how
merging
is
done,
trying
to
make
it
more
clever,
although
you
would
hope
that
anything
that
split
doesn't
get
doesn't
get
re-emerged
like
I
was
going
to
say,
okay,
what
I
was
going
to
say
is
that
you
know
there
may
be
cases
where
you
need
to
re-merge
things.
A
No,
I
guess
this
might
make
sense.
Still
you
see
the
whole
point
here
is
the
initial
thing
you're
doing
about
all
the
splitting?
A
It's
just
it's,
starting
with
everything
on
a
more
even
playing
field
before
the
merging
that
happens
right
and
so
the
merging
the
re-merging
hypothetically
could
very
well
end
up
with
things
being
re-merged
together
that
were
previously
merged
together,
hypothetically,
but
it
doesn't
necessarily
end
up
that
way
and
in
fact
you
may
want
to
bias
towards
it
trying
to
re-merge
things
that
were
already
merged
together,
because
of
the
effects
of
you
know
them
having
the
same
inline
functions
not
duplicating
that
effort
right.
That
kind
of
thing.
B
A
A
Right
and
that's
that's
a
place
where
that
a
more
intelligent,
split,
a
more
intelligent
merge
could
be
potentially
useful
because
it
avoids
that
that
blow
up,
okay,
cool
cool
cool,
all
right
for
issue
triage,
we
basically
had
very
little
things
go
on.
We
have.
A
We
haven't
added
anything
to
our
own
issue
set
because
we
haven't
had
a
meeting
and
we
were
busy
with
other
things
for
the
most
part
and
there's
three
new,
a
incremental
violation
issues,
one
of
which
was
part
of
the
things
above
right.
The
miscompilation,
with
incremental
compilation,
is
one
of
the
three
new
ones.
A
What
are
the
other
two?
Are
they?
Oh?
I
see
it's
probably
eight,
two
one,
one:
five
and
eight
two
zero,
two
seven.
I
think
they're
open,
23
and
25
days
ago,
which
you
would
think
is
a
long
time
ago,
but
we
also
haven't
had
a
meeting
in
like
a
month.
I
sounds
think
so.
It's
actually
entirely
plausible
yeah
one
is
a
they're,
both
ices.
I
think.
A
D
It
looks
like
one
of
them,
maybe
I'll
be
closed.
Eight
two
one
one
five
irons
commented:
this
is
fixed
in
the
latest
nightly.
A
Yeah
yeah,
I'm
curious,
whether
it's
because
of
the
thing
that
yeah
I'd
be
a
little
curious
to
know
what,
where
it
actually
got
fixed.
But.
A
Yeah
wesley,
you
might
well
it's
fine,
I
guess
we
can
close
it
or
leave
it
open.
I
was
gonna
say
like
I'd
like
to
bisect
it,
but
the
answer
is,
I
I
suspect
it
might
have
to
do
with.
It
could
very
well
have
to
do
with
the
patch
that
michael
put
in
recently
that
I
reviewed
okay,
but
I
don't
know
if
I
wanted
to
actually
verify
that.
Oh
and
then
only
land
like
a
couple
nights
ago.
A
B
A
A
B
A
Yeah,
it's
not
clear
to
me
when
I
scan
over
the
pr's
from
aaron,
which
one
it
would
have.
A
A
B
It's
literally
just
I
was
running
breast
c
at
some
point
and
I
hit
this
yeah.
Okay.
A
I
just
close
it
yeah
well.
Well,
I
just
maybe
first
ping
aaron
just
to
get
insight
as
to
what
pr
they
think
fixed
it.
First,
just
have
a
little
bit
of
back
trace.
Wild
still
might
be
fresh
in
aaron's
mind,
because
if
someone
asks
in
a
year
it's
not
gonna
be
they're
not
going
to
know
all
right
beyond
that,
then
there
is
the
classic
computer
she's
be
triage
question
and
yeah.
A
Once
again,
we
I
I
want
to
get
out
of
this
rut
of
us,
like
you
know
in
queuing
them
and
not
doing
it
so,
but
I
also
don't
want
to
set
us
up
for
failure.
I
guess
we'll,
let's,
let's
take
care
of
that
asynchronously,
maybe
to
see
if
we
can
do
some
of
this,
it's
triaging
of
the
old
issues.
A
It
doesn't
have
to
happen
here.
David.
Was
there
anything
you
want
to
report
by
the
way,
I
assume
not.
A
All
right,
no
problem;
okay,
all
right!
Thank
you
all
for
attending
we'll
wrap
it
up
here.
Okay,
all
right.