►
From YouTube: OMR Compiler Architecture Meeting 20180606
Description
Compiler Architecture Meeting agenda:
* Revirtualize extensible classes [ @samasri ]
Please add any comments/questions to the GitHub agenda issue: https://github.com/eclipse/omr/issues/2571
A
Okay,
so
welcome
everyone
to
the
to
this
week's
a
compiler
architecture
meeting.
So
today
we
have
one
topic
to
discuss
so
Samer
from
the
University
of
Berta,
we'll
be
talking
about
a
proposal
to
change
some
of
the
way
that
we're
doing
some
of
the
ways
we're
doing
extensible
classes.
So
we've
called
it
revert
realization,
so
I'll
let
summer
take
it
away.
If
you
need
to
take
the
screen
over
and
went
back
just
let
me
know
and
or
just
take
it
Thank.
C
Yeah,
so
our
proposal
today
is
to
change
the
polymorphism
to
from
static
to
hybrid,
and
it
is
based
on
trying
to
understand.
As
a
new
developer
comes
to
and
like
when
a
contributor
on
our
house,
would
they
do
that
so
as
a
new
developer
as
they
started,
they
want
to
extend
our
model
and
for
their
language
they
would
not
have
and
good
understanding
of
which
functions
should
be
overridden.
C
That's
one
issue:
we
found
that
there
were,
in
other
words
there
is
no
documentation
for
which
functions
are
the
part
of
the
API
or
the
function
that
they
should
be
changed,
and
the
second
issue
that
we
found
is
that
sets
just
the
function.
The
user
set
is
a
bit
confusing.
I
doesn't
understand
like
it's
the
self
function,
you
is
used
to
solve
an
inevitable
problem
and
static
morphism,
however,
for
new
developers
they
wouldn't
make
it
takes
time
to
understand
exactly
how
they
should
use
it
and
why
they
should
use
it.
C
So
that's
why
we
have
our
proposal
to
change
from
a
static
polymorphism
that
makes
us
have
to
use
set
into
a
more
hybrid
polymorphism,
so
the
proposal
is
actually
made
of
two
parts.
The
first
part
is
to
mark
all
functions
that
are
expected
to
be
overridden
or
in
other
words,
API
functions
with
an
annotation.
The
name
of
the
annotation
within
natural
Rafat
Matthew
was
suggesting
all
my
expensable
mark
were
suggesting
or
one
API.
C
However,
it's
something
similar
to
our
more
expensive
is
that
that's
using
next
to
the
class
declaration,
we
would
have
another
annotation
next
to
the
function
declaration
and
for
now
I
think
I'm
gonna
call
it
Omar
API.
So
then
that
would
use
developers
when
they
want
to
start
extending
on
our
new
language
developers
or
client
developers
who
would
want
to
start
expanding
on.
They
would
just
search
for
armor
API
and
they
can
find
all
the
functions
that
they
should
or
I
expect
it
to
extend.
C
We
realized
that
there
might
be
a
performance
issue
by
when
we
visualize
some
of
the
functions.
However,
the
hypothesis
we
have
so
far
is
that
performance
and
the
functions
that
are
expected
to
be
expanded
by
developers.
The
API
functions
are
not
are
not
used
enough
to
actually
degrade
the
performance.
However,
we
can
always
give
like,
after
changing
after
virtual
villages,
virtualizing
any
class
hierarchy,
we
can
run
benchmarks
again
and
make
sure
that
this
is
actually
the
case.
A
Okay,
well,
I
guess
I'll
start
with
one.
If,
if
we
are
talking
about
looking
at
measuring
performance
with
a
virtualized
dispatch,
the
evaluation
will
need
to
include
a
broad
set
of
compilers
when
doing
that,
it's
not
just
enough
to
say
that
oh
yeah,
it
works
with
GCC
or
whatever,
because
different
platforms
have
different,
build
compilers
and
those
compilers
can
have
radically
different
capabilities
when
it
comes
to
be
virtual,
ization
or
optimizing.
D
I'll
say
something
so
for
evaluation
of
the
performance
in
past
I
guess
so.
What
we
discussed
earlier
was
that
we
would
make
the
changes
and
then
samer
will
make
the
changes
and
then
pass
some
standard
test.
He
will
run
with
with
github
and
whatever
CI
and
then
we'll
take
his
change
set
and
do
all
the
compile
that
we
do
internally
give
the
build
like
and
then
give
it
to
the
performance
team
with
every
team
and
then
do
all
the
benchmarks
that
we
normally
do
and
I
don't
think
there'll
be
a
huge
performance
in
pepper.
B
A
We're
not
going
to
be
stuck
on
GCC
for
point
four
point:
seven
or
whatever.
We
are
right
now
forever
right.
So
if
you
were
to
do
an
experiment
with
that
compiler
and
you
get
a
certain
level
of
performance
that
isn't
necessarily
great,
he
might
actually
get
a
lot
better
performance.
It
would
have
run
with
something
like
GCC,
seven,
three.
B
A
I
I
think
you
want
to
keep
the
future
in
mind
as
well,
if
you're
going
to
be
doing
any
kind
of
performance
investigation
just
so
that
you
know
the
possibilities
because
of
this
solute.
If
if
this
is
a
solution
that
we're
going
to
go
with
longer
term,
it's
going
to
be
it's
potentially
going
to
be
with
us
for
a
while.
So
we
want
to
make
sure
we
at
least
keep
up
with
what
the
latest
compilers
are
able
to
do
with
it.
Yeah.
D
A
D
A
D
A
But
I
think
that,
let's
see
the
data
first
I
think
there
would
be
three
I
think
the
tolerance
for
degradation
is
proportional
to
the
benefit,
that's
perceived
to
be
getting
that
we're
perceiving
to
be
getting
from
it.
So
if
there's
engineering,
implementation
benefits
and
runtime
costs,
those
things
have
to
be
traded
off
and
they
have
being
traded
off
at
various
points,
but
I
think.
A
On
the
other,
I,
don't
think
so,
I
don't
think
we
can
draw
a
line
in
the
sand.
You
have
because
you
may,
you
may
end
up
with
most
benchmarks,
having
no
problem
and
one
of
them
having
a
massive
problem,
or
you
may
end
up
with
a
small
parasitic,
that's
sort
of
all
over
the
place
and
saying
we'll
accept
this
or
accept
that
I
think
it
has
to
be
evaluated
in
the
context
of
the
benefits
that
the
proposal
has
versus
the
drawbacks
that
it
has
from
a
performance
perspective.
A
I,
don't
want
to
rule
anything
in
or
out,
I,
don't
want
to
be
unfair
and
dismissed
it
on
some
arbitrary
threshold,
because
o
it
regressed.
One
thing
that
we
see
all
these
other
benefits
that
wouldn't
be
fair.
It's
also
not
fair
to
say:
well,
it
met
all
these
thresholds,
but
it's
actually
kind
of
bad
all
over
the
place.
We're
simply
going
to
take
it.
Okay,.
D
A
C
A
C
A
I
mean
when
saying
in,
admittedly,
we
did
not
do
a
great
job
with
when
we've
been
with
architecting
some
of
this
stuff.
With
these
X
there's
a
current
design,
is
we
didn't
really
sort
of
go
through
and
identify
these
or
what
we
think
should
be
the
extension
points
of
the
classes,
and
these
are
things
that
we
think
should
not
be
extension
points.
A
E
A
Right,
there's
just
not
a
product
that
there's
potentially
not
a
project,
that's
currently
extending
them,
but
it
might
make
sense
for
some
project
that
we
haven't
thought
of
yet
or
hasn't
worked
with
us.
Yet
on
that,
might
one
extend
it
so
we're
going
to
have
that
problem
as
well,
even
if
you're
going
to
use
an
annotation
right,
because.
B
B
A
That's
that
current
performance,
the
way
things
are
right
now,
if
this
is
now
going
to
be
the
way
that
things
will
be
done
like
this,
is
now
the
new
extension
mechanism
that's
being
proposed.
We
want
to
make
sure
that,
if
you're
adding
virtual
to
a
lot
of
functions,
what
we
want
to
make
sure
that
virtual
was
added
to
as
many
functions
as
we
think
our
extension
points
in
classes,
not
how
many
are
currently
being
extended.
A
Item
to
the
current
work,
I
I
would
disagree
with
that
because,
if
you
have
say,
the
class
currently
has
five
points
that
are
extended,
but
it
actually
has
15
points
that
Omar
from
a
design
perspective
has
to
allow
extension.
Then
the
10
things
that
are
not
currently
extended
would
not
be
marked
virtual
and
would
not
have
the
Omar
API
annotation.
If
you
mark
those
extra
10
things
as
virtual,
you
may
significantly
increase
the
number
of
virtual
dispatches
being
in
the
C++
code.
A
That
could
have
a
significant
impact
on
compile
time
as
a
result
of
the
quality
of
the
code
generated
by
the
C++
compiler.
So
you
need
an
upper
bound
set
on
the
number
of
methods
that
need
the
virtual
keyword
to
gauge
the
maximum
impact
of
switching
to
a
virtual
inheritance
model
from
the
current
static
polymorphism.
F
A
I
guess
the
thing
is,
though,
leo
that
if
we
went
from
static
to
dynamic,
there's
a
cost
there,
and
my
concern
is
that
if
we
do
the
evaluation
in
it
confined
for
the
subset
of
the
extension
point
you
you
may
miss
a
cost
that
you're
actually
going
to
have
to
incur.
If
you're
going
to
include
a
language,
that's
radically
different
from
from
Java,
and
your
experience.
I
agree.
D
A
D
D
A
A
Getting
it
there
was
like
we
need
to
understand
what
the
extension
points
truly
are,
and
that
does
requiring,
but
probably
from
all
of
us
right
identify
with
where
they
are
and
then
make
that
change,
and
then
after
then,
reevaluate
performance,
but
just
a
just.
An
assessment
of
the
performance
of
just
the
extension
points
that
we
have
right
now.
I
think
that
can
certainly
go
ahead.
In
fact,
that
data
would
be
useful
anyway,
because
then
right,
that's
necessary.
D
D
I
may
add
virtual
told
API
calls
at
that
point
and
then
do
another
evaluation
of
all
the
benchmarks,
whatever
in
the
bag
of
performance
testing,
that
we
were
required
to
do
to
evaluate
what
the
performance
is
at
that
point
and
say:
okay
now,
plus
because
I
feel
like
this
is
a
necessary
step
in
in
the
in
the
evaluation
saga,
I
guess
to
say
we're.
Switching
the
programming
paradigm
from
static
to
dynamic
polymorphism
right,
respond,
understand.
B
A
D
A
A
D
The
other
question
that
has
is
like
today,
we
have
dynamic
polymorphism
with
virtual
calls
in
the
codebase
already,
and
we
have
static
polymorphism
with
self
invoking
cells
called
even
with
this
issue,
like
you
know,
if
we
went
ahead
and
did
all
the
modifications
that
the
virtual
did
all
the
performance,
evaluation
assessment
and
they
say-
was
performance
neutral
or
within
some
some
threshold,
we're
happy
with
that.
Can
we
not
commit
that
then,
just
for
that
set
and
then
leave
the
rest?
What
do
we
have
to
as
the
goal
of
this
issue
implement
everything
by
coming
today?
A
I
think
that
the
thing
is
we
need
to
have
some
concept
of
what
we
believe
the
upper
bound
is
now
one
way
to
get
the
upper
bound
is
to
identify
them
all
mark
them.
All
virtual.
Do
all
the
performance
right,
the
other.
The
other
way,
if
that
is
not
tractable
in
the
scope
of
the
item,
is
that
we
do
what
we
have
at
the
moment.
A
So
while
we
may
not
have
to
get
to
the
point
where
every
extension
point
is
identified,
we
may
need
to
have
try
to
come
up
with
some
means
of
at
least
estimating
what
we
think
that
point
to
be
I.
Don't
know
how
we
do
that,
but
I
think
I
I
strongly
believe
that
we
need
to
have
some
idea
of
the
upper
bound
I
agree.
D
E
So
another
thing
they
are
going
to
be
assess
that,
like
it's
tons
of
effort,
if
you
guys
go
through
every
cloth
and
try
to
figure
out
what
the
API
is,
but
if
we
use
this
person-
okay,
it
has
it
been
overwritten
somewhere.
Then
we
at
least
reduce
it
from
having
to
go
through
a
hundred
we're
now
100
functions,
we're
now
down
to
50
that
we
know
for
sure
our
API
extension
point,
and
then
you
need
to
only
look
at
the
remaining
50
for
a
cost.
E
D
E
Just
where
I'm
saying
is
that
so
Shema
class
has
a
hundred,
and
only
50
of
them
have
been
overridden.
It
doesn't
rule
out
the
other
50
cuz
they
could.
They
could
still
feel
and
mention
script
right
up,
but
now
you
don't
have
to
go
through
100
of
them.
You
have
to
go
only
through
50,
so
like
when
Mr,
let's
say,
submits
the
pull
request.
E
He's
only
submitted
suggestions
for
50
of
them,
so
once
there's
a
Costas
we're
trying
to
virtualize,
you
could
go
through
through
others
and
see
if
those
other
functions
are
extension
points
or
not.
I'm,
just
trying
to
say
that
using
the
quote-unquote
heuristic
I've
seen
what's
been
over
with
and
reduces
the
number
of
functions
you
need
to
look
at
as
I'm
still
API
or
not
they're,
ready
to
give
you
a
set
of
four
short
part
of
the
API.
E
A
A
D
B
So
I'm,
particularly
thorny
one
or
one
maybe
leave
those
out,
but
for
the
most
part,
try
to
capture
everything.
If
they're
going
to
do
that
kind
of
across
the
board
for
evaluation,
different
forms,
different
compadres,
we
can't
be
doing
an
runs
of
penetration
yeah
we
could,
but
it
would
be
fun
yeah
well,.
D
E
B
We
get
and
there's
already
a
big
problem
or
the
signs
of
a
big
problem
to
extrapolate
that
very
handy
will
be
based
on
what
you
think
you
will
add
if
you
went
and
really
added
virtualization
points
to
the
API
and
extrapolate
that
to
a
number
and
I'm
still
comfortable
with
that
number
you're,
not
going
ahead
with
that.
What
about
the
other
side?
Let's
see
if
they
come
back
with.
B
What
what
do
we
do?
I,
don't
think
we're
saying
necessarily
able
to
say
anything
at
that
point.
D
They're
couple
points
to
I
guess
in
the
proposal.
If
you
look
at
the
issue,
there's
one
where
we
say:
if,
if
if
the
functions
are
intended
to
you
to
be
used
only
in
OMR,
we
keep
static
morphism.
The
other
thing
is
we
declare
this
mackerel,
whatever
it's
called
our
API
or
extension,
and
it's
possible
for
the
downstream
projects
to
override
that
that
mackerel
to
declare
a
final
or
whatever,
so
that
for
faster
as
a
dispatch.
D
A
D
You
don't
have
a
checker
in
that
point
that
we'll
need
to
modify
the
check
expected
to
you
need
to
modify
my
brain,
no
Samer
I.
Think
that's
the
first
point
bullet
in
your
proposition
section.
I
I
would
believe
that
and
change
that
to
virtualize
all
the
calls
that
are
overridden
and
then
be
portable.
The
other
point
of
so
sweet.
E
D
So
so
the
macro
can
be
used
for
that
purpose,
but
the
other,
probably
the
other
use
is
that,
for
like,
say,
open
j9
wants
to
declare
virtual
final
method,
because
it
knows
no
one
will
overwrite
them
anymore,
so
the
compiler
can
can
do
a
quicker
draws
in
a
call
dispatch.
So
we
can
also
use
that
for
for
that
purpose,
I
think
there
was
some
discussion
about
using
the
same
for
omr
extension
versus
this
other
mackerel,
and
if
we
do
things
in
stages,
that
cannot
happen.
D
D
D
And
this
new
thing
is
caught
off
this
other,
that,
let
me
just
call
this
om
our
API.
Now
we
proposed
that
for
all
the
virtual
calls,
so
initially
you
know
what
more
we
can
just
declare
all
more
API
as
a
virtual
for
Lamar
and
and
then
we
can
also
use
it
as
an
annotation
later
on
for
time
plug
in
in
whatever
to
generate
some
API
set
for
markup,
eyler
and
then
later
on.
This
opens,
like
another
project,
say
over
genera
or
whatever
other
project
can
overwrite.
That
map
rotate
now
change
that
virtual
final
I.
A
I,
don't
see
how
that's
going
to
work
because
om
our
API
is
going
to
be
all
over
a
bunch
of
stuff
and
not
all
of
them
are
going
to
be
final
at
any
given
point
in
time.
So
I
mean
I,
don't
see
how
that
like,
if
it
virtual
put
the
virtual
keyword
there
you're
going
to
have
to
change
that
macaron
on
a
particular
function.
That's
what
you'd
want
to
do
right.
I
know
that
that
particular
function.
I
know
I,
don't
want
it
to
be
extended.
Rea
like
that.
A
A
D
B
D
B
B
B
D
C
B
B
F
D
A
A
My
perspective,
if
we're
going
to
have
something
called
OMR
API
and
it's
going
to
get
sprinkled
over
methods
in
some
way,
I
how
you
come
up
with
the
criteria.
Well,
I'm,
not
sure
about
that.
But
I
need
a
crisp
one
to
two-sentence
description.
That
allows
me
to
make
a
decision
as
a
reviewer
as
to
whether
that
is
correct
and
to
be
able
to
convey
to
people
contributing
to
the
project
as
to
when
they
need
to
apply
it
and
when
they
need
to
not
apply
it
right
and
I'm,
not
sure
that
I
understand
what
that
crisp.
D
The
other
I'll
I
mean
the
other
sizes
to
say
when
we
decided
on
the
OMR
IPI
what
it
is.
We
also
look
at
all
the
virtual
functions
we
have
today
and
say
today:
do
they
do
they
have
to
be
there
like?
Are
they
right
today?
We
don't
we're
not
confident
to
say
all
the
virtual
functions
should
be
virtual,
and
hence
they
will
be
part
of
this
API,
because
we
don't
have
some
something
some
other
an
annotation
to
say
what
the
API
is.
A
That
confuses
the
hell
out
of
everybody,
so
I
think
that
we
need
to
have
some
kind
of
disciplined
way
that
this
is
supposed
to
be
used
and
a
mental
model
of
how
it's
supposed
to
work
so
that
people
doing
reviews
can
be
consistent
in
criticizing
the
presence
or
absence
of
the
annotation
right.
I
I
would
hate
for
one
reviewer
to
go
put
on
our
API
on
there
and
then
somebody
else
comes
around
with
their
pitchfork
and
says:
hell,
no
get
it
off
there.
A
C
A
D
F
D
D
A
B
A
There's
virtual
inheritance
in
there,
but
the
virtual
inheritance
is
not
meant,
as
extension
point
means
of
recycling
code
between
related
implementations
of
optimizations.
That
right
from
a
software
engineering
point
of
view
should
share
code,
but
you
go
and
start
replacing
those
methods.
Good
luck
to
you!
It
probably
ain't
going
to
work.
It
is
you.
A
Most
of
the
most
of
the
off
do
not
are
not
intended
to
be
extended.
If
you
need
radically
different
behavior
from
quite
a
number
of
the
are
you
need?
A
different
ought,
the
only
ones
that
have
a
notion
of
extension
really
are
like
things
like
simplifier
and
and
things
where
you
have
a
table
handlers
for
particular
off
code,
and
if
the
opcodes
are
expensable,
then
you
need
some
special
handling
for
those
or
where
the
meaning
of
an
opcode
is
possibly
refined
by
a
project,
a
write
barrier.
A
Being
an
example
where
a
write
barrier
is
a
side
effecting
right,
the
side
effect
may
be
reporting
to
the
GC.
It
may
be
doing
something
else.
The
particular
side
effect
attached
by
a
particular
implementation
to
that
may
vary
and
how
you
want
the
simplifier
to
work
with
your
write.
Barrier
may
be
different,
but
that's
a
very
constraint
like
it
already
a
table
is
a
function
table
right
because
different
examples
from
different
office
version.
B
Of
looper
say
you
have
a
method
called
version
natural
loop,
so
that
version
natural
loop
today
knows
about
certain
checks
that
are
there
in
java,
so
how
the
version
was
respected
and
it
understands
the
semantics
of
codes
in
the
context
of
that
say.
Well,
just
make
virtual
version
natural
loop
virtual
and
you
just
get
to
define
everything,
but
practically
everything
we
have
in
version.
Er
is
callable
out
of
our
natural
loop,
so
that
would
essentially
delegate
the
whole
task
of
the
versioning
to
front
for
a
new
product
which
you
don't
want.
B
You
do
want
to
help
them
with
that
fast,
but
that
may
mean
extracting
our
parts
of
what
are
currently
it.
What
is
currently
in
we're
a
natural
loop,
creating
virtual
function
here
for
those
and
putting
the
rest,
so
some
of
the
infrastructural
stuff
in
on
work
yeah,
the
sort
of
complication,
I'm
tanja
right
that
will
hit
you
across
the
board
in
many
places.
Let's
just
call
it
that
maybe
you
wanted
to
retain
some
very
simple
paper's
like
value
profit
1/3,
even
for
value
propositions.
B
D
It's
the
other
most
also
I
guess
one
is
like.
Can
we
I
mean
just
discussing
now,
like
I
mean
one
is
to
change
the
code
to
identify
like
certain
queries
that
can
be
different
for
this
project
versus
that
project
went
through
this
action
or
that
action
they
can
override
those
things
for
project.
The
other
thing
I
was
thinking
is
then
even
this
upper
bound,
let's
say
today
we're
not
ready.
D
D
But
like
let's
say
this
natural
loop
version
version
natural
loop,
for
instance,
I
know
that
rivers
were
cool.
Hopefully
you
also
don't
believe
it
should
be
in
the
future
or
like
I
guess
what
I'm
trying
to
say
is
the
upper
bound
that
we
were
proposing
earlier
like
if,
today
we
don't
if
we
decide
ok,
this
should
not
be
part
of
the
API.
We
don't
do
virtual,
like
you
know
a.
A
A
D
A
The
optimizer
does
not
use
static,
polyworks,
optimization
optimizations
do
not
never
have
used
static,
upgrade
so
discussion
here
was
about
switching
from
static
to
dynamic
polymorphism.
We
thought
was
my
understanding,
which
is
what
Sarah
and
a
tall
they
are
working
on,
and
if
the
discussion
is
around
that,
then
the
area
of
the
code
you
have
to
be
concerned
with
is
the
set
of
stuff.
That's
currently
using
static
polymorphism
right.
So.
A
B
I'm
just
saying
they've
been
there
we'll
sleep
facing
our
paths
from
10
minutes
ago
with.
Yes,
there
are
some
classes
that
are
foundational
classes
which,
where
there's
a
lot
of
static,
polyworks
use
where
you
can
analyze
and
come
to
a
conclusion
about
stats,
yeah
category
off
problems
of
that
category
of
classes,
there's
a
different
category
and
saying
which
are
not
for
foundation
stuff
but
which
are
more
complex
in
terms
of
extracting
out
an
API
from
them
and
in
terms
of
what's
currently
workflow
and
so
on
that
question
yeah.
B
D
A
C
E
B
B
E
C
A
I
just
picked
those
off
the
top
of
my
head,
but
that's
sort
of
why
I
also
asked
if
you
knew
how
many
for
all
the
classes
for
the
extensible
classes,
which
ones
called
self
the
most
and
I,
think
those
are
the
ones
that
will
be
in
terms
of
you
wanted
to
put
a
prioritized
list
of
all
these
fifty
classes
together.
The
ones
that
had
the
most
self
call
in
their
in
their
in
their
functions
would
be
the
ones
you
want
to
start
with
yeah
and
even
doing
like
the
top.
A
B
A
E
D
Think
that's
so
we
have
a
set
like
we
have
a
build
environment
that
will
build
on
a
different
platform.
That's
how
we
built
for
our
products,
that's
how
we'll
build
open,
j9
and
then
we'll
run
those
with
a
benchmark
that
are
required.
I
guess
that
satisfy
some
performance
degradation
threshold.
But
if
that
goes
well,
then
it
stops
there
if
it
doesn't
go
well,
then
we
will
look
at
like
using
some
of
the
more
modern
compilers
on
those
top
one
see
if
there's
improvement
and.
C
E
D
Yeah
I
think
the
first
step
is
to
make
the
changes
for
those
classes.
I
personally
think
the
classes
with
the
most
over
rice
will
probably
be
a
good
use.
The
first
cost
to
take
calls
up
the
top
10
or
15,
whatever
I
would
have
whatever
number
of
classes
that
you
choose
to
change
and
then
we'll
do
this
evaluation
once
see
how
it
goes
because
I
think
the
problem
is.
D
E
F
D
F
D
A
A
D
F
F
F
Wouldn't
we
be
changing
our
startup
api's
to
accommodate
dependency
injection
and
would
that
be
a
good
way
to
control
and
sort
of
like
express
exactly
what
our
extension
points
are.
Some
applications
or
I
think
it.
So
you
know,
like
maybe
the
JIT
compiler
startup
API
takes
a
pointer
to
do
some
kind
of
like
node,
factoring
your
symbol,
factory
or
something
like
that.
The.
A
Symbol
factory
would
have
a
virtual
call
that
constructs
some
kind
of
concrete
implementation
of
node,
that's
language
specific
and
that
that
is
how
an
application
would
provide
like
concrete
behavior
to
the
compiler
by
constructing
the
dependency
and
then
passing
it
into
like
the
JIT
Builder
builder
thing.
The
gif
Factory
I.
D
A
A
A
A
So,
but
but
we
stripped
when
we
open
sources,
we
stripped
all
that
out
that
we
basically
the
simplifying
assumption
that
we
made
in
order
to
get
this
up
door,
was
you're
going
to
cook
up
a
new
compiler
for
whatever
language,
front-end
architecture
pairing
that
you
care
about.
But
if
we're
going
to
come
back
a
bit
from
that
in
which
I'm
not
certain.
F
A
E
E
Just
going
to
say
so,
I
guess,
the
summary
of
the
action
points
is
that
we
try.
Those
clauses
that
were
already
mentioned
will
pop
once
we
finish,
those
changes
pass
them
on
to
Shelly
to
run
them
on
the
benchmarking
infrastructure
that
you
have
available,
and
then
we
basically
report
back
those
numbers
and
see
what
we
think
about
them
before
doing
anything
else.
Great.
D
The
out
and
yeah
yeah
sure
yeah,
let's
say
you
looked
at
the
set
of
I,
don't
know
how
many
classes,
five
hundred
classes
and
altogether
five
hundred
plus
or
something
and
out
of
those.
However
many
have
over
in
functions.
You
pick
the
top
ten,
they
say
once
you
have
the
top
ten
K.
Let
us
know
I
guess
we
can
get
on
the
agenda
to
talk
about
what
is.
A
Okay,
I
guess
we'll
adjourn
thanks
very
much
summer
for
talking
through
that.
Thank
you
for
being.