►
From YouTube: OMR Architecture Meeting 20191114
Description
Agenda:
* Make target environment local to the compiler (#4518) [ @dsouzai ]
B
C
A
Okay,
so
give
background
the
motivation
for
this
came
from
open
a
night
where
I
mostly
spend
my
time
and
started
off
with
the
fact
that
an
open,
Gina
and
I
now
have.
We
now
have
multiple
SBC's
and
actually
is
even
before
that
it
has
to
do
with
the
fact
that
now,
the
talker
from
the
fact
that
people
want
to
put
an
SCC
in
a
docker
image
isn't
ship
it.
A
If
you
build
the
docker
images
popular,
that's
the
see
on
my
machine
I
try
to
run
it
on
another,
depending
on
how
you
populate
the
second
one
and
try
to
run
the
back.
You
can
run
into
weird
static
compiler,
like
issues
where
the
validations
for
checking
the
processor
features
are
not
exactly
complete,
for
the
most
part
is
overly
conservative,
which
is
going
to
defeat
the
purpose
in
that
it
was
not
load
code
that
you
want
to
be
loaded.
A
Basically,
the
the
relocatable
code
does
not
target
some
base
set
of
features
or
whatever
features
are
specified
on
the
command
line,
and
so,
when
I
first
opened
up
this
issue,
I
sort
of
figured
we
may
need
is
not
talking
about
in
this
issue,
but
an
open
G.
Now
we
may
not
necessarily
need
it,
but
I
sort
of
figured
is
worth
bringing
up
anyways,
because
if
we,
if,
if
we
go
down
this
road,
it
still
helped
open
gen
out
as
well.
A
The
idea
here
is
that
the
target
environment
in
the
compiler
and
singleton
is
a
global
and
there's
also
a
host
which
is
used
to
sort
of
answer.
Queries
about
where
the
compiler
is
running
versus
target
is
where
the
jet
code
is
targeted
to
run
open.
We
have
the
ability
to
run,
generate
relocatable
compiles
dynamically
and
optionally.
By
that
I
mean
the
aut.
A
Compilation
actually
happens
in
the
process
and
the
runtime
can
decide
if
it
wants
to
do
a
relocatable
compiles,
but
the
problem
there
becomes
if
the
target
is
global,
then
the
target
processor
features
your
targeting
has
to
apply
for
both
Ateneo
t,
compilations
and
so
I
open
this
issue.
To
sort
of
discuss
and
Almar,
whether
it's
feasible
and
whether
it
makes
sense
to
maybe
the
target
at
compilation,
local
or,
at
the
very
least,
have
the
ability
to
have
different
targets
that
a
particular
compilation
can
choose
from,
depending
on
how
it's
set
up.
A
B
The
the
tent
there
was
to
begin
to
redesign
the
way
that
we
are
the
we
someone.
Traditionally,
we've
used
the
sort
of
front-end
structure
to
describe
interactions
with
the
VN
and
interactions
with
the
front
end
or
an
interactions
of
the
environment,
to
ask
certain
questions
and
I
guess
early
on
in
the
Omar
days.
We
did
the
way
that
we
were
approaching.
B
This
was
to
kind
of
replace
a
lot
of
the
front-end
functionality
with
the
sort
of
environment
and
and
that
work
had
sort
of
started
and
then
stopped
when
things
got
really
hard,
and
but
there
is
a
lot
more
work
that
needs
to
get
done
there.
The
front-end
interface
still
exists
since
that
time,
thinking
is
kind
of
changed
a
bit
from
when
that
work
was
done
to
try
and
resurrect
some
of
the
front-end
capabilities
and
make
it
more
make
it.
D
B
Different
compilations
threads
can
be,
can
have
different
kinds
of
environments
with
right,
so
you
could
be
doing
any
ot
compile,
for
example,
on
one
thread
you
could
be
doing
a
nominal
KQ
compiled
another.
You
could
do
to
do
the
service,
compile
something
like
that.
So
there
are
different
requirements
there,
so
thinking
was
to
actually
go
back
toward
a
more
dynamically
polymorphic
hierarchy,
so
none
of
that
has
actually
happened.
Yet.
B
I
am
somewhat
in
favor
of
a
per
compilation
thread
environment,
where
each
where
there
are
there
are
questions
that
that
compilation
thread
can
answer
that
specific
to
that
kind
of
compile
that
it's
doing
so
all
of
those
three
apply.
It
could
even
at
some
point,
be
like
a
cross
compilation
thread
bite
where
it's
answering
questions
about
a
completely
different
targets
and
the
host
that
you're
that
your
own.
D
B
Of
that,
though,
I
still
think
that
there-
and
this
is
how
the
code
is
used
in
not
only
Willmar
but
also
open
j9.
There
are
questions
that
are
being
asked
of
the
front
end
or
the
compilation
of
iron,
but
more
so
the
front
end
outside
of
any
compilation
thread,
and
so
the
assumption
seems
to
be
that
there
is
this
sort
of
a
dummy
front.
End
that's
created
like
a
headless
front.
End,
that's
created
that
can
answer
those
kinds
of
questions.
So
if
we
were
going
to
go
the
path
of
actually
having
her.
B
Thread,
environments
and
I
think
that
we
also
need
to
figure
out
what
are
the
questions
that
are
being
asked
that
could
be
asked
outside
of
those
per
thread,
compilations
and
sort
of
have
those
as
part
of
make
you
have
to
warble.
You
ask
the
global
environment
those
questions,
everything
else
that
pertains
to
a
particular
compile.
You
asked
the
one
on
the
compilation,
so
that's
kind
of
the
way
I
was
thinking
about
it
going
forward,
but
you
know
zero
work
has
been
done
on
on
that.
Yet
so.
A
The
way
I
was
looking
at
the
compiler
and
sing
with
them.
There
was
you
know.
Both
the
target
is
also
the
host
the
host
feels
like
it
would
answer
they
could
solve
that
problem,
which
is
need
to
ask
the
global
front-end
question,
probably
the
host
that
you
want
the
answer
of
that's
too
bright
as
if
it's,
if
you
answer
a
question
of
the
target,
then
it
should
mean
it
good
to
have
the
notion
of
well.
Some
questions
are
always
going
to
be
true
of
the
target.
You
can't
really
know
which
question
can
be
global.
A
C
A
B
Guess
everything
is
whether
or
not
you
were
going
toward
a
recompilation
approach
like
are
you?
Are
you
something
that
would
eventually
support
recompilation
or
something?
That's
not
that
that
isn't
quite
support
recovery?
It
supports
different
targets,
but
it's,
but
not
really.
A
recompilation
thing
like
like
I.
Think
your
example.
Your
your
motivating
example
is,
you
have
different
processor
features,
but
it's
the
same
architecture
like
the
same
business
and
everything
right
but
did
you
know,
features
might
be
enabled
or
disabled
in
certain.
A
D
A
A
B
Of
or
or
this
is
a
compile
server
that
can
handle
different
kinds
of
right,
which
ultimately,
this
might
you
might
need
to
be
able
to
support
something
like
that,
but
right,
there's
a
halfway
step
on
that
way,
which
is
where
you're
targeting
the
same
environment.
But
you
wish
to
selectively
allow
yourself
to
use
or
not
use
particular
capabilities
of
the
underlying
hardware
mm-hmm.
B
The
compiler
right,
if
it
answers
yes
to
the
capability,
it'll
use
it
right
or
if
you're,
on
Z
or
you're
on
P
and
the
architecture
responds
you
know,
power,
9
or
whatever
you
will
get
whatever
exploitation
we
have
for
power
9.
There
isn't
an
easy
way
to
say
well
actually
you're
running
on
nine,
but
what
I
really
want
you
to
generate
is
the
code
that
would
run
on
eight
right.
The
the
subset
of
the
ISA
that
was
on
the
previous
generation
know
even
outside
of
all
the
bigger
environment
issues.
A
A
Do
isn't
it
yeah
because
they
the
target?
If
you
look
at
the
actual
target
objects,
a
bunch
of
War
and
the
copy
construction
is
super
simple.
It's
just
it's
just
big
fields,
for
the
most
part,
so
like
you'd,
make
a
copy
of
the
comp
object
which
is
constructed
in
place
and
modify
it
as
you
want
actually
not
hard
to
do.
A
A
Bad
because
it's
always
th
the
most
part,
other
parts
of
the
code
already
has
access
to
the
comfiest
gene,
which
is
it
would
make
no
sense
to
have
one
at
PR
con
expedient,
but
just
it
would
be
like
a
couple
of
hours.
The
work
is
just
didn't
want
to
go
down
that
road
and
then
realize
now
what
we
want
to
do.
I
think.
B
Most
places
you
can
get
access
to
the
compilation
object,
but
most
this
is
when
the
compiler
you
can
get
access.
The
compilation
object
through
various
means.
The
tricky
parts,
at
least
when
I've
seen,
are
actually
like
an
äôt
runtime,
where
it's
not
obvious,
at
least
to
those
that
are
go
through
that
code,
very
often
that
you
actually
can
derive
the
compilation
object
from
various
things.
A
A
B
Is
some
stuff
that's
done
like
during
the?
This
is
an
open,
j9,
specific
thing,
but
there's
there's
things
that
are
done
like
via
the
patching
runtime
or
the
sampling
threads
or
or
things
that
get
called
by
the
various
hooks
like
for
class
loading
class
unloading,
all
that
kind
of
stuff
that
actually
gets
invoked
the
hook
mechanism?
B
A
That's
sort
of
why
this
one
was
just
targeting
specifically
CPU
features,
because
when
he
started
getting
to
on
ten
question,
it's
harder
to
sort
of
where
our
CPU
features
are
like.
Usually
things
that
query
in
the
weather
kicking
stuff,
don't
ask
about.
You
know.
Processor
features
on
work
on
right,
it's
a
smaller
step
to
take
and
the
bigger
question
becomes.
You
know
like,
for
example,
what
you're
saying
whether
you
want
to
go
down
the
road
and
route
or
something
else.
That's
front-end
like.
A
B
C
A
A
B
But
isn't
what
he's
doing
wouldn't
that
be
used
by
Turk,
but
it
would
actually
be
used
by
the
compilation
environment
like
the
publisher
mer,
would
ask
a
port
library
to
give
it
the
information,
as
opposed
to
it
deriving
it.
Y'all
like
to
call
this
reduction
is
going
to
live
in
your
library,
the
cache
the
results
in
a
compiler
target
and
the
query.
A
B
Don't
I
don't
think
it
has
like
if
we're
going
to
go
to
the
model
where
you're
going
to
be
able
to
turn
features
on
and
off
there
shouldn't
be
I
turn
off
stuff
that
the
CPU
has
it's
there's
one
way
of
generating
the
target,
which
is
ask
the
port
Lib
to
populate
the
stupid
thing
from
the
processor
and
the
other
one
is
stand
back,
I'm
going
to
take
care
of
this
right
because
you
may
wish
to
generate
code.
That
is
not
even
going
to
run
on
the
current
target
right.
B
You
may
wish
to
have
the
x86
code
gen
emit.
You
know,
TLE
instructions
on
a
processor
that
doesn't
so
you
know
TSX
instructions
that
on
a
processor
that
doesn't
have
the
tsx
extension,
enabled
there's
nothing
wrong
with
having
the
compiler
do
that?
It's
just
if
you
run
the
code
it'll
mall
over,
but
you
can
do
that
on
GCC
right
I
could
be
sitting
on
a
Pentium
3
and
tell
it
to
generate
avx2,
and
it
will
for
context
up
until
last.
A
Year,
we've
supported
for
compiled
targets
on
Z
and
we
are
able
to
turn
on
features
which
the
processor
family
support.
That's
been
hella
useful
when
your
implement
new
processor
features
and
you
need
to
debug
on
a
per
compile
basis.
So
you
have
bugs
in
a
certain
file,
I'll
turn
off
the
feature
on
a
certain
compile
and
we
run
right
and
I'll
figure
out
whether.
B
New
location
right,
yes,
well,
I
mean
II
sure
we
can
have
one
that
you
know
is
populated
and
then
modify
it,
but
I
mean
at
the
end
of
the
day,
if
it's
going
to
be
abstracted
all
right
and
you're
going
to
have
m
arch
flags
or
F,
the
equivalent
of
M
arch,
Flags
or
F
feature
Flags
right,
saying
that
turn
on
sse2
or
turn
off
at
that's
it
FF
III,
you're,
going
to
need
to
be
able
to
just
sort
of
cook
one
up
out
of
nowhere.
So.
A
I
think
both
of
these
can
be
done
in
the
same
way
like
yet
we're
the
port
live
and
either.
Your
features
are
your
add
features,
but
it
could
be
done
in
the
same
ways
in
that
case,
why
we
need
to
query
the
port
live
in
the
first
place.
If
we've
said
what
features
we
have,
we
don't
need
to
go
to
the
port
live
at
all
right.
B
The
arch,
if
you,
if
you
have
arch
levels
right
like
what
GCC
has
that,
is
essentially
a
pre,
a
statically
built
bit
pattern.
That
is,
this
is
what
the
CPU
does
right
and
those
ones
you
don't
need
to
call
the
port
loop.
It
doesn't
matter
what
you're
running
on
there's
no
reason
to
ask
the
CPU:
it's
just
copy
construct
from
this
bit
pattern
or
port
Lib.
B
Get
me
the
real
bit
pattern,
put
it
in
here
and
I'm
going
to
dork
with
it
it
kind
of
six
of
one
half
a
dozen
of
another,
but
it
needs
to
support
both
modes
so
that
works
on
on
on
x86,
where
I
can
get
it
to
get
handed
back
the
results
of
the
C.
So
what
we
have
on
power,
let's
say
with
what
comes?
Is
there
a
bill
on.
A
B
D
C
B
Are
at
64
and
risk
5
are
both
much
more
modular
ISAs
even
than
x86
right.
There
are
highly
two
as
well.
Our
highly
capabilities,
driven
right
and
they're,
going
to
have
big
vectors
of
I
have
core
and
I.
Have
this
and
I
have
that
I?
Don't
have
this
and
I?
Don't
have
that
and
you're
going
to
need
to
toggle
all
that
stuff
on
and
off
and
make
sure
that
you
can
compile
for
whatever
the
target
that
you're
actually
targeting.
B
A
The
moment
I
saw
especially
a
bug
there.
I
did
see.
Oh,
it's
not
a
bug
in
so
much
as
it's
no
virtual
patching
it.
Well,
it
depends
what
you're
doing
with
that
information
again
but
yeah.
Thank
you,
I'm
Susan,
look!
I,
don't
were
searching
for
TR
compiler.
You
know
target
yeah,
there's
like
that.
Wasn't
great
responses.
A
There
were
some
that
we're
in
this
was
target.
Who
was
going
to
CP
used.
It
was
asking
I
think
the
past
equality
probably
doesn't
mean
to
know
something
and
the.
C
B
A
A
A
B
See
if
we
ask,
for
you
can
or
if
there's
some
way
that
you
could
prevent.
So
if
you
have
a
model
where
you
have
multiple
couples,
you
have
multiple
compilation:
threads
that
could
be
generating
code
for
different
targets.
What
does
it
mean
to
ask
global
compilation
environment
for
the
target?
What
answer
would
you
expect
to
get
back
and
it
sounds
like
we
are
having
we
do
in
places
where
we
asked
that
question
right
now.
B
D
A
B
B
Those
ones
like
if
that's
if
the
first
step
on
this
evolution
is
just
to
get
that
tactical
level
going
and
I,
don't
personally
see
why
we
would
impose
a
higher
overhead
on
Irwin
or
whoever
from
open
j9
wants
to
contribute
first
step
of
the
refactoring
so
that
they
can
get
what
they
want
out
of
it
right.
It's
one
step.
A
Closer
to
what
we
would
like,
so
it
looks
like
that
leaves
an
open
tin
and
there
are
few
places
like
options.
A
few
pre-process
where
we'll
do
stuff,
like
prefer
tlh,
depends
on
the
cpu
like
target
cpu,
jade
hooks
and
initialize.
Just
click
initialize
send
target
that
one
there's
another
one
where
you
know
heuristic
counter
vary
depending
on
thing.
They
seem
to
be
heuristic
decisions
that
we're
making
I
would
imagine
this.
B
A
Count
zero,
like
you
know,
it's
heuristic
decisions
about
like
if
you
hit
the
target
support
citizen
and
you
arenot,
it's
also
knit
it's
very
coarse.
Grain
is
not
the
time
doing.
Anything
like
feature
detection,
definitely
not
being
asked
outside,
although
I
mean,
of
course,
that's
one
floating
point.
It's
over
the
feature
section.
A
C
A
A
B
A
D
D
B
B
If
we're
allowing
the
target
to
be
overridden
like
the
CPU
to
be
overridden
necessarily
potentially
allowing
somebody
to
create
code
that
isn't
going
to
run
in
the
process
right,
so
whatever
the
actual
get
and
its
associated
runtime
is
going
to
choose
to
do
in
the
context
of
the
current
process
if
you've
overridden
it
for
a
compile,
it
shouldn't
change
those
decisions
really
right
here,
because
it's
not
like,
if
you're
going
to
add
capabilities
or
remove
like
if
you're
gonna,
remove
capabilities.
Well
sure
you
know
you
might
have
prioritized
compiling
something
that
you
can't
Hardware.
B
That
we
usually
have
a
fallback
path
for
all
that
kind
of
stuff
when
it
capability
driven.
There's
usually
add
it
at
some
point.
Unless
somebody
deleted
the
support,
and
then
you
have
the
other
kinds
of
queries
which
are
really
about
what
should
I
get
running
first
or
what
should
I
prioritize
on
this
VM,
in
which
case
the
kinds
of
things
you're
talking
about
Dorking
around
with.
A
B
Get
me
wrong,
I
completely
agree
and
there's
a
huge,
thorny
software
engineering
and
architecture
problem
around
how
to
disentangle
all
that
properly,
but
the
baby
Lila
right,
we're
talking,
there's
there's
a
strategic
question
here
about
how
we
pivot
all
of
this
stuff,
so
that
it
all
makes
more
sense,
but
there's
also
a
tactical
problem
and
I'm
I'm
I
guess.
My
point
is:
let's
focus
on
the
tactical
problem
in
the
hope
that
it
gets
us
a
little
bit
closer
to
seeing
the
solution
to
the
strategic
problems
right
now.
A
A
B
C
D
A
A
D
A
B
It
mean
that's
a
potentially
a
global
options,
question
yet
to
a,
but
different
compilations
can
do
different
things
anyway,
and
perhaps
the
global
options
should
they
shouldn't
even
be
set.
There
should
actually
be
what
I'm
currently
compilations
right
should
it.
So
maybe
that's
the
wrong
place
for
the
read
right
most.
B
A
A
A
D
This
much
farther
we
yeah,
but
so
the
thing
is,
we
could
imagine
and
say
like
if
you
have
a
server
less
application,
that's
built
with
JIT
as
a
service,
and
you
have
something
that
is
running
on
bike
IOT
device,
that's
running
some
version
of
arm
and
then
at
some
point
you
decide
that
you
want
to
I,
don't
know,
deployed
a
hardware
upgrade
and
now
all
of
your
devices
have
this
extra
encryption
unit
or
whatever.
And
now
you
want
your
server
to
start
generating
that
takes
advantage
of
that
right
on.
A
C
E
A
A
A
B
D
A
E
C
A
Platform,
this
is
the
air
64.
There's
one
per
hectare,
I
think
it's
like
underscore
passport
holders
right,
there's
one
4x
as
well,
and
that
and
I
doesn't
worry
the
environment.
That
just
knows.
Okay,
so
here
are
64
queries.
The
environment
says
you
know:
target
on
CPU
dot
is
target
within
conditional
branch,
immediate
range.
A
A
B
A
D
D
D
B
B
B
A
C
A
C
C
A
E
A
B
A
D
B
Think
the
answer
to
that
question
is
partly
how
much
work
you're
going
to
have
to
do
and
how
much
work
you're
going
to
contribute
to
the
project
in
doing
it.
The
thing
that
you
need
is
the
cpu
primarily,
and
you
cannot
or
whatever
reason
provide
the
time
to
do
the
target,
because
it's
fun
for
killing
an
thing
or
there's
some
really
ugly
thing
about
it.
That
means
deeper
work.
I.
B
Don't
think
that
the
project
should
reject
outright
the
possibility
of
you
moving
cpu
as
step
one
of
moving
the
target,
but
if
you
can
do
the
target
as
a
whole,
then
that
would
probably
be
cleaner
and
nicer
and
probably
be
less
confusing
for
people.
So
if
you
can
do
that,
that
would
be
fantastic,
but
I
started
on
for
the.
A
Twenty
plus
plus
I
went
to
it
didn't
seem
like
that
harder,
yes,
I,
never
hit,
but
I
mean
I
hit.
This
thing.
Do
you
think
this
is
one
of
the
judgment
calls
where
I
stopped
that
you
guys
are
like
at
the
point
of
so
we
can
remove
it,
one
of
those
things
where
I
don't
have
to
bring
it
up
in
a
PR,
an
issue
and
just
say:
here's
what's
remaining
what
you
guys
think
about
all
these
places
and
then
accordant.
B
Is
a
problem
with
that
right
now
some
of
them
may
end
up
in
the
category
of
it's,
not
really
the
target
or
the
host.
It's
the
thing
that
you
actually
are
introspecting.
What
the
hell
was
that,
in
which
case
the
bits
or
whatever
it
is
that
we're
using
to
store
this
representation
may
need
to
end
up
in
the
persistent
method
body
in
those.
B
B
C
B
It
may
it
may
even
make
sense
to
think
about
putting
those
bits
into
them.
Method
info,
like
make
sure,
is,
like
you
know,
few
words
yeah
exactly
that
many
justifiable
it
for
the
sake
of
everybody's
sanity,
if
these
can
really
be
polymorphic
and
you're
really
going
to
have
a
ot
methods
in
the
cache
or
whatever
that
are
configured
for
a
particular
target,
yeah.
Okay,
maybe
the
header
of
the
cache
knows
something
about
what
the
overall
scheme
should
be,
but
at
least
knowing
exactly
what
the
compiler
was
running
with.
Actually
we
need
to
be
that
bad.
B
A
A
A
B
No
II
think
he
said
start
from
scratch
and
I
think
that
this
is
there
have
been
so
many
inconsistencies
introduced
in
the
code
base.
Since
we
had
used
this
probably
literally
15
years
ago,
was
a
lost
time.
It
was
properly
used
that
it
probably
just
makes
sense
to
start
from
scratch
and
think
about
every
place
where
use
host.
Where
you
use
target,
where
you
use
human
prj,
where
you
don't
use
you
can
you
in
prj.
A
A
B
C
A
B
Yeah
so
related
to
this
in
some
way
is
and
I
want
to
open
an
issue
for
this
Andrey
means
as
a
PR
is
there's
an
object
model
query
called
size
of
reference
address
and
the
intent
of
that
API
is
to
answer
what
is
the
size
of
a
reference
on
this
target,
and
the
answer,
for
example,
should
be
eight
on
64-bit
and
you're,
using
compress
references,
in
which
case
it
should
be
four
but
I
think
it's
always
returning.
Eight
on
64-bit
and
people
are
using.
D
B
In
context
where
you're
not
asking
about
a
reference
you're
just
asking
about
the
size
of
a
pointer
on
this
on
this
target,
so
it's
it's
it's
being
used
incorrectly,
so
I
think
and
there's
gazillion
of
these
queries
all
over
the
place.
So
in
some
cases
I
suggest
the
size
of
PTR
underscore
t
as
opposed
to
asking
the
object
model.
What
is
the
size
of
the
reference
address?
So
there's
some
serious
work,
I
think
that
needs
to
get
consistency,
work
that
you
clean
up
there,
but
it's
kind
of
related
to
your
project.
To
this
thing
so.