►
From YouTube: Lang Team Design Meeting 2020.03.26: Sealed Rust
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
B
C
C
So
many
of
these
devices
that
you're
shipping
on,
for
example,
avionics,
might
honestly
have
a
service
lifetime
with
30
to
40
years,
where
you're
expected
to
still
push
updates
to
them,
still
maintain
the
tool
chains
that
are
being
used
by
them,
and
things
like
that.
So
there's
typically
a
couple
requirements
and
different
industries
care
more
and
less
about
different
parts
of
these
guarantees,
but
they're
looking
for
a
specific
set
of
requirements
that
don't
necessarily
align
with
what
your
average
user
of
a
programming
language
is,
but
still
is
very
important
to
these
these
niche
industries.
C
So
pretty
much
as
long
as
I've
been
using
Ross.
My
hope
was
that
rust
could
be
a
better
language
for
these
areas.
For
the
many
various
reasons
of
why
rust,
I
feel
like
is
a
better
foundation
for
that.
Probably
don't
need
to
talk
too
much
to
this
crowd
about
that,
but
basically
there's
a
lot
of
work
necessary
to
have
a
language
that
is
going
to
be
both.
C
Well,
let
me
just
say
kind
of
the
the
goal
is
while
I
want
this
to
happen.
I
want
safety-critical
industries
to
be
able
to
choose
rust,
I
want
to
make
sure
that
we're
not
placing
an
undue
burden
on
open-source
contributors.
Volunteers,
things
like
that
who,
if
they're,
not
in
a
safety,
critical
industry.
This
is
not
a
particular
interest
to
them
and
is,
and
could
be
seen
as
as
particularly
high
overhead.
A
C
So
definitely
having
a
full
entity
rather
than
an
open-source
project
is
definitely
a
must
and
I'd
say
generally,
unless
it
was
a
very
large
consortium
of
very
major
industry
players
having
an
actual
company
at
the
end
at
the
other
side
of
the
table
is
also
a
particular
plus
I'd
say
for
most
of
the
customers
that
we
earn
most
of
the
people
who
are
interested
in
using
this.
That
we've
talked
to.
D
Some
of
the
things
that
you
mention
here
enter
in
the
blogpost
stone-like.
They
are
pretty
well
aligned
with
goes
all
the
open-source
team
as
well
right,
like
I,
mean
we've
wanted
at
least
I
have
wanted
and
I
know
central
has
a
proper
spec
for
parts
or
all
of
rust,
eventually,
obviously
for
for
quite
a
while.
So
if
that's
part
of
this
work,
that's
something
that
would
be
not
just
not
be
a
burden.
It
would
be
very
welcome.
I,
guess,
to
have
something
like
this
yeah.
C
I
definitely
think,
there's
a
lot
of
alignment.
There
there's
some
things
like
the
specification.
There's
things
like
the
test
suite
that
would
shake
out
of
validating
that
specification
against
any
implementation,
whether
it's
rust,
C
or
any
other
implementation,
or
rusty,
plus
LVM
versus
rust,
C
plus
crane,
lift
or
any
other
combination
of
those
things.
The
the
overhead
that
I'm
talking
about
is
less
about
writing
the
initial
spec,
but
more
doing
the
full
validation
of
that
spec,
as
well
as
doing
all
the
paperwork
that's
associated
with
that.
C
For
this
effort,
I
see
rust
as
being
best
positioned
woman,
everyone's
kind
of
talking
to
make
the
best
in
one
place
well,
yeah
that
all
the
paperwork
and
certification
work
that
goes
into.
That
is
definitely
like
the
long
tail
of
effort
that
provides
very
little
value
to
you
for
the
open
source
project
other
than
perhaps
building
confidence
in
the
open
source
project,
but
is
fundamentally
required
for
for
people
who
are
in
a
safety,
critical
industry.
C
Yeah
definitely
I
think
there's
there's
a
couple
of
hard
prerequisites
that
are
very
aligned
with
the
language.
That's
things
like
writing
the
specification.
That's
things
like
validating
the
specification,
at
least
on
the
front
end
level.
There
are
probably
some
things
that
are
gonna
need
to
shake
out
in
terms
of
like
specification
of
core
libraries,
or
even
some
tooling,
like
how
rusty
or
cargo
work
or
some
subset
of
rusty
or
cargo
work
that
are
basically
just
enough
functionality
for
what
a
safety-critical
user
would
use.
E
D
So
this
question
may
be
naive:
I'm
an
academic
I
have
no
background
and
safety-critical,
so
I
have
I,
have
ideas
in
my
head
of
what
it
means
when
you
say
specification,
but
it's
probably
not
a
definition,
so,
probably
something
much
more
practically
minded.
C
So
it's
interesting
because
the
safety
critical
specifications
spend
a
lot
of
time
talking
about
what
your
application
and
what
your
actual
system
that
you're
shipping
needs
to
do,
their
their
actual
guidance
on
what
you
need
to
do
to
prove.
Typically,
this
is
called
tool
qualification,
so
any
tool
that's
used
as
part
as
the
engineering
process
is
called
tool
qualification,
whether
that's
a
unit
testing
framework
or
a
compiler,
a
linker,
a
static,
analyzer,
anything
that's
taking
in
an
engineering
action
and
putting
a
program.
The
loop
is
a
tool
that
needs
to
be
qualified
to
some
level.
C
Their
specification
for
what
needs
to
be
done
or
a
tool
is
actually
super
super
vague,
but
in
essentially
I
summarized
it
in
the
last
post
as
you
need
to
have
at
an
engineering
level
a
level
of
due
diligence
performed
and
have
a
level
of
confidence
in
the
tool
that
you're
using
that
it
will
perform
to
the
level
that
is
necessary.
So
historically,
languages
like
C
and
C++
and
even
ADA
I
believe
have
used
a
much
more
pros.
Jib
driven
specification
so
like
ISO,
C
or
ISO.
C
C++
is
very
much
like
a
human
legal
document
of
bullet
points
and
some
examples
and
things
like
that.
What
we
have
in
mind
for
sealed,
rust
and
I
will
admit
a
fair
amount
of
the
sealed.
Rust
planning
is
fairly
open-ended,
where
we're
kind
of
both
figuring
it
out
and
and
learning
from
people.
As
we
go
I
see,
the
human
pros
driven
specification
is
as
objectively
less
valuable
than
having
something
that
can
be
at
least
partially
or
entirely
machine
driven,
whether
that's
some
kind
of
formal
grammar
for
this,
whether
it's
some
kind
of
proof.
C
This
I,
don't
think
that
we
will
pragmatically
get
to
the
point
where
we're
able
to
write
an
entire
proof
for
the
rust
language,
I
think
for
certain
parts
of
it,
like
the
borrow
checker,
the
type
system,
components
of
the
language
that
are
possible
to
isolate
and
formally
prove,
I,
think
having
that
does
build
confidence,
and
we
can
use
that
as
like
a
composite
specification
language
where
we
have
some
parts
that
are
formally
proven.
We
have
some
parts
that
are
machine
specified
and
for
the
really
hard
parts.
C
Maybe
we
have
some
parts
that
are
pros
specified
those
kind
of
things,
but
the
more
we
can
push
to
machine
checkable
or
formally
verifiable
parts.
I
think
there
is
understanding
both
from
the
safety,
critical
and
the
academic
community,
that
these
are
are
highly
valuable.
But
in
terms
of
like
how
that
scales
to
an
entire
compiler
formally
specifying
every
single
part
of
rusty
would
be
like
a
combinatorial
explosion
of
very
difficult.
In
my
opinion,
but
I
think
I
was
going
to
say.
E
E
C
I
would
be
very
honest.
The
the
the
goal
is
to
be
as
pragmatic
as
possible
and
to
lean
on
whatever
academic
research
we
can
and
to
lean
on
whatever
best
of
Best
of
Breed
tooling
exists
today,
I'm,
certainly
probably
not
the
right
person
to
be
talking
about
exactly
this
I'm,
certainly
not
a
compiler
engineer,
I'm,
certainly
not
a
language
engineer,
I'm
very
much
more,
a
systems.
C
But
the
hope
is
to
bring
in
people
who
are
working
in
the
rust
language.
People
who
are
working
in
qualified
compilers
and
kind
of
bring
them
all
under
the
seal
rust
umbrella
and
find
the
most
magnetic,
both
from
the
time
to
qualification
approach,
as
well
as
the
what
brings
the
most
value
and
has
the
least
impact
for
the
rust
language
in
general.
If
even
that
makes
sense,
I.
D
Think,
even
in
the
operational
semantics,
there
are
large
fragments
that
wouldn't
be
too
difficult
to
specify.
There
are
some
really
nasty
corners
like
integer,
pointer
costs
and
stuff
like
that,
but
but
there
are
large
parts
of
the
language
where
a
precise
ossification
shouldn't
be
too
complicated.
I
mean
I,
have
I
have
a
whole
bunch
of
that
stuff.
In
my
head,
it's
just
I'm,
not
sure
exactly
where
to
write
it
down
and
also
like
to
justify
spending
a
month
on
doing
that.
C
So
I've
kind
of
always
build
seal
trusts
as
a
subset
of
the
stable
language.
So,
while
I
don't
want
a
fork
or
divert
from
rust
skulls,
there
are
certainly
some
things
that
we
could
do
of
totally
refusing
to
do
them
to
make
our
lives
way
easier.
For
example,
we
have
a
chance
to
say
okay
cool,
as
is
not
allowed
to
be
used
at
all,
or
something
like
that.
You
know
there
are.
There
are
portions
of
language.
We
could
throw
overboard
and
just
say
yeah.
Those
are
rough
edges.
We
just
won't
have
them
safety-critical.
C
C
That's
that's
actually,
at
the
end
of
the
day,
all
misra
see
is,
is
it's
you
take
the
see
specification
which
is
theoretically
well
specified,
and
things
like
that
and
ms
receive
constrains
that
down
to
an
even
more
limited
set
where
it
says.
You
must
also
only
do
these
things
or
not
do
these
things.
My
hope,
ideally,
is
that
sealed
rust
is
misra
rust
that
there
is
no
additional
validation
or
additional
tool.
You
have
to
buy
or
specification.
C
E
Very
easy
to
see
like
these
are
the
places
where
we
are
adding
the
gating,
but
doing
it.
Four
things
that
are
always
stable
might
be
a
bit
more
difficult
might
require
more
checks,
because
it's
like
you
are
removing.
You
are
starting
from
in
a
setting
where
you
removing
your
uncle
things,
and
then
you
are
adding
them
like
questioning
that
might
require
nothing
checks,
so
the
mind,
isn't
working
yeah.
C
D
C
Not
pragmatically
I'm
really
not
sure
how
to
achieve
that
totally,
but
that's
kind
of
they,
like
the
rust
user
metaphor
that
I've
been
going
for
it
and
it
does
seem
to
help
to
explain
people
what
we're
going
for
and
that
you
can
have
stable
rust.
But
if
you
then
switch
to
seal
draw
some
of
the
features
that
you're
used
to
may
not
be
available.
E
From
a
technical
perspective,
you
would
like
to
use
some
sort
of
sealed
or
something
in
the
son
library
and
and
like
a
list
of
features
that
are
sealed
and
and
if
it's
not
in
that
list,
then
then
you
you
could
have
put
on
some
gating
like.
If
you
are
on
the
sea
on
channel,
then
this
is
unstable.
Something
like
that.
So
it
would
be
like
the
unstable
channel,
but
like
the
difference
of
the
features
and
the
unique
getting
in
different
places
or
things
that
are
all
in
stable,
that's
all
I
mean
yeah.
C
And
the
process
that
I
see
for
that
is
basically
before
you
apply
to
steal
the
attribute
to
that,
you
need
to
have
specified
the
behavior.
So
you
have
like
a
specification
for
that
component
and
you
have
the
validation
component
for
that.
So,
basically,
like
that's,
the
step
of
taking
something
from
stable
to
sealed
is
you've
done
the
specification
work.
So
if
anyone
at
any
point
like
if
it's
a
really
feature
in
the
specification
and
validation
for
that
ends
up
being
fairly
small,
that's
something
that
could
be
PR
and
then
included
in
the
next
sealed
release.
C
But
you
know
basically
that
sets
the
barrier
of
anything
we
want
to
include
needs
to
opt
into
this
level
of
specification
and
opt
into
this
level
of
validation
of
it.
Essentially,
so
it
can
be
a
pay-as-you-go
model,
but
yeah
we'll
have
to
figure
out
what
the
minimum
like.
How
little
of
Russy
can
we
get
away
with
sealing
and
have
it
still
be
a
useful
compiler
for
people
in
safety-critical,
like
I?
E
D
Yeah
I
mean
I,
guess,
there's
a
risk
here
that
I
don't
know.
Let's
assume
an
L
was
sealed,
but
Polly
knows
isn't.
This
could
mean
that,
depending
on
how
things
that
were
now,
this
could
mean
that
I
would
have
to
live
in
the
Kawada
three
more
years,
while
all
stable
channels
already
use
Polonius,
because
it
takes
three
years
to
specify
invalidate.
B
C
C
In
addition,
every
three
years
we'd
have
the
sealed
rust
release
18
months,
offset
with
that,
basically
enough
time
to
break
in
the
new
release
and
then
have
a
sealed
rust
release
in
there
and
essentially
really
only
aim
for
one
major
release
per
addition,
so
the
channel
certainly
doesn't
live
long
as
long
in
terms
of
so
let
me
answer
a
couple
things
real,
quick
in
terms
of
the
specification.
The
specification
definitely
should
not
get
down
into
the
implementation
level,
for
example,
if
from
the
outside
perspective
of
the
language,
the
behavior
of
nll
versus
polonius.
C
Basically,
if
we
specify
the
behavior
of
Baro
checking
in
a
certain
way
and
Polonius
just
allows
us
to
be
more
permissive
within
the
bounds
and
we're
refining
that
you
you
don't
really
validate
like
for
the
for
the
actual
like
qualification.
Yes,
you
may
need
to
have
like
coverage
and
testing
all
the
way
down
through
N
and
L
n
ll,
but
it
doesn't
necessarily
mean
that
a
language
level.
We
need
to
specify
how
l
n
ll
works,
because
that's
an
implementation
detail
of
the
compiler.
So
there
is
some
ability
to
stay.
C
Keep
this
specification
higher
level
than
that,
if
it
basically
the
difference
between
implementation,
details
and
not
implementation
details
in
terms
of
long-term
support.
That's
definitely
something
I
see
being
done
by
the
folks
that
are
being
funded
by
the
seal
dress
panel.
So
very
honestly,
like
the
the
qualified
compiler
is
going
to
be
a
fairly
high
paid
item
for
at
least
the
one
that
comes
with
the
sticker
and
the
paperwork.
C
So
the
idea
is
to
drive
LTS
and
all
of
this
initial
work
for
sealing
through
the
funding
that
we
would
get
for
that
and
that,
depending
on
how
this
funding
happens,
that
anything
that
benefits
the
upstream
project,
like
the
specification
like
the
validation
test,
would
be
part
of
the
open
source
project.
There's
certain
like
paperwork
and
things
like
that.
C
That
would
not
be
part
of
the
open-source
project
just
because
you
can't
really
have
an
open-source
qualified
compiler,
the
way
that
the
regulations
are
written
today,
you
need
kind
of
one
responsible
entity,
that's
done
the
qualification
and
is
basically
carrying
the
paperwork
that
you
do.
But
that
being
said,
but
there's
no
reason
that
the
specification
itself
and
like
the
compiler
validation
of
that
specification,
can't
live
in
an
open-source
project,
but
the
final
paperwork
that
says.
Yes,
we
have
done
this.
We
reviewed
this
as
engineers.
D
C
That
is
absolutely
the
goal
is
that,
for
both
academic
and
quality
reasons,
my
goal,
you
know
I'm
gonna,
say
my
goal.
If
we
had
all
the
money
in
the
world
or
we
got
all
the
funding
from
all
the
governments
or
know
all
the
companies-
and
they
just
threw
it
at
us-
we
didn't
have
to
worry
about
making
money.
The
goal
would
be
to
have
all
of
this
live
in
the
upstream
bruss
project,
because
it
gives
the
chance
for
people
like
academics
to
do
serious
research
on
what
a
qualified
compiler
means
and
ideally
continuously
improve.
C
What
best-in-class
qualified
compiler
meets.
I'll
go
the
other
way,
speaking
pragmatically,
depending
on
how
we
can
get
funding
for
this.
Basically,
there's
differing
levels
of
what
do
we
have
to
keep
to
be
able
to
sell
whether
that
means
just
the
end
sticker
and
paperwork,
whether
that
means
portions
of
the
validation
sweet,
whether
that
means
some
static
analysis
tools
that
go
along
with
it,
basically
to
be
able
to
pay
people
to
be
able
to
do
this
work.
C
We
have
to
sell
something
and
we
have
to
figure
out
what
companies
are
willing
to
do,
whether
that
means
just
joining
a
consortium
which
means
they're
paying
some
money
into
this
to
be
able
to
use
it,
so
they
can
get
the
sticker
at
the
end
or
whether
it
means
that
really
like.
We
need
to
be
more
selling
the
bigger
parts
of
it,
but
essentially
non-negotiable.
For
me,
for
be
able
to
be
part
of
this
is
that
the
design
goals
and-
and
the
specification
almost
certainly
has
to
be
open,
source
and
part
of
the
compiler.
C
The
kind
of
middle-ground
I
could
see
getting
going
back
and
forth
are
things
like
the
validation,
sweets
and
things
like
that,
or
at
least
one
reviewed
quality,
controlled
validation.
Suite
I'd
like
that
to
be
in
the
upstream
Russ
project.
It
is
traditionally
very
much
a
part
of
a
sold
qualified
compiler
set
like
if
you
look
at
what
solid
sands
sells
in
their
super
test,
suite,
which
is
for
actually
clanging
in
GCC
C
and
C++,
compilers,
they're,
essentially
selling
at
a
compiler
test
suite
and
that's
what
they
sell
to
companies
to
get
them
able
to
use.
C
Gcc
or
clang
for
certain
safety,
critical
areas
yeah
these
kind
of
things.
These
are
the
details
that
were
not
super
clear
on
we're.
Certainly
aiming
for
I
think
that
it
makes
the
most
sense
to
have
everything
in
an
open
source
and
be
part
of
an
open
standard
for
rust
and
things
like
that,
but
we're
really
figuring
out
what
it
would
take
to
get
this
project
done.
If
that
makes
sense,
so
actually
gives
me
a.
B
Question
I
wanted
to
ask
for
a
little
while
now
what
are
their
models
like
use?
You
listed
out
the
languages
for
which
qualified
tools
are
available.
You
said
C
and
C++,
and
they
are
are
other
languages
that
were
invented.
You
know,
I,
don't
know
when
it's
been
I
was
gonna
say
after
the
80s,
but
I
think
predates
that.
But
let's
say
90s
be
safe.
What
are
the
models?
Basically
for
new
languages,
getting
qualified
or
new
tooling?
Or
what
do
you?
What
did
you
looked
at?
C
In
terms
of
languages,
I'm
not
super
familiar
with
any
languages
that
have
been
been
qualified
in
recent
history.
There
have
been
dialects
of
C
and
C++
that
have
been
qualified
in
recent
history.
Basically,
in
the
80s
and
90s,
there
were
a
couple
languages
that
were
more
qualified
to
different
levels.
However,
since
then
the
industry
has
more
or
less
homogenized
on
C,
C++
or
mostly
C
and
ADA
in
terms
of
those
two
dialects,
there's
a
handful
of
others
and
there's
some
tooling.
C
C
Who
are
we
are
talking
to
who
have
a
qualified,
C,
C++
or
ADA
back-end,
to
have
a
very
short
hop
between
rusts
ir
and
their
current
existing
compiler
ir,
which
looks
nothing
like
LLVM
ir,
but
it's
still
a
compiler
ir
and
allowing
them
to
shorten
their
last
mile
of
qualification
tooling.
By
doing
by
basically
qualifying
like
if
Aris
is
concerned,
with
with
qualifying
rust
as
a
front-end
and
a
language
specification,
they
only
have
to
qualify
a
very
small
IR
to
IR
transformation
and
then
use
their
existing
IR
back
end
generator.
C
That
is
qualified
for
things
like
that
and
to
be
honest,
C
is
a
lot
of
people
have
been
using.
C
adds
that
I
are
like
instead
of
LLVM,
IR
or
Mir,
or
something
like
that.
They've
essentially
been
using
C
and
IR,
which
is
kind
of
a
weird
thing,
but
is
that's
kind
of
like
the
8010
hack
for
a
lot
of
compiler
engineers
where
they
go?
Ok
well,
I'll
just
figure
out
how
to
make
my
Python
tool
spit
out,
see
and
then
yeah
I
get
that
passed
by
a
regulator
and
they
go.
E
C
Yeah,
where
at
least
see
has
a
specification
where
lvl
I
are
doesn't
my
hope
or
my
personal
belief,
is
that
that's
a
kind
of
a
gross
hack
that
does
work,
but
is
suboptimal,
but
it
passes
the
sniff
test
of
at
the
end
of
the
day,
for
most
of
this
stuff
of
getting
it
through
a
regulator.
Your
job
is
to
prove
to
them
that
you've
done
due
diligence
and
to
not
surprise
them,
because
anytime
they're
surprised,
they're,
gonna
be
very
skeptical.
C
C
Although
now
we're
getting
a
little
bit
off
the
track
in
terms
of
what
I
think
the
industry
should
be
doing
versus
what
they
are
doing,
but
at
the
end
of
the
day,
I
do
think
that
we
can
be
doing
better
and
that
just
degrading
down
to
C
is
not
necessarily
the
most
pragmatic
or
the
most
valuable
thing
that
we
can
be
doing
here
when
people.
How
much
sorry.
B
C
So
the
into
shipping
a
qualified
product
is,
is
a
multi-layer
item
in
that
you
need
to
have
done
your
development
due
diligence.
All
of
the
tools
that
you've
used
need
to
be
of
proper
value
so
like
at
the
end
of
the
day,
like
no
one
expects
that
you
could
put
garbage
up
ocation
code
in
and
just
because
he
ran
it
through
a
qualified
compiler.
It
will
work
like
it
is
still
necessary
to
state
requirements.
Do
functional
testing
on
both
the
source
code
that
you
use
like.
C
Typically,
you
have
a
composite
validation
of
an
application
where
you
have
unit
or
module
testing.
You
have
some
level
of
integration
test
and
you
have
some
level
of
system
testing,
as
well
as
peer
review
of
your
requirements,
the
source
code,
the
tests
that
you
use
to
validate
that
as
well
as
the
end
and
by-product
so
like.
Theoretically,
there
are
multiple
approaches
of
this,
and
the
use
of
a
compile
are
of
a
qualified.
Compiler
is
only
one
facet
of
your
engineering
development
plan,
essentially
or
your
qualification
plan.
C
E
Backtracking
a
bit
two
things
we
talked
earlier
about
for
the
one
thing
I
didn't
fully
follow
was
the
vs.
earliest
bit.
So
when
you
say
implementation
detail,
so
when
so,
for
example,
if
we
would
expect
the
compiler
in
terms
of
nll
and
what
programs
it
except
some
regions
and
we've
now
moved
low
Gnaeus,
so
that
so
the
programs
are
now
accepted,
but
previously
were
injected.
C
All
of
these
things
is
not
what
you
should
specify,
because
then
you've
over
constrained
your
system
in
that
changing
the
system
becomes
a
breaking
change.
It's
kind
of
like
the
difference
between
API
level,
stability
guarantees
and
under
the
hood
stability
guarantees
like
I'll
use,
the
like
Ross
structure
layout
as
an
example
in
that
like
like
you,
can
specify
that
certain
types
will
exist
in
there,
but
they're
ordering
may
not
be
specified
in
that.
C
Where
maybe
that's
something
that
you
added
at
stage
two
where
you
say
before,
we
didn't
specify
anything
about
alignment,
but
in
generation
2
we
do
specify
things
about
alignment,
so
a
level
two
or
like
the
second
generation
of
specification
is
always
compliant
with
the
original
specification,
because
the
original
specification
was
more
vague,
but
there
may
have
been
items
that
were
compliant
with
the
first
generation
of
specification,
but
are
not
in
in
the
second
iteration.
If
that
makes.
E
C
Yet
there
there
may
be
certainly
test
cases
that
you
write
to
validate
the
specification
that
will
be
incorrect
by
switching
from
NLL
to
polonius
cuz.
Maybe
you
said
it
will
reject
this,
but
now
we've
expanded
the
definition
to
be
more
permissive.
As
long
as
we
can
prove
that
the
more
permissive
one
is
still
correct,
so
there's
still
refinement
that
can
happen
and,
yes,
we
will
likely
have
to
revise
our
test
cases,
but
the
hope
is
that
the
specification
of
borrowing,
for
example
or
lifetimes
rather
it
will
be
refined
rather
than
be
a
breaking
change.
E
C
A
C
C
If
there
was
breaking
syntax
changes
like
if
we
had
pulled
in
a
key
word
or
something
like
that,
like
I,
don't
see
it
being
used
significantly
different
than
additions
are
used
in
stable
other
than
like
yeah
I
plan
to
keep
it
pretty
coupled
to
these
stable
additions,
definition
of
or
the
stable
trains
edition
definition.
If
that
makes
sense,
yeah.
D
E
E
C
I
don't
have
any
particular
like
it
really
was
like
timeline
based
and
having
a
version
that
doesn't
change
too
too
often
and
theoretically.
I
could
see
some
customers
being
interested
in
having
like
a
sealed.
You
know
we
qualified
this
library
in
sealed
Russ
2021
and
we
won't
the
same
thing
that
like
stable
rust
has
where
you
can.
You
can
specify
like
who
will
compile
this
crate
in
2021
edition
and
I'll.
C
Compile
this
written
in
2024
edition
and
I
want
them
to
still
seamlessly
interact,
and
it's
it's
especially
important
for
safety,
critical
people,
because
if
they've
done
all
this
effort
to
validate
a
library,
they
literally
do
not
want
to
touch
that
library
like
no
changes
to
that
code.
So
if
they
can
keep
that
code
totally
unchanged,
then
they're
they're
gonna
be
pretty
happy
about
that
as
well.
Right.
E
C
E
D
B
He
is
lost
but
Ralph,
but
if
you
I
think
that's
true
of
course,
but
you
could
also
imagine
that
we
certify
it
like,
at
least
from
a
point
of
view
of
the
test
suite
and
so
on.
You
can
say
the
behavior
of
2021
code
is
not
affected
by
Childress
2024
in
any
way
that
we
can
see,
and
therefore,
if
you
haven't
changed
for
2021
code,
like
it's
at
least
an
easier
path
to
get
in
right,
if
not.
C
C
This
is
all
true,
except
for
this,
and
maybe
the
2024
compiler
will
still
like
throw
a
compiler
for
valid
2021
code
if
it's
really
like
a
soundness
problem
or
something
like
that,
but
yeah
like
I,
said
I
kind
of
want
to
pump.
This
too,
like
we've,
got
enough
problems
for
the
next
three
years,
but
I
definitely
think
this
will
be
like
one
of
the
first
questions
when
we
start
going
to
the
second
sealed
rust,
Edition
or
things
like
that.
E
C
We're
cutting
out
a
little
bit
there,
but
I
think
I
got
the
gist
about
that,
so
most
quality.
Interestingly
enough,
most
qualified
compilers
tend
to
be.
They
don't
start
life
as
optimizing
compilers.
Usually
usually
they
start
from
a
non
optimized
standpoint
and
the
goal
of
the
compiler
is
to
be
as
simple
and
as
verifiable
as
possible,
and
then
you
slowly
start
adding
provable
validations.
C
E
C
Could
be
I'll
answer
somewhat
high
level,
so
qualified
is
not
one
goal.
There's
a
big
difference
between
even
in
within
one
industry,
there's
typically
different
levels
like
when
you're
in
an
airplane.
The
coffee
machine
is
still
qualified,
but
it's
qualified
to
a
much
lower
level
than
the
flight
control
system
and
even
like
your
weather,
radar
is
kind
of
in
the
middle
of
those
two
I
don't
think.
Russy
is
likely
to
be
qualified
to
the
highest
level,
because
rusty
was
not
designed
with
qualification
in
mind.
C
I
would
say,
and-
and
things
like,
the
language
specification
are
not
going
to
be
different.
However,
I
see
if
you,
if
rust,
wants
to
move
into
the
very
strictly
safety
critical
areas
like
the
highest
level
of
avionics
and
things
like
that,
it's
likely
Russy
would
need
to
be
rewritten
or
a
new
version
would
need
to
be
rewritten.
That
is
not
optimising
base
that
is
willing
to
make
a
lot
of
performance
and
even
complexity.
Trade-Offs
in
terms
of
simplicity.
C
Just
so
that
it
is
much
easier
to
qualify,
but
my
hope
is
that
by
the
time
that
we
do
that,
we've
laid
a
really
strong
foundation
in
that
we
have
a
language
specification
and
essentially,
we've
set
up
rust
C
as
the
benchmark
of
what
that
can
be
done
in
terms
of
like
we
have
a
validation
suite
that
works
for
for
Russy.
So
if
we
start
writing
rust,
c2
or
sealed
rust
see
that
it
can.
C
This
might
be
an
area
where
we
can
actually
make
up
a
lot
of
interesting
places
and
things
like
mere
optimizations,
where,
if
we
can
specify
in
a
qualified
way,
Mir
two
Mir
transforms
or
IR
to
our
transforms
that
are
sound,
then
maybe
that's
a
room
for
doing
things
like
inlining
or
NR,
vo
or
things
like
that.
That
could
be
specified
on
that
level
and
not
in
a
really
gnarly
machine
specific
way.
But
this
is
a
little
more
theoretical.
C
Tl
DR
is
we'll
probably
avoid
optimizations
sorta
on
day
one,
and
even
though
we
lean
on
LLVM
optimizations
will
probably
lean
on
whoever's
optimizations
we're
doing
whether
it's
crane,
lifts,
optimizations
or
proprietary
compiler
company's
optimization
passes
that
are
qualified.
Most
qualified
compilers
sit
at
an
optimization
level
between
o1
and
o2
in
terms
of
practicality.
I
know,
that's
a
super
hard
thing
to
judge.
C
B
C
It's
conversations
where
most
plans
are
not
set
in
stone,
we're
very
willing
to
take
opinion
from
people
who
are
smarter
than
us
of
which
I
account
this
room.
Is
everyone
in
this
room
is
smarter
than
me
at
a
lot
of
facets
of
the
compiler
and
I'm
very
willing
to
take
feedback
from
that?
That
being
said,
we're
also
hoping,
as
soon
as
we
can
get
funding
to
start
looping
in
the
people
who
are
already
doing
this
work
to
be
doing
the
seal
trust
work
too
and
hopefully
be
paid
for
that
time
or
not.
C
Hopefully,
if
you're
doing
sealed
rust
work,
you
would
be
paid
for
that
time.
So
the
goal
is
to
like
I
said:
take
the
people
who
know
the
most
about
rust
and
to
know
other
people
outside
of
this
group
who
know
the
most
about
qualified
compilers
and
to
put
them
in
the
same
room
and
and
and
make
it
happen.
But
if
you
have
ideas,
if
you
want
to
bounce
things
off,
if
anything
that
I've
been
saying
sounds
really
wrong,
it
could
be
that
I.
C
Just
don't
understand
it
very
well,
and
I
will
very
readily
admit
that
I
don't
have
comprehensive
knowledge
of
rust,
qualified
compilers.
All
of
these
things
and
if
you
ever
I've
had
a
lot
of
productive
talks
with
Nikko
and
central
already
and
I'm
very
willing
to
have
more
productive
talks
too,
to
refine
this
plan
because
it
is
really
a
play.