►
From YouTube: GitHub Quick Reviews
Description
Powered by Restream https://restream.io/
A
Hello
friends,
it's
time
for
another
api
review,
so
today
we
have
three
blocking
issues
and
then
we
will
see
what
the
day
brings
after
this
at
least
the
first
two
should
be
relatively
quick.
I
don't
think
james
is
here
because
time
zones
and
stuff,
but
I
think
it
can
represent
this
one.
So
basically,
what
we
have
added
is
this
requires
preview
features
attribute.
A
So
the
problem
of
the
attribute
is
that
basically
right
now,
it's
just
the
parameters
constructor,
so
you
don't
get
to
do
anything
with
the
action,
but
you
just
apply
it,
and
then
this
marks
an
api
as
being
a
preview,
and
what
james
is
suggesting
is
that
we
basically
make
it
a
bit
more
similar
to
how
obsolete
works.
Obsolete
has
a
message,
so
you
can
optionally
say
why
the
api
is
obsoleted.
So
in
this
case
we
could
say
you
know
why
is
this
particular
api
marked
as
preview?
A
What
is
actually
in
preview
here
and
then
the
other
suggestion
that
I
had
based
on
the
discussion
and
based
on
the
way
he
described
it
is
that
you
know
we
should
probably
also
have
a
url
parameter.
So
then,
when
we
actually
report
the
diagnostic
in
rosalind,
we
can
actually
use
the
ul
parameter
to
surface
that
in
roslin,
which
basically
means
in
visual
studio.
When
you
click
on
the
diagnostic
id,
instead
of
just
going
to
the
generic
preview
featured
documentation,
it
would
actually
go
to
whatever
url
the
developer
put
on
the
api.
A
So
this
way
we
can,
for
example,
have
a
landing
page.
That
is
very
specific
to
kestrel
http
3
support,
rather
than
your
generic
preview
feature
description
and
that's
pretty
much
it.
So
the
idea
would
be.
Url
is
optional,
so
you
set
it
via
the
property.
Syntax
and
message
is
also
optional,
but
instead
of
setting
it
explicitly
because
that's
fairly
ugly,
we
would
just
have
a
overloaded
constructor
where
you
can
just
pass
in
the
message,
which
is
also
consistent.
How
obsolete
works?
B
A
Is
because
we
want
previous
to
be
viral,
we
generally
don't
want
people
to
arbitrarily
suppress
certain
things
right.
So
you
could
argue
that
you
know
we
are
violating
this
ourselves
in
the
platform
right
because
we
basically
turn
it
off
in
particular,
areas
to
basically
say
yep
in
32
implements
the
preview
interface,
for
example.
A
But
you
know
we
don't
want
in
32
to
be
in
preview
for
obvious
reasons,
and
so
you
only
want
the
things
that
are
related
to
the
math
interfaces
to
be
in
preview
right,
but
the
problem
with
that
is,
I
think
this
is
kind
of
intentionally
to
be
a.
You
know,
an
expert
level
scenario,
because
it's
not
easy
to
say
you
know
when
is
it
safe
versus
when
it
is
not
safe
right,
that's
kind
of
something
that
a
human
has
to
determine
and
I
feel
like
as
soon
as
we
introduce
the
diagnostic
id.
A
We
would
now
have,
to
probably
add
some
rules.
That
say
well,
if
you
know,
if
you
put
the
attribute
on
yourself
now,
what
diagnostic
idea
are
you
supposed
to
use?
Because
if
you
use
a
different
one?
Well
then,
that
may
not
play
very
well
with
the
suppression
for
the
other
one
right.
If
you
use
the
same
one
well,
then
you
may
inadvertently
suppress
something
right,
that's
also
not
necessarily
obvious,
so
I'm
not
sure
that
we
ever
want
to
do
that.
To
be
honest,.
A
A
If
we
change
our
mind
later
and
add
diagnostic
id,
then
it's
kind
of
sucky
that
we
named
the
url.url
format,
but
we
can
probably
just
say:
well,
you
know.
If
url
contains
a
placeholder,
then
you
know
we
will
use
it
and
no
matter
whether
whether
it's
natural
url
format,
if
we
would
ever
add
it
later
because
it
doesn't
strike
me,
it's
very
likely
that
somebody
has
brace
zero
in
it
right.
B
And
frederick's
right,
you
missed
your
chance
to
put
your
garbage
carts
out
to
the
street
in
yeah.
B
I
mean
we
did
talk
back
back
when
we
were
first
proposing
this
attribute.
We
did
talk
about
being
able
to
put
a
name
of
the
feature
on
there
and
be
able
to
opt
into
you
know.
Out
of
you
know,
individual
features
and
we
we
concluded
not
to
we
concluded
that
it
was
a
global
switch
and
I
think
part
of
that
yeah.
So
I'm
wondering,
though,
if
I'm
trying
to
replay
this
conversations,
I
don't
think
we
consider
diagnostic
ide
at
that
time.
B
A
A
The
other
one
has
the
name
y,
then
in
32
itself
you
know
if
we
would
mark
it
as
preview,
we
would
say
well,
this
requires
preview
features
x
and
y
yeah,
and
so
this
way,
basically,
we
can
aggregate
this
up
and
say
well,
this
application
really
depends
on
x
and
y,
and
so
the
idea
was-
or
the
intention
was
that
if
another
developer
later
on
calls,
you
know
a
third
preview
api
and
the
app
is
only
opting
in
for
x
and
y.
They
still
get
the
diagnostic
for
now
they
now
start
using
z.
A
So
the
idea
is
that
you
know
you're,
not
necessarily
like
marking
yourself
all
up.
Well,
once
you
opt
into
previews,
you
know.
Every
preview
feature
is
fair
game,
but
then
the
discussion
that
we
had
was
well.
You
know
if
you
think
about
how
previews
generally
work
is
that
we
don't
really
have
that
many
independent
features
to
begin
with.
A
We
generally,
you
know,
try
to
deliver
a
package
and
then,
within
that
package
you
know
these
features
tend
to
gel
with
each
other
anyways,
in
which
case
it's
kind
of
weird
if
they
would
be
disjointed
and
then
on
top
of
that,
the
concern
from
the
runtime
team
was
well.
What
what's
going
to
be
the
test
matrix?
A
If
we
say
people
can
turn
things
on
and
off
individually
right
and
to
me
kind
of
the
diagnostic
id
would
be
reintroducing
this
to
the
backdoor
kind
of
thing,
because
now
you
suppress
half
of
them,
but
not
the
other
half,
and
that's
why?
I'm
not
sure,
because
I
mean
to
me
the
nice
thing
with
the
feature
names
was:
it
was
at
least
a
formal
definition
of
what
it
would
mean.
D
While
I
there
there's
both,
I
I
guarantee
there's
going
to
be
at
some
point
in
the
future,
where
we
have
some
feature
where
people
want
to
differentiate
it
from
the
other
features
we've
shipped
in
the
past,
and
so
maybe
the
diagnostic
id
provides
an
easy
way
for
us
to
do
that
today,
where
that's
appropriate.
D
We
have
at
least
two
places
that
are
using
it
today,
however,
being
generic
math
and
the
system-run
time,
intrinsics,
experimental
and
they're
very
different
features
with
very
different
change,
with
very
different
reasons
why
they
are
preview,
and
so,
at
least
on
that
perspective,
it'd
be
worth
having
the
two
that
the
granularity
to
differentiate
between
them.
E
So
it's
the
expectation
that
if
we
did
a
diagnostic
ide
that
we
can
turn
off
like
different
parts
in
the
analysis,
because
the
diagnostic
ids
type
to
the
analyzer
itself
and
adding
that
to
the
preview
attribute,
doesn't
mean
you
can
suppress
it
using
the
preview
attributes
diagnostic
id.
I
don't
know
if
that
makes
sense.
A
Anything
else
it
doesn't
show
up
so
there's
the
reason
it
works
for
obsolete
is
because
obsolete,
isn't
an
analyzer,
it's
a
it's,
the
compiler
does
it,
and
so
the
compiler
isn't
subjugated
to
the
same
limitation.
So
I'm
not
sure
that
analyzer
can
actually
have
an
open-ended
customer
configured
id
space.
D
It
can
because
you
you
don't
report
different
ids
to
the
customer.
You
have
some
other
opt-in
mechanism,
such
as
via
the
editor
config,
to
be
able
to
say
I
have
my
own
subset
configuration
switches
that
I'm
aware
of,
and
so
you
know
I
can
report
ca
1000
to
rosalind,
but
I
can
have
internal
ids
that
I
can
suppress
myself
or
not
suppress,
and
I
believe
we
have
at
least
one
analyzer
or
the
community
has
an
analyzer
that
does
that
for
features
and
of
course,
editor.
Config
directly
supports
exposing
configuration
options
to
analyzers.
A
F
Didn't
we
have
some
sort
of
feeling
of
staying
within
just
the
library
space
that
once
two
preview
features
touched
the
same
thing
now
you
can't
describe
why
it's
like
it's
both
things.
So
is
that
union
intersection
like
what's
the
suppression
mechanism?
I
feel
like
that's
part
of
why
it
was
just
like
no,
it's
library,
you're
you're
opted
into
runtime
features
and
or
library
features
and
or
language
features,
and
those
were
our
three
levels
of
granularity
and
everything
finer
than
that
was
too
hard
to
describe.
F
I
I
mean
putting
a
description
on
things,
doesn't
seem
like
it
would
hurt,
but
trying
to
to
tool.
It
right
is
like
this
is
preview
because
it
uses
generic
math.
Well,
it
uses
generic
math
and
some
other
thing.
That's
in
preview
and
okay,
great.
We
remove
generic
math
and
we
go
to
remove
the
feature
it
doesn't
compile.
And
now
we
update
the
description
like
that's
just
a
string,
it's
not
a
there's,
not
a
tooling,
and
what
preview
features
are
you?
Okay
with
and
it's
you
either
get
library
preview
features
or
you
don't
anything
beyond.
E
Yes,
the
only
thing
to
consider
here
is
what
happens
if
somebody
derives?
Oh,
it's
a
co
class.
Okay,
never
mind.
A
More
data
on
it,
it's
also
more
expensive
in
reflection,
if
you
look
up
things
by
open-ended
types,
because
now
type
checks
are
more
expensive
but
yeah.
A
See
I'm
still,
this
is
the
problem
with
like
when
you
type
into
something
that
is
actually
not
an
actual
code
editor
all
right,
so
we
would
have
message,
would
be
nullable
and
url
would
be
nullable.
A
H
A
A
E
F
F
A
A
All
right
so
then,
next
one,
I
think,
hopefully,
is
an
easier
one
who
can
represent
this
one.
I
Okay,
so
this
one
is
about
allowing
custom
converters
to
work
correctly
for
dictionary
keys.
So
a
bit
of
background
today,
custom
converters
are
provided
for
types
and
impute
object.
Graphs
I
mean
for
types
to
appear
in
computer
object,
graphs
on
serialization
or
decentralization
as
dictionary
keys,
mostly
like
premises
like
string
and
numbers
cannot
be
used
to
handle
dictionary
keys.
So
this
manifests
today
as
a
not
supported
exception
being
turned
by
serializer
when
the
custom
converter
is
provided
for
these
types
and
then
the
types
appear
69
keys
in
object
graphs.
J
I
So
yeah
for
sure
so
yeah
thanks.
That's
the
next.
My
next
bullet
here
so
like
exactly
that.
That's
why
I
want
to
do
this
so
api
proposal.
So
we
have
a
couple
of
internal
methods
that
handle
this
big
and
serializer
today.
So
this
proposal
is
about
basically
exposing
them
with
so
many
more
changes
so
propose
two
new
virtual
methods
read
and
write
as
dictionary
key
have
the
same
signature
as
existing,
read
and
write
methods
on
json
convert
of
t,
except
for
the
right
method.
I
We
disallow
now,
because
dictionaries
can.
E
I
No
values,
as
keys
so
do
like
the
appropriate
checks
before
calling
the
converter,
but
just
like
normal
annotation
there,
some
notes
about
apis.
I
So
just
the
first
thing
I
said
about
not
having
known
values,
then
we
are
proposing
new
methods,
because
the
new
functionality
assumes
that
we
are
reading
and
writing
json
property
names.
So
that's
the
property
name
that
appears
before
the
value
open,
based
and
property
name
colon.
So
the
the
left-hand
side
of
the
json
key
value.
I
Only
so
by
adding
new
methods,
we
avoid
users
having
to
check
the
json
token
type
on
the
read
site
and
they
are.
They
also
need
to
write,
use
the
right
or
the
right
property
name
for
the
dictionary
key
when
they're
writing.
So
these
new
methods
can
now
it's
appropriately
named,
they
can
suggest
like
what
the
correct
figure
would
be
or
should
be
from
users.
I
So
the
other
thing
is
that
the
introduction
of
these
apis
means
that
all
serializable
types
can
be
serialized
and
digitalized
as
dictionary
keys,
provided
that
users
provide
implementations
of
the
new
methods
so
like
complex
types
like
pocos
and
collections
not
supported,
although
I
don't
think
I
think
that
is
like
pretty
edge
case.
But
there's.
I
I
Sorry,
okay,
so
yeah
so
now
complex
types
can
be
used
as
extreme
keys.
I
don't
think,
there's
any
reason
to
limit
that
and
then
also
gives
us
some
functional
priority.
We
need
to
software.
This
feature
is
supported
and
the
signatures
are
based
on
the
existing
redirect
methods
like
giving
distance
to
as
options
instances.
I
I
guess
you
could
do
stuff
like
honoring
the
dictionary
key.
I
mean
dictionary
key
policy,
that's
the
same
naming
case
for
like
your
strings
or
like
e-number
minimum
values.
If
you
wanted
to
know
on
it
that
way,
so
I
have
some
api.
This
is
examples.
Let's
scrolling
down
so
yeah,
it's
pretty!
This
converter
doesn't
do
anything.
I
Fancy
just
demonstrates,
like
you
know
the
fact
that
we
expect
a
token
type
of
property
name
on
this
realization
and
then
for
users
to
write
with
write
property
name
on
serialization
and
so
in
this
example
like,
rather
than
through
not
supported
exception,
that
the
custom
converter
that
users
provided
will
kick
in.
So
I
think
that
this
feature
provides
a
reasonable.
I
mean
one,
it's
like
useful
people
want
it
and
then
it
provides
reasonable.
I
To
the
breaking
change
that
we
talked
about,
where
I
mean-
and
it's
not
too
bad,
because
users
are
already
writing
custom
converters
for
the
types
it's
not
like,
they
have
to
write
new
like
like
a
lot
of
new
code,
just
maybe
a
couple
more
lines
to
like
get
the
previous
behavior
so
yeah.
That's.
A
C
Laomi,
can
those
methods
be
generalized
by
just
saying
string
like
write
a
string.
F
F
C
I
C
Yeah
I
mean
like
a
conversion
type
operator,
maybe
instead
of
the
rod
jason,
at
least
for
the
right
case,.
J
I
Yeah,
I
think
that
was
very
similar
to
what
previous
statements
yeah.
We
could
do
that
just
still
thinking
about
it.
I
That
would
work
and
then
director
this
realizer
will
be
responsible
for
writing
the
value.
According
to
the
context,
just
therefore,
like
the
number
quoted
numbers
scenario
we
need
to
pass,
it
needs
a
special
argument:
the
json
number
handling
in
them,
which
it
can
be
fetched.
We
don't
have
the
infrastructure
to
do
it
in
general
with
the
because
it
can
be
fetched
not
just
from
just
sensorizer
options,
but
also
from
you
know,
attributes.
So
we
already
have
fetch
that
data
when
making
json
property
in
four
instances.
I
So
I
don't
think
that,
even
if
we
generalize
it
this
way
to
return
a
string
that
I
can't
necessarily
apply
to
like
the
quoted
number
scenario
like
directly.
I
think
we
still
need
some
build
different
infrastructure
to
handle
that
because,
because
the
signature
would
not
be
compatible,
obviously
we
need
additional
information.
Yeah.
A
F
I
Yeah
the
reason
why,
like
so,
I
considered
just
like
forwarding
to
the
building
converters
just
like
detect
the
t
and
figure
out
what
like
what
we
used
to
do,
but
now
with
source
generation
and
the
efforts
to
reduce
size
like
would
have
to
be.
I
don't
see
how
I
can
do
this
without
like
referencing,
the
converter
staticking,
which
increases
size
we
have
to
be
like,
oh
string,
converter
is
converter
like,
and
that
will
affect
a
bunch
of
types
that
don't
use
those
converters.
A
A
Reasonable
to
say,
if
you
do
first
generation,
maybe
there's
one
of
those
things
where
yep
sorry,
we
can't
do
that
in
source
generation.
I
mean
that
would
be
unfortunate
because
every
discrepancy
will
add
up
over
time
too,
but
I
mean
I
don't
have
a
good
handle
on
how
impactful
that
breaking
change
is,
but
I
don't
know
in
general,
it
seems
unfortunate
to
make
a
breaking
change
in
that
then
tell
people
like
oh
yeah,
you
can
get
the
whole
behavior
back
if
you're
willing
to
write
some
code.
I
Yeah,
it
was
yeah,
it
wasn't
like.
It
was
sort
of
important
breaking
change.
We
had
to
make
just
for
things
to
function
correctly
and
be
linker
friendly.
So
that's
like
why
we
made
that
change
in
the
first
place
and
then
my
thought
is
that
the
mitigation
isn't
too
bad
because
you're
already,
like
you,
only
experience
it.
If
you
have
a
custom
converter.
C
I
F
Right,
so
it's
just
the
if
somebody
has
to
override
this
new
method
to
get
back
to
a
functioning
state,
then
that's
a
problem
like
the
the
virtual
needs
to
work
like.
If
you
can
do
it,
where
the
virtual
can
just
work,
then
it's
cool
but
an
existing
nuget
package.
We
can't
say:
hey:
we've
published
a
you
know
the
version,
6
version
of
system
text
json,
anything
that
published
a
converter
using
the
version.
5
can't
work
like
it
will
break.
Why?
F
J
What
it's
worth?
That's
currently
the
behavior
in
4.6
for
the
last
few
previews,
and
this
proposal
is
merely
providing
an
escape
hatch.
So
the
alternatives
here
are
either
provide
the
escape
patch
and
keep
the
braking
change
or
revert
the
braking
change
and
route.
All
the
default
converters,
which
we
explicitly
wanted
to
avoid.
A
J
I
sorry
go
ahead.
No!
No!
I
personally
don't
think
it's
a
french
case.
I
think
many
people
override
the
string
converter
for
reasons
that
are
probably
not
great,
let's
say,
but
as
a
side
effect.
All
these
people
will
get
not
supported
exceptions
should
they
need
to
deserialize
something
that
contains
a
dictionary
with
string
keys,
so
that
would
be
a
fairly
common
scenario.
In
my
view,.
I
Yeah
I
I
just,
I
think
that
if
we
were
going
to
make
changes
like
this
like
it
would
be
like
now.
I
think
we
were
writing
the
serializer
with
somewhat
different
goals.
In
the
past
like
where
you
know,
a
lot
of
dynamic
functionality
was
working,
and
then
we
didn't
really
care
about
science
and
things
like
that,
but
like
now,
we
want
to
like
get
exercise
like
figure
out
how
to
write
serializer
in
a
way
that.
I
A
But
I
kind
of
disagree
at
the
same
time,
because
the
you
know
the
the
problem
is
that
we
know
that
people
treat
our
net
as
a
super
dynamic
system
and
that
you
know
that
doesn't
play
well
with
some
of
the
goals
we
have.
We
know
that,
but
like
the
problem
is
that
in
many
cases
people
legitimately
don't
care
about
size,
because
if
I
write
an
asp.net
web
app,
I'm
not
very
likely
to
use
trimming
in
the
first
place
so
like.
Why
would
I
have
to
like?
Why
does
my
app
regress?
A
Because
somebody
might
use
the
same
technology
to
build
a
mobile
app?
But
that's
super
important
right
so
that
I
feel
like
the
the
trade-off
is
that
we
are
regressing
a
scenario
where
the
gains
are
not
visible
to
the
customer,
and
I
I
don't
have
a
good
handle
on
how
impactful
that
is,
but
I
mean
I
would
say
if
it
impacts
anybody
who
has
a
dictionary
regardless
of
the
type
of
the
key,
then
that's
bad.
A
I
Yeah
we
can
special
kiss
this
string.
Converter,
like
that
was
the
initial
approach,
just
special
case
that
one
it's
one
converter
and
I
think
that's
the
most
common
case
and
I
think
it'd
be
rare
to
have
object,
graphs
that
don't
have
string
in
the
first
place.
So
even
if
we
use
the
string
converter
in
our
case,
it's
actually
down
by
p.
A
I
I
did
make
a
note
in
when
I
like,
when
I
mentioned
here
that
made
that
breaking
change.
I
will
appreciate
this
feature
as
as
a
workaround,
so
I
guess
that's
why
we're
we're
here
now,
but.
I
We
did
yeah,
so
one
guy,
which
was
string,
which
was
like
a
string
converter
like
trying
to
do
some
unique
transformations
on
the
string,
and
I
imagine
that
he
would
want
they
would
want
to
do
the
same
thing
for
their
dictionary
string
keys.
Well,
I
mean
maybe
they're
not
broken
in
five,
so
maybe
not,
but
but
yeah
we
did
get
feedback
around
this.
A
I'm
not
like
I
mean,
I
think,
I'm
okay
with
saying
you
know
we
we
break
the
you
know
one
in
ten
thousand
cases
case
right
and
and
say
like
yeah.
Sorry,
we
do
this
for
simplicity
and
source
generation,
even
if
you
don't
care,
but
we
think
it's
fringe
enough
that
you
know
if
you
care
sorry,
you
have
to
write
two
lines
of
code
right,
but
if
we,
if
we
say
sorry
like
you,
do
the
super
common
thing
in
asp.net
and
you
can't
have
it
anymore
because
mobile
that
seems
like
a
weird
message
right.
A
That's
I
mean
it's
not
to
say
that
we
never
trim
asp.net
apps
right,
but
it's
it's
it's
fair
to
say
like
not
in
the
six
hole
time
frame
right,
so
the
I
feel
like
there's
a
I
mean.
Even
if
we
have
trimming
I
mean
I
think,
the
more
dynamic
the
system,
the
more
babysitting
we
expect
from
the
developer
and
it
seems
given
how
dynamic
asp.net
apps
are.
I
don't
think
this
will
ever
be
free
of
babysitting.
So
I
think
it's
okay.
A
If
those
people
have
to
do
trimming
that
they
would
have
to
do
some
work
right.
So
it
would
be
okay
to
say
sorry,
you
know
once
you
move
to.net
eight,
where
we
have
first
class
trimming
support
for
some
of
the
stuff.
What
happens
to
work
for
you
today?
No
longer
works
right,
but
if
I'm
not
trimming,
then
why
are
you
taking
something
away
from
me
right
that
that's,
I
think,
the
the
thing
I'm
struggling
with.
F
Yeah,
I
mean
to
me
the
the
real
difference.
Like
I'm,
okay,
with
a
you
know,
a
major
upgrade
breaks,
an
application.
That's
fine!
The
you
move
forward
to
a
new
version,
something's
different
in
your
application,
and
you
move
on
the
problem
with
this
one
is
it
feels
like
it's
really
set
up
for
this
goes
across
the
nougat
package,
boundaries
of
you
and
your
application.
You
wanted
some
feature
from
the
the
v6
version
of
the
serializer.
F
Now
some
nuget
package
that
you
were
talking
to
you
just
literally
cannot
use,
and
now
you
need
to
go
hound
them
to
go
upgrade
to
go,
do
a
split
compile,
so
they
can
in
the
net
6
version
of
their
thing,
see
the
that
there's
methods
here
to
override
I
mean,
I
guess
really.
This
is
all
the
net
standards,
but
so
they
need
to
upgrade
their
package
version
to
see
that
they're
the
new
virtuals
to
override
to
get
back
to
the
functionality
they
had
previously.
F
A
Yeah,
I
think
in
general.
The
problem
is
that
it's
very
attractive
to
think
of
library
spurs
versus
application
space,
but
it's
not
it's
similar
to
exchange
type.
But
it's
not
like
a
binary
thing
right.
It's
a
very
fluent
thing.
That
kind
of
depends
on
the
concrete
right.
So
it's
it's
just
a
mental
shortcut
for
us
to
like
put
things
on
a
spectrum,
but
we
have
to
understand
that
what
is
an
application
scenario
for
one
customer
might
be
a
library
center
for
another
customer
right.
So
like
it's!
A
A
F
A
A
I
mean
what
is
fair
to
say,
though,
is
like
it's
not
a
platform
problem
right,
because
it's
logically
a
nuget
upgrade
problem,
because
the
the
new
gear
package
is
provided
already
down
level
for
net
standard
too.
So,
logically,
if
you
upgrade
your
nuget
package,
you
can
write
the
correct
code
now,
even
if
you
still
target
the
older
platform.
So
it's
not
like
you
have
to
now
say
if
dev
older
platform
don't
do
the
override,
if
newer
platform
do
the
override
right.
A
So
at
least
you
get
to
write
one
set
of
code
that
you
know
keeps
working
everywhere
right,
but
it's.
It
certainly
is
a
problem.
If
you
think
of
the
upgrade
experience
where
somebody
just
retargets
from
dot
net,
five
to
six
didn't
use
the
nuget
package
and
and
from
and
from
their
point
of
view,
it
is.
It
is
an
upgrade
issue
right.
A
File,
no,
that's
not
what
I'm
saying
what
I'm
saying
is
that
the
the
the
problem
for
platform
provided
functionality,
let's
say
in
system
runtime
right.
If
we
do
a
breaking
change
like
this
there,
the
problem
is,
you
can't
write
the
code
that
works
down
level
and
up
level
right.
You
now
have
to
special
case
it
here
you
wouldn't
have
to
you
could
just
say
you
can
use
the
newest
version
of
system
text.json
on
net5
right.
So
so
you
can
so
there's
a
path
you
can
take
where
your
code
works
everywhere,
but
it
it
doesn't
right.
F
But
it
requires
that
everyone
who
currently,
when
maybe
this
number
zero
and
it
doesn't
actually
matter,
but
it
would
require
that
everyone
who
currently
has
a
custom
converter
that
gets
used
in
a
nuget
package
or
published
via
a
nuget
package
that
they
immediately
go
out
and
grab
the
new
version
override
these
things.
And
and
they
do
that
before
anyone
uses
their
library
from
a.net
six
runtime.
Because
from
the
net
six
runtime,
you
can't
downgrade
back
to
the
v5
functionality.
Because
it's
part
of
the
platform.
I
A
A
To
be
reactive
right
I
mean
it's
also
true
for
us
right.
We
tend
to
throw
stuff
over
the
wall
and
then
we
consider
it
working
unless
people
scream
right,
I
mean,
if
you
have
enough
customers
like
us,
I
mean
once
you
have
in
their
hundreds
of
thousands
of
downloads.
If
nobody
complains
it
either
means
nobody
uses
the
feature
or
the
feature
works
for
everyone
right,
like
that's,
that's
probably
fair,
but
that's
why
I
mean
there's
a
problem
with
previews
right.
A
So
if
we
have
had
zero
signal
on
the
previews
that
the
breaking
change
impacted
anybody
well,
that
might
just
mean
that
the
previews
haven't
been
adopted
broadly
enough
right,
but
like
the.
If
we
have
received
some
amount
of
feedback
on
the
previews,
then
I
think
it's
fair
to
say
once
we
ship,
stable
or
rc
like
it
will
get
worse.
A
A
Many
people
would
consider
this
an
upgrade
non-starter
right.
So
the
that's.
Why
I'm
saying
to
me?
It
really
depends
on
how
how
strongly
do
we
really
feel
that
this
is
necessary?
That
would
not
be
a
breaking
change
that
would
do
very
lightly.
There
are
some
other
ones
where
I'm
like
yeah,
it's
fine.
It's
a
one-liner
change.
It's
obvious
what
the
correct
behavior
is,
but
messing
with
civilization
data
seems
impactful
and
hard
for
the
customer.
A
To
reason
about
and
fix,
especially
when
you
consider
library
boundaries
where
you
are
not
actually
the
person
that
has
to
make
the
fix
right,
you're,
just
an
app
that
uses
cosmos
db,
they
own
the
converter,
they
didn't
override
them
and
now
stops
working
for
you
and
you're
like
oh,
what
the
heck
do.
I
have
to
do
now
right,
and
so
I
think,
that's
kind
of
the
the
challenge
where,
if
people
I
mean
yes,
when
everybody
upgrades
and
everybody
uses
the
new
functionality,
it's
fine.
A
F
Because
the
in
the
you
know
the
mentality,
at
least
that
I
think
we
generally
suggest
people
have
is,
if
you
are
a
library
component,
that
you
target
the
lowest
version
that
you
can
of
the
dependencies
that
you
have
so
a
library
component
with
a
converter
may
be
targeting.
You
know
the
the
the
three
or
version
three
version
of
the
package,
because
they're
like
this
is
the
most
compatible
with
the
universe,
I'm
not
forcing
anyone
to
upgrade.
I
They
they're
fine
with
the
old
behavior
and
it's
for
for
string
so,
and
I
do
think
that
string
is
like
the
dominant
case.
Even
from
the
feedback
like
we
didn't
seem
to
care
about
their
attacks,
but
yeah
they
were
fine
with
the
behavior
of
the
built-in
converter
handling
it.
If
we
make
the
change
and
then
root
that
converter
for
string
they'll
be
fine
and
then
because
it's
virtual,
if
they
wanted
to
provide
their
own
logic,
they
could
override
it
and
then
also
get
anywhere
so.
J
To
be
clear,
like
you
know,
we
did
discuss
mitigating
for
the
case
of
the
string
converter
and
we
speculate
that
this
would
address
80
percent
of
use
cases,
but
you
know
that
does
not
make
the
problem
go
away
like
for
every
other
thing
that
can
serialize
as
a
dictionary
key.
The
issue
will
remain
so.
The
question
is:
do
we
just
do
it
for
string
or
do
we
revert
for
every
converter.
F
Yeah
and
so
now
that
that's
come
up,
we
can
talk
about
a
specific
point.
That's
confusing
me
if
this
is
to
allow
or
is
part
of
a
pipeline
that
lets
a
dictionary
of
into
whatever
work
like.
How
is
it
that
the
signature
for
write
as
dictionary
key
takes
string?
That
means
we've
already
done
a
transformation
of
int
into
string,
and
I
thought
that's
what
the
converter
was
supposed
to
do.
F
And
and
just
because
we're
on
json
converter
of
string
here
that
it's
the
string,
okay,
all
right,
never
mind.
J
I
should
point
out
that
this
api
proposal
is
highly
orthogonal
to
the
breaking
change
we're
discussing.
We
could
ship
this,
regardless
of
what
we
do
with
the
not
supported
exceptions,
and
in
fact
I
don't
see
any
reason
why
we
shouldn't
be
adding
this.
A
Yeah,
so
this
will
be
my
next
question.
So
if
we,
if,
let's
say
we
undo
the
thing
right
and
we
basically
now
say
yep,
we
get
back
the
old
behavior.
My
understanding
is.
That
means
that
we
regress
effectively
the
source
generator
is:
are
these
methods
effectively
a
way
for
the
source
generator
customer
to
allow
better
linking
by
using
those
overrides.
I
Yeah,
because
if
you
just
specified
a
custom
converter,
serializer
is
not
keeping
it's
not
like
referencing
them
directly,
so
yeah
we
undo
some
linking
size
efforts
like
most
of
it.
I
If
we
don't
don't
do
it
this
way,
and
if
we
ever
wanted
to
go
back
and
then
science
became
like
a
thing
that
was
important
to
a
lot
of
people
like.
I
don't
think
that
we
should
do
this
without
things
like
linker
features,
switches
that
say,
hey
disable
things,
so
it's
not
going
to
be
as
clean
in
the
future.
If-
and
that
was
something
that
we
considered
for
search
then
in
this
release
that
we
didn't
want
to
do
like
make
it
based
on
a
linker
switch
at
this
level.
A
I
mean
I
like
that
model
and
that
model.
Basically,
the
idea
is,
if
you
compile
against
five
stuff
works
on
six,
it's
just
less
ideal
in
source
chain.
If
you
want
things
to
be
more
ideal
in
source
chain,
you
should
override
those
methods.
If
you
want
to
tuber
ship,
an
analyzer
that
tells
people
to
do
that,
and
then
I
think
that
that
would
be
a
good
model.
I
think
that
would
be
good
api,
then.
A
So
what
was
it?
Sorry?
I
think
I
missed
what
the
good
thing
was.
Well,
the
good
thing
is
that,
basically
we
don't
regress
customers
and
people
that
care
about
source
gen.
They
can
tweak
it
further
to
to
get
into
the
footprint
that
we
want,
and
so
it's
kind
of
pay
for
play
right.
If
you
don't
care
stuff,
just
works.
If
you
do
care,
you
can
make
it
better
right.
A
A
I
A
A
I
Answer
is
no,
I
mean
I
okay
here,
maybe
to
correct
me,
but
I
don't
think
the
link.
I
will
trim
out
the
virtual
apparently
method,
because
it
doesn't
know
that
every
converter
does
that,
basically,
so
it
will
still
leave
the
the.
I
think
the
way
that
it
work
for
linking
in
the
future
is
with
linker
switches,
which
is
where
you
specify
like
a
property
in
your
cs
project.
I
Saying,
like
you,
know,
jason's
first
generation
and
like
again,
we
didn't
do
that
because,
like
it's
harder
to
tell
when
things
work
and
when
like
why
things
feel
like
you
know
right,
it's
the
library
or
like
if
it's
a
library,
author
that
specifies
it
that
affects
like
the
absolute
user
scenario.
So
I
think
it
just
like
sets
a
messy
president
like
if
we
start
relying
on
like
non
like
in
the
different
programming
model
that
involves
linkers,
which
is
than
just
writing
this
so
generate
like
to
serialize
anyway.
I
That
is
like
gym
friendly
as
a
first
class,
like
architecture.
I
C
I
Different
way,
I
think
the
breaking
change
that
pushes
the
responsibilities
to
the
converter
is
the
only
way
to
do
cleanly
for
dissolution
scenarios.
Basically
just
allow
social
generation
to
work
for
dictionary
for
dictionary
cases
in
general,
like
not
just
string,
but
the
way
it's
written
is
such
that
I
cancel
like
substitution
can
support
dictionaries.
I
So
I
think
it's
fundamental.
The
breaking
change
is
fundamental
to
so
that
functionality
working,
but
whether
we
can
register
the
using
converters
for
this
for
this
specific,
like
scenario
in
a
different
way
like
in
a
like,
somewhat
loosely
bound
way
that
doesn't
return
explicitly.
F
In
the
old
world,
so
that
native
we
had
the
the
xml
file
that
you
could
edit.
That
said,
keep
this
type
alive,
even
though
you
don't
think
it
should
be.
Can
we
do
we
have
something
like
that
with
the
current
trimmer?
Can
we
make
an
an
app
that
wants
trimming
have
to
go
register
all
the
internal
types
for
non-elimination,
like.
G
A
F
Well,
it
would
be
that
the
virtual,
if
the
virtual,
looked
up
the
type
via
reflection,
so
the
if
the
yeah
the
base
member
of
this
new
virtual,
if
it
had
a
complicated,
go,
look
up
a
type
by
reflection
and
trimming
could
have
deleted
it.
Then
that
lets
apps.
That
did
the
override,
not
need
the
type
or
not
need
all
of
the
converters
to
be
rooted.
And
then,
if
you
did
turn
on
trimming-
and
you
did
use
a
library
that
didn't
override
one
of
these,
then
it
is
if
it
gives
back
a
good
message.
G
Indeed,
but
I
mean
like
this
actually
is
how
we
do
like
some
of
the
other
collections
right,
like
stack
like
we,
the
serializers,
are
all
marked
as
unsafe.
Already
like
the
the
reflection
based
serializers
they're,
all
marked
as
unsafe,
already
so
kind
of
all
bets
are
off
that
it's
just
going
to
work
when
you
turn
on
trimming,
so
this
could
just
be.
F
Yeah,
I'm
like
personally,
I'm
fine
with
you
know,
trimmed
apps
being
broken.
It's
people
who
had
existing
applications
that
get
broken
by
a
transitive
version
dependency
is
what
concerns
me.
G
I'll
just
add
one
clarifying
phrase
to
your
your
statement.
Trimmed
apps
that
produce
warnings
them
being
broken
is
totally
fine
warnings
right
if
we've
already
given
you
a
warning
because
you're
using
the
serializer
in
reflection
mode
or
whatever
you
want
to
call
it,
those
being
broken
from
all
sorts
of
different
reasons.
That's
totally
fine.
F
Yeah,
so
I'm
saying
where
my
the
the
behavior
of
assuming
that
the
virtual
here
is
a
throw
of
you
need
to
override
this
to
get
the
scenario
working
again.
If
that
happens,
to
only
throw
for
trim
daps
that
reduces
my
braking
change
impact
feeling.
I
Yeah,
just
my
own
personal,
like
so
much
stance
on
it,
is
that
if
we
just
you
just
find
out
that
we
forget
to
string
one,
and
I
think
that
it's
not
going
to
be
like
as
impactful
and
the
few
people
that
do
it.
It's
unfortunate
that
might
have
to
update
their
code
and
like
release
a
new
version
and
things
like
that.
But
I
don't
think
there
will
be
a
lot
of
cases.
F
Yeah,
so
now
should
we
eric
t
asserted
that
you
know
the
api
is
independent
of
the
feature
progression.
F
I
So,
as
proposed
like
you,
just
write
the
value
as
a
string,
but
there
was
a
suggestion
around
like
returning
a
string,
but
I
don't.
I
don't
know
that
it
doesn't
doing
that,
doesn't
necessarily
apply
to
like
jodah
quoted
feature
which
is
quoted
numbers.
So
I
don't
know
that
we
have
to
do
it.
That
way.
F
Well,
but
you
you
do
have
the
problem,
because
when,
when
you
call
the
thing,
that's
currently
called
writer's
dictionary
key
if
you're
in
an
object
and
you're
in
the
property
name
state,
you
can't
call
write
a
string.
You
have
to
call
write
property
value
or
write
property
name,
so
you
need
to
call
the
right
method
depending
on
where
you
are,
and
only
the
serializer
really
knows
what
state
it's
in.
F
So
returning
string
is
more
flexible
because
you
could
do
the
same
thing
later
for
quoted
numbers
if
you
wanted
so
and
then
otherwise,
if
we
wanted
it
to
be
quoted
numbers
in
a
in
a
value
as
well
as
quoted
numbers
in
a
property.
Now
we
need
another
method
to
talk
about
the
fact
that
we're
in
a
different
state
and
the
only
difference
is
what's
the
line
that
you
call
on.
I
J
I
If
we
I
mean
we
use,
the
passenger
reader
allows
you
to
process
the
data
spikes
directly
and
not
have
to
materialize
this
thing
and
even
the
same
case
for
the
reads
for
the
right
side.
Actually,
which
is
you
don't
have
to
materialize,
this
string
can
just
write
some
bites,
so
I
guess
there's
a
performance
benefits
for
doing
things
this
week,
fair
enough.
J
F
J
I
was
considering
given
some
feedback
from
a
community
member
in
the
issue
that
maybe
we
should
just
call
it
write
to
string
and
read
from
string.
F
F
J
I
mean,
assuming
that
we
keep
the
current
signatures
and
indeed
most
implementations
called
writer.write
property
name.
Then
you
know
we
should
keep
that
name
write
as
dictionary
key
or
write
as
json
property.
But
if
we're
not
doing
that,
if
we're
just
returning
a
string,
then
we
could
call
it
write
a
string.
I
Yeah
according
to
how
I
think
this
I
think
property
name
sounds
okay
tonight.
J
J
F
Yeah
I
mean
because
the
the
problem
would
be,
if
you
have,
you
know,
write
a
string
and
write
as
jason
property
name
like
is
one
of
them
calling
the
other.
Is
that
reasonable
to
do
so
in
a
later
version
like
you're,
you
have
virtuals
here
and
that's
you
know.
The
most
complicated
thing
you
can
do
in
dot-net
is
be
virtual,
especially
once
you
think
one
virtual
might
ever
want
to
call
another
virtual,
because
once
you've
once
you've
written
that
order,
you
can't
change
it.
F
So
I
I
think
with
this
that
really
with
something
like
this:
it's
there's
not
a.
Maybe
we
want
to
in
the
future.
It's
you
kind
of
really
need
to
have
your
final
plan
before
you
start
yeah.
I
mean
it's
not
totally
absolute
on
that,
but
it's
for
best
results,
design,
everything
and
then
see
what
you
can
do
incrementally,
but
thinking
in
the
future
we
might
want
to
blank
means.
You
probably
have
a
reason
why
it's
not
going
to
work.
I
I
mean
we
already
have
the
internal
infrastructure
for
the
quoted
numbers
case,
so
I
kind
of
know
what
that
would
likely
look
like
when
we,
if
we
want
to
expose
it
and
that's
why.
I
know
that
I
I'm
thinking
that
there's
no
direct
overlap,
because
I
did
try
to
write
implementations
in
a
way
that
shared
like
they
can't
call
each
other
and
things
like
that.
We
might
call
some
common
logic
in
some
cases
but
like
as
far
as
like
the
scenarios
which
they
applied.
F
Right
well
because
my
concern
would
be
if,
if
whatever
the
you
said,
the
quoted
numbers
would
need
things
that
are
not
currently
in
the
signature.
And
if
that's
the
case
then
now,
if
you
have
a
dictionary
of
int
to
whatever,
then
now,
is
there
going
to
be
another
another
virtual
for
writing
the
property
name
that
has
the
extra
parameter
in
case
you
were
in
end
and
if
that's
the
case
and
you're
doing
a
next
version,
I
would
say
no
api
in
this
version
and
just
do
the
whole
feature
together.
F
I
I
I
No
not
for
this
property
name
scenario
right.
It
will
be
for
the
property
value
scenario.
So
so
you
know
what
the
number
handling
setting
was
like
write
as
string
or
allow
like
what
they
call
it
named
floating
point
constants
like
so.
I
I
F
My
my
statement
is
just
maybe
I
misunderstood-
that
for
the
custom
number
formatting
thing
that
you
would
need
to
alter
the
signature
of
this
method,
because
if
it's.
F
Hanging
off
json
serializer
options,
then
it's
fine,
you
have
a
natural
upgrade
you
can
get
in
here.
You
can
see
that
the
things
are
supposed
to
happen
and
you
move
on.
If
you
need
a
if
you
need
to
alter
the
signature
or
make
a
new
virtual,
then
that
concerns
me
that
we
would
add
this
right
now
and
immediately
have
a
more
complicated
version
of
it
in
the
next
version.
I
F
H
C
A
We
discussed
this
now
for
like
45
minutes,
it
doesn't
seem
like
we
will
come
up
with
a
design
in
here
so
like.
Let's,
maybe
just
push
this
back,
you
guys
think
about
it,
and
then
we
can
either
do
thursday
or
next
week,
depending
on
how
fast
you
can
come
back,
because
I
feel
like
the
other
one
we
want
to
look
at
here
is
probably
also
quite
large
right.
I
Yeah,
okay,
thank
you!
So
for
so
yeah!
I
guess
sorry
for
this
one.
I
we
have
reviewed
such
generation
apis
previously
and
since
then
so
switch
to
the
new
issue.
I
So
so
we
looked
at
search
generation
apis,
so
that's
like
if
we
can
scroll
to
the
to
the
bottom
of
this
issue.
I
I
mean,
like
the
last
comment
here.
A
Sorry,
the
last
comment
you
mean
the
last.
H
Quite
literally,
oh
yeah,
okay,.
I
Yeah
so
we've
reviewed
social
apis
and
since
the
last
time
we
looked
at
this,
I've
been
adding
functionality
to
the
search
generator
just
like,
and
then
now
you
see
just
yeah,
adding
some
functionality
and
building
the
features
that
we
need.
So
now,
let's
come
back
and
then
review
like
the
changes,
and
most
of
this
is
already
prototyped
in
the
like
in
runtime,
but
I'll
just
come
back
and
then
take
a
look
at
the
new
apis
and
then
and
just
get
them
like
reviewed
and
accessed.
I
So
if
you
scroll
down,
there's
two
classes
of
apis
like
sources
apis
for
search
generator
to
use
like
the
search
generated
code
that
initializes
all
the
type
metadata
and
like
especially
and
sets
like
the
you
know,
serialized
logic
and
things
like
that.
So
that's
this
set
of
apis.
I
I
We
encapsulate
them
into
struct
because
it's
cleaner
than
having
like
14
parameters
in
the
method
and
also
helps
with
the
issue
of
you
know
forward
compact
in
the
future.
Like
hey,
what,
if
we
add
a
new
feature,
we
can
add
the
metadata
to
represent
that
feature
in
these
structs
in
a
non-breaking
way,
rather
than
having
to
add
a
new
overload
of
like
the
create
json
property
for
method,
for
instance.
I
So
these
are
so.
These
are
just
supposed
to
just
contain
values
that
are
set
once
by
the
source
generated
code
and
they're
read
only
selected
they're
not
supposed
to
be
mutated,
and
then
the
serializer
receives
this
information
and
this
initializes
the
metadata
as
required.
So
it's
pretty
much
most
of
it
is
the
same
things
that
we
saw
last
time.
Just
a
few
things
where
I
did
like
things
saying
like
is
this
a
property?
Is
this
public?
Is
the
virtual?
I
What's
the
declaring
type,
the
type
info
for
the
property
to
all
this
list?
I
think
the
only
thing
maybe
I
did
since
last
time-
is
this
stuff
about
public
and
virtual.
That
just
helps
us
like
order
things
like
you
know
not
supporting
non-public
members
and
throwing
like
relevant
runtime
exceptions.
I
When
you
know
people
do
things
like,
I
chasing
include
on
a
non-public
member,
which
should
be,
I
guess,
exception,
for
announcements
with
another
side,
supporting
that
and
his
virtual
helps
with
things
like
you
know,
like
collisions
between
property
names
like
at
a
derived
class
first,
as
a
base
class-
and
you
know
this
has
to
be
like
collisions-
need
to
be
detected
at
runtime
for
the
metadata
scenario,
because
you
know
like
things
like
the
naming
policy
applied
con
forced
collisions.
I
I
So
I
guess
scrolling
down
to
json
objecting
for
values
different
struct.
So
this
here
is
information
that
we
need
to
create
a
type
info
for
you
right.
I
The
first
thing
we
have
here
is,
and
most
of
these
things
that
were
represented
in,
like
the
overlooks
to
actually
create
digits
and
type
in
four
instances,
but
so
like
create
object,
function
like
how
to
create
the
objectives
in
a
parameters
constructor
another
func
below
that
to
create-
or
this
needs
to
be
named
to
like
create
object,
function,
parameterized,
construct
or
something
like
that
which
allows
you
to
create.
I
With
the
parameters
constructor
like
if
a
func
that
we
can
call
easily
to
initialize
property
metadata
the
function
we
can
call
usually
to
initialize
constructor,
parameter
information,
but
indeed
number
handling
for
the
for
the
type
is
and
then
series
func
that
that
contains
serialization
logic.
I
So
will
the
create
method
will
figure
out
like
what
is
the?
I
could
do
some
analysis
on
this
input
data
to
figure
out
like
what
information
is
needed
or
not
like
when
to
draw
exceptions
and
like
things
like
that,
the
next
structure
is
json
collection,
info
values
which
specifies
information
we
need
for
to
create
like
type
info
for
collections.
I
But
how
do
we
create
an
instance
on
decentralization?
What
is
the
metadata
for
the
key?
What's
the
method
is
there
for
the
element
type,
which
kind
of
relates
to
the
conversation
that
we
had
previously
about
custom
dictionary
keys,
but
somewhat,
then?
I
What
is
the
number
handling,
satin
and
class
funk
so
a
few
common
things
to
the
previous
things?
But
I
don't
think
that
we
should
like
introduce
interfaces
and
things
like
that,
because
I
don't
think
it's
necessary
right
then
scrolling
down,
we
have
decent
pyramid
real.
I
M
It
is
used
in
a
few
places
generally
when
we
refer
to
things
like
metadata:
it's
not
general.
We
don't
generally
tend
to
shorten
it
to
ctor
within,
like
the
reflection
stack,
we
we
only
tend
to
do
it
when
we're
talking
about
like
cracking
open,
echo,
stuff.
A
M
A
A
I
For
sure
then
say
you
should
take
password
the
names
the
json
written
yes,
so
we
have
the
json
parameter
in
four
values,
which
you
know
stuff
like
default.
Value
has
default
value,
name,
parameter
type
and
position
just
information
that
we
need
to
give
to
like
initialize,
constructor
or
argument
information.
I
I
Similarly,
for
json
property
info
creation,
we,
instead
of
taking
like
it's
boolean,
it's
properties,
public
blah
blah.
Then
we
just
take
this
drop.
Then
we
can
scroll
down
a
little
bit
to
the
collection
typing
for
creators.
I
So
the
first
three
here
are
the
build
these
collections
that
we
supported
between
the
first
pass
of
apiq,
so
array,
dictionary
and
list,
and
then
I
said
I'll
come
back
with
the
rest,
so
yeah
so
similar.
We,
instead
of
like
all
these
long
parameters,
we
just
take
adjacent
collection
info
and
then
for
the
other
types
most
of
them
have
the
same.
I
Like
signature,
just
with
the
immutable
collections,
we
also
pass
in
like
a
create
range
func
which
follows
the
create
range
pattern
which
is
what
yeah.
So
we
take
a
creative
funk
which
allows
us
to
like
create
a
temporary
list
and
then
copy
something
behind
me.
I
Okay,
so
yeah
so
crazy,
funk
and
then
finally
create
stack
of
you
info
just
takes
like
the
very
last
method.
Here
takes
a
like,
delicate
to
an
add
method
like
push,
or
so
you
can
add
to
the
collection
and
that's
pretty
much
for
the
create.
You
know,
creating
the
collection
for
so
similar
to
like
the
json
collection,
existing
property
for
ce,
like
with
new
features,
we
can
add
those
to
destruction
and
like
initialize
the
metadata,
so
at
least
these
apis
might
be
a
little
more
stable.
I
I
think
we
might
still
have
to
like
deprecate
some
in
the
future,
but
it
won't
be
as
like
bad
as
it
would
have
been
if
we
left,
we
left,
we
left
things
as
is
so
so
moving
down
to
another
class
of
the
api
is
just
apis
to
configure
the
source
generator
that
users
in
the
interact
with
so
if
we
can
scroll
down
instead
pass
the
collection
info
stuff.
I
Sorry,
scroll
down
sorry
so
so
yeah
we
have
vps
to
configure
the
search
generator
so
prior
to
so
yeah.
I
Prior
to
this,
we
had
a
constructor
that
took
two
options:
instances
one
of
them
was
supposed
to
specify
the
options
instance
for
the
current
context,
and
then
the
other
was
supposed
to
say
what
are
the:
what
are
the
options
that
map
to
the
design
time
specified
serialization
options
so,
rather
than
take
both
of
them
in
the
constructor,
we
just
have
an
abstract
property
which,
like
a
construct
which
context
instance
is
supposed
to
overwrite,
so
the
source
generator
will
provide
this
like
override.
I
So
you
just
don't
have
to
worry
about
it,
and
then
we
also
have
this
ignore
runtime
custom
converters.
That
say
whether
the
generated
source
code
should
ignore
converters
at
runtime.
I
initially
added
it
because
of
like
size
considerations,
so
to
avoid
like
a
custom
converter
check,
but
I
don't
see
a
scenario
where
people
would
want
this
and
that
check
is
going
to
be
pretty
fast
because
we
can
first
look
at
the
we're
gonna.
First
look
at
the
length
of
the
collections
converter
list.
I
So
basically
we
don't
need
this
anymore
because
I
don't
think
it's
useful
and
then
that
is
all
I
had
in
mind
for
now
and
I
think
there's
some
comments
around
from
steve
about
like
some
things.
You
should
do
sorry.
I.
I
So
we
have
a
multi-discuss
generator
whether
we
should
generate
like
metadata
initialization
called
the
serialization
logic,
so
the
design
time
options
are
they
mapped
to
what
is
specified
with
the
json
source
generation
options
attribute
so
like.
If
you
see
the
naming
policy
is
camo
case,
this
design
time
options
instance
will
have
you
know
the
policy
set
to
come
in
case
and
the
serializer
is
that
to
determine
whether
we
can
use
the
first
part
logic
or
not.
I
So,
like
you
compare
the
options
instance
to
say
it's
a
one-time
check
at
the
start
of
this
realization,
saying
like
hey
like
this
context
instance
like.
Does
it
align
with
the
search
generated
code
for
like
predefined
logic,
so
just
features
that
we
specify
with
this
our
design
time
attributes.
I
So
options
are
whatever
options
that
users
want
to
use
at
one
time
like
for
a
given
context
class,
so
you
can
say
whatever
an
arbitrary
like
just
synthetized
options,
instance
right
and
that
might
differ
from
the
runtime
the
design
time
options
that
is
specified
using
json
search
generation
as
attributes
which
the
class
right
below
this
distance
here
is
a
context
class.
I
A
A
I
see
so
like
I,
so
the
reason
why
I'm
struggling
with
design
time
options
so
design
time
usually
refers
to
things
that
run
inside
of
vs
on
you
know
in
the
designer
surface
or
something
right.
So,
like
think
of
a
you,
know,
a
forms,
control
or
a
property
grid
instance
or
something
like,
and
that
seems
confusing
here
as
a
technology
like,
I
think
what
you
want
here
is
more,
like,
I
don't
know,
like,
I
think
instance,
options
or
static
options,
kind
of
convey
that
a
bit
better,
but
I
don't
know
that
that
makes
it
better.
I
Don't
mind
static
options,
just
something
else
specified
ahead
of
time.
A
G
A
That
would
be
good,
I
mean
it's
basically,
the
the
the
bifurcation
here
right.
It's
the
there's
different
runtime
options
from
what
the
source
generated.
Serializer
options
are
right,
and
so
we
need
to
come
up
with
a
terminology
here
that
is
consistent
and
because
people
have
to
understand
that
that
these
are
different
things
right,
and
so
I
feel
like
yeah,
maybe
calling.
G
F
Yeah
yeah
I
mean
generated
serializer
options
is
the
only
other
thing
that
I've
come
up
with
which,
because
it's
for
when
your
serializer
deserializer
was
generated
instead
of
is
dynamically
built
with
reflection,
but
all
of
the
options
sound
a
little
off
and
I
don't
know
which
one
sounds.
The
least
off.
A
I
Hopefully
not
well.
F
Because
using
the
the
noun
form
instead
of
the
verb
form,
I
guess
I
can
also
say
data
and
accusative
and
just
feels
super
more
german.
The
the
serializer
is
the
thing
that
does
both
serialization
and
deserialization.
F
But
if
we
say
you
know
serialize
options
and
deserialize
options
in
the
verb
form,
then
that
would
apply
to
only
one
of
the
other
but
right.
The
serializer
is
the
thing
that
all
serializer
tech
that
I've
seen
uses
to
refer
to
the
thing
that
that
can
do
both.
Only
if
it
splits,
then
it
means
one
and
d
serializer
means
the
other.
But
if
you
talk
about
like
the
serializer,
it's
the
thing
that
goes
both
directions.
A
F
B
I
The
search
generator
this
serializer
needs
it
and
can
only
get
it
from
and
it
needs
to
be
specified.
So
it's
the
search
generator
that
needs
to
like
overwrite
this,
and
I
mean
the
saturated
code,
needs
to
write
this
and
it
needs
to
be
visible
to
it.
To
that
code.
Yeah.
G
E
G
I
Yeah,
oh
okay,
yeah
I'll,
just
yeah.
Can
you
taste
it
sounds
good.
G
I
We
do
know
it's,
just
that
would
be
crit
would
be
specifying.
I
could
be
creating
two
methods
which
would
do
the
exact
same
work.
So
I
guess
that's
fine,
because
it's
the
same
converter,
but
we
already
have
two
methods.
A
F
A
C
G
A
F
I
Yeah,
but
we
did,
we
do
care
about
app
startup
as
one
of
the
goals
for
the
future,
and
I
took
some
measurements
where
we
like,
if
we
make
them
classes,
would
be
allocating
more
than
the
dynamic
parts
of
the
serializer.
For
this,
like
type
in
procreation,
I
mean
more
like
not
more
overall,
but
just
more
in
some.
F
I
mean
it
will
allocate
because
you,
you
know
called
new,
but
if
you're
calling
it
once
and
then
you're
done
and
then
the
structs
are
basically
in
you
know,
tanner
or
someone
could
correct,
but
the
hand.
Wavy
is
they're,
basically
unpacked,
as
if
they
were
just
an
expanded
list
of
parameters
and
passed
that
way,
so
you're
you're
still
paying
the
cost
of
having
a
you
know,
23
parameter
method,
which
is
a
very
expensive
call,
because
the
way
that
the
the
data
has
to
be
aligned
before
the
method
call
can
can
happen.
F
So
passing
a
struct
if
you're
creating
it
a
bunch
and
passing
it
into
one
thing,
which
is
doing
all
the
work
and
then
you're
done,
then
a
struct
is
probably
better.
If
you
are
passing
it
through
like
two
methods,
then
a
class
was
probably
better
and
if
you
need
to
put
it
in
an
array
ever
a
class
is
guaranteed
better
because
you've,
because
you're
bigger
than
the
yawn
codices
suggested
24
bytes
past.
That
point
classes
are
almost
always
better.
F
There
are
very
few
cases
where
a
struct
that
big
is
good
if
it
has
to
be
because
of
something
like
the
the
ref
struck,
jason
reader.
Like
then,
you
know
now
you
have
a
reason
it
has
to
be,
but
it's
refstruct,
so
you
should
always
be
passing
it
by
ref,
which
avoids
most
of
those
problems.
D
But
yeah,
I
think,
jeremy
described
it
pretty
well
if
we
had
a
struct
that
was
bigger
than
24
bytes
and
it's
not
specifically
a
cmd
type.
So
if
it's
not
one
of
the
types
that
I'm
exposing,
then
you
should.
You
should
be
passing
it
by
ref
and
even
then
allocating
a
class
is
probably
better,
especially
if
you're
going
to
have
like
23
parameters
and
stuff.
A
D
C
F
Right,
but
when
you,
when
you
pass
all
this,
it
goes
somewhere
and
does
something
right,
so
you're
holding
all
23
pieces
of
information
somewhere
and
where
what
kind
of
thing
are
you
holding
it
in
how's
it
getting
into
that
place,
how
many
copies
get
made
of
it
along
the
way?
How
many
copies
get
made?
Just
when
you
try
reading
it?
F
I
Yeah
I
mean,
I
think
the
choice
of
bk
will
have
no
gear
in
throughput.
It's
all
about
start
time.
Considerations.
D
Yeah
you
you
have
to
so.
If
you're
going
to
do
a
struct,
you
have
to
pass
it
by
ref
or
in
in
will
make
it
immutable.
But
then
you
have
to
make
sure
that
the
struct
itself
is
read
only
or
that
all
members
that
you
might
want
to
invoke
on
it
are
themselves
annotated,
read-only,
because
trying
to
mutate
anything
will
cause
a
copy,
there's
just
all
kinds
of
additional
considerations
which
jason's
already
aware
of
because
it's
using
rough
structs
everywhere,
but
yeah.
We
do,
I
say,
go
ahead.
D
C
There
is
a
copy
made,
I
mean
so
we
did
discuss
internally
like
should
we
expose
property,
setters
and
getters
on
the
underlying
type
that
is
created,
which
is
the
json
property
info
of
t
internal
class.
C
But
we
decided
that
we
want
a
layer
in
between
that
and
modifying
the
existing
internal
type
that
every
that
these
structs
are
copied
to
to
expose.
Setters
like
this
is
a
it's
not
trivial
and
anyway,
there's
not
a
one-to-one
necessarily
between
the
two.
So
that's
why
we
have
a
a
separate
type
here.
C
F
The
struck
doesn't
need
to
be
the
ref
struck.
That
was
me
miss
speaking.
It's
you
need
to
be
a
struck.
If
you
need
to
be
a
rough
struct,
you
need
to
be
a
struct
and
then
it's
if
it's
a
struct
and
you
only
pass
it
by
ref.
So
the
thing
that
you're,
you
call
this
thing
and
then
your
or
you
build
the
struct
and
then
you're
passing
it
to
a
thing.
F
If
that
took
it
by
ref
or
in
then
it's
ref
passing
instead
of
value
passing,
which
is
the
thing
that
will
help
speed
up
the
perf.
But
then
now
you
get
all
the
complications
of
its.
You
know:
ref
passing
a
struct
just
that
it's
you're
still,
it's
super
easy
to
accidentally
make
copies,
because
it's
destruct
and
other
things,
but
the
struct
doesn't
need
to
become
a
refstract.
C
C
I
mean
the
allocations
are
going
to
be
on
the
stack,
probably
instead
of
they're,
probably
going
to
be
on
stack
in
both
cases,
but
it'd
be
interesting.
C
A
To
measure
right
because
it
seems
like
it's
still
a
lot
of
data
right,
so
it's
not
like,
and
I
mean
all
these
things
have
fields.
Sorry
yeah,
I
guess
fields
are
properties
right
and
they
themselves,
like
is-
are
still
classes
right.
So
the
json
type
info
stuff.
But
I
guess
those
are
all
singletons
right.
They
are
not
yeah,
they
are.
H
A
I
don't
know
what
to
do
here,
but
I
mean
in
general,
I
would
say
every
time
we
have
used
trucks,
it
backfired
eventually,
but
I
mean
they're
all
read
only
so
maybe
that's
fine,
but
they
certainly
are
usability
issues
with
them.
But
I
guess
those
are
also
not
the
types
we
expect
anybody
to
actually
interact
with
a
lot.
So
again,
maybe
that's
fine.
I
Yeah,
this
is
just
beautiful
generator.
I
can
do
some
measurements
to
see
like
you
know,
ref
structure,
sweet
only
versus
class,
and
but
just
my
assumption
of
how
it
worked
was
that
there
wouldn't
be
like
there
would
be
just
allocations
and
because
we're
not
mutating
it,
then
we
don't
have
to.
They
don't
be
copies
right.
I
So
yeah
I
mean
we
do
we,
we
do
copy
the
values
to
the
corresponding
json
property
input,
but
that
that
part
is
not
changing,
because
that's
what
we're
doing
with
the
previous
approach
of
passing
the
parameters.
A
A
C
I
F
I
mean
right
now
with
the
way
that
it's
obstruct
and
you're
passing
it
flat
like.
It
is
worse
than
what
you
were
doing
previously,
because
you're
you're
causing
the
compiler
to
make
sure
that
a
thing
is
built
and
aligned
in
the
right
way
in
memory,
and
then
we're
basically
exploding
that
at
the
call
site
you
can
think
of
it
as
exploding
and
then
we're
kind
of
reassembling
it
to
because
you're
passing
a
copy
of
it.
Instead
of
passing
it.
F
So
there's
a
lot
of
forcing
things
to
go
in
the
right
order
that
doesn't
need
to
happen
and
because
strucks
have
guarantees
of
talking
about
address
layout.
And
yes,
there
was
a
discussion
of
auto
versus,
sequential
and
and
whatnot,
but,
like
things,
get
destructs
make
things
complicated
in
ways
that
classes
don't
and
they
definitely
make
the
perf
more
complicated
in
ways
that
classes
don't.
They
are
good
for
if
you're
building
it
in
a
tight
loop
and
it's
small
and
past
shallow,
then
a
struct
is
good.
I
I
think
the
primary
goal
of
like
with
the
structure
class
is
just
the
encapsulation
of
all
of
these
values
into
like
just
encapsulation,
rather
than
have
them
as
parameters
for
like
beta
four
compact.
C
I
mean
go
back
to
the
the
previous
approach
right.
We
had
you
know
12
parameters
and
then
what
we
started
out
with
made
with
eight
and
then
eventually
went
to
12.
we're
thinking
in
the
next
release.
If
we
add
another
one
now
we
basically
have
you
know
a
method.
Signature,
that's
only
used
by
older.
C
You
know
already
compiled.
F
C
Okay,
well
I
we
can.
Maybe
we
can
take
it
offline
when
we
show
the
measurements
I
mean
really
is
to
avoid
the
the
the
heap
lx
and
we're
going
to
pass
it
by
ref
and
in
only
like,
probably
a
single
hop
and
more
or
less
all
of
these
types
for
the
most
part,
I
think,
are
essentially
we
consider
them
internal,
but
we
can't
make
them
internal.
We
don't
expect
people
to
use
use
these.
A
I
think
this
was
pretty
much
it.
I
think
the
the
conclusion
here
is
needs
work
right
and
then
I
think
we
gave
a
bunch
of
feedback
and
then
that's
pretty
much
it.
Let
me
just
paste
it
do.
G
G
A
G
This
is
like
the
factory
right
like
this
is
the
the
thing
like
it
was
like.
You
know
when
you
have
a
collection,
and
you
want
to
create
a
new
element
to
start
setting
stuff
into
like
a
collection
like
a
list
of
customer.
It's
like
this
is
the
callback
to
create
a
customer.
I
think
that's
kind
of
weird
just
say
it
that
way.
E
G
A
I
mean
we've
also
sometimes
just
used
the
term
factory
right
if
it's
very
clear
that
it's
only
used
for
instantiating
stuff,
but
it
seems
like
in
your
case
it's
a
bit
more
complicated
right
because
you
have
a
getter.
You
have
a
setter.
You
have
like
you
know.
Some
stuff
is
like
kind
of
clear
what
it
is
right.
So.
A
A
A
C
F
A
Without
I
know,
but
like
that
was
I
mean
that's
kind
of
I
mean
to
me:
there's
a
difference
between
what
you
see
in
c
sharp
and
how
it's
encoded
the
metadata
right,
like
the
the
that's,
what
I'm
saying
to
me,
you
can,
you
can
go
either
way.
It
depends
on
what
semantics
you
have
attached
to
what
the
prop
thingy
does
right,
if
you,
if
you
want
in
the.
F
A
I
Yeah,
just
like
all
those
funks
do
like,
so
it
basically
helps
us
like
easily
create
better
data
for
json
property
for
and
just
prime
time
for,
and
that's
when
we're
not
using
the
like
fastback
serialized
method,
where
the
serializer
needs
to
understand
like
the
types
so
they
can
use
it.
So.
A
I
I
F
I
I
had
one
quick
question
that
came
up
before
around
the
serialized
method
on
the
json
typing
for
t,
so
the
reason
why
I
made
this-
or
we
made
this
public
initially
was
because
also
that
users
could
call
it
directly
thinking
that
there
might
be
some
like
performance
boost
over
like
this
realize
they're,
calling
it
under
the
hood
from
measurements.
I
It's
not
the
case
like
they
perform
just
as
fast,
because
we
get
to
that
code
pretty
quickly
in
the
serializer.
I
think
one
benefit
of
exposing
this
is
that
more
of
the
system
technician
can
be
trimmed
out.
If
you
call
that
method
directly,
are
there
any
thoughts
about
whether
we
should
expose
it
or
not?
I
I
Actually,
I
think
you
can
see
that,
but
the
just
question
of,
like
the
only
reason
to
have
this
serialized
method
public,
to
the
account
surfaces
that
allows
more
system
taxation
to
be
tuned
out,
because
that's
something
we
should
keep.
We
can
have
this
internal
and
only
just
relaxer
has
access
to
it.
So
everybody
that
wants
to
use
generated
like
serializers.
We
need
to
go
through
the
serializer.
I
If
we,
if
we
remove
it
any
thoughts
on
that,
if
not,
then
we
can't
leave
it.
I
So
json
type
info
of
t
has
this
civilized
method
series
action?
Do
you
see
the.
A
I
So
at
first
it
was
looking
like
that
would
be
faster
than
going
through
this
eraser
itself,
which
essentially
does
the
same
thing
on
that
hood.
But
from
perf
measurements.
We
see
that
there's
no
like
difference
like
it's
just
as
fast
calling
this
directly
versus
passing
the
typing,
for
instance,
to
decide
that
to
do
it.
So.
I
A
A
I
guess
you
need
some
sort
of
data
right,
but
I
mean
I
mean,
given
that
this
is
a
get
only
clearly.
You
have
some
sort
of
internal
mechanism
to
set
it
today
right,
so
it
could
be
a
private
thingy.
That
happens
to
be
an
action
right,
but
why
does
the
public
api
have
to
return
an
action
that
just
seems
odd.
C
Well,
I
mean,
if
you
had
a
method,
I
mean
you
put
an
implementation
there.
This
is
not
a
supported
exception
or
something.
G
I
Yeah
I
mean
so
we
have
we
don't.
Maybe
we
need
to
document
it,
but
there's
some
cases
where
you
might
say.
I
want
fastpass
logic
for
a
type.
What
we
will
ignore
it
because
it's
not
supported,
like
you,
have
like
dictionary
of
objects
to
like
anything
that
has
objects
in
its
object
graph,
because
it
did
some
extra
polymorphic
supports
to
like
do
that,
detection
we
don't
we
have
we
we're
not
doing
that
for
six.
So
so
there's
some
cases
where,
like
yeah
like
you,
might
want
to
check
because.
A
We
should
probably
not
call
it
just
serialize,
because
I
mean
to
me
the
expectation
would
be
that
that's
always
there.
The
fact
that
it's
nullable
is
odd,
I
mean
sure
people
that
have
it
enabled
will
get
a
warning
when
they
just
try
to
invoke
it.
But
it's
not
obvious
to
me
what
I
would
have
like.
Why
would
that
be
another
kind
of
thing
right,
so
I
don't
know.
B
I
Yeah,
I
I
was
kind
of
relying
on
the
null
ability
to
suggest
something
and
then
maybe
documentation.
C
I
I
So
yeah
it's
like
no
size.
We
want
people
to
trim
up
more.
We
do
there's
one
thing
that
we
need
to
make
to
the
internal
serializer
thing,
which
is
to
add,
like
a
try
catch
for
like
some
extra,
like
exceptions
that
might
be
thrown.
So
if
you
go
through
the
serializer,
we
should
be
doing
the
same.
Like
exception
handling,
I'm
not
sure.
I
don't
think
I've
added
a
lot
of
overhead
to
the
point
where,
like
it
makes
this
funk
faster.
I
C
I
A
I
don't
know
fast
your
lies
and
can
fast
serialize
right
and
then
they
can
fast.
Your
lice
would
just
return
if
the
underlying
action
is
null
or
not
and
the
other
one
just
does
the
thing
right,
because
I
mean
the
programming
experience
might
be
slightly
better.
When
you
do
that
to
me,
the
nullability
is
just
very
subtle,
it's
kind
of
like
it's
null,
but
why
that
doesn't
seem
obvious
right.
If
mass
majority
of
people
just
ignore
it,
they
just
do
the
typical
question.
Mark
dot,
invoke
kind
of
thing.
A
A
Right,
but
is
that
obvious
to
the
caller,
like
the
property?
Is
null
do
it
like?
Is
this
obvious?
I
mean
it's
obvious
to
me
that
I
should
check
that
it's
now,
but
it's
not
obvious
to
me
what
the
what
up,
what
coda
would
do
in
the
else
and
the
nice
thing
with
an
exception
is
you
can
actually
tell
the
person
what
to
do
right?
So
you
you
and
the
downside
of
exceptions,
is
that
they
are
somewhat
more
expensive
potentially,
but
you
don't
really
expect
anybody
to
write
a
catch
handler
for
this.
A
You
expect
this
is
what
would
be
a
programming
error
right?
The
idea,
the
idea
would
be
no,
you
need
to
check
the
can
fast
serialize
and
if
the
answer
is
no,
then
you
call
into
the
regular
serializer,
and
if
the
answer
is
yes,
then
you
can
call
fast
serialize
right.
That
would
be
the
the
way
you
expect
people
to
write
the
code.
Yeah.
I
A
That's
fair,
but
that's
also
not
what
the
blog
post
says
right,
the
blog
post
kind
of
says
like
well.
You
have
a
you,
have
an
option
here.
You
can
call
this
method
or
you
can
call
through
the
serializer
right.
So
it's
more
like
an
aesthetic
design
decision
almost
rather
than
a
it's
an
advanced
scenario
right
where
and
that's
kind
of
my
concern
that
people
will
just
do
that
and
they
don't
have
nullable
enabled
and
then
bad
things
happen
right
and
they
have
no
idea.
What's
going
on.
A
A
I
can
just
do
dot,
dot,
serialize
and
then
stuff
is
happening
right
and
that
might
work
for
some
cases
and
suddenly
it
stops
working
for
them,
but
they
have
no
idea
what's
wrong,
because
all
they
get
is
another
reference
exception
and
then
it's
like
clearly
that
that
stuff
is
not
ready
yet
right,
so
the
takeaway
isn't.
I
did
something
wrong.
A
A
You
know
this
is
fastpass
serialization,
which
may
or
may
not
be
supported,
based
on
the
shape
of
your
types
and
you're
expected
to
check
for
that,
and
then,
in
the
else,
clause
just
go
through
the
regular
serialization
method
that
has
more
overhead
potentially
but
will
work,
and
so
I
think,
that's
kind
of
where
exceptions
will
probably
be
more
helpful
than
just
oh
yeah.
This
thing
can
be
null.
A
A
That,
well,
maybe
maybe
not
right,
because
you
can
argue
that
that's
an
easier
fix
for
the
customer.
You
would
just
tell
them
like
hey
the
shape
of
your
type.
Doesn't
you
cannot
be
serialized
unless
you
include
metadata
so
configure
your
options
to
include
that
right,
and
I
mean
that
would
be
the
fix
right,
because
I
mean
if,
if
you
need
to
serialize
this
thing,
you
need
to
serialize.
This
thing
right,
I
mean,
like
your
option
is
not
to
not
serialize
right.
A
It's
just
like
I
would
have
to
have
it.
I
would
like
to
have
it
fast,
but
if
fast
doesn't
work
well
then
slow,
it
is,
I
guess,
and
if
slow
doesn't
work
because
you're
missing
the
data
for
it.
Well,
then
I
have
to
include
the
data
right.
So
it's
not
like,
like
I
don't
know
it
doesn't
seem
like
the
option
of
not
serialization,
is
really
viable
for
the
customer
right.
A
A
C
I
C
G
B
C
C
I
Yeah,
it's
needed
it's
needed
for
good,
so
we
can.
We
shouldn't
remove
it
like.
Maybe
we
can
discuss
yeah,
we
can
discuss
if
we
should
refactor
it,
but
it
needs
to
be
visible
within
the
search
generator
like
public
on
decent
type.
I
Well,
this
resource
generator
itself
knows
whether
the
other,
like
serious
methods,
will
be
not
to
know
because
it's
generating
everything
so
like
these
ability.
Improvements
wouldn't
be
for
such
challenging
itself
with
something
for
users
but
yeah.
That's
a
good
point
about
like
us
needing
it
unless
we
do
actually,
unless
we
do
like
called
the
serializer,
but
that
way
I
guess
we're
sacrificing.
E
I
So,
overall
yeah
we
could
instead
of
calling
the
serialized
methods
directly.
We
could
go
to
this
realizer
just
like.
C
A
Yeah,
but
it
seems,
there's
no
other
alternative
right
to
me
to
me.
The
problem
is
I
mean
what
you
don't
want
to
do.
Generally
speaking,
is
you
don't
want
to
fall
back
if
the
customer
can
just
write
different
code
and
have
it
faster?
In
that
case
it
sometimes
it's
better
to
just
fail
and
not
make
things
work
to
actually
tell
them
like
yeah.
A
A
It's
it's
better
to
do
that,
then
always
go
through
the
slow
path,
just
in
case
so
like,
if
you
think,
from
a
customer's
standpoint,
even
if,
if
some
methods
throw
sometimes
or
work
sometimes-
and
sometimes
don't
eventually,
I
just
stop
using
this
thing
and
just
always
go
through
the
other
thing,
because
that
at
least
always
works,
but
now
you're
teaching
people
to
always
do
the
slow
thing,
even
though
the
fasting
would
have
worked
in
maybe
80
of
cases
right.
So
in
that
case,
it's
better
to
just
say
always
call
this
thing.
A
M
Just
to
confirm
naomi,
there
are
no
like
subtle,
behavioral
differences
between
the
two
methods,
correct
between
the
two
techniques.
B
I
M
Okay,
that
that's
actually
a
much
stronger
case
for
emo's
argument,
then,
because
I
I
know
in
the
past,
when
we
have
had
multiple
different
ways
of
doing
something
they
have
had.
You
know
subtle
differences
between
them.
That
would
perhaps
make
one
technique
not
appropriate
as
a
generalized
case.
C
Well,
I'm
thinking
if
we
don't,
if
we
don't
generate
the
the
source
code,
it
just
calls
on
the
serializer
there's.
No,
if
check
necessary,.
A
K
I
A
Is
you
can
say
the
sealers
are
closed
into
different
methods?
That
would
not
do
that
right,
but-
and
I
understand
you
may
not
be
able
to
hide
that
because
it
you
know
it
needs
to
be
public
api,
that
the
generator
can
call,
but
then
you
would
just
give
it
a
slightly
less
attractive
name
right.
I
think
the
problem
today
is
that
this
looks
like
the
oh.
If
I
want
to
theorize
this
thing,
I
can
just
call
this
and
it
will
always
work
right
and
the
answer
today
would
be
no
it
wouldn't.
I
I
guess
it's
just
a
point
in
time
thing
for
right
now.
I
think
even
like
primitives,
like
integer
and
like
string,
although
it's
a
little
weird,
but
we
can
have
the
first
part
for
those
so
it
it
could
actually
be
never
know
eventually,
like
assuming
that
you
specified
like
serialization
question.
A
A
Decision
what
to
do
with
it?
I
don't
think
we
know
what
the
perfect
answer
is,
but
we
know
that
you
know
it's
in
the
ballpark.
I
mean,
I
think
it
looks
good.
You
know,
except
for
the
feedback
that
we
have
here
so
I'd
say.
Let
me
just
click
comment
on
this.
One
put
this
back
with
api
needs
work,
and
then
you
know,
let
us
know
what
you
have
planned
for
this
sure.
Thank
you
alrighty.
So
then
this
is
it
for
today,
then
thanks
everybody,
and
then
I
see
you
either
on
thursday.