►
From YouTube: TCK call #6
Description
January 27, 2022 Jakarta EE Platform TCK call #6. Minutes can be viewed via https://drive.google.com/file/d/1ITRLSSPfkZuwJ82hAvEaIV-QdM7znsUq/view?usp=sharing
Note that previous TCK call #5 was not recorded
A
Awesome
and
let's
see,
I'm
gonna
paste
the
agenda
to
the
chat
room,
okay,
cool,
that's
done,
and
oh
hi
dmitry
good
to
see
you
and
I.
A
A
A
So
I'll
I'll
just
do
a
quick
introduction
and-
and
just
mention
you
know,
you
know
that
I
I
yeah
I
added
some
notes
from
the
that
were
based
on
what
we
had
discussed
in
the
platform
call
this
week
on
tuesday,
where
we
talked
about
split
out
tcks
and
I
included
the
link
to
the
platform
called
agenda
so
that
you
know
maybe,
after
if
anyone
wants
to
look
at
them
when
they
won't
look
at
them.
Now,
that's
you
know,
that's
fine,
but
that's
the
that's
where
we
came
up.
A
The
question
came
up
of:
isn't
it
a
problem
that
the
platform
tck
and
the
new
split
out
tcks,
you
know
the
new
standalone,
tc
spec
tcks
may
have
the
same
test
or
duplicate
and
they
may
duplicate
tests
that
are
in
both
places
and
oh
hold.
On.
Let's
see,
I
need
to
let
people
in.
A
Okay,
this
is.
C
Yeah,
this
is
not
that
this
is
actually
not
a
question
of.
Is
it
possible
that
both
bundles
I
mean
split
out
and
jakarta?
Dcks
platform
does
have
the
same
tasks.
This
is
actually
what
we
it's
definitely
possible,
but
this
is
not
what
we
want
to
have,
because
it
introduce
two
sources
of
truth
and
increases
their
support
efforts
right.
A
Yes,
but
but
it's
not
yeah,
so
I
I
added
a
point
here
and
it's
arguable
debatable.
That's
why
we're
here,
but
I
did
I
did.
I
did
add
what
I
tried
to
be
a
point
of
you
know
point
for
you
know,
constraints
and
costs
introduced.
A
If
you
have,
you
know
if,
if,
if
the
you
know,
the
standalone
spec
tck
contains
test
to
validate
platform
requirements
and
and
if
the
standalone
spec
test
doesn't
contain
those
which
is
you
know,
two
different,
you
know
two
different
situations
and
that
we
could
find
ourselves
in
and
I
think
that
kind
of
covers
you
know
both
sides,
but
on
the
on
on
on
on,
you
know
like
if,
if
the
standalone
spec
tck
doesn't
contain
the
platform
validation
tests,
then
the
standalone
test,
the
standalone
spec,
doesn't
have
dependencies
on
the
other
platform
specs.
A
If,
if
the
standalone
tck
does
have
platform
level
tests,
then
that
stand-alone
spec
has
a
dependency
on
platform
specs
and
it
creates
you
know
chicken
and
egg
type
dependencies
where
let
you
know
I'll
just
summarize
quickly
and
just
say
let
let's
in
the
example
I
I
mentioned.
Let's
say
that,
like
the
batch
tck
depends
on
ejb
and
the
batch.
The
batch
spec
goes
final,
but
then
the
ej
ejb
spec
and,
let's
just
say
there-
was
some
changes
in
flight
for
that
release.
A
The
ejb
spec
then
makes
a
change
that
is
breaking
that
tck,
the
batch
tck
that
contains
that
batch.
You
know
that
each
batch
cck
contains
ejb
test
and
that
breaks
that
tck
the
batch
spec
has
to
release
again.
It
needs
a
new
tck,
probably
a
new
spec
release,
and
I
call
that
the
cost
and
there's
probably
other
things
that
it's
it's
not
just
straightforward,
that
it
has
to
be
one
way
or
the
other
the
spec.
A
B
So
scott,
I
I
was
going
to
say
at
least
in
the
json
pnb
cases,
right
that
the
dependency
is
only
that
the
container
you
know
the
jsp
and
the
server
containers
must
be
compatible
right
so
that,
if
servlet
or
jsp
were
to
change
after
say
json
pnb
had
finalized
right,
json
wouldn't
need
to
change.
The
servlet
container
is
just
responsible
for
for
assuring
that
it
that
its
compatibility
with
you
know,
requirements
are
still
met,
so
so
it
it
it's.
B
I
I
think
that
your
I
think
your
conclusion
is
absolutely
correct.
Is
that
you
know
the
dependency
relationships
can't
be
ignored.
They
must
be
considered
by
the
the
you
know,
the
corresponding
spec
teams,
but
it
isn't
it's
not
all.
It
isn't
always
the
case
that
you
know
just
because
there
there
are.
You
know
requirements
across
different
component
specs
that
every
change
is
going
to.
You
know
have
a
abdominal
effect.
A
Definitely
it
it's
a
worse
case.
You
know
it
won't
always
be
the
case
that
we're
removing
the
spec,
but
in
the
scenario
I
I
mentioned
it's
only
if
and
only
if
in
the
you
know,
in
the
json
p
case,
let's
say:
servlets
some
servlets
spec
api-
that
that
the
json
ptck
was
validating
at
the
plot.
A
platform
requirement
that
spec
is
you
that
spec
api
for
servlet
is
used
in
the
tck.
A
B
That's
right
so
in
in
this
case,
I
think
that
the
dependency
is
in
the
opposite
direction.
Right
if,
if
server
were
to
go
final
and
jsonp,
you
know
and
it
as
part
of
its
actually
I'm
not
even
sure.
If
servant
spec
has
this
requirement
or
if
it's
just
a
platform,
but
in
any
case
let's
say
the
servlet
spec,
the
server
container
had
somehow
indicated
you
know
somehow
become
final
and
there
was
a
change
in
jsonp
right.
You
know
that
that
could
inflict
a
retest
on
terribly.
B
But
if
it's
the
platform
spec
that
says
the
certain
implementation
in
the
platform
must,
you
know,
must
be
compatible
with
with
jsonp
right
yeah.
So
then
it's
up
to
the
platform
implementations
to
make
sure
that
they
have.
You
know
they
have
achieve
those
requirements
that
may
may
or
may
not
have
a
corresponding
ripple
effect
into
survey.
A
B
The
spec
teams
have
to
do
the
analysis
and
yeah
and
make
sure
that
they
have
covered
the
dependency
relationships
correctly.
A
Yeah
so
jsonp,
because
it
because
of
of
what
it
would
be
because
of
it
of
the
the
platform
validations
that
it
does,
and
you
know
from
a
dependency
point
of
view,
yeah
yeah,
100
correct.
It's
not
likely
to
go
for
this
scenario
that
happened
with
jsonp,
but
for
you
know
for
batch.
A
You
know
we're
batch
depends
on
if
yeah,
sorry,
where
batch
validates
ejb
enterprise
beans.
You
know
that.
Would
you
know
that
could
happen,
because
it's
in
it's
an
inverse
dependency
relationship
batch
would
go
final
before
ejb
does
because
ejb
always
goes.
It
goes
final
at
the
end,
because
it
depends
on
the
platform
tck.
B
So
so,
in
the
case
for
jsonp
right,
I
think
the
main
sort
of
technical
issue
we
have
to
resolve
here
is
how
how
are
we
going
to
stage
these
js
the
json
ptck,
so
that
it
can
run
in
the
two
required
containers
right
I
mean
that
that's
for
me,
that's
the
fundamental
technical
problem
that
needs
to
get
resolved
today.
D
Right
so
the
from
for
me,
the
question
is
so
cdi
has
done
this
for
a
while.
It
has
a
set
of
tests
and
it
relies
on
our
killian
to
spin
up
the
containers
that
it
needs
to
test.
So
when
you're
testing
the
platform,
you
know
it's
injecting
the
it's
already
put
in
the
cdi
implementation
and
library
into
whatever
platform
is
being
tested
and
then
the
users
are
killing
to
deploy
those
tests.
So
the
question
for
me
for
ed10
json
bnp
is,
I
mean,
do
we
have
do
we
have
a
test
framework
that
can
do
that?
C
Yeah,
but
this
is
what
you
sensed
got.
It
requires
a
queen
right
and
we
actually
didn't
use
a
query
in
our
standalone
tests
and
that's
breakdown.
D
B
Well,
this
anything
this
is
the.
I
think
this
is
the
total
validation
right
they
just
they.
They
take
the
test
in
ee9
right.
They
they
take
the
jsonp
standalone,
they
spin
up
jsp
and
they
run
they
verify
that
the
compatibility
tests
all
pass
right
and
they
actually,
I
what
I
think
they
do.
Is
they
they
they
actually
interleave
all
the
tests
running
one
and
json
p1
and
the
servlet.
B
Then
they
run
the
next
one,
jsonp
and
json
and
then
insert,
but
in
this
case
I
think
it'll
be
easier
for
the
test
development
effort
to
just
you
know,
spin
observe
it
run
all
the
tests.
Hopefully
they
pass
then
spin
up
jsp
run
all
the
tests.
If,
hopefully
they
pass
right.
So
so,
if
we
can,
you
know
it's.
B
I
can't
believe
that
java
test
can't
be,
you
know,
cannot
do
that,
but
you
know
I
guess
I
would
have
to
defer
that
the
answer
to
that
question
to
you
know
the
people
that
actually
have
expertise
in
that.
For
me,
that'd
be
guru
or
always.
C
A
So
so,
there's
you
know
right
yeah,
right
now,
users
set
up
the
cdi
and
and
other
stand-alone
tcks
they'll
now
run
additional
standalone
tcks
with
their
ee
implementation.
A
This
you
know
for
the
standalone
tcks
that
do
work.
You
know
so
that
do
contain.
You
know
the
plot.
You
know
the
platform
validation.
So
in
the
case
of
when
it
comes
to
running
the
platform,
tck
you'll
only
run
you
know.
What's
in
the
platform
tck
the
platform
tck
itself
is
not
at
least
not
for
ee10.
It's
not
going
to
wrap
all
the
other
tcks
that
in
the
past,
we've
always
run
separately.
A
The
difference
is
now
there's
new
standalone
tcks
that
also
have
to
be
set
up
and
run.
And
again
this
was
something
we
wanted
to
do
to
do
that
integration
as
part
of
the
refactoring
but
yeah.
I
don't
think
we
would
be
completing
that,
but
yeah,
maybe
we
would
have
completed
it
by
now.
I
don't
know,
but
you
know
we
basically
stopped
in
mid,
mid,
fall,
pushing
on
that
and
it.
E
D
We
didn't
only
see
them
sorry,
they
only
see
three
possibilities
for
ee10.
The
simplest
frankly
is
just
to
keep
the
duplicate
test
and
you
have
to
run
both
the
next
would
be.
Is
there
any
way
that
you
can
get
the
j
test
harness
to
bridge
to
junit
test?
You
know
a
an
external
library
of
j
unit
test.
D
The
third
is
that
you
have
to
update
the
the
jsonp
tck
to
use
our
killing
to
to
spin
it
up.
I
mean
but
longer
term.
The
ideal
thing
would
be
that
the
tcp
harness
has
an
ability
to
run
external
or
test
ng
tests
in
the
can
in
the
target
containers.
But
you
know
that's
going
to
be
a
lot
of
investigation.
I
don't
think
there's
any
possible
way.
We
can
do
that
for
e10.
E
E
So
an
option
could
be
to
create
new
tests
in
a
current
tck,
which
would
be
implemented
as
junit
runner
run
list
of
tests
and
somehow
report
results.
B
Traditionally,
what
we,
so
there
are
some
tcks
that
are
just
run
directly
against
glass,
fish,
the
and
then
there
are
some
that
use
a
porting
kit
that
that
the
porting
kit,
basically
is
just
a
bunch
of.
As
I
understand
it,
it's
a
preliminary
set
of
setup
and
configuration
actions
that
are
scripted
to
to
make
get
the
platform
into
you
know
to
to
the
necessary
state.
Then
the
tck
itself
is
run.
E
B
B
F
Basically,
a
porting
kit
we
use
for
cdi
dc
case.
Basically,
we
run
the
cdi
dc
case
with
glass
fish
using
a
port
trinket,
but
the
idea,
a
scenario
in
platform
tck
with
json,
b
and
jsonp,
which
we
discussed
a
couple
of
times
in
platform
tck
meeting-
is
that
vehicles
are
not
supported
the
the
ejv
vehicle
and
app
client
vehicle,
those
those
are
not
supported
for
a
standalone
dc
case.
That's
the
major
issue
we
will
face.
B
F
For
jsonp
and
json
b,
there
are
vehicles
defined
scott.
If
you
can
share
the
vehicle.properties
for
jakarta.
A
It's
yeah
yeah
I'll
I'll.
Do
that
in
a
second.
Let
me
just
highlight
the
from
the
spec
text:
yes,
it
it's
there
and
then
from
the
I'll
open
up
the.
A
Let's
see
so
it's
all
on
the
agenda,
but
it
would
you
know
this.
This
is
where
I
copied
it
from.
B
A
Didn't
see
them?
Okay,
I
let
me
just
look
and
let
me
look
in
my
let's
see.
B
A
C
I
it
should
be
fine
because,
currently
for
jakarta,
9.1
jason
b
and
jason
jsonp
tests
were
part
of
cts.
Basically
right,
I
understand.
A
I'll
share
with
you,
after
the
call
what
I've
you
know,
what
I'll
look
I'll
look
again
and
I'll
give
you
a
link
of
what
I
looked
at,
and
but
we
can
do
that
after
this
call.
Probably
if
that's
so.
B
A
A
I
I
guess
you
get
you're
trying
to
eliminate
the
duplication
and
that's
great,
but
we
just
don't.
You
know
like
to
integrate
java
test
harness.
You
know
to
use
the
the
junit
test
classes.
A
You
know
we're
basically
in
yeah,
that's
something!
That's
we
don't
have
that
integrated
and
I
could
see
running
like
a
standalone
tck,
but
it
would
just
be
like
the
user
doing
it
so
that
doesn't
help
we're
really
talking
about.
We
want
to
run
them
against
the
vehicles
that
we
have.
You
know
set
up
with
java
tests
and
it's
just
you
know
we
have
this
huge
tck
that
what
it's
been
20
over.
You
know
over
20.
A
You
know
20,
you
know
years
in
the
making
and
we
could
yeah
if
we
bolted
into
j
unit.
It
would
just
be
you
know
it
would
be
a
real.
You
know
really,
I
don't
we
talked
about
it,
but
it
seemed
like
a
problem.
We
don't,
we
didn't
need
to
solve.
If
we
try
to
solve
that
now,
I
guess
you
run
a
junit
test,
but
the
genuine
test
doesn't
know
how
to
deploy.
A
E
A
E
A
A
D
You
do
you
need,
you
do
need
the
vehicle
components,
an
agb,
servlet,
a
servlet,
filter,
etc,
etc.
Actually,
making
use
of
the
api
under
test,
in
other
words,
you're
not
really
validating
that
it
integrates
in
that
container.
So
it's
not
just
as
simple
as
running
existing
geo
unit
tests,
unless
they're
already
referencing
those
components,
which
is
what
cdi
does
so
cd
adds.
That's
a
test
that
have
are
using
egb
components
to
exercise
cdi
features.
D
D
B
D
C
the
tck
cyclic
dependencies
it
would.
The
ideal
thing
is
that
a
given
platform
could
write.
I'm
sorry
specification
team
can
write.
This
is
our
expectation
for
json
bnp,
integration
and
servlet
etcetera,
etcetera,
but
we
don't
have
to
deal
with
the
arcelian
integration.
You
know
that
there
is
a
framework
in
the
platform
or
whatever
higher
level
container
that
wants
to
validate
the
behavior,
could
just
pull
those
tests
in.
B
B
A
The
tests
will
be
they'll,
be
duplicated
tests.
Then
you
know.
B
Based
on
everything,
you're
saying
I
I
don't
see
how
we
can
avoid
it,
and
I
I
guess
I
haven't
heard
anybody
speak
up
to
a
an
alternative.
D
B
C
Covering
everything,
what's
the
previous
javascript.
E
C
So
why
we
can't,
in
this
case,
just
require
in
the
user
guide
for
the
platform
that
implementation
runs
standalone
tests,
which
can
be
found
at
that
url
right
and
in
the
platform
tests
have
just
one
or
maybe
several,
depending
on
the
spec
simple
tests,
which
does
test
that
json
binding
is
accessible
in
this
container.
B
So
so
that's
what
the
tests
do
now
right
that
the
tests
run
explicitly
run
against
each
container,
and
I
think
in
this
discussion,
what
we
have
what's
been
stated.
Is
it's
not
possible
to
pre-configure
that
so
that
those
so
that
each
test,
that's
in
the
json-b
test,
suite
will
be
run.
You
know
it
for
each
of
those
four
containers.
C
Now
I
I
think
I
I'm
I
mean
something
different.
What
I
meant
is
that
we
have
a
bundle
of
split
out
tests,
standalone
tests
right
and
in-
and
this
is
this-
is
our
standalone
tests
and
they
are
not
running
as
part
of
a
platform
test
right.
It's
part
of
a
platform
test.
We
have
just
one
past
which
tests
that
json
b
is
accessible,
is
callable
in
the
servlet
container.
That's
it
there's.
D
Dimitri
you,
you
there's
not
like
a
requirement
for
manipulating
egb
method
parameters
or
anything
like
that
type
of
stuff.
It's
just
strictly
and
you'll
not
non.
B
D
B
Well,
that's
that's
what
that's
what
that's,
what
the
boarding
kit
would
do,
but
just
a
minute
ago,
scott
said
that
he
didn't
that
we
couldn't
well.
D
B
I
don't
think
that's
it.
I
I
don't
think
it's
innovation,
scott.
When
you
said
you
can't
that
you
can't
run
those
vehicles.
What
are
you?
Could
you
just
clarify
that.
A
Sure,
when
so,
when
you,
when
you
run
a
test,
you
know
you
already,
you
know
in
in
in
all
you
know,
and
you
know
you
already
have
an
app
server
already
you're
up
and
running.
You
basically
are
invoking
an
ant
script
that
calls
into
the
user's
deploy
it's
like
a
job
in
the
portion.
It's
like
a
java
class
that
does
that.
A
That
does
a
deployment,
and
it's
told
this
is
the
particular
deployment
and
the
deployments
are
all
on
on
the
disk
and
the
important
kit
script
that
gets
invoked
can
do
whatever
it
likes
to
you
know
to
handle
the
deployment
and
it
can
add,
like
vendor
deployment
descriptors
at
that
point,
everything
that's
needed
so
so,
basically,
I
I
I
guess
like
for
these
jna
tests,
you
know
they
would
have
to.
A
D
A
They
you
know
they
just
you
know,
there's
a
you
know.
The
java
test,
harness
just
throws
an
exception.
Basically,
so
up
a
higher
level
gets
an
exception.
A
So
in
the
case
of
j
unit
yeah,
we
would
just
need
to
you
know
we
would
just
have
to
go
by
the
exit
code
and
kind
of
you
know
say
you
know,
throw
some
fake
exception
to
say
that
you
know
the
test
field
in
terms
of
like
you
know,
results,
collect
and
that's
just
kind
of
like
on
the
disk,
and
you
know
done
an
unknown
location
kind
of
thing
and
that's
kind
of
not
an
issue.
C
But
just
hang
on
a
second
right,
so
we
are
always
talking
about
g
unit
right,
but
the
idea
is
that
we
use
j
unit
only
in
standalone
tests
and
in
the
platform
tests.
We
just
have
one
test,
which
is
not
gu,
which
is
java
test
harness
yes,
and
we
actually
have
a
bunch
of
these
tests.
Already
we
delete
everything.
We
keep
only
one
okay,
yeah.
A
A
What
dimitri
is
saying
is
what
works
that
can
work
now
and
I
think
scott,
you
said
it
as
well.
Just
have
some
platform.
D
A
A
C
It's
out
of
scope
out
scope
of
this
meeting,
so
we
need
really
to
decide
what
to
do
with
json,
binding
json
processing
test
which
are
already
jud5
right.
And
if
we
agree
that
the
solution
is
that
we
keep
standalone
junior
five
based
and
in
the
platform.
We
just
keep
a
minimum
set
of
tests,
ideally
one
or
two,
which
does
which,
which
just
tests
accessibility
of
the
spec
within
the
container
right.
We
already
have
a
full
set,
but
we
reduce
them
to
minimum
number
of
tests
and.
B
So
in
I
can
only
guess
what
why
the
requirements
are,
or
you
know,
state
what
they
state,
but
I
think
the
the
the
goal
of
the
requirement
is
to
assure
that
within
each
of
these
containers,
whether
or
not
they
they
are
implement,
regardless
of
how
they
are
implemented
within
the
container,
all
the
requirements
of
jsonp
are
met
right,
I'm
just
let's
just
use
jsonp
for
now
right,
so
in
the
tests
right
now,
they
will
run
each
of
the
tck
tests
against
each
container
right-
and
I
guess
so,
and
and
so
if
each
container
has
a
completely
independent
implementation
or
if
each
container
shares
a
common
implementation,
those
tests
will
verify
that
the
container
meets
the
requirements
right
and
so
demetrius.
B
What
you're
suggesting
will
result
in
a
you
know
that
that
sort
of
has
an
implication
on
how
the
how
the
platform
is
implemented
right-
and
I
guess
I
you
know-
I
I
would
I
would.
Rather
we
didn't
go
down
that
road.
C
That's
that
that's
kind
of
I'm
making
some
assumptions,
yes,
but
in
general,
if
we
require
a
json
b
implementation
which
used
in
the
platform
to
pass
all
tck
tests
and
json
b
specification
itself
doesn't
have
any
dependencies
requirements
or
doesn't
use
any
other
specifications
used
in
the
platform
which
is
true,
then
it
would
be
sufficient
at
this.
C
I
think
it
would
be
sufficient
that
we
declare
in
the
user
guide
that
platform
implementation
of
json
b
should
pass
standalone
tests,
which
is
give
based,
and
it
passes
the
minimum
number
of
platform
tests
which
just
test
that
json
b
is
accessible
because
json
b
doesn't
use
any
additional
features
or
doesn't
depend
on
other
specifications.
D
E
Oh
yes,
there
is
at
least
for
a
jsonp.
There
is
one
test
in
the
platform
which
is
very
specific
to
it
and
with
jsonp
we
had
some
issues
in
the
past
passing
it.
It
basically
verifies
that
user
provided
implementation
through
in
a
web,
app,
for
instance,
through
webbing
of
lib,
can
be
loaded
and
used
instead
of
the
server
provided
one.
E
D
A
So
in
in
the
platform
tc
in
the
sorry
in
in
the
in
the
pl
for
the
validation
of
the
platform
requirements,
we
have
one
proposal
which
is
have
a
minimum
number
of
tests,
which
you
know
in
some
cases,
that
could
be
just
the
signature
tests
in
each
of
the
the
vehicles,
and
I
think
that
you
know
for
the
case
where
we
already
have
platform
tests
that
are
doing
more
than
that.
A
If
we
went
to
the
minimum,
whether
it's
only
a
signature
test
or
some
small
subset
of
the
current
platform
tests,
where
lose,
as
ed
said,
we
would
be
losing
some
testing.
Is
that
a
platform
discussion
that
we
should
just
have
separately
from
this?
And
you
know
in
terms
of
driving
which
tests
we
remove
or
don't
remove
from
the
platform
tck
for
json,
p
and
b
is?
Is
that
like?
Is
that.
D
D
D
The
platform
team
certainly
has
to
say:
well,
you
know,
are
you
testing
blar?
You
know
if
that's
a
requirement?
Okay,
you
know.
How
are
we
testing?
They
have
to
understand
that?
Okay,
so
again,
to
the
point
of
that's
why
I
was
trying
to
understand
what
are
what
are
exactly
requirements
if
we're
just
requiring
that
I
can
invoke
the
api.
D
That's
one
thing,
but,
as
lucas
said
you
know,
you
do
want
the
ability
to
customize
the
behavior
of
the
api
with
classes
that
are
coming
from
the
container
level
deployment.
You
know
something
that's
been
bundled
with
an
egb
jar,
it's
been
bundled
in
a
war,
etc.
A
I
mean
I
personally
I'm
in
favor
of
keeping
tests
in
the
platform
only
because
from
yeah
I've
seen
many
cases
where
it
helps.
You
know,
implementations
find
bugs
that
they
otherwise
wouldn't
have
found
if
those
weren't
there-
because
I
don't
know
like
I've-
mentioned
this
example
before,
but
you
have
like
a
servlet
control
on
that
user
transaction
that
invokes
persistence,
context,
operation
over
multiple
invocations
and
a
different
thread
is
used
and
you
find
some
bug
that
you
would
wouldn't
have
found.
D
C
Yeah
and
I
thought
to
agree
with
you
so
for
the
specs
like
servlet-
it's
definitely
better
to
have
all
the
tests
tcks
in
the
platform
test,
because
it's
celebrated
basically
integrates
with
many
things
right,
but
for
json
body
and
json
processing.
C
I'm
not
sure
about
that,
and
just
remember
that
the
convenience
of
standalone
g
unit
test
is
that
when
you
make
your
implementation,
you
can
run
this
dck.
That's
as
part
of
of
your
maven
build
and
easily
detect
problems
right.
C
So
it's
kind
of
a
trade-off
as
a
as
I
understand
right
and
we
need
to
solve
it
somehow,
because
release
date
is
closed.
Yeah
and
the
this
is
exactly
this
is.
This
is
a
blocking
issue
for
these
two
specs
right,
so
sooner
we
solve
it
sooner.
We
submit
specs
for
a
belt.
D
C
Yeah,
I
would
agree
with
that.
I
I
I
I'm
just
not
sure
how
we
make
this
decision
right
because
ad,
for
instance,
disagrees
right
and.
D
Well,
no,
I
think
that's
the
proposal
that
we
take
to
the
the
platform
you
know
and
have
the
dual
scouts
representing
the
platform
tck
and
we
have
others
representing
the
implementations
and
we
have
people
representing
the
specs.
Can
we
just
make
that
decision?
This
is
the
proposal
and
decide
that
that's
acceptable
for
ee10.
B
D
B
C
B
B
A
Just
you
know
the
big,
the
big,
the
big
change
we're
making
for
the
refactoring
is
just
you
know.
It's
switching
to
you
know
multi.
You
know
multi
multi
module.
You
know
maven.
You
know
projects
that
you
know
that
you
know
move
off
of
the
ant
and
the
porting
kit
and
can
integrate
into
you
know
with
you
know,
junit
or
test
ng.
We
didn't
actually
pick
one,
but
you
know
the
idea
was
to
start
use.
You
know
you
know
basically
start
creating
this
the
same.
A
You
know
the
same
tck's
stan
in
the
platform
tck
that
could
be
just
easily
extracted
into
standalone
tcks
with
the
platform
integration
already
ready
to
go,
and
then
you
know
just
kind
of
like
if
we
wanted
to
split
it
up,
because
we
already
you
know
we'll
have
we'll
have
all
the
standalone
tcks
ready
to
go
almost
you
know
they
just
happen
to
live
in
the
platform.
Tck
repository,
that's
kind
of
the
idea,
but
you
know
that
should
integrate
more
easily
as
well
with
separate
tcks.
D
Okay,
so
my
proposal
would
be:
is
that
we
we
have
the
the
spec
team
validate
that
you
know
some
subset
of
the
existing
platform.
Tck
tests
are
really
just
implementation
tests
that
are
now.
D
Superseded
by
the
standalone
tck
and
can
be
removed
and
the
and
whatever
integration
tests
remain
will
are,
are
not
duplicate
tests.
They
are
additional
to
test
they're
dealing
with
the
platform,
level
requirements
and
they'll,
so
we're
going
to
have
to
run
a
standalone
tck
and
those
integrations
sons.
The
the
tests
that
were
that
have
been
superseded
by
the
standalone
tck.
B
So
I
think
the
platform
test
requirements
will
be
run.
The
standalone,
tck
and
the
platform
tck
will
contain
some
yes
someday
some
tests
that
are
copied
or
or
ported
from
the
extracted
the.
What
did
you
call
the
plugability
tests?
Well,
those
will
actually
continue
to
exist
inside
the
platform.
Tck
project
and
be
tested
against
each
container
is
that
right.
D
B
A
B
B
I
mean
just
gotta
port.
The
test
back.
B
E
D
C
So
I'm
reading
I'm
reading
these
bullets,
and
this
is
something
different.
What
you're
talking
about
right,
standalone
tck,
will
contain
tests
extracted
from
the
platform
pck
like
jsonp
plugability
tests-
yeah
it
can,
but
it
it
should
not
right.
So
it's
kind
of
up
to
the
spec
team
right,
for
instance,
we
have
in
json
binding
a
new
functionality
which
we
added
to
standalone
tests
standalone.
I
mean
extracted
tests
only
at
the
moment
and
we
don't
exist
in
the
platform
pck,
because
we
are
not
testing
anything
platform
related
right.
E
No
well
in
the
car,
there
are
no
changes,
but
the
difference
is
really
the
behavior
of
glass
holders
in
each
container
and
availability
of
implementations
and
stuff
like
that
right,
because,
if
within
the
web
container,
this
test
basically
adds
custom
implementation
of
jsonp
into
the
deployment
unit
and
expects
that
what's
provided
in
the
var
is
used
instead
of
yeah,
and
this
is
different
from
the
behavior
or
merge
in
java
se.
C
C
I'm
trying
to
say
what
I'm
trying
to
say
that
there
are
platform
tests
right
which
should
test
interoperability
in
case
of
your
how's.
It
called
plugability
test.
It
should
be
in
the
platform
tests
right
and
there
may
be
something
else,
and
there
are
standalone
tests
which
are
testing
your
implementation
right,
standalone,
javascript
and
in
case
of
json
b.
C
Your
plugability
test
should
be
there
too
right
and
that
standalone
tests
should
contain
a
bunch
of
other
tests
right
which
are
testing
that
implementation
is
compatible
with
specification,
and
this
bunch
of
tests
should
potentially
not
be
a
part
of
a
platform
test
right.
E
In
my
opinion,
it
doesn't
have
to
be
important
from
my
own
personal
perspective.
What's
important
at
the
platform
level
is
to
execute
signature
tests
within
each
container
to
execute
these
plugability
tests
in
each
container,
and
probably
one
or
two
some
simple
tests,
and
that's
basically
it,
but
that's
just
my
opinion.
A
I
I
guess
the
the
question
was
yeah,
so
there's
two
different
options
here.
You
know
the
question
marks
either.
You
know
the
tests
need
to
be
identical
in
both
or
it's
just
that
the
platform
tck
needs
to
have.
It
needs
to
be
synced
and
updated
to
to
contain
into
you
know
the
updated
api
usage.
A
E
E
D
At
the
container
level
that
that's
when
I
need
to
test,
I
mean
all
the
other
details
have
already
been
tested.
There's
no
point
to
retesting
them
within
the
container,
unless
there's
expectation
that
things
like
a
security
context
or
transaction
context
or
persistence
context
are
going
to
affect
it,
because
it's
something
that's
the
case
for
jsonp,
because
they're
relatively
focused
specifications.
A
B
B
If
there
weren't,
then
we
already
have
the
tests
we
need
in
platform
tck.
We
don't
need
to
copy
anything
back
if
there
have
been
changes,
since
it
was
extracted.
If
they're
doing
something
different
to
verify
compatibility,
then
those
whatever
those
modifications
are
need
to
get
back
in
to
the
platform.
B
So,
if
that's
the
case,
then
we
just
delete
all
the
other
tab.
You
know
we
just
disable
all
the
other
tests
run
this
the
standalone
vehicle
and
then
run
the
plugability
test
in
each
of
these
each
of
these
vehicles,
and
I
think
we're
done.
A
B
B
If
you
want
to
make
a
note
that
we
need
to
remind
ourselves
that
those
plugability
tests
are
shared
between
these
two
tcks,
that
wouldn't
be
a
terrible
thing,
because
eventually
one
of
them
will
change.
You
know
the
most
likely
the
plugability
somebody
will
add
a
new
plugability
test
in
the
standalone
tck.
We
want
to
make
sure
that
that
gets
that
reflected
back
these
these
these.
These
test
suite
needs
to
remain
synchronized.
A
B
So
I'm
I'm
good
with
that
for
json
bmp,
I
don't
know
scott,
I
don't
know
about
tdi
or
you
do
you
have
it.
D
D
A
B
Well
see
the
thing
just
scott.
The
the
concern
I
have
is
that
in
many
cases
the
component
specs,
don't
necessarily
have
an
in.
You
know
an
intimate
understanding
of
what
the
platform
requirements
are
yeah
right.
So
you
know
I.
I
don't
think
we
can
entirely
rely
on
the
component
teams
to
do
that.
Analysis.
Right.
No.
D
Now,
whoever
wrote
the
integration
test
you
know
needs
to
verify
that
that
there
aren't
additional
integration
tests
required.
You
know
for
a
given
update
to
a
component
specification.
B
Anyway,
okay
we're
over
time-
and
I
think
at
least
I
have
to
get
on
to
another
call.