►
From YouTube: IETF109-DISPATCH-20201116-0500
Description
DISPATCH meeting session at IETF109
2020/11/16 0500
https://datatracker.ietf.org/meeting/109/proceedings/
A
A
All
right
we're
at
the
top
of
the
hour.
So,
let's
start
with
a
few
of
the
preliminaries.
While
we
see
who
drifts
in
welcome
ietf
109,
I
guess
is
underway
here
with
the
first
of
the
working
group
sessions
for
the
week.
A
My
name
is
patrick
mcmanus
got
my
co-chairs
here,
ben
campbell
and
an
incoming
co-chair
of
chris
dp
we'll
talk
about
that
in
a
in
a
moment
as
we
go
here's
our
agenda
for
today
we
can
have
just
a
look
over
it,
but
from
the
top
we
of
course
need.
We
need
a
note
taker
for
this
session
braum.
I
believe,
offered
to
do
that
on
the
mailing
list.
A
I'm
gonna
need
a
note
taker
before
we
get
going
so
if
braun
isn't
there
yet
would
anyone
else
be
willing
to
pitch
in.
A
A
I
know
it's
easier
to
avoid
eye
contact
in
this
virtual
format
than
it
is
than
it
is
in
the
room.
B
A
At
jabber,
now
all
right
with
those
preliminaries
out
of
the
way
you
can
see
what
we
our
agenda
in
front
of
us,
we're
going
to
have
a
short
discussion
from
our
id's.
We
have
two
dispatch
items
to
talk
about.
As
a
reminder,
the
purpose
of
dispatch
is
really
to
be
a
friendly
and
welcoming
place
to
new
work
to
the
ietf
and
to
help
it
find.
The
permanent
homework
will
really
be
developed.
A
This
working
group
itself
very
rarely
adopts
a
draft.
The
more
likely
and
desirable
outcomes
are
to
find
an
existing
working
group
for
draft
to
discuss
whether
or
not
perhaps
a
mini
working
group
would
be
in
order.
Perhaps
a
buff
would
be
in
order
and
to
make
you
know
a
recommendation
amongst
those
things,
or
even
perhaps
a
recommendation
not
to
proceed
at
all
with
the
work
to
the
ads,
who,
of
course,
get
to
make
the
the
actual
decisions.
A
This
is
a
joint
meeting
with
the
art
area,
which
means
we'll
also
highlight
some
other
stuff
going
on
in
the
art
area
of
the
ietf
this
week
and
have
an
fyi
discussion
of
work
that
may
be
even
like
too
early
for
being
dispatched,
but
is
of
interest
and
could
take
feedback
from
from
this
audience
as
well.
That
alexi
will
be
presenting
on
media
types
or
rather
content
types
to
distinguish
between
forward
and
encapsulated
encrypted
each
encrypted
emails.
A
I
believe,
okay,
a
reminder
of
our
mailing
list,
art
at
iatf.org
and
dispatch
at
itf,
4
being
the
mailing
list,
and
we
don't
have
our
note
well
in
these
slides.
I'm
sorry,
that's
my
oversight,
but
I
will
refer
everyone,
especially
given
that
this
is
the
first
meeting
of
the
week
to
in
your
favorite
search
engine.
Look
up
iatf
note!
Well,
those
are
important
terms
under
which
your
contributions
with
the
iatf
are
governed.
A
A
contribution
is
almost
anything
you
do
that
interacts
with
with
other
folks
in
a
semi-public
ietf
context,
so
that
could
be
a
mailing
list
that
could
be
this
meeting
and
so
those
cover
things
rules
both
related
to
ipr,
as
well
as
your
your
code
of
conduct
during
these
meetings.
So
if
you're
not
familiar
with
that,
please
do
take
a
look
all
right.
That's
the
usual
getting
started
stuff
out
of
the
way,
and
with
that
we
can
move
over
to
our
first
speaker,
which
I
think
is
going
to
be
our
ads.
C
That
will
be
me,
and
you
should
have
my
video
now
so
hi.
Yes,
everybody
welcome
to
to
the
online
ietf
109
for
some
people,
like
our
our
asian
friends
and
the
down
under
people,
it's
a
good
time
zone
for
some
of
us.
It's
going
to
be
an
interesting
week,
but
I'd
I'd
like
to
start
by.
C
Most
of
you
know
that
fairly
recently,
our
beloved
colleague
and
good
friend,
jim
shad,
died
and
I'd
like
to
just
take
a
moment
of
silence
to
recognize
jim
and
his
contributions.
The
the
working
groups
he's
chaired
the
documents
he's
edited
the
smiles
he's
shared
and
the
wine
he's
he's
made.
So
a
bit
of
silence
for.
C
C
Thanks-
and
I
won't
do
the
full
minute
for
that,
but
we
all
remember
him
fondly
sorry.
The
the
next
thing
is
a
bit
of
happier
news.
We
have
a
new
chair,
as
many
of
you
also
know.
We
we've
been
interested
in
rotating
chairs
in
this
particular
working
group
to
give
different
people
a
shot
at
having
having
something
to
do
with
bringing
new
work
into
the
art
area
and
being
more
exposed
to
things
in
the
art
area.
So
ben
is
rotating
out
and
kirsty
payne
from
the
uk.
C
National
cyber
security
center
is
rotating
in
and
she
will
be
our
new
chair
starting
at
this
meeting.
So
as
we
speak
right
now,
I
am
clicking
the
button
to
update
the
data
tracker
for
care
state
and
at
the
end
of
the
meeting
at
the
end
of
the
week,
I
will
remove
ben
from
the
chair
list
and
there
you
have
some
video
from
kirsty
kirsty,
say
hi.
Yes,.
D
Hi
everyone
yeah
I'm
nice
to
be
here
this
morning.
It's
nice
to
see
you
all
thanks!
I'm
looking
forward
to
chairing
dispatch.
A
Awesome
christy,
that's
great!
I
don't
want
to
overlook
thanking
ben.
You
know
I
either
who's
really
done
the
the
yeoman's
work
here,
the
past
few
months,
which
I've
really
really
appreciated
and
you've
left
us
in
a
you
know
in
a
great
place.
So
you
know
thank
you
ben
for
your
efforts.
We
can
do
sort
of
the
the
meat
echo
golf
club,
I'm
sure
everyone
who's
muted,
is,
is
showing
a
appropriate
appreciation
but
sincerely
ben.
A
A
Yes,
what
yeah,
hi
patrick
restaurant
hello,
so
I'm
going
to
show
your
slides
because
it
just
disaster
always
happens
if,
okay,
if
we
don't
have
the
shirts
do
that,
but
when
you
want
a
new
slide,
just
shout
out
next
slide.
Okay,.
E
A
That's
up
to
you,
but
I
think
the
trend
is
towards
more
video
and
I
personally
think
it's
nice.
Would
you
like
questions
during
your
questions
at
the
end.
E
Questions
during
is
fine.
If
I'm
not
able
to
answer
the
question,
I
will
ask
until
I
get
through
the
slide
and
then
answer
them,
but
questions
during
the
talk
is
fine
and
I
am
assuming
I
have
about
half
an
hour
according
to
the
agenda.
Is
it
correct.
A
Including
the
question
time,
that's
correct
and
ben
and
christy
will
monitor
the
the
queue
and
they'll
just
like
pipe
in
directly
with
any
questions
that
need
to
be
addressed.
So
with
that
take
it.
E
Away,
thank
you
great
thanks,
patrick
and
hi
everyone
good
morning,
good
evening
good
afternoon.
If
you
are
in
a
time
zone
where
it
is
afternoon,
my
name
is
yeshwanth
muthusami.
I
work
for
immersion
and
I'm
going
to
be
presenting
our
id
on
the
on
on
a
proposal
for
haptics
as
a
top-level
media
type.
Next
slide,
please
patrick.
E
E
It's
used
widely
in
in
a
number
of
consumer
devices,
for
where
your
work,
when
you're,
using
your
virtual
keyboard
or
to
get
any
kind
of
vibrating
notifications
and
off
late
in
in
many
of
the
cars,
the
the
dashboard,
the
buttons
are
usually
replaced
by
a
large
touchscreen
that
provides
a
bi-directional
touch,
interface
and
the
thing
about
haptics
is
there.
It
requires
some
kind
of
actuation
to
in
order
to
to
create
that
tactile
sensation
in
a
mobile
device.
That
actuation
is
provided
by
small
vibrating
motors
or
actuators,
or
in
a
larger
or
automotive
touch
screen.
E
These
actuators
are
piezoelectric
materials
and
haptics
are
these
days.
They
seem
to
be
everywhere.
The
the
most
recent
example
is
a
sony
ps5,
dual
sensor
controller,
which
has
just
been
released,
and
it's
getting
a
lot
of
good
press
because
of
its
haptics
controller
or
pretty
much
all
of
their
smartphones
on
the
market
have
haptics
in
them
and
they
are
right.
Haptics
is
considered
pretty
essential
for
enhanced
media
experiences.
E
Next
slide,
please,
okay,
the
this
is
just
an
indication
of
that
of
the
differences
between
version
zero,
zero,
which
has
been
on
the
dispatch
mailing
list
since
the
middle
of
october
and
version
one
which
I
uploaded
a
few
hours
ago.
I
I
apologize
for
this
late
breaking
update,
but
until
late
yesterday
evening,
I
didn't
know
I
was
going
to
be
presenting
here.
E
So
thanks
to
an
ee
and
email
snafu
at
our
end,
I
wasn't
aware
of
this
presentation,
so
I
kind
of
had
to
scramble
and
get
the
new
version
out
the
new
version
it
tries
to
address
all
of
the
comments
which
we
have
received
on
version:
zero,
zero
on
the
on
the
dispatch
mailing
list
and
specifically
it
try.
It
tries
to
I've
added
pros
to
dispel
the
inadvertent
inadvertent
misconception
that
some
people
has.
He
seemed
to
have
gotten
from
version
zero,
zero
that
these
these
following
haptic
subtypes
were
already
in
use.
E
That
is
not
the
case.
The
haptic
data
formats,
like
ahap
and
org
and
ibs
and
hapt
are
are
indeed
widely
in
use
and
we
expect
them
to
live
under
the
haptics
top
level
type
once
it's
approved.
So
I
made
that
point
more
clearer
and
based
on
feedback
from
more
than
one
person.
I've
added
the
sections
on
the
subtype
registrations
for
haptic,
slash,
ibs
and
haptic,
slash
hacked
to
to
just
to
illustrate
how
the
subtype
registrations
would
be
handled
once
we
have
a
haptics
approved
at
the
top
level.
E
I've
added
a
number
of
new
new
references
that
show
the
progression
of
the
standardization
of
haptics
in
mpeg
and
the
availability
of
documents
for
the
various
haptic
data
formats
that
we
envision
to
live
under
the
haptics
top
level
type.
So
that
is
the
basic
basic
differences.
So
if
people
have
read
version
zero,
zero,
I
I
just
want
to
point
out
that
my
presentation
is
going
to
be
based
on
version
one,
and
this
slide
it
kind
of
gives
you
an
indication
of
what
is
different,
which
you
may
not
have
gone
through
next
slide.
Please.
E
Okay
in
the
in
the
next
series
of
eight
slides,
I
basically
go
through
various
justifications
for
why
we
believe
haptics
needs
to
be
treated
as
a
top
level
type
and
that's
what
I'm
that's.
What
you're
going
to
be
seeing
the
first
slide
here
is
basically
more
of
an
introduction,
so
to
speak.
Haptic
signals
provide
an
additional
layer
of
entertainment
and
overall
sensory
immersion
and
the,
and
as
the
user
experience
and
enjoyment
of
media
content
can
be
significantly
enhanced.
If
you
add
haptics
to
the
audio
video
content
in
iso.
E
E
And
to
date
there
hasn't
been
a
registration
of
for
formats
for
haptics
and
but
but
starting
in
april,
haptics
will
have
was
proposed
as
a
first
order
media
type,
which
means
that
it's
at
the
same
level
as
audio
and
video
tracks
in
the
iso
based
media
file
format
and
this
proposal
that
we
made
was
accepted
by
the
mpeg
systems,
file
format,
group
and
then
it
has
since
progressed
to
dan,
which
is
a
draft
amendment
in
the
at
the
october
meeting,
which
ended
in
october
16th
after
passing
a
worldwide
cd
ballot.
E
So
it
is
moving
on
to
the
next
stage
in
the
standardization
and
if
it,
if
it
progresses
at
its
pace,
it's
expected
to
issue
as
an
amendment
which
would
be,
which
would
be
the
equivalent
of
an
international
standard
in
october
2021.
E
And
when,
when
that
happens,
it
will
become
part
of
the
iso
iec
14496
part
12
sixth
edition-
and
this
is
a
question
we
get
asked.
Often,
how
is
it
what
is
going
to
be
the
designation
for
haptics
in
mp4
files,
given
that
video
mp4
is
already
taken
and
audio
mp4
is
already
taken
and
and-
and
we
do
realize
that,
and
so,
if
a
video,
a
an
mp4
file
has
got
video
audio
and
haptics,
then
we
are
not
going
to
mess
with
the
designation
it.
It's
going
to
it's
going
to
still
be
video
mp4.
E
The
same
goes
for
if
a
file
has
audio
and
haptics
and
happens
to
be
a
dot
mp4
file,
we
are
not
going
to
change
the
designation,
but
we
are.
We
do
envision
to
mp4
files
with
just
haptic
tracks
in
them,
and
the
examples
of
such
files
would
be
mp4
files
used
for
streaming
games
or
have
haptic
files
for
haptic,
vests
and
and
bodysuits
and
belts
and
gloves,
and
for
those
files.
We
envisioned
that
designation,
haptic,
slash
and
b4
next
slide,
please,
okay!
E
E
Vibro
tactile,
which
is
a
touch
vibration,
is
just
one
of
them:
the
force
feedback
we,
which
you
will
feel
in
the
in
the
sony.
Ps5
dual
sensor
controller
is
called
kinesthetic
haptics
and
the
surface
is
for
the
friction
and
and
where
you
can
actually
get
the
different
types
of
you
can
get
the
feel
of
different
textures.
E
So
you
you
can
actually
get
the
feeling
that
the
the
that
you're
feeling
that
that
you're,
actually
touching
rock
or
soft
silk
or
a
sharp
object
or
all
that
be
because
of
the
surface,
friction
that
you
feel
on
your
fingertips
and
then
you
have
the
spatial
non-contact
haptics,
which
is
based
on
ultrasound
and
last
but
not
least,
you
have
the
thermal
haptics.
So
these
are
all
the
different
submodalities
of
haptics
that
you
need
to
take
into
account
and
and
designating
haptics
as
a
top
level.
Media
type
would
allow
us.
E
Please-
and
this
in
my
opinion,
is
a
pretty
compelling
reason
your
mileage
might
vary,
I
mean
for
for
the
human
sense
of
hearing.
We
have
the
top
level
media
type
of
audio
for
the
human
sense
of
seeing
we
have
the
top
level
type
top-level
media
type
video
and
for
the
equally
important
human
sense
of
touch.
I
think
it
makes
for
perfect
sense
to
have
the
top
level
media
type
haptics
and
more
to
the
point.
E
E
Okay-
and
there
is
a
considerable
amount
of
commercial
uptake,
as
I
kind
of
mentioned
in
passing
earlier-
it
is
rapidly
becoming
a
standard
feature
of
many
ce
devices,
and
these
bullet
points
basically
give
you
an
idea
of
the
extent
of
the
uptake
I
iphone
as
you
can.
As
you
see
here,
is
over
191
million.
This
is
data
from
last
year,
the
the
last
year
for
which
full
year
data
is
available,
and
it
does
have
native
success,
support
for
haptic
encoded
data,
the
the
aaa
app
format,
as
well
as
the
core
haptics
api.
E
The
w3c
has
got
an
html
vibration
api,
which
is
which
provides
haptic
support
in
the
mobile
web
browsers.
I've
already
mentioned
the
game
consoles
and
those
remain
a
very
popular
ce
device
or
close
to
40
million
units
sold
and
then
xr
is
in
high.
Here
is
an
upcoming
market
and
it
helped
in
large
part
by
the
first
version
of
the
kronos
open,
xr,
haptic,
api
and
so
haptic
media
is
expected
to
be
commonly
exchanged
between
these
devices
and
since
these
devices
in
in
total,
they
represent
the
majority
of
ce
devices
around
the
world.
E
And
haptic
data
formats
are
already
in
use.
These
are
prevalent
in
a
large
number
of
devices
around
the
world
and
these
would
live
as
subtypes
under
the
proposed
haptic
top
level
media
type.
Once
it's
once
it's
standardized,
or
once
it's
registered
the
aaa
app
for
for
format,
which
is
the
apple
haptic
and
audio
pattern.
Data
format,
which
is
this
is
the
de
facto
standard
encoding
on
all
ios
devices
and
ios
connected
game
peripherals
and
of
late.
It
has
seen
usage
and
adoption
beyond
apple
devices
as
well.
E
The
very
recently
in
android
11,
google
has
introduced
a
proprietary
extension
to
the
og
for
format,
specifically
to
any
enable
the
adhd
in
the
encoding
of
half
of
haptic
media
in
org
files.
The
ibs
haptic
data
format
is
a
vendor-specific
format
as
of
now
that
is
in
use
in
mobile
phones
from
lg
electronics,
a
number
of
models
like
the
v30
v40
and
the
newest
v50
that
are
sold
worldwide
and
gaming
phones
from
asus,
especially
the
rog
one,
and
the
rog
ii
and
rog3,
which
are
also
available
worldwide.
E
E
Now
these
are
these
subtypes,
don't
yet
haven't
yet
been
standardized,
but
the,
but
they
live
in
terms
of,
and
they
live
in
an
informative
annex
that
we
made
as
part
of
our
mpeg
iso
bmf
proposal,
and
we
propose
the
following:
distinct
4cc
codes
for
them.
Once
again,
these
codes
are
not
registered
yet,
but
the
standardization
pertaining
to
these
is
ongoing
in
in
some
cases,
and
the
plan
is
to
is
to
standardize
these
haptic
coding
formats
in
the
near
future.
The
first
one
is
hmpeg
or
hmpg.
E
This
is
this
would
be
the
4cc
code
for
the
coding
format
that
is
chosen
as
the
winner
for
the
recently
issued
mpeg
call
for
proposals
on
the
quoted
representation
of
haptics.
This
draft
call
was
issued
at
the
at
the
october
meeting
at
the
mpeg-132.
E
Acting
coding
format,
that's
based
on
midi
and
hivc,
is
a
audio
to
vibe,
haptic
coding
format,
which
basically
refers
to
the
growing
area
of
automatic,
a
a
to,
b
or
or
audio
to
vibration,
conversion
algorithms,
so
one
standardized
these
for
these
formats
will
live
as
the
subtypes
under
the
proposed
haptic
top
level
media
type.
Next
slide.
Please.
E
Okay,
we
have
gotten
this
feedback
when
we
first
put
it
in
the
on
the
media
types
list
and
from
talking
to
others
as
well,
but
in
our
view,
haptics
really
doesn't
belong
under
any
other
media
type,
and
we
see
the
following
three
reasons,
as
the
main
reason
why
the
haptics
media
type
does
not
fit
under
the
application.
E
Top
level
type
haptics
connects
to
a
sensory
system
of
a
touch
or
motion
directly
and
is
more
specific
than
the
abstract
application.
Type
application
historically
has
been
used
for
applications.
That
is
code,
which
means
it's
viewed
and
treated
with
great
skill
for
security.
Hack
pics
is
not
code
in
in
much
the
same
way
that
audio
and
video
are
not
core
either
and
haptics
is
a
property
of
a
media
stream.
It's
not
an
application
under
any
normal
definition.
E
This
is
a
somewhat
involved
slide,
but
I
felt
it
necessary
to
make
these
points
because
we
were
told,
as
part
of
the
instructions
for
rfcs
that
we
need
to
be
very
explicit.
So
I'm
going
to
try
and
get
just
touch
on
that
that
the
highlights
here
signal
data
is
typically
I
mean
haptics.
Data
can
be,
can
be
represented
as
collections
of
signal
data
or
it
could
be
represented
as
descriptive
text
in
xml
or
json,
or
a
similar
format.
E
Signal
data
is
typically
not
executed
by
endpoint
processors,
so
there
is
not
much
risk
there,
but
descriptive
text
is
passed
and
represented
in
memory
using
xml
data
structures.
Then
this
data
is
then
further
utilized
to
to
to
construct
one
or
more
signals
which
are
then
sent
to
the
actuation
hardware
and
because
of
the
media,
rendering
in
nature
of
the
data
path
for
haptic
coded
data,
you
can,
you
can
think
of
the
security
profile
of
haptic
data
would
be
pretty
consistent
with
the
security
profile
of
visual
or
audio
media
data.
E
So
there
is
some
security
risk,
but
it's
no
more
or
no
less
than
the
security
risk
that
you
need
to
be
aware
of
when
you're
dealing
with
audio
and
and
visual
media
data,
and
then
media
rendering
systems
are
normally
implemented
with
a
mix
of
user
and
kernel
space
execution,
since
this
media
must
ultimately
make
their
way
to
a
hard-based
system.
E
So
in
theory
you
could
have
a
you
could
have
malicious
instructions
that
are
present
in
the
descriptive
haptic
media
and
those
could
execute
arbitrary
code
in
kernel
space
effectively.
The
high
passing
system,
permission,
structures
or
execution
sandboxes,
and
so
you
so
it
all
is
so
haptics
audio
video
and
media
data,
widespread
use
and
careful
attention
need
to
be
paid
by
the
operating
system
and
device
driver
implementers
to
ensure
that
the
synthesis
and
rendering
parts
do
not
provide
attack,
surfaces
or
malicious
payloads.
E
Any
coded
representation
of
media
haptic
media
is
insufficient
to
implicitly
provide
sufficient
security,
and
this
protection
should
really
be
enforced
by
the
os
implementer
next
slide.
Please.
F
E
Are
in
the
iana
considerations,
so
here
when
you
say
that
haptics
is
a
pre
primary
media
content
type
it
it
implies
that
the
content
defined
by
it
requires
a
certain
haptic
subsystems,
such
as
the
low
level
haptics
apis,
and
those
in
turn
require
hardware
capabilities
such
as
one
or
more
actuators
or
vibration
motors
to
actually
render
the
haptic
media.
B
Okay,
the
timing:
there
were
20
minutes
left
in
the
slot,
please
be
sure
and
leave
room
for
this.
B
Sure
just
go
a
little
fast.
E
E
Okay,
the
these
are
the
last
two
slides.
I
believe
these
are
we
we
just
give
a
gig
give
examples
of
the
I
ids
haptic
type,
it's
the
subtype
would
be
would
be
ivs
and
the
file
extension
would
be
dot,
ibs
or
dot
ivt,
and
this
is
and
the
device
independent
haptic
effect
coding.
I
think
the
rest
of
the
part.
I
will
let
you
read
on
your
own
time.
I
will
point
out
one
that
there
is
a.
E
E
This
is
the
haptic
format
so
and
the
file
extension
would
be
dot
hacked
and
this
is
a
device
dependent,
haptic
coding
effect.
That's
effect,
that's
based
on
the
rif
standard,
the
research,
the
resource,
interchange
file,
format,
standard
and
it's
and
yeah,
and
that's
pretty
much
all
I
want
to
say
about
that,
and
I
believe
that
brings
me
to
the
last
slide
next
slide.
Please
that's
it.
E
G
Then
you
don't
let
them
in
directly
anymore,
just
tell
them
to
go
and
they
unmute
their
own
audio
tim
go
ahead
when
you're
ready.
H
I
just
wanted
to
hear
a
couple
of
scenarios
where
you'd
send
haptics
data
over
the
wire
all
by
itself.
E
Okay,
the
I
I
I
think,
I've
given
a
few
examples
there.
But
if
you
in
the
case
of
streaming
games-
and
if
you
wanted
to
send
the
haptic
the
data
by
itself
to
to
the
controller,
it
will
not
have
any
audio
or
video,
because
it's
mainly
meant
for
your
controller,
which
most
likely
has
deals
with
kinesthetic
feedback.
The
other
time
where
you
can
expect
haptics
to
be
sent
over
the
wire
just
by
itself.
E
If,
if
you
had
a
wirelessly
connected
bodysuit
or
a
vest,
for
example,
I
mean
the
the
this
might
seem
a
bit
far-fetched.
But
in
reality
there
were
at
least
three
or
four
companies.
At
last
year,
ces
we,
which
actually
were
showing
these
kinds
of
haptic
bodysuits,
with
up
to
between
12
and
12,
to
24
different.
E
I
I
663
6381,
so
the
codex
parameter
yeah
for,
for
example,
we
just
use
swedish
base
media
file
formats
so
and,
and
then
before.
So
this
is
the
codex
parameters
I
mean
you
have
the
fork.
Iso
cc
performance
for
haptics,
which
already
this
allows
you
to
tell
this-
is
container
file
with
these
haptics
types
in
it.
But
that's
do
we
have
a
direction
in
the
other
way
that
we
need
to
do
something
about.
E
Okay,
I
mean
I
will
have
to
admit
to
not
having
gone
through
6381,
but
if
you
are
referring
to
haptic
codec
specific
parameters,
we
do
envision
those
being
being
standardized
once
the
mpeg
cfp,
which
deals
specifically
with
an
mpeg
coding
format
standard
once
that
standard
is
finalized
in
the
in
next
april
or
in
the
june
that
cut
time
frame.
So
I
would
at
that
point
I
would
envision
6381
to
be
needed
to
be
updated.
I
Okay,
yes,
look
at
it,
let's
consider
if
it's
we're
gonna
need
to
do
something
in
the
other
direction,
also,
actually
updating
that
format.
If,
if
this
happens,
okay,
thank
you.
Okay,.
A
So
this
might
be
a
good
time
to
jump
in
and
remind
folks
to
try
and
be
responsive
to
the
fundamental
dispatch
questions
we're
dealing
with,
as
well
as
the
the
merits
of
the
proposal.
So
if
we
get
fairly
nuanced
questions,
this
might
not
be
the
right
venue
to
work
them
through
and
actually
what
we're
trying
to
do
is
find
the
right
venue.
So
so,
please
be
responsive
to
that
question
as
we
explore
this.
J
Hi,
this
is
center,
you
think
about
applications
that
use
media
types.
You
know
the
idea
that
you
could
say
haptics
blah.
Instead
of
application,
blah
or
audioblock,
you
have
to
make
up
some
new
mime
types
for
the
individual
formats,
because
audio
doesn't
carry
them.
It
does
nothing
in
their
character,
but
if
you
think
about
generic
form
applications
that
know
that,
if
something
is
audio
star
audio
that
they
can
do
something
with
it.
J
J
E
Okay,
yeah,
I
I
can
respond
to
your
your
questions
and
comments.
The
first
point
is:
haptics
is
not
typically
encoded
in
the
same
way
as
as
audio
and
point
number
two.
Is
that
the
the
whole
point
of
our
mpeg
iso?
Bmf
proposal,
which
has
progressed
to
the
dam
stage,
which
is
a
draft
amendment
stage,
is
to
treat
haptics
as
a
first
order,
media
type
at
the
same
level
as
or
audio
and
video,
and
we
actually
have
a
working
implementation
of
our
proposal
that
works
both
on
ios
and
android
devices.
E
That
shows
that
you
could
have
an
mp4
file
which
now
has
three
tracks
and
and
as
far
as
your
pop
pop
point
about
the
audio
tools,
all
of
those
tools
use,
I
mean
they
all
ingest
mp4
files.
Now
it
is
true
that
I
mean
once
you
have
once
you
have
haptics
as
part
of
the
iso
bmf
the
standard
they
did.
They
are.
They
are
going
to
be
a
third
track
in
an
mp4
file
and
there
might
need
to
be
changes
to
be
made
in
the
media
player
engine.
E
That
now
recognizes
the
hacked
box
in
in
iso
bmf
in
the
iso.
Bmf
header
and
make
allowance
for
that,
but
that's
just
the
progression
of
the
standard,
it's
no
different
than
the
next
version
of
a
any
other
and
mpeg
standard.
Now
that
being
the
case,
the
the
the
third
point
I
would
make
is
haptics
is
or
in
many
cases
like,
if
you
take
the
ahap
format,
they
have
for
format.
If
you've
ever
looked
at
an
ahab
file,
it's
actually
a
bunch
of
json.
E
So
it's
a
very
it's
a
very
descriptive
file
format
that
basically
says
at
what
offset
with
respect
to
the
audio
timeline.
Do
you
want
to
play
out
a
haptic
effect
which
could
be
a
short
click
or
a
or
a
buzz
or
some
kind
of
a
transient,
and
that
and
at
what
percentage
strength
you
want
that
transient
to
be
played.
E
So
those
kinds
of
things
you,
you
can't
really
do
with
a
with
an
audio
format
and
it's
for
those
reasons
we
didn't
want
to
shoot
on
a
haptics
into
any
of
the
existing
or
audio
audio
formats.
We
felt
that
you
could
get
the
most
flexibility
and
benefit
from
the
growing
field
of
haptics
if
it
were
its
own
top
level
type.
A
Thank
you
swan.
Do
you
have
an
opinion
on
how
you
think
this
could
be
best
dispatched,
or
are
you
really
in
a
in
a
position
where
you
really
need
advice
on
that.
E
I
mean
that
seems
like
you
are
asking
me
if
I
know
the
procedure
in
ietf
I
mean
I
I
would
be
looking
to
you
for
that
I
mean
what
is
the?
What
would
you
consider
to
be
the
next
step?
I
mean
we
we
have,
we
have.
We
have
made
a
case
and
we
have.
We
have
addressed
all
of
the
comments
we
have
gotten
so
far.
I
really
appreciate
the
thorough.
A
Presentation
I
mean,
I
think
I
asked
the
question
just
to
make
sure
to
refocus
the
conversation
we're
having
on
that
dispatch
question
and
to
make
sure
you
know
if
you
had
an
opinion
on
that,
as
some
folks
do
when
they're
making
their
presentation.
So
that
was
clear
to
the
group,
but
I
want
to
make
sure.
E
My
my
question
back
to
you,
I'm
I'm,
I'm
sorry
to
respond
to
a
question
with
a
question,
is
what
is
needed
to
convince
this
group
that
this
is
a
viable
top
level
type,
and
does
the
group
think
there
is
something
else
missing?
A
I
mean
fundamentally
that
will
be
driven
through
iatf
consensus
and
the
the
purpose
of
this
exercise
here
is
just
to
get
it
into.
You
know
the
room
that
will
start
making
those
forms.
We
will
not
reach
consensus
and
dispatch
on
the
merits
of
it,
so
janna
you're.
Next,
if
you
can
speak
to
this
part
of
the
question,
I
would
appreciate
it.
We've
only
got
about
six
minutes
left.
K
Okay,
I
I
I
was
going
to
ask
and
is
probably
a
question
for
the
area,
directors
or
or
or
the
other
dispatch
chairs,
given
that
this
draft
has
gone
through
one
iteration
and
there's
been.
Presumably
some
engagement
on
the
mailing
list
is
what
I
gathered
now,
I'm
not
paid
attention
to
that.
But
does
it
are
there
other
folks,
besides
immersion
technologies
that
are
interested?
Presumably
they
are
and
are
they
willing
to
come
here
and
engage
at
the
idea?
K
The
reason
I
ask
this
question
is
because
this
depends
to
me
at
least
where
this
lands
kind
of
depends
on
who
the
other
parties
that
are
interested
are.
I
can
read
the
draft
and
and
to
me
it
makes
sense
to
have
haptics
as
a
high
level
thing,
but
it
doesn't
matter
because
I'm
not
the
one
building
this
stuff.
So
I'm
curious
to
hear
there
are
others
who
have
expressed
interest
in
having
a
standard
on
this.
K
If
they've
expressed
an
interest
in
actually
building
through
the
standard
word
to
be
described,
because,
ultimately,
what
you're
trying
to
do
is
generate
among
people
who
are
who
are
engaged
on
the
details
of
the
draft
and
for
that
you'd
need
people
who
are
not
just
interested
on
the
side
but
are
actually
implementing
and
deploying
this
stuff.
Are
there
people
like
that.
E
There
is
actually
an
iet,
I
mean
an
ieee
effort
which
is
ongoing
and
that
that
I,
as
I
mentioned
it's
the
p191a,
which
is
a
haptics
codex
for
the
tactile
internet,
it
is
a
and
that
that
working
group
is
actually
in
the
process
of
finalizing
the
vibro
tactile
coding
standard
for
for
the
haptic
haptic
vibrotactile,
coco,
coco
coding
standard,
as
well
as
a
kinesthetic,
haptic
coding
standard
and
so,
and
they
are
a
number
of
academic
institutions
as
well
as
companies
who
are
active
there
and
they're
actually
actually
implementing
those
two
standards
that
is
one
apple
has
been,
is
quite
active
in
impact
iso.
E
Bmf.
In
fact,
the
chair
of
the
a
mpeg
systems,
file
format,
sub
group
is
david,
singer
he's
from
apple
and
apple,
has
their
own
haptic
format,
and
they
have
actually
been
quite
active
because
they
have
an
interest.
They
have
a
big
business
interest
in
haptics,
so
they've
been
quite
active
in
creating
this
mpeg
cfp,
which
I
mentioned
the
call
for
proposals
now
outside
of
that
a
a
a
little
bit
of
anecdotal
evidence
might
be,
but
we
immersion
founded.
E
E
A
A
This
along
so
stefan
just
jumped
into
the
end
that'll
have
to
be
the
end
of
the
queue
I'm
gonna
ask
people
to
be
real,
quick.
I
think
we're
gonna
be
okay
on
overall
agenda
time,
but
we
do
need
to
wrap
this
up
so
spencer,
you're
next.
M
Thank
you,
so
the
the
thing
I'm
seeing
in
the
chat
is
basically
you
know.
This
is
a
top
level
registration,
so
maybe
a
working
group
is
appropriate
for
a
short-lived
working
group
is
appropriate.
For
that.
Does
anybody
disagree
with
that?
Then
I'll
get
out
of
the
queue.
A
N
You
with
me
richard,
hey,
yeah,
it's
just
busy
granting
permission
prompts
real
quick
comment
here.
I
think
this
is
a
very
natural
80
sponsor.
I
think
we
should
not
waste
the
bandwidth.
The
stabilization
working
group
on
this
ietf
last
call
in
the
ensuing
discussion
will
be
plenty
to
nail
down
any
any
of
the
questions
that
have
been
raised
here.
That's
it.
A
L
I
think
the
top
level
type
makes
sense,
so
I
do
support
that
on
the
question
of
where
to
dispatch
to
is
there
any
point
in
a
work
group?
I
think
the
relevant
question
is:
what
do
you
plan
to
do
beyond
the
top
level
type?
Do
you
plan
to
have
any
of
these
elementary
stream
formats
standardized
within
itf,
or
do
you
plan
to
have
payload
formats
in
in
avt
core
carrying
these,
or
do
you
plan
to
describe
them
in
sdp
and
then
music?
E
Type:
okay:
I
can
try
to
answer
that.
I
mean
at
least
one
of
the
four
formats
we
which
have
mentioned
is
not
going
to
be
under
ietf,
it's
going
to
be
done
by
ieee
and
the
second
one.
The
hmpg
will
probably
fall
out
of
the
work
that's
ongoing
in
mpeg,
which
leaves
the
remaining
two,
the
he
m
and
the
havc.
L
And
one
personal
comment:
is
there
any
format
that
supports
spatial
haptics
encoded
along
with
the
video
stream,
so
that
you
have
spatially
and
dynamically
variant?
Haptic
fields?
Is
that
standardized
anywhere.
E
Not
yet,
but
but
it's
interesting,
you
should
bring
up
spatial
haptics,
because
that
is
part
of
phase
two
of
our
mpeg
cfp
and
it's
so
under
the
overall
umbrella
of
mpi
or
mpeg
in
immersive
media.
So
once
phase
two
of
the
ccfp
is
over,
we
do
expect
something
on
spatial
haptics
to
be
standardized.
But
but
it's
probably
the
short
answer
to
your
question-
is
it's
too
early
to
say.
B
So
I'm
going
to
jump
in
real,
quick
with
pete's
comment
from
earlier
that
he
was
unable
to
make,
which
is
that
he
doesn't
think
a
work
group
is
necessary
that
a.d
sponsored
would
be
reasonable,
and
some
of
the
commentary
we've
seen
in
the
chat
is
along
the
lines
that
a.d
sponsoring
plus
calling
out
to
the
right
experts
as
a
way
to
approach
it.
A
D
So
two
things
one
is
I'm
supportive
of
getting
this
top-level
domain
done?
That's
my
personal
comment.
One
comment
as
idf.
Listen
to
mpeg
this
activity
is
by
no
means
the
most
active,
but
also
by
no
means
the
quietest
that
mpac
has.
There
is
a
significant
amount
of
industry,
interest
visible.
That
goes
well
beyond
one
startup
company.
Thank
you.
C
Yes,
I
I
think
this
is
the
the
the
primary
discussion
of
this
will
be
on
the
media
types
list.
I
think
we
don't
need
a
working
group
to
do
this,
but
I
think
I
I
do
want
to
defer
to
the
community
for
the
decision
on
that.
So
I'm
hearing,
mostly
people
thinking
ad
sponsored,
is
fine
and
I'm
good
with
that
and
murray
being
next
he'll
he'll
he'll
tell
you
whether
he's
willing
to
be
the
sponsor
or
not
and
he's
a
media
type
reviewer
anyway.
So
I'm
done.
O
Yeah,
I'm
happy
to
sponsors,
if
that's
the,
if
that's
the
cons,
but
the
community
wants
to
handle
this.
It's
fine
with
me
great.
A
So
the
chairs
are
huddling
in
our
chat.
I
think
we're
we're
hearing
that
that
is
the
most
common
view
and
are
leaning
towards
suggest
suggesting
that
as
the
rough
consensus
outcome
of
this
meeting,
does
anyone
want
to
stand
up
and
say
that
that's
doesn't
match
what
they've
heard.
P
Go
ahead,
colin,
so
yeah.
I
would
like
to
have
a
chance
to
speak.
Please,
as
I
was
in
the
queue
before
the
tool
dropped
me
for
the
second
time,
is
that
okay
yeah
go
ahead,
so
I
think
that
this
is
draft
is
going
to
require,
I
mean
so.
First
of
all,
I
think
we
should
do
this.
Okay,
no
question
about.
P
We
should
move
forward
with
some
sort
of
topical
type,
but
the
hint
of
magnus's
question
several
other
peoples
that
have
worked
in
this
space
before
has
been
like
there's
some
really
complicated,
tradeoffs,
because
nothing's
perfect
in
this
space.
The
audio
plus
video
is
already
a
disaster
to
explain
when
you
use
one
and
when
use
the
other,
and
this
is
going
to
expand
that,
and
I
think
that
it's
not
that
this
drafts
complicated.
P
It's
that
the
set
of
contradictory
rules
we've
already
set
up
in
this
space
is
going
to
be
complicated
and
there's
going
to
be
no
way
to
get
this
draft
to
meet
every
single
one
of
our
existing
rules
simultaneously,
and
I
think
that
that
complexity
is
one
of
the
reasons
that
I
would
argue
that,
given
we
already
have
very
experienced
media
type,
people
expressed
that
this
shouldn't
even
be
done
at
ietf.
You
may
not
want
to
have
that
whole
conversation
on
the
itf
last
call
list,
which
is
where
it's
going
to
end
up.
P
P
A
Ben
christie,
do
you
have
anything
you
want
to
add
to
the
wrap
up
here?
Is
this
a
case
where
we
need
to
consult
the
minutes
and
write
a
summary
of
where
we're
at.
B
I,
what
I
would
take
this
has
been,
I
think,
with
cullen's
last
comment
there
and
it
it
probably
makes
sense.
This
is
a
com.
The
dispatch
conversation
needs
to
continue
because
I
don't
think
we've
necessarily
reached
a
resolution,
so
we
probably
need
some
discussion,
but
among
the
authors,
chairs
and
area
directors
in
the
very
short
very
in
the
very
near
future
and
and
see
if
we
can
resolve
that.
B
But
I
don't
hear
a
resolution
right
now
between
80
sponsored
and.
A
Presumably
a
new
working
group,
I
agree.
I
think
we
should
continue
to
have
that
discussion
and
further
input
on
the
list
is,
I
think,
appropriate
right.
C
I'd
like
to
stick
one
quick
thing
in
in
response
to
culinary
yeah,
I
suppose,
and
the
ads
can
decide
to
ad
sponsor
anything
they
want
to,
but
one
of
the
reasons
we
have
dispatches
exactly
so
we
will
listen
to
what
dispatch
has
to
say
about
these
things.
So,
thanks
for
your
comments
on
that
and
yeshwand,
I
want
to
make
it
clear
to
you
that
the
message
I'm
getting
from
this
is
we're
going
to
do
this.
E
R
A
A
Q
T
Q
Q
Okay,
so
webrtc
is
still
the
the
best
media
transfer
used
for
streaming
real
time
and
audio
and
video,
but
while
it
is
possible
to
use
other
transport
for
ingesting
webrtc
content
into
a
media
server
when
using
webrtc
both
for
ingesting
and
delivering
the
the
media
allows
stream
to
have
several
good
properties,
but
mainly
you
can
work
on
browsers
for
both
the
publishing
side
and
the
viewer
side.
You
avoid
to
have
protocol
translation
between
something
that
it
is
not
webrtc
into
webrtc
this.
Q
You
have
to
do
that.
You
will
most
probably
cause
delay
and
have
to
increase
the
complexity
of
the
implementation,
because
you
already
are
supporting
webrtc
for
for
streaming
out
so
implementing
something
different
for
assuming
and
doing
the
protocol
translation
is
is
going
to
cause
more
more
will
will
be
more
difficult
to
implement.
I
just
have
webrtc
foreign,
and
in
most
you
will
most
probably
have
a
protocol.
Then
problems
do
with
the
codes.
Q
For
example,
rtmp
does
not
support
opus
at
all,
so
you
will
have
to
do
transcoding
between
other
codecs
and
and
the
codex
that
a
webrtc
supports
and
also
some
feature
that
they
are
webrtc
only.
Are
you
will
not
be
able
to
support
it?
End-To-End,
for
example,
some
servers
may
decide
to
just
do
a
rtxor
or
flag
or
or
effect
end-to-end,
just
from
whatever
the
the
the
the
sender
supports
that
the
producer
support
to
the
viewer.
Q
So,
however,
if
there
is
no
current
standard
signaling
protocol
that
it
is
say
available
to
do
this,
I
mean
the
most
laws
could
be
simple
like
mpp,
but
they
are
not
used
for
broadcasting
or
streaming
services,
and
there
is
no
sign
at
all
of
any
kind
of
of
industrial
option.
Q
Also
they
are
complex
implement,
so
it
is,
they
will
not
likely
be
in
use
at
all
and
in
rtsp
that
this
could
be
the
closest
that
we
can
have
for
the
broadcasting
industry
and
is
not
compatible
with
rtc
sdp
of
an
asset
model
and
also
a
for
for
publishing
the
rtcp
record
and
doing
a
push
from
our
rtcp
server
is
not
very
widely
supported.
So
the
consequence
is
that
now
it's
its
service
has
its
own
implementation
of
the
of
custom
adult
protocols.
Q
So
you
want
to
have
an
encoder
that
supports
several
several
webrtc
instrument
services.
Each
way
you
will
have
to
implement
a
one
protocol
for
one,
let's
see
a
streaming
service,
so
this
is
not
probably
that
it
is
going
so
something
that
is
going
to
happen.
So,
at
least
in
our
opinion,
do
we
the
the
consequences
that
we
need
a
new
reference
signaling
protocol
for
this
use
case.
Q
Q
So
what
would
be
the
requirements
from
this
new
streaming
protocol?
I
mean
it
needs
to
be
simple
and
simple,
to
implement
and
also
simply
to
to
use
I
mean
and
to
be
able
to
set
the
table
or
to
provision
an
encoder
with
this
new
protocol.
It
will
have
to
be
as
easy
as
you
have
now,
with
an
rtp,
a
rtmp
uri,
so
you
will
not
have
to
do.
Q
I.
You
should
only
be
need
to
to
point
your
your
encoder
with
an
uri
and
have
the
the
streaming
and
the
streaming
up
the
and
for
ingesting
the
uk
is
quite
specific,
so
it
is
only
required
to
you
only
require
to
to
implement
a
set
of
webrtc
functionality,
mainly
that
you
only
need
to
support
only
the
directional
flows
from
from
the
from
the
encoder
to
the
to
them
into
the
media.
Q
If
you
are
broadcasting
to
a
server,
the
server
is
assumed
either
to
have
a
public
ip
or
is
deployed
in
the
same
private,
a
private
network
as
the
publisher,
so
the
network,
the
the
the
not
restriction
is,
is,
is
less
difficult
in
in
normal
web,
participate
to
appear,
and
also
you
don't
need
to
to
support
a
negotiation
that
you
are
not
going
to,
for
example,
in
rtmp
or
other
protocols
that
I
use
for
ingesting
today.
Q
You
just
send
the
media
from
the
start,
and
if
you
want
to
change
things,
yes
stop
and
restart
again,
you
don't
need
to
support
renegotiation
so
in
the
requirements
really
has
to
be
a
full
compliance
with
rtc
and
rtc
web
specs
and
must
support
authentication
of
the
for
the,
for
the
stream
has
to
be
both
subletting
web
browsers
and
knight
different
colors.
Q
So
that
the
profit,
the
proposition
solution
is
something
that
it
is
really
really
simple.
It
is
just
a
doing
an
http
with
an
sdp
offer
to
from
from
the
encoder
to
to
the
media
server
and
receiving
the
sap
answer
from
the
from
the
from
the
from
the
media
server.
With
this
sdp
offer
asset,
you
set
up
a
webrtc
and
a
session.
Q
Normally
by
any
other
of
those
both
sides,
a
authentication
bay
will
be
a
an
authorization,
will
be
supported,
natively
by
http,
a
header
under
the
adopted
doctorization
header
with
a
video
token.
So
it
is
just
exactly
what
I
mean
a
standard,
http
stuff
and
also
you
will
use
you
will
it
will
support
http
radiations
for
load,
balancing.
Q
Yes,
you
create
apple
connection
and
you
get
the
the
offer
just
to
do
a
fetch
with
it,
then,
for
sending
the
post
to
the
media,
server
receive
the
the
answer
and
and
pass
it
that
they
said
remote
description
and
once
it
is
set,
and
you
can
control
the
the
state
of
the
of
the
feed
with
just
by
the
the
on
connecting
state
changes.
Events
received,
so
you
have
the
connected,
disconnected
failed
and
closed.
Q
We
have
tried
to
reduce
the
complexity,
mainly
because,
if
it
is
going
to
be
implemented
by
the
by
native
encoders
and
hardware-based
say
solutions,
I
mean
trying
to
reduce
the
the
requirements
for
implementing.
It
was
important,
so
we
have
restricted
that
that,
in
a
way
that
well
first,
the
stp
bundle
is
always
required
and
rtcp
maxine.
Q
So
we
don't
support
nonviolent
num
bonding
and
also
no
rtcp
in
different
boards,
and
the
server
may
implement
a
slide
and
the
codeman
implements
must
implement
full
full
eyes
and
the
the
other
minor
change
that
we
haven't
done
is
that
we
allow
the
the
encoder
to
just
set
up
then
implement
the
the
active
setup
so
not
to
have
to
implement
active
and
paste
it,
and
the
server
must
support
both
at
a
thin
as
a
only
as
a
spaces
fan
in
case
that,
for
example,
if
they
connect,
they
are
interconnecting
with
something
that
is
supporting
also
at
past,
because
it
can
has
to
support
atheist
or
basic.
Q
I
mean
we
if
there
is
any
other
and
other
optionality
that
we're
going
to
remove
from
current
web
artists.
In
order
to
to
to
make
it
easier
to
implement,
I
mean
it
will
be
welcome,
but
this
is
only
the
the
ones
that
we
have
a
phone
for
now,
and
I
think
that
it
may
be
my
last
slide
yeah.
Q
So
the
the
conclusion
is
this:
that
it
is.
I
think
this
is
the
the
lowest
thing
that
we
can
have
for
having
webrtc
a
easy
protocol
with
the
the
with
a
standard
that
already
available,
so
it
is
just
use
them
and
in
order
to
create
something
that
it
is
really
simple
and
really
easy
to
implement.
Q
There
has
been
some
discussion
if
this
should
be
a
make
a
bit
more
complex
to
have
and
to
be
able
to
control
the
state
of
this,
this,
the
of
the
stream
by
http
rest
or
something
like
that.
But
this
is
the
the
bare
minimum
that
it
is
that
it
is
required
to
in
order
to
to
have
this
same
working
and
with
yeah.
A
Now,
as
a
thank
you
for
all
that,
before
we
open
the
queue,
do
you
would
you
like
to
address
the
the
core
dispatch
question?
Are
you
familiar
with
the
possible
locations
for
this
work,
or
should
we
open
the
queue
suggestions
on
that
from
the
floor.
Q
I'm
I'm
not
aware
what
is
the
best
scene.
I
mean.
I
think
that
this
is
really
simple.
I
mean
the
draft
is
available
and
it
is
really
small.
So
I
think
that
the
the
at
least
my
my
opinion
should
be
that
you
should
be
the
the
whatever
is
best
for
for
having
a
fast
solution,
and
I
don't
expect
to
have
much
discussion.
F
Sergio
I
mean,
as
you
said,
you
showed
this
is
a
very
simple
web
page.
You
know
it's
probably
not
a
lot
more
complicated
on
the
server
either.
Q
Mainly
because
if
I
going
to
to
speak
to
a
hardware
encoder
a
band
to
implement
it,
I
need
to
to
say:
please
implement
it,
so
it
is
possible
to
use
it
with
a
more
than
one
service.
I
mean
I
already
have
implemented
it,
but
if
it
is
not
a
standard,
it
would
be
just
another
thing
that
and
that
my
server
implements,
but
no
other
server
implements.
So
if
we
can
agree
that
having
this
as
a
specification
within
the
encoder,
well
they're
going
to
can
be
say
that
I
just
need
to.
F
A
U
U
I
mean
karen
and
I
were
trying
to
do
something
like
this
way
back
when
that
failed,
or
it
failed
to
get
gain
traction
at
the
time,
probably
because
we
were
trying
to
incorporate
it
into
the
spec
instead
of
just
having
it.
That's
a
minor
item,
so
I
think
we
should
just
let
this
be
published.
Q
Yeah,
so
I
think
that
I
may
not
have
answered,
probably
or
very
carefully
to
today
completely
to
the
bernard
question
before
I
mean
that
the
situation
in
in
web
in
streaming
is
different
with
that
in
common
webrtc
cases
that
you
have
one
vertical,
that
you
control
the
clients
and
the
and
the
streaming
services.
Typically
in
streaming,
you
will
have
a
hardware
encoders
that
are
nor
are
even
that.
Q
Works
is,
for
example,
with
a
with
you
support
rtmp,
you
just
implement
rtmp,
and
you
are
able
to
talk
to
facebook
to
twitch
to
to
a
broad
range
of
of
service.
So
the
idea
is
here
is
the
same:
to
be
able
to
to
have
it
in,
for
example,
in
in
obs
or
in
in
the
vendor,
for
the
for
the
encoder
is
different,
that
the
one
that
is
deploying
in
the
the
streaming
service.
Q
Q
A
V
Q
It
is
yeah,
it
is
already
implemented
in
both
in
in
in
janus
and
and
in
medusa.
I
mean
in
in
in
our
in
the
melica's
streaming
platform,
and
we
have
implemented
in
in
an
obs
fork
that
we
are
trying
to
to
get
into
to
the
main
one,
but
the
complexity
is
having
to
is
compiling
the
delivery
of
rtc,
but
a
part
of
the
web
browser
we
have
already
been
in
in
obs
and
trying
to
get
traction
with
the
people
that
it
is
implemented
in
in.
However,
encoders.
W
I
just
wanted
to
say
I
I
think
this
is
a
good
good
thing.
We
should
be
doing.
I
think,
there's
some
still
discussion
to
be
had
on
how
much
simpler
it
could
be.
I
think,
there's
still
some
optionalities
that
could
be
taken
out,
so
I
kind
of
think
that
publishing
it
as
is,
might
it
needs
another
round
of
discussion
in
terms
of
the
detail
from
my
point
of
view,
but
I
I
think
it's
eminently
implementable,
even
as
it
is.
O
Intended
this
is
intended
to
be
a,
I
know
you
wrote
a
a
replacement.
You
know
flash
standardization
replacement
for
our
rpmp
is
this
feature
equivalent
to
rtmp
has
used,
because
one
thing
I
did
notice
is
it
didn't
seem
other
than
you
know
the
stp
proper.
It
didn't
seem
as
you
wrote
it
terribly
extensible.
O
O
Q
Yeah
well,
the
goal
is
not
to
replace
and
rtmp
I
mean
rtmp
or
the
other
broadcasting
protocol
can
that
will
be
and
use
for
different
things.
I
mean
I
don't
expect
this
to
replace
and
the
current
broadcasting
solution
for
ngs,
but
the
problem
is
that
if,
for
example,
if
you
use
rtmp
for
webrtc
first,
you
are
adding
delay
because
you
will
have
to
to
wake
in
in
the
best
case,
when
you
don't
have
a
buffet
on
the
on
the
encoder
side,
you
will
have
to
to
have
professor
frame
before
you
start
doing.
Q
O
I
guess
that
makes
sense
to
me.
I
guess
my
question
is:
are
going
the
other
direction
if
somebody
did
want
to
say
okay,
that
sounds
great.
I'm
going
to
take
my
exis
thing.
That's
using
you
know
rtmp
for
this
and
replace
it
with
whip.
Would
there
be
things
that
would
run
into
that
you'd
say?
Oh
well.
I
do
need
to.
I
can't
do
this
in
web.
This
thing
that
I'm
doing
today.
Q
I
don't
know
I
mean
the
the
main
goal
is
to
have
webrtc
and
from
end
to
end,
so
I
I
I
don't
know
if
this
makes
sense,
for
example,
to
ingest
from
in
webrtc
and
and
do
hls
out,
I
mean
probably
it
can
make
sense.
I
don't
know
how
that's
compared
to
other
solutions
that
they
all
ended,
but
if
you
are
done
trying
to
do
webrtc
for
webrtc
in
jason,
webrtc
out,
there
is
no
other
solution
that
that
really
works.
I
mean,
for
example,
if
you
want
to
to
use.
Q
A
Okay,
we'll
have
to
cut
the
queue
after
mo
and
ted
before
you
start
sergio
can
just
get
some
clarity
on
how
much
development
work
ietf
contribution.
Do
you
think
needs
to
be
done
here,
as
opposed
to
is
this
say,
a
static
or
informational
document,
you're
really
looking
you're
putting
forward
for
the
purpose
of
publishing
in
a
static
reference
to.
Q
Q
I
would
like
to
have
some
more
control
of
how
to
disconnect
the
session
that
not
just
to
rely
on
on
the
delays
and
ice,
because
it
could
be
on
the
worst
case
it
will.
Just.
You
will
have
to
wait
for
the
ice
disconnection
for
the
and,
if,
in
case
of
hybrid
disconnection,
only
if
the
dhls
terminal
is
down
is
lost.
Q
H
Ted
hardy
speaking
after
he
hits
the
allow
button
again
thanks.
I
just
wanted
to
to
go
back
to
the
requirements
slide
for
a
moment.
When
I
look
at
the
requirement
side,
you
have
two
things
on
here
which
may
make
me
feel
like.
Maybe
this
actually
does
need
a
working
group
to
to
think
about,
and
that's
when
you
have
under
the
second
bullet
server
is
assumed
not
to
be
behind
a
net
and
also
supports
load,
balancing
and
redirections.
H
H
I
think,
if
that's
something
you
do
want
to
kind
of
hash
out,
then
you
need
an
actual
working
group
to
work
through
this,
and
I
have
no
objection
to
a
working
group
to
work
through
this,
but
I
think,
unless
you're
going
to
to
to
take
out
some
of
that
complexity
by
by
removing
some
of
these
requirements,
you
probably
can't
go
the
ad
sponsor
group,
so
I
would,
I
would
say,
from
the
dispatch
questions
point
of
view.
Q
Yeah,
but
first
what
I
mean
support
load
balancing
is:
it
is
just
relying
on
the
the
load,
balancing
and
radiation
that
http
already
supports.
Q
So
I'm
not
talking
about
the
load
balancing
about
the
media.
It's
just
about
the
the
load
balances
of
the
http
supporters,
for
example.
You
will
be
able
to
to
do
an
entity
lcd
post
to
a
factory
and
that
that
sends
an
already
relation
to
to
that
pal
media
server.
So
it
is
not
anything
more
complex
than
that.
A
If
this
is
a
situation
where
we're
really
not
going
to
give
the
iatf
change
control
over
the
document,
and
it's
a
static
done
thing,
then
you
know
the
ise
is
the
the
common
recommendation
in
those
circumstances,
but
it
it
sounds
like
we
start
to
go
down
that
road,
and
then
we
talk
about
like
a
number
of
fairly
tricky
things
that
actually
probably
require
a
bunch
of
different
viewpoints,
and
you
know,
are
more
suitable
really
for
a
for
a
mini
working
group.
If
you
will,
instead
of.
Q
A
L
Yeah,
so
on
the
surface,
it
seems
like
a
good
idea.
If
I
understand
right,
you
want
to
have
simple
devices,
be
able
to
have
have
a
way
to
push
things
up,
even
when
they
can't
run
the
full-blown
full-blown
browser
with
arbitrary
javascript,
and
you
want
servers.
You
want
services
to
always
expose
some
common
ingestion
point,
but
the
thing
that
I
question
is:
why
do
you
think
this
has
anything
with
ingestion?
But
you
know
seemingly
a
much
much
much
bigger
demand
would
be
for
receiving
webrtc
real-time
streaming
and
that
hasn't
happened.
L
L
L
Q
The
I
completely
agree
that
it
is
also
that
something
that
I
am
working
on,
but
I
think
that
there
are
different
issues
and
then
the
the
target
is
different
I
mean
for
for
in
for
the
ingestion
and
the
stream
implement
that
are
related
to
two
different
people.
I
mean,
for
example,
to
be
to
talk
about
ingest.
I
have
to
talk
with
encoder
people
and
to
talk
about
the
view
and
the
debut
website.
D
A
Great
thank
you
for
that.
Okay,
I'm
reading
the
the
chat
and
the
chair
chat
over
on
the
side,
trying
to
decide
what
we've
heard
here.
A
I
think
there's
still
a
little
bit
of
confusion
here,
but
people
are
leaning
towards
trying
to
discuss
on
lists
and
build
what
a
charter
for
a
small
mini
working
group
might
look
like
to
try
and
give
clarity
to
the
work
that
would
actually
be
done
on
under
iatf
change
control
and,
of
course
we
can
dispatch
things
directly
from
the
list.
Should
that
look
like
you
know,
suitable
work
that
the
ietf
wants
to
take
on.
A
Want
to
charter,
but
I
think
that's
probably
the
next
step
this
has
been,
I
think,
a
very
useful
discussion,
I'm
glad
we've
got
we've
had
it,
but
I
I
think
that
probably
needs
to
be
done
to
to
crystallize
what
we're
asking.
We
got
probably
like
three
more
minutes
we
can
spend
on
commentary.
A
Okay,
sergio
would
that
work
for
you.
We
could
try
and
build
a
charter,
a
proposed
charter,
and
that
might
clarify
our
our
thinking,
whether
that's.
A
A
We'd
like
to
take
a
a
moment
in
each
art
area
meeting
as
we're
now
in
the
art
area,
section
of
our
agenda,
to
mention
some
new
working
groups
and
when
they,
when
they
next
meet,
we
have
email
core
and
asdf
which
have
been
chartered
since
the
last
time
we
met.
A
I
believe
http
api
has
also
been
chartered
since
last
time
we
met
and
they
meet
on
friday
morning
and
I'll
highlight
the
other
interesting
meetings
located
down
here
and
then,
if
anyone
would
like
to
step
up
and
offer
a
30
second,
you
should
really
come
to
this
meeting
testimonial
for
any
of
these
or
something
else
you
think
would
be
of
interest
to
this
working
group.
I
would
encourage
you
to
do
so.
A
We
have
sec
dispatch,
gen
dispatch,
the
iab,
open,
the
rfc,
editor
future
development
program
and
shmu,
which
is
of
continued
interest
to
all
of
us.
I
am
sure,
there's
also
one
buff
this
week,
medina's
mac
address
device,
identification
for
networks
and
adapt
or
application
services
they
are
meeting
on
wednesday.
Does
anyone
know,
isn't.
X
Am
I
making
sounds
this
time?
Yes,
excellent,
just
ten
seconds
jen
dispatch.
X
The
only
item
on
our
agenda
is
an
update
to
it's
7
2,
2
1,
which
is
the
rfc
about
how
working
groups
may
wish
to
adopt
documents
or
unadopt
documents,
so
not
terribly
exciting
stuff,
but
you're
all
welcome
to
come
and
chat
about
that.
If
you
like.
A
M
U
Hear
me,
yes
sounds
good
yeah
I
this
is
the
most
obviously
the
most
interesting
presentation
today.
So
I'll
try
to
be
quick,
so
bernie
and
I
are
working
on.
U
Mdkg
in
lamps
on
header
protection
for
smile
and
pgp-
and
this
is
sort
of
came
out
of
this-
whether
a
signal
saying
that
a
particular
message
is
being
encapsulated
is
useful
and
we
want
to
talk
about
whether
it's
used
for
another
context.
U
U
When
protecting
headers,
we
need
to
make
a
copy
of
the
header
fields
within
signature
or
in
encrypted
block,
so
they
basically
sort
of
need
to
be
duplicated,
and
the
way
this
is
described
in
s,
prime,
is
just
says
to
wrap
it
in
message,
rfc822,
which
is
unfortunately
ambiguous,
because
it's
unclear
whether
it's
afforded
message
or
whether
it's
a
mechanism
for
encapsulating
headers
and
the
message
next
slide.
U
U
So
the
draft
at
the
moment
says
that,
in
order
to
distinguish
from
forwarded
message,
there
is
an
extra
parameter
on
the
content
type,
which
is
forwarded
like
yes
or
no.
There
was
some
discussion
on
the
mailing
list.
Martin's
suggestion
suggested
that
maybe
content
disposition
is
actually
a
good
fit
for
this.
U
For
example,
content
disposition
online,
maybe
a
new
content
disposition
type
like
encapsulated
to
make
it
specific
for
this
purpose,
and
there
was
also
a
suggestion
to
maybe
have
a
new
content
content
semantics
had
a
feel
to
to
signal
that.
U
The
first
two
proposals
have
advantage
that
they
sort
of
work
very
well
with
imap,
because
all
parameters
to
content,
type
and
content
disposition
are
returned
within
the
protocol.
The
last
one
doesn't
work
as
well,
but
happy
to
hear
opinions
on
this
next
slide.
U
So
question
for
this
group
is:
are
there
are
any
other
use
cases
other
than
so?
Basically,
is
this
very
s
mine,
pgp,
specific
or
are?
Are
there
other
ways
that
this
can
be
used
and
what's
the
best
mechanism
for
doing
this,
I
think
what's
the
next
slide,
yeah.
A
Y
By
approval
of
things,
content
disposition
feels
like
the
right
thing
to
me.
It
feels
like
that's
exactly
what
this
is
is
a
disposition
for
the
message,
rather
than
the
content
type
of
the
message,
but
other
than
that
it
seems
like
something
that
could
be
generally.
A
Z
Hi
folks,
this
is
dkg,
I'm
with
aclu,
so
I've
been
trying
to
figure
out
how
to
use
this
forwarded,
equals
no
or
kind
of
distribution,
or
anything
like
this
in
terms
of
header
protection
and
the
the
difficulty
is
that
we
can't.
Z
We
can't
tell
old
clients
that
don't
know
about
it
to
respect
it.
This
way,
right,
but
sort
of
by
definition,
that
the
already
deployed
clients
are
behaving
differently.
So
I'm
fine
with
making
an
indication
that
you
know.
Yes,
this
is
exactly
how
I
meant
it
and
it
wasn't
supposed
to
be
a
message
that
was
an
attached
message
or
a
forwarded
message,
but
there's
still
going
to
be
a
challenge
when
dealing
with
the
large
number
of
legacy,
clients
that
are
out
there.
Z
So,
in
terms
of
I
mean
we'll
be
discussing
how
to
protection
in
lamps,
I
think
tomorrow.
I
think
that
there
are
other
options
for
how
to
do
header
protection,
including
stuff-
that's
actually
been
deployed.
I
mean
enigma
has
been
doing
and
several
others
have
been
doing
header
protection
in
a
different
way.
That
has
some
level
of
backward
compatibility
that
you
makes
things
work
with
with
with
existing
clients.
Z
So
you
know
I'm
fine
with
any
of
these
ways
of
doing
it,
but
I
think
it
doesn't
fundamentally
solve
the
problem
of
dealing
with
of
interacting
with
the
legacy
client
and
we
don't
know
when
you're
sending
the
message
whether
you
are
interacting
with
a
legacy
client
or
not.
So
I
just
think
this
is
a
very
challenging
space
to
work
in
and
I'm
not
sure
that
this
is
the
right
spot
to
do
the
signaling.
But
any
of
the
proposals
seem
fine
to
me.
AA
It
says
I
have
audio
now
now.
Can
you
hear
me?
Yes,
oh
good.
The
other
another
use
case
is
mailing
lists
and
dmarc
everybody's
favorite
mailing
list
thing
back
when
we
were
trying
to
fix
the
itf
list
to
work
around
dmarc.
One
of
the
things
I
tried
was
wrapping
single
messages
very,
very
much
the
same
way
this
this
thing
does
it
so
there's
an
outside
header
that
passes
dmarc
and
the
inside
header
actually
describes
the
message.
That's
basically
it's
basically
the
same
scenario.
AA
My
question
is:
why
adding
it
seems
to
me
this
like
this
is
the
the
message.
The
the
mime
type
here
would
be
like
message
encapsulated,
rather
than
adding
a
flag
to
message.
Rs8
rfc
a3822
and
I
agree
with
dkg
that
the
current
behavior
of
muas
with
any
sort
of
attached
or
wrapped
messages,
is
pretty
poor.
So,
on
the
one
hand
I
mean-
I
imagine
partly
it's
like
well
message
encapsulated
is
likely
to
work
worse
than
message.
Yeah
message,
global,
but
message
global
already
works
pretty
badly.
AA
U
U
AB
I
I
support
doing
something
on
this.
I
mean
I,
I
think
everybody
in
the
queue
and
ever
you
know,
I
think
everybody's
played
with
s
mime
has
proposed
something
of
the
sort
over.
You
know
the
past
25
years
and
there's
always
been
the
problem
of
the
legacy
s
mime,
clients
don't
do
what
they
should
or
what
we
would
want
them
to.
AB
You
know
there
are
headers
that
really
should
have
always
been
part
of
the
content
and
there
are
routing
headers
and
smtp
just
smushed
the
two
together,
and
so
maybe
what
we
need
to
do
is
to
disentangle
those
two
and
say
which
are
the
two
things
and
which
are
different
before
we
try
and
you
know
and
fix
rfc8281
before
trying
to
layer
more
stuff
on
top,
because
you
know,
as
I
said,
we've
tried
this
before
and
it
didn't
get
very.
U
I
have
lots
of
good
comments.
Sorry,
I
I'm
sort
of
still
waking
up
in
my
time
zone,
so
yeah
I'll
I'll
review
the
comments
and
we'll
continue.
You
know
if
you're
interested
in
the
header
protection
topic
specifically
please
come
to
lamps,
which
is
tomorrow.
A
A
So
the
that
completes
our
structured
agenda.
We
do
have
10
minutes
available
if
there's
any
other
business
that's
available
to
the
art
area.
Anyone
would
like
to
step
forward
with.
Otherwise
we
can
conclude.
A
Y
Didn't
didn't
get
ready
in
time
for
this
meeting,
but
the
there
was
discussion
that
I
emailed
the
list
a
few
weeks
ago
about
a
time
format,
basically
updating
rfc
3339.
Y
So
the
the
concept
is
to
add
basically
sigils
afterwards,
that
include
both
the
time
zone,
that
a
time
is
in
and
also
the
calendar
format
it's
in
if
it's
not
gregorian,
so
that
there
are
other.
Y
Y
We've
done
recurrence,
rules
and
r
scale,
which
is
to
say
that
this
recurrence
rule
applies
in
a
a
different
time
system
than
gregorian.
But
what
hasn't
been
addressed
is
the
format,
so
the
the
iso
8601
whatever.
It
is.
Sorry
not
that
one
yeah
time
time
for
rfc
three,
three
three
nine
time
format
year
year
dash
month
month,
day
day
and
then
t
and
some
minutes
and
seconds.
Y
A
Thanks
braun
thanks
harold
neil
got
time
for
one
more.
I
think
here.
T
Just
going
to
yeah
clarify
on
this,
yes,
the
tc39
group
developing
ecmascript
javascript
that
are
currently
looking
for
this.
I
think
we
do
need
a
serialization
format
for
dates
with
time
zones.
Lots
of
people
do
this,
and
if
we
don't
define
something,
then
we'll
have
lots
of
different
compatible
versions
out
there.
It
seems
an
obvious
extension
to
3539
to.
T
A
A
Okay,
so
if
you're
interested
in
continuing
this
discussion,
calcified
sounds
great
yeah,
the.
Y
A
A
I
mean,
I
think,
the
most
important
part
of
the
dispatch
chair
job
is
actually
reaching
out
and
integrating
folks
who
just
have
no
idea
how
this
idf
process
works
and
then
it's
just
an
exemplary
instance
of
of
making
that
work
as
a
chair,
and
so
I've
learned
quite
a
bit
from
him,
and
I
I
really
appreciate
it
so
we'll
be
sad
to
see
you
go,
but
you
know
the
the
istar
is
always
happy
to
have
you.
So
thanks
ben
and
thank
you,
everyone
for
a
good
meeting
and
we'll
see
on
the
list.