►
From YouTube: Features of WebXR - Ada Rose Cannon, Samsung
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
We
have
a
focus
on
privacy
and
security.
We
also
have
built-in
tracking
blocking
and
extensions
for
ad
blocking
as
a
developer
advocate
I
try
to
help
developers
with
the
latest
features
in
the
web
platform
the
build
website,
which
meant
the
most
of
everything
the
web
browser
can
offer.
If
you
have
any
questions
about
Samsung
internet
or
while
it's
our
please
reach
out
to
me,
either
through
email
or
social
media,
as
well
as
being
a
developer
advocate
I'm
also
co-chair
of
the
w3c
immersive
web
working
group
and
the
immersive
web
community
group.
A
These
two
groups
work
together
to
develop
web
standards
for
working
with
immersive
hardware
such
as
VR
headsets.
The
core
part
of
this
effort
is
the
web
excited
by
API
or
WebEx.
After
short,
while
XR
at
his
core,
an
API
designed
to
access
the
sensors
and
displays
of
immersive
hardware
such
as
VR
headsets
and
AR
headsets
by
knowing
what
the
user
is
looking
at,
we
can
render
a
scene
using
WebGL
from
the
users
point
of
view.
A
We
can
then
send
this
view
to
be
displayed
on
the
headset
that
when
they
move
their
head,
they
see
what
they
would
be
seeing
if
they
were
inside
the
virtual
scene.
This
is
how
you
get
the
virtual
reality
effect
remain
difficult
part
in
building
these
AP
is
supporting
the
wide
range
of
immersive
devices
with
their
wide
range
of
capability.
A
First
type
of
headset
we
supported
were
immersive
VR
headsets,
like
the
HTC
vive
and
the
oculus
rift
sometimes
of
immersive
VR,
had
attended
computers
and
are
very
powerful
but
usually
require
a
direct
wired
connection
is
usually
both
have
very
high
quality,
tracking
and
very
good
graphical
fidelity.
There
are
other
VR
devices.
The
old
gear
VR,
which
is
no
longer
supported,
was
a
very
popular
headset,
but
it
was
driven
just
by
a
phone
modern
standalone
had
sex
like
the
oculus
quest
are
similar
in
power
because
they
are
also
driven
by
phone
chipset.
A
Because
of
this
great
tracking,
giving
a
very
good
mersive
experience
and
a
very
good
user
experience
on
their
ease
of
setting
up
these
have
proven
to
be
a
very
popular
type
of
headset.
Nowadays,
a
bit
later
than
immersive
VR
headsets
aim
immersive,
augmented
reality.
Headset
has
also
known
as
mixed
reality.
A
Headsets
like
the
hololens
and
the
magic
leap
there
was
a
potential
of
immersive
augmented
reality.
These
work
by
overlaying
virtual
content
onto
your
physical
surroundings,
scan
your
environment.
Objects
are
placed
on
the
floor
walls
and
tables
a
very
incredible
experience.
You
ever
get
to
have
a
go
at
it.
A
A
Something
which
needs
to
be
supported
today
through
WebEx
are
a
few
years
ago.
Pokemon
go
took
the
world
by
storm
and
is
still
pretty
popular.
It
was
a
huge
commercial
success
and
it
showed
how
bike
we
are.
You
can
combine
and
held
augmented
reality
with
a
really
engaging
user
experience,
build
a
fantastic
product,
and
this
is
just
enough.
This
is
another
kind
of
experience
the
WebEx
our
device
API,
should
find
support,
and
these
examples
really
show
that
the
mersive
content
in
the
web
isn't
just
head-mounted
anymore.
A
I
mentioned
just
now.
The
API
called
web
via
well
fear
had
just
targeted
VR
headsets.
It
is
in
the
top
left
quarter
of
this
chart
because
of
the
wide
range
of
devices
that
now
needs
to
be
supported.
Well,
VR
was
going
to
work
long
into
the
future.
A
new
API
needed
to
be
developed
to
handle
augmented
reality
and
handset
based
experiences,
and
that's
where
we're
XR
comes
in.
A
A
This
video
was
recorded
in
Firefox
reality,
the
oculus
quest,
but
because
it
uses
WebEx
it'll
work
in
any
BR
headset.
If
the
browser
faretta
supports
WebEx,
are
you
can
try
it
out
here?
If
you
want
headset,
try
using
it
in
that
you
have
a
cardboard
headset
by
opening
up
in
your
phone,
trying
it
just
on
your
phone
and
then
in
the
cardboard
headset.
A
The
first
feature,
I'd
like
to
point
is
a
really
important
one,
because
when
I
recorded
that
demo
I
was
using
the
oculus
quest
headset,
the
controllers
look
like
this,
but
if
you
remember
that's
no
other
controllers,
look
like
in
my
hand
in
that
demo,
I
was
holding
those
gray
sticks
and,
ideally,
I
want
to
see
the
real
hardware
I'm,
holding
in
my
hand,
in
the
virtual
scene,
the
WebEx
our
device.
Api,
has
two
things
which
work
together
accomplished.
There
first
is
the
WebEx,
our
gamepad
API.
A
A
Because
it's
a
gamepad
API,
it's
polling
based,
it's
not
a
vent
Bay,
but
it
it
behaves
exactly
the
same
as
you
would
expect
from
the
JavaScript
API.
The
only
difference
is
that
you
can't
access
e
XR
controllers
from
the
gamepad
API
itself.
These
are
exposed
on
the
Xin
port
object.
A
A
A
A
A
A
A
A
Although
you
can
render
any
content
you
want
on
to
the
users
environment,
you
don't
know
where
in
the
environment,
you
should
render
it,
because
you
can't
tell
whether,
where
their
tables
and
walls
and
other
objects
you
might
want
to
place
objects
on
are
oh,
we
need
to.
We
need
to
work
out
where
it
is
and
to
do
this
we
use
a
hit
testing
API
hit
testing
API
allows
you
to
cast
array
out
into
the
world
and
get
the
position
in
the
real-life
position
relative
to
the
organ.
A
A
Like
at
the
screen,
stop
using
the
hit
test,
API
and
just
leave
the
dinosaur
over
at
last
was
is
said,
stays
in
place
and
I
can
interact
with
it
by
combining
the
abilities
and
the
real
world
and
display
virtual
content
over
the
real
world.
We
have
the
cornerstones
of
augmented
reality
and
can
build
many
experiences.
A
A
A
There's
a
frame,
a
frame
built
on
top
of
3GS
by
providing
a
web
component
wrapper
for
at
Aven.
Andreas
is
another
really
good.
3D
engine
is
maintained
by
Microsoft
and
as
many
features
for
doing
high
fidelity,
graphics,
react,
360
is
kind
of
like
a
frame,
and
that
is
a
wrapper.
Around
3
yeah
is
a
react
rapper
rather
than
a
HTML.
A
A
A
A
You
would
use
on
a
normal
HTML
document,
also
extremely
extensible,
and
is
built
on
3GS
if
you're
handy
with
JavaScript
and
okay
with
3j
s,
you
can
start
writing
your
own
components,
but
take
it
even
further
and
it
has
a
large
and
active
community.
So
is
the
finding
help?
Isn't
too
bad
out-of-the-box,
it's
ready
to
use
virtual
reality
and
augmented
reality
whenever
the
browser
support
set.
A
A
But
since
this
is
JavaScript
conference,
you'd
probably
prefer
something
more
imperative,
such
as
3ds
bothers
them.
Declarative
like
Avram
I'm
gonna
go
through
how
to
set
up.
Webex
are
in
Twitter,
yes,
but
this
is
assuming
you
already
have
some
kind
of
VR
scene
set
up
and
ready
to
go
in
three
Jas
Alton.
A
3GS
has
really
good
documentation,
loads
of
examples
and
a
large
and
active
community,
the
finding
some
boilerplate
code
to
start
from
wouldn't
be
too
bad.
A
A
A
Vr
headset
will
be
driven
by
the
more
powerful
one,
but
if
your
scene
is
rendered
on
the
lower
powered
graphics
card,
then
it
won't
be
able
to
be
displayed
on
the
headset.
We
need
to
make
sure,
isn't
is
rendered
on
the
high
power.
One
also
need
to
use
built-in
animation,
and
this
means
he
may
seem
strange,
because
if
you
use
three
GS
or
you've
done
any
kind
of
animation
in
the
web
before
you
may
be
used
to
the
requestanimationframe
api.
A
Well,
this
is
what
this
is
using
under
the
hood.
But
what
Excel
provides
two
different
animation
request:
animation
frames,
there's
the
one
for
your
normal
monitor,
the
one
you
uses
normal,
but
there's
also
the
one
for
the
headset,
because
the
headset
will
have
his
own
frame
time
on
the
headset
will
run
a
lot
faster
and
the
frame
timings
of
your
monitor
your
headsets
can
run
not
just
at
60
frames
per
second,
but
sometimes
ninety
a
hundred
or
even
140
frames
per.
Second.
A
A
A
A
Is
it
lets
you
make
controls
and
interfaces
using
just
hated,
email
and
CSS,
which
can
be
interacted
with
in
the
usual
fashion?
You
like
javascript,
add
event
listeners,
you
can
let
classes
and
everything
you
would.
The
Dom
overlay
API
very
similar
to
the
existing
JavaScript
all
screen
a
PA,
the
Dom
overlay
API,
similar
to
the
existing
JavaScript,
will
rename
the
user
agent
will
take
care
of
wet
playset.
A
But
using
HTML
like
this
is
really
good
by
the
user-agent,
a
way
that
looks
much
clearer,
especially
for
doing
stuff
with
text.
It's
you'll
get
a
much
better
result
than
if
you
were
just
rendering
texts
were
bitmap
texture
and
then
displaying
it
on
a
3d
model.
A
That
you
of
all
the
2d
layout
power
or
CSS,
so
you
can
build,
really
fancy
interfaces.
The
kind
of
thing
that'd
be
very
difficult
to
complete
from
scratch
in
WebGL
nation
biting
estimation
enables
virtual
objects
to
approximate
the
lighting
from
the
surrounding
scene
when
virtual
objects
are
lit
in
a
similar
way
to
real
objects
from
the
users
environment.
It
feels
like
there
are
much
more
natural
part
of
it.
This
enables
a
much
more
engaging,
augmented
reality
experience
a
features
had
a
similar
effect
in
which
is
very
subtle,
but
gives
a
huge
improvement.
A
And
it
looks
to
solve
an
issue
Oh
as
a
arsons
run,
the
underlying
platform
and
the
coordinate
system
might
need
updating.
But
if
you
do
this,
what
happens
to
the
objects
you've
already
placed
in
the
scene?
If
you
have
to
change
the
scale
or
slightly
move
it
all
over
the
object,
they're
going
to
get
slightly
shifted,
they're
gonna
drift
slightly.
A
But
if
you
place
something
on
a
table
ten
minutes
ago-
and
you
come
back
to
it,
you
might
find
it
smooth
by
a
few
centimeters,
and
this
just
doesn't
feel
that
realistic,
but
by
placing
an
anchor
items
can
be
attached
to
that.
Instead,
when
the
environment
is
updated,
the
underlying
system
can
make
sure
that
anchor
stays
in
the
same
place.
It
would
have
done
so.
Any
objects
connected
to
that
anchor
will
also
maintain
their
position.
A
A
A
The
rendering
of
these
layers
is
handled
by
the
user
agent
instead
of
by
using
WebGL.
It
provides.
Two
significant
benefits
is
far
more
efficient
because
you
don't
have
to
copy
the
textures
from
the
video
element
into
WebGL
just
to
render
them
to
the
headset.
The
user
agent
can
just
render
them
directly.
A
It
also
gives
you
clearer
rendering
because
they
can
be
rendered
a
lot
closer
to
the
time
when
the
actual
people's
elves
light
up
on
the
screen.
It
gives
you
a
much
crisper
experience,
because
they're
not
slowly
moving
as
the
users
eyes
and
head
move
normally.
Reading
text
in
VR
is
something
that's
not
recommended,
and
usually
a
very
poor
experience.
But
if
you
do
it
using
this
technique,
actually
not
too
bad.
The
text
really
does
stay
in
place
and
is
very
comfortable.
A
A
A
If
you
want
to
see
what's
being
worked
on
right
now
in
the
immersive
web,
working
group
check
out
I
get
hub
everything
we
do
was
done
in
the
open.
You
can
read
our
old
minutes,
you
couldn't
look
at
the
standards
as
they're
being
written
account
the
PRS
and,
if
you
want
see,
what's
coming
and
some
of
some
of
the
more
features
which
are
on
our
radar
check
out
our
charter,
in
which
we
outline
someone's
stuff,
which
we
intend
to
look
at
over
the
next
couple
of
years.