►
From YouTube: WebPerfWG call 2023 02 02 - Event Timing updates
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
The
slides,
you're,
sharing
and
recording
has
started
so
take
it
away.
Okay,
so
there
are
just
a
bunch
of
little
event.
Timing
updates,
I
wasn't
even
sure
if
it
would
fill
the
time
slot,
but
there
are
a
bunch
of
issues
filed
that
aren't
getting
all
that
much
attention,
so
I
thought
I'd
do
a
rapid
fire
walkthrough.
A
So
if
anybody
does
have
feedback
and
wasn't
aware
of
the
conversation,
that
will
be
your
prompt
and
given
the
previous
conversation,
maybe
there
is
more
to
discuss
so
some
of
the
topics
revolve
around
the
concept
of
interactions.
I
thought
I'd
start
with
just
a
quick
review,
so
we
all
have
the
context.
A
A
They
can
have
different
time
stamps
different
render
times,
meaning
they
can
appear
in
different
animation
frames
and
there
isn't
always
a
simple
mapping
of
like
these
events,
always
map
to
these
interactions.
It's
context
dependent,
not
all
pointer
events,
turn
into
interactions
because
they
could
become
scroll.
They
could
become
canceled,
not
all
input
actually
comes
from
keyboard
sources,
etc,
etc.
So
what
we
wanted
was
to
basically
represent
the
way
users.
Think
of
interacting
with
a
page
I
tap
three
times
and
I
clicked
on
my
keyboard
two
times.
A
I
should
have
five
interactions
and
just
to
simplify
grouping
so
like
I,
said
yeah
but
event,
timing,
only
reports
on
events,
and
so
as
we
looked,
we
realized
that
there's
a
lot
of
overlapping
events
in
time
because
they
sort
of
get
dispatched
in
one
after
the
other.
You
have
this
originating
input,
which
has
a
hardware
timestamp,
you
dispatch
a
whole
bunch
of
events
for
it
and
then
all
of
them
update
the
the
Dom
or
UI
or,
however,
and
then
appear
in
the
next
pane.
A
So
if
all
you
care
about
is
counting
interactions
and
knowing
the
total
duration
of
interactions,
it
suffices
to
sort
of
take
the
first
event
hand
pick
some
important
ones
and
hold
them
accordingly
and
that's
what
we
do.
We
expose
an
interaction
ID
to
event,
timing,
at
least
in
chromium
these
days,
and
this
is
sufficient
to
sort
of
highlight
those
important
events.
It's
sufficient
to
group
different
events
by
interaction
across
animation
frames.
A
It's
sufficient
to
measure
imp,
but
when
you
want
to
dig
dig
deeper-
and
this
is
where
we've
gotten
a
lot
of
developer
feedback,
a
lot
of
developer
confusion,
it
isn't
always
sufficient
things
get
Messier,
so
every
event.
Every
task
on
Main
thread,
everything
that
keeps
the
page
busy
plays
a
role
in
delaying
responsiveness,
I
think
to
the
context
of
the
previous
conversation.
Multiple
rage
clicks
with
the
page
could
affect
the
overall
responsiveness
of
the
page
and
the
types
of
events
that
fire
depend
on
the
target
of
the
event.
A
They
depend
on
the
state
of
the
the
page
and
it
begins
to
make
you
wonder
what
really
is
part
of
an
interaction.
Let
me
just
give
you
one
quick
example
to
sort
of
preview.
What
we'll
talk
about
on
mobile?
You
can't
actually
hover
over
elements
like
you
would
on
desktop
with
a
mouse
and
the
moment
you
click
on
an
element.
A
We
will
start
by
firing
hover
events,
so
hover
is
not
a
discrete
interaction,
but
it
will
always
fire
right
before,
like
a
pointer
or
click
event,
and
so
it
will
always
come
first
and
it
will
always
delay
and
it's
very
common
on
many
Rich
sites.
We
try
to
add
very
nice.
Animations
takes
a
long
time
to
get
an
animation
going
and
that
always
delays
like
the
interaction.
So
was
that
part
of
the
interaction
not
strictly
by
our
definition.
A
But
if
you
look
at
traces,
it
is
obvious
that
that's
the
source
of
many
performance
issues,
so
you
know
what
what
would
help
developers
given
how
much
confusion
we
have
here
right
now
we're
labeling.
You
know
this
pointer
down
was
a
slow
interaction,
but
it
might
have
played
a
very
small
role
like
that
specific
event
might
actually
have
played
a
small
role.
It
just
helped.
You
find
a
Time
range
that
was
interesting.
A
We
considered
expanding
how
many
events
we
expose
interaction
ID
on
maybe
every
event
we
think
is
part
of
the
interaction,
gets
an
interaction
ID,
but
there
are
actually
a
lot
and
it
can
differ
and
there
could
be
situations
like
rage
clicks
where
there
are
multiple
interactions
in
flight
and
it's
hard
to
know
what
caused
what
so
there
isn't
a
simple
mapping:
we're
not
always
going
to
get
this
right.
Another
option
is
just
to
treat
everything
as
part
of
an
interaction,
but
again
there
could
be
multiple,
weird
overlaps.
A
So
this
is
a
bit
an
open-ended
conversation,
but
there's
a
lot
of
effort
in
this
space,
especially
on
the
tooling
front,
documentation
front,
and
we
get
lots
of
developer
confusion
here.
So
that's
just
an
FYI
here's
another
picture
of
what
this
looks
like
three
different
pages,
different
elements,
I
interacted
with
on
mobile
and
desktop,
and
you
could
see
the
types
of
events
that
will
fire
and
I
tried
to
keep
it
as
stable
as
possible.
A
So
I
just
did
one
single,
simple
interaction
and
you
still
get
a
whole
bunch
of
different
events
and
here's
another
example
where
one
of
these
is
an
interaction.
The
other
one
is
not
an
interaction
and
you'd
be
hard-pressed
to
figure
that
out
with
just
a
account
of
events
or
just
by
tracking
events.
You
really
need
to
you
know
the
browser
is
doing
a
lot
to
sort
of
track
it
to
the
best
of
its
ability,
the
right
context
to
expose
for
you
interaction
ideas.
So
that's
just
a
quick
reminder.
A
We
have
had
questions
also
about
event,
timing,
the
performance
timeline,
buffering
and
interaction.
So
right
now
events
only
report
when
they
exceed
the
duration
threshold
of
104
milliseconds
by
default.
If
you
register
your
own
performance
Observer,
you
can
override
that
down
to
16
milliseconds.
A
The
problem
is
that
buffering
the
default
buffer
from
initial,
like
page
load,
will
always
be
104
milliseconds.
You
cannot
override
that
in
any
way,
so
we
we
often
get
confusing
feedback.
It
happens
to
me
on
a
regular
basis
where
I
will
drop
a
snippet
on
a
page
that
has
already
been
around
for
a
while.
A
It's
already
been
loaded
and
I
will
expect
to
get
my
list
of
events
and
I
will
see
none,
even
though
I
interacted,
even
though
I
knew
something
happened,
and
the
reason
is
I've
asked
for
a
duration
threshold
of
16,
but
you
can't
apply
that
retroactively.
So
we
have
this
issue
file
to
try
and
think
about
options.
Here,
one
option
is
to
decrease
that
default
duration
threshold,
but
then
it
applies
to
all
events.
A
A
Another
thing
we've
talked
about
before
about
exposing
interaction
counts
and
we
decided
to
simplify
that
into
interaction
count.
I
think
we've
talked
about
that
as
a
group.
This
was
recently
added
to
chromium
and
that's
related
to
this
like
problem
with
buffering
and
event
counting
and
exposing.
If
you
really
want
to
measure
the
number
of
interactions,
which
is
one
of
the
inputs
to
the
algorithm
for
imp,
this
API
makes
that
easier
and
it's
recently
been
added
so
FYI
all
right.
A
So
those
were
topics
on
interactions
that
are
currently
in
flight,
and
maybe
we
should
talk
about
Dan's
question
on
hydration
and
measurements
afterwards.
Another
topic
is
first
input,
delay
versus
interactions.
So
event,
timing
exposes
both
first
input
and
events.
A
The
the
major
difference
is
that
first
input
is
always
exposed,
regardless
of
the
time
it
takes
so
regardless
of
duration,
so
duration
threshold
is
never
applied
there,
which
is
really
good.
It's
really
good
to
be
informed
of
the
first
introduction,
especially,
and
even
though
we
don't
like
input
delay
of
the
first
input
measuring
the
first
input
is
very
valuable.
A
The
actual
criteria
in
the
spec
of
when
to
measure
what
constitutes
a
first
input
versus
what
constitutes
an
interaction,
have
diverged
they're
different.
The
first
input
criteria
is
much
simpler.
It's
sufficed
for
a
long
time
and
it
informed
what
like
how
we
came
up
with
the
concept
of
interactions,
but
now
you
can
get
situations
where
first
input
is
not
reported,
even
though
you
have
events
which
are
labeled
interactions,
you
can
also
have
interactions
or,
like
vice
versa,
and
this
can
be
confusing.
It
also
makes
the
spec
more
complicated.
A
So
there's
a
bunch
of
open
issues
and
questions
around.
Can
we
sort
of
redefine
first
input
delay
to
just
be
the
delay
of
the
first
interaction,
it
seems,
I
think
seems
to
make
sense
to
me
all
right
events
and
render
time
so
our
paint
apis
expose
a
render
time
property
or
at
least
element
timing,
does
on
on
chromium
and
the
render
time
property
is
meant
to
be
the
most
approximate
presentation,
time
of
an
animation
frame
that
has
these
interesting
paints.
A
The
way
it's
actually
exposed,
though,
is
all
of
these
entries
expose
it
as
the
start
time
they
will
have
a
render
time
if
the
element
passes.
Timing
allow
origin,
and
these
entries
always
have
a
duration
of
zero
they're,
always
relative
to
the
time
origin
of
the
navigation.
A
Now
event,
timing
doesn't
actually
expose
render
time,
but
it
does
have
a
similar
concept.
That's
because
the
duration
value
of
events
is
sort
of
exposed
in
terms
of
the
render
time
that
the
event
had.
So
you
take
the
render
time
you
subtract
its
timestamp
and
you
get
a
duration,
okay
and
it's
common
advice
that
you
see
in
blogs
and
in
tooling
and
then
Snippets
that
if
you
want
to
know
when
an
event
ends,
you
just
take
it
start
time
and
add
its
duration.
A
A
So
what
this
means
is
it.
It
just
creates
weird
behaviors
that
duration
could
get
rounded
up.
It
could
get
rounded
down.
There
could
be
different
events
with
different
start
times,
so
some
events
get
rounded
up.
Some
events
get
rounded
down.
If
you
do
have
LCP
entries
or
other
timing
entries,
they
don't
have
the
same
behavior.
So
you
just
get
different
things
ending
at
different
times.
It
makes
it
really
difficult
to
group.
A
It
becomes
impossible
actually
at
higher
refresh
rate
monitors,
because
you
don't
actually
know
if
you're
bleeding
across
animation
frames,
and
it's
also
just
an
inconsistency
between
the
way
these
things
are
exposed.
So
here
here's
a
screenshot
that
tries
to
Showcase
what
I'm
talking
about
in
case
it's
too
confusing
at
the
top
is
what
I
measured
using
performance
Observer
on
the
bottom
is
what
Chrome
exposes
through
tracing.
A
So
if
you
actually
recorded
a
profile
of
a
page,
you'll
get
the
accurate
times
and
all
events
end
on
the
in
the
exact
same
spot,
so
they
nicely
line
up.
Whereas
if
you
measure
performance
timeline,
they
all
sort
of
stagger
like
a
waterfall,
and
it's
even
weird
that
this
last
event
ended
before
the
previous
one,
like
it's
just
impossible
like
it.
That
could
not
have
been
possible,
but
it's
not
too
bad.
A
If
you
do
a
bunch
of
clever
math
to
try
and
group
these
as
best
you
can,
but
it
isn't
so
easy
like
you,
can't
just
throw
them
in
a
map
and
by
time,
so
we've
had
some
feedback
about
that.
So
there's
this
issue,
which
is
proposing
we
could
consistency
by
actually
we're
exposing
render
time.
If
we
chose
to
do
that,
it
would
be
a
good
opportunity
to
also
fix
the
previous
issue.
A
We
could,
if
we
still
feel
it's
important
for
event,
timing
to
round
off
this
property,
which
we
don't
do
for
other
paint
timings.
We
could
just
round
it
off
at
the
render
time,
property
and
so
of
all
event.
Timings
that
appear
together
in
the
same
frame
would
have
the
same
value,
and
it
would
make
you
Group
by
frame
very
easily.
A
A
There
are,
of
course
times
where
an
event
happens.
An
interaction
happens,
nothing
actually
changes.
There
will
not
be
an
expand,
rendering
will
not
be
queued,
there's
no
rendering
opportunity,
and
so
we
actually
fall
back
to
measuring
a
lesser
time,
but
the
spec
is
not
really
clear
on
when
that
happens.
So
there's
just
some
simple
issue
like
simple
requests
to
change
that,
but
there
are
also
times
where
there
are
kind
of
ambiguity
as
to
what
really
is
the
next
paint
so
as
an
example,
I
recently
patched
a
visibility
change.
A
So
if
you
click
a
link
that
opens
a
new
tab
because
it
has
Target
blank
you'll
automatically
get
a
visibility
change,
perhaps
before
the
page
is
able
to
render
that's
fine.
The
user
is
not
waiting
for
that
next
paint
opportunity,
but
technically
we
didn't.
Actually
there
was
a
next
paint
requested.
It
doesn't
get
there,
and
so
we
used
to
expose
the
time
when
the
top
switch
back
to
visible,
which
doesn't
make
sense.
It
would
be
an
arbitrary
long
time
that
the
tab
was
in
the
background,
so
changes
like
that.
A
There's
also
the
case
of
page
unload,
which
we
just
talked
about
rage
clicks
that
lead
to
reloading
or
unloading
those
don't
get
reported
in
those
cases.
You
might
never
have
gotten
next
paint,
but
we
think
we
should
still
count
the
duration
that
was
delayed.
Perhaps
unload
itself
was
delayed,
Etc
or
maybe
the
user
unloaded
because
of
the
long
responsiveness
issue.
So
those
are
still
important
to
record
so
we'll
clear
those
up
and
finally,
the
another
one
is
a
history.
A
Api
usage,
I
think
I'll
talk
a
bit
more
about
that
in
a
sec,
so
yeah
there
might
be.
There
might
be
changes
where
we
don't
always
want
to
measure
technically
until
the
next
next
paint
all
right
and
then
finally,
we've
had
requests
to
add
more
support
for
exposing
different
event
types.
A
Some
of
these
seem
probably
like
just
oversights.
There
are
different
types
of
focus
events
we
report
some
of
them,
but
not
all
of
them
forms
have
different
events.
We
report
on
input
type
like
submit
change,
or
or
rather
change
and
input,
but
we
don't
report
form
submit
or
reset,
and
these
are
discrete
events
typically
triggered
by
interactions
with
the
web
content,
so
we
could
still
filter
on
trusted
discrete
inputs.
A
There
are
also
other
types
of
discrete
events
that
aren't
technically
with
web
contents
like
pressing
the
back
button
in
your
browser.
Will
fire
history,
API
events?
A
You
didn't
really
interact
with
the
web
contents
itself,
but
it
was
a
like
a
discrete
user
interaction
and
it
could
have
been
like
there
could
have
been
a
back
button
on
a
button
on
the
page,
so
there
isn't
much
conceptually
different,
and
so
that
should
be
measured
same
with
the
drag
and
drop
API
very
similar,
so
we
might
be
doing
an
audit
of
those,
and
so,
if
anybody
we've
gotten
some
feedback,
Microsoft
recently
filed
some
requests
to
add
a
few
of
these.
A
But
if
anybody
is
measuring
event,
timing
right
now
and
missing
things
that
you
think
should
be
there,
let
us
know
I
think
with
that:
that's
it
for
the
presentation
cool.
So
let
me
stop
the
recording.