►
From YouTube: WebPerfWG F2F TPAC 2019 - day 1 early afternoon - CompressionStreams and JS profiling API
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
A
B
Other
use
cases
actually
deep,
quick
decompression
for
native
files,
Network
practical
of
the
HTTP
in
memory.
Databases
is
something
I've
been
thinking
might
benefit.
This
lazy
decompression
for
downloads,
I'm,
normally
downloads
up
decompressed
by
HTTP,
but
you
might
actually
want
to
get
the
compressed
input
and
decompress
it
later
when
you
actually
need
it
saving
memory
and
things
we
haven't
thought
of
yet
and
if.
B
D
B
B
Basically,
if
you
have
these
api's,
then
you're
going
to
manifest
relevance,
they're
the
most
implemented
in
the
same
library
and
switching
between
those
trivial,
no
options
in
first
version
for
reasons
I'll
get
to
and
we're
expecting
to
make
available
behind.
The
experimental
flight
in
second
in
chromium
can't
speak
for
any
other
browsers.
B
A
B
Is
the
best
version?
Okay?
So
the
reason,
but
the
reason
more
generally
for
Linux
in
the
first
version
is
the
exotic
coding
are
more
difficult
decisions
involved
in
whether
to
support
them
or
not.
The
cost
benefit
analysis
in
terms
of
keeping
around
static
dictionaries
in
terms
of
having
the
extra
cost
in
the
binary
of
linking
the
men
basically
properly
yeah.
But
these
are
not
broken.
Fresh
exact
by
placing
the
compression
diction
is
next
100
40
kilobytes
I.
B
A
B
I
A
J
A
B
Sorry
so
yeah
the
first
mission
to
explain
there.
We
only
support
you,
cheese,
dip
and
a
bunch
of
people.
Don't
tell
us
and
said
no.
He
said
this
terrible
is
what
deflator
said.
Okay
and,
and
so
we
decided
to
ship
to
Algren
and
legislation
books.
Yes,
so
if
deflate
doesn't
really
mean
anything,
then
we're
going
to
have
to
come
up
with
a
new
name
for
what
we've
called
the
flame.
B
G
B
B
H
A
B
A
B
A
But
more
accumulate
until
you
have
like
there
are
various
strategies
that
format
defines
and
not
everyone.
Kid
like
going
back
to
the
SSH
client
example.
Ssh
requires
a
specific
strategy
that,
for
example,
the
gzip
library
for
Java
hasn't
supported,
so
it
was
impossible
to
implement
SSH
in
Java
for
the
long
squat
until
unless
you
retweet
the
binary
or
like
the
bunch
of
weird
stuff.
So.
B
Basically,
the
ideas,
the
options
will
support
pretty
much
everything
that
makes
sense.
So
this
is
not
supposed
to
be
an
exhaustive
list
of
the
options
this
will
be
supported
and,
in
particular
for
flushing
days
or
a
a
issue
open
on
the
github
repository
from
the
explainer
about
adding
flushing
options.
A
C
C
B
K
B
B
B
Another
option
is
to
specify
the
implementations
very
closely
to
say:
okay,
one
means
use.
You
should
produce
this
precisely
this
by
output
for
this
input
and
nothing
else,
listen
I,
don't
want
to
restrict
implementations
to
either
beads
that
live
or
to
emulate
said
little,
but
there's
a
discussion
on
the
github
about
actually
standardizing
the
exact
byte
output
of
the
compression.
So.
B
C
B
I
A
I
A
I
A
B
A
J
A
That
goes
back
to
the
brothel
11
use
cases.
It's
the
same,
useless
and
I
think
it's
potentially
for
French
one,
but
would
you
spend
10
or
100
times
more
CPU,
like
user
CPU
time
compression
that
gives
you
20%
per
single
upload
I
mean
that
the
use
case
for
brought
me
11
and
zopfli?
Is
you
do
that
on
the
server
once
at
no
time,
and
then
you
serve
it
to
million
people?
This
is
like
a
single
upload,
so
the
trade-off.
C
B
A
J
K
K
E
J
J
Hang
out,
but
it's
too
long
as
we
do
the
URL
we
haven't
I.
We
don't
specify
the
deep
brain
algorithms
for
training,
Network,
hang
or
JPEG,
and
you
there's
a
little
screen
and
that's
fine,
but
there's
good
enough
standards
for
those
formats
that,
like
a
lot
of
well
they're,
just
specifically
on
this
slide
for
our
campus.
If
you
can
get
a
girl,
there's
a
optional
parameter
that
it
goes
from
zero
to
one
and
let
you
know
limitations
just
like
interpret
that.
However
they
want.
J
I
B
I
Are
variants
scenarios
that
have
different
limits
beyond
so,
for
instance,
be
play
as
easy
as
a
file
format
currently
has
the
ability
to
use
the
window
size
is
larger
than
your
size
was
available
to
Jesus,
as
used
in
blackboard
downwards.
I
think
that
was
a
consequence
like
you
may
want
to
not
have
that
option
on
the
table,
because
you
don't
want
to
do
a
foot
down
where
you
compress
something
a
little
that
some
browser
can't
be
compressible
but
simultaneous
that
you
want
to
have
it
available,
because
if.
I
K
The
equal
dresser
may
not
be
able
to
deal
with
some
compressed
content
that
would
have
been
easier
by
cos.
You
know
make
extreme
case,
you
can
imagine
you
would
press
something
in
a
browser
and
then
send
up
to
the
summer.
It's
not
for
subs
the
exact
same
oppress
content
to
the
client
and
a
client
won't
be
see.
You
can
present.
I
I
K
I
Yes,
I
mean
you
certainly
can
always
generate
something.
We
can't
be
impressed
and
so
far
as
you
can
do
deflate
you
deflate
and
give
it
something
that
only
access
to
you.
But
the
question
is
to
what
extent
the
API
should
guide
people
toward
what
browsers
have
standardized
as
their
acceptable
range
of
the
acres.
I
I
So
like,
if
I
call
progress
stream,
they
don't
pass
any
options,
I'm
guaranteed
that
every
modern
browser
is
going
to
be
able
to
download
that
file
decompress
that
content,
if
it
were
returned
as
an
HTTP
response,
and
only
when
I
get
off
in
movies,
the
specifying
window
size
but
I
potentially
generate
a
new
breast
stream.
The
browser
could
indeed
result
yeah.
A
But
other
like
browsers
that
are
using
the
decompress
ease
of
that
hood
if
they
specify
the
same
options.
Yes,
similarly
to
compression
dictionary
yeah,
if
you
control
both
ends,
you
can
do
weird
things,
and
so
I
think
this
is
a
good
point.
I
also
think
that
we
want
like
maybe
we
want
foot
guns
there
in
terms
of
large
window
sizes
or
broccolis,
alpha,
11
or
stuff
like
that,
but
it
should
also
mediated
behind
leggings
I.
A
Need
to
consider
that
use
case
and
if
that
use
case
is
viable,
then
to
be
yeah
but
I'm,
currently
not
convinced
that
there
are
actually
I
mean
human
reader.
Sshh
epub
said,
like
those
are
capabilities,
compressing
the
contents.
You
know
for
one
time,
upload
compressing
at
twenty
percent
more
and
spending
a
thousand
times
more
CPU
into
it
doesn't
sound
like
a
new
capability,
but
we
should
have
like
it
was
the
use
case
and
then
decide
whether
we
want
to
go
there
or
not.
B
So
interesting
ones
may
22
are
snappy,
which
is
already
compounded
to
Chrome
and
Firefox,
which
kind
of
makes
kind
of
changes
the
gospel
village
calculation
a
bit
and
properly,
which,
as
we've
mentioned
as
the
deep
compression
side
compelled
in
but
the
compressions
library,
contrition
dictionary
isn't,
and
so
it
has
a
hundred
forty
kilobyte
cost.
If
we
want
to
sort
of
support
compression
for
it.
Yes,
the.
D
B
Right,
oh
yes,
so
I
want
I'm
discussing
about
what
criteria
used
to
Vic
algorithms
to
checking,
particularly
given
the
last
item.
Less
bullet
hot
I'll
go
and
should
probably
supplanted
in
future
the
good
thing
out.
The
deadly
balance
is
they're,
incredibly
old
and
deeply
baked
into
the
platform.
We
can
be
really
confident
this
that
will
probably
never
manage
to
get
rid
of
them.
They'll
always
be
compiled
into
your
browser.
H
J
I
A
J
I
J
I
J
M
So
if
you
renovate
than
the
question
of
so
broadly
zealand,
both
reusable
clearly
exposed
well
the
question
of
when,
when
it
shouldn't
be
at
one,
I
think
that's
sort
of
the
core
of
what
we're
getting
out
here
and
the
core
of
that,
at
least
from
my
perspective,
is
doesn't
have
sufficient.
Positives
outweigh
the
cost
of
addition
across
all
browsers
right
and
so
broadly,
it
gets
you
text
good
throughput,
a
deacon
russian,
whether
or
not
you
can
get
sufficient
compression
on
upload.
M
B
A
A
G
I
G
B
But
that
comes
along
this
event
is
not
about
you.
Happen,
pay
the
cost
as
Bucky
Rapids,
both
again.
But
yes,
the
same
thing
is
how
you
would
add
properly
yourself.
If
you
had
your
bill
leave.
My
idea
is
that
you
just
add
it
to
the
algorithms
dictionary
with
a
factory
question,
and
then
it
works
just
like
a
bill
tip.
B
There
are
a
few
other
ways
if
we
could
make
uses
of
my
dog
ones.
Work
well,
alternative
was
to
have
a
separate
constructor
for
each
algorithm.
So
you
just
test
on
the
existence
of
the
destructor
and
add
an
algorithm
by
adding
a
new
constructor
I
was
concerned
about
adding
an
abundance
of
new
constructors.
B
Another
option
is
to
have
some
kind
of
registration
methods
or
static
function
that
used
to
register
an
algorithm,
but
then
they
have
concern
about
the
scoping
of
laps
like
if
you
register
an
algorithm
in
what
have
you
registered
it
and
it's
registered
to
the
window.
Object.
Have
you
registered
to
your
origin
if
you've
been
adjusted
to
your
origin?
What's
I
mean
if
you
try
to
do
it
from
a
worker?
Does
that
call
back
into
the
main
thread?
B
L
J
K
Ten
years
ago,
I
don't
propose
because
this
way
I'm
saying
it
as
a
part
of
the
process
of
point
in
the
design,
a
water
additive
standard,
we
should
using
a
pretty
strict
idea
to
your
movie
the
concert,
maybe
like
snappy
in
the
case
it
gives
us
that
you
might
wanna
move
in
feature
right
so
will
not
be
considered
on
that
kind.
Of
course,
delicious
I
think
well
just.
J
B
I
think
it
would
be
good
to
use
the
internship
process
or
any
other
witness,
but
I
think
even
just
within
chrome
we're
going
to
get
pushed
back
any
time
we
add
a
new
algorithm
just
because
it's
the
value
size
increases.
So
that's
going
to
be
limiting
function.
That
will
stop
us
going
crazy
and
bringing
in
new
algorithms.
I
A
K
B
So
Dominic
Kuniko
is
not
here,
had
the
wonderful
suggestion
of
actually
randomizing
the
results
from
this
peach
chest
and
from
the
actual
availability
of
progression
for
max.
So
we
only
make
broccoli
available
50%
of
the
time
using
the
TI.
Yes,
so
we
basically
force
them
have
a
working
for
back,
because.
A
B
M
M
J
K
Think
it'll
be
good.
If
there
was
a
car
there
was
a
list
of
like
them
things
you
want
to
do
well.
You
could
have
a
like
a
very
concrete,
like
scenario
of
why,
when
ejs
use
and
hugs
in
it
clearly
somewhere,
that's
already
being
well
something
bunch
of
these
others
right
it'll
be
really
good
to
have
such
a.
Why
they're
doing
that?
That's
a
complete
yes,
yeah
example.
I
B
B
B
A
If
your
individual
chunk
is
I,
don't
know
gazillion
megabytes.
What
do
you
do
like
in
terms
of
memory?
Consumption
of
the
renderer?
Is
there
a
way
for
this
API
to
just
say
no
you're
not
getting
anything
because
I
killed
it
after
I
be
compressed
a
few
bytes
realized
that
this
is
safe
bomb
and
I
killed
it
I.
E
N
B
So
we
don't
make
one
I
don't
want
to
make
that
judgment
falls
then,
in
the
path
I
moved
with
added
options
say
like
row
is
the
depression
ratio
is
over
100
or
something,
but
that
seems
extreme.
So
the
aim
is
to
make
things
like
work:
intelligent
Sofia.
If
you're
reading
the
output
in
a
streaming
fashion,
then
hopefully
you
know
what
you're
doing.
If
you
are
using
the
response.
I
I
B
And
you
could,
but
because
this
is
streaming
the
output
streaming,
you
could
do
something
more
intended
using
the
substance.
I,
don't
actually
say
abandoned,
reads
like
up
to
ten
megabytes
or
whatever
and
then
another
throat.
That's
only
if
you
actually
want
to
not
care.
It's
just
want
to
read
everything
into
memory
and
got
head
and-
and
you
get
that
with
every
error,
but
that's
kind
of
what
you
guys
thought
yeah.
A
D
A
I
Okay,
but
it
would
also
be
nice.
There
were
easy
ways
for
the
consumer
to
handle.
That
I
mean
arguably
the
bigger
ecosystem
question
is:
will
everybody
allow
this
because
they're
worried
if
your
local
servers
I
mean
because
the
server
side
one
is
more
interesting
one?
If
I
could
ask
your
server,
because
your
server
side
illumination
was
insufficient
to
care
about
the
ocean,
then
that's
at
the
center
of
concern
or
the
reason
why.
B
This
yeah,
his
swimmer,
is
weak.
That
way,
then
it's
gonna
be
subject
to
be
cool
and
I'm.
Projecting
my
data
into
it
anyway,
so
yeah
I,
do
don't
compress
stuff
like
don't
support,
compressed
uploads
or
make
up
super
robust,
is
I,
think
the
it
so
I
mean.
Generally
speaking,
people
need
streaming
spell
programming
on
the
server.
In
my
experience.
B
M
B
B
A
M
C
G
D
What
is
it
so
think
of
them?
Tools
expose
so
example,
just
periodically
from
bronze
analysis.
Of
course
they
want
to
be
able
to
better
understand
the
long-tailed
users
so
like
if
someone
has
a
slow
load
and
they're
executing
in
towns.
Yes,
even
with
like
news
so
long
casts,
is
still
pretty
hard
to
figure
out
exactly
what
jobs
are
candidates
slow.
So
we
only
filled
this
a
while
back.
You
take
a
look
at
the
explainer
which
is
in
text
that
hasn't
loaded.
D
Yet
you
can
see
what
we
did
for
our
polyfill,
which
is
to
inject
a
bunch
of
instrumentation
everywhere.
It
was
a
lot
of
fun,
but
we
want
this
down
so
yeah,
so
this
is
well
basically,
we've
included
this
in
chrome,
built
on
the
UV
aged
dev
tools,
profiler
with
these
great
new
additions,
such
as
like
contacts
isolation,
and
only
showing
like
same
origin
or
chorus
passing
script.
So
that
way,
like
similar
protections
as
to
what
you
see
in
the
Aragon
set,
you
can
check
out
the
explainer
there
for
the
full
scoop.
D
So
this
is
what
the
current
API
looks
like
you
have
two
browser:
hey
kick
off
a
profile
specify
yours
are
at
sample
interval
browsers
like
no
take
this
sample
interval
instead,
which
is
a
best-effort
based
on
the
underlying
implementation
and
begins
profile.
So
at
some
point,
when
you
want
to
stop,
you
stop
the
profiler
and
you
send
it
to
the
server
for
analysis
later.
So
that's
a
simple
try
bit
for
us
to
learn
what
spider
monkeys
has
as
well
as
the
chrome
trace
format.
D
So
in
doing
this,
we
learned
a
lot
and
we
arrived
at
several
constraints,
which
I
think
we
found
have
been
like
pretty
universally
applicable
across
any
way
speaks
different.
This
one
thing
in
video
was
interesting
is
that
on
Windows
we're
basically
stuck
with
using
resolution
timers
unless
we
want
to
drastically
increases
poverty
yeah,
so
Windows
has
like
a
default
clock
in
for
an
interval
of
15.625
milliseconds,
so
on
Windows
clamp,
the
profilers
return
values
to
a
multiple
of
this,
so
that
we
don't
chew
through
the
users
battery.
M
M
M
Leaving
the
default
you
out
of
a
15.75,
but
if
you
were
to
use
the
same,
the
battery
media
team
in
Asia
on
platform
did
work
to
explicitly
add
code.
Then
up
the
core.
Renderer
changes.
Timer,
alright
concede
yes,
and
if
you
mapped
at
that
time,
which
would
be
I,
think
it's
eight
not
sixteen.
D
We
didn't
actually
use
higher
resolution
of
ours
at
all.
This
was
mostly
based
on
how
chrome
does
its
current
staffing
eleven
layer,
so
it
does
actually
specify
a
constant
which
he
uses
for
all
of
those
terms
already,
which
was
equal
to
this
number.
The
median
is
an
exception.
It
does
activate
at
higher
resolutions.
I,
don't
know
which,
like
in
the
past,
was
done,
was
to
power
consumption
for
dev
tools.
Usage,
okay
windows
backs
predict.
D
Another
constraint
is
that
v8
and
SpiderMonkey
as
well
do
not
build
necessary
or
information
that
we
need
to
actually
collect
all
sorts
position,
information
just
in
regular
JavaScript
compilation.
So
what
that
meant
in
practice
is
when
we
actually
wanted
to
fire
up
a
profiler.
We
had
to
build
us
some
allocation
map,
which
was
fairly
expensive.
We
had
to
do
a
walker,
the
heaps,
since
we
didn't
it's
method,
it
elsewhere
and
actually
get.
D
M
D
Yeah
and
we
block
this
on
site
isolations
this,
our
initial
experiments
show
that
it
was.
It
had
many
of
the
same
concerns
as
you
might
have
with
share
or
a
buffer
with
regards
to
timing
of
attacks
so
yeah,
we
also
gathered
a
bunch
of
feedback
in
the
process.
So
because
of
that
latter
point
we
were
directed
you
can
check
out
using
tube
and
Co
EE,
preserving
which
the
bottom
that's
a
whole
can
of
worms
but
effectively
it's
a
similar
set
of
mitigation
firms.
D
D
Another
issue
that
we
saw
was
that
people
were
interested
in
doing
some
kind
of
pre-mormon
to
avoid
that
initials
of
allocations
overhead,
as
well
as
ensure
that
you're
not
including
external
scripts,
that
can
basically
shoot
through,
like
some
extra
cycles
of
yours,
by
handling
Probot
and
all
the
time
so
yeah.
It
makes
it
a
little
bit
harder
hook
gun
if
you
perform.
D
If
you
require
the
a
header
or
maybe
a
Meditech
indicate
that
you
build
effect
to
get
profiling
later
on,
but
here's
the
big
question
it
does
make
it
more
challenging
for
people
to
just
drop
in
like
a
third
party
like
bomb
analytics
good
and
be
able
to
use
for
us.
Do
you
have
any
comments
from
you
guys
regarding,
like
what
sort
of
usage
that
you
would
see
yourself
using
as
forward
like
what
would
be
or
suffice
to
warm
up
so.
D
D
D
D
The
essential
for
overhead
attention
offense,
so
at
one
time
we
saw
around
like
four
to
five
percent
of
cycles
being
spent
inside
of
various.
This
was
with
the
ten
millisecond
sample
rate,
stack,
unwinding
and
could
not
book
them,
but
that's
likely
to
be
highly
implementation-dependent
other
implications.
Some
might
want
to
constrain
their
sample
intervals
as
well,
if
their
second
wedding
is
faster
slower.
So
we
do
get
that
intentional.
It's
possible.
J
K
M
As
possible
doesn't
mean
that
you
know
when
you
pay,
for
there
are
lots
of
reasons
why,
right
it
does
all
the
it
pays
for
play
at
the
end
and
there's
nothing
at
the
beginning
and
in
ensuring
that
chakra,
the
you
know
our
previous
JavaScript
engine
could
enable
that
then
require
to
be
built.
Soooo,
I.
D
So
another
thing
that
will
be
necessary
in
order
to
starting
on
platform
test,
for
this
is
educating
like
this
with
webdriver,
so
that
we
could
actually
a
deterministic
non
plaguey.
That's
for
this!
We've
already
started
writing
up
some
tests,
what
tests-
and
we
don't
have
this
type
of
ever
ASAP.
So
if
you
start
moving
those
over
to
local
justice.
A
Got
so
both
look
first
bullet
points
make
it
harder
for
analytics
to
measure
that
the
second
one
is
okay.
There's
an
opt-in
from
this
first
party
images
are
the
thing
to
do,
but
then
the
first
one
is
probably
similar
to
the
tau
situation
in
depth.
Like
I,
don't
remember,
Nick
issued
when
you
put
a
data
about
tau
and
usage
at
some
point,
but
it's
still
not
a
very
high
percentage
of
percent,
or
something
like
that.
Yeah
so
and
yes,
shared
array,
buffers
and
other
things
that
will
also
be
blocked
behind
this.
A
A
D
H
D
H
But
what
you
get
what's
the
result?
Actually
you
just
get
JavaScript
frame
you
get
also.
You
know
rendering
information
and
also.
My
second
question
is
in
JavaScript.
You
get
the
name
of
the
function
and
if
you
have
an
extent
extension,
would
you
be
able
to
see
that
code
as
well?
So
if
the
extension
marks.
D
Yes,
there's
no
renderer
internals
expose
other
than
the
top-level
like
dumb
callbacks.
So
if
you
call
them
till
I,
get
it
on
the
client
record,
you'll
see
that
I
won't
see
any
native
frames,
or
anything
like
that.
I'm
also
happy
to
dive
into
what
neat
rip-roarin
looks
like
it's
loosely
inspired
by
the
prom
tracing
for
my
own.
A
D
So
you
can
see
a
common
do,
carry
someone
that
you
both
formats
for
it.
You
get
your
JavaScript
stack
frames,
the
samples
are
associated
with
stacks
and
each
stack
frames
associated
with.
D
G
D
Say
yeah
there's
a
whole
specification
of
this
in
the
draft
specs
attached
to
the
spinner.
The
function
names
by
the
way
are
using
the
economic
stand
efficient
set
by
the
step
function
named
spec
constructs,
so
it
basically
falls
back
on
that.
I
presume
that
most
users
of
this
will
use
the
source
mapping
infrastructure,
like
in
our
testing
on
facebook.com.
We've
already
got
our
infrastructure
start
up
to
ingest
these
traces.
It
yeah
it
works.
L
D
Right
so
it
depends
on
your
sample.
Interval
depends
on
how
much
unique
skirt
is
present
on
the
page.
It
could
be
a
letter
grade
Jeep
or
have
a
fair
amount
of
redundancy.
Ultimately
I
think
we
saw
like
fairly
because
round
like
somewhere
between
the
ranges.
I
want
to
stay
like
500,
700,
kilobytes
per
second
of
sample.
It's
been
a
while,
since
you
visited
last
that
was
on
my
page,
which
has
lots
of
nice
JavaScript,
it's
likely
to
be
less
for
it
idle
traces
and
even
less
uncompressed
I
believe
the
compression
ratio
we
stopped
is
0.25.
D
It
was
just
to
course,
and
is
a
place
for
men
to
heavyweights
in
general,
and
we
also
hope
to
be
leveraging
screams
to
figure
out
whether
or
not
Youssef
or
whatever
compression
algorithm
we
use,
can
suitably
mitigate
that
cost,
as
well
as
start
landing
platform
tests
and
web
driver
support
or
forcing
sampling
and
long
toss
attribution
so
Tim.
Do
you
want
to
chat
a
little
bit
about
that
sure
yeah?
So
what
I've
got
some
users
been.
H
D
Time
ask
what
you
have
a
long
test
or
no
kostik.
My
hope
is.
We
can
leverage
this
same
infrastructure.
You
can
imagine
the
world
in
which,
when
you're
in
a
long
task,
you
cross
the
long
path
threshold,
so
you've
executing
a
single
task
for
greater
than
say
50
milliseconds.
We
can,
if
you're
up
that
in
start
taking
samples.
D
O
A
Okay,
so,
whenever
someone
registered
before
so
observer
for
long
tasks,
that
could
be
a
point
in
time
where
you
say,
oh
okay,
or
for
long
paths
for
the
tribution.
That
could
be
a
point
in
time
where
you
say:
okay,
I'm,
turning
on
or
father
for
this
purpose
and
entertain
just
running
during
all
tasks,
shortcuts
included
the
same
day
instead
of
trying
to
network,
but
once
once
people
have
registered
for
long
tests
with
attribution
I
would
like
would
have
the
similar
performance
cause
of
dumpster
profiling.
You
don't
speak.
No
I
I
believe.
A
D
Oh
okay,
so
51
seconds
into
a
long
task.
We
start
sampling
as
soon
as
X
Pacific
task
is
over.
We
stop
sampling.
Yes,
so
there
is
a
small
issue
of
that,
at
least
through
most
chasm
implementations
and
that
you
would
have
to
it's
the
first
time
starting
JavaScript
profiling.
Do
we
keep
walk
to
set
up
the
symbol
and
I
think
you
would
need
to
opt
in
in
advance
I'm?
Getting
this
and
I
think.
Maybe
you
use
the
same
general
to
say
we.
O
What
will
usually
try
to
enable
different
ability,
testing
I,
saw
you
know
instead
of
mind,
compares
for
a
particular
user.
The
way
we
will
do
is
probably
or
I
shouldn't
more
people
will
do.
It
is
per
user
perception
right.
You
wish
to
points
rather
do
neighbor
it
when
you
love,
hitting
the
city
tree
on
to
this
magical
touch.
Photos
like
a
memory
right.
If
it
is
enabled
you
know,
there's
not
pay
the
class
I
don't
have
5%
Larry
yeah,
but
then
the
problem
is
that
it
cost.
F
L
J
A
D
A
A
P
H
P
A
D
A
A
P
P
D
P
M
P
D
D
J
D
So
I
think
that
it's
from
what
sort
of
game
theoretic
perspective,
it
doesn't
make
sense
for
content
authors
to
be
very
aggressive
with
profiling.
People
don't
want
this
ice
discrete
slow,
like
at
Facebook.
We
could
eliminate
like
that
much
presenta
slow
down
like
you
would
for
all
users,
but
I,
understand
your
inconsistency
argument.
D
O
O
D
Having
on
my
web
developer
head
here
at
this,
what
we
don't
serve
users,
different
versions
of
instrument
to
J's
for
the
most
part,
so
we
effectively
serve
this
expensive
instrumentation,
even
though
it
might
not
be
activated
on
and
say,
maybe
like
one
person
and
users
so
using
profiling,
less
to
move
a
lot
of
education
that
everyone's
getting
anyway
and
replace
it
with
just
like
an
on-demand.
Like
you,
a
side
overhead.