►
From YouTube: 2021-04-22 Frontend Performance Discussion: Source Editor (with Monaco) in Blob viewer refactor
Description
On April 22, 2021 we had a discussion to align the path forward of using the Source Editor in read-only mode to improve performance on the overall user flow of using the repository browser and viewing blobs (file content).
Agenda
https://docs.google.com/document/d/18qidhvZksXJ7TxdgmJQEJB1IVISr-aSki00bONhbAf4/edit#
A
Hi
everyone
good
morning
welcome
we're
discussing
performance
in
with
a
source
editor,
which
is
potentially
a
path
forward
for
the
refactor
of
the
blob
viewer
in
the
repository
browser
of
git
lab.
So
we
identified
some
hurdles,
especially
around
loading
source
code,
for
the
first
time
sorry
source
editor
for
the
first
time,
which
has
to
load
monaco.
A
It's
a
heavy
piece
of
machine
machinery,
but
it
does
allow
for
very
amazing
things
to
happen.
Like
quick
switch
from
view
to
edit,
but
also
important.
That
tim
mentioned
in
the
issue
is
the
stop
rendering
the
diff
markup
on
the
server
and
send
the
raw
data
of
the
file
from
the
server,
which
is
much
simpler
on
the
infrastructure
side
of
things.
A
A
So
that's
that's
kind
of
like
the
pros.
In
a
nutshell,
the
cons
is
just
it's
big.
A
lot
of
a
lot
of
code,
which
is
one
the
assets
itself
and
the
other,
is
the
weight
on
the
on
the
front
end
has
to
be
somehow
mitigated
to
the
point
of
being
faster
when
rendering
read
only
and
not
worsening
the
metrics
too
much.
I
think
we're
okay
in
accepting
a
little
bit
of
a
hit
because
of
the
benefits,
but
it's
important
to
know
way.
How
can
we?
B
A
So
I
don't
know
if
this
is
a
gold
cache,
but
we
had
that
regression
production
when
when
we
were
loading
it
inadvertently
about
the
browser
crashing,
I
think
it
is
called
cat.
It
is,
I
think
it
happens,
even
on
a
worm
cache,
so
a
very
large
file
kind
of
froze
the
browser
for
many
seconds.
It
went
from
11
seconds
to
40
seconds
of
total
blocking
time
that
that
would
be
like
a
high
total
blocking
time.
But
I
personally,
I
don't
think
that's
the
culprit
of
monaco.
A
I
think
it's
the
culprit
of
conjunct,
the
joining
of
the
current
code
in
the
page
with
monaco.
So
there's
probably
some
some
areas
there
that
we
can
work
on
of
joining
those
two
code
bases
where
now
it
blocks,
but
there's
probably
some
listeners
on
the
page,
that's
being
triggered
when
we're
rendering
when
we're
loading,
monaco,
somehow
or
importing
monaco,
because
I
think
that
was
the
the
mistake
right
jack.
Is
there
anything
else
that
we
or
against.
C
C
Well,
I
think
if
some
summed
it
up
nicely,
I'm
I'm
not
sure
I'm
not
aware
of
the
large
file
loading
in
relation
to
source
editor
or
monica
in
particular,
but
that's
that's
an
interesting
thing.
I
do
not
think
there
has
to
be
any.
C
There
should
be
any
source,
editor
or
monaco
editor
related
thing
related
to
the
large
files,
because
technically
there
is
nothing
like
the
way:
monaco
loads,
the
files,
it's
like
just
that
visible
number
of
lines
on
the
screen
at
a
time,
so
it
doesn't
process
more
than
that
unless
you
start
scrolling
so
and
then
it
reuses
these
existing
lines.
C
So
I
shouldn't
be
that
apparently,
the
the
the
biggest
hit
was
just
the
network
request
and
then,
depending
on
the
file
type,
what
like
how
how
that
file
has
been
processed
by
by
the
the
other
scripts,
but
from
the
editor's
site.
It
shouldn't
be
an
issue,
at
least,
as
I
said,
I
haven't
seen
any
any
related
issue
to
this.
A
So
what
I
was
many
well,
what
I
was
aiming
at
or
mentioning
is
the
thing
that,
like
oh,
I
blanked
the
performance
discussion
we
had
recently
because
it
popped
up
in
production
grant.
A
He
mentioned
that
on
slack,
that
we
had
a
big
regression
in
the
in
the
numbers
and
then
when
so
the
theory
was
that,
when
your
mr
dennis
that
stopped
importing
monaco
when
the
feature
flight
was
off
with
that,
that
would
mitigate
it
so
just
by
importing
it
onto
the
page,
somehow
that
made
the
page
block
from
11
seconds
to
40
seconds,
and
that
was
a
very
large
file.
A
So
that's
that
was
the
case
I
was
talking
about,
and
I
we
we've
seen
multiple
times
that
monaco
handles
that.
Well,
so
that's
why
I'm
well
98
sure
that
it's
it's
our
code.
That
is
doing
that,
not
monaco.
So
that's
one
of
the
things
that
we
can
take
a
look
at
is
what
exactly
caused
it
from
our
perspective
from
our
side
of
the
code.
Is
there,
like?
A
I
don't
know
some
poppable
pop
over
code
or
some
listeners
that
are
being
triggered
that
is
causing
that,
like
we've
seen
cases
in
the
past
that
even
the
vsaf
html
has
caused
a
lot
of
tbt
to
go
up.
So
maybe
it's
a
v-safe
html
that
is
causing
that
to
begin
with,
I
don't
know,
but.
B
C
That
would
be
an
interesting
experiment
to
see
like
the
the
scalability
of
the
source
editor,
because
actually
I
didn't
do
that-
that
myself
yet,
but
to
figure
out
how
the
size
of
the
file
affects
the
loading,
but
again
this
is
this-
is
this
is
pretty
secondary
issue?
I
think
the
the
main
concern
is
the
is
the
loading
performance
of
of
the
of
the
editor
itself
right
right.
That's
that's
the
main
thing
and
how
we
deal
with
the
large
files
we
can.
B
C
B
When
we
fix
the
svg
icon
problem,
this
already
improved
a
lot,
but
I've
seen
that
emoji
json
can
take
70
seconds
on
the
backend
to
render
the
api
request
and
that's
ridiculous
right.
I
did
some
tests
with
bigger
files
that
take
10
15
seconds
to
get
the
api
response
and
then,
on
top
of
that,
it's
sent
as
a
huge
bucket.
So
we
can't
do
any
performance
optimization
on
the
front
end,
we
are
just
injecting
a
huge
dom
element
with
thousands
of
dom
nodes
that
get
rendered
and
and
styled
and
everything
at
once.
Yeah.
B
C
B
Amount
of
complexity
and
the
servers
are
already
under
a
lot
of
pressure.
You
can
see
this
very
clearly.
If
you
just
look
at
our
performance
measurements,
you
have
a
seesaw,
you
see
on
a
lot
of
pages,
some
bigger
some
lower,
but
you
see
day
and
night
cycles,
which
is,
in
my
opinion,
should
not
happen.
If
you
get
your
system
to
100
right,
the
impact
of
it
of
of
load
shouldn't
be
visible
straight
ahead
should
only
happen
on
very
specific
days,
but
we
need
to
get
to
a
point
where
we
can
right
now.
B
The
main
focus
is
on
database,
but
I
think
there
are
tons
of
other
topics,
two
of
the
biggest
topic
that
I've
seen
from
the
back
inside
rendering
files
and
code
highlighting
of
those,
because
what
we
even
do
is
files
that
are
not
even
highlighted
on
the
backend
are
going
through
the
pipeline
et
cetera,
and
the
other
thing
is
markdown
comments.
So
doing
the
live.
Markdown.
B
C
B
Every
comment
and
very
heavy
process
so
yeah
and
the
other
things
that
I
saw
simply
was.
We
could
decouple
meter
from
content
because
we
simply
can
go
to
the
raw
endpoint
on
the
api.
That's
also
nice
there.
We
rather
have
the
biggest
problem
with
authorization.
I
think
I
mentioned
it
at
some
point
that
if
you
take
the
same
file
and
you
put
it
in
a
public
project-
a
thousand
line
files,
it
takes
20
milliseconds
to
load
from
the
backend.
B
If
you
take
the
same
file
and
you
put
it
in
a
private
project,
it
takes
around
500
milliseconds,
just
to
get
authenticated
and
authorized
to
actually
take
a
look
at
the
file.
So
if
we
are
able
to
improve
this
by
as
much
as
I
want
to
have
it
improved,
this
should
also
go
down
to
us
to
a
maximum
of
100
milliseconds.
B
This
means
that
if
we
do
this
right
with
startup.js
and
prefetching
and
preloading
monaco
and
mitigating,
we
should
be
able
to
get
the
whole
file
content
of
a
1000
2009
file
in
100
milliseconds,
which
is
nothing
compared
to
what
we
are
doing
right
now.
So
those
were
my
main
titles
to
really
take
the
load
off
from
the
back
end
of
the
that.
The
library
that
we
are
using
is,
as
far
as
I
know,
quite
outdated,
because
the
person
that
was
a
maintainer
left
hitler
ages
ago
and
is
not
really
maintaining
it
anymore.
B
B
And
we
can
reuse
that
with
the
vs
code
extension,
so
I
think
if
we
shift
our
focus
there
and
yes,
I
know
there
is
an
impact
on
loading
straight
away,
the
the
specific
file,
but
for
the
rest,
if
you
think
about
it,
if
you
go
to
a
project,
you
will
never
have
a
code
file
on
the
on
the
first
hit.
So
if
you
go
to
the
project,
you
will
not
have
a
code
file.
B
So
as
soon
as
we
have
rendered
the
readme,
we
could
already
start
loading
monaco
for
the
whole
application
in
that
sense,
so
that
we
go
for
eager
web
pack,
loading
of
monaco
that
we
have
already
as
soon
as
you
go
through
the
directors
or
to
a
file.
B
The
other
thing
is
that
we
have
currently
in
preparation-
and
I
hope
that
I
have
some
time
to
to
finalize
this-
is
the
whole
pwa
thing
and
one
of
the
first
targets
to
get
this
going
and
using
this
would
be
caching,
our
bundles,
and
we
could
also
prioritize.
I
would
do
hand
picking
at
the
beginning,
and
one
of
the
things
would
then
be
monaco.
B
B
If
you
go,
it
completely
throws.
So
I
think
there
are
a
couple
of
nice
ways
of
mitigations
and
we
have
a
clear
tile
so
when,
when
I
did
the
the
poc
one
of
the
base
things-
and
we
discussed
this
in
the
application
performance
call
is,
this
is
like
evolution
of
all
our
performance
measurements.
We
started
with
loading
because
we
needed
to
start
somewhere.
Loading
measurement
is
the
easiest,
and
if
you
don't
have
anything
on
screen,
you
can't
do
anything.
So,
let's
start
with
loading.
B
If
we
really
want
to
improve
the
user
experience,
we
need
to
have
the
full
experience
measured.
What
are
the
tasks?
What
are
the
actions
that
users
are
doing?
We
need
to
measure
those
and
we
should
get
those
down
in
total
where
loading
is.
Yes,
it's
the
starting
point,
but
you
it's
like
a
classic
game.
You
can
exchange
a
little
bit
of
more
loading
time
and
especially
if
you
mitigate
it
nicely
and
do
this
nicely
and
plus
adding
enough
borders
and
checks
in
between
that.
B
We
are
not
including
the
bundle
and
we
have
with
the
quality
stuff
in
place
that
should
get
us
there
that
at
some
point,
if
we
include
it
by
mistake,
then
we
should
already
block
it
on
on
staging
or
at
least
see
it
very
very
early
and
that's
what
also
happened
to
some
extent
now
and
then
we
are
fine.
We
can
exchange
if
we
get
everything
past
the
loading
by
50
faster.
B
I
don't
care
if
we
load
a
little
bit
long
on
code
cache,
and
that
is
something
I'm
very
happy
to
bring
to
the
rest
of
the
company
so
that
people
are
not
getting.
Oh,
my
god,
this
graph
is
going
up.
No,
but
yes,
we,
the
loading
graph,
went
up
on
purpose,
but
even
there
I
see
that
we,
apart
from
one
use
case,
which
is
called
cache
on
one
single
file.
I
think
we
shouldn't
have
a
problem.
A
I
I
think
that
would
be.
We
would
need
help
there
in
spreading
the
message
that
this
is
an
exception.
I,
like
the
idea
of,
like
you,
you're
eager
loading
as
soon
as
the
repository
browser
boots
up.
It
might
not
be
the
project
home
page,
it
could
be
a
folder
as
well,
so
I'm
gonna
put
here
our
folder.
The
one
thing
that
I
did
did
not
mention
yet
I
don't
see
that
as
a
con,
but
it's
a
cost
is
that
we'll,
after
this
is
in
place,
we'll
have
to
reimplement
the
code.
A
Intelligence
tools
we
have
currently
on
top
of
the
markup
source
graph
would
have
to
be
updated,
but
they
have
a
layer
on
how
to
implement
that.
So
we
would
have
to
probably
implement
that
as
a
source,
editor
extension
and
the
same
thing
for
the
gitlab
code,
intelligence
that
we
already
have
built
anyway.
C
About
the
source
graph,
I
might
confuse
the
things,
but
we
do
have
source
graph
in
web
ide
right,
so
it
does
have
the
layer
for
the
for
the
editor.
Oh.
C
A
Sure
yeah,
but
that's
that's
what
I
think
we
do
so.
Basically
they
add
callouts
to
their
platform.
When
you
click
the
link,
you
go
there,
the
one
that
we
built
stays
within
our
code
base
stays
within
gitlab.com,
but
that's
just
to
mention
that,
on
the
call
going
back
to
what
tim
was
saying,.
A
I
feel
like
that's
that's
a
way
forward
for
us
to
do.
Do
you
think
they'll
be
able
to
put
the
metrics
so
to
put
the
site
speed,
for
example,
testing
only
a
second
visit,
or
I
mean
they
already
do
multiple
loads
right.
Yes,.
B
A
Okay,
I'm
not
really
aware
of
that
configuration
on
the
metric
side,
but
that
would
be
an
interesting
thing.
B
I'm
very
I
have
discussed
this
for
a
long
time
with
christy
and
things
like
that,
and
they
are
totally
aware
of.
Rather,
let's
improve
the
overall
performance
for
the
user
and
the
whole
experience
rather
than
specific
loading
times.
But
yes,
loading
time
is
our
starting.
We
needed
to.
We
need
to
have
a
common
understanding
of
where
to
start
and
yeah.
It's
it's
really
about
that
that
I
want
to
investigate
also
for
all
other
areas.
What
are
the
common
use
cases?
What
are
the
things
that
people
do
and
make
those
them
faster?
B
C
Is
faster
right?
I
would
like
to
make
to
make
just
one
one
comment
about
the
about
measuring
only
the
chord
cache
I
like
by
default
side.
Speed,
I
think,
does
two
rounds
now
right.
So
first
one
is
called
and
second
one
is
hot
already
right.
So
you
know
no.
Both.
C
Okay,
but
that
that
that's
great
to
know,
but
the
the
point
is
that
we
should
not
I
in
my
opinion,
we
should
not
rely
purely
on
the
whole
cash.
We
have
to
know
if
things
go:
go
nuts
on
the
first
load
as
well,
so
we
have
to
do
the
to
the
both
both
things.
Otherwise
we
might
miss
something
really
really
dramatic.
C
Like
in
this
case,
that
grant
posted
when
we
had
the
lcp
over
40
seconds,
if
we
would
measure
on
the
only
on
the
hot
cache
we,
we
might
have
missed
this,
this
problem
right,
so
we
should
not.
We
should
not
ignore
cold
cache
results.
B
And
which
is
with
the
code
cache
so,
for
example,
the
gitlab
project
home
has
an
lcp
with
code
cache
of
2.498
seconds,
and
if
you
do
the
same
thing
with
cash,
which
is
sim,
which
is
quite
simply,
it
goes
to
the
gitlab
dot
com,
slash
explore
page
and
afterwards
it
loads
the
specific
page.
So
it's
not
even
hot
cash
on
the
same
page,
that
is
also
something
to
remind,
and
then
we
come
down
to
an
lcp
of
1.4
seconds.
A
Yeah,
I
am
cool
so
saying
that
we
don't
really
know
where
the
user
might
be
clicking
clicking
the
link
to
the
blob.
So,
for
example,
it
could
be
on
a
merger
cast,
but
it
could
be
outside
of
gitlab
completely
so
yeah,
but
as
soon
as
they
go
through
a
gitlab
page,
we
should
be
able
to
if
we
have
the
service
worker
in
place
to
mitigate
that
to
the
service
worker
by
loading
the
market.
A
Within
that
my
so,
do
you
think
that
we
will
fall
into
the
risk
of
adding
everything
into
the
service
worker
to
cash
because
like
for
us,
the
monaco
is
really
important,
but
I'm
pretty
sure
other
groups
have
other
big
code
applications
that
they
would
like
to
load
immediately.
Do
you
see
anything
anything
else
that
might
be
just
just
to
make
sure
that
we're
not
loading
like
a
hundred
megabytes
on
the
first
load,
gitlab.
B
B
So,
let's
share
my
screen:
if
you
look
at
it,
this
is
all
monaco.
That's
the
type
script
worker,
that's
monaco
itself
by
samick
viewer,
I'm
pretty
sure
this
is
very
much
used
across.
Not
so
I
think
this
is
nothing
that
I
would.
I
would
definitely
handpick
them,
and
then
you
have
already
the
merch
request
show.
Then
we
have
the
notebook
view.
The
open
api
viewers
are
really
especially
special
stuff.
B
Yammer
work
is,
I
think,
also
monaco,
if
I'm
not
mistaken,
pdf
viewer
design,
page
yeah,
so
that,
to
be
honest,
the
biggest
thing
that
I
would
add
on
top
of
that
is
this
most
probably
only
gitlab
ui
and
take
this
in
its
own
library
but
you'll
notice,
anyhow
in
the
first
row,
because
yeah
every
page
should
use
it,
but
I'm
not
seeing
really
anything
differently.
Apart
from
that,
there
are
no
bigger
libraries
that
are
used
across
the
station
that
are
not
in
maine.
At
the
moment,
we.
A
B
Get
to
a
to
a
certain
stage
where
we
can
jquery
slim
out
of
the
main
bundle
and
then
rather
load
is
pre-loaded
for
the
service
worker
might
be
something
at
some
point
which
feels
weird,
but
as
long
as
we
then
have
a
couple
of
extra
plugins
and
special
viewers
that
still
rely
on
it.
Yeah.
Why
not?
But
for
everything
else
it
seems
that
monaco
would
be
in
reality
the
perfect
starting
point,
but
maybe
or
also
the
main
thing
anyhow.
B
We
are
working
very
heavily
also
on
the
plan
sides
to
work
into
that
direction
of
having
one
simple
application
for
the
issue:
detail
the
issue
list,
plus
the
issue
board,
for
example,
so
that
you
are
also
able
to
switch
very,
very
fast
there,
and
maybe
at
some
point.
We
will
also
preload,
then
the
whole
issue
application
in
that
sense,
so
that,
if
you
hit
an
issue,
they
they're
also
very
fast,
and
maybe
the
same.
For
mrs
those
are
my
three
main
targets.
A
Okay,
I
was
typing,
that's
cool,
so
I
think
there's
a
couple
of
things
here
that
we
need
to
look
into
more
deeply.
So
this
will
essentially,
if
we're
the
way
to
mitigate
it,
would
be
to
cash
it
through
the
service
worker
and
optimize.
A
For,
like
a
warm
cash
testing
approach,
we
we
kind
of
put
this
as
a
requirement,
so
we
need
to
have
service
worker
put
in
place
before
we
can
ship
this
refactor,
and
that's
that's
okay,
because
we
we
do
have
a
long
way
ahead
of
us
regarding
rewriting
the
viewers
and
everything
we
are
preparing
it
in
a
way
that
we'll
be
able
to
enable
whenever
we
want,
even
if
all
the
viewers
aren't
re-implemented
already
we'll
just
send
them
to
the
hamel
view,
while
they're
being
implemented
for
that
type,
the
other
ones
will
go
to
the
new
ones.
A
B
If
we
go
yeah,
we
could
also
try
of
simply
eager
webpack
loading
so
that
we
simply
take
it
out
of
the
bundle.
But
we
say
on
the
on
the
optional
conditional
loading
of
monaco.
We
could
already
see
eager,
which
basically
then
starts
as
soon
as
you're,
not
doing
anything
anymore,
that
that
might
mitigate
anyhow
already
a
lot
of
cases.
That
is
something
we
we
should
try
out.
B
Then
the
even
perhaps
is
the
service
worker
not
as
a
high
of
a
priority
or
blocker,
but
I
believe
that
if
we
have
the
the
service
worker
in
place-
and
we
all
the
only
thing
that
is
currently
looks
weird-
is
that
a
couple
of
tests
are
breaking
to
very
weird
reasons.
I
need
to
look
into
that,
but,
apart
from
that,
it's
it
would
be
ready
that
you
just
add
the
ul
and
that's
it
to
be
out
there
with
with
caching
monaco,
because
we
have.
A
A
Okay,
I
think
I
wanted
to
say
something
else
and
the
yeah
we
can
go
on
with
the
development
of
that,
but
I
think
we
still
need
to
validate
a
few
things.
We
need
to
talk
to
grant.
I
guess
in
the
quality
team,
about
this,
about
setting
up
a
measurement
or
at
least
having
them
informed
of
the
plan
of
that
the
current
path
forward.
B
Yeah,
this
is
also
a
question
what
we
might
get
there
from
upgrading,
because
we
are
more
than
a
year
behind
the
current
version
of
monaco.
So
the
latest
came
out
in
march
this
year
and
we
are
currently
using
the
version
from
last
year
february
and
they
are.
C
Yeah
there
were
reasons
for
that
because
at
some
point
mid
last
year
we
tried
to
to
upgrade
monaco
and
we
technically,
like
anything
related
to
editing,
has
been
crashed
has
been
crashing.
So
there
were
some
some
breaking
changes
introduced
there.
So
I
will
just
need
to
to
take
a
look
at
that
from
from
the
from
the
source
editor
perspective,
so
yeah,
absolutely
I've
checked
the
lightest
version
yesterday
so
yeah
according
to
the
change
log.
C
There
are
a
lot
of
things
that
might
help
us
to
to
to
make
the
monaco
footprint
an
impact,
less
intrusive,
but
but
upgrading
isn't
isn't
the.
B
The
easiest
part
yeah
yeah-
I
was
expecting
that
I
was
just
I
was
mainly
checking
in
on.
Might
there
be
something
that
is
fixed
by
now
that
we
have
ran
into
and
that
we
that
we
have
keep
an
eye
on
it,
that
we
are
making
progress
there?
Because
I
think,
if
you
look
at
such
a
huge
and
that's
the
nice
part
for
me
also
to
use
monaco.
B
This
is
a
very,
very,
very
much
repository
or
library
that
is
invested
so
much
into
it
as
it's
the
baseline
of
vs
code
that
we
get
this
for
free
in
that
sense
of
of
reusing
a
lot
of
engineers
spent
with
inside
and-
and
I
think,
that's
that's-
a
huge
huge
advantage
that
we
could
have
there
so
cool.
If
we,
if
we
have
this
in
any
heart
in
mind,
that's
awesome
and.
A
B
A
Putting
at
the
bottom
of
the
agenda
two
two
lists:
one
is
the
future
investigations
which
are
spikes
or
things
we
need
to
create.
Pocs
on
the
other
is
future
work,
and
this
will
translate
into
issues
and
then
we
can
find
assignments
for
these.
Then
it
would
be
great
if
you
could.
I
can
talk
to
roman
if
you
need
to
get
you
some
time
for
this
work,
because
I
think
it
benefits
everyone,
you
being
a
staff.
A
I
already
talked
to
him
a
little
bit,
so
he
already
knows
that
there's
going
to
be
some
incoming
work,
so
that's
that
so
tim
when
you
say
that
you're
loading
on
in
webpack
would
that
be
on
specific
pages?
Like
would
be
we
would
we
put
that
on
like
every
time
you
visit
the
repository
browser
area
like
a
folder
or
the
project
homepage,
it
would
also
load
monaco
as
a
an
asset.
B
C
All
right,
I
think,
I
think
it
would
be
for
for
the
loading
from
the
loading
perspective
again
to
to
to
account
for
the
for
the
hot
cache
measurements.
C
Technically,
we
could
we
could
start
loading
monaco
like
give
instructions
to
start
loading
monica
whenever
we
have
any
project
context
so
because,
once
the
user
gets
into
the
context
of
a
project,
there
is
a
very
high
probability
that
they
will
hit
that
or
another
route,
including
the
editor.
Now
that
that
we
have
source
editor
everywhere.
C
And
it
yeah-
and
that's
that's
that's
exactly
the
thing
that
we
that
we
talked
about
during
the
first
call
or
the
andre
that
we
there
are.
C
There
are
two
things
technically
that
we
can
that
we
can
do
to
improve
the
the
loading
performance
first,
one
is
to
to
start
loading
to
to
pre-load
source
editor
or
monaco,
in
particular
on
on
the
project
level
right,
but
then
again
now,
when
we
have
all
the
editing
experiences
unified
and
they
are
all
based
on
source,
editor
and
monaco,
so
I
think
it's
absolutely
proper
time
to
make
the
to
make
at
least
the
monaco
chunks,
the
first
class
citizen
that
we
just
pre-load
on
the
domain
level
with
just
resource
hint
or
anything
like
this.
C
So
no
matter
whether
we
hit
the
editing
route
or
not
getting
it
would
be
would
be
fine.
This
would
affect
the
mobile
users,
but
our
our
product
is
not
very
mobile
friendly
anyway,
especially
when
it
comes
to
editing.
So
unfortunately,
but
in
order
to
improve
your
muted
team.
B
Sorry,
sorry
to
jump
in,
I
even
saw
a
couple
of
changes
that
they
did
to
monaco
for
fixing
it
on
mobile
and
to
make
it
better.
It
was
down
and
they
just
improved
yeah,
it's
an
android
stuff.
So
I
think
I
think
it's
more.
C
More
about
like
tweaks,
there
is
the
there
is
a
like
ever
going
thread
in
the
in
the
in
the
issue
tracker
for
from
monaco,
where
technically
like
it's,
it's
abandoned
for
for
like
people
for
several
years,
people
people
leave
comments
like.
Is
there
any
progress
in
this
front
and
then
just
like,
like
the
the
core
team?
Just
just
refuses
to
answer
there
because
like
it?
No
it's
not
it's
not
on
the
radar.
C
So
monaco
is
not
mobile,
mobile
editor,
but
anyway
like
in
order
to
make
the
to
benefit
the
whole
product.
I
think
making
it
the
first
class
citizen
and
load
it
right
when
we
hit
the
domain.
That
would
be
beneficial
like
just
do
it
with
resource
hint
do
it
with.
I
don't
know
with
anyhow
anyhow
else
and
and
then
like
so
combining
this
thing
so
preload
it
with
resource
hint
put
it
into
the
service
worker
and
then
like.
C
We
have
the
combo
of
things
that
technically
on
the
hot
cache
will
make
things
super
fast,
but
that's
sort
of
that's
the
decision
that
we
may
have
to
make
that
the
editor
now
become
the
source
editor
now
becomes
not
just
the
editor
routes
related,
it's
just
it's
the
product
product
wide
asset
and
it's
it's
important
for
the
whole
product
and
it
has
to
be
treated
properly
in
accordance
to
this.
To
this
value.
A
Yep,
especially
since
it
goes
to
read
only
just
to
display
the
the
blob
information
that
so
you
mentioned
mobile,
just
to
make
sure
that
we
mentioned
this.
Do
we
know
of
any
problems
on
the
displaying
of
the
content,
not
the
editing.
So
no.
A
B
C
A
A
A
Right
yeah
I
was
in
during
this
week.
I
was
thinking
of
alternative
ways
for
us
to
like
the
cold,
the
cold
cache
version
of
the
blob
would
be
rendered
with
the
old
code
base
of
the
hammel,
and
then
the
second
would
be
rendered
with
the
monaco,
but
that
would
just
keep
us
having
to
babysit
two
code
bases.
A
A
C
A
A
C
Will
still
have
the
the
load
on
the
backhand
and
then
we
will
still
have
the
heat
on
on
the
loading
performance.
So.
A
Yeah,
I
agree
so
we're
going
to
go
for
that.
We
do
want
to
yeah
and
overall,
I
think
we
have
some
pretty
good
direction
here
on
things
to
explore,
making
sure
that
the
code
base
of
source
code
doesn't
clash
with
the
monarch
and
makes
it
worse
and
blocks
the
whole
ui.
So
that's
going
to
be
one
one
for
us,
jacques,
to
see
how
how
how
that
really
happens,
which
could
be
this.
So
that's
one.
A
How
does
the
source
code
codebase
interact?
That's
not
the
right
word
with
monaco.
A
That's
something
for
something
for
for
us
to
look
into
shaq
to
see
if
we
can
investigate
in
secret
if
we
can
find
what's
causing
that
hold
up,
because
that's
that's
besides
the
source.
Editor,
that's
really
specific
to
our
hosting
area
right.
So
I
I
can
take
care
of
of
engaging
with
with
the
quality
team,
getting
them
up
to
speed
with
what
we're
trying
to
do.
A
We
need
them
on
board
on
this,
so
I
I
can
set
up
a
call
with
with
grant
and
and
kind
of,
inform
him
and
get
him
up
to
speed
with
everything
then
tim.
If
I,
if
I
need
help,
then
I'll
probably
loop
you
in
just
to
to
get
everybody
on
board
of
the
whole.
B
Measuring
I
will
yeah
and
I
will
keep
everyone
also
informed
on
on
those
things,
and
the
things
that
we
want
to
tackle
david
is
completely
on
board
david
de
santo
from
product.
He.
He
really
sees
the
advantages
of
the
approach
in
total
and
that
we
all
perhaps
need
to
invest
some
costs,
but
overall
yeah.
I
think
we
should
be
able
to
do
it
in
a
way
that
you're
not
even
really
recognizing
that
we
are
learning
yeah
yeah.
C
I
just
want
to
make
questions
sure
dennis
I'm
sorry
it's
interrupt,
but
so
we
are.
We
had
like
the
as
a
as
I
mentioned
during
the
application
session
when
the
the
application
performance
session
team,
when
you
demo
this
like
this
this
this
is
awesome.
This
is
wonderful.
I
just
have
one
question
about
handling
the
other
types
of
files
that
are
not
code
so
markdown
like
we.
We
saw
those
viewers
for
like
balsamic,
pdf
and
things
so
for
those
we
will
still
have
to
to
use
the
existing
viewers
right.
B
Yeah
right,
yes,
the
code
files
were
my
top
top
one
priority,
all
the
other
files.
I
would
say
we
are
not
that
great
and
it's
a
little
bit
chaotic
on,
because
I
was
looking
with
the
students
from
the
warehouse
course
also
before
I
did
that
into
the
different
types
that
we
have.
I
think,
if
we
can
reiterate
and
organize
them,
we
might
even
also
be
able.
I
think
that
the
long-term
target
might
be
ready
to
load
all
files
from
raw
endpoints
because
they
are
so
much
faster.
B
So
as
soon
as
we
have
that
the
biggest
hurdle,
but
the
most
important
hurdle
done
with
code
and
organize
this
in
a
nice
view,
application
with
the
with
reusable
components
for
real,
then
then
I
think
we
can
extend
them.
Also
very
nicely
give
this
also
to
the
community
and
make
this
better
organized
and
and
go
down
the
road
but
yeah.
Sorry,
there
will
be
some
time
where
we
need
to.
A
B
No,
basically,
the
code
file
loads,
monaco
everything
else.
We
need
to
see
if
we
either
can
load
a
view
component
that
loads
them
directly
the
raw
files
aka
an
image
viewer,
because
I
really
don't
believe
that
we
need
the
back
end
to
render
us
some
html
to
then
initialize,
some
some
javascript
that
doesn't
sound
right,
so
we
should
be
able
to
replace
them
by
specific
view,
components
in
there.
Yeah.
A
I
think
there's
still
a
decision
to
be
made
doesn't
have
to
be
made
now
about
whether
the
viewer
for
an
image
or
viewer
for
pdf.
Should
we
do
that
on
the
blob
view
app
or
should
we
still
load
the
source
editor,
but
the
source
editor
doesn't
doesn't
render
monaco
and
renders
their
own
viewer
inside
source
editor
that's
kind
of
like
a
separate
conversation,
but
we
can
have
that
later.
It
doesn't
really
yeah
and
you
know
basically
yeah,
there's
no
roadblocks.
A
B
The
only
thing
that
I
would
think
about
on
the
long
term
and
just
as
a
funky
idea
in
the
back
of
my
head
is
what
I
saw
when
I
built
the
pc.
Was
this
bonus
thing?
That
was
really
for
me
opposed
to
that.
It
was
nothing
that
I
thought
in
the
first
place.
I
had
this
and
seeing
it
then
of
clicking
edit
button
and
having
it
three
milliseconds
later
is
such
a
change
to
it.
B
What
about
all
the
other
file
types?
I
would
really
foresee
that
at
some
point
it
would
be
really
awesome
that
you
are
not
only
able
to
edit
code
files,
but
that
you
are
able
to.
I
mean
how
many
funky
libraries
are
out
there
by
now
that
you
can
do
very,
very
fast
image,
compression
and
image
conversion
for
web
assembly.
B
If
you
just
look
at
an
image,
you
click
edit
and
you
can
resize
rescale,
run
it
image,
optimized
and
hit
commit
again
and
the
same
for
all
the
other
interesting
file
types.
I
think
that
is
something
we
need
to
keep
in
mind
and
if
we
do
this
right
and
then
structure
it
right,
then
we
should
be
able
to
extend
this
really
nicely
and
get
an
advantage
advantage
over
those
other
products
that
focus
to
100
percent
of
code
and
yeah.
B
A
Yeah,
that's
kind
of
part
of
the
conversation
that
we
had
in
the
past
call
about.
If
we
do
these
on
the
source,
editor
level
will
benefit
the
whole
of
gitlab
instead
of
just
just
improving
the
blob
and
that's
again,
I
don't
think
I
think
the
roadblock
here
is
monaco
itself,
so
we
can
talk
about
that
later.
In
the
perspective.
C
C
That's
the
that's
the
that's
the
interesting
thing
like
technically
the
so.
First
of
all,
I
would,
I
would
definitely
evaluate
the
idea
of
using
of
making
viewers
as
the
extensions
to
a
source
editor
simply
on
the
premises
that
then
we
will
have
those
in
a
very
isolated
sort
of
in
the
scoped
manner.
C
That
would
allow
us
to
prevent
things
like
like
we
we
had
last
week
when
the
shared
component
get
up,
gets
updated
and
then
the
whole,
like
the
whole
bunch
of
routes
are
affected
because
because
it
is
the
shared
component
and
it
and
the
component
is
shared
among
all
of
those
keeping
it
in
the
in
the
isolated
manner.
As
an
extension
to
source
header
would
would
help
us
sort
of
have
more
control
technically,
and
it's
just
the
safer
way
of
preserving
this
functionality
to
to
my
taste.
C
But
there
is
also
one
thing:
I'm
I'm
working
on
figuring
out
how
to
asynchronously
load
log
monaco
on
the
source,
editor
level,
so
that
we
would
be.
We
would
not
be
tied
by
only
dynamically
loading
source,
editor
in
the
in
the
sort
of
cons
and
user
components.
C
So
I'm
trying
to
sort
that
out
on
the
on
the
component
level,
and
in
this
case
we
might
even
say
that
we
set
a
parameter
to
edit
light
like
when
we
render
a
blob.
We
do
know
the
type
of
the
file,
whether
it's
a
code
file
or
it's
a
viewer
file,
sort
of
so
and
that
we
can
based
on
that
param.
We
send
that
parameter
when
we
initialize
the
source,
editor
and
based
on
that
we
might
even
say.
Okay,
we
do
not
need
monaco
here.
C
We
just
we
are
just
in
the
viewer
type
or
we
need
some
other
sort
of
like
front
and
back
endish
technology.
So
monaco
is
used
for
editing
the
code
edit
code
files,
then
another
library
is
used
to
edit,
like
images
like
team,
for
example
set.
So
we
can
sort
of
create
this
set
of
backbones
handling,
different
types
of
files
and
then
be
have
the
source
editor
as
the
frontier
of
all
those
and
like
gatekeeper
for
all
those
libraries
that
that
would
be
awesome.
Actually,
okay,.
A
No
we'll
keep
that
in
mind
as
we
go
forward.
I
think
our
biggest
hurdle-
and
this
is
kind
of
a
fork
in
the
road
that
I
kind
of
want
to
mention
before
the
call
is
it's
happening-
is
ending,
which
I
see,
there's
there's
two
two
ways
forward.
A
When
we
look
at
competition
and
one
is
the
the
the
example
that
tim
gave
github
one
a
one
ask.com
that
loads
a
very
quick
editing
experience
on
on
a
github
file
when
we
look
at
the
other
alternative
and
how
github
itself
renders
large
files
and
handles
like
large.
Mrs,
for
example,
they
go
in
the
direction
of
which
is
now
for
us,
a
challenge
like
the
server-side
rendering
of
things,
and
they
just
put
it
on
the
page,
and
then
they
hydrate
that.
A
I
believe
that
this
is
a
fork
in
the
road
for
us,
so
that
once
we
go
in
this
route,
we
go
more
in
the
direction
of
the
github
1s.com
experience.
Rather
than
the
original
server-side
server-side
rendered
version.
We
won't
be
able
to
service
that
render
monaco,
for
example,
and
then
hydrate
it
is
that
correct.
B
That's
the
easier,
the
the
best
example
what
you
can
do,
if
you
do
this
right
is
linear.app,
I'm
not
sure
if
you
have
seen
that
or
used
that.
That
is
something
I'm
discussing
all
the
time.
On
the
plant
side,
it
was
built
by
a
couple
of
former
coinbase
engineers
who
have
built
especially
the
coinpress
pro
stuff,
etc,
which
is
handling
quite
some
data
in
some
milliseconds.
Yes,
linear
app.
I
I'm
not
sure
if
I
can
log
in
here,
give
me
a
second
because
then
I
can
show
you
at
the
ending.
B
B
It's
a
one
single
page
application
and
they
do
a
lot
of
client-side
caching
and
they
use
indexeddb,
and
then
they
tell
basically
the
back
end.
Look.
That's
my
version
of
it
just
give
me
the
diff
of
it,
but
they
start
the
application
itself
by
having
everything
already
in
place
and
rendering
from
here.
Can
you
see
my
window
here
and
just
that
window
and
move
the
whole
screen.
A
B
B
If
I
have
an
application
that
I'm
working
on
a
daily
basis
and
that's
where
I
want
to
get
connected
all
the
areas
that
I
haven't
had
that's
the
experience
I
want
to
see
on
my
issues
when
I'm
working
with
stuff
there.
That's
how
I
want
to
see
my
mrs
and
that's
how
I
want
to
see
source
code
files
and
I
strongly
believe
we
have
the
right
people
for
this.
We
have
the
technology
for
this.
We
should
be
able
to
do
this.
B
A
B
I
mean
the
only
last
thing
I
don't
want
to
have
gitlab
as
a
whole
to
that,
but
I
want
to
have.
I
believe,
the
best
strategy
that
we
can
do
is
to
have
specific,
high
usage
areas
in
applications
like
that
and
user
forms.
One
of
our
issues
have
one
for
grade
and
I
have
for
one
for
mrs
and
one
for
source
code.
Those
are
my
three
major
years
everything
else
we
can
do
as
we
do
as
we
do.
A
Right
now
that
I
feel
like
that's,
that's
a
more
achievable
goal
in
the
in
the
in
the
short
period,
certain
short
scale,
yeah
that
that
sounds
good
and
I
think,
that's
kind
of
just
a
more
philosophical
thing
about
us
to
be
aware.
That's
okay
right!
So
I
think
we
have
some
outcomes
here
to
explore
and
dennis.
I
would
need
your
help
here
with
with
some
of
these
investigations,
particularly
so
the
web
fact
we
can
do
you
wanna.
A
You
wanna,
take
this
investigation
of
eager
loading
in
map
pack
and
maybe
maybe
reaching
out
to
lucas
or
someone
in
in
validating
this
approach.
C
Yeah
sure
that's
that's
possible
and
I'm
not
sure
how
how
this
corresponds
to
to
loading
monaco
as
the
resource.
Well,
we
do
need
to
know
the
didn't
need
to
know
the
chunk
name
anyway,
right
so
yeah.
We
have
to
go
through
the
webpack
yeah,
I'm
I'm
I'll.
I
can.
I
can
do
that
and
I
already
do
the
like.
While
we
are
talking
on
my
second
screen,
I
already
do
the
research
of
performance
on
the
large
files.
A
Nice
cool,
so
basically,
what
I'll
do
is
I'll
create
issues
for
all
these
future
stuff,
except
the
conversations
but
I'll
create
issues
for
these,
and
then
you
can
post
your
your
findings
there
so
that
it's
visible
for
everyone
who
is
interested
in
this
thing.
Yeah
I'll
create
one
for
that
investigation
of
large
files
as
well,
and
then
tim
just
keep
pushing
the
pdwa
service
worker
that
will
be
on
team
to
push.
We
can
support
every
anytime.
We
can
so
updating
the
version
of
monaco
dennis.
C
A
C
For
when
it
comes
to
upgrading
monaco,
I
think
it
will
make
sense
to
to
do
it
iteratively
and,
first
of
all
to
create
the
technical
discovery
of
how
it
goes
and
from
that
sort
of
list,
all
the
all
the
issues
discovered
at
least
all
the
specs
that
that
fails.
C
A
I
would
just
add
what
tim
already
mentioned:
q2
is
around
the
corner,
stifle
kr's,
I
feel
like.
They
should
definitely
be
no,
no
pressure
at
all
right.
A
A
C
A
C
I
would
I
would
we
can
we
can
we
can
we
can
phrase
it
as
the
eager
loading
in
webpack,
but
I
will
do
my
best
to
investigate
different
possible
ways,
eager
loading
of
yeah
of
yeah
like
uber,
eager
loading.
That's
right,
so
that's
sort
of
combination
of
different
things.
C
C
About
the
service
worker,
we
don't
have
it
in
place
yet
do
we?
No
I'm
not
in.
B
Review
right,
yeah,
it's
it's!
Basically,
I
have
two
or
three
failing
our
specs
okay,
some
really
weird
reasons.
This
can
take
five
minutes
or,
as
we
already
know,
there's
already
no
five
days
and
also
depends
a
little
bit
on
what
explodes.
Apart
from
that
at
the
same,
but
that
stuff
is
done.
We
are
really
and
then
we
should
have
really
everything
in
place
to
just
add
the
url,
and
it
should
go.
C
Okay,
okay,
sounds
cool,
sounds
cool,
so
I
I
yeah.
Okay
got
it.
Thank
you
very
much.
I
will
we'll
keep
an
eye
on
that
tomorrow.