►
From YouTube: Grafana Community Call 2021-05-20
Description
No description was provided for this meeting.
If this is YOUR meeting, an easy way to fix this is to add a description to your video, wherever mtngs.io found it (probably YouTube).
A
All
right
so
just
to
recap
again
in
grafana
7,
a
lot
of
what
we
did
was
front
end
streaming
abilities
that
sort
of
from
a
data
source
api.
You
could
refresh
the
data
and
that
would
flow
down
to
update
a
graph
in
eight
that
now
works
also
from
back
end
plugins
that
there's
a
new
api,
that's
extended
to
push
that
out,
so
that
you
can
send
a
stream
to
grafana
core
and
it
will
broadcast
it
to
all
listeners
of
that
channel.
A
So
here
is
a
you
know,
so
this
is
data
coming
in
this
case,
we're
aiming
at
trying
to
do
20,
20
hertz.
What
you
see
here
is
a
very
slow
moving
gif
of
that,
but
that's
an
idea
so
that
this
streaming
infrastructure
allows
us
to
do
both
kind
of
data
streaming,
which
you
would
expect
from
grafana,
but
also
allows
us
to
start
thinking
about
some
new
types
of
applications.
A
So
this
was
a
sort
of
quick
hack
at
a
you
know,
trying
to
build
a
trello
style
board
where
you
have
multiple
dashboards
and
want
to
move
panels
around
and
see
real-time
updates
between
multiple
clients
right
so
standard
stuff
in
2020
right.
A
So
the
other
part
that
this
is
working
for
is
within
grafana.
Now,
when
you
save
dashboards
the
that
action
is
broadcast
to
everyone,
that's
there,
so
saving
a
dashboard
actually
updates.
It
everywhere
else,
like
many
other
applications,
but
I
want
to
also
acknowledge
alex
on
this.
So
a
huge
part
of
this
is
we're
using
a
a
project.
It's
called
centrifuge,
which
alex
wrote,
that's
a
go
based,
websocket
library,
that's
now
part
of
grafana
core
and
he's
on
this
call
and
you'll
talk
to
him
a
bit
later.
A
You
should
check
out
that
project.
It's
quite
popular
in
so
some
of
the
dire
like
system
that
we're
looking
at
is
sort
of
how
to
get
data
into
grafana.
You
know,
we've
always
had
this
path
where
grafana
queries
data,
but
this
is
looking
at.
How
do
we
push
data
into
it?
A
And
what
can
we
do
with
that
data
once
we
have
it
I'll
come
back
to
some
of
these
ideas,
so
within
this
streaming
idea,
you'll
see
everything
that
we
refer
to
has
a
channel
and
a
channel
is
really
just
a
string
that
specifies
a
stream
of
data
and
in
that,
in
the
in
this
new
api,
we've
got
that
specified
into
three
compartments.
There's
the
scope,
a
name
space
and
a
path.
A
So
the
scope
is
kind
of
who
owns
that
like
who,
which
part
of
our
program
is
in
control
of
that
and
that's
going
to
be
either
grafana.
So
the
core
of
grafana,
it's
going
to
be
a
plugin,
so
a
single
per
plugin
id
you
can
control
a
namespace
and
then
also
per
data
source.
So
each
instance
of
a
data
source
has
its
own
namespace.
A
So
within
that
a
namespace
would
be,
for
example,
in
grafana.
It
would
be.
We
have
a
whole
area
dedicated
to
dashboard
events
or
to
data
source
events
plugins,
that
one's
open
to
itself
that
any
plugin
can
kind
of
control
what
happens
under
its
namespace
and
a
data
source
that
the
namespace
would
be
the
id
path.
Again,
that's
sort
of
further
down
so
grafana.
A
For
a
data
source
or
for
a
dashboard
namespace
might
be
the
dashboard
id,
and
then
events
can
be
based
on
that.
So
then
oops.
So
this
has
actually,
in
addition
to
this,
there's
the
org
id
so
we're.
This
is
prepared
for
the
multi
multi-tenant
setup,
so
there's
also
an
org
id,
that's
transparently,
shoved
in
front
of
there.
A
So
within
the
go
side
and
within
our
api
you'll
see
that
we
have
a
channel
that
everything
that
we're
writing
to
gets
prefixed
with
scope,
namespace
and
path,
our
grpc
interface.
This
is
a
new
api
added
in
eight
and
we'll
walk
through
this
a
bit
more,
but
there's
a
callback
when
someone
tries
to
subscribe
to
a
stream.
A
So
you
can
respond
and
say
what
you
should
do,
whether
they
should
be
allowed
in
or
not,
and
it
also
allows
you
to
send
the
first
request
back
and
then
there's
another
call
called
run
stream.
That
is
called
once
for
so
subscribe
stream
is
here,
so
you
get
back
a
call
and
you
get
to
say
whether
someone
should
have
access
or
not
run
stream
is
another
call
that
lets
you
say
given
data.
A
A
So
from
the
data
source
side,
this
is
actually
kind
of
nice.
So
all
we
do
now
is
from
a
normal
data
query.
If
you
return
the
channel
address
that
you
want
to
point
to
which
is
literally
in
your
query
response
you
set
the
frame
metadata
to
be
a
channel,
so
what
which
channel
that
should
connect
to
you
get?
Essentially,
the
initial
request
comes
through
so
a
query
that
returns
all
the
data
and
then
each
update
gets
returned
by
that
channel.
A
So
each
like
the
interval
that
that
channel
pushes
data
out
is
then
attached
appended
to
the
initial
query.
Request
part:
that's
pretty
nice
about
this!
Oh
here,
you'll
see
that
it
each
message
you'll
see
comes
along
here.
If
you
check
out,
if
you
inspect
the
websocket
communication,
there's
a
long
list
of
values
spitting
out
there.
A
What's
nice
about
this,
the
front
end
work
on
this.
This
is
the
whole
front,
end
implementation
and
this
all
kind
of
comes
out
of
the
box.
A
So
if
your
back
end
tells
it
which
channel
to
connect
to
the
front
end
will
manage
connecting
to
it
and
appending
the
data
to
to
the
stream
kind
of
automatically,
let's
simplify
that
so
in
sort
of
for
today,
the
things
that
we
can
look
at
for
like
working
data
sources,
there's
an
mqtt
initial
mqtt
implementation
that
would
be
great
to
check
out
and
get
some
feedback
on.
A
It's
got
a
pretty
basic
support
for
a
topic
structure
that
expects
a
time
and
a
value
right,
a
topic
with
the
time
and
a
value
we
can
read
that
and
pretty
pretty
early
support
on
that
this
other
one
is
the
signal
generator
that
we've
been
using
to
to
generate
essentially
to
help
stress
test
our
our
platform
and
again,
that's
one
that
you
can
check
out
for
for
examples
of
how
to
how
to
work
with
this.
A
Then
you
know,
as
we're
preparing
for
for
the
v8
release,
we're
working
on
some
tutorials
that
I'll
help
that
I'll
walk
you
through
in
a
bit-
let's
see
so
now
to
get
to
some
more
exciting
stuff,
I'll,
just
I'll
switch
over
to
a
live
demo.
So
that's
what
I
showed
you
was
kind
of
how
you
would
work
it
from
this
from
a
data
source
level,
but
that
may
be
a
little
overwhelming.
If
you
aren't
building
data
sources
right
so
yeah
we
also
have
oops
where's.
A
A
All
right
I'll
add
a
panel
I'll
switch
over
to
the
grafana
data
source
with
live
measurements,
and
here
I
can
pick
which
channel
I
see
here
so
the
list
of
channels
that
populates
is
kind
of
what
it's
seen
in
time
so
you'll
see.
Let
me
pick
this
system,
one
and
I'll
go
to
the
last
30
seconds,
and
maybe
let's
do
a
table
view
for
actually
we'll
do
table
view
here.
All
right.
So
you
see
I've
got
one
point
here:
it's
value
is
three.
A
Three,
so
each
update
gets
pushed
and
appended
to
that
system.
So
that's
that's
kind
of
nice
because
it
essentially
requires
no
updates
to
your
core
and
you
can
essentially
pipe
your
data
to
grifana
server
and
it
will
broadcast
it
to
everyone.
Listening
on
that
channel,
which
is
kind
of
nice.
A
This
leads
us
to
let's
see
so
so.
The
next
part
is
that
that,
if
you're
using
telegraph,
you
can
essentially
broadcast
your
telegraph
output
to
that
channel
and
alex
is
I'm
gonna
pass
things
over
to
alex
now,
who
is
gonna
show
how
you
can
connect
a
telegraph
instance
straight
to
grafana
and
stream
live
data
to
it.
B
I
do
yeah
cool
yeah,
so
what
we
will
do
now
is
running
graffana
and
running
a
telegraph
which
will
stream
some
real-time
data
to
grafana
time
series
panel
and
in
addition
to
this,
we
will
show
how
we
can
send
metrics
from
local
grafana
to
hosted
grafana.
B
So,
let's
run
grafana.
This
is
not
yet
available
in
vh.
This
is
just
we
have
just
started.
Working
on.
I
mean
opposed
to
grafana,
doesn't
work
in
v8,
but
all
other
stuff
will
be
available
with
vh
release,
so
grafana
runyon.
B
B
The
web
circuit
output
plugin
not
yet
merged
yet
into
telegraph,
but
we
hope
it
will
very
soon.
So
here
we
are
using
a
custom
branch
of
telegraph
with
websocket
output,
plugin
yeah
and
with
admin
token
we
will
send
an
influx
format
towards
grafana.
B
Okay,
so
cpu
information-
and
here
we
filtered
it
by
usage
user
measurement
and
what's
interesting,
is
that
we
also
output
this
to
hosted
graphana
in
lower
resolution.
B
As
I
said,
output
two
hosted
grafana,
not
yet
ready
for
v8,
but
we
will
work
on
this
to
improve
it
and
introduce
at
some
point
in
future
releases
to
demonstrate
how
it
actually
reacts
on
changes
in
cpu.
Let's
put
some
load
on
the
systems.
Here
is
a
small
program
that
does
some
math
and
for
10
seconds
it
simply
increments
counters
and
loads.
Six
of
my
of
my
12
cores
on
this
macbook.
B
B
B
B
C
I
have
a
question
yeah,
so
the
difference
between
using
websocket
and
the
http
outputs,
plugin
you're,
saying
that
it
it
can
be
in
a
high
re
resolution
or.
B
Yeah,
actually,
you
can
use
both
ready
to
use
http
plugin,
which
is
already
part
of
telegraph
or
web
circuit
plugin,
which
we
hope
will
be
a
part
of
telegraph.
Very
soon
there
is.
The
only
difference
is
that
websocket
output
plugin
should
put
a
lot
less
launch
on
grafana
instance,
because
http
output
plugin
sends
every
request
and
every
request
uses
all
grafana
middlewares,
which
we
have
pretty
a
lot
of,
while
websocket
just
connects
to
grafana
and
streams
data
over
tcp
connection.
C
Okay,
sounds
nice
I'll,
have
a
look
at
your
pull
request,
I'm
one
of
the
maintainers
of
telegraph,
and
so
I
will
look
to
push
some
things
forward.
B
Yes,
that
would
be
very
nice.
We
are
waiting
this,
so
thank
you.
B
B
A
A
A
A
D
Is
it
all
available
in
version
in
beta2?
What's
going.
B
Yeah,
I
think
the
most
noticeable
change
between
beta1
and
beta2
is
that
we
are
now
looking
at
channel
symbols
so
allowing
only
specific
symbols
from
ascii
alphanumeric
and
several
additional
symbols.
It's
still
not
documented.
Yet
so
we
are
going
to
document
this
before
before
june.
I
suppose
yeah.
D
A
Yeah
and
if
you
run
into
issues,
let
us
know
and
we'll
see
what
we
can
do
to
make
it
easier.
The
other
thing
that
added,
if
you
are
writing
plugins,
is
the
instance
manager
has
always
been
I'd,
say
overly
complicated
to
try
to
have
a
back-end
plug-in.
That
knows
when
the
instance
setup
changes,
and
so
we
had
to
manually
do
that.
A
The
new
backend
example
uses
a
different
flavor
of
that
that
takes
care
of
it
all
for
you,
so
you
just
never
think
about
it,
and
it
manages
the
closing
itself
and
re
calling
a
new
instance
as
part
of
its
normal
workflow.
D
A
So
the
the
streaming
part
of
it
just
won't
work
in
seven.
That's
like
the
the
query.
If
you
stick
channel
on
your
response,
header
seven
won't
do
anything
with
it.
Well,
eight
will
there's
various
tricks
in
your
plugin
to
say
you
know
to
look
at
the
grafana
version
and
add
new
elements
to
the
ui,
but
it's
sort
of
up
to
the
plug-in
developers
to
figure
out
how
they
want
to
manage.
You
know
seven
and
eight
independently
or
which
features
show
up
in
which
versions.
D
A
Yeah
yeah,
no,
it's
I
I!
I
wish
there
was
a
better
answer,
but
you
can't
it's.
You
can't
go
back
in
time
with
these
apis,
which
is
why
we're
I've
been
pretty
cautious
about
pushing
out
this
streaming
support.
Though
it's
been
there
in
a
certain
form
for
a
long
time.
I
think
it's
in
a
state
now,
where
I'm
we're
pretty
confident
that
at
least
this
initial
round
of
apis
can
can
stick
around
and
be
stable.
I
mean
you,
you
will
notice
I'd
say
that
back
end,
apis
have
been
pretty
stable
over
the
years.
A
D
A
We
have
put
a
lot
of
effort
into
it.
I
mean
your
mileage
is
gonna,
vary
and
say,
but
we've
we've
put
a
lot
of
effort
into,
in
particular
the
switch
from
if
you're,
using
the
the
graph
based
panel
so
leon's
on
here.
Internally,
we
have
the
cpu
for
the
lock
based
graph
panel.
Just
could
not
handle
getting
updated,
so
we've
essentially
replaced
that
with
a
u-plot
based
panel
and
done
as
much
kind
of
optimization
as
we
could
around
around
throughput
on
that
leon.
E
Well,
I
switched
over
to
my
phone,
so
maybe
you
can
hear
me
now.
I
don't
know
what's
up
with
my
mic,
but
yeah
so
yeah,
the
the
the
performance
is
not
it's
not
just
simply
from
switching
the
visualizations,
although
a
big
part
of
it
is
that
we
also
optimized
to
not
do
kind
of
there's
also
middleware
layers.
E
I
guess
on
the
front
end
where
it
does
a
lot
of
transforms,
potentially
there's
a
whole
stack
of
things
that
it
does
and
then
it
uses
the
reacts
to
re
to
pass
everything
through.
So
there's
a
lot
of
that
that
we
now
bypass
and
so
there's
a
lot
of
stuff
that
has
been
removed
for
the
streaming
case.
E
Specifically
so
before
I
don't
know
ryan,
you
were
saying
what
was
the
max
that
squad
was
able
to
sustain
previously
something
like
maybe
two
updates
per
second
and
we're
now
able
to
do
20
updates
per
second.
So
for
a
lot
of
cases,
we're
much
faster,
but
you'll
have
to
you'll
have
to
try
to
push
the
same
stuff
through
through
version
8,
to
really
find
out
for
your
use
case,
but
it
should
be
much
better.
A
Yeah
I'll
say:
we've
done
what
we've
done,
what
we
can
so
far.
I
think
there's
there's
still
a
little
room,
but
we've
gotten
we've
improved,
I
mean
previously,
it
would
work,
but
your
cpu
you'd
set
your
laptop
on
fire
right,
and
so
this
is
just
all
of
these
features
you
can
kind
of
as
well
as
they
work.
You
just
ask
them
to
do
more
stuff
right.
So
it's
this
trade-off
of
how
do
we?
A
You
know?
How
do
we
increase
that
limit?
But
not
let
you
hurt
things
too
much
so
for
streaming.
Updates.
We've
currently
got
a
process
that,
if
your
cpu
like,
if
the
refresh
rate
isn't
keeping
up,
we
just
stop
rendering,
so
it
can
get
choppy.
But
we
try
not
to
freeze
the
system,
which
is
what
was
happening
before.
D
Right,
thank
you
because
when
we
did
the
streaming
in
redis
plugins,
we
captured
two
only
one
query
per
panel.
I
didn't
want
to
support
more
because
I
started
to
like
it
and
now
I
have
a
couple
of
feature
requests
that
people
want
to
have
more
than
queries.
One
query
in
the
panel,
and
especially
when
you
have
six
panels,
it
became
a
nightmare.
It's
so.
E
So
one
thing
that's
still
true
right
now
and
even
in
beta
2.
That
just
got
tagged
today
is,
if
you
have
the
legend,
enabled
there's
it's.
It
may
be
significantly
less
performance,
so
hopefully
by
beta
3
or
definitely
final
release
will
put
in
more
optimizations
to
have.
If
you
have
a
static
legend
in
a
streaming
scenario,
it
should
not
impact
performance,
but
right
now,
if
you
disable
legend,
you'll
probably
be
able
to
squeeze
a
few
more
frames
out
of
it.
D
Okay,
thank
you.
Still,
there
is
a
problem
when
you
have
it,
you
should
support
dashboards
for
version
7
and
version
8,
because
if
you
switch
the
time
series
which
is
not
available
in
previous
versions,
it's
harder
to
manage,
especially
with
enterprise
users,
you
cannot
say
there
is
a
new
time
series
planner.
You
have
to
switch
to
8.
D
D
D
A
You're,
you
know
you
know
we
also
have
to
prepare
for
the
future.
Some
too,
like,
like
I'm,
super
sympathetic
to
how
long
it
takes
to
build
those
things,
but
we
also
need
to
fix
it
at
some
point,
like
the
the
angular
flot
based
panel
is
not
gonna
last
and
we
got
it.
A
So
what
is
nice
in
many
ways
like
what
we're
doing
with
eight
is
kind
of
actually
using
the
infrastructure
we
put
in
place
for
seven,
so
the
whole
fields
where
you
have
a
data
frame
where
each
frame
has
configuration
attached
to
it?
That
applies
kind
of
regardless
of
which
visualization
you
use.
That
was
always
kind
of
a
weird
thing,
because
the
graph
doesn't
use
it
right,
so
the
thresholds
you
set
on
the
graph
panel
are
actually
different
than
they
are
in
every
other
panel.
A
F
I
got
a
question,
but
the
websocket,
I'm
not
an
expert
in
websites,
but
from
what
I
understand
sometimes
can
be
difficult,
there's
some
sort
of
challenges,
specifically
with
horizontal
scaling.
I'm
just
wondering
if,
like
if
you've
already
worked
through
some
of
that,
you
know
what
your
thoughts
are
about.
It.
F
Just
like
an
upper
limit
of
number
of
connections
like
per
instance
or
I
don't
know-
I'm
not
really
keen
on
a
lot
of
that
stuff.
So
just
some
things
that
I've
read.
B
Yeah
so
regarding
web
circuit
scalability
of
web
circuit,
so
web
circuit
should
be
mostly
supported
on
every
system
throughout
all
browsers.
These
days,.
B
B
This
is
about
30,
50
kilobytes
per
connection,
and
we
need
to
say
that
regarding
high
availability,
setups,
where
you
have
several
graffana
instances
behind
load
balancer,
we
still
need
to
do
this
after
v8
release.
So
currently
we
were
concentrating
mostly
on
introducing
streaming
streaming
for
single
node
graphana,
but
we
know
that
some
have
several
grafana
instances
behind
behind
engines,
for
example.
So
we
will
come
with
some
high
availability
solution
later,
most
probably
it
will
be
based
on
radius.
A
And
when
he
says
that
it's
not
supported
for
high
availability,
the
big
caveat
to
that
is
the
plug-in
support
will
work.
The
one
issue
is
is
that
each
node
manages
its
own
stream
right,
so
a
request
comes
in
on
a
channel
and
each
node
in
that
would
have
to
fulfill
it.
A
A
E
B
No
no,
this
can.
The
initial
data
can
be
loaded
from
plug-in,
for
example,
from
data
source
as
initial
data
using
query.
B
A
It,
but
for
these
again
for
plugins
you,
each
plugin
gets
when
someone
connects
to
a
channel
you're
able
to
send
the
first
request,
so
plugins
can
manage
their
own
cache
and,
for
example,
the
the
telegraph
flavor
of
it.
We
do
hold
on
to
the
last
point
that
we
have
is
saved
in
memory.
So
the
when
you
connect
you
will
get
the
last
point
that
was
sent
there,
but
we
don't
hold
on
to
the
whole
history.
Just
the
last
point
that
was
sent
there,
but
each
plugin
oh
go
on.
A
E
E
A
No,
the
alerting,
alerting,
is
based
on
the
query
response.
It
doesn't
sit
there
and
listen
to
the
response.
So
alerting
is
based
on.
D
The
streaming,
which
is
available
in
version
7,
has
this
advantages
that
you
can
actually
do
streaming
and
alerting
the
same
time.
It's
what
we
actually
recently
did,
because
you
still
I
mean
if
your
data
source
returns
any
data
using
user
time,
and
it's
can
can
streamed
on
the
front
end.
You
can
actually
do
alert
yeah.
A
But
you're
essentially
implementing
that
by
a
pole,
loop
right,
yep
right,
so
that's
the
the
so
the
advantage
of
I
guess
the
approach
in
this
streaming
is
literally
that
it's
one,
the
back
end
is
sending
one
request
and
broadcasting
it
to
everyone
versus
previously.
You
could
have
stuff
that
looks
like
streaming,
but
it's
really
some
version
of
polling.
A
D
Okay,
because
I
mean
it's,
I
think
it's
very
very
useful.
You
don't
see
the
streaming
data
all
the
time,
but
if
the
alert
happens
you
can
get
notification
right,
you
go
there
and
you
see
what's
happened
exactly
and
do
they
support
all
the
panels
because
they
demonstrated
on
the
table
and
and
time
series
can
we
stream
locks
the
same
way:
yeah
yeah
it.
A
A
C
A
Major
caveat
is
with
streaming
data
like
if
you
don't
set
an
explicit
minimum
or
maximum,
we
have
so
many
pieces
of
the
system
that
sort
of
pick
colors
based
on
the
range
that
like,
if
that's
required,
to
get
calculated
with
each
update.
A
A
So
yeah
give
it
a
try.
Let
us
let
us
know
still
still
a
little
more
time
before
we
100
freeze
it
as
as
an
initial
release,
but
calculations.
B
A
We
had
versions
of
this
and
I
think,
there's
a
good
chance
that
we
can
support
this
on
the
front
end,
but
mainly
that
was
based
on
having
a
cache
that
survives
navigation
and
the
like,
because
obviously,
if
you're
for
a
refresh
you're
disconnecting
and
reconnecting
to
the
same
stream.
But
if
we're
already
connected
shouldn't,
we
just
stay
connected.
So
we've
looked
at
some
early
implementations.
That's
what
we
did,
but
it
got
a
little
unpredictable
on
what
its
behavior
was.
A
D
A
With
I'd
say
everything
on
that
everything
that
we
talked
about,
where
grafana
server
processes
the
data
and
does
something
with
it,
so
what
he
showed
was
we
receive
it?
We
send
it
to
the
dashboards,
that's
in
eight,
but
in
eight
one,
we'll
look
at
taking
that
and
then
sending
it
to
grifana
cloud,
perhaps
taking
it
and
sending
it
to
the
alerting
system
having
an
additional
channel,
that's
running
it
through
transformation
or
running
it
through
the
thresholds
to
have
an
additional
channel
based
on
the
status
stuff
like
that.
D
So
how
do
there
was
an
analytic
authentication?
I
saw
the
barrier
talking
but
who
is
generated
for
so.
A
In
the
http
push
flavor
of
it,
we
just
in
this
version
we
said,
send
an
admin
token,
and
you
can
write
to
these
streams.
I
suspect
we're
gonna
need
something
a
little
less.
You
know
less
extreme
than
giving
an
admin
token
to
telegraph,
but
as
a
first
step
that
seemed
appropriate
for
plugins,
you
get
a
callback
when
someone
tries
to
subscribe.
So
at
that
stage
the
plugin
gets
to
decide
whether
they
can
allow
it
or
not.
So
plugins
can
manage
their
own
authentication
on
that
level.
D
A
So,
as
part
of
our
effort
to
increase
the
streaming
performance,
we
so
back
end
we've
sort
of
standardized
on
arrow.
As
a
as
just
like
the
a
common
industry
standard
format
of
like
we
should
be
able
to
support
stuff.
That's
in
there
in
seven
sort
of
as
a
as
a
shortcut,
we
ended
up.
A
Well,
we
were
trying
to
send
the
arrow
frames
to
the
front
end
and
somewhere
in
there
they
got
base64
encoded
and
sent
to
the
front
end,
which
is
this
sort
of
massive,
very
ugly
super,
not
performant
structure,
but
given
that
it
was
kind
of
half
of
the
data
sources
used
it
and
have
it
it
worked.
It
was
worked
fine
enough
in
our
effort
to
increase
streaming.
We
explored
kind
of
what
is
our
best
approach
here.
So
could
we
do
a
binary
arrow?
A
Because
arrow
is
pretty
good?
It's
a!
It
is
a
streaming
format,
but
it's
really
made
to
stream.
You
know
megabytes
gigabytes
of
data
between
systems.
Not
you
know
not
a
couple
lines
to
your
browser
like
it's:
that's
not
its
sweet
spot.
So
after
going
through
a
bunch
of
things,
we
settled
on
kind
of
what
are
the
most
efficient
things
from
javascript,
which
is
you
know,
json
and
came
up
with
a
sort
of
json
representation
of
this
arrow
frame,
so
that
should
be
transparent
to
to
your
users.
A
D
Can
we
just
send
the
data
frames
to
the
json
or
I
like
it,
and
so
I
mean
sometimes,
data
frames
are
not
easy
to
parse.
You
have
to
go
right.
Do
the
two
arrays
there
are
multiple
iterations.
How
you
can
you
should
parse
this
data
if
you're
working
as
a
custom,
custom
plugins
custom
panels
on
the
front
end.
So
if
you
receive
just
a
json
data
like
a
normal
rest
api,
it
may
be
easier
to
deal
with.
It
will
still
be.
A
A
So
again,
if
you
just
use
the
standard
response
handler,
it
manages
the
parsing
for
you,
so
you
can
inspect
the
payloads
and
you'll
see
that
the
format
now
separates
out
like
what
the
header
so
schema
versus
the
data.
A
A
But
yeah
that
that
change,
we
sort
of
did
a
huge
performance
push
a
month
back
and
one
of
the
things
I'll
say
that
I'm
super
excited
about
for
eight.
Is
you
won't
see
the
loading
spinner
anymore?
You
may
see
it,
but
it's
under
it's
like
it's
not
part
of
the
normal
loading
process.
So
no
more
bouncing
logo.
E
A
I
suspect
it's
gonna
be
around
for
a
long
time,
but
I
do
my
suspicion
with
it
is
that
it
will
exist
like
continue
to
work,
but
it
may
not
be
an
option
to
create
new
panels
with
it,
just
because
there's
so
much
stuff
that
kind
of
exists
using
it.
So
so
we'll
we'll
see
what
happens
with
it
similar
to
the
table
panel.
The
old
table
panel
still
works
because
people
need
it
and
can't
break
everything,
but
you
have
to
work
really
hard
to
create
a
new
panel
to
use
it.
D
Okay,
to
have
examples:
how
to
use
new
components
which
is
behind
time
series
new,
which
new
url
is
component,
which
is
graph
a
new
graph
component,
which
is
I
mean
which
is
used
by
this
panel,
how
to
use
it
properly.
A
So
the
my
answer,
for
that
is
don't
in
eight
there's
a
new
like
so
rather
than
trying
to
use
those
components
directly
because
they
require
so
much
prep
and
they're
pretty
finicky
on
how
they
get
used.
We
have
a
wrapper
for
embedding
a
panel
because
I
assume
you're
looking
at
in
an
in
an
app.
I
would
like
to
put
a
graph
there
right
so
rather
rather
than
trying
to
do
all
the
component
bits,
you
can
just
embed
the
panel
in
here.
D
What
if
I
want
to
create
my
own
panel
with
this
compo,
I
mean
my
case
is
I
have
this
data
returned
by
the
plugin
and
the
data
is
actually
I
want
to
process
it
again.
D
A
Yeah
so
there's
a
panel
chrome,
where
you
can
take
essentially
the
time
series
panel
and
give
it
data
and
your
configuration
and
it
will
render
as
though
it
was
that
panel.
So
that
model
seems
a
lot
better
than
trying
to
have
components
that
aren't,
you
know,
really
are
not
designed
to
be
sort
of
end
user
facing
components
where
we
do
really
care
like.
What
is
the
api
that
we
have
for
a
panel
of
like
these?
Are
your
json
options
for
that
and
pass
it
this
data
in
this
config
and
it
should
work.
D
So
like
I
can,
I
can
search
in
documentation
for
panel
chrome
panel
chrome.
I
think.
A
F
A
Are
panels
and
their
structures
like
if
you're
looking
for
a
component
library?
That's
when
you
use,
you
know,
use
auntie
use.
You
know,
there's
plenty
of
libraries
that
just
take
those
but
the
panel
library
we
really
need
to
care
about,
which
is
where
I'd
say.
If
you
want
to
embed
stuff,
you
can
embed
panels
within
itself.
So.
D
A
In
version,
7,
yeah
use
components
so
again,
what's
nice
about
this,
is
with
panel
chrome,
you're
insulated
from
changes
to
the
component.
Libraries
right
like
the
both
it
allows
us
to
sort
of
treat
the
components
a
little
more
accurate
to
how
they
are
like
we
want
those
to
be
as
efficient
as
possible
and
well
designed
they're
in
a
weird
state
of
like
both
internal
and
external
next
to
each
other,
and
it's
hard
to
know,
like
obviously
our
button
libraries
shouldn't
change
as
long
as
you
import
a
button,
it
should
stay
the
same
api.
A
A
Great
any
anyone
else,
we'll
we'll
call
it
a
day,
then.