►
From YouTube: Simulating Quantum Systems with Qisket Dynamics
Description
Simulating Quantum Systems with Qisket Dynamics
Daniel Puzzuoli
B
Yeah
hi
everybody
I
just
want
to
say
thanks
to
the
organizers
for
inviting
me
this
has
been
an
enjoyable
day
so
far,
yeah
so
I'm
going
to
be
talking
about
a
newish
package
in
in
Kiss
kit.
B
So
if,
if
people
aren't
familiar
kiss,
kid
is
like
the
open
source
like
umbrella
of
packages
that
IBM
develops,
and
so
this
new
newer
package
it's
been
out
for
about
a
year
that
I'm
going
to
talk
about,
is
called
Dynamics
and
I'll,
explain
what
it
is,
but
just
wanted
to
say:
first,
like
I'm
kind
of
the
lead,
developer
and
organizer
and
maintainer
of
this
package.
But
there's
been
a
lot
of
other
contributors
all
listed
here.
Most.
B
There
we
go
okay,
so
here's
my
outline
so
I'm
just
going
to
talk
about
what
kind
of
simulation
we're
doing
and
why
we're
doing
it.
Why
do
we
want
to
build
this
package?
B
I'll
talk
about,
and
in
particular
I'll
say
here,
I'm
talking
about
simulation
of
the
Dynamics
of
the
quantum
system,
so
like
schroding
our
equation
when
blood
equations
stuff,
like
that,
so
every
time
I
say
simulate
in
this
talk,
I'm
talking
about
that
type
of
simulation
and
not
circuit
simulation
and
classical
stimulation
of
these
things
to
clarify
yeah,
so
I'll
just
talk
about.
You
know
why?
Why
do
we
want
to
do
that?
What
are
the
applications
we
have
in
mind?
B
I'll
talk
about
kind
of
what's
out
there
in
terms
of
Open
Source
packages.
If
you
want
to
do
this
kind
of
simulation,
what
what
do
you
want
to
look
for
in
a
package
and
what
kind
of
packages
are
out
there?
Obviously,
then
I'll
talk
about
our
package,
but
you
know
it's
important
to
know
it's
out
there
if
you're
going
to
do
something,
what
to
know
what
what
are
the
available
tools
and
then
yeah.
B
So
once
I
start
talking
about
Dynamics
I'll,
just
kind
of
generally
describe
the
features,
I'll
show
some
code
examples
doing
like
some
standard
things
and
then
I
have
a
couple
slides
on.
Maybe
let's
say
quote-unquote
results
slide.
So
how
does
it
perform
and
then
a
new
feature
that
was
added
recently?
It's
some
Advanced
computational
features
foreign,
so
simulate
what
so.
B
This
first
sentence
is
kind
of
a
tautology,
but
Quantum
devices
are
quantum
systems
so
when
we
want
to
understand
what
they're
doing
or
work
with
them
or
optimize
them
in
some
way
at
a
physics
level,
you
know
we
model
these
things
using
quantum
theory
and
the
differential
equations
in
quantum
theory.
So
things
like
the
Schrodinger
equation,
things
like
the
limblad
equation,
so
that
this
is
the
level
we're
talking
about,
simulating
the
physics,
a
hamiltonian
Quantum
description
of
the
physics,
and
why
do
we
want
to
do
that?
B
Well,
there's
certain
sort
of
workflows
that
people
use
simulation
for
you
know
it's
basically
like
simulating
these
things.
Solving
these
differential
equations
show
you
how
your
models
play
out
in
time,
and
so
that's
how
you
sort
of
realize
your
knowledge
of
the
system
or
build
your
knowledge
of
the
system,
so
first
building
device
models.
B
So
this
is
one
that
I'm
particularly
very
interested
in
you
know
you
have
a
model
or
a
class
of
models
for
your
system
and
you
want
to
design
a
control
sequence
that
will
Implement
some
gate,
and
maybe
you
want
it
to
be
robust
within
some
parameter
ranges
or
something
like
that,
so
you
can
use
simulation
for
that.
So
you
try
to
optimize
some
control
sequence
to
do
something
you
want
given.
You
know
your
model
information
and
then
another
thing.
B
So
this
is
a
picture
there's
this
package
as
well
in
case
it
calls
called
kiss
kid
metal
for
Designing
superconducting
chips
and
something
that
you
know
we're
kind
of
working
with
people
internally.
Is
you
know
at
the
design
stage,
you're
trying
to
play
around
with
things
see
what
you
can
design
and
then
you
need
to
decide.
Are
you
going
to
build
this
thing
or
not,
and
so
you
can
use
simulation
to
see.
Okay.
B
Does
this
design
kind
of
do
what
you
want,
obviously,
and
at
the
end
of
the
day
you
need
to
build
things
and
see
what
happens,
but
you
know
this
can
be
very
informative,
like
even
in
principle
does
this
thing
do
what
I
want
or
not,
and
if
it
doesn't,
then
you
don't
have
to
spend
all
the
money
and
time
to
to
build
that
device
and
watch
it
fail.
B
You
know
you
need
to
simulate
these
things.
Thousands
and
thousands
and
thousands
of
times,
maybe
during
the
parameters,
a
bunch
of
times
or
you're,
doing
optimization
and
just
continuously
evaluating
some
objective
function:
So
the
faster
you
can
simulate
these
things,
obviously
even
more
complicated
sort
of
research.
You
can
do
incorporating
simulations.
B
Know
the
classic
thing
in
Quantum
systems
is,
as
you
add,
more
and
more
subsystems.
Things
grow
exponentially
in
terms
of
the
dimension
of
the
differential
equation.
So
that's
a
big
problem
and
people
are
interested
in
HPC
utilization.
For
these
this
reason,
but
another
thing
I
wanted
to
point
out
is
there's
the
other
cursive
dimensionality
too,
which
is
even
if
you're
looking
at
small
systems.
B
You
know
they
might
have
a
lot
of
model
parameters,
and
so
you
want
to
explore
this
parameter
space
for
some
reason.
Maybe
it's
optimizing
controls
or
learning
a
model
or
something
like
that,
and
so
even
in
small
systems,
you
there's
still
benefits
to
continuing
to
improve
and
find
faster
methods
and
speed.
B
Things
up
and
I
make
this
point,
because
a
lot
of
people,
you
know
rightly
so-
are
interested
in
larger
system
sizes,
but
I
think
sometimes
that
obscures
the
fact
that
even
smaller
system
sizes-
you
know,
if
you
just
want
to
do
one
simulation,
that's
fine,
but
for
these
higher
level
applications,
even
in
that
domain,
you
want
to
speed
things
up:
yeah,
yeah,
okay,.
A
B
B
Obviously,
the
foundation
of
a
package
that
does
this
kind
of
simulation
is
going
to
be
the
numerical
tools,
different
differential
equation,
solvers
different,
you
know,
array,
libraries
or
representations,
Hardware
utilization
using
gpus
HPC
things
is
that
you
know
you
want
to
have
those
things
built
into
it
to
some
extent,
then,
at
a
slightly
higher
level,
you're
going
to
have
model,
building
and
Analysis
tools
so
being
able
to
describe
the
simulation
to
build.
You
know
your
hamiltonian
in
whatever
way
the
package
accepts.
Can
you
do
that
conveniently
or
not,
then?
B
Once
you
have
built
that
and
you
do
your
simulation,
can
you
analyze
the
results
effectively
or
not,
and
then
at
the
higher
level
you're
going
to
want,
you
might
be
looking
for.
You
know
things
that
enable
certain
higher
level
workflows
that
incorporate
these
lower
level
things
so
control,
optimization
model,
fitting
things
like
this
and.
B
B
The
technical
capabilities
we
need
are
speed
and
flexibility,
so
being
able
to
like
specify
things
in
a
really
generic
way,
because
these
systems
have
a
lot
of
variation.
But-
and
this
is
the
point
that
I
was
kind
of
mentioning
in
terms
of
this
being
a
computation
Focus
audience,
you
really
want
things
to
be
able
to
be
compilable
and
automatically
differentiable.
So
you
can
do
these
more
complicated
things
more
quickly
and
in
particular,
obviously
with
automatic
differentiation
doing
optimization
applications,
foreign
so.
A
B
B
So
there's
a
lot
of
packages
out
there
I'm
not
going
to
talk
about
every
single
one,
but
you
know
I
think
the
most
famous
one
is
Q-tip,
which
has
been
around
for
a
really
long
time,
and
this
is
a
python
package
and
it
does
much
more
than
just
simulate
the
Dynamics
of
quantum
systems
which
I'm
talking
about
now.
But
it's
a
really
sort
of
feature
complete
package:
it's
maybe
a
bit
older
it
does.
B
It
I
think
they're
working
to
do
like
tensorflow
integration,
but
it
doesn't
have
that
right
now,
but
they
do
have
some
things
for
doing.
You
know
control,
optimization,
some
more
specialized,
you
know
and
automatically
differentiable
versions
of
it.
So
yeah,
there's
python
packages,
torch
Quantum
was
one
that
I
learned
about
recently
and
so
yeah.
They
have
they're
integrated
with
pi
torch,
so
you
can
differentiate
everything
which
is
really
nice.
B
Julia
packages
are
obviously
nice.
We
don't
work
with
Julia,
because
kiss
kid
is
just
in
Python,
so
we
default
at
these
python
at
IBM,
at
least
as
the
main
interface
language.
But
Julia
is
really
nice
because
it
has
a
really
powerful
ode,
library
and
I.
Don't
know
all
the
details,
but
there's
very
general
automatic
differentiation
in
Julia's
language
itself.
So.
B
Things
out
there
if
you're
doing
this
kind
of
simulation,
I
recommend
you
know
looking
around,
but
now
I'll
specifically
tell
you
about
our.
A
B
So
kiss
kit
Dynamics
is
a
new
package,
as
I've
said
in
kiskit
in
Python
for
hamiltonian,
and
then
blood
simulation,
Central
applications
of
Interest
or
optimization,
and
this
virtual
prototyping
idea
that
I
mentioned
and
the
main
features
and
the
main
thing
that
we're
really
going
for
I've
already
mentioned
being
able
to
compile
an
automatically
differentiate.
But
the
first
point
here
is
really
like
one
of
the
main
design
elements
of
the
package.
Is
you
want
to
be
able
to
configure
everything?
B
There's
lots
of
different
elements
to
that
when
you
solve
these
differential
equations
and
every
problem
is
different,
there's
no
best
ode,
solver
or
best.
You
know,
array
representation
for
for
all
possible
problems,
so
you
want
to
use
it
to
be
able
to
for
any
given
problem.
Just
say:
I'm
going
to
use
this
solver
I'm,
going
to
use
this
array,
representation
and
I'm
going
to
do
these
Transformations
on
the
model
and
find
what
the
best
thing
is
for
that
application
before
you
know
doing
something
that
requires
thousands
of
evaluations
of
these
things
and.
B
Package,
the
package
is
relatively
new,
so
in
terms
of
integration
with
the
rest
of
kiskit,
we're
still
sort
of
working
on
that.
But
that's
that's
the
goal
we'll
integrating
more
with
their
so
in
terms
of
the
features
in
the
package
itself,
so
the
current
version
is
0.3.0
and
I
would
say
the
last
couple
of
releases
or
sorry.
The
first
releases
up
to
this
point
have
been
really
about
sort
of
solidifying,
the
numerical
Foundation.
B
So
in
terms
of
solvers,
you
know
there's
going
to
be
the
classic
ode
solvers,
but
we
also
have
geometric
solvers
available,
which
are
just
you
know,
for
the
specialized
class
of
differential
equations,
there's
other
solvers
that
you
can
use
like
X
Matrix
exponential
base
solvers
in
terms
of
array
types,
obviously
dense
and
sparse,
and
you
can
choose
between
numpy
and
Jax
as
the
main
array
representation.
B
So
all
of
our
compilation,
automatic
differentiation
and
whatnot-
is
due
to
Jack's
compatibility,
which
I
mentioned
here
and
something
that
we've
added
in
the
last
release
is
a
module
doing
time
dependent
perturbation
Theory
numerically,
so
you
can
compute
very
general
expansions
numerically
and
there's
certain
specialized
solvers
built
out
of
that
which
I'll
mention
in
a
future
slide.
B
On
top
of
that,
you
can
build
models
in
sort
of
standard,
decompositions
I
think
most
like
most
packages
will
have
some
version
of
this.
So
you
know
generic
hamiltonian
decomposition,
for
example,
with
a
standard
representation
of
mixed
signals
in
terms
of
sort
of
arbitrary
functions
for
the
envelopes
and
some
frequency.
One
thing
which
I
think
is
unique
to
this
package
at
least
currently
or
as
far
as
I'm
aware,
a
certain
model
transformation.
B
Some
HF
and
it'll
do
the
computation
that
rotating
frame
and
do
things
behind
the
scenes
to
make
it
efficient
to
do
that
and
there's
some
nice
for
people
that
are
familiar
with
these
things
like
you
might
be
familiar
with
the
idea
that
oh,
if
I,
do
a
rotating
wave
approximation,
so
I
enter
a
rotating
frame
and
do
an
approximation,
you'll,
say:
okay,
the
differential
equation
solver
will
run
a
lot
nicer.
B
You'll,
remove
certain
high
frequency
components
and
stuff
like
this,
and
we've
actually
found
that
in
practice,
just
entering
the
rotating
frame,
which
is
a
mathematical
transformation,
doesn't
change
the
solution
in
any
way.
I
mean
it
transforms
it,
but
it's
a
reversible
transformation,
unlike
drwa
just
entering
the
rotating
frame,
gives
you
most
of
the
speed
benefits
as
doing
the
full
rwa.
In
our
experience.
So
far,
we
don't
have
I,
don't
have
like
a
nice
plot
for
that
right
now,
but
this
is
kind
of.
B
Nice
thing
and
it's
something
that
I
think
makes
the
package
quite
fast,
as
I
mentioned:
yeah
we're
working
to
do
kiss
kit
integration.
So
when
you
build
the
operators
for
these
models-
and
you
can
use
that
use
kiskit
and
all
the
tools
that
it
has
for
analyzing
things,
a
major
feature
that
is
going
to
be
for
the
next
release
is
doing
pulse
simulation.
So
if
people
are
not
familiar
with,
that
pulse
is
a
language
and
kiss
kit
for
describing
time-dependent
control
signals.
B
And
so,
if
you
want
to
like
interact
with
the
device
through
IBM's
back-end
interface,
and
you
want
to
send
specific
control
sequences
there
and
you
you'll
submit
them,
as
quote
unquote,
they
call
them
pulse
schedules
within
kiss
kit
pulse
and
so
we're
working
on
making
it
so
that
it's
easy
to
basically
interact
with
the
simulator
in
the
exact
same
way
as
you
would
a
reel
back
in
through
pulse.
B
B
Be
coming
up
in
the
next
couple
months
and
then
workflows.
That's
that's
to
come.
I
will
probably
build
control,
optimization
and
workflows
into
Dynamics
or
maybe
somewhere
else
in
kisket,
but
yeah.
We'll
definitely
do
stuff
like
that
yeah.
So
that's
that's!
The
current
snapshot.
B
Maybe
I'll
go
through
this
really
quickly.
This
is
just
like
a
quick
picture
showing
how
you
maybe
Define
and
solve
a
problem
in
Dynamics.
So
say
we
have
some
simple
hamiltonian
with
a
static
operator
here
and
then
some
other
operator
H1
and
some
time
depending
control
signal.
This
is
how
you
would
run
a
simulation
and
I'm
going
to
highlight
all
the
different
options
you
can
select.
B
So
here
you
can
choose
between
umpire
Jacks
as
the
back
end,
so
I'm
telling
it
here
to
use
jacks
that
construct
this
solver
instance
with
the
model
information
and
here
I
tell
it
internally
represent
things
with
dense
arrays
and
solve
in
this
rotating
frame
and
I
can
put
any
outbreaker
here,
but
here
I'm
putting
the
static,
hamiltonian
I
build
my
time.
Dependence
of
this
S1
is
going
to
be
the
signal
object.
I
can
specify
the
envelope
as
an
arbitrary
python.
Callable
function
in
this
case,
because
things
are
being
done
with
Jax.
B
You
would
want
it
to
be
a
Jack's
function.
You
can
specify
the
carrier
frequency
and
then
you
call
solve
with
standard
sort
of
ode.
Solver
interface
elements
the
integration
time,
the
initial
State
here
you
pass
in
the
time
dependence,
but
then
you
can
choose.
You
know
whatever
silver
you
want
to
do
here,
so
there's
lots
of
different
solvers
that
we
integrate
with.
So
you
can
make
your
choice
there
and
again,
depending
on
the
problem.
B
We
see
very
different
performance
from
different
solvers,
so
yeah,
all
of
those
sort
of
configurable
elements
are
shown
here
and
then
yeah
this.
This
slide
is
with
the
pulse
integration.
Again,
if
you're
not
familiar
with
this,
so
you
can
use
kiss
kit
pulse
in
this
panel
here
to
describe
a
control
sequence.
So
this
is
like
two
back
to
back:
drag
pulses
with
a
virtual
Z
or
a
phase
shift
in
the
middle.
B
So
you
can.
You
know
Define
that
you
could
send
this
to
a
real
back
end,
but
so
right
now
we
you
we
do
have
the
ability
to
simulate
these
in
Dynamics,
not
the
exact
same
interface
as
a
real
back
end
yet.
But
you
can
configure
the
solver
to
accept
these
things
and
simulate
them,
which
there's.
B
Interest
for
people
that
are
interested
in
doing
control,
research
and
stuff
like
this,
so
instead
of
passing
that
signal
object
as
before
you'll
pass
in
this
schedule
and
it'll
solve
okay.
So,
in
terms
of
speed
for
this
thing
we
always
compare
it
or
not
always,
but
we
usually
compare
the
Q-tip,
because,
basically,
every
time
we
show
somebody,
this
package
they're
saying
how.
B
To
Q2
as
the
sort
of
default,
so
yeah
I'm
not
going
to
go
into
the
details
with
the
simulation
are,
but
it
was
a
three
transomon
model.
I
forgot
how
the
dimensions
broke
down,
but
it
this
is
160
dimensional
simulation
and
what's
being
shown
here,
is
a
2d
scan
of
control
parameters
and
what
they're
looking
at
for
this
particular
control
sequence
that
was
being
investigated,
is
how
much
entanglement
is
there
between
two
of
the
qubits.
B
So
if
you
think
about
this
as
a
computational
problem,
it's
like
okay,
you
have
a
parameter,
scan
simulation
problem,
I,
think
there's
a
thousand
pixels
here,
and
so
you
know
you
have
to
go
pixel
by
pixel.
Obviously
this
is
a
very
parallelizable
operation,
but
this
this
little
plot
here
on
the
side
shows
Q-tip.
If
we
want
to
simulate
and
obviously
using
the
same
accuracy,
parameters
or
tolerances,
it
took
about
54
seconds
with
Q-Tip
on
CPU,
whereas
with
Dynamics
it
took
about
7.3
seconds.
That's
a
nice
speed
up.
A
A
B
What
CPU
we
were
using?
This
was
either
on
a
D100
or
a100
GPU,
but
the
nice
thing
is
even
for
these
small
Dimensions.
We
found
that
if
we
use
Jacks
as
a
vectorization
Transformations,
you
can
vectorize
the
function
that
evaluates
each
pixel
and
then
call
it
on
actually
an
array
that
contains
all
the
sort
of
parameter
values
at
once
and
it'll
run
it
all
kind
of
in
parallel,
and
if
you
do
that
it
takes
about
140
seconds
so
about
two
minutes.
B
So
this
this
was
something
that
was
done
with
some
internal
researchers,
and
this
was
way
way
faster
than
what
they
were
doing
before
with
Q-Tip,
which
was
nice
one.
Other
thing
I'll
mention
quickly
before
concluding,
is
that
we've
added
a
perturbation
Theory
module
in
the
last
release
and
I
yeah
I
don't
have
time
to
get
into
details,
but
perturbation
Theory
expansions
are
usually
used
in
control,
optimization
applications
to
quantify
robustness,
to
various
parameter
variations,
and
so
to
do
this
in
a
numerical
way.
B
You
need
obviously
code
that
will
compute
these
perturbation
terms
using
numerically,
you
know
in
a
very
general
way
to
to
facilitate
very
general
control,
parameterizations
and
whatnot,
so.
A
B
Added
this
in
the
last
release,
coincidentally,
the
paper
with
these
algorithms
has
been
posted
to
the
archive
today.
So
here
it
is.
This
is
a
good
timing,
but
one
thing
which
I
can
you
know
easily
explain:
is
that
there's
we've
built
in
some
perturbative
solvers,
so
solvers,
based
on
the
Tyson
series
and
Mana
suspension
they're,
basically
like
a
series
solution,
methods
to
differential
equations
and
I'll
quickly
say
before
explaining
this
plot
that
these
are.
These
are
what
we
built
in
was
a
variation
from
of
an
algorithm
from
a
paper.
B
That's
been
that
previously
kind
of
introduced
this
general
idea.
So
this
is
a
really
nice
paper
and
what
we
did
is
kind
of
based
on
that.
B
But
the
idea
here
is
you
create
these
solvers
that
are
fixed
step
solvers
by
doing
series
expansions
and
then
taking
small
time
steps
and
what
they
observed
in
this
paper,
which
was
like
the
key
observation
was
that
the
expansions
at
each
time
step
can
be
translated
to
each
other.
So
you
only
need
to
do
this.
Expansion
computation
one
time
and
then
you
can
just
like
solve
by
doing
like
very
few
array
operations,
and
so
this
plot
is
just
showing
some
configurations.
B
So
this
was
the
time
that
it
took
to
run
a
hundred
pulse
parameters
for
a
two
cubic
gate.
And
again
this
is
a
small
Dimension
25.
B
But
when
you
look
at
vectorizing
and
stuff
so
this
plot
hides
the
details,
but
we
consider
lots
of
different
like
vectorizing
over
lots
of
different
inputs
and
finding
which
one
was
the
fastest
and
only
plotting
the
fastest.
But
we
see
the
traditional
ode
solver,
sorry,
so
we're
plotting
on
the
y-axis,
this
average
distance
to
a
benchmark
solution.
So
it's
like
how
accurate
is
it
and
then
this
is
the
total
runtime
and
you
see
the
ode
solver
you
know
is
this
line
and
all
of
these
preservative
silver
instances
are
kind
of
living
below.
A
B
Is
quite
technical,
I
think
check
out
the
paper
if
you're
interested
I,
think
more
investigation
needs
to
be
done
with
these
things,
to
sort
of
understand
how
to
use
them
profitably
and
whatnot,
but
it's
it's
still
kind
of
cool
and
exciting,
so
check
that
out,
okay,
I'll
close
now,
hopefully
I'm
not
over
time.
So,
just
to
recap,
it's
talking
about
kids
get
Dynamics.
B
It's
at
its
current
version.
0.3.0
establishes
sort
of
the
core
numerical
Foundation.
It
builds
in
things
that
allow
you
to
do
like
automatic
model,
Transformations
compile
and
how
to
differentiate.
In
the
future.
We
will
have
a
full
kiss
kit
pulse
integration,
so
you
can
interact
with
the
simulator
in
the
same
way,
you
would,
with
a
real
back
end
and
that'll,
be
really
nice
for
doing
certain
research
things.
It's
nice
to
be
able
to
just
quickly
swap
them
in
and
out
and
interact
with
them
in
the
exact
same
way
so
yeah.
B
So
it's
an
open
source
project.
So
you
know
if
you're
interested,
if
you
use
it,
please
you
know,
submit
issues,
ask
for
help
and
or
contribute
you
can
check
out.
The
GitHub
is
here:
there's
lots
of
documentations
tutorials
user
guide.
Things
like
this
and
if
you're
interested
in
chatting
there's
also
a
slack
channel
in
the
case,
scheduling
space
that
you
can
reach
out
and
and
I
will
respond
personally,
but
okay,
that's
it.
A
It's
very
interesting.
We
do
have
time
for
maybe
one
quick
question
from
the
audience.
A
B
A
Ask
you
a
quick
question:
I
was
wondering
if
you
use
the
mempile
backend,
do
you
need
to
specify
like
how
do
you
complete
the
derivatives
in
that
case,
I
see
them
with
checks?
You
always
use
their
autograph
back
end,
but
with
numpy.
What
are
you
using
there
so.
B
With
numpy
we
don't
we
don't
do
that
if
you
want
to
compute
derivatives,
you'll
have
to
just
use
Jacks.
B
B
Yes,
yes,
exactly
the
numpy
inclusion,
a
lot
of
the
sort
of
performance
stuff
and
whenever
we
want
to
do
highly
profound
things
we
just
always
use
Jacks
and
The.
Numpy
was
kind
of
included
as
just
to
make
it
so
that
a
new
user
can
get
into
things
easily
I
guess.
B
But
we
have
found
once
you
start
going
to
some
large
system
sizes
and
you're
using
say,
like
the
Sci-Fi,
sparse
representation
internally,
that
it
actually
does
kind
of
become
comparable,
like
the
compilation
just
in
terms
of
peer
simulation
time
of
Jax
is
like
any
overhead
of
you
know
not
having
it
be
compiled
in
sci-fi,
like
the
things
kind
of
end
up
becoming
somewhat
comparable.
Once
the
array
operations
start
like
fully
dominating
everything,
but
yeah
for
derivatives
and
whatnot,
it
has
to
be
checks.