►
From YouTube: SimPEG meeting Jan 30
Description
- Plugging in empymod to SimPEG
- creating a Simulation class to replace the Problem + Survey
- parallelizing the code
- contributions to processing
A
B
And
so
he's
got
a
look
forward.
Yeah
also
introduced
an
entire
and
wanted
to
you
invert,
then
using
something.
So
what
we've
done
is
first
wrote
like
the
problem
with
surveyed
class
or
came
to
play
in
sin
peg
and
then
use
just
finite
difference.
Sensitivities.
So
within
Symphony
generally
do
analytic
sensitivities.
This
is
it
faster,
but
for
a
1d
problem
like
his
stuff
is
quite
fast,
so
it
doesn't
really
matter
yeah
and
so.
B
B
It's
about
money,
so
right
now
is
up
to
so
that's
all
am
Python,
and
here
this
is
now
what
the
same
peg
wrapper
looks
like,
and
so
we've
got
an
EMP
mug
problem
which
inherits
from
this
problem
for
the
first
pass,
which
is
starting
with
resistivity
being
the
thing
that
will
invert
for
we
can
hook
up
other
parameters
when
you
make
it
a
nice
chocolate
stuff
later,
but
for
now
we
just
wanted
something
that
works.
It
started
with
that.
B
B
B
So
when
we
ask
for
the
fields
we
just
first
set
the
model
and
then
you
help
get
them
the
existing,
and
so
we
can
expand
this
out
if
you
actually
want
to
have
other
variable
parameters,
because
here
like
it's
taking
all
of
these
things,
so
we
get
to
change
those
if
you
want
to
and
then
to
get
the
sensitivity.
This
is
using
a
centered
difference,
just
doing
a
finite
difference,
so
we
just
take
em
and
then
step
on
your
side
and
stack
up
a
sense
to
me.
So
it's
pretty
straight.
B
B
Yeah,
a
few
more
like
sanity
checks
and
things
like
that,
so
defining
the
perturbation
that
we're
going
to
take
and
a
minimum
critter
Bayesian.
So
this
I
figured
we
probably
want
to
try
and
do
some
percentage
of
the
model
as
a
default,
but
then
also
have
a
for
education
model
and
zero
so
that
we
make
sure
that
we
duplicate
it.
The
third
area.
B
Yeah
I
mean
it
started
with
slope
and
protect,
or
actually
with
all
the
example
to
brand
I
didn't
have
a
clue
by
doing
the
during
good
test,
okay
yeah.
So
in
that
case
it
just
like
to
make
sure
that,
basically,
if
you're,
including
since
to
be
in
their
confirmation
to
a
function,
it
should
converge
at
second
order.
So
that's
how
we
test
this
anything.
B
B
Cuz
forward
or
backwards
is
gonna
be
cheaper
because
you
only
have
to
calculate
one
extra
thing,
so
those
are
all
options
and
then
here's
like
based
on
whatever
your
default
choice,
is
which
right
now
I
just
put
the
center
of
difference.
All
this
would
then
spit
out
the
sensitive,
so
I
haven't
tested
this
at
all.
Yet,
but
this
is
the
sketch
of
one
taxes,
Oakland.
D
B
D
B
It
needs
to
go
through
the
mapping
and
the
model
is
always
the
thing
like
the
model
is
sort
of
art.
That's
yeah,
that's
the
version
from
if
you
didn't
want
to
invert,
if
you
just
wanted
to
like
even
the
e/m
frequency
doing
it,
we
have
hooked
up,
so
you
can
invert
for
Sigma
or
me
or
both
if
you
on
the
outside
just
set
mu
equals
where
it's
not
going
to
convert
to
that.
So
it
only
hurts
for
things
where
you
provided
them
out.
D
B
So
what
I
think
we
should
do
is
combine
those
as
a
simulation
and
just
so
that,
like
a
mesh,
all
the
serial
numbers,
all
that
go
on
to
these
simulation
and
each
of
those
pieces
can
be
modular
and
what
I
like
about
this
is.
We
can
also
play
with
order
of
operations,
I
guess
so,
for
example,
a
lot
of
the
yuto's
that
were
built
for
DC.
You
know
like
building
match,
based
on
where
my
electrodes
are.
B
If
we
have
everything
in
the
survey
class,
then
we're
sorry
in
the
simulation
class,
then
strapping
a
Utila
that
just
said
spilled
mesh
from
my
survey,
grounders
I'm
gonna
set
those
first.
That
makes
sense,
and
that's
easy
too
easy
to
do
and
same
thing
just
like
I
think
it
gives
you
more
flexibility
on
what
you're
defining
when
he's
like,
when
I'm
building
up
meshes
for
the
casing,
stuff
for
frequency
or
time
domain,
like
you,
first
set
out
the
survey
parameters
and
then
decide
how
far
to
pad,
based
on
my
last
time
or
my
lowest
frequency.
D
D
B
B
D
B
David,
okay,
right
now,
this
survey
is
also
has
extra
stuff.
On
top
of
it,
so
I
think
data
needs
to
stay
very
lightweight
and
he's
exactly
like
sources,
receivers,
observe
data,
and
that's
it
that's
it
right
now
the
survey
also
computes
deep
red
and
does
all
of
those
steps
which
is
like
yeah,
so
I
think
that
the
data
clocks
is
actually
just
a
container
for
your
data
and
then
the
simulation
class
actually
knows
how
to
go
in
compute
fields.
Everyone
nearby
should
compute
your
predictive
data
and
it
should
not
hold
the
observed
data.
B
E
Because
the
master
data
classes
cannot
consortin
receiver
class
is
fairly
heavy
right,
like
that's,
where
you
evaluate
this
projection:
literacy,
resources
and
stuff.
Aha
yeah,
not
sure.
What's
the
what's
the
real
difference
here,
because,
like
that's
actually
where
people's
kind
of
expensive
stuff
is
happening
in
survey
class,
because
the
source
receiver
is
attached
to
distribute
class
and
then
that's
where
kind
of
complication
I
guess
is
coming
from,
but
yeah.
B
F
B
So
I
think
so.
Do
you
think
the
biggest
thing
here
is
it's
good.
The
Lightning
of
the
data
class
I
think
is
important,
so
the
biggest
thing
is
removing
this
need
to
pair
and
unpaired
the
problem
and
survey
and
that
those
should
be
because
like
for
am
I
mod
or
like
please
any
other
package
like
they
just
provide
you,
the
entire
forward
simulation
and
so
us
trying
to
then
unnaturally
tease
it
apart
of
his
yeah
strange
yeah.
B
B
Like
when
you
then
instantiate,
this
is
very
now
actually
to
set
up
a
problem.
You
actually
have
to
provide
it
a
mesh
when
you
instantiate
it,
whereas
this,
what
we
can
do
is
you
can
fiddle
around
with
what
order
of
operations
like
how
you
want
to
construct
that
class.
So
if
you
want
to
set
your
survey
parameters
first
and
then
build
the
mesh,
if
you
need
to
or
if
we
have
utilities
to
create
a
synthetic
survey
on
a
given
mesh,
you
can
do.
B
E
D
B
B
B
And
then
it's
also
nice
about
that
is.
We
can
also
put
in
some
steps
that,
like
before
you
actually
run
the
simulation
we
validate,
and
so
here
we
could
actually
go
and
check
that
like
having
it
far
enough
or
that
you
thought
of
fine
enough
descritization
around
your
sources
and
receivers
and
we
can
throw
like
warnings
or
errors
depending
on
how
severe
something
is
so,
for
example,
if
you
want
it
like
I,
don't
think
we
should
kill
the
simulation
if
you're
skinning
up
or
if
you're
patting
you
sit
far
enough.
E
B
E
B
C
B
C
B
D
E
H
B
E
Yeah
and
then
the
whole
bunch
of
options,
NPR
multi-processing,
third,
only
options,
it's
okay,
it
depends
upon
what
you
want
to
do
at
that's
fine,
but
the
hard
part
was
how
they
picked
like
that
into
a
syntax
structure
that
was
really
I.
Wasn't
really
I
was
having
a
hard
time.
How?
How
did
do
that?
E
The
simple
the
simplest
example
that
I
was
thinking
is:
let's
say
you
got
like
there's
no
there's
no
connection
and
five
different
all
there
into
totally
the
time.
Let's
say.
Let's
say
we
got
the
previous,
the
main
game,
Table
five
agency,
and
then,
if
this
is
each
simulation,
it
I
just
need
to
pass
the
simply
require
that
we
are
all
the
required
information
and
that,
but
the
colonists
like
all
this
processing
packet
is
structured,
is
okay.
E
F
E
E
F
Floral
block
interest
are
all
because
before
the
bit
like
so
once
he's
telling
would
start
paralyzing,
it
does
a
lot
of
memory
management.
In
the
background
and
before
you
start
parallelization,
you
should
have
already
designated
how
much
so
I
guess
say
the
first
read
you
have
that
you're
going
out
here
that
comes
over
100
hundred
units
or
something
like
that,
and
then
you
go
that
for
the
next
one
in
the
next
one,
the
next
one
before
it
actually
starts
the
pair
bodies
issue.
I
E
E
F
Wood
that
needs
some
edges,
okay,
yeah.
So
what
you
would
do
is
you
would
call
so
it
wouldn't
be
p1
bracket.
It
would
be
your
parallel
problem
and
your
input
parameters
right
and
you
would
have
to
designate
your
help,
which
would
you
would
have
one
output,
that
would
have
say
the
equivalent
amount
out
of
the
cell.
So
this
cell
will
have
your
predicted
you're
right
and
so
then
you
would
have
discipline
and
you
should
just
be
able
to
write.
D
F
E
F
F
F
E
E
F
F
F
E
F
E
F
B
And
especially
because
that's
a
piece
that
won't
change,
even
if
we
like
doing
this
forward
simulation,
upgrade
right
the
objective
function
in
this
time.
That's
not
changing
or
the
behavior
of
it
will
change
some
stuff
under
the
hood,
but
the
behavior
of
the
fact
that
you
can
get
like
it
ivalice
it.
He
doesn't
strip
and
its
secondary.
D
D
D
B
Yeah
and
then
like
what's
about
that
too,
is
from
the
simulation
side
of
things
then
like
having
a
you
killed
that
there's
no
test
split
up
the
problem
in
a
same
way
like
for
the
frequency
domain
exploded
on
by
frequencies
and
then
just
like
a
create
the
reasonable
objective
functions,
but
then
paralyzes
it
like
that's
an
easy
step.
So.
F
So
weird,
that's
what
I
just
kind
of
worked
with
one
of
those
examples
and
federal
laws.
I
guess
there's
like
that.
One
example
for
three
P's:
do
you
have
a
survey
object
a
class
for
each
survey
line
and
then,
instead
of
combining
and
inverted
you
just
just
take
each
one
of
those
invert
it
and
then
yes,
yes,
so
that'll
be
yeah,
I,
guess
I'm,
just
gonna
find
a
place
to
start
yeah,
yeah
I.
Get
that
sure
you're
like
we
worked
on.
F
C
B
So
here
all
this
is
is
a
list
of
object,
okay
and
so
the
step
that
we
can
paralyze.
It's
been
just
a
few
places
so
like
this
loop,
basically
yeah
this
guy
here.
So
all
this
is
doing
is
looking
at
the
list
and
saying
please
evaluate
each
one
of
these
things,
events,
so
this
guy
and
then
the
derivative
and
the
second
derivative
are
doing
exactly
the
same.
So
these
are
where
to
look,
because
if
this
is
just
now.
E
That
what
makes
it
complication
to
me
because
then
instantly
simple,
but
the
under
the
would
a
lot
of
things
going
on
and
how?
How
do
you
actually
really
pass
an
information?
It's
like
you
really
need
to
kind
of
explicit
about
passing
that
information
to
your
wall.
Walker.
Yes,
that's!
That's!
That's
what
I
having
a
hard
time,
because
although
that's
an
objective
function
but
below
that
problem,
yeah
survey
and
then
get
all
of
those
information,
it
needs
to
be
a
certain
container
and
then
need
to
pass
yeah.
So
the
thing
that.
B
B
F
F
F
F
B
G
I
B
B
B
E
B
It's
actually
a
vector
that
describes
the
3d
and
then
each
simulation
essentially
has
its
own,
so
much
so
the
the
mapping.
So
this
inhalation
basically
takes
your
3d
model
and
it's
now
going
to
project
onto
your
local
and
then
that
can
evaluate
all
itself
how
the
river
gives
it,
and
so,
in
this
case
we
could
actually
do
that
at
look
at
the
function
level.
Where
each
simulation
then
is
it's
only
reach
sound
resistance.
Doesn't.
E
E
So
once
you
got
that
stop
charging.
So
that's
what
I
was
working
on
said
and
I
I
was
thinking
about
like
headings
he's
like
their
local
level
with
paralyzation
data.
I
was
thinking
about
a
little
bit
smaller
level
like
okay,
I
got
a
Google
problem
that
contains
all
the
information
and
I
can
parse
it
out
there.
The
new
person
enters
their
lives
exactly
so.
That
said,
that's
kind
of
what
I
want
to
work
on
in
this
point.
It's
like
that
McKenna
and
the
extent
to
level
yeah,
because
it's
a
little
bit
simpler
in
that
level.
E
F
E
What's
happening
here,
so
f
is
so,
let's
say:
you've
got
a
PC
problem,
got
a
thousand
sources
and
yeah.
It
generates
that,
like
that
potential
everywhere
in
your
me
today,
okay,
let's
take
about
ten
lines
and
then
issue
I
have
all
those
potential
values
that
I
have
s,
and
then
here
it
could
be
this
s
of
T
are
using
that
potential
because
developing
sensitivities,
retirement
or
filtering
so.
B
E
G
B
B
Go
up
a
level
yeah
yeah
so
instead
of
like
instead
of
going
down
and
trying
to
take
a
problem
and
then
split
it
up,
I
think
if
we,
if
we
started
at
the
top
level
and
then
later
on,
we
can
write
utilities
that
you
know
how
to
split
up
a
simulation.
Sanely
I
think
that's
going
to
be
a
lot
easier,
so
like
just
thinking
about
ease
basically
like
each
objective
function
is
a
completely
independent
thing
at
this
point,
and
then
we
can
take
problems
later
on
and
actually
split
them
up
other
way.
B
Okay,
because
if
we,
if
we
sold
this
level,
then
because
every
problem
gets
going
to
be
sort
of
unnatural
way
to
split
it
up
right,
but
if
it
solves
at
this
level,
then
all
the
only
thing
that
we
had
R
and
s
per
game.
It's
kind
of
split
up,
MT,
vs.
DC
versus
yeah.
Most
of
the
things
can
be
handled
here
at
this
level.
D
B
D
B
E
B
We
consider
em
our
models
that
were
averted
for
and
then
a
Mac
thing
is
any
steps
that
you
are
taking
to
get
back
to
become
a
physical
property
on
our
mesh.
So
if
you're
inverting
poor
long
conductivity
below
the
surface,
your
mapping
then
turn
that
into
conductivity
gives
you
as
well
as
more
device.
E
E
I
D
D
J
D
B
C
G
G
B
G
B
B
So
everything
is
like
fairly
straightforward,
so
it's
all
factors
of
two:
the
random
treaty.
What
I
believe
is
going
on.
It's
the
underlying
mesh,
it's
different,
so
there's
different
cells
whit's
in
the
each
dimension,
so
you
don't
actually
like
h
is
not
the
same
for
all
of
them
and
then
so
it
refines
it
has
like
stretched
in
some
and
not
others,
and
so
what
it
should
be
testing
them
like
both
of
these
should
be
second-order.
H
F
F
But
what
key?
How
do
you
wanted
to
build
in
one
guys
have
specific
files
or
is
it
just?
Should
I
just
build
with
it,
but
you
just
taken
aback,
be
reordered
and
then
it
operates.
Well.
How
many
did
you
beat
it?
Yes,
it's
saying
who
do
you
wanna?
Oh
I,
use
it
daily.
You
use
that
one
getting
a
text
file
yeah!
We
created
it.
B
F
B
F
F
F
E
E
F
Yeah,
so
this
is
just
their
main
page:
intro
yeah,
nothing
too
fancy
with
the
paralyzation
in
this
code
because
it
we're
just
we
have
a
source
on
two
dipoles,
so
just
simple
paralyzation
work
on
all
the
sources
and
think
so
it's
not
just
a
simple
four
or
parallel
four
loop,
and
then
you
know
specific
things,
something
like
roads
and
then
again,
so
what
deists,
where
we
got
to
do
some
subtraction
of
the
time
series.
So
this
chunk
is
yeah.
F
This
is
where
it
actually
just
makes
so
I've
got
how
its
structured
is.
Is
we
have
things
about
patches,
which
is
what
would
be
like
a
line
and
we're
like
a
survey
day
kind
of
thing
and
then
patches
hold
readings
which
would
be
the
equivalent
of
your
sources,
and
then
each
source
has
another
subclass
of
dipoles,
and
so
right
now
we're
cycling
through
all
the
sources
and
we're
gonna
initiate
a
dipole
and
to
do
that,
there's
again,
there's
some
do
specific
stuff.
F
F
Using
your
a
fire
so
essentially
what's
happening
here
in
the
code
is
I'm,
taking
that
one
big
long
time
series
and
I'm
finding
the
sort
of
two
second
time
base
of
600
samples
is
going
to
be
your
max.
Okay,
yeah
and
I.
Take
that
pull
vector
and
then
I
do
some
mock
dimensions
to
make
it
into
a
matrix
so
that
each
column
of
the
matrix
is
the
hat
side.
And
then
it's
also
creating
your
kernel
stacking
room,
which
is
just
well
a
brute
force.
F
One
is
just
kind
of
like
a
a
bell,
a
bell
curve
and
then,
if
I've
also
allowed
it
so
then
you
can
play
around
with
that.
So
you
can
use
different
type
of
when
doing
together.
So
then
you
can
knock
off
frequencies,
undesired
frequencies,
lower
tier,
so
someone
so
in
that
case,
yeah
I
have
all
the
options
just
just
like
before
right
now,
yeah
just
handing
a
helpless
my
geyser
now
Jim
Kevin,
nothing,
nothing
too
fancy,
but
it
seems
to
do
the
trick
very
well.
So
it's
not.
E
F
F
To
take
each
of
those
columns
in
started
yeah,
then
we
throw
this
filter
kernel
at
them
and
then
it
or
so
when
you
do
the
meters
nine
matrix
multiplication.
It
ends
up
with
just
six
one.
Sorry,
your
vector,
that's
the
size
of
your
house,
ethical,
that's
all
averages.
Hopefully
all
your
noise
also
moved
in
a
perfect
world.
B
F
F
Yeah
and
so
the
rest
of
them
are
always
following
the
simple
doing
some
fancy
logins
simple
matrix
multiplication
and
you
end
up
when
you
think
uniform
staffing
goal.
Yes,
and
if
the
sampling
does
go
off
its
corrected
for
in
terms
of
you
processing
but
ideally
yeah.
It's
I
take
99.9
percent
of
the
time
it's
on,
but
sometimes
you
don't
notice
that
one!
Oh
yes,
you
know
look
at
the
time
series
then,
all
of
sudden
it
goes
back
like
that
and
they
decided
to
time.
K
K
F
G
F
Yeah,
so
that's
the
brute-force
one
is
it's:
it's
typically
just
yeah
straight
and
then
two
little
curves
right
at
the
end,
whereas
yeah,
if
you
start
multiplying
by
these
little
handing
windows,
then
you
can
start
getting
a
little
bit
more
relaxed
curve.
You
can
focus
in
on
some
frequencies
that
you
want
to
not
go.
For
example,
we
had
one
where
there
was
a
seven
and
a
half
hurts
it's
what's
causing
this
we're
gonna.
F
F
F
Well,
I
think
about
where
it
is
I'll
just
continue.
My
mantra
here
so
after
the
stacking
is
done.
Yeah
we
just
do
the
we
calculate
the
VP
in
the
last
I
guess
this
is
from
50%
to
90%
of
the
long
time
and
that's
just
yeah.
Actually
that's
one
thing:
I
have
infect
your
eyes,
go
back
and
do
that
and
then
we
come
down
here.
F
This
sorry
yeah
doing
the
case
couple.
They
need
the
case
from
that
step.
I
used
to
have
another
piece
under
here
that
was
just
kind
of
a
group
for
us
and
it's
just
doing
an
average
over
each
more
than
their
loot,
each
one
of
individual
windows
from
them.
Okay,
whereas
this
get
tapered,
DK,
I'm
kind
of
applying
that
same
idea
of
the
staffing.
F
F
Happy
reduce
that
yeah,
that's
and
yeah.
That's
this
code
here
just
takes
it
so
you've
got
you're
going
to
be
kid.
You've
got
in
your
little
window,
chuckling
route,
sweet
and
it
just
does
another
little
hanging
at
the
top,
and
so
it
leaks
into
the
other
windows,
but
those
ones
are
rated
so
much
lower
that
yeah
I.
F
Guess
when
you
think
about
it,
you
can
pass
a
straight
line
if
you're
just
a
virgin
/
to
kind
of
passing
all
frequencies,
whereas
if
you
give
it
a
little
bit
of
a
way
to
decay
earth,
if
you
waited
descending
into
do
anybody
Windows,
then
you're
actually
don't
bring
out
all
those
higher
frequencies
up
to
be
it's
kind
of
moving
average
in
time.
So
I
guess.
F
E
G
F
F
D
F
Create
window
actually
that's
what
I
was
I
was
gonna
go
to
next
was
I'm,
just
gonna
make
it
a
nice
little
separate
function
so
to
having
built
into
the
actual
title
builder.
It
would
just
call
it
get
just
got
any
Biscay
that
need
that,
but
again
part
of
the
code
a
little
bit.
It
was
very
very
times
that's
a
new
thing.
F
F
I
F
Yeah
right
through
here
so
we're,
if
you
know
I,
just
give
it
an
8
by
8
8
by
8,
matrix
of
a
bunch
of
conductivity
these
cows
and
I,
just
kind
of
invert
through
them,
try
to
test
all
of
them
on
every
one
of
those
halves
like
this
and
whichever
ones
don't
have
a
good
cocoa
fit
it
just
knocks
it
over,
and
then
everything
is
average
8
by
8.
Oh,
it's
our
totally
arbitrary.
It's
just
a
bunch
of
values
that
we
found
for
a
ring.
F
There
were
enough
of
a
range
that
it
was
fitting
to
all
of
our
data
that
we're
throwing
in
it.
So
it's
just
a
it's
just
yeah
a
matrix
of
cows
and
charge
abilities
that
go
into
the
cocoa
equation
and
then
yeah.
So
we've
won
through
all
of
those.
Perhaps
I
go
to
see
if
we
can
get
what
kind
of
Kokomo
curve
that
we
can
fix
with
it.
And
yes,
and
if
none
of
those
8
work.
E
F
F
F
E
Question
is
when
you're
fitting
that
oh
yeah
is
after
stuffing
work
before,
stick
that's
before
stat,
so
that's
before
stacking
them.
You
can
reject
that
yeah
exactly
and
then
the
oh
sorry,
yeah,
sorry
so
after
stacking
first
and
then
do
a
status.
That's
a
storage
action!
After
that
you
just
get
a
pretty
cake.
The.
E
F
So
there's
two
different
flows,
so
the
first
one
is
you
take
the
time
series
you
stack
it
and
then
you
do
your
decay
and
then
the
other
one
is
ensemble
stacking.
So
you
take
it.
You
break
the
time
series
into
say
five
chunks,
and
then
you
stack
each
of
those
five
chunks,
average
them
together
and
then
get
your
decay.
F
The
last
one
is
statistical
rejection,
which
is
the
whole
time
series,
oh
yeah,
and
it's
all
raw
and
yeah.
You
just
cut
it
up
into
it's
half
cycle
chunks
and
fitting
Coco
there's
do
it.
There
are
best
fitting
colors
and
then
they
average
together.
Then
here
you
can
I
think
you
can
call
that
stack
the
multiplication
stuff,
the
actual
stack.
F
D
B
F
G
B
C
E
E
A
F
A
A
A
F
I
believe
there's
there
should
be
some
data
kicking
around
for
that
one.
He
is
didn't,
do
it,
but
I
feeling
the
story
might
might
have
done.
Something
like
that.
E
F
F
E
E
So
there
might
be
an
idea,
knowledge,
so
they're
pretty
excited
about
it
and
then
we're
realizing
it
way
that
fights
device
might
be
effective,
I
guess
that
was
pretty
well
known,
30
or
40
hundred
years
ago,
and
Jim
played
Scotland
or
West
validating.
All
these
guys
know:
they're
writing
papers
left
right
and
center
above
IP
and
left
field
and
figure
out
the
effects
of
cases
actually
em
on
I.
E
A
B
E
F
And
then
the
arrow
day
is
that
Jonathan
says
that
that's
okay,
but
we're
just
kind
of
working
through
it
just
getting
us
some
problems
today,
they're
in
there
is
some
weird
data
points
that
we're
still
gonna
hang
out.
So
when
that's
when,
when
we
get
that
in
the
seventh
state
and
we'll
be
able
to
I,
don't
know
think
they
were
post
that
one,
but
you
guys
would
be
able
to
distribute.